The Detector Control Power System of the Monitored Drift Tubesof the ATLAS Experiment
Theodoros AlexopoulosNTU Athens
TWEPP08September 17, 2008
Naxos, Greece
Outline
• Introductory Remarks about DCS
• MDT in ATLAS Experiment
• Structure of DCS
• Power Systems for MDT • Summary
Detector Control System (DCS)
DCS - Introduction
Detector Control Systems◦ Systems to control (via computers) the operation of a HEP
experiment◦ Appeared in 1980’s ◦ Increasing Complexity of the Experiments◦ Control of a large amount of parameters◦ Check of the smooth operation of the experiment
Characteristics◦ Operators don’t need to be sub-detector experts◦ Operation procedures become very easy ◦ Automatic Error Recovery procedures can be implemented
LAN-ETHERNET
OPIOperator Interface
OPIOperator Interface
IOCInput/Output Cntrl
IOCInput/Output Cntrl
Standard DCS Model(Client-Server Model)
OPI: Operator Interface , IOC: Input/Output Controller, IOS: Input/Output Servers
IOSInput/Output Server
IOSInput/Output Server
Clients
Servers
Controlle
rs
What does DCS involve?
Control, Configuration, Readout, Monitoring of Hardware devices (but not readout of event data)
Monitoring of external systems (LHC, Safety, Electrical, Gas, etc)
Implementation of Finite State Machine behavior
Communication with DAQLogging of data, status, storage in database
Partitioning
subDet 1 subDet 2 subDet 3
subSys 1 subSys 2
Driver 1 Driver 2
Device 1
central control
Control Units
Device Units
DCS Architecture
External systems
I/OControllers Analog/Digitalchannels,Fieldbuses
field
buse
s
SCA
DA
OPIterminal
Experiment Sub-Detectors &Experimental Equipment
LHCSafety Server
Beam
Se r
v er
Beam-lineLHC data
OPI
WAN
RemoteWorkstations
LocalWorkstations
I/O Serversdistributed inExperimental area O
PC
Typical DCS Architecture (physical subsystem layout)
bridge
brid
ge
OPI terminal
OPI terminal
LAN-ETHERNET
IOS
DBserver
Ora
cle
WAN
laptopOPI Remote Users
elmb caen-hv
Supervisory Control And Data Acquisition System: PVSS
Commercial product PVSSII from ETM, Austria LHC wide decision
Can be distributed over many stations
Flexible and open architecture
Basic functions for automatisation
Standardized interface to the hardware
Application programming interfaces
What is OPC? OPC defines a standard to interface programs (SCADA, HMI) and hardware devices in a control system.
OPC provides a multi-vendor interoperability.
No more specific drivers needed for each client
OPC (OLE for Process Control) is based on the MS object model COM/DCOM (Component Object Model). OPC includes 3 interface specifications:
Data Access Alarm and Event Handling Historical Data Access
Application X Application Y
Device A Device B
Application X Application Y
Device A Device B
OPC Interface OPC Interface
OPC Server OPC Server
OPC
10
LHC Experiments ATLAS
Diameter: 25 mLength: 44 mWeight: 7 kT ~100 M electronic chs1150 MDT chambers354 384 Tubes214 Tonnes725 m3 Gas Volume5520 m2 Chamber Area
A Toroidal LHC ApparatuS
ATLAS
Inner Detector CalorimetersMuons
TRT
SCT
Pixel LArg Tile
Trigger Precision
TGC RPC MDT CSC
46m
25mTRT: Transition Radiation DetectorSCT: Silicon Central TrackerTGC: Thin Gap ChamberRPC: Resistive Pad ChamberMDT: Monitored Drift TubesCSC: Cathode Strip Chamber
Muon Spectrometer
Two Basic parts Barrel και EndCap:◦ Barrel: 3 cylinders (inner-middle-
outer)◦ EndCap: 3 disks
Chambers: MDT, CSC, RPC, TGC
1150 MDT chambers354 384 Tubes214 Tones725 m3 Gas Volume5520 m3 Chamber Area
DCS Architecture in ATLAS
DCS_IS
Global Control Stations (GCS)
CERN
MagnetLHC
DSS
OperatorInterface
DataViewer Alarm Status Web
WAN
Subdetector Control Stations (SCS)
CIC Tile Pixel SCT TRT MDT TGC RPC CSC TDAQ LAr
Local Control Stations (LCS)
High/LowVoltage
Gas Alignment Temp B-field JTAG Rod Power
Front-End Systems
Finite States Machine (FSM)
ATLAS
MDT
Barrel A EndCap A
INNER MIDDLE OUTER
← GCS: Overall control of the detector
← SCS: Full local operation of sub-detector
← TTC Partitions →
BMS1C14 BML1C01
Barrel C EndCap C
← Layers
← Chambers
State & Status
Comm
ands
Structure of FSM:
Definition: A finite state machine (FSM) or finite state automaton or simply a state machine is a model of behavior composed of a finite number of states, transitions between those states and actions.
Pixel SCT TRT LAr Tile MDT TGC CSC RPCCIC
ATLAS
EMB-A EMB-B EMEC-A EMEC-C HEC FCAL
HV THERMO ROD FECHV-PS
Q2 Q3 Q4Q1
HV sector 1 HV sector 2 HV sector n
Global Control Station
Sub-detector Control Station
DAQ Partition
DAQDDC
Local Control Station
Geographic partition
Device Units
Operator Interface Finite State Machine (FSM) Architecture
The FSM (part of the JCOP Framework) is the main tool for the implementation of the full control hierarchy in ATLAS DCS
It is envisaged that the shift operators will operate DCS ONLY through the Operator Interface (based on the FSM) and the PVSS alarm screen
JCOP: Joint Controls Project
ATLASDCS
GCS
SCS 2 SCS N
TTCN.1
LCSN.1.1
DevN-1
LCSN.1.2
DevN
...
...
Shift Operator
TTC1.N
SCS 1
TTC1.1
TTC1.2
LCS1.1.1
LCS1.1.2
Sub-sys1.1.2.1
Sub-sys1.1.2.2
Dev1
Dev2
Dev4
Dev3
Dev5
...
Detector Expert
AnotherDetector Expert
Operator InterfaceFSM (Partitioning)
Operator InterfaceFSM (STATE and STATUS)
STATEREADY
NOT_READY
ON
STANDBY
RAMP_UP
….to be defined by the user
STATUSOK System works
fine
WARNING Low severity.System can goon working.
ALARM High severity.Serious error tobe fixed ASAP
FATAL Very high severity. The system cannot work.Co
mm
ands
STAT
US
TTC1.N
SCS 1
GCS
SCS 2 SCS N
TTC1.1
TTC1.2
TTCN.1
LCS1.1.1
LCS1.1.2
Sub-sys1.1.2.1
Sub-sys1.1.2.2
LCSN.1.1
LCSN.1.2
DevN-1
DevN
Dev1
Dev2
Dev4
Dev3
Dev5
...
...
...
Operator
Operator
STAT
E
Messages via a double Information Path – STATE & STATUS STATE defines the operational mode of the system (ON, OFF, etc) STATUS defines how well the system is working (OK, WARNING, ALARM, FATAL) Two parallel information paths. E.g. HV system is in RAMPING_UP state (which
takes several minutes) and an error triggers. The error is propagated through the STATUS while keeping the same STATE
ATLAS DCS at First Beam Events
CAEN Power Supplies modules
Crates
Boards LV: A3025 & A3016 HV: A3540P
MainframeCAEN SY1527
Branch ControllersA1676
19
Structure of MDT PSEltx Room
SY
1527
Sys
tem
mai
nfra
me
Bra
nch
Con
trolle
r
Experiment Area
Easy Module
Easy Module
Con
trol L
ogic
Con
trol L
ogic
Control Bus
48VPowerSource
Pow
er F
ail U
seIn
terr
upt
48VPowerSource
Pow
er L
ine
P. S
ervi
ce L
ine
48VBackupBattery
64 Crates
PS OPC server
EM EM
SY1527
OPC serverOPC
ClientOPC
Client
PS1
PS2 PS3 CSC
Run a single OPC server on PS1 pc (scattered system)
PS2 Barrel System
PS3 EndCap SystemEM
OPCClient
PVSS datapoints
Datapoints that correspond to hardware modules (channels, boards, crates, branch controllers, mainframe)
Datapoints that correspond to the chambers
Internal datapoints for the FSM for the OPC Server (communication with hardware) for the RDB manager (archiving of datapoint elements)
Alarms are activated for the datapoints and their elements
FSM trees
One FSM tree is implemented for each partition
Hierarchy is set for the parts of the system (nodes and their children)
Commands and states propagate correctly
FSM panels
FSM panels- EndCaps
Present status of PS system
Barrel
All chambers are incorporated in the system having LV modules
HV modules for all chambers in all sectors
Endcaps
Both Big Wheels and Small Wheels (one for each side) are incorporated to the system having both LV and HV modules
Summary
• ATLAS PS MDT DCS is a Robust System
• Is being used…
• Delivered & Tested on Time
Members of ATLAS NTU-Athens DCS group: Ex-membersT. Alexopoulos, T. Argyropoulos, E. Gazis, M. Bachtis, A. IliopoulosE. Mountricha, C. Tsarouchas, G. Tsipolitis