controls & monitoring status update j. leaver 05/11/2009

37
Controls & Monitoring Status Update J. Leaver 05/11/2009

Upload: marion-lawrence

Post on 08-Jan-2018

223 views

Category:

Documents


4 download

DESCRIPTION

21/01/2016Imperial College 3 Infrastructure Issues Control PCs & servers Application management –Client-side application launcher –Soft IOC run control Configuration Database (CDB) –EPICS Interface –Ensuring the validity of recorded run parameters Protocols for updating & maintaining software on control PCs (see PH’s talk) Alarm Handler (see PH’s talk) Channel Archiver (see PH’s talk) Remote Access (see PH’s talk)

TRANSCRIPT

Page 1: Controls & Monitoring Status Update J. Leaver 05/11/2009

Controls & Monitoring Status Update

J. Leaver05/11/2009

Page 2: Controls & Monitoring Status Update J. Leaver 05/11/2009

Infrastructure

Page 3: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 3

Infrastructure Issues• Control PCs & servers• Application management

– Client-side application launcher– Soft IOC run control

• Configuration Database (CDB)– EPICS Interface– Ensuring the validity of recorded run parameters

• Protocols for updating & maintaining software on control PCs (see PH’s talk)

• Alarm Handler (see PH’s talk)• Channel Archiver (see PH’s talk)• Remote Access (see PH’s talk)

Page 4: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 4

Control PCs & Servers: Current Status

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

miceserv1miceecserv

miceopipc1 micecon1target2

Page 5: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 5

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• General purpose server PC– Currently runs EPICS servers for:

• FNAL BPMs• DATE Status• Config. DB User Entry Data• CKOV Temp. & Humidity Monitor• DSA Neutron Monitor

– Will also run EPICS servers for:• Network Status• CKOV + TOF CAEN HV Supplies

Page 6: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 6

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• Target DAQ & Control PC– Currently runs:

• Target / Beam Loss Monitor DAQ

– Will run EPICS servers for:• Beam Loss Monitor• Target Controller

Page 7: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 7

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• Target Drive IOC (vxWorks)– EPICS server for Target PSU &

extraction motor

Page 8: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 8

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• Beamline Magnets IOC (vxWorks)– EPICS server for Q1-9, D1-2

Page 9: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 9

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• Decay Solenoid IOC (vxWorks)

Page 10: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 10

Control PCs & Servers: Current Status

miceserv1miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

• Linde Refrigerator IOC (PC)

Page 11: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 11

Control PCs & Servers: Current Status

miceserv1

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

miceecserv

• ‘EPICS Client’ Server PC– Runs all client-side control & monitoring applications– Runs infrastructure services:

• Alarm Handler• Channel Archiver

• Large wall mount display shows:– Alarm Handler panel– Log message viewer

• Display may also be used to show any (non-interactive) panel containing information that must be monitored for the duration of a specific run

Page 12: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 12

Control PCs & Servers: Current Status

miceecserv

miceopipc1 micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

miceserv1

• Gateway / Archiver Web Server PC– Runs Channel Access Gateway,

providing read-only access to PVs between MICE Network & heplnw17

– Runs web server enabling read-only access to Channel Archiver data

– Currently running non-standard OS for control PCs

• Will reformat after current November/December run period

– See PH’s talk

Page 13: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 13

Control PCs & Servers: Current Status

miceserv1miceecserv

micecon1target2

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

miceopipc1

• General purpose Operator Interface PC– Primary access point for users to interact with

control & monitoring panels– Essentially a ‘dumb’ X server – runs all

applications via SSH from miceecserv

Page 14: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 14

Control PCs & Servers: Current Status

miceserv1miceecserv

miceiocpc1 target1ctl miceioc2 miceioc4 miceioc1 miceioc5

miceopipc1 micecon1target2

• Additional General purpose Operator Interface PCs– Currently running non-standard OS for control PCs– Useable, but not optimally configured…– Cannot disturb at present - will reformat after current

November/December run period• See PH’s talk

– Shall be renamed miceopipc2 & miceopipc3

Page 15: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 15

Application Launcher• New application launcher

replaces DL TCL script

– XML configuration file, easy to add items

– Unlimited subcategory levels

• Provides real-time display of application status

• Configurable response to existing application instances

Page 16: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 16

Application Launcher: App StatusApplication was previously launched by an external process, but is no longer running (return value unknown)

Application was killed by a signal & is no longer running

Application is running, but was not executed by this launcher

Application is running

Application quit with an error code & is no longer running

Page 17: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 17

Application Launcher: External App Response

• Multiple application launchers will be operated simultaneously

– On miceopipc1-3 (via SSH from miceecserv) & miceecserv itself

• Need to ensure that shifters using different launchers do not ‘conflict’

• If operator attempts to execute an application that is already running, launcher has a configurable response:

– Ignore: Launch another instance

– Inhibit: Prevent another instance from running

– Kill: Close existing instance & run a new one (e.g. could be used for a ‘master’ override control panel)

• Typical configuration:

– Only one instance of each ‘control’ application may run (cannot have multiple users modifying the same parameter!)

– Unlimited numbers of monitoring panels may be opened

Page 18: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 18

Soft IOC Management• Application launcher primarily concerned with managing client-side control

& monitoring panels running on miceecserv

• Also need to implement run control for corresponding EPICS servers

– ‘Hard IOCs’ running on vxWorks (i.e. servers provided by DL) are always ‘on’ → require no routine external intervention

– ‘Soft IOCs’ running on control PCs (i.e. servers produced within the Collaboration) are executed like any other application → require user control

• Why can’t soft IOCs just run at system start-up, like any other service?

– Assumes that servers run perpetually, unattended – not true!

– Sometimes need to modify configuration files, requiring server restart

– Servers sometimes crash due to hardware problems, requiring restart

– May need to turn off or reconfigure hardware – cannot do this while a soft IOC is running

– Shifters should not have to worry about Linux service management…

Page 19: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 19

Soft IOC Management

• CmdExServer provides similar functionality to normal application launcher, but with an EPICS interface

• Each IOC PC runs an instance of the CmdExServer at start-up

• CmdExServer manages local soft IOCs

• Client-side ‘remote application launcher’ communicates with all CmdExServers & allows user to start/stop IOCs

NB

- Current remote launcher configuration only includes a subset of the servers assigned to miceiocpc1

- Others will be added as they become available

CmdExServer

FNAL BPM Server

DATE Status Server

Config. DB User Entry Data Server

Network Status Server

etc.

miceiocpc1CmdExServer

AFEIIt Server

micetk1pc target1ctl

Beam Loss Monitor Server

CmdExServer

Target Controller Server

Page 20: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 20

Configuration Database: EPICS Interface

• Custom EPICS PV backup & restore client is functionally complete

– Enables manual backup & restore of set point values

– Automatically backs up set parameters when DATE signals end of run

• Currently stores values in local XML file archive

– Automatic backup files transferred to CDB via SSH/SCP to heplnw17

– Temporary solution → will be replaced with direct SOAP XML transactions once RAL networking issues resolved

• Need publicly accessible web server on heplnw17

– Restoration of parameters from CDB will also be implemented once SOAP transfers are possible

Page 21: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 21

Configuration Database: User Entry• Not all parameters required for a CDB

‘run’ entry are available through normal EPICS channels

– i.e. Relevant IOCs & integration with the DAQ are not yet complete

– Currently only beamline magnet currents can be backed up from ‘live’ servers

• (Quasi) temporary solution:– Generic EPICS data server hosts PVs for

all values missing from existing IOCs, so they can be read by backup/restore client

– User entry client allows shifter to enter required parameters before initiating a run

– As future work progresses, unnecessary user entry items will be removed

• However, shall always require some degree of manual data entry

CDB Data Server

(miceiocpc1)

User Entry Client

Backup / Restore Client

Page 22: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 22

Ensuring the Validity of CDB Entries• Vital that set point values remain fixed during each standard run

– If set point value record in CDB does not represent physical state of system for entire run, data are invalid

• Implement following protocol to ensure invalid runs are correctly identified:

– CDB data server hosts run status PV

– User entry client automatically sets run status to true when user submits current run parameters

• At this stage, users should not modify set point values again until run is complete

– Dedicated monitor IOC checks all set point PVs while DATE is in ‘data taking’ state → sets run status to false if any value changes (to do)

– Alarm Handler monitors run status → immediately warns that run is invalid if any user modifies a set point value (to do)

– run status incorporated in CDB run parameters record

Page 23: Controls & Monitoring Status Update J. Leaver 05/11/2009

Control & Monitoring Systems

Page 24: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 24

C&M Systems OverviewSystem Owner EPICS Developer EPICS Status

Target: Drive Paul Smith; Paul Hodgeson; Chris Booth (UOS) Adrian Oates; Graham Cox (DL) Operational. Requires additional safety interlocks and read out of chiller temperature and flow rate.

Target: Controller Paul Smith (UOS); James Leaver (IC) James Leaver (IC) Software framework and initial versions of EPICS server/client applications are complete. Low level hardware drivers are to be implemented once firmware is in an operational state.

Target: Beam Loss Paul Smith; Paul Hodgeson (UOS); James Leaver (IC) Pierrick Hanlet (IIT) Functionally complete.Beamline Magnets Martin Hughes (RAL) Peter Owens (DL) Functionally complete.Pion Decay Solenoid Mike Courthold (RAL) Adrian Oates; Graham Cox (DL) Functionally complete.FNAL Beam Position Monitors Alan Bross (FNAL) James Leaver (IC) Complete.TOF Maurizio Bonesini (INFN) Pierrick Hanlet (IIT) EPICS server/client applications for HV power supply

control are functional, but development of hardware drivers has not yet commenced.

CKOV Lucien Cremaldi; David Sanders (OLEMISS) Pierrick Hanlet (IIT) Monitoring of temperature and humidity is functionally complete. EPICS server/client applications for HV power supply control are functional, but development of hardware drivers has not yet commenced.

Tracker: Diffuser Wing Lau (OU) Pierrick Hanlet (IIT) Not yet commenced.Tracker: Spectrometer Solenoids Steve Virostek (LBNL) Adrian Oates; Graham Cox (DL) Included as part of a solenoid package to be provided by

Daresbury Lab. Most of the design drawings are complete, and ~£15.5K of capital and 0.4 man years of effort are required to finish the project. Funding has been confirmed by the US (2/3) and UK (1/3).

Tracker: B-Field Probes Frank Filthaut (RUN) Frank Filthaut (RUN) Functionally complete.Tracker: AFEIIts Alan Bross (FNAL) James Leaver (IC); Jean-Sebastien

Graulich (UNIGE)Complete, but integration with DATE DAQ to be finalised.

Tracker: AFEIIt Infrastructure Alan Bross (FNAL) Adrian Oates; Graham Cox (DL) Included as part of a solenoid package to be provided by Daresbury Lab. See Tracker: Spectrometer Solenoids entry.

Calorimeter: KL Calorimeter Virgilio Chimenti (INFN) Unknown Unallocated.Calorimeter: Electron Muon Ranger A. Blondel; J-S Graulich; V. Verguilov; F. Masciocchi; L.

Nicola; R. Bloch; P. Béné; F. Cadoux (UNIGE)Unknown Unallocated.

H2 Absorbers: Focus Coils Wing Lau (OU) Adrian Oates; Graham Cox (DL) Included as part of a solenoid package to be provided by Daresbury Lab. See Tracker: Spectrometer Solenoids entry.

H2 Absorbers: Hydrogen System Yury Ivanyushenkov; Tom Bradshaw (RAL) Adrian Oates; Graham Cox (DL) DL have acquired necessary safety training and have started evaluating PLC systems. Early stages of development.

RF Cavities: Coupling Coils Derun Li; Steve Virostek (LBNL) Adrian Oates; Graham Cox (DL) Included as part of a solenoid package to be provided by Daresbury Lab. See Tracker: Spectrometer Solenoids entry.

RF Cavities: RF System Andy Moss (ASTeC) Dimity Tettyleman (LBNL); Adrian Oates; Graham Cox (DL)

Initial discussions have taken place.

DATE Status Jean-Sebastien Graulich (UNIGE) James Leaver (IC); Jean-Sebastien Graulich (UNIGE)

Complete.

Network Status Anyone with a PC/IOC in the MLCR/Hall James Leaver (IC) Complete.DSA Neutron Monitor Andy Nichols; Tim Hayler (RAL) Pierrick Hanlet (IIT) Functionally complete.

Page 25: Controls & Monitoring Status Update J. Leaver 05/11/2009

C&M Systems Developed by Local MICE Community

Page 26: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 26

Target: Controller• Target Controller Stage 1

upgrade underway– Hardware essentially complete– Currently working on Controller

firmware (P. Smith)

• Software nearing completion– Hardware driver framework in

place– Have implemented all EPICS

server / client functionality• Tested using ‘virtual’ Target

Controller Device

– Remaining task: write low-level hardware driver plug-in once firmware is complete

Stage 1 upgrade - Control functionality includes:- Set delays- Set / monitor Target depth- Park / hold, start / stop actuation- Monitor hardware status

Page 27: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 27

Target: Beam Loss• Standalone DAQ system upgrades:

– Final algorithm selected for ‘absolute’ beam loss calculation

• Thanks to AD for implementation

– Standalone event viewer can now follow real-time output of DAQ

• Target Beam Loss IOC will read local data archive written by DAQ & serve values as PVs

– Enables integration with Alarm Handler & readout of actuation numbers for CDB run data entries

– IOC will use same analysis & file access code as standalone DAQ, via C wrapper functions

– See PH’s talk

Page 28: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 28

DATE Status

• Need mechanism for reporting current DAQ state via EPICS

– Required for user feedback, alarm handling & triggering automatic CDB run set point value backups

• Simple (‘dumb’) data server hosts DATE status PV

• Client application reads DATE status from DIIM server, forwards value to EPICS server

• Server & display client complete – DATE integration should be complete before end of Collaboration Meeting (JSG…?)

EPICS Data Server

(Single ‘status’ PV)DATE Client

Page 29: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 29

Network Status• Need to verify that all machines on Control & DAQ networks are

functional throughout MICE operation

• Two types of machine– Generic PC (Linux, Windows)

– Hard IOC (vxWorks)

• EPICS Network Status server contains one status PV for each valid MICE IP address

• Read status: PC– SSH into PC (using key files, for security)

• Verifies network connectivity & PC identity

– If successful, check list of currently running processes for required services

• Read status: Hard IOC– Check that standard internal status PV is accessible, with valid contents

• e.g. ‘TIME’ PV, served by all MICE ‘hard’ IOCs

Page 30: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 30

Network Status• Server & client are functionally

complete

– Client displays status of all PCs & hard IOCs, scans at user-specified period (with ‘check now’ override)

• Currently not configured for use on the MICE network…

– Necessary to compile list of required services for each MICE PC

– Network specification not yet finalised

• Need to confirm how status server will connect to PCs on both Control & DAQ networks

Page 31: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 31

Other ‘Local’ C&M Systems

• TOF

• CKOV

• Diffuser

• DSA Neutron Monitor

• KL Calorimeter

• Electron Muon Ranger

See PH’s talk

Remain unallocated…

Page 32: Controls & Monitoring Status Update J. Leaver 05/11/2009

Daresbury C&M Status UpdateA. Oates

Page 33: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 33

Recent DL Controls Work • New ‘IOC controls server’ installed

– Provides boot/configuration parameters for DL hard IOCs (vxWorks): miceioc1, miceioc2, miceioc4

– Enabled Online Group to take ownership of old ‘IOC controls server’ (miceserv1) & associated Channel Archiver duties

• Provided the Online Group with general EPICS assistance

– EPICS architecture guidance

– Channel Archiver web server installation

• Changed both Target Drive systems to incorporate new spilt rail power supplies

• Fault diagnosis and correction on Target 2 Drive controls in R78

• Prepared quotation for Cooling Channel Control System

• Completed requested changes to Magnet PSU databases

Page 34: Controls & Monitoring Status Update J. Leaver 05/11/2009

Schedule & Final Notes

Page 35: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 35

Schedule

Page 36: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 36

Schedule

Page 37: Controls & Monitoring Status Update J. Leaver 05/11/2009

05/03/23 Imperial College 37

Final Notes• Good progress on all fronts• Groundwork required to unify MICE control systems & create

a user-friendly interface is well underway– Alarm Handler, Channel Archiver operational– Unified application launchers complete (bad old days of logging into

individual PCs & typing commands are over!)– CDB interface functional

• Lots of work still to do!– Controls effort is ‘significantly understaffed’ (DAQ & Controls Review,

June ‘09)– We are always operating at full capacity– ‘Controls & Monitoring’ is such a broad subject that many

unexpected additional requirements crop up all the time…– Please be patient!