LHCb - PowerPoint PPT Presentation

About This Presentation
Title:

LHCb

Description:

The PVSS & Framework and the FSM courses are still required in ... Croquette. ServerPC: pclbcecs03. SPECS. I2C. ClientPC: Portable. SpecsSrv. PVSS. PVSS00dim ... – PowerPoint PPT presentation

Number of Views:226
Avg rating:3.0/5.0
Slides: 57
Provided by: cla6181
Category:
Tags: croquette | lhcb

less

Transcript and Presenter's Notes

Title: LHCb


1
LHCbs Experiment Control System
  • Step by Step

2
Overview
  • LHCbs Experiment Control System
  • What do we (JCOP/LHCb) provide
  • What sub-detectors/sub-systems need to implement
  • PVSS Framework reminder
  • Interfacing Electronics Boards
  • SPECS CC-PC Tools
  • The Configuration DB
  • Hierarchical Control
  • The FSM Toolkit
  • Note This tutorial is meant as an overview
  • The PVSS Framework and the FSM courses are
    still required in order to use the tools!

3
ECS Scope
DCS Devices (HV, LV, GAS, Temperatures, etc.)
Detector Channels
L0
TFC
Front End Electronics
Experiment Control System
Readout Network
High Level Trigger
Storage
DAQ
External Systems (LHC, Technical Services,
Safety, etc)
4
ECS Generic Architecture
ECS
DAQ
LHC
T.S.
DSS
DCS
...
...
Abstract levels
GAS
DetDcs1
DetDcsN
DetDaq1
Status Alarms
Commands
SubSys1
SubSys2
SubSysN
Dev1
Dev2
Dev3
DevN
To Devices (HW or SW)
5
What do we provide?
  • JCOP LHCb Online provide
  • Not complete applications, but
  • A Framework, i.e. a set of tools to help
    sub-systems create their control systems
  • Complete, configurable components (ex. CAEN HV)
  • Tools for defining User Components
  • Electronics boards (SPECS/ CC-PC)
  • Specific equipment/software tasks (DIM protocol)
  • Other Tools, for example
  • FSM for Building Hierarchies
  • Configuration DB
  • Archiving, Alarm handling, etc.

6
We also provide
  • Integration of Infrastructure Services
  • Power Distribution and Rack/Crate Control
  • Cooling and Ventilation Control
  • Magnet Control (Monitoring)
  • Gas Control
  • Detector Safety System
  • And interface to
  • LHC machine
  • Access Control System
  • CERN Safety System
  • Sub-detectors can use these components
  • For defining logic rules (using their states)
  • For high-level operation (when applicable)
  • Switch ON, Switch Off, Set parameters

7
And also Database Tools
  • Interfaces to the three Logical Databases in the
    Online System

Conf.
DB
...
...
PVSS
PVSS
PVSS
PVSS
PVSS
Arch.
PVSS
PVSS
PVSS
PVSS
...
...
.
.
Cond.
To Offline
To Offline
DB
8
Online Database Contents
  • Configuration DB contains
  • All data needed to configure the HW (or SW) for
    the various running modes
  • Ex. HV V0 Settings, Pedestal settings, trigger
    settings, etc.
  • PVSS Archive contains
  • All monitoring data read from HW for monitoring
    and debugging of the Online System
  • Ex. HV Vmon Readings, temperatures, pedestal
    readings, etc.
  • Conditions DB contains
  • A subset of the monitoring data read from HW if
    it is needed for Event processing (prob. packaged
    differently)
  • Ex. HV Vmon Readings if changed by more than n
    Volts
  • Some configuration data once it has been used
  • Ex. Trigger settings used by a particular run

9
The Configuration DB
  • The Configuration DB will contain
  • All "static" information about the devices
  • Connectivity, addresses, etc. (also inventory and
    history)
  • Developed within LHCb (supports queries)
  • All "dynamic" data needed by the devices (for
    different running modes and different versions)
  • Settings (voltages, alarm limits, etc.),
    Calibration constants, Pedestals, FPGA code
    (probably a pointer to it), etc.
  • The settings for a particular running mode are
    called a Recipe (partial recipes available)
  • The JCOP FW component implements a cache
  • Can be used without Oracle for tests
  • Can pre-load several recipes before Start of Run

10
What needs to be done
  • Start bottom up
  • Integrate each device into PVSS
  • Define configuration recipes
  • for the various running modes
  • Build a hierarchy for each sub-system
  • According to the guidelines
  • Integrate the devices in the hierarchy

11
Device Integration
  • Device Types
  • HV LV channels
  • CAEN, ISEG, WIENNER -gt JCOP Framework
  • Analog inputs
  • ELMB -gt JCOP Framework
  • Electronics boards
  • SPECS CC-PC -gt Tools to describe boards
  • TELL1 -gt FW component (for common part)
  • Other Components
  • HW or SW -gt FwDIM component
  • Needs PVSS, Framework, DIM,

12
PVSS
13
PVSS Distribution
14
Datapoint Concept
  • DP type -gt DP

Configs
15
Graphical Objects
  • Reference Panels
  • Can be inherited dynamically
  • parameters get replaced by instance value

16
Building User Interfaces
  • Static Part -gt Drag Drop
  • Dynamic part -gt Control Scripts ("C" like)
  • A few usefull calls for accessing DPs
  • dpGet (string dpName, ltdata_typegt value)
  • dpSet (string dpName, ltdata_typegt value)
  • dpConnect (string callback, string dpName)
  • A few usefull calls for accessing Widgets
  • getValue (string widgetName, string
    widgetProperty, ltwidget dependent datagt)
  • setValue (string widgetName, string
    widgetProperty, ltwidget dependent datagt)

17
PVSS Features
  • Open Architecture
  • We can write our own managers
  • It can be interfaced to anything (FSM, DIM)
  • Highly Distributed
  • 130 Systems (PCs) tested
  • No major problem found
  • Standard Interface
  • All data of all sub-systems defined as DataPoints!

18
Demo-1
  • Start PVSS console
  • Create a project (add installation tool)
  • PVSS basic functionality
  • PVSS Managers
  • Parameterization Module
  • Datapoint structures
  • Graphic editor

19
Demo-2
  • Install Framework
  • fwCore
  • fwAnalogDigital
  • fwCaen
  • fwConfigurationDB
  • fwDIM
  • fwSpecs
  • fwHw
  • CAEN component
  • Create Crates/Boards/Channels
  • Crate0 will be used by FSM later
  • Show Operation panels

20
DIM Distributed Information Management System
  • Publish/Subscribe mechanism
  • Servers publish Services.
  • Clients subscribe to Services
  • On change or at regular intervals
  • Clients can send commands to Servers
  • Services
  • A set of data
  • any type or size
  • Identified by a name
  • A Name Server
  • Keeps a list of available Services

21
DIM Some Characteristics
  • Transparency
  • DIM clients do not know where their interlocutors
    are.
  • DIM components can move from one machine to
    another, all connections are transparently
    re-established.
  • Available on mixed environments
  • UNIX (HP-UX, Sun-OS, Sun-Solaris, IBM-AIX,
    DEC-OSF, Linux), Windows, VMS, Real-time OSs
    (OS9, LynxOS, VxWorks)
  • API available in C, C and Java
  • Easy to Use
  • One call and a process can become a server or a
    client.
  • Monitoring and Visualization Tools Available.
  • Documentation and examples at http//www.cern.ch/
    dim

22
PVSSlt-gtDIM
  • FwDIM component
  • Server is a DIM Server
  • Client is a PVSS Manager (PVSS00dim)
  • Correspondence PVSS DPs lt-gt DIM Services
  • Can be setup graphically via fwDIM panel
  • Or via a script library
  • When setup
  • When Server updates Service data goes into DP
  • Writing to DP will send a DIM Command
  • Documentation at
  • http//www.cern.ch/lhcb-online/ecs/fw/FwDim.html

23
Non-standard components
  • Integrating user components
  • Create a DIM server (C or C)
  • Publishes device status data
  • Receives Commands
  • Create a PVSS Datapoint
  • That matches the structure of DIM services
  • Connect the DP to the DIM services
  • Using the FwDIM tools
  • Make a PVSS panel to control the device
  • Used for farm monitoring, trigger algorithms,
    etc.

24
Demo-3
  • FwDIM
  • Configure DIM_DNS_NODE
  • Start a DIM server (ex. pvss_dim_server)
  • Start DIM visualization tool
  • DIMTree on Windows
  • DID on Linux
  • Start fwDIM.pnl
  • Connect services to DPs
  • Visualize from PVSS

25
Electronics Interface
  • CC-PC SPECS tools
  • Low-level Software
  • A C library for accessing board components
  • Via I2C, JTAG or parallel bus (and FPGA
    programming)
  • A Generic DIM server for PVSS Communication
  • PVSS Tools (FW components fwCcpc/fwSpecs)
  • A library (PVSS scripting) for accessing board
    components on any board with a CC-PC/Specs
    (equivalent to the low-level library)
  • A graphical user interface providing the
    functionality available in the library

26
Electronics Integration
  • Electronics Boards
  • Can use the CCPC/SPECS FW Tools for tests, but
    accessing the chips is not enough
  • Boards have to be modeled in PVSS according to
    guidelines (ex. registers have to correspond to
    datapoints) in order to
  • Provide access to the Conf. DB
  • Select a device/group of devices and saySave as
    Physics recipe.
  • Be able to archive the data
  • Be able to send the data to the Cond. DB
  • Integrate into the FSM, Generate alarms, etc.

27
Electronics Integration
  • We provide a tool for modeling boards and their
    components (FWcomponent FwHw)
  • Declaring boards (access via SPECS or
    CC-PC)Containing
  • Groups of Chips (recursive) Containing
  • Chips (TTCrx, Beetle, etc.)Containing
  • Registers (access via I2C/JTAG/Parallel Bus)
  • Contacts
  • Ricardo Fernandes SPECS
  • Stefan Koestner CC-PC

28
Electronics boards
  • Demo Setup

ClientPC Portable
PVSS
ServerPC pclbcecs03
PVSS00dim
SpecsSrv
Ethernet
SPECSMaster
SPECSMezzanine
SPECS
PVSS
Croquette
DNS
I2C
PVSS00dim
I2Cwidget
SupportPC pclhcb155
Note The DNS should run on a stable machine
(same as PVSS), not on a portable
29
Demo-4
  • FwSpecs
  • Server PC pclbcecs03
  • Configure DIM_DNS_NODE
  • Start SpecsServer remotely
  • Client PC portable
  • Configure DIM_DNS_NODE
  • Start SpecsClient direct access panel
  • Exercise I2C, JTAG, DCU
  • Explain the Monitoring feature
  • Show Advanced Panel (User Scripts)
  • Documentation at (not this version yet)
  • http//www.cern.ch/lhcb-online/ecs/PVSS_SPECS
  • FwCcpc very similar
  • Tools will be presented at Online meeting

30
Custom Electronics
  • Demo Example

Server PC
SpecsSrv
SPECSMaster
SPECSMezzanine
SPECS
I2C
Beetle1
TTCrx
Beetle2
Velo Board
31
Demo-5
  • FwHw
  • Create HW types
  • TTCrx, Beetle and VeloBoard
  • Configure Default Settings
  • Create veloBoards
  • Operate the board
  • Interface to Configuration Database (cache)
  • Save recipes (PHYSICS, TEST, etc.)
  • Download recipes

32
Electronics guidelines
  • FwHw Some Guidelines
  • If a chip has many registers
  • If they can be written in one single operation
  • Declare them as 1 register of size N
  • This will optimize configuration time
  • Some (a few) can also be declared separately
  • If they are often accessed individually
  • After using FwHw to define the boards
  • Design a user interface to operate each board
    type
  • The library fwSpecs or fwCcpc will give you
    access to the data to be visualized or sent to
    the board ex. fwSpecs_read(board1.ttcrx1.reg2,
    )

33
Control Hierarchy
  • Building a Control Hierarchy
  • And integrating Devices
  • Needs FwFSM, LHCb guidelines

ECS
DCS
DAQ
LHC
T.S.
DSS
...
...
GAS
DetDcs1
DetDcsN
DetDaq1
SubSys1
SubSys2
SubSysN
Dev1
Dev2
Dev3
DevN
34
Control Units
  • Each node is able to
  • Summarize information (for the above levels)
  • Expand actions (to the lower levels)
  • Implement specific behaviour Take local
    decisions
  • Sequence Automate operations
  • Recover errors
  • Include/Exclude children (i.e. partitioning)
  • Excluded nodes can run is stand-alone
  • User Interfacing
  • Present information and receive commands

DCS
Tracker
Muon
Temp
HV
GAS
HV
35
Device Units
  • Device Units
  • Provide the interface to real devices(Electronic
    s Boards, HV channels, trigger algorithms, etc.)
  • Can be enabled/disabled
  • In order to integrate a device within FSM
  • Deduce a STATE from device readings (in DPs)
  • Implement COMMANDS as device settings
  • Commands can apply the recipes previously defined

DevN
36
The Control Framework
  • The FwFSM Component is based on
  • PVSS for
  • Device Description (Run-time Database)
  • Device Access (OPC, Profibus, drivers)
  • Alarm Handling (Generation, Filtering, Masking,
    etc)
  • Archiving, Logging, Scripting, Trending
  • User Interface Builder
  • Alarm Display, Access Control, etc.
  • SMI providing
  • Abstract behavior modeling (Finite State
    Machines)
  • Automation Error Recovery (Rule based system)

Device Units
Control Units
37
SMI
  • Method
  • Classes and Objects
  • Allow the decomposition of a complex system into
    smaller manageable entities
  • Finite State Machines
  • Allow the modeling of the behavior of each entity
    and of the interaction between entities in terms
    of STATES and ACTIONS
  • Rule-based reasoning
  • Allow Automation and Error Recovery

38
SMI
  • Method (Cont.)
  • SMI Objects can be
  • Abstract (e.g. a Run or the DCS)
  • Concrete (e.g. a power supply or a temp. sensor)
  • Concrete objects are implemented externally
    either in "C", in C, or in PVSS (ctrl scripts)
  • Logically related objects can be grouped inside
    "SMI domains" representing a given sub-system
    (Framework Control Unit)

39
SMI Run-time Environment
  • Device Level Proxies
  • drive the hardware
  • deduceState
  • handleCommands
  • C, C, PVSS ctrl scripts
  • Abstract Levels Domains
  • Implement the logical model
  • Dedicated language - SML
  • A C engine smiSM
  • User Interfaces
  • For User Interaction
  • All Tools available on
  • Windows, Unix (Linux)
  • All communications are transparent and
    dynamically (re)established

Hardware Devices
40
SMI
  • SMI - The Language
  • SML State Management Language
  • Finite State Logic
  • Objects are described as FSMstheir main
    attribute is a STATE
  • Parallelism
  • Actions can be sent in parallel to several
    objects. Tests on the state of objects can block
    if the objects are still transiting
  • Asynchronous Rules
  • Actions can be triggered by logical conditions on
    the state of other objects

41
SML The language
  • Devices
  • Sub System
  • Objects can be dynamically included/excluded in a
    Set

42
SML example (automation)
  • External Device
  • Sub System

43
PVSS/SMI Integration
  • Graphical Configurationof SMI Using PVSS

44
Building Hierarchies
  • Hierarchy of CUs
  • Distributed over several machines
  • "" means reference to a CU in another system
  • Editor Mode
  • Add / Remove / Change Settings
  • Navigator Mode
  • Start / Stop / View

45
Control Unit Run-Time
  • Dynamically generated operation panels(Uniform
    look and feel)
  • Configurable User Panels

46
Features of PVSS/SMI
  • Task Separation
  • SMI Proxies/PVSS Scripts execute only basic
    actions No intelligence
  • SMI Objects implement the logic behaviour
  • Advantages
  • Change the HW -gt change only PVSS
  • Change logic behavioursequencing and dependency
    of actions, etc -gt change only SMI rules

47
Features of PVSS/SMI
  • Error Recovery Mechanism
  • Bottom Up
  • SMI Objects react to changes of their children
  • In an event-driven, asynchronous, fashion
  • Distributed
  • Each Sub-System recovers its errors
  • Each team knows how to recover local errors
  • Hierarchical/Parallel recovery
  • Can provide complete automation even for very
    large systems

48
Demo-6
  • Show a simple Hierarchy
  • Install fwLHCb_FsmDomains
  • In installs standard LHCb FSM Domain Types
  • And it creates
  • Show Include/Exclude and Enable/Disable
  • Show Temp FSM and Alarm Handling

VELODCS
VELOMotors
VELOTemp
49
Sub-detector FSM Guidelines
  • Started defining naming conventions.
  • Defined standard domains per sub-detector
  • DCS
  • DCS Infrastructure (Cooling, Gas, Temperatures,
    pressures, etc) that is normally stable
    throughout a running period
  • HV
  • High Voltages or in general components that
    depend on the status of the LHC machine (fill
    related)
  • DAQ
  • All Electronics and components necessary to take
    data (run related)
  • DAQI
  • Infrastructure necessary for the DAQ to work
    (computers, networks, electrical power, etc.) in
    general also stable throughout a running period.
  • And standard states transitions per domain.
  • Doc available in EDMS
  • https//edms.cern.ch/document/655828/1

50
FSM Guidelines
  • State Diagram for Trigger and DAQ Domains
  • Possible intermediate CONFIGURING and
    STARTING states if operations slow

51
Hierarchy
Infrast.
DCS
HV
DAQI
DAQ
L0
TFC
HLT
LHC
SubFarm1
SubFarmN
MUONDCS
MUONHV
MUONDAQI
MUONDAQ
VELODCS
VELOHV
VELODAQI
VELODAQ
VELODAQ_1
VELODAQ_2
VELODCS_1
VELODCS_2
VELODev1
VELODev1
VELODev1
VELODevN
52
Hierarchy Conf. DB
1
Infrast.
DCS
HV
DAQI
DAQ
L0
TFC
HLT
LHC
1
MUONDCS
MUONHV
MUONDAQI
MUONDAQ
Conf.
VELODCS
VELOHV
VELODAQI
VELODAQ
2
DB
1
VELODAQ_1
VELODAQ_2
VELODCS_1
VELODCS_2
VELODev1
3
VELODev1
Configure/modePHYSICS (Get PHYSICS
Settings) Apply Settings
1
VELODev1
VELODevN
2
3
53
Demo-7
  • Using type DAQ_Domain
  • Create
  • Create type VeloBoard
  • Integrate veloBoard1, veloBoard2,
  • Apply recipes on Configure command
  • Note There will be a configurator object per
    CU which gets recipes from DB to cache

VELODAQ
VELOFEE
VELOTELL1
VELODev1
VELODev1
VELOBoard1
54
Demo-8
  • Using type ECS_Domain
  • Create

VELO
VELODAQ
VELODCS
VELOHV
VELOFEE
VELOTELL1
VELOMotors
VELOTemp
VELODev1
VELODev1
VELOBoard1
55
The End
  • Questions?

56
Hierarchy Partitioning
Infrast.
DCS
HV
DAQI
DAQ
L0
TFC
HLT
LHC
MUONDCS
MUONHV
MUONDAQI
MUONDAQ
VELODCS
VELOHV
VELODAQI
VELODAQ
VELODAQ_1
VELODAQ_2
VELODCS_1
VELODCS_2
Write a Comment
User Comments (0)
About PowerShow.com