overview of lhcb computing requirements, organisation, planning, resources, strategy lhcb uk...

27
Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

Upload: pamela-hudson

Post on 17-Jan-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

Overview of LHCb ComputingRequirements, Organisation, Planning, Resources, Strategy

LHCb UK Computing Meeting

RAL

May 27/28, 1999

John Harvey

CERN / EP-ALC

Page 2: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 2

Outline

LHCb data handling requirements Project Organisation - scope of LHCb Computing The DAQ and ECS project : goals, projects, milestones Software : strategy, projects, milestones Communication : Meetings, videoconferencing, documentation, web

Page 3: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 3

LHCb Data Acquisition System

Page 4: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

Reconstructed Data

Analysis Group Tags

Analysis Job Results

Monte Carlo Generator

Detector Simulation

Generator Data

Reconstruction

Event Tag Generation

Real Data Production

Sim. Raw Data Raw Data

Reconstruction Data

Event Tags

Event Tags

Analysis Job

Analysis Group Prod.

Analysis Data

Data analysisData Production

Detector/DAQ Analysis Cycle

200 kB 100 kB

70 kB

0.1 kB

10kB

150 kB

0.1 kB

200 Hz

LHCb datasets and processing stages

Page 5: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 5

Data Volumes

Assume run for 107 secs each year

Data type Rate /sec Volume /year

a) Raw data 20 MB/s 200 TB

- b) Interesting physics data 1 MB/s 10 TB

c) Simulated data (bx10) 10 MB/s 100 TB

d) Reconstructed Raw 14 MB/s 140 TB

e) Reconstructed Sim 7 MB/s 70 TB

f) Analysis data 4 MB/s 40 TB

TOTAL 550 TB

Page 6: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 6

Data Storage Requirements

Storage used (TB)In 1997

Storage Needed(TB/yr)In 2006

Real data 0.01 200

Simulated data 1.0 100

Calibration andAlignment

- <1

Reconstructed data(real + simulated)

1.0 200

Analysis 0.05 ~ 2

Total ~2 ~500

Page 7: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 7

Processing Power Requirements

CPU (MIPS)In 1997

CPU (MIPS)In 2006

TriggeringLevel-2Level-3

4 x 105

1 x 106

ReconstructionReal dataSimulated data 200

1.6 x 106

2 x 105

SimulationB-mesonBackground

600-

1.2 x 106

1.2 x 103

AnalysisReal dataSimulated data

-150

5 x 104

5 x 105

Total 950 ~ 5 x 106

Page 8: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 8

CPU Resources

Assumption is that the cpu power / processor in 2004 will be ~4000 Mips

CPU Resources at the experimentdata production and triggering 1,400x1000 Mips 350 nodes

reconstruction 800x1000 Mips 200 nodes

Total 550 nodes

Event simulation and reprocessing (4 month duty cycle)monte carlo production 1,400x1000 Mips 350 nodes

reprocessing, event tag creation 800x1000 Mips 200 nodes

Total 550 nodes

Physics Analysisphysics production 10 groups 107 kMs/refinement/week 46 nodes

user analysis 100 107 kMs/job *2 /week 91 nodes

Total 137 nodes

Page 9: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 9

Data Rates

Raw Reconstructed TagsAnalysis

User AnalysisPhysics analysis

660 MB/s 360 MB/s

DAQ Reconstruction Reprocessing Simulation

4 MB/s

20 MB/s 14 MB/s 20 MB/s 14 MB/s 35 MB/s

DATAREPOSITORY

Page 10: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

Process for Organising Computing Activities

ManagePlan, initiate, track, coordinateSet priorities and schedules, resolve conflicts

SupportSupport development processesManage and maintain componentsCertify, classify, distributeDocument, give feedback

AssembleDesign applicationFind and specialise componentsDevelop missing componentsIntegrate components

RequirementsExisting software

systems

BuildDevelop models, Evaluate toolkitsArchitect components and systems Choose integration standardEngineer reusable components

Page 11: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

ManageProject Organisation

Support

FacilitiesCPU farmsDesktopStorageNetworkSystem Man.

VendorsIT-IPTVendors

IT-PDPVendors, IT-ASD

Support

SoftwareSDEProcessQualityLibrarianTrainingWebmaster

MM

Build

ArchitectureComponents

Frameworks(GAUDI)Glue

A

Librariesclhep, GUI, ...

data management

M

Reconstruction

M

Simulation

M

Analysis

M

Controls

M

Control Room

M

Assemble

DAQ

M

Steering GroupM M C

Technical ReviewE M A

...

Arch. ReviewM A E ... ...

M

A

C

E

Coordinator

Architect

Project Manager

Project Engineer

Page 12: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 12

ID Task Name

1 R&D Phase2 Technology Choices3 TDR Preparation4 Implementation5 Integration & Commissioning6 Exploitation

1/1

H1 H2 H1 H2 H1 H2 H1 H2 H1 H2 H1 H2 H1 H2 H1 H21998 1999 2000 2001 2002 2003 2004 2005

Schedule for DAQ and ECS

Page 13: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 13

Devise an architecture for the DAQ system and a specification for all dataflow and control elements

Acquire knowledge of, and experience with, new technologies Assemble small scale hardware prototype of DAQ system (‘String Test’)

running at full speed Finally take an educated decision on the technologies to use for the

implementation of the final system

Goals of R&D Phase

Page 14: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 14

DAQ Activities

Architecture and protocol design Readout Unit Implementation Study

Study functionality, interfacing

Design and prototype, performance Event Building Project

Devise strategy L1,L2/3

Study technologies e.g. Myrinet

Simulation models, demonstrators Timing and Fast Control

Readout Supervisor FEM Implementation Study Event Filter Farm Study (LCB Project) Study capabilities of Mass storage (ALICE/IT)

Timing&

FastControl

Front-End Electronics

Trigger Level 2 & 3Event Filter

VDET TRACK ECAL HCAL MUON RICH

Read-out Network

Storage

LHC-B Detector

RU RU

SFC

Control &

Monitoring

SFC

CPU

CPU

CPU

RU

CPU

CPU

CPU

L0

L1

Level 0Trigger

Level 1Trigger

40 M Hz

1 MHz

40 kHz

Fixed latency ~3 ms

Variable latency <256 ms

Datarates

2-4 GB/s

40 TB/s

1 TB/s

4 GB/s

20 M B/s

Variable latencyL2 ~10 ms

L3 ~200 ms

LAN

Front-End Multiplexers(FEM)1 MHz

Sub-Farm Controllers(SFC)

Read-out units(RU)

Page 15: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 15

Control/Monitoring structure

Experimental equipment

. . .

LAN

WAN

Storage

Oth

er

syst

em

s(L

HC

, S

afe

ty,

...)

Configuration DB, Archives,Logfiles, etc.

Controller/PLC VME

FieldBus

LAN

Supervision

ProcessManagement

FieldManagement

Sensors/devices

Field buses

PLC

OPC

CommunicationProtocols

SCADA

Technologies

SCADA = supervisory control and data acquisitionOPC = OLE for process controlPLC = Programmable logic controllerField buses = CAN, ProfiBus, WorldFip, ...

Page 16: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 16

Experiment Control System (ECS)

Joint Project to devise a common controls kernel for all LHC experiments and all aspects of control

Selected Sub-Projects …. Joint URD for Alice/LHCb - finished Hardware interface URD - by end ‘99 Architecture design - ongoing Technology survey(SCADA) - finished Fieldbus evaluation - ongoing OPC evaluation - ongoing …

Page 17: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 17

Schedule for DAQ/ECS

ID Task Name

3 Architecture Studies4 RU (readout unit)Project5 Design6 Test first Prototypes7 Event Building Project8 Tecnology Evaluation9 Simulation Models10 Event Building Strategy11 Build small prototype12 Timing and Fast Control13 Front-End Multiplexer14 String Test15 Technology Choices16 ECS (Experiment Control System)17 User requirements (sub-detectors, hall infrastructure)18 Architecture/Evaluations/R&D (OPC,…)19 Interface technology recommendations20 Event Filter Farm (LCB Project)23 Mass Storage (Alice/IT)29 TDR30 Writing31 Submission

1/4

31/12

3/1

1/1

Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q41998 1999 2000 2001 2002

Page 18: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 18

Where are we gOOing?

SICB

FORTRAN Toolkits

ZEBRA banks

ZEBRA andASCII files

June 1998June 1998

OO Frameworks

OO toolkits

OO event + geometry models

OO database (Objectivity/DB)

June 2000June 2000

Page 19: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 19

Schedule for Computing

ID Task Name

1 SICB2 Production3 Disposal4 .5 OO software (GAUDI)6 Prototype - I7 Prototype Review8 Prototype - II9 Interim Review10 Production Version11 Sim Data Challenge12 Final commissioning13 Ready for real data14 .15 Computing Model16 Tools and technologies17 Prototype and simulation18 Test Implementation19

20 Computing TDR

3/7

30/6

1/7

30/6

1/7

2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 21998 1999 2000 2001 2002 2003 2004 20

Page 20: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 20

Plans for SICb

SICb will be discarded when new software with same or superior functionality will be available Current planning: July 2000

Until then:

Production simulation will continue with SICB Interface to other event generators Studies of alternative detector layout options Enhanced detector response simulation

Page 21: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 21

Milestone 1: Working Prototype

By mid 2000, produce a new working prototype of LHCb software incorporating: model describing the LHCb detector (structure, geometry…) database containing ~ 1,000,000 simulated events (~100 GB) data processing programs based on LHCb’s OO software framework :

simulation : integrate GEANT4 simulation toolkit reconstruction : pattern recognition algorithms for tracking, RICH, calorimetry etc. analysis : toolkit of analysis algorithms

Interactive analysis and event display facilities evaluate available toolkits - ROOT, WIRED, JAS, OpenScientist

Page 22: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 22

Strategy for development of new software

We are convinced of the importance of the architecture architect (experienced designer) and design team (domain specialists)

Identify components, define their interfaces, relationships among them Build framework from implementations of these components

“framework is an artefact that guarantees the architecture is respected” to be used in all the LHCb event data processing applications including : high

level trigger, simulation, reconstruction, analysis. Build high quality components and maximise reuse

Incremental approach to development new release 3-4 times per year gradually add functionality use what is produced and get rapid feedback

GAUDI release 1 in Feb ‘99, release 2 next week

Page 23: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 23

Strategy for next two years

Dilemma have to keep existing software in production update practices, use new technology for future development

OO Migration do not isolate people, encourage same people to do both minimise ‘legacy’ code -algorithm development ongoing wrap existing FORTRAN code if appropriate use available libraries (clhep, STL) and toolkits (GEANT4, ODBMS) start component development (event model, geometry, ….)

Train ‘just-in-time’, LHCb OOAD course attended by 45 people Share ideas, designs, components etc. with other experiments Don’t start too many activities at once, keep under control

Page 24: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 24

LHCb Offline Software Road Map

200420022000

Working Prototype, ‘retire’ SICB

Detailed Implementation

Integration and Commissioning Exploitation

Rele

ase

Num

ber

2006

Incremental releases

Page 25: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 25

Software Development Environment

Documentation templates for user requirements, project plans, costings.. Design method (UML) and tool (Rose,….) Training - OO A&D course, 5 days, John Deacon Language - C++ (but with Java in mind) NT - Visual Developer Studio used by GAUDI team Linux - preferred choice of physicists Code management (cvs) and software release scheme (CMT)

Work with IT/ IPT groupProject Management - process to manage software projectsConfiguration ManagementDocumentation : Handbooks - Use , Manage, Engineer

Page 26: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 26

Computing Model

Compute facilities - PC farms running NT or Linux Marseilles, Liverpool, Rio, with other LHC experiments and IT/PDP group

“Data Management and Computing Using Distributed Architectures” with other LHC experiments in a proposed LCB project outside institutes + CERN/IT (LHCb/Oxford,…)

determine which classes of models for distributed data analysis are feasible, taking into account network capabilities and data handling resources likely to become available in the collaboration sites

identify and specify the main parameters and build tools for making simulations for comparison of alternative strategies.

Make test implementations of elements of computing model

Page 27: Overview of LHCb Computing Requirements, Organisation, Planning, Resources, Strategy LHCb UK Computing Meeting RAL May 27/28, 1999 John Harvey CERN / EP-ALC

LHCb UK Computing Meeting, RAL, 27/28 May ‘99 Slide 27

Communication

Meetings Weekly Computing meeting at CERN - Wednesday morning Irregular DAQ/ECS meeting at CERN - Wednesday afternoon LHCb Software weeks started in 1999

Feb 8-12, June 2-4 , Nov 24-26

Data Handling meeting in LHCb weeks

Videoconferencing has been tried but not institutionalised dedicated facility for LHCb at CERN being discussed

LHCb web Technical notes, news of meetings, talks, slide gallery, collaboration database

are all maintained on LHCb web - http://lhcb.cern.ch/ we aim to improve - guidelines on producing web pages and design of public

web being discussed