applications area status torre wenaus, bnl/cern peb meeting october 8, 2002
DESCRIPTION
Torre Wenaus, BNL/CERN PEB meeting, October 8, 2002 Slide 3 Top Apps Area Priorities – A Current Look 1. Establish process and infrastructure project (Active) 2. Address core areas essential to building a coherent architecture, and develop the architecture Object dictionary (Active) Persistency (Active) Interactive frameworks (RTAG in progress), analysis tools (RTAG soon) 3. Address priority common project opportunities Driven opportunistically by a combination of experiment need, appropriateness to common project, and ‘the right moment’ (existing but not entrenched solutions in some experiments) Detector description and geometry model (RTAG in progress) Driven by need and available manpower Simulation tools (RTAG in progress) Initiate: First half 2002TRANSCRIPT
Applications Area Status
Torre Wenaus, BNL/CERNPEB Meeting
October 8, 2002
PEB meeting, October 8, 2002 Slide 2
Torre Wenaus, BNL/CERN
Applications Area Status
This will not be a full-scope status talk You heard the state of SPI in the work plan discussion last week;
will not address it here POOL has been developing rapidly and a report will be given at
the applications area meeting tomorrow First internal release last week Will report status briefly
Current planning status will be shown SCRAM/CMT still a chain around my neck
No time to focus on this issue so far; no discussion here Architects Forum meeting this Friday to assess where we are on
this issue, as well as general apps area discussion
PEB meeting, October 8, 2002 Slide 3
Torre Wenaus, BNL/CERN
Top Apps Area Priorities – A Current Look
1. Establish process and infrastructure project (Active)2. Address core areas essential to building a coherent architecture, and develop
the architecture Object dictionary (Active) Persistency (Active) Interactive frameworks (RTAG in progress), analysis tools (RTAG
soon)3. Address priority common project opportunities
Driven opportunistically by a combination of experiment need, appropriateness to common project, and ‘the right moment’ (existing but not entrenched solutions in some experiments)
Detector description and geometry model (RTAG in progress) Driven by need and available manpower
Simulation tools (RTAG in progress)
Initiate: First half 2002
PEB meeting, October 8, 2002 Slide 4
Torre Wenaus, BNL/CERN
Near term priorities
Build outward from the core top tier components Conditions database (Pending, but common tool exists) Framework services, class libraries (RTAG in progress)
Address common project areas of less immediate priority Math libraries (Active) Physics packages (RTAG in progress)
Extend and elaborate the support infrastructure Software testing and distribution (Active)
Initiate: Second half 2002
Deliver: fall/late 2002: basic working persistency framework (On schedule)
PEB meeting, October 8, 2002 Slide 5
Torre Wenaus, BNL/CERN
Medium term
The core components have been addressed, architecture and component breakdown laid out, work begun. Grid products have had another year to develop and mature. Now explicitly address physics applications integration into the grid applications layer.
Distributed production systems. End-to-end grid application/framework for production.
Distributed analysis interfaces. Grid-aware analysis environment and grid-enabled tools.
(Blueprint RTAG recommending prompt initiation of RTAG)
Some common software components are now available. Build on them.
Lightweight persistency, based on persistency framework Release LCG benchmarking suite
Initiate: First half 2003
PEB meeting, October 8, 2002 Slide 6
Torre Wenaus, BNL/CERN
Activities: Candidate RTAGs by activity area Application software infrastructure
Software process; math libraries; C++ class libraries; software testing; software distribution; OO language usage; benchmarking
Common frameworks for simulation and analysis Simulation tools; detector description, model; interactive
frameworks; statistical analysis; visualization Support for physics applications
Physics packages; data dictionary; framework services; event processing framework
Grid interface and integration Distributed analysis; distributed production; online notebooks
Physics data management Persistency framework; conditions database; lightweight persistency
Blue: Currently being addressed in a project or RTAG
PEB meeting, October 8, 2002 Slide 7
Torre Wenaus, BNL/CERN
Candidate RTAG timelineMonths Merge? 02Q1 02Q2 02Q3 02Q4 03Q1 03Q2 03Q3 03Q4
Simulation tools 1 XDetector description & model 2 XConditions database 1 XData dictionary 2 XInteractive framew orks 2 XStatistical analysis 1 XDetector & event visualization 2 XPhysics packages 2 XFramew ork services 2 XC++ class libraries 2 XEvent processing framew ork XDistributed analysis interfaces 2 XDistributed production systems 2 XSmall scale persistency 1 XSoftw are testing 1 XSoftw are distribution 1 XOO language usage 2 XLCG benchmarking suite 1 XOnline notebooks 2 X
PEB meeting, October 8, 2002 Slide 8
Torre Wenaus, BNL/CERN
Personnel status
10 new LCG hires in place and working 3 more started at beginning of August 3 more starting between now and December Manpower ramp is on schedule Contributions from UK, Spain, Switzerland, Germany, Sweden,
Israel, Portugal, US Still working on accruing enough scope (via RTAGs) to employ this
manpower optimally But, everyone is working productively
~10 FTEs from IT (DB and API groups) also participating ~7 FTEs from experiments (CERN EP and outside CERN) also
participating, primarily in persistency project at present Important experiment contributions also in the RTAG process
PEB meeting, October 8, 2002 Slide 9
Torre Wenaus, BNL/CERN
Activities started or starting, prioritized order
Software process and infrastructure (SPI) – Alberto Aimar Persistency framework (POOL) – Dirk Duellmann Core tools and services (RTAG in progress – blueprint) Physics interfaces (RTAG in progress – blueprint) Simulation (RTAG in progress) Detector description (RTAG in progress) Event generators (RTAG in progress) Analysis tools, distributed analysis (RTAG soon) Math libraries – Fred James
PEB meeting, October 8, 2002 Slide 10
Torre Wenaus, BNL/CERN
Persistency Project Timeline
Started officially 19 April, led by Dirk Duellmann IT/DD Initially staffed with 1.6 FTE (1 from CMS, .6 (Dirk) from IT) MySQL scalability and reliability test Requirement list and experiment deployment plans
Persistency Workshop 5-6 June at CERN More requirement and implementation discussions Work package breakdown and release sequence proposed Additional experiment resource commitments received A name, POOL: Pool of persistent objects for LHC
Since beginning of July Real design discussions in work packages started
active participation since then ~5FTE (LHCb, CMS, ATLAS, IT/DD) Project hardware resources defined and deployed (10 nodes, 2 disk servers) Software infrastructure defined and becoming progressively available Work plan for 2002 presented to SC2 in August and approved Coding!
PEB meeting, October 8, 2002 Slide 11
Torre Wenaus, BNL/CERN
POOL Work Packages
Storage Manager and Object References ROOT I/O based Storage Manager and persistent references Capable of storing objects ‘foreign’ to ROOT (‘any’ C++ class)
File Catalog and Grid Integration MySQL, XML and EDG based implementations
Collections and Metadata Collection implementations for RDBMS and RootI/O
Dictionary and Conversion Transient and persistent dictionaries Cross population between RootI/O dictionary and Dictionary
import/export Infrastructure, integration and testing
Project specific development, integration, test infrastructure
PEB meeting, October 8, 2002 Slide 12
Torre Wenaus, BNL/CERN
Prioritized Experiment Focus of Interest
RootIO integration with grid aware catalog (ALL) Transparent Navigation (ATLAS/CMS/LHCb)
ALICE: maybe, but only in some places EDG (ALL), Alien (ALICE), Magda (ATLAS)
MySQL as RDBMS implementation until first release (ALL) Consistency between streaming data and meta-data (CMS/ATLAS)
At application defined checkpoints during a job Early separation of persistent and transient dictionary (ATLAS/LHCb) Initial release supports persistency for non-TObjects (ATLAS/LHCb)
without changes to user class definitions Support for shallow (catalog-only) data copies (CMS)
formerly known as cloned Federations Support for deep (extracted part-of-object hierarchy) copies (CMS)
PEB meeting, October 8, 2002 Slide 13
Torre Wenaus, BNL/CERN
Release Sequence
End September - V0.1 - Basic Navigation all core components for navigation exist and interoperate
StorageMgr, Refs & CacheMgr, Dictionary, FileCatalog some remaining simplifications
Assume TObject on read/write – simplified conversion End October - V0.2 – Collections
first collection implementation integrated support implicit and explicit collections on either RDBMS or RootIO
persistency for foreign classes working persistency for non-TObject classes without need for user code instrumentation
EDG/Globus FileCatalog integrated (??) End November - V0.3 – Meta Data & Query External release
annotation and query added for event, event collection and file based meta data Early testing & evaluation of this release by ATLAS, CMS, LHCb anticipated
PEB meeting, October 8, 2002 Slide 14
Torre Wenaus, BNL/CERN
POOL V0.1 Release Steps & Schedule
Complete/Update Component Documentation - by 15th Sept. (Completed Sep 17) pool (internal) component description what will be done in this release? what is still left for later? documents are announced on pool list and appear on POOL web site version numbers for any external packages are fixed
(Root, MySQL, MySQL++ …) Component Code Freeze for Release Candidate - by 25th Sept. (Completed last week)
all component code tagged in CVS all documented features are implemented and have a test case compiles at least on highest priority release platform
(=Linux RH7.2 and gcc-2.95.2 ?) survives regression tests on component level
System & Integration Testing and later packaging - by 30th Sept. (Completed last week) Any remaining platform porting Integration tests & “end user” examples run on all platforms
Code review – early October Start planning the next release cycle (Meeting today) POOL team already getting developer-user feedback
PEB meeting, October 8, 2002 Slide 15
Torre Wenaus, BNL/CERN
Math library recommendations & status
Establish support group to provide advice and info about existing libraries, and identify and develop new functionality
Group to be established in October Experiments should specify the libraries and modules they use
Only LHCb has provided info so far Detailed study should be undertaken to assess needed functionality
and how to provide it, particularly via free libraries such as GSL Group in India is undertaking this study; agreement just signed
PEB meeting, October 8, 2002 Slide 16
Torre Wenaus, BNL/CERN
Applications Area Planning
Planning materials are informed by the Architecture Blueprint RTAG RTAG concludes this week; final report presented to SC2 this
Friday Overall applications area plan mapped out in the context of this
RTAG A series of anticipated new projects defined
Core tools and infrastructure (CTS) Physics interfaces Simulation Generator services
A new RTAG recommended Physics analysis, including distributed aspects Augment limited initial scope of physics interfaces project
PEB meeting, October 8, 2002 Slide 17
Torre Wenaus, BNL/CERN
Planning materials on web
New planning page linked from applications area page; still in progress Applications area plan spreadsheet
http://lcgapp.cern.ch/project/mgmt/AppPlan.xls Based partly on blueprint RTAG work
Applications area plan document http://lcgapp.cern.ch/project/mgmt/AppPlan.doc Incomplete draft
Personnel spreadsheet http://lcgapp.cern.ch/project/mgmt/AppManpower.xls
XProject based planning materials – PBS, schedule, personnel resources http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html Loading personnel resources in progress; incomplete
PEB meeting, October 8, 2002 Slide 18
Torre Wenaus, BNL/CERN
Applications Area Plan Document
Applications area plan in development To be the ‘foundation document’ of applications area subproject plans
Overall scope Overall requirements Architecture overview Activity domains Summary of projects Applications area management Applications area planning
WBS, schedule, resources 13 page draft based largely on earlier planning materials (e.g. high
level plan) and blueprint RTAG
PEB meeting, October 8, 2002 Slide 19
Torre Wenaus, BNL/CERN
PBS, Schedule and Personnel
Level 2 milestones filled in for existing and anticipated projects Except for small projects, not done yet (math libs, generators) At least until mid 2003 Some of my own invention applied in the level 2 milestones still,
but for the most part the milestones in the existing projects have been either defined or OK’d by the project managers
Export of project data to MS Project not yet done It is a matter of running it, not coding it
Over to XProject… http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html LCG applications web page -> Planning -> Project breakdown
and schedule XProject will shortly be moved to CERN and divorced from
ATLAS planning
PEB meeting, October 8, 2002 Slide 20
Torre Wenaus, BNL/CERN
Planning ToDo
Finish applications area plan document LCG-dedicated XProject running at CERN
Earned value planning Percent complete; anticipated cost
Interest from CMS Finish personnel resource loading Export to MS Project for Gantt chart (& personnel?) Finish first round of level 2 milestones Math libraries workplan …
PEB meeting, October 8, 2002 Slide 21
Torre Wenaus, BNL/CERN
Applications Architectural Blueprint RTAG
Preamble: without some overall view of LCG applications, the results from individual RTAGs, and thus the LCG work, may not be optimally coherent. Hence the need for an overall architectural ‘blueprint’. This blueprint will then serve to spawn other RTAGs leading to specific proposals, and ensuring some degree of overall consistency.
Mandate: define the architectural ‘blueprint’ for LCG applications: Define the main architectural domains (‘collaborating frameworks’) of
LHC experiments and identify their principal components. (For example: Simulation is such an architectural domain; Detector Description is a component which figures in several domains.)
Define the architectural relationships between these ‘frameworks’ and components, including Grid aspects, identify the main requirements for their inter-communication, and suggest possible first implementations. (The focus here is on the architecture of how major ‘domains’ fit together, and not detailed architecture within a domain.)
Identify the high level deliverables and their order of priority. Derive a set of requirements for the LCG
PEB meeting, October 8, 2002 Slide 22
Torre Wenaus, BNL/CERN
Requirements
Lifetime Languages Distributed applications TGV and airplane work Modularity of components Use of interfaces Interchangeability of implementations Integration Design for end-users Re-use existing implementations Software quality at least as good as any LHC experiment Platforms
PEB meeting, October 8, 2002 Slide 23
Torre Wenaus, BNL/CERN
How will ROOT be used?
LCG software will be developed as a ROOT user Will draw on a great ROOT strength: users are listened to very
carefully! The ROOT team has been very responsive to needs for new and
extended functionality coming from the persistency effort Drawing on ROOT in a user-provider relationship matches the
reality of the ROOT development model of a very small number of ‘empowered’ developers
The ROOT development team is small and they like it that way While ROOT will be used at the core of much LCG software for the
foreseeable future, we agree there needs to be a line with ROOT proper on one side and ‘LCG software’ on the other.
Despite the user-provider relationship, LCG software may nonetheless place architectural, organizational or other demands on ROOT
PEB meeting, October 8, 2002 Slide 24
Torre Wenaus, BNL/CERN
Basic Framework
Foundation Libraries
Simulation Framework
Reconstruction Framework
Visualization Framework
Applications
. . .
Optional Libraries
OtherFrameworks
Software Structure
PEB meeting, October 8, 2002 Slide 25
Torre Wenaus, BNL/CERN
Blueprint architecture design precepts
Software structure: foundation; basic framework; specialized frameworks Component model: APIs, collaboration (‘master/slave’, ‘peer-to-peer’),
physical/logical module granularity, plug-ins, abstract interfaces, composition vs. inheritance, …
Service model: Uniform, flexible access to functionality Object models: dumb vs. smart, enforced policies with run-time checking,
clear and bullet-proof ownership model Distributed operation Dependencies Interface to external components: generic adapters …
PEB meeting, October 8, 2002 Slide 26
Torre Wenaus, BNL/CERN
Blueprint Architectural Elements
Object dictionary and whiteboard Component bus Scripting language (ROOTCINT and Python both available) Component configuration Basic framework services
Framework infrastructures: creation of objects (factories), lifetime, multiplicity and scope (singleton, multiton, smart pointers), communication & discovery (eg. registries), …
Core services: Component management, incident management, monitoring & reporting, GUI manager, exception handling, …
System services Foundation and utility libraries
PEB meeting, October 8, 2002 Slide 27
Torre Wenaus, BNL/CERN
EventGeneration
Core Services
Dictionary
Whiteboard
Foundation and Utility Libraries
DetectorSimulation
Engine
Persistency
StoreMgr
Reconstruction
Algorithms
Geometry Event Model
GridServices
InteractiveServices
Modeler
GUIAnalysisEvtGen
Calibration
Scheduler
Fitter
PluginMgrMonitor
NTupleScripting
FileCatalog
ROOT GEANT4 DataGrid Python Qt
Monitor
. .
.
Modeler
MySQL
PEB meeting, October 8, 2002 Slide 28
Torre Wenaus, BNL/CERN
Recommendations
Recommendations in the report include… ‘Use of ROOT’ recommended as described Aspects of ROOT development of expected importance for LCG
software are listed Initiation of a common project on core tools and services Initiation of a common project on physics interfaces Initiation of an RTAG on analysis, including distributed Implementation of Python as interactive/scripting environment
and ‘component bus’ (as an optional part of the architecture) Review CLHEP quickly; repackage it as discrete components and
decide which components the LCG employs, develops Adopt AIDA Adopt Qt as GUI library Support standard Java compiler(s) and associated needed tools Develop a clear process for adopting third party software
PEB meeting, October 8, 2002 Slide 29
Torre Wenaus, BNL/CERN
Four experiments, Four Viewpoints
A lot of accord between ATLAS, CMS, LHCb on the architecture emerging in the blueprint
There are points of difference, but not fundamental ones Differences are between CMS and LHCb; no daylight between
LHCb and ATLAS ALICE point of view is distinct!
Sees the planned work as primarily a duplication of work already done over the last 7 years and available in ROOT
But, ALICE and ROOT are prepared to work in the framework of the LCG software being a customer of ROOT, with ALICE contributing substantially to ROOT
We do not have full accord among the four experiments, but we have come to a plan that should yield a productive working relationship among all
ALICE/ROOT view expressed in an appendix to the report, but ALICE and ROOT members sign the main report
PEB meeting, October 8, 2002 Slide 30
Torre Wenaus, BNL/CERN
Blueprint RTAG Status
Almost finished This week received input from a group with a ‘physics analysis user,
non-expert in software’ perspective asked to review the draft and give comments
Final report will be delivered this week and presented to Oct 11 SC2 meeting
A draft of the report has gone to the SC2 ~36 pages and close to complete
RTAG has met 14 times, and should not have to meet any more!
PEB meeting, October 8, 2002 Slide 31
Torre Wenaus, BNL/CERN
Concerns
Applications area needs an appropriate technical presence in the appropriate GDB working group(s) if the working groups are to be driving decisions on grid middleware to be used
Can’t think of any more right now!
PEB meeting, October 8, 2002 Slide 32
Torre Wenaus, BNL/CERN
Conclusion
So far so good Good engagement and support from the experiments at all stages
SC2, RTAG participation, project participation Use of LCG software written into experiment planning Now the LCG apps area has to deliver!
LCG hires on track and contributing at a substantial level IT personnel involved; fully engaged in the DB group, ramping up in
the API group EP/SFT group established Oct 1 to host LCG applications software
activity in EP Essential physical gathering of LCG applications participants in
building 32 begun Progress is slower than we would like, but direction and approach
seem to be OK First product deliverable, POOL, is on schedule