tb1: data analysis antonio bulgheroni on behalf of the tb24 team
TRANSCRIPT
TB1: Data analysisAntonio Bulgheroni
on behalf of the TB24 team
Introduction
Analysis software structure
Completely renewed analysis code based on algorithms already tested in sucimaPix and moved to Marlin + LCIO
All the I/O is based on LCIO, the official data format and event model of the ILC community.
This software tool is meant to take care of all the off-line analysis, from the handling of RAW data coming output from the DAQ to high level object (Tracks) for the general public usage
Where and howCVS repository at DESY
http://www-zeuthen.desy.de/lc-cgi-bin/cvsweb.cgi/Eutelescope/?cvsroot=eutelescope`
Documentation in Rome
http://www.roma3.infn.it/~bulgheroni/Eutelescope/head/
New for this test beam
EUTelMimoTelReader
The final DAQ system will output data already in LCIO format, but in order to full debug the system in this TB data were saved using a native “homemade” data format
Native data are then converted on the fly in the LCIO using a specific Marlin processor that is linking against the DAQ library (thanks to Emlyn).
The reading of serialized data is done with the same library used to write them on disk.
Pls, don't try to decode data without the eudaq deserializer!
New
EUTelStrasMimoTelReader Also the IPHC DAQ system data output format can
be used as an input of our analysis chain. Very important for benchmarking the analysis code! Special thanks to Jola and to the Strasbourg team for
your help!
New
EUTelClusterFilter Very versatile processor that is able to select clusters
applying several different selection criteria. The list of available cut will improve soon...
Useful because at the beginning one wants to be sure not to throw away anything and tends to keep cuts loose. Then looking at reference figures of merit, the selection can be improved.
Very fast!
New
New GEAR geometry New
Tatsiana prepared an update version of the geometry description implementing the current geometry.
The same description can be used both for simulation and real data
This is the starting point of track fitting (hopefully starting up this afternoon). This work will become soon part of the official release of GEAR.
Some very fresh data
Landau distribution for 3 GeV e-
Noise and bad pixel masking
Hit map on detector FoR
Hit map on telescope FoR
Event display: typical event
Event display: less typical event
Pair production (?)
What's next?
Now things start to be interesting Data are now migrating from the DAQ computer to
the dCache system of DESY and then registered on the GRID.
Who is interested in data analysis should ask for a GRID certificate and contact Tatsiana in order to get into the game.
My suggestion is: do whatever analysis you want, but possibly don't touch the native raw data. Recommended starting point is the collection of hits, expert users starting point the LCIO raw data
A lot of funny things to do... First of all benchmarking the software comparing
the results of Strasbourg telescope with other independent analysis.
Making the track fitting working. Filip provided a TrackFitter but still we haven't use it yet
Think about alignment ... Enjoy!
More seriously Consider the integration of a condition database for
the next test beam Try to move some analysis steps from off-line to on-
line Find a good data model for zero suppressed readout
mode Consider a possible replacement for GEAR (it's too
much complicated for what we have to do) Improve the event display
Conclusions
Summary The renewed software for the TB data analysis is
well behaving and working continuously since one week easily keeping the pace of the data taking.
The implemented algorithms for pedestal/noise calculation, bad pixel masking, cluster search, cluster filtering and frame of reference conversion proved to be working. Benchmarking with Strasbourg data will give another confirmation.
The rest of the analysis chain will be test starting from the end of this meeting.
Thanks! To everybody for this intense but extremely nice
beam test period...