wsdc quality assurance (qa)...w2 w3 w1 w2 national aeronautics and space administration jet...
TRANSCRIPT
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
1
WSDC Quality Assurance (QA)
October 20, 2009
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
2
QA Overview
Philosophy: WISE data volume is large; timescale for public release is short. To be successful, QA must be quick and efficient. Automate it as much as possible, allowing the human to concentrate on the small percentage of data requiring detailed scrutiny.
Objectives – Assess data through each stage of processing. – Identify and flag data not meeting WISE science requirements. – Characterize data so they can be correctly interpreted by the community.
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
3
QA Overview
There are four flavors of WSDC QA needed to support launch activities
– Ingest QA – Quicklook QA – ScanFrame QA – Archive QA
Director’s Review – October 20, 2009
Current QA products have been tested using ORT/MST data and other simulated data from Ned.
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
4
Current Implementation: QA summary page
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
5
Current Implementation: Ingest QA
Director’s Review – October 20, 2009
Daily summary for each ingested delivery:
• Files received and ingested • Error messages seen
Checks also performed against manifest from White Sands
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
6
Current Implementation: Quicklook QA, “scan” level
Director’s Review – October 20, 2009
In Quicklook QA, we run 3% of the delivered data through processing and check diagnostics in the following areas:
• Image backgrounds and noise • Scan synchronization • Photometric calibration • Visual checks
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
7
Current Implementation: Quicklook QA, “scan” level
Director’s Review – October 20, 2009
Visual checks
Two examples…
Scan synchronization monitor
W1 W2 W3 W4
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
8
Current Implementation: ScanFrame QA, scan level
Operational Readiness Review – October 7-8, 2009
In ScanFrame QA, we run all data through processing and check diagnostics in the following areas:
• Frame statistics & overlaps • Image quality • Photometric calibration • Astrometric calibration • Completeness and reliability • Checks of solar system objects • Color-color/color-mag plots • Visual checks
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
9
Current Implementation: ScanFrame QA, scan level
Director’s Review – October 20, 2009
Two examples…
Photometric calibration
Mean counts in frames
W2 W3
W1 W2
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
10
Current Implementation: ScanFrame QA, frame level
Director’s Review – October 20, 2009
Two examples…
Row and column medians in this frame
Frame image quality via aperture/PSF photometry comparison
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
11
Procedural Flow: Ingest QA
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
12
Procedural Flow: Quicklook QA
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
13
Procedural Flow: ScanFrame QA
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
14
Procedural Flow: Archive QA
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
15
Procedural Flow: Anomaly alerting & tracking
WSDC Anomaly Reports The WSDC uses Redmine software for
recording anomalies that impact the integrity of WISE science data. The software allows for the entering, assigning, and tracking of anomalies. Once an anomaly is analyzed and the issue closed, Redmine acts as a repository of the associated analysis reports.
For some anomalies assigned to them, MOS may track the same issues using their ISA system.
IRSA has their own issue tracking system, the IRSA Help Desk (via Test Track), which will be used to track problems with WISE online services
Director’s Review – October 20, 2009
National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology
16
Key Tasks Remaining
• Items needed to support launch – Incorporate lessons learned from ORT2 and ORT3/MST7.
• Better define the receivers and content for QA reports. • Optimize workflow for QA reviewers.
– Include color/flux checks for saturated sources. – Exercise anomaly alerting system. – Interface with SOC on quality scoring/reviewing/sign-off.
• Items needed post-launch – Refine thresholds for QA (e.g., what is the mean bkg noise on orbit and
what sized deviation should trigger review by QA scientist?). – Some of these cannot be determined until IOC or after.
Director’s Review – October 20, 2009