operational quality control in helsinki testbed mesoscale atmospheric network workshop university of...

17
Operational Quality Control in Helsinki Testbed Mesoscale Atmospheric Network Workshop University of Helsinki, 13 February 2007 Hannu Lahtela & Heikki Turtiainen

Post on 20-Dec-2015

218 views

Category:

Documents


2 download

TRANSCRIPT

Operational Quality Control in Helsinki Testbed

Mesoscale Atmospheric Network Workshop

University of Helsinki, 13 February 2007

Hannu Lahtela & Heikki Turtiainen

©Vaisala | date | Ref. code | Page 2

What is Quality?

“The degree to which a system, component, or process meets (1) specified requirements, and (2) customer or users needs or expectations” – IEEE

”Data are of good quality when they satisfy stated and implied needs... [such as] required accuracy, resolution and representativeness.” – WMO Guide to Meteorological Instruments and Methods of Observation

©Vaisala | date | Ref. code | Page 3

Quality Management and Quality Control (QC)

“The purpose of quality management is to ensure that data meet requirements (for uncertainty, resolution, continuity, homogeneity, representativeness, timeliness, format, etc.) for the intended application, at a minimum practicable cost. Good data are not necessarily excellent, but it is essential that their quality is known and demonstrable.“

“Quality control is the best known component of quality management systems, and it is the irreducible minimum of any system. It consists of examination of data at stations and at data centres to detect errors so that the data may be either corrected or deleted *.”

-WMO Guide to Meteorological Instruments and Methods of Observation

*) ”Deleted” must be understood here in the sense that erroneous data is not used for applications – however, it should remain stored in the database, only flagged faulty.

©Vaisala | date | Ref. code | Page 4

Other Quality Management functions

In addition to QC, Quality Management includes:• equipment specification and selection• station siting and sensor exposure planning• maintenance & calibration procedures• data acquisition and processing (sampling, averaging, filtering...)• personnel training and education• metadata management• etc...

©Vaisala | date | Ref. code | Page 5

Levels of QC

©Vaisala | date | Ref. code | Page 6

Quality Flags

Information about suspicious or certainly wrong data values detected in the QC process should be passed on together with an information label, or flag, in order to:

• indicate the quality level

• inform which control methods and control levels data have passed

• inform about the error type if an error or suspicious value was found

Such flagging information is useful both in quality control phases (technical flags) and for users of meteorological information (end-user flags).

©Vaisala | date | Ref. code | Page 7

HTB uses FMI end-user flagging system

Four-digit code, one digit for each QC level:

HQC QC2 QC1 QC0

1000 100 10 1

Value of the digit defines quality of the data:

0 no check 4 calculated

1 OK 5 interpolated (spatial)

2 suspicious, small difference 8 missing

3 suspicious, large difference 9 deleted

Example: 1531

1 = QC0 at the site is OK

30 = QC1 found big difference (e.g. monthly limit exceeded)

500 = QC2 interpolated the value using neighbour station data

1000 = HQ accepted the interpolated value

©Vaisala | date | Ref. code | Page 8

Proposal for HTB QC process (by Jani Poutiainen) - so far implemented only partially and with some modifications.

©Vaisala | date | Ref. code | Page 9

Proposal for HTB QC process (by Jani Poutiainen) - so far implemented only partially and with some modifications.

©Vaisala | date | Ref. code | Page 10

Metman – QC1 Quality Control

Quality control of weather observations is based on real time quality control, containing the following quality control tests:

• range test

• step tests ( 1hr and 3 hrs)

• persistence test

• spatial test

•At present the following observations are tested:

• wind speed (10 min. average)

• barometric pressure

• air temperature

The best fit quality control algorithms and recommendations by NORDKLIM (KLIMA report no 8/2002) and Oklahoma Mesonet QC are superimposed on the Metman quality control process.

©Vaisala | date | Ref. code | Page 11

Metman QC: Control Domains

Each weather stations must be part of quality control domain. Each quality control domain contains predetermined suspicious and erroneous limits for each parameters needed in each test. The values can be configured based on seasonal climate extremes.

Also meteorologically non-representative and representative weather stations should be located in different quality control domains.

Spatial test can be performed only with stations located on same representative quality control domain.

In Helsinki Testbed project all weather stations belong to one and the same quality control domain.

However, there are some special sites that should belong to a different domain. For example air temperatures in Heimoonkruoppi differ dramatically from the weather stations near by.

©Vaisala | date | Ref. code | Page 12

Metman QC1 process

data flow

without

qc-flag

Range test

Step tests

data flow

with

qc-flags

erroneous

Persistence test valid or erroneous

Spatial test

(under testing)

valid, suspicious or erroneous

QC1-process

valid

valid

Suspicious

Suspicious

erroneous

Suspicious

Technical flag code is stored in the MetMan database. Four-digit end-user flag code is composed, converted to FMML and posted to CDW together with the observation data.

©Vaisala | date | Ref. code | Page 13

Metman QC: Range test

Range test is a test that determines if an observation lies between predetermined range.

Erroneous ranges are based on sensor specifications and suspicious ranges can be configured based on seasonal climate extremes.

Metman real time quality control process performs range test first.

Range test doesn't need historical observations to perform.

If range test:

• succeeds, step test will be performed next

• fails, the rest of the tests won't be performed, and observation is flagged with erroneous flag

• gets suspisious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 14

Metman QC: Step tests

Step tests use sequential observations (1-hour and 3-hours) to determine which data represent unrealistic 'jumps' during the observation time interval.

Erroneous and suspicious step thresholds can be configured based on seasonal climate extremes.

Metman real time QC process performs step tests after the range test.

Step tests need historical observations to perform.

If the tests:

• succeed, persistence test will be performed next

• fail, the rest of the tests won't be performed, and observation is flagged with erroneous flag

• get suspicious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 15

Metman QC: Persistence test

Persistence test analyses data on hourly basis to determine if observation underwent little or no variation.

Metman real time quality control process persistence test after the step test.

Persistence test needs historical observations to perform.

If the test:

• succeeds, observation is flagged with valid flag

• fails, observation is flagged with erroneous flag

• gets suspicious value, spatial test will be performed

©Vaisala | date | Ref. code | Page 16

Metman QC: Spatial test

Spatial test performs intercomparison between neighbour stations in the same quality control domain.

Metman real time quality control processes spatial test only if one of the earlier tests returns suspicious value.

Spatial test searches a nearby reference station and compares the parameter under test with that of the reference station. The reference station must

• belong to the same QC domain

• be sufficiently close

• have about the same altitude and installation heights

• the reference parameter must have passed range-, step- and persistence tests.

The spatial test is currently under testing, not yet operational.

©Vaisala | date | Ref. code | Page 17

HTB QC: next steps

• Extension of QC1 to all measured parameters

• Implementation of spatial test

• Availability of end-user flags through Researcher’s Interface

• Addition of technical flagging to CDW?

• Special challenge for dense mesoscale networks:Large number of stations => maintenance based on immediate response too

expensive => new methods and tools needed for QC, network diagnostics and maintenance!