handbook calibration

Upload: erharsingh

Post on 07-Aug-2018

223 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/20/2019 Handbook Calibration

    1/73

    CalibrationTechnology

       C  a   l   i   b  r  a   t   i  o  n   T  e  c   h  n  o   l  o  g  y

    Basics, reference instruments for pressure and

    temperature, professional calibration

    VERLAG MODERNE INDUSTRIE

  • 8/20/2019 Handbook Calibration

    2/73

     verlag moderne industrie

    Calibration Technology

    Basics, reference instruments for pressure andtemperature, professional calibration

    Christian Elbert

  • 8/20/2019 Handbook Calibration

    3/73

    This book was produced with the technical collaboration ofWIKA Alexander Wiegand SE & Co. KG.

    Christian Elbert is head of the Calibration Technology Product Unitat WIKA in Klingenberg, Germany. He is a member of the board ofdirectors of the German Calibration Service (DKD) and chairman ofthe associated Technical Committee for Pressure. His special thanksgo to Danica Schwarzkopf for the expert support in the key topic of

    temperature and to the marketing team of André Habel Nunes.

    Translation: RKT Übersetzungs- und Dokumentations-GmbH,Schramberg

    © 2012 All rights reserved withSüddeutscher Verlag onpact GmbH, 81677 Munichwww.sv-onpact.de

    First published in Germany in the series Die Bibliothek der Technik Original title: Kalibriertechnik © 2012 by Süddeutscher Verlag onpact GmbH

    Illustrations: WIKA Alexander Wiegand SE & Co. KG, KlingenbergEditorial journalist: Bernhard Gerl, MainzTypesetting: JournalMedia GmbH, 85540 Munich-HaarPrinting and binding: Sellier Druck GmbH, 85354 FreisingPrinted in Germany 236622

  • 8/20/2019 Handbook Calibration

    4/73

    Contents

    Introduction 4

    Traceability and calibration hierarchy 7

    Calibration on an international level ....................................................... 8National metrological institutes .............................................................. 9Accredited calibration laboratories ......................................................... 10In-house calibration ................................................................................ 11

    Professional calibration 12

    Standards, regulations and calibration directives .................................... 12Calibration capability .............................................................................. 16Ambient conditions ................................................................................. 17Calibration sequences ............................................................................. 20

    Reference instruments 25

    Reference standards ................................................................................ 26Working standards................................................................................... 34Portable pressure calibration instruments ............................................... 43

    Calibration characteristics 49

    Measuring deviation ................................................................................ 49Hysteresis ................................................................................................ 51Repeatability ........................................................................................... 53

    Determination of the characteristics in practice ...................................... 54Measurement uncertainty 55

    Basics according to GUM ....................................................................... 55Measurement uncertainty budget ............................................................ 58Example calculation ................................................................................ 60

    Documentation 62

    Minimum data required for a DKD calibration certificate ..................... 63Graphic evaluation .................................................................................. 64Single-figure indication........................................................................... 65

    Trends 66

    Technical terms 68

    The company behind this book 71

  • 8/20/2019 Handbook Calibration

    5/73

    4

    Introduction

    Every measuring instrument is subject to age-ing as a result of mechanical, chemical or ther-mal stress and thus delivers measured valuesthat change over time. This cannot be pre-vented, but it can be detected in good time bycalibration.The Egyptians already knew this almost 5000

    years ago. The workers calibrated their yard-sticks by comparing them with a “royal cubit”(approx. 52.36 cm) made of stone and thusmanaged to achieve, for example, side lengthson the Cheops pyramid of 230.33 m whichdiffer from each other by only about 0.05per cent.In the process of calibration, the displayed

    value of the measuring instrument is comparedwith the measuring result of a different meas-uring device which is known to function cor-rectly and accurately and which itself has beenmade to coincide directly or indirectly with anational (or international) reference instrument(standard) (Fig. 1). One talks about verification

    when the calibration has been carried out orsupervised by an official body. Both of thesevariants are purely intended for determiningthe quality of the displayed values. No inter-vention to the measuring instrument itself isallowed. With adjustment , it is understood thatthere is an intervention to the measuring device

    in order to minimise a detected measuringdeviation. Typically, adjustment is followedby a further calibration, in order to check anddocument the final state of the measuringinstrument following the intervention.In contrast to verification, which will lose itsvalidity after a period of time set by law, thevalidity period of a calibration is subject topractical specifications such as manufacturer’s

    Definition

    Period ofvalidity

  • 8/20/2019 Handbook Calibration

    6/73

    Introduction 5

    instructions, requirements of a quality assur-ance standard or in-house and customer-specific regulations. Calibration must also becarried out when the measuring instrument isused for the preparation of products which aresubject to official supervision such as drugs or

    foodstuffs.As part of a survey of 100 management execu-tives of international companies, the NielsenResearch Company established in 2008 thatdue to faulty calibrations, manufacturing com-panies were losing more than 1.7 million dol-lars annually on average, and those companies

    with a turnover of more than a billion dollarsas much as 4 million dollars.In connection with limitations on resourcesand the required increase in efficiency of manu- facturing processes, calibration is increasinglygaining in importance. Increasing the measur-ing accuracy may result in raw materialsavings and fewer emissions of pollutants, forexample, by supplying exactly the right

    Significance

    Fig. 1: Accredited calibrationlaboratory for themeasurement par-ameter “temperature”

  • 8/20/2019 Handbook Calibration

    7/73

    6 Introduction

    amount of oxygen during a chemical reaction.The calibration of measuring instruments cansometimes also be relevant to safety: if pres-sure or temperature sensors (in the chemicalindustry, for example) do not provide correctvalues, the incorrect control of chemical pro-cesses may even result in a risk of explosion

    (Fig. 2). At the very least, the importance ofcalibration can be seen in everyday examplessuch as in household gas or water meters andin fuel gauges at petrol pumps.In this book, the basics of calibration andcalibration technology will be presented. It willdescribe which rules, methods and referenceinstruments are suitable for professional cali-bration. Pressure and temperature measuringinstruments will serve as application examples.

    Fig. 2:Calibration of

     pressure measuringinstruments with a

     portable calibrator 

  • 8/20/2019 Handbook Calibration

    8/73

    7

    Traceability and

    calibration hierarchyTo be able to compare measuring results, theymust be “traceable” to a national or inter-national standard via a chain of comparativemeasurements. To this end, the displayedvalues of the measuring instrument used or ameasurement standard are compared over one

    or several stages to this standard. At each ofthese stages, calibration with a standard pre-viously calibrated with a higher-rankingstandard is carried out. In accordance with theranking order of the standards – from theworking standard or factory standard and thereference standard to the national standard –

    the calibration bodies have a calibrationhierarchy. This ranges from the in-housecalibration laboratory to the accreditedcalibration laboratory and to the nationalmetrological institute (Fig. 3).

    Hierarchy ofthe standardsand calibrationservices

    National

    metrology instituteNational standard

     Accredited calibration laboratoryReference standards

    Internal calibration laboratoryWorking or factory standards

    Inspection equipment, e.g. of a company

    Fig. 3:Calibration hierarchy described by theexample of Germany

  • 8/20/2019 Handbook Calibration

    9/73

    8 Traceability and calibration hierarchy

    The German Calibration Service  DKD(Deutscher Kalibrierdienst) designates thefollowing as essential elements of traceability:

    • The chain of comparison must not be inter-rupted.

    • In each stage of the calibration chain, themeasurement uncertainty must be known sothat the total measurement uncertainty canbe calculated. As a rule, a higher-rankingmeasuring instrument should have a measur-ing accuracy three to four times higher thanthe instrument calibrated by it.

    • Each stage of the calibration chain must bedocumented as must the result.

    • All bodies carrying out a stage in this chainmust prove their competence by means ofaccreditation.

    • Depending on the required measuring accur-acy and technical requirements, calibrations must be repeated at appropriate intervals.

    Calibration on an internationallevel

    On an international level, the BIPM ( Inter-national Bureau of Weights and Measures,abbreviation of French:  Bureau Internationaldes Poids et Mesures) coordinates the develop-ment and maintenance of primary standardsand the organisation of international compara-tive measurements. Decisions about the repre-sentation of the primary standards are made bythe CGPM (General Conference for Weightsand Measures, abbreviation of French: Con- férence Générale des Poids et Mesures). Theparticipants of the conferences, which takeplace every four to six years, are the represen-tatives of the 51 signatory states of the inter-national Metre Convention and the represen-

    Traceability inpractice

    BIPM

  • 8/20/2019 Handbook Calibration

    10/73

    National metrological institutes 9

    tatives of those 26 associated member stateswithout full voting rights.

    National metrological institutes

    On a national level, institutes are responsiblein most cases for metrology. They maintainthe national standards to which all calibratedmeasuring instruments can be traced andensure that these primary standards are com-

    parable on an international level.In most countries, the top metrological institutesare state agencies or authorities. Thus, the na-tional metrology institute of the Federal Repub-lic of Germany PTB (Physikalisch Technische Bundesanstalt ) in Braunschweig and Berlin isdirectly responsible to the Federal Ministry ofEconomy. In the USA, the NIST ( National Insti- 

    tute of Standards and Technology) in Gaithers-burg (Maryland) operates under the authorityof the US Department of Commerce. However,in some countries, the national institutes areprivatised. Thus in the U.K., the “guardian ofmeasures” belongs to the SERCO Group.The best-known of the standards stored at PTB

    are without doubt the atomic clocks, which

    National orprivate institutes

    Fig. 4: Atomic clock ofthe PTB (Germannational metrology

    institute)

  • 8/20/2019 Handbook Calibration

    11/73

    10 Traceability and calibration hierarchy

    serve among other things as the basis for thetime signal in radio clocks and watches (Fig. 4).As the national metrological institute, the PTB

    also has the legislative mandate to offerscientific and technical services in the area ofcalibration to science and commerce. For this,it uses a network of accredited institutions.

    Accredited calibration laboratories

    Accredited calibration laboratories often takeon calibration as external service providers forthose companies that do not have the requiredequipment themselves. However, they them-selves can also be part of a company andcalibrate all measuring instruments within it.To this end, they are equipped with their ownworking or factory standards which are cali-brated at the proper time intervals with thesmallest possible measurement uncertaintyusing the reference standard of the appropriatenational metrological institute or other accred-ited calibration laboratories.To ensure the same level of all calibrations,the laboratories carrying them out must be ac-

    credited in accordance with internationally rec- ognised rules (DIN EN ISO IEC 17025). Within the EU, this is coordinated by the EA ( Euro- pean co-operation for Accreditation). In themember states, calibration laboratories can ob-tain a certificate from the national agencies re-sponsible for them which certifies that they are

    working in accordance with these regulations.In Germany, the responsible body is theDAkkS (abbreviation of German:  Deutsche Akkreditierungsstelle), which took over thismandate from the DKD from 17.12.2009. Asan expert committee of the PTB and an associ-ation of calibration laboratories in industrialcompanies, inspection institutions and tech-nical authorities, the DKD nowadays works

    Coordinationby the EA

  • 8/20/2019 Handbook Calibration

    12/73

    In-house calibration 11

    exclusively on specialist grass-roots projectssuch as the development of standards anddirectives (Fig. 5).

    DAkkS carries out a complete evaluation ofevery accredited calibration laboratory everyfive years and also pays monitoring visitsevery 18 months, to ensure that the highdemands of measuring processes are met. Inaddition to the described process monitoringof laboratories, there are also expert commit-

    tees to ensure technical development andknowledge transfer.Since all European bodies that accredit cali-bration laboratories collaborate in the EA, thework procedures of the laboratories have beenmatched to each other. This is why calibrationcertificates issued by any calibration labora-tory are also recognised in all other Europeancountries.

    In-house calibration

    It is up to each individual company how to setup its in-house calibration system, but allmeasurement and test equipment must be cali-

    brated with its own reference standards, whichin turn must be traceable to the national stand-ard. As proof, in-house calibration certificatesare issued.

    EU-wide recog-nised calibrationcertificates

    Fig. 5: Logo of the GermanCalibration Service

  • 8/20/2019 Handbook Calibration

    13/73

    12

    Professional

    calibrationThe professional execution of calibrations isgoverned by various standards, regulations anddirectives. For a measuring instrument to becalibrated in the first place, it must fulfilcertain basic requirements. The physical con-ditions under which calibration can be carried

    out must also be known and taken into ac-count. Under these conditions, it is possible toselect a calibration procedure suitable for therequirements.

    Standards, regulations and

    calibration directivesIn essence, regulations for the calibration ofmeasuring instruments take effect whenever acompany decides to observe a standard ordirective for its calibration or when it manu-factures products whose production is subjectto legal regulations. Of great importance for

    quality assurance are standards and directivessuch as the ISO 9000 series of standards,which is being implemented more and morefrequently in all industrialised nations. InClause 7.6 “Control of monitoring and meas-uring equipment” of the ISO 9001:2008 stand-ard “Quality management systems – Require-

    ments”, there is a specific requirement that anyinspection equipment that directly or indir-ectly affects the quality of the products mustbe calibrated. This includes, for example, testequipment used as a reference in measurementrooms or directly in the production process.The ISO 9000 standards do not stipulate avalidity period for calibrations – which wouldnot make a lot of sense given the different

    Quality assur-

    ance standards

  • 8/20/2019 Handbook Calibration

    14/73

    Standards, regulations and calibration directives 13

    technologies of measuring devices – but theydo specify that any inspection equipment mustbe registered and then a distinction must bemade as to whether or not it must be regularlycalibrated. Inspection plans must be drawn upin which the scope, frequency, method andacceptance criteria are defined. Individual cali-brations are to be documented in detail. Labels

    on the measuring instruments (Fig. 6) orappropriate lists must show when each pieceof inspection equipment needs to be recali-brated. It is essential to recalibrate when ameasuring instrument has been altered or dam-aged during handling, maintenance or storage.Closely related to the ISO 9000 series of

    standards in terms of its structure is the ISO10012:2004 standard “Measurement manage-ment systems – Requirements for measure-ment processes and measuring equipment”. Itdefines the requirements of the quality man-agement system that can be used by com-panies in order to establish confidence in themeasurement results obtained. In measurementmanagement systems, it is not only the meas-

    Requirements

    for measure-ment manage-ment systems

    Fig. 6:Calibration stickeron a mechanical

     pressure gauge

  • 8/20/2019 Handbook Calibration

    15/73

    14 Professional calibration

    uring device but the entire measuring processthat is considered. This means that thoseresponsible not only have to determine the

    measurement uncertainty during calibration,but also have to verify and evaluate themeasurement uncertainty in use. To this end,statistical methods are also used.In addition to such universal standards, indi-vidual sectors of industry have their own dir-ectives for the quality assurance of measuring

    devices, for example the automotive industry.American automobile manufacturers havedeveloped the QS 9000 directive in which theISO 9000 standards have been substantiallysupplemented by industry and manufacturer-specific requirements, and in part tightened.In the meantime, the American QS 9000, theGerman VDA 6.1 and other country-specificregulations have been combined to someextent in the international ISO/TS 16969standard. This saves many suppliers multiplecertifications.Quality assurance standards and directivesmust only be observed by companies that wantto be certified. The situation is completely dif-

    ferent when, for example, drugs, cosmetics orfoodstuffs are being manufactured. Here legalregulations, whose compliance is controlledby state agencies, often apply. Due to inter-national trade relations, the regulations of theAmerican Food and Drug Administration(FDA) are important. Thus, the Code of

    Federal Regulation  (CFR) requires the “cali-bration of instruments, apparatus, gauges, andrecording devices at suitable intervals inaccordance with an established written pro-gram containing specific directions, schedules,limits for accuracy and precision, and pro-visions for remedial action in the event accur-acy and/or precision limits are not met”. Euro-pean laws have similar stipulations.

    Industry-specificdirectives

    Legal provisions

  • 8/20/2019 Handbook Calibration

    16/73

    Standards, regulations and calibration directives 15

    Apart from the regulations mentioned, whichall aim at the necessity of calibration, there arealso regulations for calibration itself. These

    must be strictly adhered to by the body per-forming the calibration in order to achievelegal acceptance of the results. The most im-portant of these is DIN EN ISO/IEC 17025“General requirements for the competence oftesting and calibration laboratories” (Fig. 7).It describes the process-oriented requirements

    of calibration laboratories and serves as basisfor accreditation.

    Requirementsfor calibrationservices

    Fig. 7: Accredited calibrationlaboratory for themeasurement par-ameter “pressure”

    National institutes and affiliated technicalcommittees publish directives and documentsto supplement this standard, in which concretetechnical and organisational procedures aredescribed which serve as a model to the cali-bration laboratories for defining internal pro-cesses. These directives can in turn be part ofquality management manuals for companiesentrusted with calibration. They guarantee that

    Additionaldirectives

  • 8/20/2019 Handbook Calibration

    17/73

    16 Professional calibration

    the instruments to be calibrated are treatedidentically and that the work of the bodies en-trusted with calibration can be verified. In

    Germany, this function is performed by theDKD.

    Calibration capability

    Before a measuring instrument can be cali-brated, it must be checked as to whether it is

    suitable for this. The decisive factor is whetherthe condition of the instrument conforms togenerally recognised rules of technology andto the manufacturer’s documentation. If cali-bration is possible only after maintenance oradjustment, this work must be agreed upon be-forehand between the customer and the cali-bration laboratory (Fig. 8). Any subsequentintervention to the instrument will void a priorcalibration.To determine whether a measuring instrumentis fit to be calibrated, its general condition andfunction are first checked. When calibratingmechanical pressure measuring instruments,

    Testing themeasuringinstrument

    Fig. 8: Adjustment of a Bourdon tube pressure gauge

  • 8/20/2019 Handbook Calibration

    18/73

    Ambient conditions 17

    the DKD directive DKD-R 6-1 designates,among other things, the following main tests:

    • visual inspection for damage (pointer,thread, sealing face, pressure channel)

    • contamination and cleanliness• visual inspection with regard to labelling,

    readability of the displays• check whether the documents required for

    calibration (specifications, instruction man-ual) are available

    • proper functioning of the operating elements(e.g. adjustability of the zero point)

    • adjusting elements in a defined position• torque dependence (zero signal) during

    mounting.

    For other measuring instruments, similar re-quirements apply in most cases.In connection with the calibration capability,very elementary questions also arise such as:Is the measuring instrument accessible? Canthe measuring instrument be dismounted? Is itpossible to calibrate the measuring instrumentunder conditions similar to those in which it isused?

    Ambient conditions

    Measured values often depend to a greater orlesser extent on ambient conditions such astemperature, humidity, exposure to dust andelectromagnetic radiation. Thus, the Egyp-

    tians’ royal cubit made of granite expanded atan increase in temperature of 30°C, not un-usual for a desert climate, by 0.05 per cent −this alone would already explain the dimen-sional deviations during the construction ofthe Cheops pyramid. This is why measuringinstruments should be calibrated if possibleunder the same ambient conditions in whichthey will be used.

    Ideal: calibra-tion under oper-ating conditions

  • 8/20/2019 Handbook Calibration

    19/73

    18 Professional calibration

    Temperature is a particularly important influ-ence quantity in pressure measurement (Fig. 9).This is why for the accuracy classes 0.1, 0.25and 0.6, calibration must take place in a rangeof ±2°C, and for all others in a range of ±5°Crelative to the reference value. Prior to calibra-tion, the measuring instrument must be given

    enough time to adopt the ambient temperatureor to warm up to a constant operating tempera-ture. In particular therefore, mechanical devicesof large mass such as tube pressure gaugesshould be acclimatised overnight in the calibra-tion room.The equation represented in Figure 10 shows

    which influencing factors must be taken intoaccount, for example, when calibration ismade with a pressure balance (see  Referencestandards for the calibration of pressuremeasuring instruments, p. 26 ff.). With thiscalibration instrument, pressure is producedmainly by metal weights which generate aload on a known piston area. The piston areachanges with temperature because the piston

    Example:

    calibrating apressure balance

    Fig. 9:Temperature controlduring calibration

  • 8/20/2019 Handbook Calibration

    20/73

    Ambient conditions 19

    material can expand thermally. Due to theirvolume, the weights experience a certainbuoyancy which depends on their own dens-ity, but also on the density of air. The latter isaffected by the ambient temperature and thehumidity. The value of the acceleration due togravity (also called gravitational accelerationor gravitational constant) is not a constant atall. Due to the earth’s rotation around its axisand its topography and the inhomogeneousmass distribution in the earth interior, the

    gravitational constant varies regionally by afew per mille. In addition, it will decrease thefarther you move away from the earth’s sur-face. Thus, the gravitational constant de-creases by about 0.001 per cent when goingup to the next floor in a factory building. Thisis an order of magnitude that is no longer

    negligible for high-quality pressure balances.The ambient conditions must be checked evenafter a calibration has been concluded. Thereason for this is that, for example, suddenchanges in temperature can affect the measur-ing instrument in such a way that calibrationbecomes void, which is why a calibratedmeasuring instrument must be professionallystored and transported.

    Buoyancy

     A = f(T) A = f(p)

    Local value of

    gravitational acceleration

    Height

    difference

    p

     A 0 

    l

    ρlρm

    =

    m  g1 –( )1 + ( α + β ) (T – T0 ) + λ  p

    Fl l l+ ( ρ  )  g ∆h– ρ   10

     –5

                                          }

          }

          }   }

    Fig. 10: Influences on themeasurement resultsof a pressure

    balance: p Pressure in bar m Mass load l  Air densitym  Density of the

    massesgl  Local value of

    gravitationalacceleration

     A0  Piston areaunder standardconditions

     ,   Temperaturecoefficients

    T Piston tempera-ture

    T 0  Referencetemperature

    (293.15 K)  Pressure distor-

    tion coefficient ρFl  Density of the

     pressuremedium

    h  Heightdifference

  • 8/20/2019 Handbook Calibration

    21/73

    20 Professional calibration

    Calibration sequences

    In general when a measuring instrument is

    used, its measuring range is only partially util-ised. The calibration points must be selectedsuch that they are within the working range ofthe inspection equipment. In calibration, atleast the upper and lower process points mustbe considered. To check the linearity of thetest item, at least one more checkpoint lyingbetween these end points is measured. To com-pare the measured values of the measuring in-strument and the reference or working stand-ard, the measurement parameter can, in manycases, be adjusted either according to thedisplayed values of the test item or accordingto the readings of the standard.In many cases, the number of measuring

    points for the calibration of a measuring in-strument is not defined in a standard. In thiscase, close coordination between the calibra-tion laboratory and the user is required inorder to define suitable test points.

    Calibration of pressure measuring

    instrumentsThe calibration of pressure measuring instru-ments is regulated in great detail. The docu-ment DKD-R 6-1 of the German CalibrationService DKD describes three calibration se-quences for different accuracy classes (Table 1).The calibration sequence designated by A forthe accuracy class < 0.1 prescribes three load-ings up to the full scale value, to be completed

    Selection of

    the calibrationpoints

    Table 1:Calibration sequences as per DKD-R 6-1(Source: DKD-R 6-1,

     p. 11)

    Cali-

    brationsequence

    Desired

    accuracyclass

    Number of

    measuringpoints

    Number of

    pre-loads

    Number of

    measurement series

    up down

    A < 0.1 9 3 2 2

    B 0.1 to 0.6 9 2 2 1

    C > 0.6 5 1 1 1

  • 8/20/2019 Handbook Calibration

    22/73

    Calibration sequences 21

    prior to the actual measurement series (Fig. 11top). The maximum pressure is held for atleast 30 seconds and is then completely re-

    leased again. Over the course of the first meas-urement series, nine measuring points evenlydistributed over the measuring range are meas-ured by gradually increasing the pressure from“low to high”. The zero point counts as thefirst measuring point, but is only included inthe statistical evaluation if it is free-moving,

    i.e. if the pointer movement is not limited by a

    Calibrationsequence A

    Sequence A 

    Sequence B

    Sequence C

    FSp

    M2

    z

    z

    tZero-point adjustment

     Additional repeated measurement during 2nd setting

    M1 ... M6: measurementseries

    Pre-load

    M3 M42 minutes

    • •

    • •

    • •

    M1

    M5

    M1 M2

    M1 M2

    M3

    M6

    ••

    • •

    • •

    ••

    • •

    ••

    ••

    30 s 30 s

    2 minutes

    For Bourdontube pressuregauges:5 minutes

    Readings

    Fig. 11: Illustration ofthe calibrationsequences as per

     DKD-R 6-1 (Source: DKD-R 6-1, p. 12)

    FS Full scale value

  • 8/20/2019 Handbook Calibration

    23/73

    22 Professional calibration

    stop pin. The pressure must be increased atsuch a low rate that the intended measuringpoint is not exceeded, since due to hysteresis

    (see p. 51 ff.), this could lead to a falsificationof the results. If necessary, the pressure mustbe sharply reduced again to ensure that themeasuring point is reached from below in allinstances. The value reached is read after aholding time of at least 30 seconds. This hold-ing time is necessary as the value displayed

    for the test item is not adopted directly, butonly reached after a certain relaxation time(Fig. 11 bottom right). With tube pressuregauges, the housing should be given a lightknock prior to reading in order to minimisefriction effects on the movement. The finalvalue at the last test point is also read and docu- mented after 30 seconds. After two minutesat constant pressure, this value is read a sec-ond time. With tube pressure gauges, the max-imum pressure should even be held for fiveminutes. In contrast, with piezoelectric sen-sors, the holding times can be reduced. In thesecond measurement series, the same measur-ing points are measured from “top to bottom”

    starting with the maximum pressure. To do so,the pressure is lowered such that it does notfall below the respective intended value. At theend of this first measurement cycle, the zeropoint is measured. For this, the instrumentshould remain depressurised for two minutes.Calibration sequence A requires that the de-

    scribed cycle of increasing and decreasing thepressure by degrees is repeated once. If clamp-ing effects are observed in pressure measuringinstruments with large measuring ranges or ex-posed flush diaphragms, a third measurementcycle can be carried out in order to detect anydependence of the zero signal on the tighten-ing torque − an effect often observed withlow-cost electrical sensors.

  • 8/20/2019 Handbook Calibration

    24/73

    Calibration sequences 23

    Calibration sequence B (Fig. 11 centre), whichis used for pressure measuring instruments ofaccuracy classes 0.1 to 0.6, requires less effort:

    There are only two preloads up to the max-imum value, and the third measurement seriesends shortly after the pressure has reached themaximum value. From there, the pressure isdischarged immediately down to the zerovalue.Calibration sequence C can be applied to pres-

    sure measuring instruments of accuracy class> 0.6 (Fig. 11 bottom). It only requires onepreload up to the maximum value and onlyone measurement series, which, including thezero value and the maximum value, consists offive measuring points and in which the pres-sure is increased by degrees and then lowered

    again.

    Calibration of temperature measuringinstrumentsGenerating an accurately defined temperatureis much more complicated than generating aparticular pressure. This is why in the calibra-

    tion of temperature measuring instruments,such extensive measurement series are usuallynot performed. In many cases, thermometersare only calibrated at a single fixed point suchas the triple point of water. To adjust the tem-perature measuring instrument, its character-istic curve (usually very well known) is shiftedupwards or downwards in such a way that theinstrument displays the correct value at thefixed point.Measurement series are feasible when a tem-perature measuring instrument is not to becalibrated at a fixed point, but by comparisonwith a higher-quality measuring instrument.This is possible, for example, in immersionbaths or ovens.

    Calibrationsequence B

    Calibration

    sequence C

    Calibration at afixed point ...

    ... or in a meas-urement series

  • 8/20/2019 Handbook Calibration

    25/73

    24 Professional calibration

    When performing such a measurement series,make sure that:

    • enough time is available for the test item toadopt the temperature of the reference in-strument

    • the environment supplies a homogeneoustemperature distribution over space and time,so that the temperature at the reference instru-ment has the same value as at the test item

    • the immersion depth is sufficient and at least

    ten times the sensor tube diameter, to pre-vent heat from escaping to the environment.

    Software validationSince the measured data is increasingly re-corded by computers, it is necessary also tocheck the software when calibrating a measur-

    ing chain. In particular, malfunctions mayoccur when modifications are made to the sys-tem environment. However, this is not referredto as calibration, but validation, i.e. an inspec-tion that checks whether a process can fulfilthe intended purpose.

  • 8/20/2019 Handbook Calibration

    26/73

    25

    Reference instruments

    Depending on the application and measuringaccuracy of the measuring instrument to becalibrated, a suitable reference instrument is re-quired. This chapter describes reference instru-ments which can be used as reference standardsor working standards and as portable calibra-tion instruments using the measurement param-eters pressure and temperature as an example.The highest metrological requirements are metby primary standards whose measured valuesare recognised without reference to other stand-ards, for example, by determining the measure-ment parameter via the parameters specified inits definition or via fixed points given by a nat-ural law. Primary standards are usually used by

    national metrological institutes or accreditedcalibration laboratories. They serve as referencestandards that can be used for calibrating theworking standards or factory standards used inin-house calibration laboratories.It must be possible for each measuring instru-ment to be traced to a national standard by

    means of a chain of comparative measure-ments. In this chain, the measurement uncer-tainty of each reference instrument should bethree to four times less than that of the meas-uring instrument to be calibrated. This alsomeans that in each intermediate step, a loss inaccuracy takes place and the measuring accur-

    acy of the measuring instruments used in pro-duction is limited by their position in the cali-bration hierarchy. While, for example, pres-sure measuring instruments equipped with apressure balance and used in the national insti-tutes or calibration laboratories as primarystandards achieve a measuring accuracy of0.001 per cent of the measured value, portablepressure measuring instruments typically only

    Primarystandard

    Hierarchyaccording to

    measurementuncertainty

  • 8/20/2019 Handbook Calibration

    27/73

    26 Reference instruments

    achieve 0.2 per cent. In addition to more com-plicated measurement technology, a highermeasuring accuracy also requires more effort

    and time during calibration.

    Reference standards

    Reference standards for the calibration ofpressure measuring instrumentsThe reference standards used for the calibra-

    tion of pressure measuring instruments arepressure balances which are also referred to asdeadweight testers (Fig. 12). They measure thephysical quantity pressure with respect to itsdefinition as force per unit area. From thatpoint of view, pressure balances are con-sidered as primary standards.The core of the pressure balance is a very accur-

    ately manufactured piston/cylinder system withexactly measured cross-section. As the pressuretransmission medium, the cylinder contains

    Pressure

    balances

    Fig. 12: Deadweight tester 

  • 8/20/2019 Handbook Calibration

    28/73

    Reference standards 27

    either a dry, purified, particle-free gas (such asnitrogen or synthetic air) or a hydraulic fluid.Since the gap between the piston and the cylin-

    der is smaller than 1 µm, the system is sealed,even under high pressure, and has good runningcharacteristics. Both the piston and the cylinderare manufactured from highly resistant tungstencarbide. This guarantees that the piston surfacedoes not change in relation to temperature orpressure. A solid, stainless steel housing pro-

    tects the piston/cylinder system effectively fromcontact, impacts or dirt from outside.Via the piston, a force acts on the medium in-side the cylinder from which the pressure in-side the medium can be calculated − due to theknown cross-section of the piston. This forceis made up of the weight of the piston, theweights of the masses (calibrated with high ac-curacy and which are stacked on top of a bell-shaped device or a plate on the piston) and theweight of the bell or the plate (Fig. 13). The

    Operatingprinciple

    Fig. 13: Mass load, includingtrim masses of a fewmilligrams

  • 8/20/2019 Handbook Calibration

    29/73

    28 Reference instruments

    bell also serves to displace the centre of grav-ity of the assembly downwards so that themeasurement can be more stable. The total

    mass made up of the piston, the appliedweights and the bell or plate (also referred toas the weight applied) is proportional to themeasured pressure.Each pressure balance includes a set of num-bered weights in the form of metal discs. Therespective weights and the pressure values re-

    sulting from them are listed in the calibrationcertificate (Table 2). Depending on the desiredpressure to be measured, the appropriateweights are selected as required.The fact that the acceleration due to gravity,g, is subject to local fluctuations that are notinsignificant can be taken into account wheninitially preparing the weights applied. If atthe place where the pressure balance will be

    Considering thelocal gravita-tional accelera-tion

    Table 2:Characterisation ofthe weights of a

     pressure balance(extract from calibra-tion certificate)

    Designation of

    the weightNo.

    Actual

    mass in kg

    Pressure value for

    the system in bar

    Piston 1262 0.08160 0.4002

    Overhang (bell jar) 1 0.81560 3.9998

    Plate 2 0.05097 0.2499

    Mass 3 1.01954 5.0000

    Mass 4 1.01954 5.0000

    Mass 5 1.01954 5.0000

    Mass 6 1.01954 5.0000

    Mass 7 1.01954 5.0000

    Mass 8 1.01954 5.0000Mass 9 1.01954 5.0000

    Mass 10 1.01953 5.0000

    Mass 11 1.01952 4.9999

    Mass 12 0.50976 2.5000

    Mass 13 0.20391 1.0000

    Mass 14 0.20391 1.0000

    Mass 15 0.12234 0.6000

    Mass 16 0.10196 0.5000

    Mass 17 0.07137 0.3500

    Mass 18 0.05098 0.2500

  • 8/20/2019 Handbook Calibration

    30/73

    Reference standards 29

    used, g is, for example, 0.25 per cent abovethe standard value of 9.80665 m/s2, theweights are made 0.25 per cent “lighter”. Al-

    ternatively, a deviation in the gravitational con-stant can also be corrected via the equationshown in Figure 10, p. 19, or else via a simplerequation that exclusively takes the dependenceon the gravitational constant into account.To apply the exact test pressure predeter-mined by the weights applied to the system −

    made up of the pressure measuring instru-ment (sensor) to be tested and the piston −a pump integrated into the pressure balanceor an external pressure supply is used. Forfine adjustment, an adjustable volume regu-lated by a precision spindle is available.The approximate pressure can be read offa test pressure gauge mounted on the pressurebalance. Once the desired test pressure hasalmost been reached, the rotary knob atthe spindle is rotated and thus medium fromthe spindle is fed in until the piston alongwith the weights applied starts “floating”. Tominimise frictional forces, the system is care-fully set into rotary motion at the weights

    applied. The pressure acting in the cylinder,and thus also on the pressure measuring in-strument to be calibrated, then correspondsexactly to the quotient of the weight of thetotal mass applied divided by the piston cross-section. It remains stable for several minutes,also allowing any adjustment tasks taking a

    longer time to be carried out on the test itemwithout problems.To allow calibration in different pressureranges, different piston/cylinder systems canbe selected (Fig. 14) such as pneumatic sys-tems for pressures from 1 bar up to typically100 bar. For higher pressures of up to typically1000 bar or in special cases up to 5000 bar,hydraulic systems are used. In this case, the oil

    Pressure supply

    Piston-cylindersystems

  • 8/20/2019 Handbook Calibration

    31/73

    30 Reference instruments

    used serves simultaneously as a lubricant forthe piston/cylinder unit and optimises the run-ning characteristics.Depending on the manufacturer, a wide rangeof pressure measuring instruments can be con-nected to the pressure balance via a standardmale thread or a quick-release connector. Thismakes it possible to cover large pressure ranges

    with an accuracy of up to 0.004 per cent of themeasured value. Thus, pressure balances are themost accurate instruments for the calibration ofelectronic or mechanical pressure measuringinstruments. Due to their excellent long-termstability of five years – according to the recom-mendations of the DKD – they have been in use

    as reference standards for many years, not onlyat national metrological institutes, but also incalibration and factory laboratories.

    Reference standards for the calibration oftemperature measuring instrumentsTemperature scales are established againstfixed points for certain substances. It is wellknown that the Celsius scale is defined by the

    Connecting thecalibration item

    Fig. 14:Piston-cylindersystem of a pressurebalance

  • 8/20/2019 Handbook Calibration

    32/73

    Reference standards 31

    melting point and boiling point of water anda further 99 equidistant scale values lying in-between.

    In metrological institutes, the temperaturecan be determined by gas thermometers.These measuring instruments are based onthe fact that the vapour pressure of liquidsdepends very strongly on temperature. Tomeasure an absolute temperature, either thetemperature-dependent difference in volume

    at constant pressure or the temperature-dependent pressure differential at constantvolume relative to the corresponding valueat a calibration point is determined. How-ever, this method of temperature measure-ment is fairly complicated.This is why cells in which fixed points ofhigh-purity substances can be set are moresuitable for the calibration of temperaturemeasuring instruments. As a function oftemperature and pressure, substances existin the three classical physical states −solid, liquid and gas. Phase transitions, forexample from solid to liquid, can be used forcalibration since at constant pressure, the

    temperature of a substance remains constantuntil the phase transition is complete, i.e.until all the ice of a water/ice mixture hasbecome liquid.Apart from the phase transitions between twostates of matter, for some substances triplepoints are also used as fixed points for cali-

    bration. At the triple point, the three conven-tional phases (solid, liquid and gas) of ahigh-purity substance are present in a ther-mal equilibrium. Figure 15 shows the phasediagram of water and the position of its triplepoint. Triple points can be set very accur-ately and reproduced at any time. Moreover,they can be maintained over a longer periodof time.

    Gas-actuatedthermometers

    Fixed point cells

    Triple point cells

  • 8/20/2019 Handbook Calibration

    33/73

    32 Reference instruments

    For the purpose of calibration, in 1990 theCIPM ( International Committee for Weightsand Measures, abbreviation of French: Comité International des Poids et Mesures) estab-lished the ITS-90 ( International Temperature

    Pressure in Pa

    6.116 · 102

    0.01 100 374

    Temperature in ºC

    Triple point

    Critical point

    Solid

    Liquid

    Gaseous

    1.013 · 10

    5

    2.206 · 107

    Boiling point

    Melting point

    0

    Fig. 15:Phase diagram ofwater 

    Table 3:Selected fixed pointsas per ITS-90

    Fixed point Temperature in °C

    Triple point of hydrogen –259.3467

    Triple point of neon –248.5939

    Triple point of oxygen –218.7916

    Triple point of argon –189.3442

    Triple point of mercury –38.8344

    Triple point of water 0.01

    Melting point of gallium 29.7646

    Melting point of indium 156.5985

    Melting point of tin 231.928

    Melting point of zinc 419.527

    Melting point of aluminium 660.323

    Melting point of silver 961.78

    Melting point of gold 1064.18

    Melting point of copper 1084.62

  • 8/20/2019 Handbook Calibration

    34/73

    Reference standards 33

    Scale of 1990), which defines temperatures viapredefined temperature fixed points (Table 3).For the calibration of thermometers, the watertriple point cell has the greatest importance(Fig. 16). To be able to use this fixed point cell

    for calibrating thermometers, it first must becooled down to –6°C in a liquid bath. If this isdone slowly enough, the water does not formice in the cell, but rather remains liquid. Onlyafter shaking the cell carefully does the waterfreeze completely from top to bottom withinseconds. A closed ice jacket is formed around

    the inner tube. The water triple point cell isplaced into a Dewar vessel which has beenfilled with crushed ice to a point just below theupper edge of the inner tube to prevent it fromheating up too quickly. After a certain time,over a period of up to eight hours – dependingon the number of calibrated thermometers –a constant temperature of +0.01°C (273.16 K)is obtained in the cell.

    Calibrationwith the watertriple point cell

    Fig. 16:Calibration with thewater triple pointcell

  • 8/20/2019 Handbook Calibration

    35/73

    34 Reference instruments

    To calibrate thermometers by means of thetriple point cell, a few millilitres of alcohol arepoured into the inner tube for better heat trans-

    fer. Then, for example, platinum resistancethermometers can be inserted into the innertube. After a few minutes, they will haveadopted the temperature of the triple point, atwhich point the recording of the measuredvalues displayed by them can begin.

    Working standardsWorking standards are usually calibrated bymeans of a reference standard. They are usedfor the quick and effective calibration of meas-uring instruments in calibration laboratoriesand factory laboratories.

    Working standards for the calibration ofpressure measuring instrumentsThe cost-effectiveness of the calibration ofpressure measuring instruments and of theirdocumentation depends on the level of auto-mation possible (Fig. 17). The process can beautomated to a high degree with measuring in-

    Fig. 17: Automated calibra-tion of pressuremeasuring instru-ments

  • 8/20/2019 Handbook Calibration

    36/73

    Working standards 35

    struments that have an electronic output, as inthis case, a higher-ranking evaluation instru-ment can read the signal of the measuring in-strument and communicate simultaneouslywith a pressure controller (Fig. 18), operatingas a working standard.Pressure controllers are devices controlled viaa keyboard or PC connection that are capableof quick and stable delivery of a desired pres-sure. Depending on their design, these instru-ments work in the high-precision pressure

    range from approx. 25 mbar and high pneu-matic pressure ranges of up to 400 bar up toultra-high pressure applications of 2000 barand more. Owing to the high compressibilityof gases, a hydraulic system is used fromapprox. 400 bar upwards.Pressure controllers generally consist of a ref-

    erence pressure sensor (typically a piezo-resist- ive sensor) and a valve unit regulated via elec-tronic control. To ensure that the output of apressure controller is capable of delivering anexactly defined pressure, the instrument mustbe supplied from outside with an initial pres-sure above the desired maximum final value.If pressures below atmospheric pressure arealso to be measured, an external vacuum sup-

    Pressurecontrollers

    Fig. 18: Example of a precision pressurecontroller 

  • 8/20/2019 Handbook Calibration

    37/73

    36 Reference instruments

    ply, for example a pump, must also be avail-able. Since the pressure measurement takesplace electronically, no complicated mathem-

    atical corrections are necessary, in contrast topressure balances. Pressure controllers achievemeasurement uncertainties of at least 0.008 per cent of the measuring span.The valve unit can have different designs: asimple 2-stage system, which only has thepositions “open” and “closed”, and a more

    precise needle system, which enables a con-tinuously variable larger or smaller aperture for the gas (Fig. 19). Although the desired pres-sures to be measured can usually be obtained alittle more quickly using the 2-stage system,the continuously variable control has severaladvantages: applying the pressure more slowlyreduces the mechanical stress on the measur-ing instrument to be calibrated. Since thespeed of pressure build-up can be reduced nearthe measuring point, the risk of overshootingthe desired pressure value is reduced. Its veryprecise control also makes it possible to com-pensate any slight losses in pressure caused byleaks in the tube system. The control charac-

    Fig. 19:Valve unit withneedle system

  • 8/20/2019 Handbook Calibration

    38/73

    Working standards 37

    teristics can be affected not only by the needlegeometry, but also by the selection of mater-ials. Suitable ceramic valve systems ensure,

    for example, that control is achieved independ-ently of temperature and thus in a more stablemanner.It is not only in industrial plant that pressuremeasuring instruments need to be calibrated.Pressure sensors have also found their wayinto motor vehicles and washing machines.

    What is required in such calibration applica-tions is less high accuracy, but rather highspeed. Accordingly, there are also pressurecontrollers that have adaptive control algo-rithms, which by means of the temperature-insensitive ceramic valve system, are capableof moving to measuring points fully automat-ically within a few seconds. This allows fivemeasured values of a pressure sensor whose

     job it is to determine the filling level of awashing machine, for example, to be checkedwithin one minute (Fig. 20). Pressure control-lers of this type still reach accuracies of up to0.025 per cent. Frequently, overshoot-freecontrol is sacrificed in favour of speed.

    Pressure control-

    lers for auto-

    mated calibration

    Fig. 20:Pressure controllerwith temperature-independent ceramicvalve system

  • 8/20/2019 Handbook Calibration

    39/73

    38 Reference instruments

    Working standards for the calibration oftemperature measuring instrumentsThe calibration of thermometers at fixed

    points is very complicated. A much easier andquicker method is that of comparative meas-urement. This is done by exposing the ther-mometers to be calibrated together with a cali-brated thermometer as a working reference toa constant temperature. As soon as thermalequilibrium has been reached, the temperature

    values can be read and any measuring devi-ations determined. Most calibration labora-tories and factory laboratories usually use thismethod, also because it allows them to cali-brate several thermometers simultaneously.The reference thermometers used are in gen-eral resistance thermometers. For the tempera-ture range from −189°C to 660°C, accreditedcalibration laboratories require at least two25-ohm standard resistance thermometers, de-pending on the operating range:

    • one for temperatures from −189°C to 232°Cwhich was calibrated at the triple points ofargon and mercury listed in ITS-90 and atthe fixed points of indium and tin

    • one for temperatures from 0.01°C to 660°Ccalibrated at the fixed points of tin, zinc andaluminium.

    Additionally, there are encapsulated thermom-eters that are used in cryostats down to−259°C and high-temperature designs up to

    962°C.However, in daily use the working standardsused are not the standard resistance thermom-eters, but for example, platinum resistancethermometers which were calibrated bycomparative measurements on these therm-ometers. The measuring resistor of thesethermometers is a winding made of verypure platinum (Fig. 21). As their resistance

    Standardresistancethermometers

    Platinumresistancethermometers

  • 8/20/2019 Handbook Calibration

    40/73

    Working standards 39

    changes with temperature in a defined way,the temperature can be determined via theresistor. The precision of the resistance meas-urement greatly affects the measurement un-certainty of the thermometer calibration. Thecomparison resistor required in the measure-

    ment bridge is temperature-controlled toapprox. 23°C. This allows measurementuncertainties below 1 mK to be reached.At temperatures above the operating rangeof resistance thermometers, the workingstandards used are usually thermocouplesmade of platinum and a platinum-rhodium

    alloy (Fig. 22). They are suitable for a tem-

    Thermocouples

    Fig. 21:Platinum coil for

     platinum resistancethermometers

    Fig. 22:Thermocouple madeof platinum and plat-

    inum-rhodium alloy

  • 8/20/2019 Handbook Calibration

    41/73

    40 Reference instruments

    perature range from 0°C to 1300°C. Plat-inum-gold thermocouples are even more stable due to the fact that they hardly age.

    A constant temperature over time and space isessential for the calibration of thermometers.Amongst others, this can be generated byliquid baths or heating ovens. Stirred liquidbaths offer the most homogeneous spatial tem-perature distribution and a constant tempera-ture over a long period of time (Fig. 23). Their

    properties depend on the geometry of the bath,the bath liquid used and the temperature rangeto be investigated. Each bath should have suf-ficient immersion depth for the thermometersand the highest possible volume in order to beable to keep a certain temperature constantover a longer period of time, even if thermom-eters are immersed and removed in the mean-

    Liquid baths

    Fig. 23:Stirred liquid bath

  • 8/20/2019 Handbook Calibration

    42/73

    Working standards 41

    time. Good insulation in relation to thesurroundings is also important. Pumps and

    stirrers are not allowed to introduce any heatinto the bath. The temperature control fluidmust have a high thermal conductivity and lowviscosity. It must be inert in the temperaturerange under consideration and have lowvapour pressure. Typical examples are listedin Table 4.Each liquid bath is calibrated prior to its firstuse. This is done by measuring the tempera-ture in axial direction – from bottom to top –and radial direction – from inside to outside –at three temperatures, i.e. at the initial, middleand final value of the measuring range, usinga resistance thermometer as the workingstandard.

    In the actual calibration, the resistance therm-ometers as the working standards and thethermometers to be calibrated are introducedinto the liquid (by means of a positioningdevice) in such a way that they are very closeto each other. This allows the effect of in-homogeneities to be minimised, achieving

    measurement uncertainties of below 5 mK.The calibration of thermometers in a tem-perature range from 500°C to 1200°C re-quires tube furnaces (Fig. 24). The heatingtube, well-insulated in relation to the outsideof such furnaces, is heated electrically. Theheated interior of typical tube furnaces has adiameter of 7 cm and is 50 cm long. At itscentre is a compensation block made of

    Tube furnaces

    Table 4:Suitable temperatureranges of differenttempering media

    Temperature range in °C Medium

    −80 to 0 Methanol and ethanol

    −35 to 70 Water-glycol-mixture

    (60% glycol)

    10 to 80 Water

    50 to 200 Silicone oil

    180 to 500 Saltpetre

  • 8/20/2019 Handbook Calibration

    43/73

    42 Reference instruments

    nickel or steel containing six to eight concen-tric bores. The test items and the workingstandards are inserted into these bores. Themass and structure of this metal block and thegood insulation of the furnace ensure that thetemperatures in all bores are identical andremain stable over time. Nevertheless, a tubefurnace must also be calibrated in the sameway as a liquid bath. A particularly critical

    feature is the axial temperature profile,because a temperature gradient can easilyform in the direction of the inlet openings ofthe bores.In calibration, care must be taken that the borediameter is not more than 0.5 mm larger thanthe diameters of the thermometers, since air

    gaps would hinder heat transfer. This is nolonger so important above approx. 600°C,since heat transfer takes place mainly by heatradiation in this case.As in the case of liquid baths, the test itemsand the standard resistance thermometersshould be immersed into the block as deeplyas possible. Ideally their measuring resistorsare at the same height. Sensors that are shorter

    1

    2

    3

    4

    5

    6

    Fig. 24:Schematic design of 

    a tube furnace:1 Ceramic heating

     pipe2 Metal block with

    bore holes3 Heating coil4 Thermal insula-

    tion5 Reference

    (standard)6 Calibration item

  • 8/20/2019 Handbook Calibration

    44/73

    Portable pressure calibration instruments 43

    than 7 cm are best calibrated in a liquid bath,because the temperature of the furnace is nolonger sufficiently constant over whole immer-

    sion depth.

    Portable pressure calibrationinstruments

    Dismounting a measuring instrument, packingit and sending it to an external calibration la-

    boratory is often a complex and time-consum-ing task. In many cases, for example, if a pro-duction process would have to be interruptedwhile the instrument is removed, it would noteven be possible. In this case, on-site calibra-tion is the only option. Moreover, this has theadvantage that the measuring instrument can

    be calibrated more easily as part of the entiremeasuring chain and its mounting position canalso be taken into account.Suitable instruments are also available for on-site calibration. Although their measurementuncertainty is lower than that of the referenceand working standards described above, they

    are fully adequate for most industrial processes.

    Portable instruments for on-site calibrationof pressure measuring instrumentsOn-site calibration of pressure measuring in-struments requires a working standard and apressure source (Fig. 25). This can be a com-pressed nitrogen gas cylinder or an externalhand pump which is available for pressuresbelow 40 bar in a pneumatic version and upto 1000 bar in a hydraulic version. The mostconvenient solutions are portable pressurecalibrators. They combine pressure gener-ation and the working standard in a portableinstrument. This avoids the cumbersome and

    time-consuming assembly of the instrument

    Hand-heldcalibrators ...

  • 8/20/2019 Handbook Calibration

    45/73

    44 Reference instruments

    from several components and reduces therisk of leaks in the pressure system. Depend-ing on the version, the pressure in the instru-ment is built up by means of a manual orelectric pump. Calibration instruments of thistype make it possible to achieve measure-

    Fig. 25:Portable pressurecalibrator withaccessories in aservice case

    Fig. 26:Calibration of a pro-cess transmitter with

     portable calibrator 

  • 8/20/2019 Handbook Calibration

    46/73

    Portable pressure calibration instruments 45

    ment uncertainties of up to 0.025 per cent ofthe measuring span.Many models not only measure pressures, but

    also ambient temperatures and the electric sig-nals output by the measuring instruments to becalibrated (Fig. 26). This makes it possible, forexample, to check the correct functioning ofpressure switches or process transmitters.Pressure switches close and open a valve up-wards of a certain pressure value. Process

    transmitters convert the measurement param-eter of pressure into a proportional electricoutput signal.When calibrating a process transmitter, firstthe pressure connection and the electrical con-nection between the measuring instrument andthe calibration instrument are established. Toavoid contamination of the portable standard,a dirt trap can be connected in-between, asotherwise, the measuring instrument wouldoperate with contaminated media. This is fol-lowed by performing a zero point adjustmentwith the valves open. The individual pressuretest points can then be established by means ofthe integrated pump, and the delivered electric

    signals can be measured.Portable pressure calibrators are equipped withintegrated data loggers. The measured valuesare recorded on site. The data can be read outlater at the laboratory or office by connectionto a PC and displayed in a log. Some are alsoequipped with an electronic evaluation system,

    which performs an error calculation and showsthe result on the display immediately aftercalibration.

    Portable instruments for on-site calibrationof temperature measuring instrumentsOn-site calibration of thermometers is usuallycarried out with dry-well calibrators andmicro calibration baths (Fig. 27). They are

    … with inte-grated datalogger

  • 8/20/2019 Handbook Calibration

    47/73

    46 Reference instruments

    basically smaller transportable versions of theheating ovens and liquid baths already de-

    scribed. In both, the reference thermometerused as a working standard is permanentlybuilt in and serves simultaneously to controlthe oven or bath temperature and to calibratethe test items.Inserts with bores of different diameters canbe built into the metal block of dry-well cali-

    brators, enabling the testing of thermometermodels of different types. In on-site calibra-tion, a thermometer measuring insert can betaken out of its thermowell fitted in the plantand placed in the calibrator, without having tointerrupt the electrical connections to the elec-tronic evaluation system. Temperature sensorsare often only calibrated at a single tempera-ture – that of the process they monitor. Cali-

    Dry-wellcalibrators

    Fig. 27: Dry-well calibrator 

  • 8/20/2019 Handbook Calibration

    48/73

    Portable pressure calibration instruments 47

    bration is carried out at a measurement un-certainty of 0.1 K to 3 K, depending on thetemperature range and the properties of the

    calibration item.Since all dry-well calibrators are closed at thebottom and open at the top, a temperature gra-dient inevitably forms in the bores, which maylead to measuring errors, especially if the tem-perature sensors to be calibrated are too short.They should be inserted into the calibrator as

    deeply as possible, i.e. about 15 cm. If thetemperature sensors cannot be inserted into thebore of the calibrator by more than 7 cm, it isbetter to use an external reference thermom-eter as the working standard instead of the per-manently built-in one (Fig. 28). For sensorlengths below 7 cm, micro calibration bathsshould be used.

    Fig. 28:Portable calibratorwith an externaltemperature sensoras reference therm-ometer 

  • 8/20/2019 Handbook Calibration

    49/73

    48 Reference instruments

    In contrast to the large-sized versions, microcalibration baths (Fig. 29) can only accom-modate one or two test items, because the tem-perature of the relatively small amount of li-quid cannot be kept constant as efficiently due

    to the heat dissipation of the thermometers.Despite the homogeneous temperature distri-bution in the bath, the thermometers to becalibrated must permit a minimum immersiondepth. Depending on their design, this shouldbe ten to fifteen times the sensor diameter. Inthe range from −35°C to 255°C, micro calibra-tion baths achieve measurement uncertaintiesof 0.1 K to 0.3 K.

    Micro calibra-tion baths

    Fig. 29: Micro calibrationbath

  • 8/20/2019 Handbook Calibration

    50/73

    49

    Calibration

    characteristicsAn analysis of the measured values obtainedduring the calibration sequence allows charac-teristic features of the measuring instrument tobe determined which can be taken as an indica-tion of its quality and suitability for the meas-uring process. The determination of typical

    characteristics such as measuring deviation,hysteresis and repeatability is described below.

    Measuring deviation

    The measuring deviation (also referred to asmeasuring error ) indicates how far a value vd 

    displayed or output by a measuring instrumentdeviates from the “true” value vt  or from the“correct” value vc (Fig. 30). The true value ofa measurement parameter is an imaginaryvalue. To determine this value, a measuringinstrument of infinite accuracy and displaywould have to be available. The correct valueis the value that is measured with an error-freeor − in practice − high-accuracy measuringinstrument such as a primary standard.

    Deviation fromthe “correct”

    value

    Fig. 30: Definition and deter-

    mination of themeasuring deviation:FS Full scale value

    100% FS

    50% FS0 100% FS

    Measuring deviation

    Ideal straight line

    Output signal

    Output signal/ displayed value

    Pressure

  • 8/20/2019 Handbook Calibration

    51/73

    50 Calibration characteristics

    In general it should be noted that the measuredvalue can deviate from the displayed value.Thus high-resolution digital displays may give

    the impression that a measuring instrument isparticularly accurate. However, the number ofdisplayed digits and the accuracy of the meas-uring instrument do not have to coincide.Ideally the accuracy of the measuring instru-ment is reflected in its resolution. It should befive to ten times higher than the desired accur-

    acy. The display of an electronic pressure meas- uring instrument that can determine pressurewith an accuracy of ±0.1 bar should in thiscase produce a digital value with one or twodecimal places in the unit “bar”.According to DIN 1319, mechanical meas-uring instruments are divided into accuracyclasses. Such an accuracy class includes allmeasuring instruments that meet predeter-mined metrological requirements, so that themeasuring deviations of these instruments arewithin established limits. Thus, for example,the display of a pressure measuring instru-ment of accuracy class 0.6 deviates from thecorrect value by no more than 0.6 per cent. In

    contrast to instruments with a digital display,mechanical measuring instruments do nothave a “fixed” resolution. Depending on thespacing of the individual division markson the measurement scale, an interpolationbetween 1/3 und 1/5 of the scale graduationcan take place.

    The measuring deviation is specified either asabsolute or relative error. The absolute error isindicated with a sign and has the same unit asthe measurement parameter.

    F = vd – vc

    The relative measuring deviation refers to thecorrect measured value. Thus it is also indi-cated with a sign, but has no unit:

    Absolute andrelative meas-uring deviation

  • 8/20/2019 Handbook Calibration

    52/73

    Hysteresis 51

    f = (vd – vc)/vc · 100%

    A distinction must be made between system-

    atic and random measuring deviations. Sys-tematic measuring deviations are unidirec-tional, i.e. they have a magnitude not equal tozero and a sign. This is why systematic devi-ations always result in errors in the measuringresult. Although known systematic measuringdeviations can be compensated by means ofmathematical methods or empirical data, a“calibration without adjustment” of this type iscarried out only rarely in practice, so that itscontribution to the measurement uncertainty(see p. 55 ff.) is usually calculated. Randommeasuring deviations are not unidirectional.Even if a measurement is repeated under thesame conditions, the displayed measured

    values are scattered around the correct value inmagnitude and sign. Thus, random measuringdeviations “only” lead to uncertainties in theresult. If it were possible to carry out an in-finite number of repetitions, the random meas-uring deviations would average out, allowingthe true value to be determined. In practice,

    the measurement uncertainty of the measuringinstrument can be determined from an infinitenumber of repetitions by means of statisticalmethods.

    Hysteresis

    In connection with measuring instruments,the term hysteresis (according to DIN 1319,also referred to as reversal error ) means thatthe value displayed by the test equipment notonly depends on the input parameter (for ex-ample the applied pressure or the prevailingtemperature), but also on the values of the in-put parameter that were measured previously.

    Figure 31 shows the values output by a pres-

    Systematic andrandom meas-uring deviations

  • 8/20/2019 Handbook Calibration

    53/73

    52 Calibration characteristics

    sure measuring instrument both on increas-ing pressure (yellow) but also on decreasingpressure (blue). The latter were determinedonly after reaching the maximum value.Despite otherwise identical conditions, themeasuring instrument shows different valuesat identical pressures. The correct measuredvalue is in the range between two linearcompensating curves and cannot be specified

    in more detail. One reason for hysteresis canbe too slow a relaxation process in the meas-uring instrument. Such a process takes place,for example, when the Bourdon tube of amechanical pressure measuring instrumentdoes not react completely elastically to anelongation.

    If an evaluation of hysteresis is desired, itmust be ensured that the intended pressure tobe measured is approached “asymptotically”.The measured value must be reached withoutexceeding the nominal value. Otherwise, thehysteresis effects would be falsified. If acci-dentally a measured value is exceeded, thetest pressure must first be reduced well belowthe nominal value before approaching the

    Evaluation ofthe hysteresis

    Fig. 31: Definition and

    determination ofthe hysteresis:FS Full scale value

    100% FS

    0 100% FS

    Displayed values onincreasing pressure

    Hysteresis

    Displayed values ondecreasing pressure

    Output signal/ 

    displayed value

    Pressure

  • 8/20/2019 Handbook Calibration

    54/73

    Repeatability 53

    pressure to be measured again from the cor-rect test direction.

    Repeatability

    Repeatability is a measure of how well ameasuring result can be reproduced if it is de-termined again under the same conditions. Incontrast to hysteresis, the input parametermust also follow the same curve. Figure 32

    shows two measurement series on the samepressure measuring instrument. In both, thepressure is gradually increased at a uniformrate. Nevertheless, the displayed values differfrom one another.The characteristic repeatability is also helpfulwhen the values of the test item depend on themounting state. In particular when the displayof a pressure measuring instrument dependson the tightening torque of the threaded con-nection, repeatability can take into accountthe contribution of uncertainty in the laterapplication.For the determination of repeatability, a highrepeatable accuracy  is required. To determine

    it, the same measurement procedure must berepeated several times by the same person onthe same measuring instrument and in thesame place under the same experimental con-

    Consideration ofclamping effects

    Fig. 32: Definition anddetermination ofthe repeatability:FS Full scale value

    Output signal/ 

    displayed value

    100% FS

    100% FS

    Pressure

    100% FS

    Pressure

    Repeat-ability

    Cycle 1

    Cycle 2

    0 p p

  • 8/20/2019 Handbook Calibration

    55/73

    54 Calibration characteristics

    ditions; moreover, the independent individualmeasurements must be carried out within avery short time span. The higher the repeat-

    able accuracy, usually the smaller the scatterof the measured value, and thus the measure-ment uncertainty of the instrument.

    Determination of the character-istics in practice

    With a suitable calibration sequence, it is pos-sible to determine the measuring deviation Δp,the hysteresis h and the repeatability b. Themeasuring deviation at a certain pressure cor-responds to the difference between the meanvalue of all measured values of the test itemand the pressure displayed by the reference in-

    strument (reference or working standard). Thehysteresis at a certain pressure is obtainedfrom the difference between the individualmeasured values measured at increasing pres-sure and decreasing pressure. The repeatabilityat a certain pressure corresponds to the differ-ence between the two measured values deter-mined at increasing and decreasing pressure,

    respectively. The three calibration sequencesdescribed on page 20 ff. allow different evalu-ations. Sequence A allows hysteresis andrepeatability to be determined twice each,sequence B gives repeatability and hysteresisonce (Fig. 33), while sequence C gives hyster-esis only once, but not repeatability.

    Suitable calibra-tion sequences

    M1 M2 M3

    Comparison

    shows hysteresis

    Comparison

    shows repeatability

    ••

    • •

    ••

    •   •

    ••   •

    Fig. 33:Calibration sequence

     B as per DKD-R 6-1with evaluation

  • 8/20/2019 Handbook Calibration

    56/73

    55

    Measurement

    uncertaintyThe measurement uncertainty can be deter-mined from measuring results by statisticalmethods using technical scientific analyses.Manufacturers of measuring instruments oftenprefer the term accuracy, which has a positiveconnotation. However, this is a purely qualita-tive characterisation. A quantitative character-isation, i.e. by means of a numeric value, canonly indicate by how much the values dis-played on an instrument can deviate from thecorrect value.

    Basics according to GUM

    The term measurement uncertainty  is definedby the International Vocabulary of Metrologyas a characteristic assigned to the result of ameasurement, for example in calibration, andwhich characterises the range of values that canbe sensibly attributed to the measurement par-

    ameter by the completed measurement. Fig-uratively speaking, the measurement uncertainty indicates a range around the measured value inwhich the correct value lies with high probabil-ity, i.e. including how well the result obtainedreflects the value of the measurement param-eter. This can be of importance, for example,

    when products are to be checked as to whetherthey meet an established specification. If, forexample, a component must not be longer than10.0 mm, and the measurement gives 9.9 mm ata measurement uncertainty of ±0.2 mm, thenthe requirement might not be fulfilled.In general, measured values are always scat-tered – even under identical measurement con-ditions – around an empirical mean value.

    Definition

  • 8/20/2019 Handbook Calibration

    57/73

    56 Measurement uncertainty

    This arithmetic mean is calculated by addingup the individual measured values and divid-ing the result by the number of measurements.To describe the scatter of the measured valuesin a good approximation, typically rectangular

    or Gaussian distributions are used. In a rect-angular distribution (Fig. 34), the probability of  measuring the parameter x in the interval froma to b is constant. Outside the interval it iszero. In a Gaussian distribution typical for ran-dom measuring deviations with its bell-shapeddensity function (Gaussian curve), the farther

    away from the mean valuex       ͂

    , the smaller theprobability of obtaining a measured value x(Fig. 35). The characteristic parameter of theGaussian distribution is the standard deviation

    Typical densitydistributions

       P  r  o   b  a   b   i   l   i   t  y   d  e  n  s   i   t  y

    Measured value x

    a   x   b

    Fig. 34: Illustration of a rect-angular distribution:

     x ͂  Mean valuea Lower limit value

    b Upper limit value

    x−2σ   x−σ   x   x+σ   x+2σ   P  r  o   b  a   b   i   l   i   t  y   d  e  n  s   i   t  y

    Measured value x

    Fig. 35: Illustration ofa Gaussiandistribution:

     x ͂   Mean valueσ   Standard

    deviation

  • 8/20/2019 Handbook Calibration

    58/73

    Basics according to GUM 57

    or its square, the variance. Both are a measureof how broad the distribution is and thus howlarge the scatter of the individual measure-

    ments is around their mean value.The measurement uncertainty u is identical tothe standard deviation. In the Gaussian distri-bution, the standard deviation is designatedwith σ  and amounts to 1/ √3 · (b − a)/2. Theexpanded measurement uncertainty is obtainedfrom the measurement uncertainty by multi-

    plication by a factor k. In industry, this factoris 2 in most cases. In a Gaussian distribution,more than 95 per cent of all measuring resultsare in the interval x ͂  − 2σ to x ͂  + 2σ.To enable accredited calibration laboratories todetermine the measurement uncertainty ac-cording to identical aspects, the ISO/BIPM(International Bureau of Weights and Meas-ures) published a guideline that establishes aprocedure that has meanwhile also given itsname to it: GUM (Guide to the expression ofUncertainty in Measurement ). The GUM wastranslated by the German Calibration ServiceDKD under the name DKD-3 (German title:“Angabe der Messunsicherheit bei Kali-

    brierungen”) as a guideline for German cali-bration laboratories and was made tangible,for example in the DKD-R 6-1 guideline bymeans of application examples specifically forpressure metrology.The basic idea of the GUM is to establish amodel that describes the measurement with

    sufficient accuracy. This model sets the meas-urement parameter in relation to the input par-ameters, in order to determine the total meas-urement uncertainty utotal  from the individualmeasurement uncertainty contributions ui. Asimple example is the pressure as the quotientof force and area, in which force is defined asthe product of the mass, for example the ap-plied weights of a pressure balance, and the

    Standard devi-ation = measure-ment uncertainty

    Determinationaccording toGUM

  • 8/20/2019 Handbook Calibration

    59/73

    58 Measurement uncertainty

    acceleration due to gravity g. None of thethree parameters (area, mass, g) can be deter-mined accurately. This is why all contribute to

    the total measurement uncertainty. A more de-tailed analysis can include, as shown in Figure10, p. 19, more ambient conditions such as thetemperature.The GUM describes two methods of determin-ing the measurement uncertainty of the inputparameters. Type A is a statistical analysis in

    which under identical conditions, at least tenmeasurements are carried out and the arithmet-ic mean and standard deviation are determinedfrom the measured values. In type B, the meas-urement uncertainties of the input parametersare determined from scientific knowledge, forexample from the following information:

    • data from previous measurements• general knowledge and experience regarding

    the properties and the performance ofmeasuring instruments and materials

    • manufacturer’s instructions• calibration certificates or other certificates• reference data from manuals.

    Measurement uncertainty budget

    Possible sources of measurement uncertaintiesduring calibration are, for example, inherent inthe:• procedure: method, instrument type, number

    and duration of measurements, measurementaids ...• calibration item: repeatability, quality of

    calibration, evaluation software ...• ambient conditions: temperature, atmos-

    pheric pressure, gravitational constant ...• reference instrument: accuracy, use under

    application conditions ...• operator: experience, care, handling ...

    Two methods ac-cording to GUM

    Sources of meas-urement uncer-tainties duringcalibration

  • 8/20/2019 Handbook Calibration

    60/73

    Measurement uncertainty budget 59

    Figure 36 shows all influence quantities thatplay a role in the calibration of a pressuremeasuring instrument.Once all sources of measurement uncertaintiesduring calibration have been established andevaluated quantitatively, a measurement uncer-tainty budget is drawn up. It contains the rele-vant standard deviations and a note of the

    method used to determine them in tabularform.Typical relevant characteristics for the ana-lysis of the measurement uncertainty are themeasuring deviation, hysteresis and repeat-ability discussed in Calibration characteris-tics, p. 49 ff. They also include display fluc-

    tuations and zero offsets.To determine the total measurement uncer-tainty for non-correlating (i.e. independent) in-put parameters, the squares of the individualmeasurement uncertainties are added up andthe square root is taken from the sum. Ifindividual input parameters depend on oneanother, complicated analytical and numericalcalculation processes are required.

    Determinationof the totalmeasurementuncertainty

    Temperature Temperature Temperature

    Height difference

     Air density Reference level 

    Local gravity Density

    Reference level 

    Power supplyPosition

    Drift

    Reference point 

    Pressure

    measuring instrument

    Measurement

    uncertainty assigned

    to the values of the

    standard under

    normal conditions

    Measurement uncertainty

    assigned to the values of

    the converter, indicator

    - Separator

    - Hoses

    - Fittings

    - Valves

    - Pressure trans-

      mission medium

    Sensor properties

    - Zero offset

    - Repeatability

    - Reproducibility

    - Hysteresis

    - Drift

    Roundings

    Conversions

    Interpolation error

    Resolution

    Pressure

    standardPiping

    systemSensor

    Converter

    Indicator

    Signal

    processingPressure balance

    Fig. 36:

     Influencing param-eters during calibra-tion of a pressuremeasuring instru-ment (Source: DKD-

     R 6-1, p. 16)

  • 8/20/2019 Handbook Calibration

    61/73

    60 Measurement uncertainty

    Example calculation

    The calculation of the measurement uncer-

    tainty will be discussed in more detail below,using the example of calibrating a pressuremeasuring instrument according to sequence Bdescribed in Calibration of pressure meas-uring instruments, p. 20 ff. Table 5 lists the

    pressure values displayed by the reference in-strument and the mean values of the measure-ments in the test item, as well as the meas-

    uring deviation, repeatability b and hysteresis h. From the last-named, the individual measure-ment uncertainties u(b) and u(h) are obtained.For the evaluation, the uncertainties due to thelimited resolution of the display u(r) and theconstant zero offset u(f0) are also included. Itis assumed that these four uncertainties follow

    a rectangular distribution. In the case of hys-teresis, this gives, for example:

    u(h) = 1/ √3·h/2

    Other measurement uncertainty contributionsto be included are those of the standard u(N)and possibly measurement uncertainties due tothe procedure u(V), e.g. when a digital multi-meter was used for the evaluation of a pressure

    Determinationof the individualmeasurementuncertainties

    Table 5: Evaluation ofthe calibration

     parameters

    Reading

    of calibrator

    in bar

    Mean value µ

    in bar

    Measuring

    deviation

    in bar

    Repeatability b

    in bar

    Hysteresis h

    in bar

    0.0000 0.0000 0.0000 0.0000 0.0000

    2.0000 2.0047 −0.0047 0.0006 0.0000