general image-quality equation: giqe

7
General Image-Quality Equation: GIQE Jon C. Leachtenauer, William Malila, John Irvine, Linda Colburn, and Nanette Salvaggio A regression-based model was developed relating aerial image quality, expressed in terms of the National Imagery Interpretability Rating Scale ~NIIRS!, to fundamental image attributes. The General Image- Quality Equation ~GIQE! treats three main attributes: scale, expressed as the ground-sampled dis- tance; sharpness, measured from the system modulation transfer function; and the signal-to-noise ratio. The GIQE can be applied to any visible sensor and predicts NIIRS ratings with a standard error of 0.3 NIIRS. The image attributes treated by the GIQE are influenced by system design and operation parameters. The GIQE allows system designers and operators to perform trade-offs for the optimization of image quality. © 1997 Optical Society of America Key words: Image quality, image interpretability, aerial imagery, system performance prediction, predicting image interpretability. 1. Introduction Designing imaging systems that just meet customer image-quality requirements and satisfy size, power, weight, and cost constraints is a critical issue facing all imaging-system manufacturers. A system that delivers a higher quality than customers require is likely to be more expensive than a competitor’s that just meets those requirements. Conversely, a device that is too low in quality is not likely to be purchased. Imaging-system performance goals are increas- ingly being expressed in terms of the National Imag- ery Interpretability Scale ~NIIRS!. For example, the Defense Airborne Reconnaissance Office has de- fined the performance goals for both the Predator and Global Hawk unmanned aerial vehicle sensor sys- tems in terms of NIIRS. The NIIRS is a 10-level scale defined by interpretation tasks or criteria. The scale is used by members of the image- exploitation community to communicate the informa- tion potential of imagery as well as to express requirements for information. 1 Given a NIIRS requirement, it behooves the sys- tem developer to design a system that will achieve the desired level of NIIRS performance. In a com- petitive era overachieving can be as bad as under- achieving. Developers thus require a tool that will accurately predict NIIRS performance prior to actu- ally building and testing a new sensor system. The General Image-Quality Equation ~GIQE! was developed to provide such predictions. The GIQE was developed under the auspices of the U.S. Gov- ernment’s Imagery Resolution Assessment and Re- porting Standards ~IRARS! Committee. The GIQE predicts NIIRS as a function of a predicted image scale, sharpness, or resolution and the signal-to-noise ratio ~SNR!. It was released to the unmanned aerial vehicle development community in 1994. 2 The GIQE was developed initially by use of a regression- modeling approach. Ten imagery analysts provided NIIRS ratings on a large sample of electro-optical ~EO! imagery. The image-quality characteristics of the imagery were used to develop a regression model that predicted NIIRS values as a function of system design ~quality! and operating parameters. Since the initial release, both the NIIRS and the GIQE have been updated. The NIIRS was updated to replace outdated criteria as well as a few criteria that had been found to cause confusion among users. The GIQE was updated with an additional sample of imagery that broadened the range of sharpness and the SNR. The total image sample was split into two halves, and a regression model was developed on the basis of one of the two samples. The other sample was then used to validate the regression model. J. C. Leachtenauer and J. Irvine are with ERIM International, 1101 Wilson Boulevard, Suite 1100, Arlington, Virginia 22209- 2248. W. Malila and L. Colburn are with ERIM International, P.O. Box 134001, Ann Arbor, Michigan 48113-4001. N. Salvaggio is with TASC, 12100 Sunset Hills Road, Reston, Virginia 20190. Received 4 February 1997; revised manuscript received 7 May 1997. 0003-6935y97y328322-07$10.00y0 © 1997 Optical Society of America 8322 APPLIED OPTICS y Vol. 36, No. 32 y 10 November 1997

Upload: nanette

Post on 02-Oct-2016

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: General Image-Quality Equation: GIQE

General Image-Quality Equation: GIQE

Jon C. Leachtenauer, William Malila, John Irvine, Linda Colburn,and Nanette Salvaggio

A regression-based model was developed relating aerial image quality, expressed in terms of the NationalImagery Interpretability Rating Scale ~NIIRS!, to fundamental image attributes. The General Image-Quality Equation ~GIQE! treats three main attributes: scale, expressed as the ground-sampled dis-tance; sharpness, measured from the system modulation transfer function; and the signal-to-noise ratio.The GIQE can be applied to any visible sensor and predicts NIIRS ratings with a standard error of 0.3NIIRS. The image attributes treated by the GIQE are influenced by system design and operationparameters. The GIQE allows system designers and operators to perform trade-offs for the optimizationof image quality. © 1997 Optical Society of America

Key words: Image quality, image interpretability, aerial imagery, system performance prediction,predicting image interpretability.

1. Introduction

Designing imaging systems that just meet customerimage-quality requirements and satisfy size, power,weight, and cost constraints is a critical issue facingall imaging-system manufacturers. A system thatdelivers a higher quality than customers require islikely to be more expensive than a competitor’s thatjust meets those requirements. Conversely, a devicethat is too low in quality is not likely to be purchased.

Imaging-system performance goals are increas-ingly being expressed in terms of the National Imag-ery Interpretability Scale ~NIIRS!. For example,the Defense Airborne Reconnaissance Office has de-fined the performance goals for both the Predator andGlobal Hawk unmanned aerial vehicle sensor sys-tems in terms of NIIRS. The NIIRS is a 10-levelscale defined by interpretation tasks or criteria.The scale is used by members of the image-exploitation community to communicate the informa-tion potential of imagery as well as to expressrequirements for information.1

Given a NIIRS requirement, it behooves the sys-

J. C. Leachtenauer and J. Irvine are with ERIM International,1101 Wilson Boulevard, Suite 1100, Arlington, Virginia 22209-2248. W. Malila and L. Colburn are with ERIM International,P.O. Box 134001, Ann Arbor, Michigan 48113-4001. N. Salvaggiois with TASC, 12100 Sunset Hills Road, Reston, Virginia 20190.

Received 4 February 1997; revised manuscript received 7 May1997.

0003-6935y97y328322-07$10.00y0© 1997 Optical Society of America

8322 APPLIED OPTICS y Vol. 36, No. 32 y 10 November 1997

tem developer to design a system that will achievethe desired level of NIIRS performance. In a com-petitive era overachieving can be as bad as under-achieving. Developers thus require a tool that willaccurately predict NIIRS performance prior to actu-ally building and testing a new sensor system.

The General Image-Quality Equation ~GIQE! wasdeveloped to provide such predictions. The GIQEwas developed under the auspices of the U.S. Gov-ernment’s Imagery Resolution Assessment and Re-porting Standards ~IRARS! Committee. The GIQEpredicts NIIRS as a function of a predicted imagescale, sharpness, or resolution and the signal-to-noiseratio ~SNR!. It was released to the unmanned aerialvehicle development community in 1994.2 TheGIQE was developed initially by use of a regression-modeling approach. Ten imagery analysts providedNIIRS ratings on a large sample of electro-optical~EO! imagery. The image-quality characteristics ofthe imagery were used to develop a regression modelthat predicted NIIRS values as a function of systemdesign ~quality! and operating parameters.

Since the initial release, both the NIIRS and theGIQE have been updated. The NIIRS was updatedto replace outdated criteria as well as a few criteriathat had been found to cause confusion among users.The GIQE was updated with an additional sample ofimagery that broadened the range of sharpness andthe SNR. The total image sample was split into twohalves, and a regression model was developed on thebasis of one of the two samples. The other samplewas then used to validate the regression model.

Page 2: General Image-Quality Equation: GIQE

Table 1. Visible NIIRS Operations by Level—March 1994a

.

Rating Level 0Interpretability of the imagery is precluded by obscuration,degradation, or very poor resolution.

Rating Level 1Detect a medium-sized port facility andyor distinguish be-tween taxiways and runways at a large airfield.

Rating Level 2Detect large hangars at airfields.

Detect large static radars ~e.g., ANyFPS-85, COBRA DANE,PECHORA, HENHOUSE!.

Detect military training areas.

Identify an SA-5 site based on road pattern and overall siteconfiguration.

Detect large buildings at a naval facility ~e.g., warehouses,construction halls!.

Detect large buildings ~e.g., hospitals, factories!.

Rating Level 3Identify the wing configuration ~e.g., straight, swept, delta!of all large aircraft ~e.g., 707, CONCORD, BEAR, BLACK-JACK!.

Identify radar and guidance areas at a SAM site by the con-figuration, mounds, and presence of concrete aprons.

Detect a helipad by the configuration and markings.

Detect the presenceyabsence of support vehicles at a mobilemissile base.

Identify a large surface ship in port by type ~e.g., cruiser,auxiliary ship, noncombatantymerchant!.

Detect trains or strings of standard rolling stock on railroadtracks ~not individual cars!.

Rating Level 4Identify all large fighters by type ~e.g., FENCER, FOXBAT,F-15, F-14!.

Detect the presence of large individual radar antennas ~e.g.,TALL KING!.

Identify, by general type, tracked vehicles, field artillery,large river crossing equipment, wheeled vehicles when ingroups.

Detect an open missile silo door.

Determine the shape of the bow ~pointed or bluntyrounded!on a medium-sized submarine ~e.g., ROMEO, HAN, Type209, CHARLIE II, ECHO II, VICTOR IIyIII!.

Identify individual tracks, rail pairs, control towers, switch-ing points in rail yards.

Rating Level 5Distinguish between a MIDAS and a CANDID by the pres-ence of refueling equipment ~e.g., pedestal and wing pod!.

Identify radar as vehicle-mounted or trailer-mounted.

Identify, by type, deployed tactical SSM systems ~e.g.,FROG, SS-21, SCUD!.

aThe information in this table was previously published in Ref. 3

Distinguish between SS-25 mobile missile TEL and MissileSupport Van (MSV) in a known support base, when not cov-ered by camouflage.

Identify TOP STEER or TOP SAIL air surveillance radar onKIROV-, SOVREMENNY-, KIEV-, SLAVA-, MOSKVA-,KARA-, or KRESTA-II-class vessels.

Identify individual rail cars by type ~e.g., gondola, flat, box!andyor locomotive by type ~e.g., steam, diesel!.

Rating Level 6Distinguish between models of smallymedium helicopters ~e.g.,HELIX A from HELIX B from HELIX C, HIND D from HINDE, HAZE A from HAZE B from HAZE C!.

Identify the shape of antennas on EWyGCIyACQ radars asparabolic, parabolic with clipped corners or rectangular.

Identify the spare tire on a medium-sized truck.

Distinguish between SA-6, SA-11, and SA-17 missile air-frames.

Identify individual launcher covers ~8! of vertically launchedSA-N-6 on SLAVA-class vessels.

Identify automobiles as sedans or station wagons.

Rating Level 7Identify fitments and fairings on a fighter-sized aircraft ~e.g.,FULCRUM, FOXHOUND!.

Identify ports, ladders, vents on electronics vans.

Detect the mount for antitank guided missiles ~e.g., SAGGERon BMP-1!.

Detect details of the silo door hinging mechanism on TypeIII-F, III-G, and III-H launch silos and Type III-X launch con-trol silos.

Identify the individual tubes of the RBU on KIROV-, KARA-,KRIVAK-class vessels.

Identify individual rail ties.

Rating Level 8Identify the rivet lines on bomber aircraft.

Detect horn-shaped and W-shapted antennas mounted atopBACKTRAP and BACKNET radars.

Identify a hand-held SAM ~e.g., SA-7y14, REDEYE, STINGER!.

Identify joints and welds on a TEL or TELAR.

Detect winch cables on deck-mounted cranes.

Identify windshield wipers on a vehicle.

Rating Level 9Differentiate cross-slot from single slot heads on aircraft skinpanel fasteners.

Identify small light-toned ceramic insulators that connect wiresof an antenna canopy.

Identify vehicle registration numbers ~VRN! on trucks.

Identify screws and bolts on missile components.

Identify braid of ropes ~1 to 3 inches in diameter!.

Detect individual spikes in railroad ties.

10 November 1997 y Vol. 36, No. 32 y APPLIED OPTICS 8323

Page 3: General Image-Quality Equation: GIQE

This paper describes the update and validation pro-cesses.

2. National Imagery Interpretability Scale

The NIIRS is a 10-level scale ~0 to 9! of image inter-pretability. It was developed and is maintained bythe IRARS committee.3 Originally released in 1974,the NIIRS has been used throughout the intelligencecommunity since that time. Each NIIRS level from1 through 9 is defined by a series of interpretationtasks that range from very easy ~requiring low imagequality! to very difficult ~requiring high levels of im-age quality!. Level 0 indicates that level 1 taskscannot be accomplished. The tasks that define theNIIRS are related to an empirically derived percep-tual image-quality scale. Table 1 lists the visibleNIIRS criteria for levels 1 through 9. Similar scaleshave been developed for use with radar, IR, and mul-tispectral imagery.3

The NIIRS is used by imagery analysts to defineimage requirements. Using image-quality equa-tions, system operators collect imagery to satisfy theanalysts’ requirements. The requester can examinethe image in terms of the NIIRS to determine if theoriginal request has been satisfied after acquiring animage. If it has not been satisfied, the requirementcan be resubmitted. When communicating withother analysts, the NIIRS of an image provides ashorthand description of its interpretability and thusutility. It also provides a shorthand rating of thevalidity of an interpretation report. NIIRS 8 infor-mation derived from a NIIRS 4 image is clearly lessbelievable than the reverse.

The NIIRS has been used extensively in studies ofimage quality. It is far more useful to quantify theeffects of a bandwidth-compression ~BWC! algorithmin terms of the NIIRS loss than in terms of, for ex-ample, the mean-square error. The NIIRS has alsobeen used ~although not specified as such! to validateimage-quality prediction metrics.4,5 Image-qualityprediction metrics attempt to predict image inter-pretability or utility as a function of physical-qualitymeasures.

3. Interpretability Prediction

The prediction of image interpretability has evolvedover a 40-year period. When aerial cameras andfilms were generally of the same quality, image scale~the ratio of the film-to-ground distance! served as anadequate predictor of image interpretability or infor-mation content.6 All images at the same scale wereof roughly the same quality and thus interpretability.As cameras and films improved and diversified interms of quality, scale was no longer a good predictor.Attention shifted to the use of other physical-qualitymeasures such as resolution, contrast, and noise.7–9

For a variety of reasons these studies did not pro-vide accurate predictions of interpretability. Thenonlinear relation between physical quality and in-terpretability was generally not recognized. Themeasurement and definition of interpretability wasnot well defined. Finally, the interactions among

8324 APPLIED OPTICS y Vol. 36, No. 32 y 10 November 1997

the physical-quality variables negated the use of asingle variable as a predictor.

Attention thus shifted to more complex summarymetrics such as the modulation transfer function~MTF! area,10,11 the optical and digital power spec-trum,5,12 and the SNR.13 Although some of thesemetrics predicted more than 80% of the observedvariance in interpretability, none gained immediatewidespread acceptance.

In a study reported by Beaton et al.14 the MTF areaaccounted for 86% of the observed NIIRS variance onhard-copy images but only 50% on soft-copy images.Other MTF-based metrics accounted for more than80% of the observed NIIRS-rating variance on bothhard-copy ~film! and soft-copy ~CRT or otherelectronic-display! images. Nill and Bouzas5

showed that a digital power-spectrum metric ac-counted for 81% of the observed NIIRS variance onhardcopy images. In both the study by Beaton etal.14 and that by Nill and Bouzas,5 the image-qualitymeasures were computed on the same images thatwere NIIRS rated. The ability to predict the MTF orpower spectrum of the images was not evaluated.There is, of course, no reason why such measurescould not be predicted. Simmons and Cheeseman15

have published a model ~PHYSIQUE! that computes in-formation content ~in bits per unit area! on the basisof EO system design and operating parameters. Toour knowledge, PHYSIQUE has not been validated interms of the ability to predict NIIRS ratings.

A. General Image-Quality Equation

The GIQE was developed initially in the late 1980’sbut was not formally released until 1994. The GIQEprovides NIIRS predictions as a function ofperceptual-quality attributes of scale, resolution, andsharpness and of contrast and noise. The terms inthe GIQE derive from earlier research relating phys-ical image quality to interpretability. The ground-sampled distance ~GSD! is a measure of both scaleand resolution. The relative edge response ~RER!relates to perceived sharpness or acutance. BecauseMTF compensation ~MTFC! is commonly used withEO system postprocessing, it was necessary to ac-count for MTFC effects. The MTFC boosts bothedges and noise, hence the overshoot and noise gainterms. Noise is modeled in terms of the noise gainderived from the MTFC and the SNR; a measure ofcontrast is captured in the SNR term. The terms inthe GIQE thus account for all of the physical-qualityparameters that have been found to affect image in-terpretability.

Conceptually the GIQE model accounts for target,sensor, and processing characteristics of EO systems,as diagrammed in Fig. 1. The GIQE does not ac-count for possible degradation resulting from BWC,and the model assumes a standardized contrast-optimized hard-copy image output. For soft-copy~CRT! image prediction it would be necessary to ac-count for display differences ~MTF and contrast! rel-ative to the current hard-copy prediction.

Page 4: General Image-Quality Equation: GIQE

The form of the original GIQE is

NIIRS 5 11.81 1 3.32 p log10~RERGMyGSDGM!

2 ~1.48 p HGM! 2 ~GySNR!, (1)

where the RERGM is the geometric mean of the nor-malized RER, the GSDGM is the geometric-meanGSD ~in inches!, HGM is the geometric mean-heightovershoot caused by edge sharpening, and G is thenoise gain resulting from edge sharpening.

The RER is the slope of the system’s edge responseand is derived from the system MTF. Componentsof the system MTF include optics, the environmentaland boundary-layer MTF’s, atmospheric effects, re-sidual motion, detector characteristics, and MTFC.The RER is measured between two points that are 0.5pixels from the edge ~Fig. 2!. The edge is normalizedover the range of 0 to 1. The geometric mean of theX- and Y-axis RER is used.

The GSD is the geometric mean of the ground-sampled distance based on a projection of the pixel-pitch distance to the ground:

GSD 5

FS pixel pitchfocal lengthD p slant rangeG

cos~look angle!. (2)

The GSD is computed in inches in both the X and Ydimensions. For systems or cases in which thealong-scan and cross-scan directions are not orthog-onal, the sine of the angle a between the along- andcross-scan directions must be used:

GSDGM 5 @GSDX p GSDY p sin a#1y2. (3)

The pixel pitch should include any aggregation ordecimation that is performed.

The overshoot-height term H models the edge-response overshoot that is due to MTFC. It is mea-sured over the range of 1.0 to 3.0 pixels from the edge~Fig. 3! in 0.25-pixel increments. If the edge ismonotonically increasing ~case 1!, it is defined as thevalue at 1.25 pixels from the edge. Otherwise ~case2!, it is the maximum value. As with the GSD andthe RER, it is calculated on both the X and Y axes.

Fig. 1. Model of the GIQE.

The noise-gain term is defined by computation ofthe root of the sum of the squares of the MTFC kernelvalues:

G 5 F(i51

M

(j51

N

~kernelij!2G1y2

. (4)

Finally, the SNR term is the ratio of the noise of thedc differential scene radiance ~electrons! to the noiseof the rms electrons computed before the MTFC andafter calibration. The dc differential scene radianceis the difference in detector output between two ex-tended Lambertian surfaces differing in reflectance.Although actual values can be used, the GIQE in itscurrent form assumes reflectances of 7% and 15%.An atmospheric model or actual measurement is re-quired to compute scene irradiance and path radi-ance and transmission. Knowledge is also requiredof the optics ~relative aperture and transmission! anddetector ~quantum efficiency, size, integration time,and time-delay integration stages!. Noise terms in-clude signal-dependent ~photon! noise, the dark-current, readout, quantization, and nonuniformitynoise sources. The SNR is calculated as

SNR 5 ~S1 2 S2!yN. (5)

Fig. 2. RER measurement.

Fig. 3. Computation of the overshoot H.

10 November 1997 y Vol. 36, No. 32 y APPLIED OPTICS 8325

Page 5: General Image-Quality Equation: GIQE

LOWTRAN was used as the atmospheric model for theoriginal GIQE development.

B. Model Validation

A sample of 359 NIIRS-rated images was used toupdate and validate the GIQE. The sample wassplit initially into two halves since the goal was toboth update and validate the model. Table 2 liststhe mean values for the two sets of data, as well as forthe total set. Table 3 gives the range of values in theoverall data set.

Initial regression analysis was performed on thedevelopment set by use of the terms in the originalGIQE. Analysis showed that the RERyGSD and GySNR coefficients were significantly lower than thosein the original GIQE. Because the NIIRS was cali-brated initially to a GSD relation, there was cause forconcern. The original coefficient of 3.32 implies aone-NIIRS difference as a result of a doubling or halv-ing of the GSD.16 Subsequent analysis showed thatthe GSD–NIIRS relation varied with the RER. Therelation between the GSD and the NIIRS flattened asthe RER decreased. Analysis also showed that theR2 value improved when the RER and GSD wereseparated in the equation. For preserving the 3.32GSD–NIIRS relation, analyses were performed to de-fine the RER levels at which the relation held. Itwas determined that the 3.32 coefficient was valid atRER values greater than 0.9. A final equation wasdeveloped for both RER conditions. The equationwas defined as

NIIRS 5 10.251 2 a log10 GSDGM 1 b log10 RERGM

2 ~0.656 p H! 2 ~0.344 p GySNR!, (6)

where the GSD, RER, H, and GySNR are defined asfor Eq. ~1!, a equals 3.32 and b equals 1.559 if RER $0.9, and a equals 3.16 and b equals 2.817 if RER ,

Table 2. Comparison of NIIRS Development and Test Sets

Variable

Mean Values

All Development Test

NIIRS 5.3 5.3 5.3NIIRSa 1.2 1.2 1.2GSD 20.6 in. 21.3 in. 19.9 in.RER 0.92 0.93 0.92Overshoot 1.31 1.31 1.32Noise gain 10.66 10.53 10.80SNR 52.3 52.7 51.8

aStandard deviation.

Table 3. Range of Values in the Overall NIIRS Data Set

Parameter Minimum Maximum

GSD 3 in. 80 in.RER 0.2 1.3Overshoot 0.9 1.9Noise gain 1 19SNR 2 130

8326 APPLIED OPTICS y Vol. 36, No. 32 y 10 November 1997

0.9. The equation provided an adjusted R2 value of0.986 and a standard error of 0.282.

The same equation was run on the test set to com-pute residuals ~prediction errors!. The estimated R2

value was computed by use of

R2 5 1 2 ~Sypx2ySy

2!, (7)

where Sypx is the standard error of Y as estimatedfrom X and Sy is the standard deviation of Y. Theestimated value of R2 was 0.934. The standard er-ror was 0.307. These values should be taken as thebest estimate of the performance of the revised GIQE.The relation between the predicted and the observedNIIRS is shown in Fig. 4.

The original GIQE computed the GSD in the planeorthogonal to the line of sight. For the current ver-sion of the GIQE the GSD is computed in the groundplane.

C. Residual Analysis

As a further check of the revised GIQE a series ofresidual analyses was performed. These analysesshowed the relation between prediction errors andthe value of the predictor variable. None of the cor-relations was statistically significant. An exampleis shown in Fig. 5. This indicates that the equationis not biased with respect to any of the predictorvariables.

D. Equation Comparison

The original GIQE was run on the same data thatwere used in the current study; the results are shownin Fig. 6. This figure can be compared with the re-sults for the revised GIQE shown in Fig. 4. Unlikethe revised equation, the original equation shows aslope significantly different from 1.0 and an interceptsignificantly different from 0. Figure 7 shows theresiduals for the GySNR term. A significant bias isevident.

Fig. 4. NIIRS values: observed versus predicted.

Page 6: General Image-Quality Equation: GIQE

Fig. 5. Residual plot for the GSD.

Fig. 6. NIIRS values for the original GIQE: observed versuspredicted.

Fig. 7. Residual plot of the GySNR for the original GIQE.

E. Model Predictions

The revised GIQE was evaluated to show the effectsof each of the predictor variables. Figure 8 showspredicted NIIRS as a function of the GSD. Figure 9shows the effect of the RER with the GSD set toprovide a NIIRS-7 level of performance. The sametypes of data are shown in Figs. 10 and 11 for theovershoot H and GySNR. Comparing Figs. 8–11shows that it is evident that the GSD and RER arethe dominant terms in the equation and that theovershoot H and the GySNR have a much smallerimpact.

Fig. 8. Predicted NIIRS values versus the GSD.

Fig. 9. Predicted NIIRS values versus the RER.

Fig. 10. Predicted NIIRS values versus the overshoot H.

10 November 1997 y Vol. 36, No. 32 y APPLIED OPTICS 8327

Page 7: General Image-Quality Equation: GIQE

4. Discussion and Conclusions

In a validation test the revised GIQE accounted for93% of the observed variance in the NIIRS ratings.The standard error of prediction was 0.3 NIIRS. Themean prediction error was zero, with a standard devi-ation of 0.3 units. This compares with an R2 value of0.89 and a standard error of prediction of 0.4 for theoriginal GIQE run on the same data set. The level ofperformance of the revised GIQE is better than most ofthe results of earlier non-GIQE studies previouslycited. Although Beaton et al.14 showed two metricshaving hard-copy R2 values of approximately 0.94,some shrinkage of these values would be expected in across-validation process. Further, the ability to pre-dict values for these metrics was not demonstrated.It thus appears that, in terms of NIIRS-prediction ac-curacy, the GIQE is at least equal to, and probablybetter than, other available metrics.

The GIQE predicts NIIRS values for hard-copy im-agery that has not been bandwidth compressed. Ex-perience has shown that its performance on soft-copysystems is generally equal to or better than that forhard-copy systems,17 but the terms required to modelthis difference are not included in the GIQE. Theeffects of BWC depend on the bit rate as well as on thetype ~and implementation! of BWC. BWC effectscan best be evaluated independently of the GIQE.

The GIQE treats edge sharpening in three terms:the RER, overshoot, and noise gain. Increasing theRER increases the NIIRS value, wheras increasingthe overshoot and noise gain decreases the NIIRSvalue. Because of this complex interdependency theuse of the GIQE to evaluate edge-sharpening algo-rithms is not recommended.

The current version of the GIQE separates theRER and the GSD. Because of this separation, themodel may not be valid for systems in which l~ f-number!yP is greater than 1.18

Finally, the revised GIQE was validated over therange of conditions listed in Table 3. The accuracyand validity of the GIQE outside these bounds isunknown.

The authors acknowledge the helpful support ofMichael Parcell and Robert Bell.

Fig. 11. Predicted NIIRS values versus the GySNR.

8328 APPLIED OPTICS y Vol. 36, No. 32 y 10 November 1997

References and Notes1. L. A. Maver, C. D. Erdman, and K. Riehl, “Imagery interpret-

ability rating scales,” in Digest of Technical Papers: Interna-tional Symposium of the Society for Information Display~Society for Information Display, Santa Ana, Calif., 1995!, Vol.26, pp. 117–120.

2. IRARS Committee, General Image Quality Equation: UsersGuide, Version 3.0, High Altitude Endurance Unmanned Aer-ial Vehicle Tier II1 distribution ~IRARS Committee, Washing-ton, D.C., 1994!.

3. J. C. Leachtenauer, “National Imagery Interpretability RatingScales: overview and product description,” in ASPRSyASCMAnnual Convention and Exhibition Technical Papers: RemoteSensing and Photogrammetry ~American Society for Photo-grammetry and Remote Sensing and American Congress onSurveying and Mapping, Baltimore, Md., 1996!, Vol. 1, pp.262–272.

4. H. L. Snyder, “Visual search and image quality,” Rep. AMRL-TR-76-89 ~Aerospace Medical Research Laboratory, Wright-Patterson Air Force Base, Ohio, 1976!.

5. N. H. Nill and B. H. Bouzas, “Objective image quality measurederived from digital image power spectra,” Opt. Eng. 31, 813–825 ~1992!.

6. U.S. Department of the Army, U.S. Department of the Navy,and U.S. Department of the Air Force, Photo InterpretationHandbook, TM30-45yNAVAER 10-35-610yAFM 200-50 ~U.S.Government Printing Office, Washington, D.C., 1954!.

7. C. C. Bennett, S. H. Winterstein, J. D. Taylor, and R. E. Kent,“A study of image quality and speeded intrinsic target recog-nition,” IBM Rep. 63-535-1 ~IBM Federal Systems Division,Oswego, N.Y., 1963!.

8. Applied Psychology Corporation, “Performance of photo-graphic interpreters as a function of time and image charac-teristics,” Rep. RADC-TDR-63-313 ~Rome Air DevelopmentCenter, Rome, N.Y., 1963!.

9. R. A. Erickson and J. C. Hemingway, “Image identification ontelevision,” Rep. NWC TP 5025 ~Naval Weapons Center, ChinaLake, Calif., 1970!.

10. H. C. Borrough, R. F. Fallis, T. H. Warnock, and J. H. Britt,“Quantitative determination of image quality,” Rep. D2-114058 ~Boeing Aerospace Company, Kent, Wash., 1967!.

11. H. L. Task, “An evaluation and comparison of several mea-sures of image quality for television displays,” Rep. AMRL-TR-79-7 ~Aerospace Medical Research Laboratory, Wright-Patterson Air Force Base, Ohio, 1979!.

12. R. A. Schindler, “Optical power spectrum analysis of processedimagery,” Rep. AMRL-TR-79-29 ~Aerospace Medical ResearchLaboratory, Wright-Patterson Air Force Base, Ohio, 1979!.

13. F. A. Rossell and R. H. Willson, “Recent psychophysical exper-iments and the display signal-to-noise ratio concept,” in Per-ception of Displayed Information, L. M. Biberman, ed.~Plenum, New York, 1973!.

14. R. J. Beaton, R. W. Monty, and H. L. Snyder, “An evaluation ofsystem quality metrics for hard-copy and soft-copy displays ofdigital imagery,” in Applications of Digital Image ProcessingVI, A. G. Tescher, ed., Proc. SPIE 432, 320–328 ~1983!.

15. R. E. Simmons and D. W. Cheeseman, PHYSIQUE ~EOI! User’sGuide, Version 3.10 ~Eastman Kodak, Rochester, N.Y., 1991!.

16. For example, log10 X 5 log2 Xy3.32.17. J. C. Leachtenauer and N. L. Salvaggio, “NIIRS prediction, use

of the Briggs target,” in ASPRSyASCM Annual Conventionand Exhibition Technical Papers: Remote Sensing and Pho-togrammetry, ~American Society for Photogrammetry and Re-mote Sensing and American Congress on Surveying andMapping, Baltimore, Md., 1996!, Vol. 1, pp. 282–291.

18. The term l~ f-number!yP is the ratio of the wavelength timesthe f-number to the pixel pitch. A ratio of greater than 1denotes oversampling relative to the cutoff frequency.