2011 grss data fusion contest: exploiting worldview-2 ... · spectral bands (c=coastal, b=blue,...

4
2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 MULTI-ANGULAR ACQUISITIONS Fabio Pacifici (1) , Jocelyn Chanussot (2) , and Qian Du (3) (1) DigitalGlobe, U.S.A. (2) Grenoble Institute of Technology, France (3) Mississippi State University, U.S.A. ABSTRACT The multi-angle capabilities of WorldView-2 are discussed in the framework of the GRSS Data Fusion Contest. Multi-angular surface reflectance measurements of grass, trees, and water are derived from the atmospherically corrected image sequence and compared to top of the atmosphere reflectance values. If the atmospheric effects are ignored, the top of the atmosphere values show significant spectral distortions for higher off-nadir acquisitions not resulting only from the bidirectional reflectance effects. The contest will be open for approximately another month. So far, more than 700 participants from 95 different countries have downloaded the imagery. Index Terms— Data Fusion Contest, multi-angle sequence, WorldView-2 1. INTRODUCTION The Data Fusion Contest has been organized by the Data Fusion Technical Committee (DFTC) of the Geoscience and Remote Sensing Society of the International Institute of Electrical and Electronic Engineers (IEEE). The DFTC serves as a global, multi-disciplinary, network for geospatial data fusion, connecting people and resources. It aims at educating students and professionals, and at promoting best practices in data fusion applications. The Contest is annually proposed since 2006 and it is open not only to IEEE members, but to everyone, with the aim of evaluating existing methodologies at the research or operational level to solve remote sensing problems using data from different sensors. For the 2006 Contest, the focus was on the fusion of multi-spectral and panchromatic images. Six Pleiades simulated images were provided by CNES, the French National Space Agency. Each data set included a very high spatial resolution panchromatic image (80 cm) with corresponding multi-spectral images (3.2 m resolution). A multi-spectral very high spatial resolution airborne image was available as ground reference and used by the organizing committee for the evaluation of the results, and not distributed to the participants. The results are in [1]. In 2007 the Contest`s main theme was urban mapping using radar and optical data. A set of satellite radar and optical images (9 ERS amplitude data sets and 2 Landsat multi-spectral images) were available. The task was to obtain a classified map as accurate as possible with respect to the unknown (to the participants) ground reference, depicting land cover and land use classes for the urban area under test. The results are reported in [2]. In 2008, the Contest was dedicated to the classification of very high spatial resolution hyper-spectral data. The data set was distributed to every participant, and the task was to obtain a classified map as accurate as possible with respect to the unknown (to the participants) ground reference, depicting land-cover and land-use classes. The data set consisted of airborne data from the Reflective Optics System Imaging Spectrometer (ROSIS-03) optical sensor. The number of bands of the ROSIS-03 sensor was 115 with a spectral coverage ranging from 0.43 to 0.86 μm. Thirteen noisy bands were removed. The spatial resolution was 1.3 m. The results are reported in [3]. In 2009, the aim of Contest was to perform change detection using multi-temporal and multi-modal data. Two pairs of data sets were available over Gloucester, UK, before and after a flood event. The data set contained two 3 band SPOT images and ERS images (one before and one after). The optical and radar imagery were provided by CNES. As for the previous editions of the Contest, the ground truth used to assess the results was not provided to the participants. Singular results were tested and ranked a first time using the K coefficient. Then, the best 5 results were used to perform information fusion using majority voting, and re-ranking was provided after evaluating which result most improves the information fusion results with respect to the above mentioned K coefficient. This year the Contest aims at exploiting multi-angular acquisitions over the same target area. Since there are a large variety of possible applications, each participant can decide the research topic to work with. At the end of May 2011, each participant will need to submit a paper to be considered for the Contest.

Upload: others

Post on 23-Jul-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 ... · spectral bands (C=Coastal, B=Blue, G=Green, Y=Yellow, R=Red, RE=Red Edge, NIR1=Near-Infrared1, and NIR2=Near-Infrared2,

2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 MULTI-ANGULAR ACQUISITIONS

Fabio Pacifici (1), Jocelyn Chanussot (2), and Qian Du (3)

(1) DigitalGlobe, U.S.A.

(2) Grenoble Institute of Technology, France (3) Mississippi State University, U.S.A.

ABSTRACT

The multi-angle capabilities of WorldView-2 are

discussed in the framework of the GRSS Data Fusion Contest.

Multi-angular surface reflectance measurements of grass, trees, and water are derived from the atmospherically corrected image sequence and compared to top of the atmosphere reflectance values. If the atmospheric effects are ignored, the top of the atmosphere values show significant spectral distortions for higher off-nadir acquisitions not resulting only from the bidirectional reflectance effects.

The contest will be open for approximately another month. So far, more than 700 participants from 95 different countries have downloaded the imagery.

Index Terms— Data Fusion Contest, multi-angle sequence, WorldView-2

1. INTRODUCTION

The Data Fusion Contest has been organized by the Data Fusion Technical Committee (DFTC) of the Geoscience and Remote Sensing Society of the International Institute of Electrical and Electronic Engineers (IEEE). The DFTC serves as a global, multi-disciplinary, network for geospatial data fusion, connecting people and resources. It aims at educating students and professionals, and at promoting best practices in data fusion applications.

The Contest is annually proposed since 2006 and it is open not only to IEEE members, but to everyone, with the aim of evaluating existing methodologies at the research or operational level to solve remote sensing problems using data from different sensors.

For the 2006 Contest, the focus was on the fusion of multi-spectral and panchromatic images. Six Pleiades simulated images were provided by CNES, the French National Space Agency. Each data set included a very high spatial resolution panchromatic image (80 cm) with corresponding multi-spectral images (3.2 m resolution). A multi-spectral very high spatial resolution airborne image was available as ground reference and used by the

organizing committee for the evaluation of the results, and not distributed to the participants. The results are in [1].

In 2007 the Contest`s main theme was urban mapping using radar and optical data. A set of satellite radar and optical images (9 ERS amplitude data sets and 2 Landsat multi-spectral images) were available. The task was to obtain a classified map as accurate as possible with respect to the unknown (to the participants) ground reference, depicting land cover and land use classes for the urban area under test. The results are reported in [2].

In 2008, the Contest was dedicated to the classification of very high spatial resolution hyper-spectral data. The data set was distributed to every participant, and the task was to obtain a classified map as accurate as possible with respect to the unknown (to the participants) ground reference, depicting land-cover and land-use classes. The data set consisted of airborne data from the Reflective Optics System Imaging Spectrometer (ROSIS-03) optical sensor. The number of bands of the ROSIS-03 sensor was 115 with a spectral coverage ranging from 0.43 to 0.86 μm. Thirteen noisy bands were removed. The spatial resolution was 1.3 m. The results are reported in [3].

In 2009, the aim of Contest was to perform change detection using multi-temporal and multi-modal data. Two pairs of data sets were available over Gloucester, UK, before and after a flood event. The data set contained two 3 band SPOT images and ERS images (one before and one after). The optical and radar imagery were provided by CNES. As for the previous editions of the Contest, the ground truth used to assess the results was not provided to the participants. Singular results were tested and ranked a first time using the K coefficient. Then, the best 5 results were used to perform information fusion using majority voting, and re-ranking was provided after evaluating which result most improves the information fusion results with respect to the above mentioned K coefficient.

This year the Contest aims at exploiting multi-angular acquisitions over the same target area. Since there are a large variety of possible applications, each participant can decide the research topic to work with. At the end of May 2011, each participant will need to submit a paper to be considered for the Contest.

Page 2: 2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 ... · spectral bands (C=Coastal, B=Blue, G=Green, Y=Yellow, R=Red, RE=Red Edge, NIR1=Near-Infrared1, and NIR2=Near-Infrared2,

Fig. 1: comparison of the WV2 and QB bands.

90°azimuth

180°

270°

(a)

(b)

(c)

(d)

Fig.2: (a) azimuth and elevation of the multi-angle acquisition (red) and the sun (yellow); (b)-(c)-(d) details from the three most nadiral images.

2. DATASET

A set of WorldView-2 (WV2) multi-sequence images was provided by DigitalGlobe. WV2, launched in late 2009, is the first commercial satellite to carry a very high spatial resolution sensor with one panchromatic and eight multi-spectral bands (C=Coastal, B=Blue, G=Green, Y=Yellow, R=Red, RE=Red Edge, NIR1=Near-Infrared1, and NIR2=Near-Infrared2, with center wavelengths at 425, 480, 545, 605, 660, 725, 835, and 950 nm respectively). By comparison, QuickBird`s (QB) four spectral bands (B, G, R, N1) are centered at 485, 560, 660 and 830 nm, as shown in Fig. 1.

This unique set is composed of five Ortho Ready Standard multi-angular acquisitions, including both 16 bit panchromatic and multi-spectral 8-band images.

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

350 400 450 500 550 600 650 700 750 800 850 900 950 1000 1050 1100

Tra

nsm

itta

nc

e

nm

C B G Y R RE NIR1 NIR2 TRAN

(a)

0

5

10

15

20

25

30

44.7° 56° 81.4° 59.8° 44.6°%

Up

wel

led

vs.

To

tal

Re

fle

ctan

ceSatellite Elevation

C B G Y R RE NIR1 NIR2

(b)

Fig. 3: (a) atmospheric transmittance for the most nadiral acquisition and (b) percentage of upwelled (vs. total) reflectance for the image-sequence.

The imagery was collected over Rio de Janeiro (Brazil)

in January 2010 within a three minute time frame with satellite elevation angles of 44.7°, 56.0°, and 81.4° in the forward direction, and 59.8° and 44.6° in the backward direction, as shown in Fig. 2. The multi-angular sequence contains the downtown area of the city, including a number of large buildings, commercial and industrial structures, the airport, and a mixture of community parks and private housing.

Multi-angle capabilities offer several advantages with respect to a single shot dataset. Generally speaking, multi-angular data fusion allows:

the exploitation/investigation of the bidirectional reflectance distribution function (BRDF) [4],

the extraction of digital height maps [5], atmospheric parameter retrieval [6], classification improvement [7], etc.

3. ATMOSPHERIC EFFECTS IN MULTI-ANGLE ACQUISITIONS

Several issues have to be considered when high off-nadir acquisitions are considered. Specifically, the atmosphere plays a key role in producing spectral distortions that may lead to incorrect results. In Fig. 3a, the atmospheric transmittance for the most nadiral acquisition is shown

Page 3: 2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 ... · spectral bands (C=Coastal, B=Blue, G=Green, Y=Yellow, R=Red, RE=Red Edge, NIR1=Near-Infrared1, and NIR2=Near-Infrared2,

along with the 8 WV2 bands: the effects of Rayleigh scattering, aerosols, and water vapor are predominant in C-B, and N2 bands, respectively. MODTRAN [8] was used for these simulations.

The so-called Upwelled Reflectance (which is the energy from the sun scattered upward by the atmosphere into the sensor [9]) is shown in Fig. 3b. In general, this quantity increases with smaller elevation angle of the satellite because of the longer path into the atmosphere. As shown, the upwelled reflectance is about 10-30% of the measured reflectance at the top of the atmosphere (TOA) at shorter wavelengths and less than 5% at longer wavelengths.

In Fig. 4, the TOA and surface reflectance are compared for vigorous grass, tree, and water. As shown, the TOA values show significant spectral distortions for both nadiral

and non-nadiaral images with respect to the surface reflectance measurements.

If the atmospheric effects are ignored, the spectral signatures will have higher reflectance values at shorter wavelengths. For example, it is possible to note how the classes grass and tree are actually bluish in the visible bands, instead of being greenish. The grass class is also useful to appreciate the effect of water vapor which drastically reduces the reflectance of NIR2 (as known, the reflectance values of NIR1 and NIR2 should be close for grass).

On the other hand, the surface reflectance values of the classes grass and tree have higher values in the green band than in blue or red. Also, the water vapor absorption produces less attenuation in NIR2 (being closer to NIR1).

0

0.1

0.2

0.3

0.4

0.5

0.6

44.7° 56.0° 81.4° 59.8° 44.6°

TO

A R

efle

ctan

ce

Satellite Elevation

GrassC B G Y

R RE NIR1 NIR2

0

0.1

0.2

0.3

0.4

0.5

0.6

44.7° 56.0° 81.4° 59.8° 44.6°

Surf

ace

Ref

lect

ance

Satellite Elevation

GrassC B G Y

R RE NIR1 NIR2

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

44.7° 56.0° 81.4° 59.8° 44.6°

TO

A R

efle

ctan

ce

Satellite Elevation

TreeC B G Y

R RE NIR1 NIR2

0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

44.7° 56.0° 81.4° 59.8° 44.6°

Surf

ace

Ref

lect

ance

Satellite Elevation

TreeC B G Y

R RE NIR1 NIR2

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2

44.7° 56.0° 81.4° 59.8° 44.6°

TO

A R

efle

ctan

ce

Satellite Elevation

WaterC B G Y

R RE NIR1 NIR2

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2

44.7° 56.0° 81.4° 59.8° 44.6°

Surf

ace

Ref

lect

ance

Satellite Elevation

WaterC B G Y

R RE NIR1 NIR2

Fig. 4: (left) TOA and (right) surface reflectance of vigorous grass, tree, water for the Rio de Janeiro multi-angular sequence.

Page 4: 2011 GRSS DATA FUSION CONTEST: EXPLOITING WORLDVIEW-2 ... · spectral bands (C=Coastal, B=Blue, G=Green, Y=Yellow, R=Red, RE=Red Edge, NIR1=Near-Infrared1, and NIR2=Near-Infrared2,

The class water can be analyzed to better understand the atmospheric effects as well. Specifically, the pixels samples were selected from deep water, approximating a dark object (i.e., a zero reflectance target). A similar behavior as the upwelled reflectance can be noticed in the TOA reflectance, instead of flat and low reflectance curves as for the atmospherically corrected case.

Assuming an ideal atmospheric correction, the variations in reflectance between the different multi-angle acquisitions can be attributed to BRDF effects. For example, the class tree appears to be more sensitive to the different viewing conditions. This can be explained considering that bidirectional reflectance effects in vegetated surfaces are primarily caused by the distribution of shadows, observed under specific viewing-illumination geometries [10]. This also explains the less sensitivity of grass and water to the different viewing conditions for the considered data set.

4. THE CONTEST SO FAR

As said, the Contest will be open until the end of May

2011. Successively, each participant will need to submit a paper to be considered for the competition. All papers will undergo a reviewing process by the Data Fusion Award Committee. The winner(s) will be announced at IGARSS 2011 and awarded with an IEEE Certificate of Recognition.

Up to the end of April 2011, more than 700 people have subscribed and the dataset has been downloaded from 95 different countries, with prevalence from USA, India and Brazil. Fig. 5 shows the geographical distribution of the subscribers, where other indicates the subscription`s total of countries with less than 10 participants.

5. CONCLUSIONS

The 2011 Data Fusion Contest aimed at exploiting multi-

angular acquisitions over the same target area using a set of WorldView-2 images. Since there were a large variety of possible applications, each participant could to decide the research topic to work with.

The Contest will be open for approximately another month. So far, more than 700 participants from 95 different countries have downloaded the imagery.

6. REFERENCES [1] L. Alparone, L. Wald, J. Chanussot, C. Thomas, P. Gamba, and L. M. Bruce, “Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data fusion contest”, IEEE Transactions on Geoscience and Remote Sensing, vol. 45, no. 10, pp. 3012–3021, Oct. 2007. [2] F. Pacifici, F. Del Frate, W. J. Emery, P. Gamba, J. Chanussot, “Urban mapping using coarse SAR and optical data: outcome of the 2007 GRS-S data fusion contest”, IEEE Geoscience and Remote Sensing Letters, vol. 5, no. 3, pp. 331-335, July 2008

[3] G. Licciardi, F. Pacifici, D. Tuia, S. Prasad, T. West, F. Giacco, J. Inglada, E. Christophe, J. Chanussot, P. Gamba, “Decision fusion for the classification of hyperspectral data: outcome of the 2008 GRS-S data fusion contest”, IEEE Transactions on Geoscience and Remote Sensing, vol. 47, no. 11, pp. 3857-3865, November 2009 [4] V. Liesenberg, L. Galvao, and F. Ponzoni, “Variations in reflectance with seasonality and viewing geometry: Implications for classification of Brazilian savanna physiognomies with MISR/Terra data”, Remote Sensing of Environment, vol. 107, no. 1-2, pp. 276–286, Mar. 2007. [5] N. Longbotham, C. Bleiler, C. Chaapel, C. Padwick, W. J. Emery, F. Pacifici, “Spatial Classification of WorldView-2 Multi-angle Sequence”, Joint Urban Remote Sensing Event 2011, Munich, Germany, April 11-13, 2011 [6] D. Diner, J. Beckert, T. Reilly, C. Bruegge, J. Conel, R. Kahn, J. Martonchik, T. Ackerman, R. Davies, S. Gerstl, H. Gordon, J. Muller, R. Myneni, P. Sellers, B. Pinty, and M. Verstraete, “Multi-angle Imaging SpectroRadiometer (MISR) instrument description and experiment overview”, IEEE Transactions on Geoscience and Remote Sensing, vol. 36, no. 4, pp. 1072–1087, Jul. 1998. [7] N. Longbotham, C. Bleiler, C. Chaapel, C. Padwick, W. J. Emery, F. Pacifici, “Urban Classification of a WorldView-2 Multi-Angle Sequence”, submitted to IEEE Transactions on Geoscience and Remote Sensing [8] A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: A Moderate Resolution Model for LOWTRAN 7”, Spectral Sciences, Burlington, MA, 1989. [9] J. R. Schott, Remote Sensing: The Image Chain Approach, 2nd ed. New York: Oxford Univ. Press, 2007. [10] B. Hapke, D. Di Mucci, R. Nelson, and W. Smythe, “The cause of the hot spot in vegetation canopies and soils: shadow-hiding versus coherent backscatter”, Remote Sens. Environ. vol. 58, pp. 63–68, 1996

205

1011121414

1415

17

21

22

22

25

26

63

80

141

USA India Brazil Colombia China France Italy Canada Russia Indonesia Iran Malaysia Pakistan Turkey Mexico Netherlands other

Fig. 5: geographical distribution of the Contest`s subscribers.