carnegie airborne observatory: in-flight fusion of ...€¦ · water content and physiological...

21
Carnegie Airborne Observatory: in-flight fusion of hyperspectral imaging and waveform light detection and ranging (wLiDAR) for three-dimensional studies of ecosystems Gregory P. Asner a , David E. Knapp a , Ty Kennedy-Bowdoin a , Matthew O. Jones a , Roberta E. Martin a , Joseph Boardman b , Christopher B. Field a a Department of Global Ecology, Carnegie Institution, 260 Panama Street, Stanford, CA 94305 USA [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] b Analytical Imaging and Geophysics LLC, 4450 Arapahoe Avenue, Boulder, CO 80303 USA [email protected] Abstract. Airborne remote sensing could play a more integrative role in regional ecosystem studies if the information derived from airborne observations could be readily converted to physical and chemical quantities representative of ecosystem processes and properties. We have undertaken an effort to specify, deploy, and apply a new system – the Carnegie Airborne Observatory (CAO) – to remotely measure a suite of ecosystem structural and biochemical properties in a way that can rapidly advance regional ecological research for conservation, management and resource policy development. The CAO “Alpha System” provides in-flight fusion of high-fidelity visible/near-infrared imaging spectrometer data with scanning, waveform light detection and ranging (wLiDAR) data, along with an integrated navigation and data processing approach, that results in geo-orthorectified products for vegetation structure, biochemistry, and physiology as well as the underlying topography. Here we present the scientific rationale for developing the system, and provide sample data fusion results demonstrating the potential breakthroughs that hybrid hyperspectral-wLiDAR systems might bring to the scientific community. Keywords: airborne remote sensing, data fusion, imaging spectroscopy, LiDAR, spectrometer. 1 INTRODUCTION Airborne remote sensing should play a key role in Earth science, land management, and conservation because neither ground-based nor satellite measurements can fully capture the spatial heterogeneity of ecosystem structural and functional changes that occur over large geographic areas. However, the information provided by airborne remote sensing depends on the technology and algorithms employed. In recent years, two advanced remote sensing technologies and sciences have matured to a point in which ecosystem structure and chemistry can now be probed and quantified in ways that are useful to conservation, management and resource policy development. Each technology provides unique data that are sufficiently rich in information to allow for highly automated analysis techniques, including accuracy and uncertainty reporting. One technology – imaging spectroscopy (also called hyperspectral imaging) – can provide detailed information on the cover, abundance and concentration of biological materials and biochemicals [1, 2]. The other technology – waveform light Journal of Applied Remote Sensing, Vol. 1, 013536 (13 September 2007) © 2007 Society of Photo-Optical Instrumentation Engineers [DOI: 10.1117/1.2794018] Received 25 May 2007; accepted 9 Sep 2007; published 13 Sep 2007 [CCC: 19313195/2007/$25.00] Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 1 Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Upload: nguyenanh

Post on 04-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

Carnegie Airborne Observatory: in-flight fusion of hyperspectral imaging and

waveform light detection and ranging (wLiDAR) for three-dimensional studies of ecosystems

Gregory P. Asnera, David E. Knappa, Ty Kennedy-Bowdoina, Matthew O. Jonesa, Roberta E. Martina, Joseph Boardmanb, Christopher B. Fielda

a Department of Global Ecology, Carnegie Institution, 260 Panama Street, Stanford, CA 94305 USA

[email protected], [email protected], [email protected], [email protected], [email protected], [email protected]

bAnalytical Imaging and Geophysics LLC, 4450 Arapahoe Avenue, Boulder, CO 80303 USA [email protected]

Abstract. Airborne remote sensing could play a more integrative role in regional ecosystem studies if the information derived from airborne observations could be readily converted to physical and chemical quantities representative of ecosystem processes and properties. We have undertaken an effort to specify, deploy, and apply a new system – the Carnegie Airborne Observatory (CAO) – to remotely measure a suite of ecosystem structural and biochemical properties in a way that can rapidly advance regional ecological research for conservation, management and resource policy development. The CAO “Alpha System” provides in-flight fusion of high-fidelity visible/near-infrared imaging spectrometer data with scanning, waveform light detection and ranging (wLiDAR) data, along with an integrated navigation and data processing approach, that results in geo-orthorectified products for vegetation structure, biochemistry, and physiology as well as the underlying topography. Here we present the scientific rationale for developing the system, and provide sample data fusion results demonstrating the potential breakthroughs that hybrid hyperspectral-wLiDAR systems might bring to the scientific community.

Keywords: airborne remote sensing, data fusion, imaging spectroscopy, LiDAR, spectrometer.

1 INTRODUCTION Airborne remote sensing should play a key role in Earth science, land management, and conservation because neither ground-based nor satellite measurements can fully capture the spatial heterogeneity of ecosystem structural and functional changes that occur over large geographic areas. However, the information provided by airborne remote sensing depends on the technology and algorithms employed. In recent years, two advanced remote sensing technologies and sciences have matured to a point in which ecosystem structure and chemistry can now be probed and quantified in ways that are useful to conservation, management and resource policy development. Each technology provides unique data that are sufficiently rich in information to allow for highly automated analysis techniques, including accuracy and uncertainty reporting. One technology – imaging spectroscopy (also called hyperspectral imaging) – can provide detailed information on the cover, abundance and concentration of biological materials and biochemicals [1, 2]. The other technology – waveform light

Journal of Applied Remote Sensing, Vol. 1, 013536 (13 September 2007)

© 2007 Society of Photo-Optical Instrumentation Engineers [DOI: 10.1117/1.2794018]Received 25 May 2007; accepted 9 Sep 2007; published 13 Sep 2007 [CCC: 19313195/2007/$25.00]Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 1

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 2

detection and ranging (wLiDAR) – can provide detailed information on the cover, height, shape, and architecture of vegetation, as well as ground topography [3]. When combined, hyperspectral imaging and wLiDAR may provide one of the most powerful, and ultimately practical, set of ecosystem observations available from the airborne vantage point.

We have undertaken an effort to develop and fully integrate imaging spectroscopy and wLiDAR technologies in a system called the Carnegie Airborne Observatory (CAO; http://cao.stanford.edu). Through in-flight fusion of these technologies, along with new automated algorithms for precise co-location and geo-orthorectification of the hyperspectral and wLiDAR data, the CAO provides an observational suite that simultaneously probes the biochemical and structural properties of ecosystems. This paper provides the background to, and details on, the CAO “Alpha System” design and aspects of its performance. Our primary intention is to demonstrate some of the unique ecological data products and analytical potential facilitated by in-flight fusion of hyperspectral and wLiDAR measurements. In doing so, we provide early results from Hawai'i, where we applied the CAO for studies of forest structure, biochemistry, and physiology.

1.1 Science Rationale The CAO system was based on a defined set of science questions, goals and needs. The guiding science questions are: • How are terrestrial ecosystems, and the services they provide, changing in response to

climate variability and change? • How are terrestrial ecosystems, and the services they provide, changing in response to

land-use practices and resource use? • How does diffuse disturbance affect the diversity and functioning of ecosystems? • How do changes in ecosystem composition, including those caused by invasive species,

alter the functioning of terrestrial ecosystems? • Where are the thresholds or tipping points at which ecosystems undergo sudden

transitions from one state to another (e.g., unburned versus burned, or pest free versus pest-infested), and what are the consequences of these transitions for the services ecosystems provide to humans?

Here, we put particular emphasis on “diffuse disturbances” which are small-scale, ubiquitous changes to the composition and physiognomy of ecosystems. Examples of diffuse disturbance include storms, selective logging, recreational vehicle use, and herbivory. Diffuse disturbances and their ecological effects are particularly difficult to quantify from ground- or space-based vantage points, and so airborne remote sensing plays a vital role in providing this information.

To address our scientific questions, the overarching requirement for the CAO was that the system must acquire data which simultaneously expresses the biochemical, physiological, and structural properties of ecosystems, including the vegetation, surface waters, soils, and the underlying topography. The major chemical constituents of interest are plant and microbiological pigments, water, and nitrogen, as well as soil organic carbon. The structural properties of interest are vegetation height, crown shape, vertical layering or stratification, and the underlying terrain. Another background requirement was to utilize the sensor technologies in a way that steps beyond remote sensing indices of vegetation properties, and to instead develop quantitative, physically-based measurements of the materials that comprise ecosystems in three dimensions. The CAO data also needed to be collected in a way that facilitates automated processing and analysis, thereby providing science-ready data on a timely basis, preferably within days or weeks of overflight. This requirement is particularly important for conservation and management applications, where data are often needed for

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 3

rapid decision-making. To meet the demands of many field-based research and management programs, the CAO also needed to be collected at very high spatial resolution, preferably in the 1-5m range of ground sample distances.

1.2 Measurement Needs Our science questions and goals drove the measurement needs that resulted in the selection of the CAO core technologies. Whereas multispectral sensors and color-infrared photography provide quantification of vegetation cover and identification of some vegetation types, as well as certain biochemical information, hyperspectral imaging often provides a far more detailed measurement and/or broader suite of measurements that directly express the pigment, water, nitrogen, and carbon chemistry of plants, as well as the mineral and organic content of exposed soils [2]. Hyperspectral imaging also provides a way to automate the correction of water vapor – one of the most important atmospheric constituents to account for in aircraft imagery [4]. In addition, hyperspectral signatures allow flexibility of analysis among spectral features, which is critically important in the evolving effort to map plant species and functional types based on their often unique, but variable, spectral reflectance signatures [5-7]. For these reasons, the CAO design required a hyperspectral imaging sub-system.

Most LiDARs provide information on the 3-D structure of vegetation and topography, and can be collected with a scanning system that provides an image-like data set [3, 8]. Waveform digitization is a specialized form of LiDAR that partitions both the outbound and inbound portion of a laser pulse into very fine time-series “spectra” [9, 10]. Other active and passive imaging technologies, such as synthetic aperture radar (SAR), interferometric SAR, and multi-view angle optical sensing, also provide important metrics of ecosystem structure [11, 12]. However, our requirements for high-resolution, 3-dimensional detail of vegetation structure and underlying topography made small-footprint wLiDAR a high priority sub-system for the CAO.

The CAO effort is driven by the recognition of the strong synergy between hyperspectral imaging and scanning wLiDAR. For example, canopy structure is often a major contributor to hyperspectral reflectance signatures of vegetation [13], although structure alone does not determine the spectra. Moreover, the expression of canopy biochemical and physiological properties in hyperspectral signatures is greatly affected by structure and the resulting shadows that occur within and between vegetation canopies [14, 15]. In contrast, wLiDAR can provide direct measurements of canopy height and crown shape that are key determinants of structure, shadowing, and biomass, but wLiDAR cannot easily distinguish between species or plant functional types that determine the conversion of vegetation heights/shapes to biomass estimates (via species-specific wood densities and other factors) [16]. wLiDAR also cannot distinguish differences in vegetation biochemical and physiological properties, yet the water content and physiological activity of canopy foliage affects the laser returns. The CAO was therefore envisioned to provide the hyperspectral and wLiDAR measurements in very close concert to facilitate the fusion and analysis of both data types. The required precision of hyperspectral and wLiDAR integration called for hardware fusion of the instruments on board the aircraft, and this required a unique data processing and analysis structure to bring the measurements together in a robust, operational way.

1.3 Sensor Limitations and Trade-offs There were trade-offs to be considered in the selection and deployment of either an imaging spectrometer or a wLiDAR. Furthermore, combining the technologies required an understanding of the operating requirements and limits of each instrument, so that a series of higher order trade-offs could be properly considered to allow coordinated data collection by each sensor on board an aircraft. The trade-offs are well known among instrument designers and engineers, but rarely discussed in the literature with respect to particular scientific

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 4

questions. Here, we discuss sensor limitations and trade-offs with respect to our ecological research goals and needs. 1.3.1 Imaging Spectrometer

Our scientific goals to measure ecosystem biochemical and physiological properties required an imaging spectrometer with high signal-to-noise (SNR) performance, detector uniformity, and sensor stability (Table 1) [17]. Imaging spectrometers can be designed with a variety of spectral ranges and spectral resolutions. The most common systems cover either the visible/near-infrared (VNIR) (400-1050 nm) or the VNIR plus shortwave-IR (400-2500 nm) range. VNIR instruments are usually based on silicon detector technology, whereas shortwave-IR instruments use detector materials such as indium-antimonide (InSb) or mercury-cadmium-telluride (HgCdTe) [18]. Both sensor types can use either a scanning mirror or a linear array (pushbroom) approach to populate an image with spectra. Major trade-offs here revolve around the fact that scanning systems are often much larger, and must fly much slower than pushbroom instruments, if a wide swath is to be achieved. However, pushbroom sensors are extremely difficult to design so that each element of the array behaves uniformly. For example, a 1000 pixel-wide detector is analogous to having 1000 individual spectrometers that must be inter-calibrated to exacting tolerances. Scanning mirror systems have one element that is swept across the image swath as the sensor flies in the aircraft. Our biochemical and physiological studies also required spatial resolutions of < 5m, and preferably < 1m (Table 1) to resolve individual plant canopies and vegetation clusters, and the available pushbroom designs were most suited to this task.

Another major technology trade-off involved the rate at which a spectrometer can collect photons and store the resulting electronic signal (called integration time) in comparison to aircraft speed over ground. For practical purposes, the high-fidelity VNIR and full-range spectrometers require integration times that translate to relatively low ground speeds and/or high flying altitudes. This, in turn, directly affects the spatial resolution (ground sample distance) of the image pixels. Since we needed high spatial resolution, we were limited by maximum integration times commensurate with low flying altitudes and high ground speeds.

Table 1. Airborne imaging spectrometer requirements for measurement and monitoring of ecosystem biochemistry and physiology at high spatial resolution.

Specification Requirement Spectral range 400-1050 nm minimum; 400-2500 nm optimal Spectral resolution (FWHM) < 10 nm minimum; < 5 nm optimal Spatial resolution < 5 m minimum; < 1 m optimal Range of flying altitudes 500 to 3500 m SNR 400-1050 nm > 500 @ 550 nm for live vegetation targets;

> 400 @ 850 nm SNR 1050-2500 nm > 100 @ 2100 nm for live vegetation targets Spectral calibration < 0.5 nm Spectral uniformity > 95% cross-track Spectral IFOV shift < 5% down spectrum

1.3.2 Waveform LiDAR

The instrument characteristics determine the type and quality of the vegetation structural information that can be derived from wLiDAR. LiDARs produce poor quality data if they: (1) operate at insufficient laser power levels, (2) have slow or unstable scanning mirrors, or

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 5

(3) have low performance signal digitizers [19]. Our scientific goals to measure top-of-canopy and ground elevations in highly vegetated areas required very high frequency laser firing and high mirror scan rates (Table 2). High pulse rates produce more laser shots per time and per area, but noise levels also increase at high pulse rates. This directly impacts the probability of canopy penetration to the ground, and the accuracy of the measurement [20]. Older 5 < 50 kHz LiDARs are well known to have problems penetrating canopies at the typical laser wavelengths of 1060-1065 nm [20, 21]. Newer 70+ kHz systems are proving useful for reaching the ground level, even in dense vegetation [22]. However, trade-offs had to be made between pulse repetition rate, beam divergence, laser spot size (spatial resolution), waveform digitization method, and flying altitude. At their currently available power levels, state-of-the-art, small-footprint wLiDAR systems cannot fly high while maintaining a high spatial resolution or enough power in the return pulse to capture the 3-D structure of canopies and topography below.

Table 2. Airborne wLiDAR requirements for measurement and monitoring of ecosystem 3-D structure.

Specification Requirement Laser pulse repetition frequency > 50 kHz minimum; > 100 kHz optimal Discrete laser return

measurement mode > 4 laser ranges/4 laser intensities

Waveform laser return measurement mode

> 200 elevations per laser pulse

Range of flying altitudes 500 to 3500 m Effective spatial resolution/laser

spot spacing < 2 m minimum; < 1 m optimal

Laser point distribution Evenly spaced across swath, across- and down-track 1.3.3 Navigation and Tracking Hardware

Navigation and tracking hardware is required to generate information on the location and orientation of the sensors onboard an aircraft. This information is essential for accurate projection of the collected data (hyperspectral radiance and laser shots) onto the ground below the airplane. The importance of this hardware cannot be easily overstated. For years, the efficacy of airborne remote sensing was severely limited when the sensor positioning was poorly known, leading to long and costly post-processing efforts, and often resulting in large volumes of unused data. Both the validity of the sensor measurements and their use in an operational program requires high performance positioning data. The two essential technologies are global positioning systems (GPS) and inertial measurement units (IMU). IMU technology has evolved in recent years, and is now providing sub-millidegree roll, pitch and yaw accuracies for sensors onboard aircraft. Properly combined with survey-grade, in-flight GPS measurements, the IMU output data stream serves as the primary vehicle for fusion of multiple sensor technologies in flight. The GPS and IMU sub-systems had to be fully integrated with the spectrometer and wLiDAR systems. 1.3.4 Software and Computing

Hyperspectral and wLiDAR systems generate large volumes of data, which require specialized software and high-performance, distributed computing for processing and analysis of large geographic areas. The list of core software modules for in-flight and post-flight fusion of hyperspectral and LiDAR measurements are provided in Table 3.

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 6

Table 3. Basic software requirements for pre-processing and fusion of hyperspectral and wLiDAR data. Processing Requirement

Major Steps Dependencies

Sensor trajectory analysis

GPS processing IMU processing

Sensor location onboard aircraft

Hyperspectral reflectance

Radiometric correction Geo-orthorectification Atmospheric correction

GPS-IMU trajectory LiDAR digital elevation model

LiDAR ranges, waveforms, intensities

Laser range analysis Timing corrections Geometric distortion modeling

GPS-IMU trajectory

2 THE CAO ALPHA SYSTEM The CAO Alpha System design addresses the many of the needs and trade-offs described in Section 1.0 to provide an integrated hyperspectral-wLiDAR system with precise 3-D positioning (Table 4). The sensor head assembly and data control systems (Fig. 1) weigh approximately 100 kg and require 50 amps of 28v-DC power onboard the aircraft. The system is small enough to be deployed on a wide variety of single- and twin-engine aircraft.

Fig. 1. Schematic of CAO Alpha System sensor head package and onboard control systems.

2.1 Spectrometer and wLiDAR The Alpha spectrometer is a pushbroom array with a diffraction grating and Offner spectrograph [17]. The design is a new version of the Compact Airborne Spectrographic Imager (CASI-1500) [23, 24], with custom anti-reflective lens coatings to decrease stray light and to boost SNR, a high-throughput read-out to lower integration times, and a VNIR cooling system to increase sensor stability. The spectrometer provides flexible programmability, very fine spectral resolution, a range of flight altitudes and aircraft speeds, and overall fidelity (SNR, detector uniformity, stability of electronics) (Table 4). Laboratory calibration tests

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 7

demonstrated instrument spectral smile of 0.01 pixels per 1500 cross-track elements, and spectral keystone of only 0.02 from 365 nm to 1050 nm. This design exceeded our core requirements for uniformity, and noticeably exceeded performance specifications for standard CASI-1500 systems [23, 24], but it did not meet our goal of achieving full spectral range (400-2500 nm) sampling. Knowing that imaging spectroscopy cannot achieve its potential of providing quantitative chemical analyses using an instrument that lacks high SNR, spectral uniformity or calibration, and given a limited initial budget, we opted to maintain a high degree of spectral and image quality over a shorter spectral range of 369-1052 nm. This provided a critical minimum set of observations related to canopy chemistry, physiology, and type (Table 5). It failed, however, to provide chemical measurements for some carbon constituents [1].

The Alpha wLiDAR operates at 1064 nm with laser pulse repetition rates programmable to 100 kHz (Table 4). A customized version of the Optech ALTM-3100EA was selected, along with a new waveform digitization system. Scan frequency is fully programmable, as is scan angle up to 44 degrees across the image swath. The system can be operated in both discrete-return and waveform modes simultaneously. In discrete-return mode, the system provides four ranges and four intensities per laser shot. The waveform digitizer provides the outbound and inbound waveforms in at least 220 contiguous returns per laser shot. The laser beam divergence of the system was custom-designed to precisely match the IFOV of the imaging spectrometer.

Table 4. The CAO Alpha System design specifications. Component Description Spectrometer Pushbroom array, diffraction grating, Offner design

373-1052 nm spectral range 1500 cross-track pixels; 40 degree field-of-view Fully programmable, up to 2.4 nm spectral resolution (288 bands) 14-bit dynamic range Spectral smile = 0.1 pixels across the entire 1500 pixel array Spectral keystone = 0.02 from 365 to 1050 nm SNR at nadir = ~400 @ 550 nm on 10% reflectance target SNR at nadir = ~400 @ 850 nm on 30% reflectance target Coincident downwelling radiance sensor (200-1100 nm; 2048 bands) Instantaneous field-of-view = 0.56 mrad to match wLiDAR Compatible with same GPS/IMU data stream as wLiDAR

wLiDAR Wavelength = 1064 nm 12-bit dynamic range for LiDAR intensities Waveform digitization; nanosecond temporal resolution; up to 440 slices

or elevations per laser shot Laser repetition programmable up to 100 kHz Scan angle programmable up to 44 degrees Scan frequency programmable up to 70 Hz Laser beam divergence of 0.56 mrad (1/e) to match spectrometer Compatible with same GPS/IMU data stream as spectrometer

IMU 200 Hz high-performance FOG gyros; silicon accelerometers Performance: velocity = 0.005 m/s; roll and pitch = 0.005 deg; heading = 0.008 deg

GPS L1/L2 compatible; 43 db 12 channel dual frequency; 10 Hz raw data rate

Sensor Mount Floating-plate design with six pneumatic mounts for vibration dampening Pilot Controls Navigation display controlled by instrument operator from rear of aircraft

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 8

2.2 Operational Limits The CAO Alpha System requires an aircraft that can fly in the altitude range of 1000-3500 m a.g.l., and a speed of 90-120 knots over ground. This is requirement is based on the spectrometer and LiDAR sensor specifications previously described. At a flight speed of 100 knots, and an altitude of 1000 m a.g.l., the CAO Alpha System can provide hyperspectral and LiDAR data at 0.45 m spatial resolution and laser spot spacing. At 3000 m a.g.l., the integrated system provides 1.35 m data. Under clear sky conditions, the system can cover more than 400 ha and 1,800 ha per hour from 1000 m and 3000 m a.g.l., respectively.

3 SYSTEM PERFORMANCE

3.1 Imaging Spectrometer The spectrometer calibration was tested in the laboratory using a SphereOptics 20” integrating sphere traceable to National Institute of Standards and Technology in accordance with ISO 10012-1 and MIL-STD-45662. Operational-mode imagery was collected at a range of apertures and integration times to evaluate the quality of the instrument calibration. The results showed absolute calibration errors of less than 0.015% throughout the spectrum (Fig. 2). These results demonstrated that the spectrometer sub-system of the CAO met our measurement and thus the science needs as described in section 1.3. In addition, Guanter et al. [24] recently completed a comprehensive evaluation of a standard CASI-1500 spectrometer, showing high uniformity and SNR, but a 2.3 nm spectral shift when the system is run in ultra-fine spectral resolution mode. Based on their work, we limit our operational spectral resolution to a minimum of 4.6 nm (FWHM) to minimize the affects of spectral shift on our scientific results.

Fig. 2. Spectral calibration of CAO spectrometer in comparison to NIST ISO 100012-1 light source with integration time set at 25 ms and data averaged across the imaging array.

Following the laboratory tests, we flew the sensor in an area of volcanic terrain in Hawai'i

that contains both bright forested and dark basalt targets. We did not take the opportunity to deploy ground calibration targets because high-fidelity imaging spectrometers theoretically no longer require spectral calibration targets [25]. That is, the spectra are of sufficiently low noise and high uniformity to facilitate automated retrieval of surface reflectance from physical

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 9

models. This has proven true on a regular basis using the JPL Airborne Visible and Infrared Imaging Spectrometer (AVIRIS), which, via a rapid increase in its performance specifications in the past six years, has been repeatedly used without field calibration to assess a range of biophysical and geophysical phenomena, starting with apparent surface reflectance [25-27]. We flew the same volcanic area with the AVIRIS and CAO spectrometers, although sun angles differed between image dates (Fig. 3). The AVIRIS data were collected on a clear-sky day at 3,200 m a.g.l. providing 3.2 m resolution imagery. The CAO was also flown on a clear-sky day, but at 2,000 m a.g.l., providing 1.0 m data. We then selected sunlit portions of large rainforest canopies and large patches of bare basalt for a comparison of the data taken by each sensor. Both images were processed using the ACORN-5 atmospheric correction model (Imspec LLC., Palmdale, CA, USA) with the same input parameters: (1) visibility set to 250 km, (2) water vapor retrieval using only the 940 nm feature, and (3) tropical atmosphere. The sun and sensor geometry were the only inputs unique to each instrument.

Fig. 3. Comparison areas for CAO (left) and AVIRIS (right) apparent surface reflectance images over volcanic terrain containing rainforest canopy (top) and shrubland/bare basalt (bottom).

SZA = solar zenith angle; SAZ = solar azimuth angle

The resulting spectra from each sensor showed very similar overall fidelity; the derived vegetation reflectance spectra of common sunlit canopies differed by 1-3% absolute, as the example in Fig. 4a shows. Airborne reflectance measurements of dark basalt were also highly comparable, with maximum differences of less than 1% absolute (Fig. 4b). This vicarious test of the CAO spectrometer against what is arguably the highest fidelity sensor currently in operation – AVIRIS – demonstrated the overall, end-user performance of the CAO spectrometer. Future studies focused solely on sensor performance will be carried out using a wider range of targets with both the CAO and AVIRIS, and with tighter control over sun-

SZA = 52o SAZ = 95o

SZA = 54o SAZ = 191o

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 10

sensor geometries, sky conditions, and other factors required for a more complete analysis and inter-calibration of the two sensors. Until then, this initial test indicated the overall performance of the CAO spectrometer.

Fig. 4. Example comparisons of CAO and AVIRIS apparent surface reflectance. Shown are mean spectra taken from 36 co-located, sunlit pixels of (a) forest canopy and (b) bare basalt rock targets.

3.2 wLiDAR The wLiDAR was also calibrated in the lab and then tested during flight from altitudes of 750-3,500 m a.g.l. Flights were conducted over an airport runway (3 km length) and flat-roof building (120 m length), both of which were surveyed at high spatial resolution (< 5 m) using standard ground-based survey techniques. The wLiDAR was operated at 33, 50, 70 and 100 kHz laser pulse repetition rates, depending upon flying altitude on each overpass. Repeat wLiDAR overpasses showed absolute horizontal accuracies of + 0.05 - 0.08 m (n = 1,684 test points), and vertical accuracies of + 0.06 - 0.14 m (n = 20,821 test points).

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 11

4 CAO PROCESSING AND DATA FUSION

4.1 Processing Stream The critical first steps to developing the fusion products listed in Table 6 involves high-precision co-location and ortho-georectification of LiDAR discrete return and waveform ranges and intensities with spectral radiance measurements, in an effective data “cube” of up to 540 bands (up to 288 hyperspectral, 220 LiDAR waveform intensities, 4 discrete-return LiDAR ranges and intensities, plus an additional 15-20 bands of ephemeris data on solar angle, view angle, and other data set properties). We developed a new co-location and ortho-georectification process to ray trace the imaging spectrometer pixels and LiDAR spots from the instrument optical centers onboard the aircraft down to the ground. Our ray tracing software requires high precision and accuracy GPS-IMU navigation and trajectory analysis of the instruments on board the aircraft (Fig. 5).

Fig. 5. Trajectory of a CAO flight over Hawaii, which shows the precise location and attitude of the CAO instrumentation above ground. Graph axes show distances in easting, northing and altitude in

meters.

The high-precision and high-accuracy trajectory and attitude information form the common link for the detailed ray tracing of the two separate sensors. Both sensors receive the same GPS time-stamping, which allows for the common use of the GPS-IMU data stream among instruments. In addition, the lever-arms (or offsets) between the GPS antenna, IMU, LiDAR and spectrometer optical centers are precisely measured and modeled on board the aircraft; these offsets are used with the optical models of the two sensor heads to solve for the precise co-location and projected geo-location of all data in post-flight processing.

For both the wLiDAR system and the hyperspectral imagery, each pixel is individually ray traced using a full optical sensor model and the trajectory data. The pixels of the passive imagery are ray traced to the surface defined by the first returns of the wLiDAR data. The point-for-point alignment of the wLiDAR and passive image data is complicated by the inherent differences in the scanning geometries of the two systems and the further distortions

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 12

of the ground sampling grid due to topography, aircraft motion and attitude and non-nadir look directions. Our approach is to recover best-estimates for each pixel center location in three dimensions for both the wLiDAR and hyperspectral data. These actual locations can be used in detailed spectroscopic modeling and also for rendering of the two data sets into a common grid for overlay, comparison and product generation. Fig. 6 shows a subset of hyperspectral imagery and wLIDAR first returns for the Institute of Pacific Islands Forestry, US Forest Service facility near Hilo, HI, rendered on a square pixel grid of 0.5m spacings. The precise and accurate co-alignment of the two data sets is demonstrated by the match between the wLiDAR and passive imagery and the three dimensional draping of the imagery over the wLiDAR data. In this example, we calculated the root-mean-square position uncertainty < 0.15m, or < 1/3 of the 0.5m pixel size of these rendered results. The stability of these solutions has proven very high and inter-swath boundaries between adjacent flightlines are usually only manifest by bidirectional reflectance differences with no relative positional errors visible at the pixel level. The high accuracy and high precision of the trajectory solution, coupled with detailed optical models for the sensors, permits sub-pixel collocation of the two data sets and opens the door to multi-angular, multi-temporal, and multi-resolution studies.

Fig. 6. Example ray tracing results for in-flight fusion of CAO hyperspectral imaging and LiDAR data.

The CAO processing stream is outlined in Fig. 7, and starts with the trajectory and ray tracing analysis described above. All other data products rely upon this step. Throughout the remainder of this section, we highlight key aspects of the processing stream, providing some examples from a pilot study in a forest and agriculture landscape in Hawaii. The steps described here are intended to show how in-flight data fusion can be used to develop a series of related data products for ecosystem analysis. We did not intend to validate each product here, as that will be the focus of specific upcoming studies. However, all products have specific heritage in the literature, as cited.

A series of core and synthetic fusion data products are derived from the CAO processing stream and include upper-canopy pigment concentrations and indices, canopy water content and indices, canopy height and architecture, and ground topography. These products were

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 13

selected based on our scientific requirements (section 1.1) as well as the overall maturity of the published algorithms [2-4, 8, 28-36]. Higher-order synthetic products from the CAO include aboveground biomass (carbon) stocks in vegetation, canopy light-use efficiency (g C per MJ absorbed photosynthetically active radiation), gross primary production (GPP), and species dominance and diversity. Some of these synthetic products require ecosystem and physically-based remote sensing models [36-39].

Following trajectory and ray tracing analysis, the spectrometer data are converted to at-sensor radiance values using laboratory calibration spectra as shown in Fig. 2. Apparent surface reflectance is then derived from the radiance data using an automated atmospheric correction model, ACORN 5LiBatch (Imspec LLC, Palmdale, CA). Inputs to the atmospheric correction algorithm include ground elevation (from LiDAR), aircraft altitude (from GPS-IMU), solar and viewing geometry, atmosphere type (e.g., tropical), and estimated visibility (in kilometers). The code uses a MODTRAN look-up table to correct for Rayleigh scattering and aerosol. Water vapor is estimated directly from the spectral signatures using the 940 nm water vapor feature in the radiance data [4]. Additional atmospheric corrections can be made using downwelling spectral irradiance data collected by a sensor mounted on the top of the aircraft fuselage, with data logged and embedded in the imaging spectrometer data collected every ~30 lines of image data.

Fig. 7. The CAO data processing stream.

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 14

Sample output from the radiance conversion and atmospheric correction is shown in Fig. 8, demonstrating the overall spatial and spectral fidelity of the data taken at 0.45 m spatial resolution (ground sample distance). These images are used in a sequence of example data fusion analysis steps shown below.

Fig. 8. Sample spectral images (0.45 m pixel size) of (a) radiance and (b) atmospherically-corrected reflectance. Insets show example radiance

and reflectance spectra from indicated circle for wavelengths spanning 367-1052 nm.

While the spectrometer data are radiance- and atmospherically-corrected, the LiDAR laser shots are modeled to retrieve ground terrain (bald Earth), canopy surface, and canopy profile information. Very high pulse repetition rates of 70-100 kHz increase the probability of vegetation penetration to retrieve ground elevation, and thus canopy height.

Fig. 9. Sample LiDAR first surface (vegetation and ground) image of same area shown in Fig. 8. Tall trees are shown in blue with progressively shorter vegetation in green and red. Also shown are the

extracted waveform and a pseudo-waveform from canopy pixel in the black circle.

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 15

Canopy vertical profiling is provided using the LiDAR waveforms or as a simulated pseudo-waveform derived from spatially-aggregated, discrete-return data (Fig. 8). These pseudo-waveforms are created by binning them over a defined horizontal spatial scale (e.g. 2 x 2 m), then vertically by elevation (e.g., 0-0.5 m, 0.5-2 m, 2-4 m, 4-6 m…).

Following the single-instrument analysis steps with the spectrometer and LiDAR data, a series of integrated fusion analyses are then be undertaken. First, the lighting conditions in each spectrometer pixel are precisely determined using a combination of the LiDAR canopy-surface digital elevation model (DEM; Fig. 9), vegetation height product, and solar and viewing geometry. This facilitates the critical step of selecting spectral reflectance signatures from the spectrometer data categorized under specific illumination conditions (Fig. 10). For example, fully sunlit portions of vegetation canopies exceeding a given height above ground can be flagged for additional biochemical and physiological analysis.

Fig. 10. Within-canopy and inter-crown shadow (black), plus all vegetation lower than

1 m (black), determined and masked by ray tracing of incoming solar radiation to LiDAR vegetation shots back to the sensor. The remaining pixels (gray) are retained for subsequent

biochemical and physiological analyses.

The sunlit spectrometer pixels are then passed to a modified version of the AutoMCU code [35] for automated analysis of photosynthetic and non-photosynthetic canopy fractions (Fig. 11). The importance of shade-canopy masking (from LiDAR) is illustrated in Fig. 11, where errors in spectral mixture analysis caused by intra-canopy, inter-canopy, and canopy-to-ground shade are removed altogether from the analysis. This allows for a more quantitative and reliable determination of photosynthetic and non-photosynthetic fractional cover in and among canopies. The non-photosynthetic fraction, above a given height threshold determined by the LiDAR data, is then extracted as a final product for standing dead material (blue in Fig. 11).

Following the determination of sunlit pixels dominated by live foliage, these portions of the canopies can then be analyzed for expressed pigments, water content and nutrients (Fig. 7). Three types of algorithms are currently employed based on their overall maturity and general applicability, as highlighted throughout the literature [36, 40, 41]. First, a set of vegetation indices (VI) are computed on an automated basis, providing metrics of chlorophyll concentration [28], carotenoid and anthocyanin concentrations [29], light-use efficiency (PRI; [42]), canopy water content [4], and leaf area index (LAI; [43]). Second, partial least squares analysis (PLS) is employed to estimate upper-canopy leaf nitrogen (N) and pigment concentrations [31, 32]. Third, we are testing canopy radiative transfer model inversion

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 16

Fig. 11. (a) Photosynthetic (green), non-photosynthetic (blue), and shade/soil fractions (red) using the AutoMCU algorithm [27]. (b) Same as left image, but with all shadows, intra-crown shade and vegetation < 1 m masked by LiDAR. Notice the standing dead vegetation in blue in the right image. approach to simultaneously estimate leaf N, multiple pigments, and canopy leaf area index (LAI) [36]. The key point here is that a suite of approaches can be brought to bear on the biochemical analyses, and their accuracy are greatly improved by a prior isolation of sunlit canopy pixels of a given vegetation height class [14]. Fig. 12 provides examples of the expressed canopy light-use efficiency (LUE) [42] and leaf N derived from canopy radiative transfer model inversion [36]. The LUE image (left) shows canopies of apparent high CO2 uptake per unit of absorbed photosynthetically active radiation in red, with progressively lower values in yellow, green and blue. Notice that portions of tree crowns with very high sunlit exposure to the south have lower apparent LUE (yellows), as would be expected from leaf-level studies of LUE [44]. The leaf N image (right) shows a slightly different pattern with surprising heterogeneity within individual tree crowns. Highest N concentrations (> 1.7%) are shown as orange colors, with yellows, greens and blue showing progressively lower N values.

Fig. 12. (a) PRI and (b) leaf N concentration derived from the

reflectance image shown in Fig. 8.

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 17

The sunlit canopy spectra, and derived biochemical metrics, can then be analyzed in the context of species dominance and richness [6, 39]. For example, the first derivative of reflectance among sunlit portions of canopies can depict the overall species richness within a prescribed area such as a 0.1 ha kernel [39]. Additionally, the LiDAR canopy heights and crown diameters can be combined with either general or species-specific allometric equations to estimate aboveground live biomass (Fig. 13a) [3, 45, 46]. This image shows aboveground biomass from highest values (white) to progressively lower values in green and black. In addition, the LAI, PRI, and N results can be ingested into models, such as PnET [37] and CASA-2 [47] to derive estimates of gross and net primary production (Fig. 13b). Here we used CASA-2 to estimate instantaneous gross primary production (GPP) across the forest canopies and in the partial clearing.

Fig. 13. (a) Aboveground live tree biomass and (b) GPP derived from the

products shown in Figs. 8-12.

These example data products of varying scientific maturity benefit directly from the full fusion of the hyperspectral and wLiDAR measurements: wLiDAR provides the detailed structural information for shadow determination in the spectroscopy data, while the spectroscopy provides information on live-dead fraction for partitioning biomass estimates derived from the wLiDAR. Furthermore, the complete fusion of spectrometer and wLiDAR measurements narrows the solution domain for a variety of integrative products such as growth rates. Future efforts will explore these interactions and the potential benefit of in-flight data fusion.

5 CONCLUSIONS Regional ecology – and its connection to conservation, management and resource policy development – is often limited by lacking detail in large-scale observations. The observations often fall short in delivering biophysical information and/or are taken at insufficient spatial resolution while maintaining substantial geographic coverage. Imaging spectroscopy and wLiDAR are complementary technologies that could rapidly advance regional ecological science by providing a suite of measurements inter-relating vegetation biochemistry, physiology, and structure as well as topography. Coordinated use of imaging spectrometer and wLiDAR data is complicated by stark differences in the design of the instrumentation; i.e. pushbroom spectrometers and scanning-mirror wLiDARs. Co-location of spectrometer and wLiDAR data is also extremely difficult to achieve from two separate aircraft. Even small 0.5-2.0 m offsets in the two data sets are difficult to achieve, yet these seemingly small offsets can render a combined data set nearly unusable for many ecological studies that require a

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 18

simultaneous understanding of the functional and structural properties of vegetation and ecosystems. Here we presented the background scientific rationale, system specifications, and fusion approaches for bringing imaging spectroscopy and waveform LiDAR into truly integrated, in-flight data fusion framework deployed on a single aircraft platform. Basic calibration and example data products derived from the fusion show great promise in bringing 3-D imaging of ecosystems to a reality for regional ecological science. In particular, the potential for multi-scale analysis of how structural, biochemical and physiological properties of vegetation co-vary and respond to environmental change will be rapidly advanced by in-flight fusion approaches.

Applications for integrated imaging spectroscopy and waveform LiDAR observations extend far beyond regional ecological science. Conservation, management and resource policy efforts need regional-scale, spatially-explicit information on the goods and services provided by ecosystems. For example, carbon sequestration, water quantity and quality, and cultural resources are often defined by the structure, physiology, and diversity of vegetation, and its relation to hydrological networks and terrain. The CAO Alpha System provides detailed measurements of these contributors to ecosystem goods and services in a way that could clearly impact planning and decision-making. The Carnegie Airborne Observatory is serving as a new pathfinder for future programs seeking detailed ecological and geophysical data at the regional scale.

Acknowledgments The Carnegie Airborne Observatory is made possible by the generous support of the W.M. Keck Foundation and William Hearst III. We thank Susan Ustin for her sustained advice, scientific support and editorial comments, as well as three anonymous reviewers for their helpful suggestions. We also thank our engineering partners ITRES Research Ltd., Optech Inc., Applanix Inc., Analytical Imaging and Geophysics LLC, and ImSpec LLC.

References [1] P. J. Curran, "Remote sensing of foliar chemistry," Rem. Sens. Environ. 30, 271-278

(1989) [doi:10.1016/0034-4257(89)90069-2]. [2] S. L. Ustin, D. A. Roberts, J. A. Gamon, G. P. Asner, and R. O. Green, "Using imaging

spectroscopy to study ecosystem processes and properties," Bioscience 54, 523-534 (2004) [doi:10.1641/0006-3568(2004)054[0523:UISTSE]2.0.CO;2].

[3] M. A. Lefsky, W. B. Cohen, G. G. Parker, and D. J. Harding, "Lidar Remote Sensing for Ecosystem Studies," BioScience 52, 19-30 (2002) [doi:10.1641/0006-3568(2002)052[0019:LRSFES]2.0.CO;2].

[4] B.-C. Gao and A. F. H. Goetz, "Retrieval of equivalent water thickness and information related to biochemical components of vegetation canopies from AVIRIS data," Rem. Sens. Environ. 52, 155-162 (1995) [doi:10.1016/0034-4257(95)00039-4].

[5] M. L. Clark, D. A. Roberts, and D. B. Clark, "Hyperspectral discrimination of tropical rain forest tree species at leaf to crown scales," Rem. Sens. Environ. 96, 375-398 (2005) [doi:10.1016/j.rse.2005.03.009].

[6] P. A. Townsend and J. R. Foster, "Comparison of EO-1 Hyperion to AVIRIS for mapping forest composition in the Appalachian Mountains, USA," Int. Geosci. Rem. Sens. Symp. 2, 793-795 (2002) [doi:10.1109/IGARSS.2002.1025688].

[7] M. A. Cochrane, "Using vegetation reflectance variability for species level classification of hyperspectral data," Int. J. Rem. Sens. 21, 2075-2087 (2000) [doi:10.1080/01431160050021303].

[8] J. B. Drake, R. O. Dubayah, D. B. Clark, R. G. Knox, J. B. Blair, M. A. Hofton, R. L. Chazdon, J. F. Weishampel, and S. D. Prince, "Estimation of tropical forest structural

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 19

characteristics using large-footprint lidar," Rem. Sens. Environ. 79, 305-319 (2002) [doi:10.1016/S0034-4257(01)00281-4].

[9] J. E. Means, S. A. Acker, D. J. Harding, J. B. Blair, M. A. Lefsky, W. B. Cohen, M. E. Harmon, and W. A. McKee, "Use of large-footprint scanning airborne lidar to estimate forest stand characteristics in the Western Cascades of Oregon," Rem. Sens. Environ. 67, 298-308 (1999) [doi:10.1016/S0034-4257(98)00091-1].

[10] G. Q. Sun and K. J. Ranson, "Modeling lidar returns from forest canopies," IEEE Trans. Geosci. Rem. Sens. 38, 2617-2626 (2000) [doi:10.1109/36.885208].

[11] D. J. Diner, G. P. Asner, R. Davies, Y. Knyazikhin, J.-P. Muller, A. W. Nolin, B. Pinty, C. B. Schaaf, and J. Stroeve, "New directions in Earth observing: scientific applications of multi-angle remote sensing," Bull. Am. Meteorol. Soc. 80, 2209-2228 (1999) [doi:10.1175/1520-0477(1999)080<2209:NDIEOS>2.0.CO;2].

[12] R. N. Treuhaft, B. E. Law, and G. P. Asner, "Forest attributes from radar interferometric structure and its fusion with optical remote sensing," Bioscience 54, 561-571 (2004) [doi:10.1641/0006-3568(2004)054[0561:FAFRIS]2.0.CO;2].

[13] G. P. Asner, "Biophysical and biochemical sources of variability in canopy reflectance," Rem. Sens. Environ. 64, 134-153 (1998) [doi:10.1016/S0034-4257(98)00014-5].

[14] P. J. Zarco-Tejada, J. R. Miller, T. L. Noland, G. H. Mohammed, and P. H. Sampson, "Scaling-up and model inversion methods with narrowband optical indices for chlorophyll content estimation in closed forest canopies with hyperspectral data," IEEE Trans. Geosci. Rem. Sens. 39, 1491-1507 (2001) [doi:10.1109/36.934080].

[15] G. A. Blackburn, "Remote sensing of forest pigments using airborne imaging spectrometer and LIDAR imagery," Rem. Sens. Environ. 82, 311-321 (2002) [doi:10.1016/S0034-4257(02)00049-4].

[16] T. W. Gillespie, J. Brock, and C. W. Wright, "Prospects for quantifying structure, floristic composition and species richness of tropical forests," Int. J. Rem. Sens. 25, 707-715 (2004) [doi:10.1080/01431160310001598917].

[17] R. O. Green, "Spectral calibration requirement for Earth-looking imaging spectrometers in the solar-reflected spectrum," Applied Optics 37, 683-690 (1998).

[18] G. Vane and A. F. H. Goetz, "Terrestrial imaging spectroscopy," Rem. Sens. Environ. 24, 1-29 (1988) [doi:10.1016/0034-4257(88)90003-X].

[19] E. P. Baltsavias, "Airborne laser scanning: existing systems and firms and other resources," ISPRS J. Photogram. Rem. Sens. 54, 164-198 (1999) [doi:10.1016/S0924-2716(99)00016-7].

[20] N. R. Goodwin, N. C. Coops, and D. S. Culvenor, "Assessment of forest structure with airborne LiDAR and the effects of platform altitude," Rem. Sens. Environ. 103, 140-152 (2006) [doi:10.1016/j.rse.2006.03.003].

[21] T. Takahashi, K. Yamamoto, Y. Miyachi, Y. Senda, and M. Tsuzuku, "The penetration rate of laser pulses transmitted from a small-footprint airborne LiDAR: a case study in closed canopy, middle-aged pure sugi (Cryptomeria japonica D. Don) and hinoki cypress (Chamaecyparis obtusa Sieb. et Zucc.) stands in Japan " J. For. Res. 11, 117-123 (2006) [doi:10.1007/s10310-005-0189-0].

[22] C. Hopkinson, "The influence of flying altitude, beam divergence, and pulse repetition frequency on laser pulse return intensity and canopy frequency distribution," Can. J. Rem. Sens. 33, 312-324 (2007).

[23] Itres Research Ltd., “Compact Airborne Spectrographic Imager-1500 (CASI-1500) Hyperspectral Imager," <http://www.itres.com/CASI_1500> (6 September 2007).

[24] L. Guanter, V. Estelles, and J. Moreno, "Spectral calibration and atmospheric correction of ultra-fine spectral and spatial resolution remote sensing data: Application to CASI-1500 data," Rem. Sens. Environ. 109, 54-65 (2007) [doi:10.1016/j.rse.2006.12.005].

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 20

[25] R. O. Green, M. L. Eastwood, C. M. Sarture, T. G. Chrien, M. Aronsson, B. J. Chippendale, J. A. Faust, B. E. Pavri, C. J. Chovit, M. S. Solis, M. R. Olah, and O. Williams, "Imaging spectroscopy and the Airborne Visible Infrared Imaging Spectrometer (AVIRIS)," Rem. Sens. Environ. 65, 227-248 (1998) [doi:10.1016/S0034-4257(98)00064-9].

[26] R. O. Green, B. E. Pavri, and T. G. Chrien, "On-orbit radiometric and spectral calibration characteristics of EO-1 Hyperion derived with an underflight of AVIRIS and in situ measurements at Salar de Arizaro, Argentina," IEEE Trans. Geosci. Rem. Sens. 10, 1194-1203 (2003) [doi:10.1109/TGRS.2003.813204].

[27] G. P. Asner, A. J. Elmore, F. R. Hughes, A. S. Warner, and P. M. Vitousek "Ecosystem structure along bioclimatic gradients in Hawai'i from imaging spectroscopy," Rem. Sens. Environ. 96, 497-508 (2005) [doi:10.1016/ j.rse.2005.04.008].

[28] A. A. Gitelson, Y. Gritz, and M. N. Merzlyak, "Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves," J. Plant Physiol. 160, 271-282, 2003 [doi:10.1078/0176-1617-00887].

[29] A. A. Gitelson, Y. Zur, O. B. Chivkunova, and M. N. Merzlyak, "Assessing carotenoid content in plant leaves with reflectance spectroscopy," Photochem. Photobiol. 75, 272-281 (2002) [doi:10.1562/0031-8655(2002)075<0272: ACCIPL>2.0.CO;2].

[30] D. A. Roberts, M. O. Smith, and J. B. Adams, "Green vegetation, nonphotosynthetic vegetation, and soils in AVIRIS data," Rem. Sens. Environ. 44, 255-269 (1993) [doi:10.1016/0034-4257(93)90020-X].

[31] M. L. Smith, M. E. Martin, L. Plourde, and S. V. Ollinger, "Analysis of hyperspectral data for estimation of temperate forest canopy nitrogen concentration: comparison between an airborne (AVIRIS) and spaceborne (Hyperion) sensor," IEEE Trans. Geosci. Rem. Sens. 41, 1332-1337 (2003) [doi:10.1109/TGRS.2003.813128].

[32] M. L. Smith, S. V. Ollinger, M. E. Martin, J. D. Aber, R. A. Hallett, and C. L. Goodale, "Direct estimation of aboveground forest productivity through hyperspectral remote sensing of canopy nitrogen," Ecol. Appl. 12, 1286-1302 (2002) [doi:10.2307/3099972].

[33] D. A. Roberts, B. W. Nelson, J. B. Adams, and F. Palmer, "Spectral changes with leaf aging in Amazon caatinga," Trees 12, 315-325 (1998) [doi:10.1007/s004680050157].

[34] D. A. Roberts, S. L. Ustin, S. Ogunjemiyo, J. Greenberg, S. Z. Dobrowski, J. Q. Chen, and T. M. Hinckley, "Spectral and structural measures of northwest forest vegetation at leaf to landscape scales," Ecosystems 7, 545-562 (2004) [doi:10.1007/s10021-004-0144-5].

[35] G. P. Asner and K. B. Heidebrecht, "Spectral unmixing of vegetation, soil and dry carbon cover in arid regions: comparing multispectral and hyperspectral observations," Int. J. Rem. Sens. 23, 3939-3958 (2002) [doi:10.1080/01431160110115960].

[36] G. P. Asner and P. M. Vitousek, "Remote analysis of biological invasion and biogeochemical change," Proc. Nat. Acad. Sci. 102, 4383-4386 (2005) [doi:10.1073/pnas.0500823102].

[37] S. V. Ollinger and M.-L. Smith, "Net primary production and canopy nitrogen in a temperate forest landscape: An analysis using imaging spectroscopy, modeling and field data," Ecosystems 8, 760-778 (2005) [doi: 10.1007/s10021-005-0079-5].

[38] S. V. Ollinger, M. L. Smith, M. E. Martin, R. A. Hallett, C. L. Goodale, and J. D. Aber, "Regional variation in foliar chemistry and soil nitrogen status among forests of diverse history and composition," Ecology 83, 339-355 (2002) [doi:10.2307/2680018].

[39] K. M. Carlson, G. P. Asner, F. R. Hughes, R. Ostertag, and M. E. Martin, "Hyperspectral Remote Sensing of Canopy Biodiversity in Hawaiian Lowland Rainforests," Ecosystems (2007) [doi: 10.1007/s10021-007-9041-z].

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx

Journal of Applied Remote Sensing, Vol. 1, 013536 (2007) Page 21

[40] D. A. Sims and J. A. Gamon, "Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages," Rem. Sens. Environ. 81, 337-354 (2002) [doi:10.1016/S0034-4257(02)00010-X].

[41] P. Ceccato, S. Flasse, S. Tarantola, S. Jacquemoud, and J. M. Gregoire, "Detecting vegetation leaf water content using reflectance in the optical domain," Rem. Sens. Environ. 77, 22-33 (2001) [doi:10.1016/S0034-4257(01)00191-2].

[42] J. A. Gamon, C. B. Field, W. Bilger, A. Björkman, A. L. Fredeen, and J. Peñuelas, "Remote sensing of the xanthophyll cycle and chlorophyll fluorescence in sunflower leaves and canopies," Oecologia 85, 1-7 (1990) [doi:10.1007/BF00317336].

[43] D. A. Roberts, R. O. Green, and J. B. Adams, "Temporal and spatial patterns in vegetation and atmospheric properties from AVIRIS," Rem. Sens. Environ. 62, 223-240 (1997) [doi:10.1016/S0034-4257(97)00092-8].

[44] J. A. Gamon, L. Serrano, and J. S. Surfus, "The photochemical reflectance index: An optical indicator of photosynthetic radiation use efficiency across species, functional types, and nutrient levels," Oecologia 112, 492-501 (1997) [doi:10.1007/s004420050337].

[45] J. B. Drake, R. G. Knox, R. O. Dubayah, D. B. Clark, R. Condit, J. B. Blair, and M. Hofton, "Above-ground biomass estimation in closed canopy neotropical forests using lidar remote sensing: Factors affecting the generality of relationships," Global Ecol. Biogeogr. 12, 147-159 (2003) [doi:10.1046/j.1466-822X.2003.00010.x].

[46] M. A. Lefsky, D. Harding, W. B. Cohen, G. Parker, and H. H. Shugart, "Surface lidar remote sensing of basal area and biomass in deciduous forests of eastern Maryland, USA," Rem. Sens. Environ. 67, 83-98 (1999) [doi:10.1016/S0034-4257(98)00071-6].

[47] G. P. Asner, D. Nepstad, G. Cardinot, and D. Ray, "Drought stress and carbon uptake in an Amazon forest measured with spaceborne imaging spectroscopy," Proc. Nat. Acad. Sci. 101, 6039-6044 (2004) [doi:10.1073/pnas.0400168101].

Downloaded From: https://www.spiedigitallibrary.org/journals/Journal-of-Applied-Remote-Sensing on 9/11/2017 Terms of Use: https://spiedigitallibrary.spie.org/ss/TermsOfUse.aspx