ieq-05 remote sensing studies notes_short

20
1 Course Title IEQ-05 Earthquake Geology and Geoinformatics by Dr. J. D. Das (Dept. of Earthquake Engineering, IIT Roorkee) Remote Sensing Studies Remote sensing is the science of acquiring information about the Earth's surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and processing, analyzing, and applying that information. Remote sensing process involves an interaction between incident radiation and the targets of interest. This is exemplified by the use of imaging systems where the following seven elements are involved. With the advent and tremendous technological progress in the field of remote sensing technique it has now become possible to study several things happening on the earth surface. Earth observing satellites have the capability of producing synoptic view of the earth and can generate wealth of information. Remote sensing offers this perspective and allows a researcher to examine other reference ancillary data simultaneously and synergistically. Nature and pattern of deformation that the earth has undergone are beautifully displayed by the satellite images enabling us to study these in details. Deformation mapping is the identification and characterization of structural expression. Structures include faults, folds, synclines and anticlines and lineaments. Understanding structures is the key to interpreting crustal movements that have shaped the present terrain. Structures can indicate potential locations of oil and gas reserves by characterizing both the underlying subsurface geometry of rock units and the amount of crustal deformation and stress experienced in a certain locale. Structures are also examined for clues to crustal movement and potential hazards, such as earthquakes, landslides, and volcanic activity. Identification of fault lines can facilitate land use planning by limiting construction over potentially dangerous zones of seismic activity. Certain remote sensing devices offer unique information regarding structures, such as in the relief expression offered by radar sensors. A benefit of side looking radar is that the illumination conditions can be controlled, and the most appropriate geometry used for type of terrain being examined.

Upload: akash-agarwal

Post on 20-Dec-2015

225 views

Category:

Documents


2 download

DESCRIPTION

Earthquake

TRANSCRIPT

Page 1: IEQ-05 Remote Sensing Studies Notes_Short

1

Course Title IEQ-05 Earthquake Geology and Geoinformatics

by Dr. J. D. Das

(Dept. of Earthquake Engineering, IIT Roorkee)

Remote Sensing Studies Remote sensing is the science of acquiring information about the Earth's

surface without actually being in contact with it. This is done by sensing and recording reflected or emitted energy and

processing, analyzing, and applying that information. Remote sensing process involves an interaction between incident

radiation and the targets of interest. This is exemplified by the use of imaging systems where the following

seven elements are involved. With the advent and tremendous technological progress in the field of remote sensing technique it has now become possible to study several things happening on the earth surface. Earth observing satellites have the capability of producing synoptic view of the earth and can generate wealth of information. Remote sensing offers this perspective and allows a researcher to examine other reference ancillary data simultaneously and synergistically. Nature and pattern of deformation that the earth has undergone are beautifully displayed by the satellite images enabling us to study these in details. Deformation mapping is the identification and characterization of structural expression. Structures include faults, folds, synclines and anticlines and lineaments. Understanding structures is the key to interpreting crustal movements that have shaped the present terrain. Structures can indicate potential locations of oil and gas reserves by characterizing both the underlying subsurface geometry of rock units and the amount of crustal deformation and stress experienced in a certain locale. Structures are also examined for clues to crustal movement and potential hazards, such as earthquakes, landslides, and volcanic activity. Identification of fault lines can facilitate land use planning by limiting construction over potentially dangerous zones of seismic activity. Certain remote sensing devices offer unique information regarding structures, such as in the relief expression offered by radar sensors. A benefit of side looking radar is that the illumination conditions can be controlled, and the most appropriate geometry used for type of terrain being examined.

user_2
Highlight
user_2
Highlight
user_2
Highlight
Page 2: IEQ-05 Remote Sensing Studies Notes_Short

2

1. Energy Source or Illumination (A) – the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. 2. Radiation and the Atmosphere (B) – as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. 3. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. 4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation. 5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). 6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. 7. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.

Page 3: IEQ-05 Remote Sensing Studies Notes_Short

3

Electromagnetic Radiation : The first requirement for remote sensing is to have an energy source to illuminate the target (unless the sensed energy is being emitted by the target). All electromagnetic radiation has fundamental properties and behaves in predictable ways according to the basics of wave theory. Two characteristics of electromagnetic radiation are particularly important for understanding remote sensing. These are the wavelength and frequency. Electromagnetic radiation consists of an electrical field (E) which varies in magnitude in a direction perpendicular to the direction in which the radiation is traveling, and a magnetic field (M) oriented at right angles to the electrical field. Both these fields travel at the speed of light (c). Wavelength is measured in meters (m) or some factor of metres such as nanometres (nm, 10-9 metres), micrometres (µm, 10-6 metres) (µm, 10-6 metres) or centimetres (cm, 10-2 metres). Electromagnetic Spectrum:

For most purposes, the ultraviolet or UV portion of the spectrum has the shortest wavelengths. The light which our eyes - our "remote sensors" - can detect is part of the visible spectrum. The visible wavelengths cover a range from approximately 0.4 to 0.7 µm. The longest visible wavelength is red and the

Page 4: IEQ-05 Remote Sensing Studies Notes_Short

4

shortest is violet. Common wavelengths of what we perceive as particular colours from the visible portion of the spectrum are listed below. It is important to note that this is the only portion of the spectrum we can associate with the concept of colours. Violet: 0.400 - 0.446 µm Blue: 0.446 - 0.500 µm Green: 0.500 - 0.578 µm Yellow: 0.578 - 0.592 µm Orange: 0.592 - 0.620 µm Red: 0.620 - 0.700 µm The next portion of the spectrum of interest is the infrared (IR) region which can be divided into two categories based on their radiation properties - the reflected IR, and the emitted or thermal IR. Radiation in the reflected IR region is used for remote sensing purposes in ways very similar to radiation in the visible portion. The portion of the spectrum of more recent interest to remote sensing is the microwave region from about 1 mm to 1 m. This covers the longest wavelengths used for remote sensing. The shorter wavelengths have properties similar to the thermal infrared region while the longer wavelengths approach the wavelengths used for radio broadcasts. Interactions with the Atmosphere : Before radiation (used for remote sensing) reaches the Earth's surface it has to travel through some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect the incoming light and radiation. These effects are caused by the mechanisms of scattering and absorption. Rayleigh scattering occurs when particles are very small compared to the wavelength of the radiation. These could be particles such as small specks of dust or nitrogen and oxygen molecules. Rayleigh scattering causes shorter wavelengths of energy to be scattered much more than longer wavelengths. Rayleigh scattering is the dominant scattering mechanism in the upper atmosphere. Mie scattering occurs when the particles are just about the same size as the wavelength of the radiation. Dust, pollen, smoke and water vapour are common causes of Mie scattering which tends to affect longer wavelengths than those affected by Rayleigh scattering. Mie scattering occurs mostly in the lower portions of the atmosphere where larger particles are more abundant, and dominates when cloud conditions are overcast. Nonselective scattering The final scattering mechanism of importance is called nonselective scattering. This occurs when the particles are much larger than the wavelength of the radiation. Water droplets and large dust particles can

user_2
Highlight
user_2
Highlight
user_2
Highlight
user_2
Highlight
user_2
Highlight
user_2
Highlight
Page 5: IEQ-05 Remote Sensing Studies Notes_Short

5

cause this type of scattering. Nonselective scattering gets its name from the fact that all wavelengths are scattered about equally. Absorption is the other main mechanism at work when electromagnetic radiation interacts with the atmosphere. In contrast to scattering, this phenomenon causes molecules in the atmosphere to absorb energy at various wavelengths. Ozone, carbon dioxide, and water vapour are the three main atmospheric constituents which absorb radiation. Carbon dioxide referred to as a greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared portion of the spectrum - that area associated with thermal heating - which serves to trap this heat inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming longwave infrared and shortwave microwave radiation (between 22µm and 1m). Atmospheric Window Those areas of the spectrum which are not severely influenced by atmospheric absorption are useful to remote sensors, are called atmospheric windows.

The visible portion of the spectrum, to which our eyes are most sensitive, corresponds to both an atmospheric window and the peak energy level of the sun. Energy emitted by the Earth corresponds to a window around 10 µm in the thermal IR portion of the spectrum. The large window at wavelengths beyond 1 mm is associated with the microwave region. Radiation - Target Interactions There are three (3) forms of interaction that can take place when energy strikes, or is incident (I) upon the surface. These are:

user_2
Highlight
user_2
Highlight
Page 6: IEQ-05 Remote Sensing Studies Notes_Short

6

Absorption (A) Transmission (T) Reflection (R) The total incident energy will interact with the surface in one or more of these three ways. The proportions of each will depend on the wavelength of the energy and the material and condition of the feature. In remote sensing, we are most interested in measuring the radiation reflected from targets. We refer to two types of reflection, which represent the two extreme ends of the way in which energy is reflected from a target: Specular reflection and Diffuse reflection. When a surface is smooth we get specular or mirror-like reflection where all (or almost all) of the energy is directed away from the surface in a single direction. Diffuse reflection occurs when the surface is rough and the energy is reflected almost uniformly in all directions. If the wavelengths are much smaller than the surface variations or the particle sizes that make up the surface, diffuse reflection will dominate. Most earth surface features lie somewhere between perfectly specular or perfectly diffuse reflectors. Leaves strongly absorb radiation in the red and blue wavelengths but reflects green wavelengths producing green appearance. Water absorbs more longer wavelength radiation than shorter visible wavelengths thus water typically looks blue or blue-green due to stronger reflectance at these shorter wavelengths. Passive vs. Active Sensing Passive Sensors : Remote sensing systems which measure energy that is naturally available are called passive sensors. For all reflected energy, this can only take place during the time when the sun is illuminating the Earth. Energy that is naturally emitted (such as thermal infrared) can be detected day or night, as long as the amount of energy is large enough to be recorded.

user_2
Highlight
user_2
Highlight
Page 7: IEQ-05 Remote Sensing Studies Notes_Short

7

Active sensors, on the other hand, provide their own energy source for illumination. Some examples of active sensors are a laser and a synthetic aperture radar (SAR). Satellites Orbits and Swaths Geostationary Orbits : The orbit of a satellite is elliptical in shape, but remote sensing satellites are usually put in orbits that are very close to approximations to a circle. Satellites at very high altitudes, which view the same portion of the Earth's surface at all times have geostationary orbits. These geostationary satellites, at altitudes of approximately 36,000 kilometres, revolve at speeds which match the rotation of the Earth so they seem stationary, relative to the Earth's surface. This allows the satellites to observe and collect information continuously over specific areas. Weather and communications satellites commonly have these types of orbits.

Geostationary Satellite

Nearpolar Orbits : Many remote sensing platforms are designed to follow an orbit (basically north-south) which, in conjunction with the Earth's rotation (west-east), allows them to cover most of the Earth's surface over a certain period of time. These are nearpolar orbits, so named for the inclination of the orbit relative to a line running between the North and South poles.

Page 8: IEQ-05 Remote Sensing Studies Notes_Short

8

Sun-synchronous orbits: Many of these satellite orbits are also sun-synchronous such that they cover each area of the world at a constant local time of day called local sun time. At any given latitude, the position of the sun in the sky as the satellite passes overhead will be the same within the same season.

Swath : As a satellite revolves around the Earth, the sensor "sees" a certain portion of the Earth's surface. The area imaged on the surface, is referred to as the swath. Imaging swaths for spaceborne sensors generally vary between tens and hundreds of kilometers wide.

Sensor Technology Most remote sensing instruments (sensors) are designed to measure photons. The fundamental principle underlying sensor operation centers on the concept of the photoelectric effect. The magnitude of the electric current produced (number of photoelectrons per unit time) is directly proportional to the light intensity. Thus, changes in the electric current can be used to measure changes in the photons (numbers; intensity) that strike the detector during a given time interval. CCD Detector An individual CCD is an extremely small silicon (micro) detector, which is light-sensitive. Many individual detectors are placed on a chip side by side either in a single row as a linear array or in stacked rows of linear arrays in X-Y (two dimensional) space. When photons strike a CCD detector, electronic charges develop whose magnitudes are proportional to the intensity of the impinging radiation during a short time interval (exposure time). The number of elements per unit length, along with the optics, determine the spatial resolution of the instrument. Using integrated circuits each linear array is sampled very rapidly in sequence, producing an electrical signal that varies with the radiation striking the array. Each individual CCD corresponds to the "pixel". The size of the CCD is one factor in setting spatial resolution (smaller sizes represent smaller areas on the target surface); another factor is the height of the observing platform (satellite or aircraft); a third factor can be the use of a telescopic lens.

user_2
Highlight
user_2
Highlight
Page 9: IEQ-05 Remote Sensing Studies Notes_Short

9

Resolution Spatial Resolution Spectral Resolution Radiometric Resolution Temporal Resolution Spatial Resolution :Spatial resolution, the capability of distinguishing closely spaced objects on an image or a photograph. Image resolution is determined by the size and number of picture elements or pixels used to form an image. The smaller the pixel size, the greater the resolution. In photography, resolution is limited primarily by the film grain size, but lenses and other technical considerations play important roles.

Spatial resolution of passive sensors depends primarily on their Instantaneous Field of View (IFOV). The IFOV is the angular cone of visibility of the sensor (A) and determines the area on the Earth's surface which is "seen" from a given altitude at one particular moment in time (B). The size of the area viewed is determined by multiplying the IFOV by the distance from the ground to the sensor (C). This area on the ground is called the resolution cell and determines a sensor's maximum spatial resolution.

Spectral Resolution : Spectral resolution describes the ability of a sensor to define fine wavelength intervals. The finer the spectral resolution, the narrower the wavelength range for a particular channel or band. Black and white film records wavelengths extending over much, or all of the visible portion of the electromagnetic spectrum. Colour film has higher spectral resolution, as it is individually sensitive to the reflected energy at the blue, green, and red wavelengths of the spectrum. Many remote sensing systems record energy over several separate wavelength ranges at various spectral resolutions. Radiometric Resolution :Radiometric characteristics describe the actual information content in an image. Every time an image is acquired on film or by a sensor, its sensitivity to the magnitude of the electromagnetic energy determines the radiometric resolution. The radiometric resolution of an imaging system describes its ability to discriminate very slight differences in energy. The finer the radiometric resolution of a sensor, the more sensitive it is to detecting small differences in reflected or emitted energy. Imagery data are represented by positive digital numbers which vary from 0 to (one less than) a selected power of 2. This range corresponds to the number of bits used for coding numbers in binary format. Each bit records an exponent of power 2 (e.g. 1 bit=21=2). The maximum number of brightness levels available depends on the number of bits

Page 10: IEQ-05 Remote Sensing Studies Notes_Short

10

used in representing the energy recorded. Thus, if a sensor used 8 bits to record the data, there would be 28=256 digital values available, ranging from 0 to 255. Temporal Resolution: Temporal resolution is also important to consider in a remote sensing system which refers to the length of time it takes for a satellite to complete one entire orbit cycle. The revisit period of a satellite sensor is usually several days. Therefore the absolute temporal resolution of a remote sensing system to image the exact same area at the same viewing angle a second time is equal to this period. Thermal Imaging All matter of the earth radiates energy at Thermal Infrared wavelength (3 µm to 15 µm) both day and night. Thermal sensors use photo detectors sensitive to the direct contact of photons on their surface, to detect emitted thermal radiation. The detectors are cooled to temperatures close to absolute zero (0K) in order to limit their own thermal emissions. Thermal sensors essentially measure the surface temperature and thermal properties of targets. Thermal IR images generally record broad spectral bands, typically 8.0 µm to 14.0 µm for images from aircraft and 10.5 µm to 12.5 µm for images from satellites. To interpret thermal IR images, one must understand the basic physical processes that control the interactions between thermal energy and matter, as well as the thermal properties of matter that determine the rate and intensity of the interactions. Heat, Temperature & Radiant Flux Kinetic Heat is the energy of particles of matter in a random motion. The random motion causes particles to collide, resulting in change of energy state and the emission of electromagnetic radiation from the surface of materials. The internal, or kinetic, heat energy of matter is thus converted into radiant energy. The amount of heat is measured in calories. Temperature is a measure of the concentration of heat (in centigrade). On the Kelvin, or absolute, temperature scale, 0oK is absolute zero, the point at which all molecular motion ceases. The Kelvin and Celsius scales correlate as follows: 0oC = 273K, and 100oC = 373K. Heat energy is transferred from one place to another by three means: Conduction transfers heat through a material by molecular contact. Say heat transfer from pan to vegetables during cooking. Convection transfers heat through the physical movement of heated matter. The circulation of heated water and air are examples of convection.

Page 11: IEQ-05 Remote Sensing Studies Notes_Short

11

Radiation transfers heat in the form of electromagnetic waves. Heat from the sun reaches the earth by radiation. Radiation can transfer heat through vacuum. Materials at the surface of the earth receive thermal energy primarily in the form of radiation from the sun. To a much lesser extent, heat is also conducted from the interior of the earth. Energy from the interior of the earth reaches the surface primarily by conduction. The atmosphere does not transmit all wavelengths of thermal IR radiation uniformly. Carbon dioxide, ozone, and water vapor absorb energy in certain wavelength regions, called absorption bands. The atmosphere transmits wavelengths of 3 to 5 µm and 8 to 14 µm. These bands are called atmospheric windows.

Page 12: IEQ-05 Remote Sensing Studies Notes_Short

12

Radiant Energy

For an object at a constant kinetic temperature, the radiant energy, or flux, varies as a function of wavelength. As temperature increases, the total amount of radiant energy increases and the radiant energy peak shifts to shorter wavelengths. This shift, or displacement, to shorter wavelengths with increasing temperature is described by Wien’s displacement law states that λmax = 2897 µm. oK / Trad Where Trad is radiant temperature in degrees Kelvin and 2897 µm. oK is a physical constant. The wavelength of the radiant peak of an object may be determined by substituting the value of Trad.

On most thermal IR images, the brightest tones represent the warmest

radiant temperatures, and the darkest tones represent the coolest ones. Clouds typically show the patchy warm-and-cool pattern.

The thermal inertia of water is similar to that of soils and rocks, but in

daytime, water bodies have a cooler surface temperature than soils and rocks. At night the relative surface temperatures are reversed, so that water is warmer than soils and rocks.

If water bodies have warm signatures relative to the adjacent terrain, the image was acquired at night. Whereas, relatively cool water bodies indicate daytime imagery. Damp soil is cooler than dry soil, both day and night.

Green deciduous vegetation has a cool signature on daytime images and a warm signature on nighttime images. During the day, transpiration of water vapor lowers leaf temperature, causing vegetation to have a cool signature relative to the surrounding soil. At night the insulating effect of leafy foliage and the high water content retain heat, which results in warm nighttime temperatures.

Applications of Thermal IR Sensing Land cover classification and mapping Estimate sea surface temperatures

Page 13: IEQ-05 Remote Sensing Studies Notes_Short

13

Estimate soil moisture Monitor plant stress Detect ground water and geological

Structures and materials Detect and map thermal discharges Measure heat loss of buildings Assess urban heat island effects Map forest fires Monitor volcanic activity

Microwave Remote Sensing Microwave sensing encompasses both active and passive forms of remote sensing. The microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength. Because of their long wavelengths, compared to the visible and infrared, microwaves have special properties that are important for remote sensing. Longer wavelength microwave radiation can penetrate through cloud cover, haze, dust, and all but the heaviest rainfall as the longer wavelengths are not susceptible to atmospheric scattering which affects shorter optical wavelengths. This property allows detection of microwave energy under almost all weather and environmental conditions so that data can be collected at any time. A passive microwave sensor detects the naturally emitted microwave energy related to the temperature and moisture properties of the emitting object or surface within its field of view. Active microwave sensors provide their own source of microwave radiation to illuminate the target. The most common form of imaging active microwave sensors is RADAR. RADAR is an acronym for RAdio Detection And Ranging, which essentially characterizes the function and operation of a radar sensor. RADAR Basics A radar is essentially a ranging or distance measuring device. It consists fundamentally of a transmitter, a receiver, an antenna, and an electronics system to process and record the data. The transmitter generates successive short bursts (or pulses of microwave (A) at regular intervals which are focused by the antenna into a beam (B). The radar beam illuminates the surface obliquely at a right angle to the motion of the platform.

Page 14: IEQ-05 Remote Sensing Studies Notes_Short

14

The antenna receives a portion of the transmitted energy reflected (or backscattered) from various objects within the illuminated beam (C). By measuring the time delay between the transmission of a pulse and the reception of the backscattered "echo" from different targets, their distance from the radar and thus their location can be determined. As the sensor platform moves forward, recording and processing of the backscattered signals builds up a two-dimensional image of the surface. The microwave region of the spectrum is quite large, relative to the visible and infrared, and there are several wavelength ranges or bands commonly used. Irrespective of wavelength, radar signals can transmit horizontal (H) or vertical (V) electric- field vectors, and receive either horizontal (H) or vertical (V) return signals, or both. There can be four combinations of both transmit and receive polarizations as follows:

HH - for horizontal transmit and horizontal receive, VV - for vertical transmit and vertical receive, HV - for horizontal transmit and vertical receive, and VH - for vertical transmit and horizontal receive.

Viewing Geometry and Spatial Resolution

Page 15: IEQ-05 Remote Sensing Studies Notes_Short

15

While ground range is the horizontal distance between the emitter and its target and its calculation requires knowledge of the target's elevation. Since the waves travel to a target and back, the round trip time is divide by two in order to obtain the time the wave took to reach the target. Spatial resolution in both the range (look) direction and azimuth (flight) direction is determined by the engineering characteristics of the radar system. Depression angle defined as the angle between a horizontal plane and a beam from the antenna to a target on the ground. The depression angle is steeper at the near range side of an image strip and shallower at the far-range side. The average depression angle is measured for a beam to the midline of an image strip. Incidence angle is defined as the angle between a radar beam and a line perpendicular to the surface. Spatial Resolution The spatial resolution of a radar image is determined by the dimension of ground resolution cell, which are controlled by the combination of range resolution and azimuth resolution. Range Resolution

Range Resolution (Rr) or resolution in the radar look direction is determined by the depression angle and by the pulse length. Pulse length (τ) is the duration of the transmitted pulse and is measured in microseconds. It is converted from time into distance by multiplying by the speed of electromagnetic radiation.

Range Resolution is determined by the relationship Rr = τ . c / 2 cos γ (c = speed of electromagnetic radiation)

Target A and B, spaced 20 m apart, are imaged with a depression angle of 50o and a pulse length of 0.1 µsec. For these targets range resolution is 23.4m. Targets A and B are not resolved because they are closer together (20m) than the range resolution distance. They are within a single ground resolution cell and cannot be separated of the image.

Page 16: IEQ-05 Remote Sensing Studies Notes_Short

16

Targets C and D are also spaced 20m apart but are imaged with a depression angle of 35o. For this depression angle, range resolution is calculated as 18m. Targets C and D are resolved because they are more widely spaced than the ground resolution cell. One method of improving range resolution is to shorten the pulse length, but this reduces the total amount of energy in each transmitted pulse. Azimuth Resolution

Azimuth resolution (Ra), or resolution in the azimuth direction, is determined by the width of the terrain strip illuminated by the radar beam. To be resolved, targets must be separated in the azimuth direction by a distance greater than the beam width as measured on the ground. The fan-shaped beam is narrower in the near range than in the far range, causing azimuth resolution to be smaller in the near-range portion of the image.

The equation of azimuth resolution (Ra) is

Ra = 0.7 . S . λ / D

S = Slant-range distance D = Antenna length.

For a typical X-band system (λ = 3.0 cm; D = 500 cm) with a slant-range distance of 8 km, Ra is calculated to be 33.6 m. Therefore, targets in the near range, such as A and B must be separated by approximately 35 m to be resolved. At the far range position the slant range distance is 20 km, and Ra is calculated as 84 m, thus targets C and D, also separated by 35m, are not resolved. Slant Range Distortion

The slant-range distortion occurs because the radar is measuring the distance to features in slant-range rather than the true horizontal distance along the ground. This results in a varying image scale, moving from near to far range.

Page 17: IEQ-05 Remote Sensing Studies Notes_Short

17

Foreshortening

Foreshortening occurs when the radar beam reaches the base of a tall feature tilted towards the radar (e.g. a mountain) before it reaches the top. Because the radar measures distance in slant-range, the slope (a to b) will appear compressed and the length of the slope will be represented incorrectly (a' to b') at the image plane.

Layover

Layover occurs when the radar beam reaches the top of a tall feature (b) before it reaches the base (a). The return signal from the top of the feature will be received before the signal from the bottom. As a result, the top of the feature is displaced towards the radar from its true position on the ground, and “lays over” the base of the feature (b' to a').

Shadowing

The shadowing effect increases with greater incident angle θ, just as our shadows lengthen as the sun sets.

SAR Image A digital SAR image can be seen as a mosaic (i.e. a two-dimensional array formed by columns and rows) of small picture elements (pixels). Each pixel is associated with a small area of the Earth’s surface (called a resolution cell). Each pixel gives a complex number that carries amplitude and phase information about the microwave field backscattered by all the scatterers (rocks, vegetation, buildings etc.) within the corresponding resolution cell projected on the ground. Different rows of the image are associated with different azimuth locations, whereas different columns indicate different slant range locations.

Page 18: IEQ-05 Remote Sensing Studies Notes_Short

18

The radiation transmitted from the radar has to reach the scatterers on the ground and then come back to the radar in order to form the SAR image (two-way travel). Scatterers at different distances from the radar (different slant ranges) introduce different delays between transmission and reception of the radiation.

Due to the almost purely sinusoidal nature of the transmitted signal, this delay τ is equivalent to a phase change φ between transmitted and received signals. The phase change is thus proportional to the two-way travel distance 2R of the radiation divided by the transmitted wavelength λ. In other words the phase of the SAR signal is a measure of just the last fraction of the two-way travel distance that is smaller than the transmitted wavelength. Real-Aperture and Synthetic-Aperture Real-Aperture Radar uses an antenna of the maximum practical length to produce a narrow angular beam width in the azimuth direction.

A Sythetic Aperture Radar (SAR) is a coherent sidelooking airborne system which utilizes the flight path of the aircraft to simulate an extremely large antenna

Page 19: IEQ-05 Remote Sensing Studies Notes_Short

19

or aperture electronically and that generates high-resolution remote sensing imagery. The signal processing uses magnitude and phase of the received signals over successive pulses from elements of a synthetic aperture. After a given number of cycles, the stored data is recombined (taking into account the Doppler effects inherent in the different transmitter to target geometry in each succeeding cycle) to create a high resolution image of the terrain being over flown. Interferometry Interferometric SAR or InSAR, allows accurate measurements of the radiation travel path because it is coherent. Measurements of travel path variations as a function of the satellite position and time of acquisition allow generation of Digital Elevation Models (DEM) and measurement of centimetric surface deformations of the terrain. A satellite SAR can observe the same area from slightly different look angles. This can be done either simultaneously (with two radars mounted on the same platform) or at different times by exploiting repeated orbits of the same satellite.

The distance between the two satellites (or orbits) in the plane perpendicular to the orbit is called the interferometer baseline and its projection perpendicular to the slant range is the perpendicular baseline.

The SAR interferogram is generated by cross-multiplying, pixel by pixel,

the first SAR image with the complex conjugate of the second. Thus, the interferogram amplitude is the amplitude of the first image

multiplied by that of the second one, whereas its phase (the interferometric phase) is the phase difference between the images.

Page 20: IEQ-05 Remote Sensing Studies Notes_Short

20

Geological applications of remote sensing include the following: Surficial deposit / bedrock mapping, Lithological mapping Structural mapping, Sand and gravel (aggregate) exploration/ exploitation, Mineral exploration Hydrocarbon exploration, Environmental geology, Geobotany, Baseline infrastructure Sedimentation mapping and monitoring Event mapping and monitoring Geo-hazard mapping Planetary mapping