d3.3: radar sensing - sunnyproject.eu · novel sensors and on-board processing generation,...

44
1 Grant Agreement number: 313243 Project acronym: SUNNY Project title: Smart UNattended airborne sensor Network for detection of vessels used for cross border crime and irregular entrY Funding Scheme: Collaborative project D3.3: Radar sensing Due date of deliverable: 29/04/2016 Actual submission date: dd/mm/yyyy Start date of project: 01/01/2014 Duration: 42 Months Organisation name of lead contractor for this deliverable: MetaSensing Participating: MetaSensing, CNIT Project co-funded by the European Commission within the Seventh Framework Programme (2007-2013) Dissemination Level PU Public PP Restricted to other programme participants (including the Commission Services) RE Restricted to a group specified by the consortium (including the Commission CO Confidential, only for members of the consortium (including the Commission Services)

Upload: others

Post on 28-May-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

1

Grant Agreement number: 313243 Project acronym: SUNNY Project title: Smart UNattended airborne sensor Network for detection of vessels used for cross border crime and irregular entrY Funding Scheme: Collaborative project

D3.3: Radar sensing

Due date of deliverable: 29/04/2016 Actual submission date: dd/mm/yyyy Start date of project: 01/01/2014 Duration: 42 Months Organisation name of lead contractor for this deliverable: MetaSensing Participating: MetaSensing, CNIT

Project co-funded by the European Commission within the Seventh Framework Programme (2007-2013) Dissemination Level PU Public PP Restricted to other programme participants (including the Commission Services) RE Restricted to a group specified by the consortium (including the Commission

CO Confidential, only for members of the consortium (including the Commission Services)

Page 2: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

2

Document Title: Radar Sensing WP 3 Document number:

WPx Tx.x Dx.x

Main Authors Org

Alex Coccia MetaSensing

Contributing Authors Org

Sonia Tomei CNIT

Doc. History Version Comments Date Authorised by

27/6/2016 R01 1st version 1/9/2016 R02 2nd version, applied comments of

TECNALIA

5/9/2016 R03 Paragraph 5.2 added. Number of pages: 44 Number of annexes: -

Page 3: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

3

Contents

1. EXECUTIVE SUMMARY ................................................................................................................................................. 4 2. INTRODUCTION .......................................................................................................................................................... 5

2.1 SUNNY Project Objectives ............................................................................................................................. 5 2.2 This Version ................................................................................................................................................... 5 2.3 Document Conventions ................................................................................................................................. 5

3. OPTIMIZATION DESIGN OF HIGH RESOLUTION RADAR SENSORS ............................................................................................. 7 3.1 About MetaSensing ‘s technology ................................................................................................................. 7 3.2 Optimized design for UAV platforms ............................................................................................................ 9

4. 2D ISAR IMAGING MODE IN SAR SCENES...................................................................................................................... 10 4.1 Theory about 2D ISAR imaging ................................................................................................................... 10 4.2 SW guide ..................................................................................................................................................... 14 4.3 Output example .......................................................................................................................................... 16

5. 3D IMAGING ........................................................................................................................................................... 18 5.1 Literature review of 3D imaging techniques ............................................................................................... 18 5.2 Conclusions ................................................................................................................................................. 21

6. RADAR SYSTEM REALIZATION AND INSTALLATION ........................................................................................................... 23 6.1 MiniSAR sub-systems for the SUNNY demonstrator ................................................................................... 23 6.2 Actual Mechanical Installation ................................................................................................................... 29 6.3 Radar Data Processing Algorithms ............................................................................................................. 30 6.4 Radar performance ..................................................................................................................................... 33

7. PERFORMANCE AND QUALIFICATION TESTS ................................................................................................................... 34 7.1 MiniSAR subsystems lab tests ..................................................................................................................... 34 7.2 MiniSAR qualification test ........................................................................................................................... 35 7.3 Results ......................................................................................................................................................... 39 7.4 Conclusions ................................................................................................................................................. 41

Page 4: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

4

1. Executive Summary The SUNNY project aims to develop and integrate a new tool for collecting real-time information using heterogeneous sensors carried by multiple UAV platforms, and for analyzing this information automatically to provide situational awareness for the system operators to monitor the EU maritime borders more effectively and efficiently.

Different sensor technologies will be used for the scope, including infrared, hyper-spectral and radar sensing. In this document, the radar sensing technology used within the SUNNY system is reviewed in detail. The general requirements for the sensor technologies, including the radar one, were discussed in previous deliverables of this project [1]. In this document, the methodologies and hardware equipment related to the radar technology which will be employed during the SUNNY demonstrator are discussed. In this document the main customizations of the MetaSensing’s MiniSAR sensor are summarized, which have been introduced to match with the SUNNY specifications; the processing related to the generation of ISAR images from radar data is overviewed, and the equipment which will be actually used during the SUNNY demonstrator are listed; the main results of the qualification tests will be finally shown, assessing the expected performance of the radar system.

Page 5: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

5

2. Introduction

2.1 SUNNY Project Objectives The SUNNY project aims to contribute to EUROSUR by defining a new tool for collecting real-time information in operational scenarios. SUNNY represents a step beyond existing research projects due to the following main features: A two-tier intelligent heterogeneous Unmanned Aerial Vehicle (UAV) sensor network will be considered in order to provide both large field and focused surveillance capabilities, where the first-tier sensors, carried by medium altitude, long-endurance autonomous UAVs, are used to patrol large border areas to detect suspicious targets and provide global situation awareness. Fed with the information collected by the first-tier sensors, the second-tier sensors will be deployed to provide more focused surveillance capability by tracking the targets and collecting further evidence for more accurate target recognition and threat evaluation. Novel algorithms will be developed to analyse the data collected by the sensors for robust and accurate target identification and event detection; Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, low cost, high resolution that can operate under variable conditions such as darkness, snow, and rain. In particular, SUNNY will develop sensors that generate both RGB image, Near Infrared (NIR) image and hyperspectral image and that use radar information to detect, discriminate and track objects of interest inside complex environment, with focus on the sea borders. Alloying to couple sensor processing and preliminary detection results (on-board) with local UAV control, leading to innovative active sensing techniques, replacing low-level sensor data communication by a higher abstraction level of information communication [1].

2.2 This Version This document represents the second version of this deliverable, namely R02. The document addresses the radar sensing technology which will be used for the SUNNY demonstrator. In particular it shows how the design of the MetaSenisng’s MiniSAR sensor has been customized and dedicated to meet the SUNNY requirements in terms of integration in the UAV platform, and how the radar processing will lead to the generation of ship images from the collected radar data.

2.3 Document Conventions

Term Definition 2D 2 dimensional 3D 3 dimensional ADC Analog-to-Digital Converter AIS Automatic Identification System ATC Air Traffic Control ATR Automatic Target Recognition a-Si amorphous Silicon AWG Arbitrary Waveform Generator CPI Coherent Processing Time CW Continuous Wave DOA Direction Of Arrival

Page 6: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

6

FOV Field Of View GPS Global Positioning System GPU Graphic Processing Unit GS Ground Station HRI High resolution Imaging HW Hardware IC Image Contrast ICBA Image Contrast Based Autofocus IE Image Entropy IFMCW Interrupted Frequency-Modulated Continuous-Wave Mode IFMCW Intermediate Frequency InSAR Interferometric SAR InISAR Interferometric ISAR IMU Inertial Measurement Unit I/O Input / Output IPP Image Projection Plane IR Infrared ITR Integrate Then Read mode IWR Integrate While Read mode LoS Line of Sight LPFT Local Polynomial Fourier Transform LSE Least Square Error MTI Moving Target Indication NA Not Applicable PSU Power Supply Unit Radar Image 2D spatial distribution of the electromagnetic features of the target RCS Radar Cross Section RFA Radio Frequency Antenna RFE Radar Front End SaR Search and Rescue SAR Synthetic Aperture Radar SAR PCU SAR Processing and Control Unit SNR Signal-to-Noise Ratio SW Software Swath Size of the imaging area along the range dimension TBD To Be Decided UAV Unmanned Aerial Vehicle UAV CU UAV Control Unit UAV PWR: UAV Power

Table 2-1 Acronyms and definitions used in this document

Page 7: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

7

3. Optimization design of high resolution radar sensors

3.1 About MetaSensing ‘s technology MetaSensing B.V. is a high-tech company located in Netherlands, operating in the radar and SAR field. It is one of the unique companies in the world which offer airborne and ground-based radar sensors and services for mapping, monitoring and surveillance, both for commercial and scientific applications. All MetaSensing products are based on the combination of innovative radar technology developed in more than a decade of advanced research and SAR techniques [2]. MetaSensing’s radar sensors cover a whole range of frequency bands, from the 400 MHz of the P-band, up to the 17 GHz of the Ku-band and a new Ka-band radar instrument at 35 GHz is currently under development. In particular, MetaSensing has developed a large number of different X-band sensors for multiple applications. Among them: • Traditional, high resolution geo-located radar imaging; • Polarimetry for classification purposes; • Single-pass interferometry with high accuracy (in the cm range) for 3D DEM or DSM generation; • Large swath, medium resolution radars for patrolling and detection of small objects at sea; • Sea waves and currents observation (height, velocity) for harbor management; • MTI.

Examples of radar-based images achieved with MetaSensing technology at X band are shown in Figure 3-1. The top image is an intensity image recently acquired in China: coverage is 10 km in azimuth and 3.5 km in ground range [3]. The bottom image is a crop from a DSM over Delft, The Netherlands. The good resolution of the images can be assessed as smaller man-made structures and nature elements can be easily recognized. In order to further show the detail level that can be achieved by MetaSensing radar sensors, a crop of a radar image acquired at X band over the harbor of Rotterdam is given in Figure 3-2, together with optical images for comparison purposes.

Figure 3-1 Examples of SAR images acquired and processed by MetaSensing.

Page 8: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

8

Figure 3-2 Along-Track Interferometric SAR image of Rotterdam Harbor by MetaSensing X-band sensor compared to optical pictures shot during the acquisitions.

Not only the bigger elements, like big vessels and silos, but also little buoys and similar small elements can be recognized by the comparison of radar and optical images, showing the high resolution level of MetaSensing radar sensors. The developed airborne radar instruments are operationally used by MetaSensing and its customers for data acquisitions campaigns. Different radar sensors have been mounted in the past on diverse airborne platforms. In Figure 3-3 are shown some of the smallest aircraft platforms which have been hosting MetaSensing radar products for diverse applications. Most of them are manned and small. Still, they are big enough to accommodate on board the MetaSensing radar sensors with a moderate engineering effort, thanks to their small dimensions and limited power hungriness, combined with the high customization level.

Figure 3-3 Examples of aircraft platforms on which MetaSensing radar sensors have been installed and succesfully operated.

Page 9: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

9

3.2 Optimized design for UAV platforms MetaSensing’s MiniSAR is an end-to-end, multimode X-band SAR system with real-time capabilities, designed for military applications like tactical surveillance and target reconnaissance. The MetaSensing MiniSAR sensor is shown in Figure 3-4.

Figure 3-4 MetaSensing’s MiniSAR, a versatile X-band SAR system

Despite its moderate size and weight, the MiniSAR as such can hardly find its collocation on board of most of the common small and medium size UAVs, due to their limited payload and space constraints. Therefore, an innovative MiniSAR sensor with reduced dimensions and even lighter weight, with advanced signal and radar imaging processing for enhancing ship identification is the main objective of the activities described in this document. In particular, the following specific objectives are prosecuted:

1) Optimization design of very high resolution MS sensors for the specific application of maritime surveillance from small-medium UAVs, by reduction of weight and re-engineering; 2) Design and analysis of a 2D ISAR imaging of moving target from sea SAR scene; 3) Installation of the system in the UAV platform envisioned for the SUNNY demonstrator.

Page 10: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

10

4. 2D ISAR imaging mode in SAR scenes SAR systems were originally employed in Earth observation applications, such as ocean, land, ice, snow and vegetation monitoring. Nevertheless, the ability to form high resolution images from remote platforms in all day/all weather conditions has rapidly made SAR systems become very attractive for military and homeland security applications. In particular, especially in homeland security such as the scenario depicted in the SUNNY project, imaging of man-made non-cooperative moving targets becomes the main interest rather than the observation of natural phenomena. Many SAR processors, which are designed to form highly focused images with very high resolution are based on the assumption that the illuminated area is static during the synthetic aperture formation [5], [6], [7]. As a consequence, standard SAR techniques are typically unable to focus moving targets while forming a focused image of the static scene, leading to blurred and displaced images of any object which is not static within the observation time. A solution to this problem based on ISAR processing is implemented in the SUNNY system according to the approaches proposed in [8],[9], since the objective of the SUNNY system is the detection and identification of non-cooperative moving targets associated with irregular immigrants and drug smugglers. In fact, ISAR has been suggested as another way to look at the problem of forming a synthetic aperture and obtain high resolution images of non-cooperative moving targets. As shown in, ISAR techniques do not assume that the target is static during the observation interval, but exploit the unknown relative target motion between the radar and the target to form the synthetic aperture. Obviously, all the issues related to the ISAR image formation problem must be taken into account, such as issues related to the cross-range scaling, the identification of the image projection plane and the fact that the imaging system performance is not entirely predictable. The resolution of the ISAR images is not only imposed by the acquisition parameters which can be a priori fixed, but even by the unknown target motion. The range resolution ∆r is inversely proportional to the bandwidth of the transmitted signal ( B ) and it is defined as

2crB

∆ = ( 4.1)

where c denotes the speed of light in vacuum. The cross range resolution ∆cr is expressed as

02 obs

ccrf T

∆ =Ω

( 4.2)

where Ω is the effective rotation vector that produces the aspect angle variation, obsT is the observation

time and 0f is the carrier frequency.

4.1 Theory about 2D ISAR imaging The functional block diagram of the ISAR image generation implemented in the SUNNY system is shown in Figure 4-1. The input of such a processing can be either a SAR image or a Range Doppler map, while the outputs consist of a set of refocused sub-images of moving targets present within the monitored scene. The main steps of the algorithm are:

1. Target detection. Every target to be refocused is firstly detected within the RD map. This step is performed in real time by the internal processing of the MetaSensing’s MiniSAR, as described in chapter 6.

Page 11: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

11

Figure 4-1 Functional block diagram for ISAR images from SAR processing, implemented in the SUNNY system.

2. Target sub image selection. A small crop is taken from the RD map, one for each detected target

separated from clutter and other targets. In this way, a number of sub-images equal to the number of detected targets are obtained. Also this task is performed in real time by the internal processing of the MetaSensing’s MiniSAR (chapter 6).

3. Sub-image inversion. The ISAR processing takes as input raw data, so an inversion mapping from target image to the target raw data is performed. It is worth pointing out that raw data obtained from the whole SAR image is not a suitable input for the ISAR processing because it usually contains the returns from several moving targets, each one with its own motion. ISAR processing must be applied to a single target raw data at each time in order to properly compensate each target’s motion and provide well focused images.

4. ISAR processing. It performs motion compensation and image formation for each detected target.

It has been shown in [9] that an equivalent raw data can be obtained by simply applying a 2D Inverse Fourier Transform (2D-IFT) to the considered crop extracted from the SAR image (or the range-Doppler map). When the 2D-IFT is applied, a new spatial grid in the frequency/slow time domain must be defined for each SAR sub-image. In fact, since the resolution of the whole SAR image and the sub-image are the same, the observation time and the bandwidth must be the same as the original SAR image. On the other hand, the PRF and the frequency spacing in the back-projected data are different and can be evaluated as follows:

eqcrop

cropeq

obs

BfM

NPRF

T=

∆ = ( 4.3)

where cropM and cropN are the crop size in the range and cross-range dimension respectively, B is the

transmitted signal bandwidth and obsT is the observation time.

The ISAR processing is the core of the refocusing algorithms. Its functional block is depicted in Figure 4-2 while each sub-block will be described in the following sub-sections.

Page 12: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

12

Figure 4-2 Functional block diagram

4.1.1 Time Window Selection The RD image formation technique can be successfully applied only when the effective rotation vectorΩ does not change significantly during the observation time. However, the target’s own motion may induce a non-uniform target’s rotation vector, especially in case of small manoeuvring target. In this case, only sub sections of the whole Coherent Processing Interval (CPI), within which the effective rotation vector is constant, should be considered. This approach can provide a sequence of images of the moving target or only the best frame. An automatic procedure to select the best time window will be implemented in the SUNNY system based on appropriate image quality criteria.

4.1.2 Motion Compensation This step aims at compensating the unknown target own motion which is responsible for the blurring effect on the target image. The autofocus technique adopted in the SUNNY system is the Image Contrast Based Autofocus (ICBA) algorithm [11]. The ICBA is a parametric technique based on the Image Contrast (IC) maximization. Under the assumption that the target can be modelled as the superimposition of sN ideal

and independent scatterers, the signal after the inversion process and after platform motion compensation can be expressed as

4 ( )0

1( , )

sn

fN j R tc

M nn obs

f ftS f t e rect rectT B

π

σ−

=

− =

∑ ( 4.4)

where B and 0f are the signal bandwidth and the carrier frequency of the transmitted signal, respectively.

obsT is the CPI exploited to form the ISAR image and it could be the same of the original data or it can be

reduced by the time window selection. ( )nR t is the residual distance between the radar and the thn

scatterer at the time instant t . The residual distance is approximately the difference between the actual distance and the distance between the radar and the SAR centre scene. It is worth pointing out that such a distance accounts for the relative motion of the target and the stationary centre scene, which must be compensated before forming the image of the moving target.

Under the hypothesis that the straight iso-range approximation holds true, the term ( )nR t can be

expressed as

0( ) ( ) ( )·Tn n LoSR t R t t≈ + x i ( 4.5)

where ( )0R t is the residual distance between the radar and an arbitrary reference point 0 on the target,

x is the column vector that identify the position of the thn scatterer on the target and ( )LoS ti is the unit

vector the identifies the radar LoS at the time t .

The ICBA autofocus technique aims at removing the term ( )0R t via an iterative estimation of the motion

parameters. For a relative small value of obsT and relatively smooth target motions, the radar target

distance can be expressed by exploiting a thL polynomial model as

Page 13: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

13

( ) ( )00

1!

Ll

ll

R tl

tα=

=∑ ( 4.6)

in which lα are the above mentioned motion parameters, commonly denoted by focusing parameters

which can be stacked in a vector form as [ ]1 2, , , TLα α α=α .

The estimation of ( )0R t resorts to the estimation of the target motion parameters via the maximization of

the IC with respect to α as follows

( )argmax IC= α

α α ( 4.7)

where the IC is defined as

( )( ) ( )

( )

2, , , ,

, ,

I II

A I

A AC

τ ν α τ ν α

τ ν α

− =α

( 4.8)

and where A indicates the average operation over the variables τ and ν , ( ), ,I τ ν α is the ISAR

image intensity after the motion compensation with α as focusing parameter vector. In particular,

( ) ( ) ( )04 ,

, , 2D-FT , ·fj R t

ceI S f tπ

τ ν

=

αα ( 4.9)

where ( )0 ,R nα is the radial motion evaluated with the motion parameters in α .

4.1.3 Image Formation The ISAR image is formed via the Range Doppler (RD) approach. The signal after motion compensation can be expressed as

4 · ( )0

1( , )

TsLoS

fN j tc

nn ob

cs

f ftS f t e rect rectT B

π

σ−

=

− =

∑x i

( 4.10)

Range-Doppler images are formed by applying a 2D-FT to the motion compensated data. The complex ISAR

image ( ),I τ ν is mathematically defined as:

( ) ( ), 2 [ , ]cD FT S f tI τ ν = − ( 4.11)

where ( ),τ ν are the time delay and the Doppler coordinate, respectively [10].

4.1.4 Scaling ISAR generate two dimensional high resolution images of targets in the delay time Doppler domain. In order to determine the size of the target, it is preferable to have fully scaled image. The range scaling can

be performed by using the well known relationship 2 2x cτ= . On the other hand, cross range scaling

requires the estimation of the modulus of the target effective rotation vector Ω :

102 obs

x cf T

νΩ

= ( 4.12)

Page 14: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

14

Recently, a novel algorithm has been proposed to solve this problem [12]. This algorithm is based on the assumption of quasi-constant target rotation. In fact, when the target rotation vector can be assumed constant within the coherent integration time, the chirp rate produced by the scattering centers can be related to the modulus of the target effective rotation vector by means of an analytical expression. Therefore, each scattering centre carries information about the modulus of the target rotation vector through its chirp rate. As a consequence, the signal component of the scattering centers are processed by means of a polynomial Fourier transform in order to estimate the chirp rate and, hence, the target rotation vector. The use of such a method, namely Local Polynomial Fourier Transform (LPFT) requires the solution of an optimization problem. In this optimization problem the IC is used as a cost function to be maximized for the estimation of the scattering centers chirp rate. It has been largely proven in literature that the image contrast (IC) and the image entropy (IE) are good parameters to assess the quality of the reconstructed ISAR image. The cross range scaling algorithm can be summarized by the following steps:

• The ISAR image of a single target is segmented in order to extract K sub-images

( )( ) ,kI τ ν associated with the brightest K scattering centers. Their scattering centre location kC is

also calculated. • Each sub-image is inversely Fourier transformed in the Doppler domain in order to obtain K signals

( )( ) ,kS tτ

• The K signals ( )( ) ,kS tτ are the used for scattering chirp rate estimation by applying the second

order LPFT to the signals ( )( ) ,kS tτ

• The chirp rate and the scattering range location are used for the estimation of the effective rotation vector of the target,Ω , via a LSE approach.

4.2 SW guide The ISAR from SAR processing software is implemented in MATLAB. Therefore, a general purpose PC with Matlab installed on it is sufficient to run the algorithms. The following sections provide further details about the Matlab software developed by CNIT. The ISAR processing is following documented, namely the crop inversion, the autofocusing, the image formation process and the scaling. Details about radar target detection and relative sub-image selection processing are given in paragraph 6.3.

4.2.1 Crop inversion As mentioned in the previous paragraphs, ISAR from SAR processing must be applied to an image in which a single target is present in order to properly perform the autofocusing. The first operation to be performed is the crop selection and inversion, which is made via 2D inverse Fourier Transform. The input of this operation is the complex matrix of the image crop containing only the target of interest. This crop is inverted in order to obtain a complex matrix of the same size representing the raw data. A schematic representation of the input and output of this part of the ISAR from SAR processing is in Figure 4-3

Crop inversion

SAR/RD image crop(provided by Metasensing)

Crop Raw Data

Figure 4-3 Image crop inversion

4.2.2 Time window selection Time windowing is performed in order to get the portion of data with the maximum CPI for which the effective rotation vector can be assumed as constant. This operation is performed on the crop raw data obtained from the crop inversion operation. In order to evaluate the maximum CPI, the crop is firstly

Page 15: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

15

divided in fixed length temporal windows. For each temporal window, the autofocusing process is applied and the image is evaluated via 2D Fourier transform. Then, the image contrast calculated. The image contrast is used as quality criteria to define the image quality. It is worth pointing out that the higher the image contrast, the better the image quality. As a consequence, the crop window with the highest contrast is selected. In order to get the longer CPI as possible, temporal samples adjacent to the time window selected are added to the selected data portion and the image contrast evaluated again. The procedure is repeated until the maximum CPI is found. It must be said the time window is not always necessary, unless the motion of the target is complicated, such as for maneuvering targets.

Time window

Crop Raw Data Crop raw data with maximum CPI

Figure 4-4 Time windowing

Once the maximum CPI has been selected, the crop is resized according to it. This means that the time window output is a complex matrix of the data in which only the temporal samples corresponding to the selected time window are preserved.

4.2.3 Motion Compensation The motion compensation step takes as input the crop with the maximum CPI that allow for the effective rotation vector to be assumed constant during the observation time and the system parameters. The output consists of the raw data in which the unknown motion of the target has been compensated by means of autofocus techniques. In particular, the input and the output are both complex matrices of the same size. A schematic representation of the motion compensation processing block is given in Figure 4-5. The motion compensation is an iterative procedure based on the following steps:

• evaluation of the target motion parameters • motion compensation with the motion parameters evaluated at point 1 • image formation by means of 2d Fourier transform • evaluation of the image contrast • repetition of steps 1 to 4 until the maximum image contrast is found.

The motion parameters corresponding to the maximum image contrast are used to perform the motion compensation and obtain the output of the motion compensation processing block.

Motion compensation

Crop raw data with maximum CPI

System parameters, range spacing, column spacing

Focused crop raw data

Figure 4-5 Motion compensation

Page 16: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

16

4.2.4 Image formation The image formation takes as input the compensated data and the system bandwidth and performs a 2D Fourier transform to obtain the ISAR image. In addition, the image formation process includes a range scaling operation, performed according to Eq.( 4.1), which exploits the system bandwidth information. As a consequence, the output of the image formation is the RD map of the refocused crop. It is worth pointing out that even if a scaling operation in the range domain has been performed, no information on the target size in the cross-Range domain can be extrapolated. A cross range scaling procedure is needed.

Image formation

Focused RD image crop

Focused crop raw data

System bandwidth

Figure 4-6 Image formation

4.2.5 Cross range scaling The image scaling process allows for the range/cross range bins to be scaled as length units. This operation is fundamental to obtain an estimation of the target size in range/cross range. The inputs of this process are the crop image and the system parameters, which includes the carrier frequency and the crop observation time, which are used to perform an estimation of the effective rotation vector. Once the estimation is performed, the cross range scaled ISAR image is obtained by means of Eq.( 4.12). At this point, both the axis of the image are expressed in length units (meters), so an estimation of the target size in both range and cross range dimension can be performed.

Range/cross range scaling

Focused image crop Scaled image crop

System parameters

Figure 4-7 Cross range scaling

It is worth pointing out that in order to have a fine cross range resolution, a sufficiently long observation time is needed. This assumption may be not verified in surveillance mode, in which the radar antenna moves to scan the surveillance area.

4.3 Output example Figure 4-8 and Figure 4-9shows two examples of the capabilities of the ISAR from SAR processing.

Page 17: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

17

Figure 4-9 Example of ISAR Output: on the left, image of a defocused crop; on the right the focused scaled crop.

Figure 4-8 Example of ISAR output: on the left, image of a defocused crop; on the right the focused scaled crop.

Page 18: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

18

5. 3D Imaging

5.1 Literature review of 3D imaging techniques Conventional two-dimensional (2D) radar images can be interpreted as projections of the target's electromagnetic (e.m.) reflectivity function onto the image projection plane (IPP). Such a plane is usually defined by 2D coordinates, namely the range coordinate and the cross-range coordinate. As any other type of images, radar images are also characterized by the resolution, which is the ability to separate two closely spaced scattering centers along the range or the cross-range direction. A fine range resolution can be achieved by transmitting wide bandwidth signals. On the other hand, fine cross-range resolution is strictly related to the variation of the aspect angle under which the target has been observed and, hence, the observation time. More in general, the cross-range resolution is determined by the aperture of the radar antenna. SAR and ISAR allows for a small antenna aperture to synthesize a large aperture antenna, hence achieving fine cross-range resolution by exploiting the coherent processing of the received echoes. The core idea of ISAR systems relies on the extension of the concept of SAR, referring to a geometry in which the synthetic aperture is formed by exploiting the relative motion between the target and the radar platform. The synthetic aperture in SAR systems is generated by exploiting the motion of the radar platform, while in ISAR systems the synthetic aperture is formed by exploiting the motion of the target. It is worth pointing out that in ISAR systems the targets are usually non cooperative and, hence, the target motion is unknown leading to challenging motion compensation task. The unknown target motion leads to the key problem of not knowing the acquisition geometry and dynamics of the radar-target system during the coherent integration time. Such a limitation causes difficulties in the interpretation of the ISAR images, and, hence, in the fundamental tasks of recognition and classification of the targets. The ISAR image obtained by the usual processing is, in fact, a 2D projection of the true three-dimensional target reflectivity onto an image plane [13] [14] [15]. The orientation of the image plane strongly depends on the radar-target geometry and on the target motion, which is typically unknown. The result is that the target projection onto the image plane cannot be predicted. The interpretation of the ISAR images becomes complicated and Automatic Target Recognition (ATR) turns into hard problem. Three-dimensional (3D) ISAR imaging can therefore be seen as a logical solution to such a problem as it produces 3D target reconstructions, and therefore it can avoid the target's projection onto planes. 3D reconstruction algorithms allow for an estimation of the target shape and consequently its sizes. This helps the interpretation of the ISAR images and the recognition of the target. 3D ISAR imaging has been a subject for research for many years and several approaches have been proposed in the literature that can be summarized in the following main categories:

1. ISAR movies A set of consecutive ISAR images is exploited. In fact, each frame of the ISAR movie is an ISAR image with a different image plane, thus the 3D coordinates of the location of the point scatterers can be derived using the 2D coordinates detected in each frame.

2. Interferometric ISAR (InISAR) A 3D reconstruction is achieved by exploiting the phase differences between 2D ISAR images obtained at sensors located at different spatial positions in a multi-channel configuration.

3. Tomographic ISAR A 2D ISAR image is generated for each transmit/receive element of two orthogonal arrays and the inter-element phase differences are coherently combined using beamforming techniques to estimate the 3D shape of the target.

Page 19: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

19

4. MIMO ISAR Orthogonal waveforms transmitted from different antenna elements in a MIMO configuration are exploited to obtain the 3D target reconstruction by taking advantages of the virtual receiving array.

In the first approach the three-dimensional shape of the target is achieved by using a set of consecutive ISAR images obtained from a single sensor [16]-[20]. This approach exploits the view angle’s changes produced by the 3D target motion in order to calculate the three-dimensional position of each scattering center that composes the target. The sequence of ISAR images is used to track scattering centers range histories. The 3D location of the scattering centers is then estimated by using such tracks and geometrical constraints induced by rigid bodies. Within this approach, all the methods rely on a long observation of the target and their performances strongly depend on the target's motion. Furthermore, the relative target's motion must be a priori known. The use of ISAR image snapshots to form 3D images of fast rotating objects when the spinning velocity is much higher than the precession velocity has been proposed [20].The effectiveness of this method was proven in the case of ballistic missile 3D imaging. For this and the other methods within the first approach, the exact aspect angle has to be known and this fact makes this approach useless for non-cooperative targets since this information is not available. In general, since the relative target's motion cannot be a priori known and it might not be possible to obtain long observation times for a single target, the approach based on a set of ISAR images is not a suitable solution for the SUNNY system. A second approach is based on the use of multi-channel sensors and interferometry to obtain 3D ISAR images [21]-[24]. Interferometric techniques are based on the use of multistatic radar configurations in order to obtain the height of each scatterer that composes the target by exploiting phase differences between the received signal at the different sensors. The use of multiple radar configurations can overcome some of the geometrical limitations that exist when obtaining radar images of a target using ISAR techniques. In general, ISAR imaging depends on the variations of the target aspect angle with respect to the radar within the observation time. Insufficient rotational motion of the target can cause insufficient Doppler separation and then a set of geometrical cases that do not enable a suitable ISAR image formation, even if the target is moving with respect to the radar. On the other hand, a variation of the aspect angle produces a variation of visible scattering centers, making it difficult to compare them to a data base for the purpose of target classification. These problems can be solved by integrating signals obtained from different aspect angles into a single image therefore using a multistatic configuration. Furthermore, the interferometric technique has the advantage that no a priori knowledge of the target's rotational motion is required and its effectiveness does not depend on such a motion. This approach allows for the estimation of the height with respect to the image plane. The obtained 3D shape cannot then be considered as a full 3D image. In fact, the scatterer's elevation with respect to the IPP is obtained as a result of an estimation problem and not as an output of an imaging algorithm. In an interferometric ISAR image processing technique for 3D target altitude image formation was proposed [25]. This method allows for an estimation of altitude of the target to be obtained by using only two frames of 2D ISAR images. This results in a simplification of the data collection and signal processing. Nevertheless, this technique has to deal with several constraints regarding the data collection and the effect of multiple scattering centers for target located at the same range and azimuth but with different altitude (layover problems). In [26] an ISAR system with 3 receivers that exploits the interferometric principles in order to obtain three-dimensional images of the target has been proposed. This technique has the advantage of not requiring the knowledge of the target motion. However, errors can occur in the estimation of the scatterers position if an image cell contains more than one scatterer because scatterers that have the same range-Doppler position cannot be separated.

Page 20: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

20

Problems related to the baseline length in Interferometric ISAR (InISAR) for 3D target reconstruction were discussed in [27]. A method for compensating the baseline length variation was also proposed in the same paper. In any case, in order to avoid the phase wrapping, the altitude measurements must follow the following constraint:

0RbDλ

≤ ( 5-1)

where b is the baseline length, is the wavelength of the transmitted waveform , is the radar-target distance in the phase center and D is the maximum size of the target. However the baseline length should be set close to the above mentioned upper bound since the longer the baseline length, the finer the accuracy results, as described in [28]. A technique for co-registering ISAR images prior the application of interferometry was discussed in [28], whereas the use of two perpendicular antenna arrays was proposed in [24] to form target 3D images. In the above mentioned paper, a 3D ISAR imaging method based on an antenna array configuration which makes use of interferometric principles was proposed. The three-dimensional imaging was achieved using array Direction Of Arrival (DOA) estimates. Target motion effects in InISAR was investigated by using a realistic 3D target model [29]. The problem of 3D target reconstruction was introduced with a new estimation problem, which involves the joint estimation of the target's effective rotation vector (modulus and phase) and the height of the scattering centers by using a dual interferometric ISAR system in [30], [31]. The estimation of the effective rotation vector allows the ISAR image plane to be estimated and the cross-range scaling to be performed. The author of [32] proposed a novel algorithm for producing 3D ISAR images of targets by combining ISAR image sequences and interferometry, however such a technique requires several conditions on the observation geometry. The above proposed methods exploiting multichannel sensors and interferometry techniques cannot be used for systems installed on small UAVs because of the limited space available for antenna installation. Let assume that the radar is transmitting a waveform with a carrier frequency in the X band (8-12 GHz) and is carried on a platform at a height ranging between 1 and 1.5 km. If the dimension D of the target is about 10 m, the receivers, generally two or three depending on the method, have to be spaced roughly 3 4.5 meters apart according to Eq. ( 5-1). This fact makes unpractical the installation of such interferometric sensors on the UAVs. Moreover, assumptions on the target’s size have to be used in order to set up such baselines’ length. Within the third approach, two 3D imaging algorithms based on tomography have been developed that extended the synthetic aperture both along the azimuth and the elevation [33]-[34]. A key factor for achieving accurate SAR 3D images in the first method is precise control over the radar LoS incident angle that is not the case of radars installed on UAVs. Such accuracy can be achieved in controlled experiments. The second method exploits two perpendicular antenna arrays and allows multiple scatterers belonging to the same range-Doppler cell to be resolved. The large size of the described cross-shaped array makes it impossible to be installed on UAVs. The last approach is based on three-dimensional ISAR imaging using a MIMO configuration. Three-dimensional MIMO ISAR imaging techniques combine the growing demanding necessity of obtaining 3D images of the target with the advantages provided by using multiple radar configurations. MIMO radar techniques evolved from MIMO wireless communication. Different from traditional radar systems, the MIMO radar adopts orthogonal waveforms transmitted from different antenna elements, which helps avoiding transmitting waveform interference and ensure the signal channels to be independent from each

Page 21: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

21

other. This is a new, challenging approach, and only a few papers attempt to deal with 3D MIMO ISAR imaging. There are several potential advantages in radar systems that can be achieved by making use of MIMO configurations. For given system design choices, a MIMO radar system allows for an improvement in target detection performance and angle estimation accuracy. In addition, the minimum velocity that can be detected can be decreased [35]. MIMO distributed ISAR imaging techniques have been used to increase the cross-range resolution, which depends on the intrinsic motion characteristics of the target (and, specifically, on its overall change of aspect angle). The use of MIMO ISAR configuration can also reduce the transmitting power [36]. Whilst methods to obtain three-dimensional images of the target using MIMO ISAR systems have been proposed in the last few years [37],[38], the theoretical framework of a MIMO ISAR system is still developing. MIMO radar systems can be categorized in two main configurations: statistical MIMO radar and coherent MIMO radar. Statistical MIMO radar configurations consist of a number of transmitting/receiving elements widely spaced. In the coherent MIMO configurations, the antenna elements are co-located, i.e. closely spaced. Despite all the advantages of the MIMO technique, this technology cannot be used if the radar system is carried by UAVs. In fact, for statistical MIMO, to obtain path diversity for an extended target, the spatial decorrelation needs to be guaranteed and the MIMO radar antennas need to be sufficiently separated. Two channels are uncorrelated if at least one of the following conditions are met:

0 0

0 0

0 0

0 0

( , ) ( , )

( , ) ( , )

( , ) ( , )

( , ) ( , )

tk ti

k i x

tk ti

k i y

rjrt

l j x

rjrt

l j y

x xd T X d T X D

y yd T X d T X D

xxd R X d R X D

yyd R X d R X D

λ

λ

λ

λ

− >

− >

− >

− >

( 5-2)

Where ,( )tk tkx y ,( )ti tix y are the coordinates of the two transmit antennas and the target dimensions along the x and y axes are xD and yD , respectively. Furthermore, ,( )rl rlx y and ,( )rj rjx y are the coordinates of the two receiving antennas. The parameter 0( ),kd T X is the distance between transmit antenna at kT and the target at 0X while 0( ),ld R X is the distance between receiver antenna at lR and the target at 0X . Taking the example we used earlier with a distance to target of 0 1.[1 5]kmR = ÷ the condition for decorrelation is that the separation required between the elements of the MIMO radar is of the order of [3 4.5]m÷ . Even when the MIMO radar antennas are co-located, then decorrelation is still possible for large D. Specifically, twice the distance to the target. In this case, D cannot be the aperture of a single extended target, but rather we could view two point targets separated by D as an array with two elements [35]. Furthermore, for 3D applications, the baseline length constraints expressed in Eq. ( 5-1) must be followed when using interferometry.

5.2 Conclusions The previous section focused on the study of 3D ISAR imaging techniques in order to investigate whether or not 3D imaging could be performed in the SUNNY system. Four different 3D imaging methods have been reviewed which, for different reasons, cannot be applied to the SUNNY system as such, as summarized in the following. 3D imaging based on ISAR movies requires the target to be observed for a long time and its motion to be perfectly known in order to evaluate the aspect angle needed to for the 3D image. Such requirements

Page 22: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

22

cannot be met by the SUNNY application since targets are not cooperative and the surveillance mode does not allow illuminating the target for a sufficient amount of time. The InSAR method, besides requiring a multistatic configuration with multiple radar sensors, must comply with baseline restrictions which cannot be satisfied in the SUNNY system, due to very limited space in the UAV platform for radar antennas. The tomographic ISAR method cannot be used in SUNNY because it requires a precise control over the radar LoS, which is not the case of radar installed on UAVs. In addition, such a method requires the use of an antenna array, which cannot be installed on UAV. The last reviewed method, based on MIMO technology, has baseline requirements very similar to the previously mentioned InSAR method, and for this reason cannot be applied to the SUNNY system.

Page 23: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

23

6. Radar System Realization And Installation MetaSensing developed the MiniSAR system as a compact radar sensor that combines high quality radar performance and flexible system characteristics (see Figure 3-4). It is the result of several years of experience by MetaSensing in creating and developing SAR system solutions [40]. The sensor operates at X-band and it can be mounted on a wide range of moving platforms, including the ANTEX, the one designed for the demonstration of the SUNNY concept. High Resolution Imaging (HRI) and Moving Target Indication (MTI) capabilities permit all-weather airborne observation and surveillance. Figure 6-1 shows a typical example of the high-resolution radar image acquired by MetaSensing with a previous version of its X-band airborne radar sensor.

Figure 6-1 Example of SAR image acquired by MetaSesning’s MiniSAR..

The functionalities of the MiniSAR are described in previous project documentation [39]. For convenience, the block diagram is brought back in Figure 6-2. The black arrows indicate control and data connections, the blue arrows represent RF connections and the red lines show the DC power distribution lines. The acronyms used in the figure are defined in the document conventions (paragraph 2.3).

6.1 MiniSAR sub-systems for the SUNNY demonstrator Most of the hardware which is installed on the ANTEX for the SUNNY demonstrator corresponds to what was described in the project deliverable about sensor specification [39]. There, some design choices were still left open due to potential alternatives to be assessed. Moreover, acquisition requirements had to be further consolidated with respect to the demonstrator scenario. In order to explicitly mention any eventual choice made or modification with respect to the initial design, the different MiniSAR subsystems which are actually used for the SUNNY demonstrator are summarized in the following paragraphs.

6.1.1 Sensor processing and control unit (PCU) The MiniSAR PCU is the brain of the radar system. It takes care of controlling and synchronizing the operation of all sub-systems, generating the waveform, processing the data, storing results and providing the outputs. The main constituents of the control unit are a processor computer, an arbitrary waveform generator (AWG) and analog-to-digital converter (ADC).

Page 24: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

24

Figure 6-2 Block diagram of MetaSensingMiniSAR radar sensor. The black arrows indicate control and data connections (serial and/or Ethernet), the blue arrows represent RF coaxial connections and the red lines show the DC power distribution lines.

In order to withstand the payload specification of the ANTEX, between the two possible SAR PCU enclosure solutions, i.e. standard Vs ruggedized, it is chosen the lighter one. The main physical characteristics of the MiniSAR PCU are summarized in Table 6-1.

Parameter Value Weight 2.9 kg Height 80 mm Depth 250 mm Width 200 mm

Table 6-1 Physical characteristics of the standard MiniSAR PCU

6.1.2 Radar front-end (RFE) The radar front-end (RFE) is the radio frequency module of the radar, taking care of transmitting and receiving microwave signals. It accepts the IF waveform as input from the MiniSAR PCU and it up-converts is to the desired operating frequency (9.6 GHz), before sending it to the connected RF antenna. The radar echoes received back by the antenna are then conveyed again to the RF unit for further elaboration with down-conversion: the IF signal is delivered again to the SAR control unit for digitization, eventual storage and mostly for target detection and image cropping processing. A photograph of the actual RFE is displayed Figure 6-3 and its physical specifications are given in Table 6-2. It is to be noted that this module is a more powerful version (HPA of 5 Watt) with respect to the originally designed one (HPA of 2 Watt) [39]. A more powerful amplifier module allows for higher SNR, translating in improved detections of smaller targets also at longer ranges.

Page 25: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

25

Figure 6-3 The actual MiniSAR RFE installed in the ANTEX

Parameter Value Weight 4 kg Height 65 mm Depth 200 mm Width 180 mm

Table 6-2 Physical characteristics of the MiniSAR RFE for the SUNNY demonstrator. Dimensions do not include connectors and cabling

6.1.3 GPS-IMU The GPS-IMU is used by the radar to derive the position of the sensor and the attitude of the aircraft, when emitting its waveforms. The MiniSAR sensor is equipped with its own GPS-IMU, shown in Figure 6-4. The physical characteristics of the GPS-IMU are given in Table 6-3. A dedicated firmware interfaces the SAR processing and control unit with the navigation unit.

Figure 6-4 The actual MiniSAR navigation unit installed in the ANTEX

Page 26: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

26

A GPS antenna is installed to the roof of the ANTEX, to receive GPS satellite signals. The one used by the MiniSAR is shown Figure 6-5. It is an active antenna designed to operate at the GPS L1 and L2 frequencies, 1575.42 and 1227.60 MHz and across the L-Band from 1525 to1560 MHz. The GPS receiver provides the necessary power through the antenna RF connector (the one on top among the two in Figure 6-4.). A coaxial cable with a male TNC connector is used. The antenna is aircraft certified for navigation.

Figure 6-5 The actual MiniSAR GPS antenna installed in the ANTEX

Parameter Value Weight (including GPS antenna) 2.4 kg Height 89 mm Depth 127 mm Width 106 mm

Table 6-3 Physical characteristics of the MiniSAR GPS-IMU

Page 27: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

27

6.1.4 Radar RF antenna The MiniSAR RF antenna is a flat panel based on microstrip array technology, which offers good electromagnetic performance with limited dimensions and weight. An image of the radar RF antenna is provided in Figure 6-6. The physical characteristics and performance of the antenna are described in Table 6-4.

Figure 6-6 The MiniSAR RF antenna installed in the ANTEX

Parameter Value Frequency 9.5 – 9.7 GHz Gain 26 dBi Azimuth Half Power Beamwidth (HPBW) 4.4 deg Elevation HPBW 10.6 deg Sidelobe level -13 dB Polarization V Dimensions 450 x 225 x 15 mm Connector N female Weight 1.3 kg

Table 6-4 Physical characteristics of the MiniSAR RFA

Page 28: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

28

6.1.5 Pan-and-Tilt Unit (PTU) The pan-and-tilt unit (PTU), displayed in Figure 6-7, is a device that corrects possible unwanted aircraft attitude variations, according to the navigation data coming from the GPS-IMU and elaborated in real time by the MiniSAR PCU. In this way, the desired area can be imaged with high precision and reliability, making sure that the illuminated scene is the desired one, and it is not affected by the instable motion of the mounting platform. Additionally, the PTU is the element by which the antenna scanning can be performed, and the area to be monitored can be chosen according to the user requirements. The connector on the base of the PTU is used to power and control the unit. The connector halfway the height of the PTU, visible in Figure 6-7, is left unused. The key parameters of the Pan-Tilt Unit are summarized in Table 6-5.

Figure 6-7 The MiniSAR PTU with flat panel microstrip RFA mounted on it

Parameter Value Weight 4.6 kg Height 200 mm Depth 180 mm Width 450 mm Rotating radius 620 mm

Table 6-5 Physical characteristics of the MiniSAR PTU.

Page 29: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

29

6.2 Actual Mechanical Installation In order to withstand with space and weight limitation imposed by the ANTEX platform (see Figure 6-8), the different MiniSAR subsystems are directly installed in the internal walls of the platform without using any enclosure, as usually advisable for protection of electronics and for EMI minimization with other systems. The not moving parts are fitted in a ~27 x 34 x 27.5 cm volume, while the moving parts (pan-and-tilt head and RF antenna) are accommodated inside a radome (see Figure 6-9) under the belly of the ANTEX. The scan function of the radar antenna is implemented mechanically with the PTU. The PTU allows 350° rotation in azimuth, not continuous, and 60° in elevation, below the horizon. However, because the landing gears could cause obstruction to radar signals, the scanning sector is further limited to a smaller operational range. The total weight including all parts and cables is about 16 Kg. At the time of writing this document, the MiniSAR system is being installed on board the ANTEX at the facilities of the Academnia da Forca Aerea in Portugal. The environmental specifications described in [39] are still considered valid as such.

Figure 6-8 Space reserved for MiniSAR on board of the ANTEX (courtesy of the Portuguese Academia da Força Aérea)

Page 30: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

30

Figure 6-9 CAD representation of the MiniSAR PTU mounted RFA into the radome to be placed under the belly of the ANTEX

(courtesy of the Portuguese Academia da Força Aérea)

6.3 Radar Data Processing Algorithms Most of the radar processing (maritime MTI and ISAR) is done in real time on-board the MiniSAR. However, since in the SUNNY project the ISAR image generation is tasked to CNIT and since the ISAR processing has been widely described in previous chapters, this section is dedicated to the high level description of the implemented algorithms to process the radar data up to the detection of target and image cropping. The algorithms are implemented in MatLab®. A block diagram of the complete radar data processing is shown in Figure 6-10.

RD map/SAR image

Target Detection

#1

#N

ISAR processing

ISAR processing

ISAR image / target parameters

ISAR image / target parameters

Target crop

Target crop

Figure 6-10 Radar data processing block diagram.

It is beyond the scope of the present paragraph a deep dissertation of the radar processing procedures. The reader can find all the required information in the literature on the topic, for example in [3],[4],[5],[42]-[45]. Following, a general overview is given.

Page 31: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

31

Briefly, the SAR focussing processing has the aim of synthesizing an image from the radar backscattered information. The echo energy received back at the radar antenna from a certain target is dispersed in range (by the duration the transmitted pulse) and in azimuth (by the time the target is illuminated by the radar antenna beam during the motion of the host platform). In order to obtain SAR images the signal processing has to therefore perform the dual task of range compression and azimuth compression, as explained in the following. The transmitted signal is received after being backscattered from the illuminated scene. A matched filter is applied, meaning that such echo signal is mixed with a replica of the transmitted waveform, and the intermediate frequency signal (beat-signal) is obtained. This is proportional to the time delay of the targets and therefore to their distances. Range compression is then obtained by a Fourier Transform. To reduce the influence of strong scatterers on neighbor weaker pixels, windowing is applied before Fourier Transform. The echo signals collected along a data-take trajectory have to be coherently combined in order to synthesize a larger antenna and to obtain a finer along-track resolution. To do this the position of the platform carrying the radar is precisely derived from the navigation unit, and it is synchronized with the received signal. Fourier transformation is performed in the azimuth direction on each range gate to

transform the data into the range Doppler domain. From a range-Doppler map it is possible to estimate the range of each detected target and how fast these are approaching or receding from the radar platform. Therefore, it is possible to distinguish among targets moving at various speeds at various ranges. As an example, Figure 6-11 represents in practice what the radar processing outputs are. A RD map is generated, processed from radar data acquired with the MiniSAR system for SUNNY during the qualification flight on the 20th of April 2016 (see chapter 7). There, two targets are clearly visible at different distances from the radar antenna, i.e. about 10 Km and 13 Km, and they are characterized by different velocities. These, however, are not the actual velocities of the targets, as the compensation for aircraft velocity has not accounted for in that image. Within the MiniSAR processing chain the task of ship detection is achieved by a Constant False Alarm Rate (CFAR) algorithm applied to the RD maps. Many CFAR detection algorithms have been proposed in the literature, as for example in [46]-[48]. Essentially, the algorithm is based on the homogeneity of the image to be analyzed, which is valid for sea clutter scenarios. Through the values of the signal in this cluster of samples, some statistical parameters are estimated and are used to adjust the histogram to the probability density function of a particular statistical distribution previously assumed to characterize the clutter (the most popular hypothesis is the K-distribution). Then a threshold is calculated and its application leads directly to a binary image, showing the presence or not of the target. An example of how this is implemented in the MiniSAR is shown in Figure 7-10. For each one of the detected targets, a crop of the RD image is generated and passed over for the further ISAR processing steps.

Page 32: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

32

Figure 6-11 Example of RD map derived from data acquired with the MiniSAR sensor for SUNNY during the qualification flight

test performed on 20th of April 2016 (see chapter 7)

Page 33: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

33

6.4 Radar performance The main parameters of the MiniSAR system installed in the ANTEX platform during the SUNNY demonstration are reviewed in Table 6-6 .

Parameter Value Unit Pulse repetition frequency (PRF)(*) 0.5< PRF < 10 KHz Wavelength 0.03125 m Beam pattern: antenna 3dB beamwidth Azimuth – Elevation

4.4 – 10.6 deg

Antenna scan width +/-175 deg Pulse coherent integration Yes NA Number of pulses integrated 1024 NA Threshold (S/N min) 10 dB Radar MIN-MAX slant range 3-25 km Radar resolution 4 feet/px

Table 6-6 Radar performance specifications. (*) Tunable, depending on acquisition scenario, operational range, flight altitude, etc..

(**) Due to the pulsed approach, the actual transmitted power of the MiniSAR for SUNNY is ~20-30% of the peak value.

Page 34: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

34

7. Performance And Qualification Tests In order to assess the overall radar performance of the MIniSAR for SUNNY, the different subsystems which have been discussed in paragraph 6.1 have been subject of measurements at MetaSensing labs, and the main results are reported in paragraph 7.1. Additionally, as a further qualification test for the overall system to be used in the ANTEX, the MiniSAR has been installed on a (manned) flying platform and some acquisitions performed over actual maritime scenarios, as described in paragraph 7.2.

7.1 MiniSAR subsystems lab tests Results of the lab test are shown in this paragraph, concerning some of the subsystems composing the MiniSAR, (see paragraph 6.1). It is to be noted, however, that the correct functionality of each subsystem has been checked, without being necessarily documented in this paragraph, as a deep investigation of all performance parameters of each sub-system is considered too evasive with respect to the purpose of this document.

7.1.1 RFE As part of conventional procedure and quality control at MetaSensing, the radar front end has been subject of extensive lab measurements after its productions. Some of the results are summarized in Table 7-1 .

Parameter Specification Test result TX frequency range

9450-9750 MHz

TESTED Output power vs frequency [dBm] +38.0...38.4 Spurious level <-80 Input noise figure TESTED Pulsed operation mode TESTED IF attenuator control (RX chain)

RS232 OK

RF attenuator control (TX chain) OK DC input current [A] @ +12V 5.5

Table 7-1 Measured parameters for the MiniSAR RFE for SUNNY

7.1.2 Radar RFA The radiation patterns of the MiniSAR antenna have been measured in anechoic chamber; the results are plotted in Figure 7-1. The parameters listed in Table 6-6 are confirmed.

Figure 7-1 Radiation patterns measured at 9.6 GHz (operating frequency of the MiniSAR for SUNNY)

Page 35: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

35

7.2 MiniSAR qualification test Before the installation of the MiniSAR system on board of the ANTEX platform it has been planned an assessment of the acquisition performance based on detection of actual targets (ships) in a maritime environment. Therefore, the MiniSAR equipment has been flown on a manned platform and acquisitions have been performed over some areas of the Dutch sea in front of the Rotterdam harbor, as described in detail in the following paragraphs.

7.2.1 Aircraft platform As part of the planning, the radar qualification test had to be performed on a manned aircraft platform, as the integration of the radar sensor on board the ANTEX platform was not finalized yet. The chosen aircraft is a Cessna 208 Caravan, the one usually operated by MetaSensing for the prototyping phase of its airborne sensors. The used aircraft is shown in Figure 7-2 . This kind of aircraft allows for the complete radar system to be installed inside the fuselage of the aircraft, including the RF antennas. These can radiate radar signals and receive back the echoes through the wide lateral cargo door. In this way, the high costs and the long time required to obtain the certification for installing and operating the radar system are contained.

Figure 7-2 C208 Grand caravan, the (manned) aircraft platform used for the qualification test of the MiniSAR system for SUNNY.

7.2.2 On board installation As mentioned, the complete X-band radar system which to be used on board the ANTEX has been installed inside the fuselage of the C208, including the radar antenna mounted on the PTU unit. These last have been mounted on an aluminium structure interface, so that the same configuration envisioned for the ANTEX could be realized (PTU installed with his base on top, as shown in Figure 6-9). The aluminium structure has been fastened in proximity of the cargo door, as shown in Figure 7-3. Because of this arrangement, the MiniSAR resulted in a left-side looking system, and it was possible to accomplish an antenna scanning (in azimuth) only within the +-30 deg range with respect to the direction perpendicular to the flight direction, as other angles would have result in radiations inside the cabin of the aircraft. In elevation, the antenna pointing was not limited by the installation constraints, so the detections could be attempted at the desired look angle (30°-90°). The remaining subsystems (RF, processor, navigation unit, dc/dc converters) have been arranged inside a metallic enclosure, and this last was fastened to the cabin floor. It is to be noted that, even if the system configuration used for the acquisition did not exactly match the one envisioned for the ANTEX, it was still sufficient to test the functionality of the assembled hardware and the reliability of the developed algorithms for detection of ships on the sea.

Page 36: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

36

Figure 7-3 Installation of the radar antenna and PTU on board the C208 Caravan

Figure 7-4 Radar antenna and PTU on board the C208 Caravan during an actual acquisition.

Page 37: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

37

7.2.3 Acquisition site The aircraft used for the qualification test is based in Teuge, a town close to the city of Apeldoorn, in the inner part of the Netherlands, as shown in Figure 7-5. The figure is a screenshot taken from a marine traffic website showing in real time Automatic Identification System (AIS) data [49]; the figure represents a typical vessel situation on an average spring day in sea in front of the Dutch coasts. The use of a collaborative target was not planned for these acquisitions; therefore, it has been chosen to fly over the area of Hoek van Holland, in proximity of the city of Rotterdam. There, the river Maas flows into the sea, representing the main way to reach the Rotterdam harbour, one of the biggest in Europe. So, by flying in that area the probability of having differently sized ships in the radar illuminated scene could be maximized, due to the usual high traffic. Additionally, it has been decided to fly parallel to the coast up to the town of Ijmuiden, also known as a ship-congested spot.

Figure 7-5 An average maritime traffic scenario between Hoek van Holland and Ijmuiden, for an average day in the spring period,

few days before the acquisitions with the MiniSAR.

7.2.4 Measurement campaigns A first measurement flight has been performed in 1st April 2016. During the ferry from Teuge to Rotterdam, the Air Traffic Control (ATC) of Schiphol denied the previously granted authorization to fly over the designed site, due to some airliners transiting in the area. Therefore, only acquisitions over land scenarios were performed, in particular over highway, in which moving vehicle could be eventually found. A second flight has been performed on 20th of April 2016, and this time no ATC limitations were imposed, except for the flight altitude to be kept less than 1000 ft. Radar data have been successfully acquired over the designed maritime scenario, detections were shown at different observation ranges, from 3 Km to more than 10 Km, concerning ships characterized by different dimensions and velocity regimes (also standing still). No collaborative target was employed during the acquisitions, so AIS information of the ships and vessel on the scene has been obtained from external sources.

Page 38: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

38

Figure 7-6 Picture taken during the acquisitions, showing differently sized ships in the monitored area

Figure 7-7 AIS information for some of the vessels on the area of interest during the radar acquisitions of 20/4/2016 as qualification test

Page 39: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

39

7.3 Results As first step RD plots have been obtained from the acquired data. As an example, in Figure 7-8 RD maps are processed belonging the same acquisition and a target (presumably a big vessel) is clearly visible at ~3 km range. The radar antenna was scanning within a 60° sector, that is, ±30° from the direction perpendicular to the direction of flight. Therefore, the target was several times within the antenna beam while the antenna was scanning and the aircraft moving. That is the reason for which the target can be seen in different RD maps processed from the data of a same acquisition. The relative position and velocity between radar and target varies during the flight, and this can be seen in the plots, which have not been compensated for aircraft motion.

Figure 7-8 Range Doppler maps showing the confirmed detection of a vessel target [dB].

Page 40: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

40

Figure 7-9 shows a RD map in which a target is detected at ~10 km distance. Despite the relatively long range, a good SNR level (~20 dB) can be appreciated.

Figure 7-9 Range Doppler map [dB] showing a detection of a target at ~10 km distance.

The parameters estimated by the radar system, i.e. position and velocity of the detected and confirmed targets, are being validated by comparison with the available AIS data. In Figure 7-10 an example is shown of the CFAR output: from the RD map on the left to the detection map on the right. By the knowledge of the aircraft position, the antenna pointing direction, and the range to the target, it is possible to univocally locate the position of the target.

Figure 7-10 CFAR algorithm turn a RD map (plot on the left) into a detection map (plot on the right)

Page 41: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

41

7.4 Conclusions The performance of the radar which will be used in the SUNNY demonstrator is addressed in this chapter, by shortly discussing the tests performed at MetaSenisng’s facilities on the main radar subsystems. In particular, the tests performed on the most critical ones (from a radar point of view) are the radio frequency equipment and the radar antenna, and the results are reported in paragraph 7.1. As a functional test of the overall system, an actual airborne campaign with the complete system has been performed on the 20th of April over a maritime scenario in The Netherlands. A manned platform has been used to accomplish the radar acquisitions and to show the correct functioning of the system. Different vessels have been detected ad various ranges, as shown in the plots of paragraph 7.3.

Page 42: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

42

References [1] Sunny, "Description of Work," 2013. [2] www.metasensing.com [3] A. Meta et al., Simultaneous L- and X- band fully polarimetric and interferometric airborne SAR campaigns in China, IGAARS 2016, July 10-15, 2016, Beijing, China. [4] F. Ulaby, R. Moore, and A. Fung, Microwave Remote Sensing: Radar remotesensing and surface scattering and emission theory, Remote Sensing.Addison-Wesley Publishing Company, Advanced Book Program/World ScienceDivision, 1981. [5] J. C. Curlander, Synthetic Aperture Radar: Systems and Signal Processing, Wiley,1991. [6] W. Carrara, R. Goodman, and R. Majewski, "Spotlight Synthetic Aperture Radar:Signal Processing Algorithms," ser. Artech House signal processing library. Artech29 House, Incorporated, 1995. [7] M. Soumekh, Synthetic Aperture Radar Signal Processing with MATLAB Algorithms, ser. Wiley-Interscience publication. Wiley, 1999. [8] Martorella, M.; Giusti, E.; Berizzi, F.; Bacci, A.; DalleMese, E., ISAR based techniques for refocusing non-cooperative targets in SAR images, Radar, Sonar & Navigation, IET , vol.6, no.5, pp.332,340, June 2012. [9] M. Martorella, E. Giusti, F. Berizzi, A. Bacci, E. DalleMese, An ISAR Technique for Refocusing Moving Targets in SAR Images, chapter 14th of the book Signal and Image Processing for Remote Sensing", Second edition CRC Press (Taylor and Francis Group), Edited by C.H. Chen, 2012 [10] Chen, V. C. and Martorella, M., Synthetic Aperture Radar Imaging Principles, Algorithm and Applications, Institution of Engineering and Technology, 2014. [11] Martorella, M.; Berizzi, F.; Haywood, B., Contrast maximization based technique for 2-D ISAR autofocusing, Radar, Sonar and Navigation, IEE Proceedings - , vol.152, no.4, pp.253,262, 5 Aug. 2005 [12] Martorella, M., Novel approach for ISAR image cross-range scaling, Aerospace and Electronic Systems, IEEE Transactions on , vol.44, no.1, pp.281,294, January 2008 [13] A. Ausherman, A.Kozma, J. L. Walker, H. M. Jones, and E. C. Poggio. Developments in Radar Imaging, Aerospace and Electronic Systems, IEEE Transactions on, AES, vol. 20, no.4, pp. 363-400, July 1984. [14] J.L.Walker. Range-Doppler Imaging of Rotating Objects. Aerospace and ElectronicSystems, IEEE Transactions on, AES, vol. 16, no.1, pp.23-52, Jan. 1980. [15] M. Martorella, Introduction to Inverse Synthetic Aperture Radar, In Elsevier Academic Press Library in Signal Processing: Communications and Radar Signal Processing. (Vol. 2), 1st Ed., Sept 2013. [16] M. Stu, M. Biancalana, G. Arnold, and J. Garbarino, Imaging moving objects in3D from single aperture Synthetic Aperture Radar, In Radar Conference, 2004. Proceedings of the IEEE, pages 94-98, April 2004. [17] Iwamoto, M.; Kirimoto, T.; A novel algorithm for reconstructing three-dimensional target shapes using sequential radar images, Geoscience and Remote Sensing Symposium, 2001. IGARSS '01. IEEE 2001 International , vol.4, no., pp.1607-1609 vol.4, 2001 [18] Suwa, Kei; Yamamoto, Kazuhiko; Iwamoto, Masafumi; Kirimoto, Tetsuo; Reconstruction of 3-D Target Geometry Using Radar Movie, Synthetic Aperture Radar (EUSAR), 2008 7th European Conference on , vol., no., pp.1-4, 2-5 June 2008. [19] Cooke, T.; Scatterer labeling estimation for 3D model reconstruction from an ISAR image sequence, Radar Conference, 2003. Proceedings of the International, vol., no., pp. 315- 320, 3-5 Sept. 2003. [20] Cooke, T.; Ship 3D model estimation from an ISAR image sequence, Radar Conference, 2003. Proceedings of the International, vol., no., pp. 36- 41, 3-5 Sept. 2003 [21] Xiaojian Xu and R.M. Narayanan, Three-dimensional interferometric ISAR imaging for target scattering diagnosis and modeling , Image Processing, IEEE Transactionson, vol. 10, no. 7, pp. 1094-1102, Jul 2001. [22] J.A.Given and W.R.Schmidt, Generalized ISAR - part I: an optimal method for imaging large naval vessels, IEEE Transactions on Image Processing, vol. 14, no. 11, pp. 1783-1791, 2005. [23] J.A. Given and W.R. Schmidt. Generalized ISAR-part II: interferometric techniques for three-dimensional location of scatterers, Image Processing, IEEE Transactions on, vol. 14, no. 11, pp. 1792-1797, Nov. 2005.

Page 43: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

43

[24] C.Z.Ma, T.S.Yeo, Q.Zhang, H.S.Tan, and J.Wang, Three-dimensional ISAR imaging ased on antenna array, IEEE Transactions on Geoscience and Remote Sensing,vol. 46, no. 2, pp. 504-515, 2008. [25] J.Mayhan, M.Burrows, K.Cuomo, and J.Piou, High resolution 3D snapshot ISAR imaging and feature extraction, IEEE Transactions on Aerospace and ElectronicSystems, vol. 37, no. 2, pp. 630-642, 2001. [26] J G. Wang, X.-G. Xia, and V. Chen, Three-dimensional ISAR imaging of maneuvering targets using three receivers, Image Processing, IEEE Transactions on, vol. 10, pp. 436 – 447, Mar 2001. [27] N.Shen, Z.P.Yin, and W.D.Chen, A compensation method for baseline length variationin InISAR imaging, pages 1034-1038, 2009. [28] Qun Zhang; Tat Soon Yeo; Gan Du; Shouhong Zhang, "Estimation of three-dimensional motion parameters in interferometric ISAR imaging," Geoscience and Remote Sensing, IEEE Transactions on , vol.42, no.2, pp.292,300, Feb. 2004 [29] D.Felguera-Martin et al.; Interferometric inverse synthetic aperture radar experiment using an interferometric linear frequency modulated continuous wave millimeter-wave radar, IET Radar, Sonar Navigation, vol. 5, no. 1, pp. 39-47, 2011. [30] M. Martorella, et al. 3D interferometricISAR imaging of noncooperative targets, Aerospace and Electronic Systems, IEEE Transactions on, vol. 50, no. 4, pp. 3102-3114, October 2014 [31] M. Martorella et al., 3D target reconstruction by means of2D-ISAR imaging and interferometry , In Radar Conference (RADAR), 2013 IEEE,pages 1-6, April 2013. [32] K.Suwa, T.Wakayama, and M.Iwamoto, Three-dimensional target geometry andtarget motion estimation method using multistatic ISAR movies and its performance, IEEE Transactions on Geoscience and Remote Sensing, vol. 49, no. 6, pp. 2361-2373,2011. [33] K.Knaell and G.Cardillo. Radar tomography for the generation of three-dimensional images, IET Proceedings Radar, Sonar and Navigation, vol.142, no.2, pp.54, 1995. [34] Salvetti, F. et al. Joint Use of Two-dimensional Tomography and ISAR Imaging for Three-dimensional Image Formation of Non-cooperative Targets, EUSAR 2014; 10th European Conference on Synthetic Aperture Radar; Proceedings of , vol., no., pp.1,4, 3-5 June 2014. [35] J. Li and P. Stoica, MIMO Radar Signal Processing, Hoboken, Wiley, 2009 [36] Pastina, D., M. Bucciarelli, and P. Lombardo, Multistatic and MIMO distributed ISAR for enhanced cross-range resolution of rotating targets, IEEE Trans. Geosci. Remote Sens., Vol. 48, No. 8, 3300-3317, 2010. [37] Changzheng Ma; et al., MIMO Radar 3D Imaging with Improved Rotation Parameters, PIERS Proceedings, Kuala Lumpur, MALAYSIA, March 27-30, 2012 [38] XieXiaochun; Zhang Yunhua; 3D ISAR imaging based on MIMO radar array, Synthetic Aperture Radar, 2009, APSAR 2009. 2nd Asian-Pacific Conference on , vol., no., pp.1018-1021, 26-30 Oct. 2009. [39] SUNNY project, Sensor Specification D2.2. [40] Meta A. et al., A selection of MetaSensing Airborne campaigns at L-, X- and Ku- band, proceedings of 2012 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2012, July 22–27, 2012 Munich, Germany. [41] Meta A. et al. MetaSensing compact, high resolution Interferometric SAR sensor for commercial and scientific applications, Radar Conference (EuRAD), 2010 European, Sept. 30 -Oct. 1 2010. [42] M.I. Skolnik, Introduction to Radar Systems, McGraw-Hill Book Company, 2001, 3rd edn. 978-1-4244-5813-4/10/$26.00 ©2010 IEEE 000160 [43] A. G. Stove, Linear FMCW radar techniques, Proc. Inst. Electr. Eng. F Radar Signal Process., vol. 139, no. 5, pp. 343-350, 1992. [44] A.Meta, Signal Processing of FMCW Synthetic Aperture Radar Data, PhD Dissertation, August 2006. [45] I. G. Cumming, F. H. Wong, Digital Processing of Synthetic Aperture Radar Data, 2005, Artech House [46] Liu, N. N., J. Li, and Y. Cui, A new detection algorithm based on CFAR for radar image with homogeneous background, Progress In Electromagnetics Research C, Vol. 15, 13–22, 2010.

Page 44: D3.3: Radar sensing - sunnyproject.eu · Novel sensors and on-board processing generation, integrated on UAV system, will be focus on low weight, ... • Single-pass interferometry

44

[47] P.P. Gandhi, S.A. Kassam, Analysis of CFAR processors in non homogeneous background, IEEE Transactions on Aerospace & Electronic Systems, 1988, 24, (4), pp. 427-445. [48] Barkat, M., S. D.Himonas, and P. K. Varshney, CFAR detection for multiple target situations, IEE Proceedings, Vol. 136, No. 5, 1989. [49] https://www.vesselfinder.com/