design and development of a uav system for multispectral … · iii infrared bands. the cameras...

189
Design and Development of a UAV System for Multispectral Imaging and Remote Sensing Applications by Jorge Marin Marcano A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Engineering Department of Electrical and Computer Engineering University of Alberta © Jorge Marin Marcano, 2018

Upload: others

Post on 23-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Design and Development of a UAV System forMultispectral Imaging and Remote Sensing

Applicationsby

Jorge Marin Marcano

A thesis submitted in partial fulfillment of the requirements for the degree of

Master of Science

in

Computer Engineering

Department of Electrical and Computer Engineering

University of Alberta

© Jorge Marin Marcano, 2018

Page 2: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

ii

Abstract

The increasing use of the Unmanned Aerial Vehicles (UAVs) over the past few years has opened

the door to a large number of applications in research, commercial, and industrial areas. UAVs

have become an efficient, cost effective, and environmentally friendly (less or zero CO2 emissions)

platform for remote sensing compared to manned vehicles. One major field in this category is

multispectral sensing. Commercial solutions of UAV-compatible multispectral cameras are available

but tend to have a high cost and, in most cases, lack certain flexibility when it comes to custom

development. This thesis is organized in three basic sections. It begins with the development of

a modular fixed-wing UAV that includes a customized airframe, integrated avionics with a custom

UAV system board with redundant power supply, and single board computer (SBC) for payload

interfacing. The other two sections of this thesis are focused on the design, development, and testing

of two revisions of a custom multispectral imaging system.

The customized fixed-wing airframe was redesigned aiming for modularity and reliability. The

open-source UAV system board integrates all essential sensors and components in a small and com-

pact board, reducing the amount of wiring and connectors, which in turn reduces weight and in-

creases reliability. The UAV system board includes a detachable and novel redundant power supply

board that provides main and backup power regulation. The complete UAV system design is pre-

sented as an efficient, and flexible platform.

The multispectral imaging system is based on multiple USB 2.0 camera modules and optical filters.

The first iteration, or proof of concept, introduces a 3-band multispectral imaging system based on

3 rolling shutter cameras synchronized by software, with optical filters for the green, red, and near-

Page 3: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

iii

infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system

was tested over a grass field, and the images collected were post-processed. Resulting images were

aligned and NDVI images were produced where it was possible to detect vegetation and identify

different levels of vegetation stress. This system proved to be a low cost and simple, yet effective

approach for custom UAV multispectral imaging.

The next evolution of the system is a 12-band multispectral imaging platform based on USB 2.0

camera modules. Cameras used in this revision have a global shutter and a trigger functioning

mode that allows hardware synchronization. A novel bridge board was designed to integrate all

USB communications via USB HUBs and also include a microcontroller unit (MCU) in charge

of triggering the cameras. The USB HUB network has selectable routes for grouping cameras to

different upstream USB ports depending on the bandwidth requirements of the application. The

system was tested and characterized for the maximum number of simultaneous cameras per root

USB port depending on camera configuration. With the highly integrated bridge board the overall

size and weight of the system was reduced, making it compatible with most UAVs. Software was

also developed, based on third party tools, that allows extensive configuration of the cameras and

control over the imaging system.

The result of this work introduces a detailed methodology for UAV systems development, introduces

a comprehensive UAV system board, and explores the capacities of USB 2.0 camera modules as part

of complex imaging systems. The hardware and software platforms developed throughout this thesis

are cost effective and efficient solutions that allow customization and expansion, making them an

ideal development tool for UAV systems and UAV multispectral imaging applications.

Page 4: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

iv

Preface

The work in this thesis, including the conceptualization, design, implementation, data processing,and analysis, was carried out by me under the supervision of Dr. Duncan Elliott from the Departmentof Electrical and Computer Engineering at the University of Alberta.

The work was done in working space provided by the University of Alberta Aerial Robotics Group(UAARG), and later in Dr. Michael Lipsett’s laboratory, from the Department of Mechanical Engi-neering at the University of Alberta.

Page 5: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

v

Acknowledgements

I would first like to thank my supervisor Dr. Duncan Elliott from the Electrical and Computer En-gineering Department of the University of Alberta for his insights and guidance, and Dr. MichaelLipsett from the Mechanical Engineering Department at the University of Alberta for his support.

I would also like to thank the University of Alberta Aerial Robotics Group (UAARG) for their helpduring the first steps of the project. Many thanks to Andrew Jowsey, Sebastian Werner, Cindy Xiao,and Brian Hinrichsen. Their knowledge and experience were of great importance. I would like tospecially thank Rijesh Augustine for his vital contributions throughout the project. I would also liketo acknowledge Stream Technologies Inc. for their valuable support in the final stages of the project.

Finally, I must express my profound gratitude to my family, friends, and to my loving spouse, fortheir continuous support and encouragement throughout my years of study and through the processof conducting research and writing this thesis. This accomplishment would not have been possiblewithout them. Thank you.

Jorge Marin.

Page 6: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

vi

Contents

List of Tables x

List of Figures xii

List of Abbreviations xvii

Chapter 1: Introduction 1

1.1 Objective of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.2 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Chapter 2: Literature Review 4

2.1 Imaging Systems and Cameras . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2.1.1 Fundamentals of Optics and Imaging Systems . . . . . . . . . . . . . . . . 4

2.1.2 Image Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.1.2.1 CCD vs. CMOS Technologies . . . . . . . . . . . . . . . . . . 8

2.1.2.2 Rolling vs. Global Shutter . . . . . . . . . . . . . . . . . . . . . 9

2.1.2.3 Performance Metrics of Image Sensors . . . . . . . . . . . . . . 9

2.1.3 Lenses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.1.4 Modulation Transfer Function . . . . . . . . . . . . . . . . . . . . . . . . 11

2.1.4.1 Resolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.1.4.2 Contrast . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.1.4.3 Slanted-Edge MTF Measurement . . . . . . . . . . . . . . . . . 17

Page 7: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

vii

2.1.5 USB Camera Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.1.5.1 USB Video Class (UVC) . . . . . . . . . . . . . . . . . . . . . 19

2.1.5.2 Video for Linux Two (V4L2) API . . . . . . . . . . . . . . . . . 20

2.1.5.3 GStreamer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.2 Spectral Imaging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2.2.1 Vegetation, Soil and Water Reflectance . . . . . . . . . . . . . . . . . . . 22

2.2.2 Landsat Thematic Mapper . . . . . . . . . . . . . . . . . . . . . . . . . . 23

2.2.3 Lightweight UAV-Compatible Multispectral Imaging Systems . . . . . . . 24

2.3 UAV-Based Spectral Remote Sensing . . . . . . . . . . . . . . . . . . . . . . . . 27

2.3.1 Crop Inspection and Precision Agriculture . . . . . . . . . . . . . . . . . . 27

2.3.2 Surface and Soil Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

2.3.3 Surface Hydrocarbon Detection . . . . . . . . . . . . . . . . . . . . . . . 30

2.3.4 Environmental Monitoring . . . . . . . . . . . . . . . . . . . . . . . . . . 31

2.3.5 Archaeology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Chapter 3: UAV Development 33

3.1 UAV Electronic System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

3.1.1 UAV System Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

3.1.1.1 Microcontroller Unit (MCU) . . . . . . . . . . . . . . . . . . . 35

3.1.1.2 Inertial Measurement Unit (IMU) . . . . . . . . . . . . . . . . . 35

3.1.1.3 GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

3.1.1.4 XBee and Data-logger . . . . . . . . . . . . . . . . . . . . . . . 36

3.1.1.5 PWM Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

3.1.1.6 Power Regulation . . . . . . . . . . . . . . . . . . . . . . . . . 36

3.1.1.7 Additional Features . . . . . . . . . . . . . . . . . . . . . . . . 37

3.1.2 Redundant Power Supply Board . . . . . . . . . . . . . . . . . . . . . . . 38

3.1.2.1 Principle of Operation . . . . . . . . . . . . . . . . . . . . . . . 40

3.1.3 Firmware and Ground Control Station (GCS) . . . . . . . . . . . . . . . . 43

3.2 Payload Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

Page 8: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

viii

3.3 UAV Avionics Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

3.4 Airframe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

3.4.1 Falcon I . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

Chapter 4: Proof of Concept Multispectral Imaging System 53

4.1 USB Camera Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4.1.1 Selection and Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

4.1.2 Software Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

4.1.3 Lens Testing - Modulation Transfer Function . . . . . . . . . . . . . . . . 57

4.2 3-Band Multispectral Imaging System . . . . . . . . . . . . . . . . . . . . . . . . 58

4.2.1 Cameras and Lenses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

4.2.2 Optical Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

4.2.3 Software Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

4.2.4 UAV Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

4.2.5 Flight Campaigns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

4.2.6 Image Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

4.2.7 Vegetation Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Chapter 5: 12-Band Multispectral Imaging Platform 73

5.1 System Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

5.2 USB Camera Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

5.2.1 Optical Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

5.3 Camera Bridge Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

5.3.1 USB HUB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

5.3.2 Trigger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

5.4 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

5.5 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

5.6 Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85

5.7 Focusing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

5.8 Image Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

Page 9: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

ix

Chapter 6: Conclusions and Future Work 91

6.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

Bibliography 95

Appendix A: UAV System Board User Guide 105

Appendix B: Camera Bridge Board Schematics 154

Appendix C: Camera Bridge Board Bill-of-Materials 167

Appendix D: 3D Printer Settings 171

Page 10: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

x

List of Tables

2.1 Features comparison between CCD and CMOS image sensors. . . . . . . . . . . . 9

2.2 Spatial frequency unit conversion chart. . . . . . . . . . . . . . . . . . . . . . . . 15

2.3 Landsat Thematic Mapper Bands for Vegetation Monitoring. . . . . . . . . . . . . 24

2.4 Comparison of common multispectral imaging systems. . . . . . . . . . . . . . . 26

3.1 UAV System Board Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

3.2 Technical Specifications of the ODROID-C2 . . . . . . . . . . . . . . . . . . . . . 44

4.1 Features of the ELP-USB500W02M camera. . . . . . . . . . . . . . . . . . . . . 54

4.2 Settings of the ELP-USB500W02M camera. . . . . . . . . . . . . . . . . . . . . 55

4.3 Frame Rates for the ELP-USB500W02M camera based on output video format andresolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

4.4 Results of different M12 lenses tested using the slanted-edge MTF method. . . . . 57

4.5 Features of the ELP-USB130W01MT-B/W camera. . . . . . . . . . . . . . . . . 59

4.6 Settings of the ELP-USB130W01MT-B/W camera. . . . . . . . . . . . . . . . . . 60

4.7 Frame Rates for the ELP-USB130W01MT-B/W camera based on output video for-mat and resolution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

4.8 Technical details of the optical filters. . . . . . . . . . . . . . . . . . . . . . . . . 62

4.9 Flight campaigns details. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

5.1 Specifications of the DMM 42BUC03-ML camera. . . . . . . . . . . . . . . . . . 74

5.2 Video formats and frame rates available in the DMM 42BUC03-ML camera. . . . 76

Page 11: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xi

5.3 Settings of the DMM 42BUC03-ML camera. . . . . . . . . . . . . . . . . . . . . 76

5.4 Test results of USB traffic with different configurations. Cameras were connectedto a single USB upstream port and triggered simultaneously. Data transferred basedon a single frame from each camera. . . . . . . . . . . . . . . . . . . . . . . . . . 82

5.5 Comparison of common multispectral imaging systems with the system presentedin this work. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

5.6 General costs of components for the multispectral imaging system hardware. . . . 85

5.7 Costs per set/unit based on order quantity for Bridge Board bare PCB and BOM. . 85

D.1 3D printer settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171

Page 12: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xii

List of Figures

2.1 Illustration of the fundamental parameters of an imaging system. . . . . . . . . . . 5

2.2 Geometry of a basic optics system using the thin-lens model. . . . . . . . . . . . . 5

2.3 Angle of view and its horizontal, vertical and diagonal components [18]. . . . . . 6

2.4 Focal length and angle of view of a “full frame” (35mm) image sensor [19]. . . . . 7

2.5 Conversion and charge transferring principles between CCD and CMOS technolo-gies [20]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.6 Comparison between rolling shutter and global shutter captures on moving target[21]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.7 Perfectly clear pattern before (left) and after (right) passing through a low resolutionlens [25]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.8 Resolving two black stripes. If the space between the stripes is too small (a) thecamera sensor will not be able to resolve them as separate stripes [25]. . . . . . . . 13

2.9 Different levels of contrast plotted as a square wave [25]. . . . . . . . . . . . . . . 14

2.10 Contrast comparison in resulting images at low (a) and high (b) frequency, and MTFplot (c) [27]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

2.11 SFR target with horizontal and vertical (highlighted) slanted edges. . . . . . . . . 17

2.12 Flowchart of the operations in the edge gradient analysis. . . . . . . . . . . . . . . 18

2.13 Different MTF shapes and their meaning. . . . . . . . . . . . . . . . . . . . . . . 18

2.14 Diagram of GStreamer components [38]. . . . . . . . . . . . . . . . . . . . . . . 21

2.15 Illustration of hyperspectral imaging. Each pixel in the image forms the spectrumtrough imaged wavebands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Page 13: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xiii

2.16 Comparison of band classifications in multispectral (few wide bands) and hyper-spectral (several narrow bands) models. Units are in micrometers for the multispec-tral ranges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

2.17 Reflectance spectrum of water, soil and vegetation. . . . . . . . . . . . . . . . . . 22

2.18 Reflectance spectrum of water, soil and vegetation together with the Landsat TMbands in the 0-2.5 µm range. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

2.19 (a) Tetracam Mini MCA6. (b) Sentek GEMS. (c) Sentera QUAD. (d) Parrot Se-quoia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

2.20 Comparison of spectral bands used in each camera. . . . . . . . . . . . . . . . . . 26

3.1 Block diagram of the UAV system board hardware. . . . . . . . . . . . . . . . . . 35

3.2 UAV system board. 90x52mm 4-layers PCB with a weight of 29 grams. A detailedmanual and user guide for the board can be found in Appendix A. . . . . . . . . . 37

3.3 Redundant Power supply module. 41x46mm 4-layers PCB with a weight of 14grams. There are four voltage regulators, from bottom left and clockwise: autopilotmain, payload, servo backup, and autopilot backup. . . . . . . . . . . . . . . . . . 39

3.4 Redundant Power supply module block diagram. Greyed blocks and dashed linesare inactive. (a) Normal power operation mode: all systems are on and poweredfrom main battery. (b) Backup power operation mode: autopilot and servos are pow-ered from backup battery; motor, payload and Electronic Speed Controller (ESC) /Battery Eliminator Circuit (BEC) are off. . . . . . . . . . . . . . . . . . . . . . . 40

3.5 Main and backup power supplies connection diagram. Both power supplies areconnected in parallel and there is a voltage difference between the two, where VMAIN

is slightly higher than VBACKUP when in normal operation mode. . . . . . . . . . . 41

3.6 Variation of the current sank by the backup voltage regulator circuit (iS) with thevoltage difference ∆V = VMAIN −VBACKUP, including the point of current inversionfor a low ∆V in which iS becomes negative. . . . . . . . . . . . . . . . . . . . . . 41

3.7 Variation of the current sank by the backup voltage regulator circuit (iS) with thevoltage difference ∆V when both voltage regulators are on (∆V =VMAIN −VBACKUP). 42

3.8 Variation of the current sank by the backup voltage regulator (iS) circuit with theoutput voltage level (VOUT ) when the backup voltage regulator is off (output in highimpedance). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

3.9 Paparazzi-UAV Ground Control Station (GCS) interface during a flight. . . . . . . 43

3.10 Hardkernel ODROID-C2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

Page 14: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xiv

3.11 UAV avionics module fully assembled with all its components. . . . . . . . . . . . 45

3.12 HobbyKing EPP FPV airframe. . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

3.13 Inside of the HobbyKing EPP FPV fuselage after splitting. Inside cuts are markedand nose was removed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

3.14 Side of the fuselage with cuts and some plastic parts glued already (right). . . . . . 47

3.15 (a) Original EPP FPV fuselage. (b) Modified fuselage (Falcon I design). . . . . . . 48

3.16 (a) Top fuselage module. (b) Assembled wing centre module. (c) Assembly attach-ment glued to the wings. (d) Wings assembly. . . . . . . . . . . . . . . . . . . . . 49

3.17 (a) Original EPP FPV tail. (b) Modified tail (Falcon I design). . . . . . . . . . . . 50

3.18 Internal wiring of the fuselage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

3.19 (a) Default foam nose. (b) Custom plastic nose for installing a USB camera board. 51

3.20 Rendering cut of the fuselage with internal components. . . . . . . . . . . . . . . 51

3.21 3D model of the Falcon I full assembly. . . . . . . . . . . . . . . . . . . . . . . . 52

3.22 Falcon I full assembly on the ground (a) and during flight (b). . . . . . . . . . . . 52

4.1 USB camera module ELP-USB500W02M from ELP Ailipu Technology Co. . . . 54

4.2 MTF plot for lens 5 (6mm focal length). . . . . . . . . . . . . . . . . . . . . . . . 58

4.3 Simple block diagrams of a single camera (a), and the 3-band multispectral imagingsystem (b). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

4.4 Quantum Efficiency of the Aptina / ON Semiconductor AR0130CS image sensor. 61

4.5 Spectral response or transmissivity of the green, red, and NIR band-pass filterschosen for the system. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

4.6 Comparison of chosen bands with bands used in other commercial multispectralcameras. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

4.7 From left to right; green, red, and NIR filters mounted on lens cap adapters. . . . . 63

4.8 Back of a lens with IR cut filter (left), and after IR cut filter removal (right). . . . 63

4.9 Diagram of the Gstreamer pipeline application for three ELP-USB130W01MT-B/Wcameras. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

4.10 3D model of custom plane nose with 3 cameras. (a) Isometric view. (b) Side cross-section view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

4.11 (a) Custom plane nose with 3 cameras and filters. (b) Nose installed in the plane. . 65

Page 15: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xv

4.12 Map of Bremner Field indicating the delimited area for flight operations. . . . . . 66

4.13 Correlation plot of alignment algorithm indicating maxima. . . . . . . . . . . . . 68

4.14 Example of pixel offset and cropping of images for alignment. . . . . . . . . . . . 69

4.15 Superposition of red, green, and NIR band images before and after alignment. . . . 69

4.16 Result of image alignment algorithm. (a) Superposition before alignment. (b) Su-perposition after alignment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

4.17 Example NDVI test of leaves with different levels of stress. (a) RGB image. (b)False colored NDVI image. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

4.18 Sample pictures of a flights at Bremner Field showing the NIR and Red band imagesand the NDVI resulting image. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

5.1 DMM 42BUC03-ML camera module from The Imaging Source, including S-Mountlens holder and lens. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

5.2 Quantum efficiency of the ON Semiconductor AR0134CS image sensor. . . . . . 75

5.3 Timing diagram of the trigger, exposure, and image readout processes. . . . . . . . 75

5.4 Camera board with S-Mount (M12) lens and optical filter installed using the 3Dprinted adapter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

5.5 Diagram of the basic connections between the camera array and the bridge board.The bridge board routes all USB communications into selectable upstream ports andsends trigger signals to the cameras. . . . . . . . . . . . . . . . . . . . . . . . . . 77

5.6 Altium Designer layout view of the bridge board with top layer (red), bottom layer(blue), and silkscreen (yellow). . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

5.7 CAD 3D model of the bridge board. . . . . . . . . . . . . . . . . . . . . . . . . . 78

5.8 Diagram of the configurable USB HUB network. SW 1, SW 2, and SW 3 are 2-to-1bidirectional USB 2.0 switches with inputs controlled by user selectable jumpers.USB 1, USB 2, USB 3, are the isolated upstream ports, and USB M is the masterupstream USB port. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

5.9 Diagram of the trigger system based on a microcontroller unit (MCU). Triggersignals are generated via GPIOs. A UART connection via USB is provided forinteractively interfacing with the MCU via serial commands. . . . . . . . . . . . . 80

5.10 Multispectral imaging system hardware with 12 cameras installed. . . . . . . . . . 84

5.11 Example of a sample gray scale image (left) after applying the positive Laplacianoperator (right). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

Page 16: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xvi

5.12 Graph of the focus measure calculated using the above algorithm as the focus isadjusted from out of focus (too far) to out of focus (too close), passing through thein-focus point. Any point outside the in-focus point is out of focus. The graphhighlights two random out of focus points for reference, one too far and one too close. 88

5.13 Sample images of the three highlighted points in the graph of Figure 5.12 . (a) Outof focus (too far). (b) In-focus. (c) Out of focus (too close). . . . . . . . . . . . . . 88

5.14 Result of aligning two images. (a) Template or reference image. (b) Test imagealigned to template image. It is possible to see the black borders filling the emptyspaces after alignment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

5.15 Result of image alignment algorithm. (a) Superposition before alignment. (b) Su-perposition after alignment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Page 17: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xvii

List of Abbreviations

AGL Above Ground Level

BEC Battery Eliminator Circuit

CCD Charge-Coupled Devices

CMOS Complementary Metal–Oxide–Semiconductor

CPI Camera Parallel Interface

CSI Camera Serial Interface

DOF Degrees of Freedom

DOF Depth of Field

DOM Degrees of Measurement

DSC Digital Still Camera

EPP Expanded Polypropylene

ESC Electronic Speed Controller

FOV Field of View

FPS Frames Per Second

GNDVI Green Normalized Difference Vegetation Index

GPS Global Positioning System

GSD Ground Sampling Distance

GUI Graphical User Interface

IMU Inertial Measurement Unit

Page 18: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

xviii

MCU Microcontroller Unit

MTF Modulation Transfer Function

NDRE Normalized Difference Red Edge Index

NDVI Normalized Difference Vegetation Index

NIR Near Infrared

PCB Printed Circuit Board

PLA Polylactic Acid

PMAG Primary Magnification

ROI Region of Interest

SBC Single Board Computer

SFR Spatial Frequency Response

SNR Signal-to-Noise Ratio

UAARG University of Alberta Aerial Robotics Group

UAV Unmanned Aerial Vehicle

USB Universal Serial Bus

UVC USB Video Class

V4L2 Video for Linux Two

WD Working Distance

Page 19: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

1

Chapter 1

Introduction

Aerial and satellite photography are remote sensing techniques that have contributed to numerousdevelopments due to the extensive coverage and large amount of data that can be collected in ashort period of time. Remote sensing is based on collecting data about an object or phenomenonfrom a distance, without making physical contact with the object. Multispectral imaging is a majorremote sensing method in the field of spectroscopy that studies the amount of light that is emittedby or reflected from materials and its energy fluctuations with wavelengths. In the context of aerialand satellite multispectral imaging, data is collected by detecting the energy that is reflected fromthe Earth’s surface. Multispectral imaging focuses on a small number of carefully selected bands,usually 3 to 10 bands, that are not necessarily adjacent and tend to be 30-40 nm wide. Thesebands are usually selected for a specific application by knowing the type of materials that will bestudied and their spectral response. Aerial and satellite multispectral imaging has a wide range ofapplications, that go from research to commercial and industrial.

The Landsat Thematic Mapper (TM) [1] is a satellite mounted whisk broom multispectral scanningsensor from NASA, that has been used for extensive characterization and identification in multipleapplications [2, 3, 4, 5]. The TM has a total of 7 spectral bands, 3 in the visible spectrum and 4 inthe infrared spectrum. Each band has been carefully selected for the reflectance properties of certainmaterials of interest, such as vegetation, soil, and water. Manned aircrafts are also used for flyingmultispectral imaging equipment [6, 7]. While imaging payloads in satellites and manned aircraftshave been useful and effective, the costs and complexity associated with these implementations makethem unfeasible for small scale and low cost applications. With the increasing use of UnmannedAerial Vehicles (UAV) in recent years, and the advances and miniaturization in imaging systemstechnologies, costs and complexity have been considerably reduced.

UAVs have been successfully used for flying multispectral cameras for a variety of applications.Experiments have shown that UAV-based spectral data has a high correlation with equivalent ground

Page 20: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

2

level measurements. Nebiker et al. [8] validated the use of a UAV mounted high-end multispectralsensor, showing high correlation with respect to ground level reference measurements, together witha low-cost commercial near-infrared (NIR) camera that showed reasonable performance.

A major application of UAV-based multispectral cameras is crop inspection and precision agricul-ture. Part of Nebiker et al. [8] experiments included plant vitality assessment and plant diseasedetection. Roosjen et al. [9] used a UAV mounted spectrometer to gather spectral anisotropy datafrom several agricultural crops as a way to measure crop growth. Candiago et al. [10] also provedthe use of UAV-based multispectral imaging for vegetation mapping and crop assessment.

Multispectral images taken from UAVs have also been successfully used for surface and terrain stud-ies, detecting soil and vegetation marks, and surface modeling. Themistocleous et al. [11] combinedUAV and ground spectral data to study the formation of crop and soil marks for archaeological ap-plications, while Turner et al. [12] used a multispectral camera together with a visible camera and athermal infrared camera to study the physiological state of Antarctic moss ecosystems. Multispectraltests from a UAV were reported by Hassan-Esfahani et at. [13] in successful estimation of surfacesoil moisture over a large field irrigated by a center pivot sprinkler system.

UAV-based multispectral imaging is a growing field that continues to have a major impact in vege-tation assessment, crop inspection, and precision agriculture. With recent advances in multispectralsensing technologies, small and lightweight commercial solutions have found their way into themarket of UAV remote sensing tools. Nonetheless, these tools are still a relatively expensive so-lution that, in some cases, lacks certain flexibility for further development. Popular multispectralcameras that are UAV compatible like the Sentek Systems GEMS Multispectral Sensor [14], theSentera QUAD [15], and the Parrot Sequoia [16] offer a small, lightweight, and highly integratedsolution. Conversely, fixed optical and functioning properties affect negatively the flexibility andpotential of these solutions.

This work aims to introduce the design and development, as well as demonstrate the use and perfor-mance of two main systems. First, a comprehensive UAV system board with redundant power supplythat integrates most UAV functions and components. Second, a custom made multispectral imagingsystem based on USB camera modules. The imaging system is a flexible hardware and softwareplatform that can be used to develop custom UAV compatible multispectral imaging solutions.

1.1 Objective of the Thesis

This thesis attempts to demonstrate the feasibility of a small scale, lightweight, low cost UAV plat-form for multispectral imaging applications. The work presented here contributes a practical designand implementation methodology for developing UAV systems, focused on an integrated UAV sys-tem board and a multi-camera imaging systems for aerial remote sensing. At first, a basic conceptual

Page 21: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

3

3-band multispectral imaging system is developed. The presented system introduces the use of mul-tiple USB 2.0 camera modules and different optical filters for capturing simultaneous images indifferent spectral bands. Commercial solutions of UAV compatible multispectral cameras are an-alyzed and compared to the proposed system. To test and validate the system a fixed-wing UAVwas designed and built, based on a semi-custom airframe, UAV system board hardware with a novelredundant power supply system, and a single board computer (SBC) for payload. The UAV is usedto evaluate the effectiveness and performance of the system in real field multispectral imaging ap-plications. A second and more complex revision of the multispectral imaging system is developedwith a higher number of cameras and more advanced features. This second evolution serves as adevelopment platform for UAV compatible multispectral imaging applications.

1.2 Organization of the Thesis

Following this introduction is a literature review in Chapter 2, where a basic overview of imag-ing systems, remote sensing, and UAV-based multispectral applications is presented, together withprevious work pertaining to this thesis. Chapter 3 covers the development of the UAV platform,from the avionics to the airframe, including the development of a UAV system board with redun-dant power supply. Chapters 4 and 5 elaborate on the design, development, and testing of the basic3-band multispectral imaging system and the more advanced 12-band multispectral imaging systemrespectively. Finally, Chapter 6 summarizes the conclusions of this works for future developmentand application.

Page 22: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4

Chapter 2

Literature Review

This chapter presents a review of literature previously published relevant to the major topics coveredin this work including: imaging systems, image sensors, USB camera modules, spectral imaging,remote sensing, and the integration of unmanned aerial vehicles (UAVs) into aerial imaging remotesensing solutions with applications in multiple areas. Previous system solutions and their achieve-ments and limitations are discussed.

2.1 Imaging Systems and Cameras

An imaging system is an arrangement or setup of component that allow capturing images and han-dling them. Part of an imaging system can be to optically modify the capture, alter, store, process,and/or transmit the images. The most essential stage of any imaging system is the capturing device,which is the camera. Cameras are optical devices that capture images, either as individual still pho-tographs or as a sequence of images constituting videos. Technically speaking, cameras are remotesensing instruments as they sense subjects without any contact.

A camera consists of a combination of optical, mechanical, and in the case of digital cameras,electronic components. Traditional cameras worked based on capturing light onto a photographicplate or photographic film. Digital cameras use an electronic image sensor to capture images, andadditional electronics to transfer those images to some storage media. The present work is focusedon the use of digital cameras.

2.1.1 Fundamentals of Optics and Imaging Systems

The design and understanding of an imaging system involves a lot of optical properties, dimensionsand calculations, but there are some basic fundamental parameters and concepts (Figure 2.1) that

Page 23: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

5

must be understood prior to any further study.

Figure 2.1: Illustration of the fundamental parameters of an imaging system.

The two main components of any camera are the image sensor and the lens. All calculations relatedto optics are based on the dimensions and characteristics of these two elements. The thin-lens model[17] is used here for the purpose of covering basic optics and imaging concepts. The thin-lensmodel simplifies ray tracing calculations by ignoring the optical effects due to the thickness of thelens. While it is an approximation, this model (Figure 2.1) illustrates lens and optics principles andintroduces the fundamental terms with which to discuss imaging systems.

Figure 2.2: Geometry of a basic optics system using the thin-lens model.

Page 24: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

6

Sensor Size The dimensions of the camera sensor’s active area. This parameter is typically spec-ified as horizontal and vertical physical dimensions. The sensor size can be calculated by knowingnumber of horizontal and vertical pixels in the sensor and the size of a pixel. Sensor size is animportant parameter in determining the characteristics of a lens for a desired field of view.

Focal Length ( f ) The distance from the lens’s optical center to the image sensor plane when thelens is focused at infinity (Figure 2.2). Focal Length, usually stated in millimeters, is the principalmetric of a lens, and gives a measure of how strongly light converges in the optics system, assuminga positive (convex) lens.

Aperture The size of the opening through which light enters the camera. Together with focallength, it defines the cone angle of light that come to a focus in the sensor plane.

Working Distance (WD) The distance from the front of the lens to the object. For a fixed FocalLength, changing the Working Distance proportionally changes the Field of View.

Field of View (FOV) The total viewable area of the object that is imaged by the lens onto theimage sensor. This is the portion of a given scene that is captured by the camera’s sensor. It canbe expressed in terms or total area or by its horizontal, vertical, or diagonal components. FOV issometimes expressed as the angular size of the view cone, and it is called angle of view. The angleof view can also be measured horizontally, vertically, or diagonally (Figure 2.3).

Figure 2.3: Angle of view and its horizontal, vertical and diagonal components [18].

The horizontal and vertical components of the angle of view can be calculated in degrees by:

θH,V = 2arctan(

sH,V

2 f

)180π

(2.1)

Page 25: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

7

where sH,V is the horizontal or vertical size of the sensor and f is the Focal Length. Similarly, therelationship between Focal Length, Sensor Size, Working Distance, and Field of View is defined by:

FOVH,V =sH,VWD

f(2.2)

It can be seen from (2.2) that a greater Focal Length leads to a smaller Field of View as the anglebecomes smaller. Figure 2.4 illustrates examples of different Focal Lengths, and how the angle ofview and Field of View change accordingly.

Figure 2.4: Focal length and angle of view of a “full frame” (35mm) image sensor [19].

Resolution The smallest feature size of the object that can be visibly resolved in the image. Res-olution quantifies how close lines can be to each other and still be distinguished in the final image.This is a simplified definition for a concept that is very important and widely misunderstood. Abroader and more detailed explanation of resolution will be covered later on. Concepts like sensorresolution, Modulation Transfer Function (MTF), and effective resolution of an imaging system willbe introduced.

Depth of Field (DOF) The range between the maximum and minimum working distances where itis possible to maintain acceptable focus. Depth of Field is also the total amount of object movementin the optical axis allowable while maintaining focus.

Primary Magnification (PMAG) The Primary Magnification is defined as the ratio between theSensor Size and the Field of View:

Page 26: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

8

PMAG =sH,V

FOVH,V(2.3)

2.1.2 Image Sensor

As previously mentioned, the image sensor is the principal element of a camera, in charge of captur-ing light and transferring the information that constitutes an image. Image sensors have a sensitivecell array that converts light waves into electrical signals that are measured and interpreted in orderto produce a matrix of values that form the image.

2.1.2.1 CCD vs. CMOS Technologies

Most modern image sensors use one of two types of technologies: semiconductor charge-coupleddevices (CCD) or active pixel sensors in complementary metal–oxide–semiconductor (CMOS).

Regardless of the technology used, image sensors must first convert light into electrons that areaccumulated as charge, and then read the value of each cell. In CCD sensors, the charge is transferredfrom one pixel to another across the chip until it ends up at the periphery of the array, where it isconverted into voltage. Then, an analog-to-digital (ADC) converter turns each pixel’s voltage intoa digital value. In CMOS image sensors, there are several transistors integrated into each pixel thatamplify and transfer the charge using traditional wires. CMOS sensors are more flexible becausepixels can be read individually. Figure 2.5 shows a visual comparison of the working principlebetween CCD and CMOS sensors.

Figure 2.5: Conversion and charge transferring principles between CCD and CMOS technologies [20].

There is a number of features where CCD and CMOS image sensor are very different. CCD sensorscreate a higher-quality, low-noise image, whereas CMOS sensors are more susceptible to noise.Due to each pixel on a CMOS sensor having several transistors next to it, many of the photonswill inevitably hit these transistors and the overall light sensitivity will be lower. CCD sensorsuse a special manufacturing process that guarantees the transportation of charge across the chip

Page 27: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

9

without distortion, leading to sensors with high fidelity. The downside is that this process consumesconsiderably more power. CCD sensors can consume up to 100 times more power that an equivalentCMOS sensor. Contrarily, CMOS chips use traditional manufacturing processes, so they can befabricated on just about any standard silicon production line. Therefore, CMOS sensors tend to beinexpensive compared to CCD sensors. Table 2.1 summarizes the comparison of some of the mostimportant features between CCD and CMOS image sensors.

Table 2.1: Features comparison between CCD and CMOS image sensors.

Feature CCD CMOS

System Noise Low ModerateResponsivity Moderate Slightly BetterPower Consumption High LowSensitivity (*) High ModerateResolution High HighCost High Low

(*) Visible and infrared bands up to approximately 1100 nm.

2.1.2.2 Rolling vs. Global Shutter

The shutter is the mechanism for exposing the sensor to light. There are two types of shutter, rollingand global, where the difference is how each controls the scanning of the image. In a rolling shuttermechanism, the image is scanned sequentially, where the pixels of the sensor are exposed to lightline by line. It usually starts from top to bottom and is more commonly found in CMOS sensors.While each pixel line collects light during the same period of time, the light collection start and endtimes for each line differ.

In a global shutter mechanism, on the other hand, all the pixels are exposed at the exact same timeand during the same time. Since light collection happens simultaneously for all pixels, the sceneis “frozen” in time, provided that the time window for light collection is short enough and that nomotion occurs during this window. If this is the case, there will be no motion blur in the resultingimage. Motion blur in likely to be present in images captured with a rolling shutter if there is anymovement occurring between the start of the first line and the end of the last line. Figure 2.6 shows acomparison of a moving object being captured by a rolling shutter sensor and a global shutter sensor.

2.1.2.3 Performance Metrics of Image Sensors

Essentially, the function of an image sensor is the conversion of light to a digital image. There aresome common metrics used for comparing how well image sensors can perform their task. This

Page 28: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

10

Figure 2.6: Comparison between rolling shutter and global shutter captures on moving target [21].

section elaborates on some of these metrics and presents them in three categories: metrics related tothe pixel layout, metrics related to the pixel physics, and metrics related to the pixel readout [22].

Metrics Related to Pixel Layout The most common metric used for comparing image sensors isthe pixel count, usually expressed in megapixels. The number of pixels in a sensor depends on theoverall size of the chip and the area the designer has available for the pixels versus the area used forthe rest of the circuitry. When it comes to pixel count, the higher the better.

Another metric that also relates to pixel count is the pixel pitch, usually expressed in micrometers.This metric defines how much area each pixel occupies in the array. Generally, a smaller pitch meansmore pixels can be fitted. However, smaller pixels will not collect as much light, and thus might notbe suitable for fast exposure captures or low illumination applications.

Finally, one more highly used metric is the fill factor, which deals with how pixels are physicallylaid out. The fill factor is the percentage of the pixel area that is occupied by the actual photodiode. A100% fill factor would be ideal as all the pixel area would be used for light collection. Most sensorsdon’t have a 100% fill factor, and designers need to decide how much area is used for photodiodes.Nonetheless, some chips can achieve 100% fill factor by using a backside illuminated design, wherereadout circuitry is placed below the photodiode, allowing it to occupy all the pixel area [23, 24].

Metrics Related to Pixel Physics Another set of metrics by which image sensors are compared isrelated to the physics of how photons are converted into charges. When photons hit the silicon withenough energy an eletron-hole pair is formed. But this does not happen for every single photon andthe effect also differs depending on the wavelength. Quantum efficiency is a percentage measureof how many photons are converted into electron-hole pairs.

Another important metric is the dynamic range, given in decibels, which measures how accurate asensor can capture a signal under low light conditions all the way up until full capacity. A higher

Page 29: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

dynamic range is desired as it will allow the sensor to operate in a wider range of light conditions.

Metrics Related to Pixel Readout A third set of metrics that is useful for image sensor com-parison has to do with the features and performance of the readout process and circuitry. One ofthe most important metrics here is the Signal-to-Noise (SNR) ratio, measured in decibels. This isa ratio of how much of the original signal is captured and present in the output, versus how muchnoise is present. The higher the SNR means there is more of the original signal present in the outputcompared to the noise level. SNR varies with light intensity, as it is typically lower for dim light andhigher for bright light.

Other important metrics are the percentage of linearity, which is how linear the pixel output is withrespect to the captured photovoltage, and the frame rate, which is a measure of how many timesper second the full pixel array can be exposed and read. Power consumption is also important. Thechoice of power supply, operating frequency and capacitance affect power consumption. Typically,a higher frequency allows a higher frame rate at the cost of higher power consumption.

Finally, the output bit-depth is another important feature. The photodiode signal that is later con-verted into voltage and read, will eventually be sampled by an analog-to-digital converter to output adigital value. The bit-depth defines the number of bits used for representing this number. The higherthe number of bit, the higher the value resolution and the more distinct values can be represented.

2.1.3 Lenses

Lenses are optical transmission devices used for focusing or dispersing light by means of refraction.Lenses are made out of transparent glass or plastic, usually with a circular shape. The two sides ofthe lens are polished surfaces, either or both of which have a concave or convex curve.

A lens can achieve its focusing effect thanks to light changing direction as it passes through thelens, crossing the interfaces between air and the lens material. This effect is called refraction, and itoccurs both where light enters the lens and where it emerges from the lens into the air.

Lenses are important because the have the valuable property of forming images of objects situatedin front of it, and are an indispensable component of any camera or imaging system. Lenses alsohave drawbacks such as non-linear distortion (which can be measured and corrected with imageprocessing) and limits to optical resolution.

2.1.4 Modulation Transfer Function

The Modulation Transfer Function (MTF) is an important and common metric for evaluating theperformance of imaging systems. MTF measures the ability of the imaging system to transfer con-

Page 30: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

12

trast at a particular resolution from the object to the image. In order to understand the importanceand significance of the MTF, the concepts of resolution and contrast must be defined.

2.1.4.1 Resolution

A simplified definition of resolution was given in a previous subsection. A more detailed and thor-ough explanation of this concept is presented here. Resolution is a measurement of how wellan imaging system can reproduce object detail. It is often expressed in line-pairs per millimeter(lp/mm), which is a unit of frequency that originates from the testing method to measure resolution.A line-pair is a sequence of two lines with opposing contrast, that is one black and one white. Abasic target with a sequence of alternating, equally spaced black and white stripes is usually usedfor performing MTF testing. This target presents line-pairs with increasing frequency in order totest the imaging system resolving capabilities. When an imaging system captures this pattern, withperfectly clear edges, lines will have a degree of blurriness in the final image due to imaging systemlimitations (Figure 2.7). Increasing the frequency will also increase the blur. High resolution im-ages have a high level of detail and minimal blur whereas low resolution images lack fine detail andexhibit higher blur.

Figure 2.7: Perfectly clear pattern before (left) and after (right) passing through a low resolution lens [25].

As the frequency in the target increases the line-pairs become thinner and less number of pixelsare required for imaging each line-pair. If the stripes are imaged onto neighboring pixels, thenthey will appear to be one single long stripe in the image (Figure 2.8a) rather than two separatestripes (Figure 2.8b). Therefore, there is a minimum line-pair width for which the image sensor iscapable or resolving the black and white stripes. At least the distance equivalent to one pixel must bebetween two black stripes. Following this reasoning, a minimum of 2 pixels is required for imaginga line-pair, one pixel for the black stripe and one for the white stripe.

Page 31: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

13

Figure 2.8: Resolving two black stripes. If the space between the stripes is too small (a) the camera sensorwill not be able to resolve them as separate stripes [25].

Correspondingly, this limitation is defined by the size of the pixel in the image sensor. The sensorresolution can be calculated with (2.4).

Sensor Resolution(l p/mm) =1000

2(pixel size(um))(2.4)

This is also known as the Nyquist limit. It can be seen that as pixel sizes get smaller the correspond-ing sensor resolution or Nyquist limit in lp/mm increases proportionally. Additionally, the objectresolution is calculated using the sensor resolution and the primary magnification (PMAG) of thelens (equation 2.5). It is important to note that the equation for object resolution assumes the lenshas no resolution loss. Object resolution can also be converted to um using (2.6).

Ob ject Resolution(l p/mm) = PMAG × Sensor Resolution(l p/mm) (2.5)

Ob ject Resolution(um) =1000

2(Ob ject Resolution(l p/mm))(2.6)

Page 32: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

14

2.1.4.2 Contrast

Contrast can be understood as the relation between low and high luminance in a scene or image. Ina black and white bar stripe test target, minimum luminance will be black and maximum luminancewill be white. Any value between these two limits will be a mix of black and white (gray) lumi-nance values. A sample plot (Figure 2.9) of these value will produce a square wave with differentamplitudes according to the luminance value.

Figure 2.9: Different levels of contrast plotted as a square wave [25].

There are different way to calculate and measure contrast depending on the target used for testing.In this particular case the Michelson Contrast [26] is used. The Michelson Contrast, also knownas visibility, peak-to-peak contrast, or modulation, measures the relation of luminance in periodicpatterns where both bright and dark features are equivalent and take up similar fractions of the totalarea (e.g. sinusoidal gratings). The Michelson Contrast is defined as:

Contrast =Imax− Imin

Imax + Imin(2.7)

where Imax and Imin are the maximum and minimum luminance values respectively. The denominatoris the luminance difference while the denominator is twice the luminance average.

Considering the effect of a lens and the differences between the resulting image and the real object,as shown in Figure 2.7, the contrast plot of the resulting image will look more like a sinusoidalwave due to black and white transitions in the target not being resolved perfectly (Figure 2.10a).Moreover, as the line-pair frequency is increased, the contrast will decrease, which is seen in theamplitude of the resulting sinusoidal wave (Figure 2.10b). This effect is always present due to thelimitations of the imaging lens. For the image to appear defined, black and white must be welldefined with a minimal amount of grayscale in the transition.

Page 33: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

15

With the concepts of resolution and contrast properly introduced, the MTF can be seen as the wayto integrate these two metrics into a single specification. As the frequency of line pairs increases onthe test target, it becomes more difficult for a lens to effectively transfer the corresponding decreasein contrast. In other words, the MTF describes the attenuation in contrast as a function of the spatialfrequency. Figure 2.10c shows a the general shape of a MTF plot, where the modulation or contrast,expressed as a percentage, decreases as the spatial frequency (in lp/mm) increases.

Spatial Frequency Units This section briefly covers the different units that used for expressingspatial frequency. There is a total of six units commonly used:

- LW/PH = Line width per picture height - Cycles/mm = Cycles per millimeter- LP/mm = Line pairs per millimeter - Cycles/pixel = Cycles per pixel- L/mm = Lines per millimeter - LP/PH = Line pairs per picture height

Table 2.2 show the conversion operations for spatial frequency units. To convert from left columnunits to top row units use the operation in the corresponding row/column intersection.

Table 2.2: Spatial frequency unit conversion chart.

LW/PH LP/mm L/mm Cycles/mm Cycles/pixel LP/PH

LW/PH x 1 / [2 x h] / h / [2 x h] / [2 x v] / 2

LP/mm x [2 x h] x 1 x 2 x 1 x p x h

L/mm x h x 0.5 x 1 x 0.5 x [p / 2] x [h / 2]

Cycles/mm x [2 x h] x 1 x 2 x 1 x p x h

Cycles/pixel x [2 x v] / p x [2 / p] / p x 1 x v

LP/PH x 2 / h x [2 / h] / h / v x 1

Where p is pixel pitch, h is picture height, and v is number of vertical pixels.

Page 34: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

16

Figure 2.10: Contrast comparison in resulting images at low (a) and high (b) frequency, and MTF plot (c)[27].

Page 35: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

17

2.1.4.3 Slanted-Edge MTF Measurement

The ISO 12233 standard (Photography -- Electronic still picture imaging -- Resolution and spatialfrequency responses) [28] defines a method for measuring the Modulation Transfer Function (MTF),also known as Spatial Frequency Response (SFR), based on a slanted-edge target. The slanted-edgeMTF method was developed by Burns in [29] as a modified version of traditional edge-gradientanalysis methods that used scanning microdensitometers for digital image characterization [30].Burn’s method allows to use actual image data taken from the device, and consists of a slanted,or skewed edge target (Figure 2.11), in conjunction with data processing algorithms. This methodreduces aliasing and is particularly useful for digital still cameras (DSC). This method has beenwidely and successfully applied to MTF measurements of not only digital still cameras, but alsofilm and print scanners, and CRT displays.

Work done by Williams in [31] showed the evaluation of a software implementation of the ISOprocedure and found it to provide a robust, accurate, and precise tool for quantifying MTF from adigital capture device. Williams reported the tool to be largely insensitive to edge angle variations,capable of delivering accurate MTF estimates for peak SNRs as low as 10:1, and capable of handlingill-behaved frequency morphologies caused by FIR sharpening. Refined practices in the use of thismethod have also been reported to improve performance by reducing and identifying conditions thatcan cause measurement errors [32].

Figure 2.11: SFR target with horizontal and vertical (highlighted) slanted edges.

Page 36: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

18

The slanted-edge MTF method consists of three basic operations: acquiring an edge profile fromthe (image) data, computing the derivative in the direction of the edge, and computing the discreteFourier transform of this derivative array (Figure 2.12). The edge profile is also referred to as theregion of interest (ROI), and it can have different sizes and orientations (horizontal or vertical edge).The result is a data set that represents the MTF. A typical graph of a balanced MTF can be seen inFigure 2.10c.

Figure 2.12: Flowchart of the operations in the edge gradient analysis.

A MTF graph can have different shapes (Figure 2.13) that indicate characteristics of the imagingsystem, environment, and conditions[33]. However, for the purposes of direct performance com-parison between cameras and imaging systems, a single value that resumes MTF results has beendefined. Burns and Williams in [34] introduced the concept of sampling efficiency, which was lateradded to the ISO 12233 standard. Sampling efficiency is a measure of limiting resolution, definedas the normalized frequency at which the MTF falls to 10%.

Figure 2.13: Different MTF shapes and their meaning.

The sampling efficiency is used to calculate the resolution efficiency and the effective resolution ofan imaging system. The process starts by determining the frequency of the first 10% MTF occur-rence for both vertical and horizontal components, clipping at 0.5 cycles/pixel. Each of these valuesis normalized against the digital limit (Nyquist criteria) of 0.5 cycles/pixel to obtain the samplingefficiency of each direction (equation 2.8).

Page 37: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

19

SamplingE f f iciencyH,V (%) =Frequency(0.1MT F)H,V (cycles/pixel)

0.5(cycles/pixel)(2.8)

The resolution efficiency is calculated as the product of the horizontal and vertical sampling effi-ciencies (equation 2.9).

ResolutionE f f iciency(%) = SamplingE f f iciencyH (%)×SamplingE f f iciencyV (%) (2.9)

Finally, the effective resolution of the imaging system can be found multiplying the delivered reso-lution (usually the image sensor resolution) with the resolution efficiency (equation 2.10).

E f f ectiveResolution(Mpx) = ResolutionE f f iciency(%)×Delivered Resolution(Mpx)(2.10)

2.1.5 USB Camera Modules

USB camera board modules are based on an image sensor, a processor that controls and reads thesensor, and an interchangeable lens. The onboard processor has a USB interface to communicate toan external computer using the USB Video Class (UVC) standard driver. These type of cameras canbe compared to webcams in terms of components, settings, and interface. Given the relatively smallnumber of components, USB camera board modules can achieve a small form factor, making thema lightweight alternative to higher-end professional cameras.

The segment of board-level cameras that are of interest in this project carry image sensors with a¼”- ½” optical format and S-Mount (M12) lenses. These lenses are small and available with a widevariety of characteristics (focal length, resolution, etc.). Interchangeable lenses increase the overallflexibility of the camera system.

2.1.5.1 USB Video Class (UVC)

The USB Video Class (UVC) is a standard that groups certain devices into a special class ofUSB device. UVC encloses devices capable of streaming video like webcams, digital camcorders,transcoders, analog video converters and still-image cameras. The advantage of UVC is that it pro-vides native “universal” and cross-platform drivers to interface with devices without depending onmanufacturer specific drivers.

Page 38: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

20

2.1.5.2 Video for Linux Two (V4L2) API

Video for Linux Two (V4L2) API is a kernel interface that includes a collection of device drivers andan API for analog radio and video capture on Linux systems [35]. V4L2 was developed by Bill Dirksas an improved version of its predecessor Video for Linux (V4L). V4L2 provides higher flexibility,a new structure, and the possibility of interfacing with multiple applications/processes/devices at thesame time [36].

V4L2 is a high-level driver that interfaces with the lower-level UVC driver, therefore acting as amiddle layer between applications and the UVC driver.

2.1.5.3 GStreamer

GStreamer1 [37] is an open source, cross-platform, pipeline-based multimedia framework capableof linking multiple media processing systems to complete complex workflows. GStreamer providesa lot of flexibility for constructing related series of multimedia elements arranged in parallel flowscalled pipelines. These pipelines are assembled with automatic negotiation between its elements,making the construction of the workflow easy and robust.

Pipelines in GStreamer consist of a set of connected elements that can be divided in three majorgroups: data producers (sources), processing steps (e.g. formats, filters, codecs), and data consumers(sinks). The framework is based on plugins that provide codecs and functionalities.

GSreamer is written in the C programming language, but it has written bindings for various lan-guages such as Python, Vala, C++, Perl, GNU Guile, C# and Ruby. It also provides a command linetool option.

1https://gstreamer.freedesktop.org/

Page 39: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

21

Figure 2.14: Diagram of GStreamer components [38].

2.2 Spectral Imaging

Spectral imaging is a common technique in the field of spectroscopy that studies the level of lightthat is emitted by or reflected from materials and its energy fluctuations with wavelengths. Themeasurements obtained using this technique produce simultaneous image data in multiple bands.

Two major distinctions among this technique are hyperspectral imaging and multispectral imaging.The main difference between the two is the number of bands and how narrow the bands are. Hyper-spectral imaging consists of hundreds or thousands of narrow (10-20 nm), adjacent spectral bands.This comprehensive image data set make it possible to derive a continuous spectrum of the imagecell (Figure 2.15) that can be used to identify materials.

Figure 2.15: Illustration of hyperspectral imaging. Each pixel in the image forms the spectrum trough imagedwavebands.

Page 40: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

22

Conversely, multispectral imaging uses a smaller number of bands, usually 3 to 10 bands, that arenot necessarily adjacent and tend to be wider (30-40 nm). These bands are usually selected for aspecific application by knowing the type of materials that will be studied and their spectral response.

Figure 2.16 shows a graphical comparison between hyperspectral and multispectral imaging exam-ples.

Figure 2.16: Comparison of band classifications in multispectral (few wide bands) and hyperspectral (severalnarrow bands) models. Units are in micrometers for the multispectral ranges.

2.2.1 Vegetation, Soil and Water Reflectance

Three of the major natural materials of interest when looking at landscapes are vegetation, soil andwater. Figure 2.17 shows the reflectance spectrum of each of these materials. It can be seen thatthe three reflectance curves are fairly close in the visible spectrum (390 - 700 nm). Therefore,differentiation of these materials must be done at longer wavelengths.

0.5 0.7 0.9 1.1 1.3 1.5 1.7 1.9 2.1 2.3 2.5Wavelength (µm)

0

10

20

30

40

50

60

70

Ref

lect

ance

(%)

Dry SoilWaterGreen Vegetation

Figure 2.17: Reflectance spectrum of water, soil and vegetation.

Page 41: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

23

Generally, water only reflects light in the visible range of the spectrum, making very distinct fromsoil and vegetation in the infrared region. The reflection of soil increases from the visible range to theinfrared range. There are different types of soil that have slightly different reflectance characteristicsdepending on its composition and the content of water and minerals. The graph above shows theaverage curve for bare ground soil. Finally, the spectral reflectance of green vegetation is verycharacteristic. Visible light is greatly absorbed by plants, specially blue and red light as they areused during photosynthesis. The curve increase abruptly in the red-edge region and then infraredlight is greatly reflected.

2.2.2 Landsat Thematic Mapper

The Thematic Mapper (TM) [1] is an advanced, multispectral scanning sensor that is part of theNASA Landsat program. It is designed for higher image resolution, sharper spectral separation,improved geometric fidelity and greater radiometric accuracy and resolution. It features a total of7 spectral bands, 3 in the visible spectrum and 4 in the infrared spectrum. Band 6 senses thermal(heat) infrared radiation in the 10.41-12.5 µm [2]. Figure 2.18 shows the graph from Figure 2.17with the addition of the Landsat TM bands up to 2.5 µm (band 6 is not shown in the graph).

Figure 2.18: Reflectance spectrum of water, soil and vegetation together with the Landsat TM bands in the0-2.5 µm range.

TM is a whisk broom scanner which takes multi-spectral images across its ground track. Most bandshave a 30 m spatial resolution, with the exception of band 6 which has a 120 m spatial resolution.Table 2.3 shows the wavelength ranges and brief description of the Landsat TM bands.

Page 42: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

24

Table 2.3: Landsat Thematic Mapper Bands for Vegetation Monitoring.

Band WavelengthRange (nm)

Description

TM1 450 - 520 [blue-green] - maximum chlorophyll absorption. Used formonitoring aquatic ecosystems. Noisy, susceptible to scatter.

TM2 520 - 600 [green] - green vegetation reflection peak. Slight sensitivity toplant chlorophyll concentration.

TM3 630 - 690 [red] - strong chlorophyll absorption. Useful for distinguishingbetween vegetation and soil and in monitoring vegetation health.

TM4 760 - 900 [NIR] - strong water absorption. Bright reflectance for soil andvegetation. Good for defining the water/land interface.

TM5 1550 - 1750 [short-IR] - very sensitive to moisture. Used to monitorvegetation and soil moisture.

TM7 2080 - 2350 [mid-IR] - This band is also used for vegetation moisturealthough generally band 5 is preferred for that application.

TM6 10410 - 12500 [long-IR] - thermal (heat) infrared radiation.

2.2.3 Lightweight UAV-Compatible Multispectral Imaging Systems

With the increasing number of applications using multispectral remote sensing a number of multi-spectral imaging system have been developed and introduced in the market. This sections presentsa brief overview of some popular multispectral imaging systems available in the market.

The Tetracam ADC Micro [39] is a single sensor camera with an Aptina CMOS rolling shuttersensor. The camera captures the red, green and near infrared (NIR) bands using a single widebandpass filter that covers the 520-920 nm range. The camera has fixed optical parameters of 8.43mm focal length and optical aperture of f/3.2. Tetracam also has the Mini MCA4 [40], which is acamera array system. It features four cameras, 1.2 megapixels each, covering the blue, green, red,and NIR bands. There are versions of this system with 6 and 12 cameras.

The Sentek Systems GEMS Multispectral Sensor [14] is a dual camera system. It features an RGBcamera and a NIR camera, both with 1.2 megapixels sensors and a global shutter. Both camerashave a fixed focal length of 7.7 mm. This system can be acquired as an OEM PCB module or as afinal product with enclosure.

The Sentera QUAD [15] is a compact multispectral imaging system with four 1.2 megapixels globalshutter cameras. It features one RGB camera and three monochromatic single-band cameras cover-ing red, red-edge, and NIR bands. This unit integrates all four sensors into a small package.

Page 43: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

25

(a) (b)

(c) (d)

Figure 2.19: (a) Tetracam Mini MCA6. (b) Sentek GEMS. (c) Sentera QUAD. (d) Parrot Sequoia.

Finally, the Parrot Sequoia [16] is a very popular and small unit, somewhat similar to the SenteraQUAD. It has four 1.2 megapixels global shutter single-band sensors covering the green, red, red-edge, and NIR, in addition to a 16 megapixels rolling shutter RGB camera.

Table 2.4 presents a detailed comparison of the features of each camera. At this point, it is importantto introduce a term that is used to compare the level of detail present in an image. In remote sensing,ground sampling distance (GSD) of the ground from the air or space is the distance between thecenter of two adjacent pixels measured on the ground. This measure gives a sense of how muchdistance and area are covered by pixel in the image. This measure is better as the number is lower asit relates to higher resolution. The GSD values presented in the table below correspond to a workingdistance, or altitude above ground level (AGL), of 400 ft (~122 m).

Page 44: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

26

Table 2.4: Comparison of common multispectral imaging systems.

Camera TetracamADCMicro

TetracamMini-MCA4

SentekGEMS

SenteraQUAD

ParrotSequoia

Dimensions (mm) 75 x 59 x 33 131 x 78 x 87 127 x 89 76 x 62 x 48 59 x 41 x 28

Weight (g) 90 600 320 170 72

Number of Bands 4 4 4 4 4

Focal Length (mm) 8.43 9.6 7.7 5.0 3.98

FOV (H° x V°) 37.6 x 28.7 38.2 x 30.9 34.6 x 26.3 51.2 x 39.6 61.9 x 48.5

Sensor size (mm) 6.55 x 4.92 6.66 x 5.32 4.8 x 3.6 4.8 x 3.6 4.8 x 3.6

Resolution (pixels) 2048 x 1536 1280 x 1024 1280 x 960 1280 x 960 1280 x 960

Pixel size (µm) 3.12 5.2 3.75 3.75 3.75

GSD (cm) @120m 4.6 6.6 5.1 9.1 11.0

Price (USD) 2,995.00 9,995.00 3,500.00 4,599.00 3,500.00

Figure 2.20 shows a comparison of the bands used in each camera. It is important to note that theTetracam Mini MCA and the Sentera QUAD offer the possibility of changing the optical filters uponrequest when ordering. Nonetheless, the comparison is done based on the default configuration ofthe cameras.

Figure 2.20: Comparison of spectral bands used in each camera.

Page 45: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

27

2.3 UAV-Based Spectral Remote Sensing

Remote sensing is the science of collecting data about an object or phenomenon from a distance,without making physical contact with the object. In the spectral scope, remote sensing refers to thecollection of data by detecting the energy that is reflected from the Earth’s surface, with instrumentsand sensors usually mounted on large scale satellites and aircrafts.

A major area in remote sensing is the field of spectroscopy, in both hyperspectral and multispectralremote sensing, which measures the energy reflectance and absorption properties of vegetation,soils, rocks, and water bodies in the natural environment, generally under solar illumination. Asmentioned before, the “traditional” approach has been to use large scale satellites and aircrafts tocollect data. But the increasing use of UAV systems, and the miniaturization and cost reduction ofspectroscopy electronics, has opened the door to UAV-mounted spectrometers, which are currentlybeing used to quickly and easily obtain spectral data of a target area.

Spectral data obtained with light-weight and compact spectrometers mounted in UAVs can achievesignificant match and a high correlation with the data obtained from similar ground-level equipment[41, 8, 42]. While ground-level measurements can generally achieve higher sensitivity and accu-racy, UAV-based measurements have shown a lower standard deviation due to the larger field ofview [41]. Therefore, it is possible to obtain reliable and accurate spectral data from UAV-mountedspectrometers without considerable losses, making this technique a faster and lower-cost alternativeto “traditional” methods of spectral data collection. The following sub-sections cover some ma-jor applications in spectral measurements and remote sensing, with a larger focus on UAV-basedhyperspectral and multispectral imaging systems.

2.3.1 Crop Inspection and Precision Agriculture

A critical element in agriculture is the inspection of crops to determine its health and quality. Thisinformation is crucial for farmers as a way to maximize crop growth, increase yields and reducecrop damage. Precision agriculture is the use of measured characteristics of soil, and the factorsthat affect it, in farming and the management of crop fields. In this regard, the most importantparameters related with crop growth are nutrient status, weed infestation, disease affectation andwater management [43].

Healthy vegetation, and more specifically their leaves, exhibit optical characteristics such as (a)low spectral reflectance at visible light (from 400 to 700 nm) due to strong absorption of thesewavelengths by the chlorophyll (pigment found in plant leaves); (b) high spectral reflectance atnear-infrared light (from 700 to 1100 nm) due to multiple scattering at the air-cell interfaces in theleaf internal tissue [44]; and (c) low spectral reflectance in the short-wave infra-red (SWIR) banddue to strong absorption by leaf constituents such as water, proteins, and carbon components [45].

Page 46: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

28

For remote sensing purposes, spectral images can be used to extract important information on veg-etation parameters, such as canopy density, height, and clumping [46], nitrogen content [47], leafarea index [48], foliar water content, and soil properties, such as soil surface roughness [49] and soilmoisture content [50].

In evaluating and interpreting remote sensing measurements of vegetation, several indices are usedas graphical indicators of the existence of live green vegetation. The first one is the NormalizedDifference Vegetation Index (NDVI), calculated by

NDV I =NIR−RNIR+R

(2.11)

where R and NIR are the spectral reflectance measurements acquired in the visible (red) and near-infrared regions, respectively. Other indices like the Green Normalized Difference Vegetation Index(GNDVI) and the Normalized Difference Red Edge Index (NDRE) are also used for the same pur-pose while providing different levels of information. In the GNDVI (2.12), the red part of thestandard NDVI is replaced with a very specific band of light in the green range (G).

GNDV I =NIR−GNIR+G

(2.12)

Similarly, in the NDRE (2.13), the red part of the standard NDVI is replaced with a red edge (RE)spectrum centered around 715 nm.

NDRE =NIR−RENIR+RE

(2.13)

There is a large number of additional indices that have been designed to extract specific informa-tion. It is important to note that crops spectral measurements and therefore vegetation indices infield conditions are affected by interference factors such as mixture of vegetation, soil-brightness,environmental effects, shadow, soil color and moisture [51].

Hyperspectral imaging systems have been successfully used in the past to study crops using vege-tation indices (VI). Part of Nebiker et al. [8] experiments included plant vitality assessment using ahigh-end spectrometer and a low-cost consumer-grade NIR camera mounted on a UAV. In this work,a first study found a correlation between rape yield crops and NDVI values of 0.78 for treated plotsand 0.35 for untreated plots. Moreover, vitality differences among all the crops were identified andit was found that the use of fungicides prolonged plant activity up to harvest time. A second part ofthe vitality assessment study was done using different species of barley crops. In this study, correla-tions varied considerably depending on the index used (NDVI, GNDVI and NDRE) and the barleyspecie. Some species showed a low correlation with one or even two of the three indices, whileother species showed a high correlation with all three indices. These results indicated that indicesand species under study must be correctly matched in order to get the most accurate measurement.

Page 47: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

29

Vegetation indices extracted from multispectral and hyperspectral images can be used with a highlevel of accuracy in estimating vegetation vitality and density [10] and by performing successivemeasurements over a period of time it is possible to estimate the growth rate of crops [52, 9].

Previous works have also demonstrated it is possible to measure the level of nitrogen in crops us-ing hyperspectral imaging systems. Most nitrogen levels in plant leaves is found in chlorophyllmolecules, therefore, there is a strong relationship between leaf nitrogen content and leaf chloro-phyll content [53]. An increase in visual reflectance of plants is related to the decrease in chlorophyllcontent as a result of low nitrogen supply, whereas an increase in near-infrared (NIR) reflectance isassociated to an increase in leaf area and green biomass. Therefore, by measuring leaf reflectance itis possible to predict crop nitrogen levels and therefore estimate the level of vegetation, health andcrop growth.

In experiments done by Zheng et al. [54] and Agüera et al. [43], UAV-mounted spectrometerswere used to collect data from rice and sunflower crop fields, finding strong relationships betweenvegetation indices values and the levels of nitrogen in the crop fields. Pölönen et al. [55] werealso able to estimate biomass and nitrogen content of wheat and barley fields using a UAV-mountedhyperspectral camera. In this study, four vegetation indices were tested, with FVA (linear) givingthe best correlation of 0.814 and 0.72 for biomass and nitrogen content respectively.

It is also possible to use hyperspectral images and vegetation indices to analyze, detect and trackplant stress levels. One of the most common sources of plant stress is diseases and infestations. Atearly stages, plants react to the presence of a pathogen with some physiological mechanism such asa decrease in the photosynthesis rate, which leads to an increase of fluorescence and heat emission[44]. At a later stage, pathogens cause a reduction of chlorophyll content in leaves due to necroticor chlorotic lesions [56]. A decrease in chlorophyll increases reflectance in the visible light bandand causes a shift of the red-edge position in the spectrum. At the canopy scale, infections canchange canopy density and leaf area. While these effects do not provide unambiguous indicationof a specific stress, they can provide information for early detection and anticipation of non-normalconditions.

Nebiker et al. [8] captured spectral images of an onion crop field using a UAV on two differentdates and found a substantial difference in NDVI values that were used to identify an infestationof thrips. Similarly, an infestation of blight was also detected by analyzing the changes in NDVIvalues from two flights over a field of potato crops. Näsi et al. [57] developed a processing approachbased on UAV-based hyperspectral imagery that was used to identify spruce trees suffering from aninfestation of bark beetle, and achieved a accuracy of 90% in separation between healthy and deadtrees. Similarly, Garcia-Ruiz et. al. [58] achieved an accuracy of 85% in the detection and classifi-cation of Huanglongbing-infected citrus trees using high resolution hyperspectral images taken froma UAV. It was also found that the early stage branch-level disease could be detected by performinglow altitude flights.

Page 48: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

30

Another commonly found form of plant stress is due to lack of water, which directly affects pho-tosynthesis and chlorophyll content in leaves [59, 60], and therefore affects the spectral reflectancesignature of plants. Zarco-Tejada et. al. [61] were able to track stress levels in a citrus crop us-ing thermal and hyperspectral imagery obtained with a UAV, identifying areas where water wasdeficient.

The successful deployment of UAV-mounted hyperspectral cameras and the extraction of accurateinformation for the assessment of plants vitality and growth and the detection of plant stress condi-tions make it possible to used these systems for advanced precision agriculture and remote sensingof vegetation and crops.

2.3.2 Surface and Soil Study

Hyperspectral images taken from UAVs have also been successfully used for surface and terrainstudies, detecting soil and vegetation marks, and surface modeling. Themistocleous et al. [11]combined UAV and ground hyperspectral data to study the formation of crop and soil marks forarchaeological applications, while Honkavaara et al. [62] were able to measure the 3-D surfacemodel and surface moisture of a peat production area using light-weight hyperspectral frame formatcameras from a UAV. Turner et al. [12] also used a multispectral camera together with a visiblecamera and a thermal infrared camera to study the physiological state of Antarctic moss ecosystems.

Dry soil presents a higher albedo due to the high reflectance of its constituents. Moist soil willexhibit a low albedo due to water absorbing light. Bryant et al. [63] evaluated hyperspectral mea-surements for monitoring surface soil moisture, and found a strong correlation between reflectanceand soil moisture within 3 cm of depth. Similarly, Finn et al. [64]compared soil moisture measure-ments between airborne hyperspectral data and equivalent ground readings. A strong correlation(R2 greater than 0.7) was found in the 1000-1600 nm band for a 5.08 cm (2 inches) depth. Multi-spectral tests from a UAV were reported by Hassan-Esfahani et at. [13] in successful estimation ofsurface soil moisture over a large field irrigated by a center pivot sprinkler system. The multispectraldata combined with an artificial neural network developed to quantify effectiveness of the methodreported a high correlation (R2) of 0.77.

2.3.3 Surface Hydrocarbon Detection

The use of hyperspectral and multispectral imaging for the detection of hydrocarbons in the surface,thus being able to identify such hydrocarbons, can be a considerable improvement for environmentaland industry monitoring. Just the petroleum industry is responsible for a large amount of leaks andspills that commonly occur during transportation, storage, and manipulation of, not just petroleum,but also its related products. Therefore, it is of great interest to be able to detect and prevent these

Page 49: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

31

events. Allen and Krekeler [65] performed a series of experiments with light, intermediate and heavycrude, and also some common refined products (E85 fuel, gasoline, diesel, and motor oil) over arange of substrates that include concrete, asphalt, gravel, grass, sands, and clays. They were ableto detect and identify these hydrocarbons from each other and water using an ASD FieldSpec Pro-FR2 spectrometer. Airborne hyperspectral data has also been collected and used for the detectionof hydrocarbons. Hörig et al. [66] flew a HyMap hyperspectral sensor and detected hydrocarbonson the ground surface. It was shown that hydrocarbons are characterized by a typical absorptionmaxima at 1730 nm and 2310 nm.

Analysis of data collected using the Airborne Visible Infrared Imaging Spectrometer (AVIRIS) dur-ing the Deepwater Horizon spill provided a quantitative way of estimating oil thickness and oil-waterratio [67]. Oil thickness and emulsion ratio maps were generated by Clark et al. [68] using the oilabsorption bands, which later were used to estimate oil volume.

Work has also been done in detecting and estimating the bitumen content of oil sands using infraredspectra. Rivard et al. [69] demonstrated that hydrocarbons in Athabasca oil sands have distinctfeatures in the infrared spectra region. In another experiment, they were able to estimate the bitumencontent of oil sand ore samples using hyperspectral reflectance spectra in the 300-2500 nm range[70].

2.3.4 Environmental Monitoring

The general study of environmental phenomena and the effect of human activity in the environmentcan also be assessed using spectral imagery techniques. Hausamann et al. [71] performed studiesoriented towards the detection of gas leaks from pipeline installations using UAV spectral remotesensors. This type of monitoring is of relevant importance in order to detect problems early andprevent negative environmental impacts. Tuominen et al. [72] reported using hyperspectral remotesensing techniques for monitoring nuclear fuel repository sites in search for environmental changes.

Experiments have also been performed using UAV spectral measurements in volcanology. McGo-nigle et al. [73] used a UAV mounted spectrometer to study a volcano in Italy and were able tosuccessfully measure carbon dioxide flux in the volcanic gas.

2.3.5 Archaeology

Spectral imaging is also an important data collection method used in archaeology. Studies have beendone in ways to extract archaeological information using vegetation indices and crop marks [74, 75].Furthermore, the increasing use of UAVs for archaeological remote sensing provides a quick docu-mentation, monitoring, and visualization tool that is easy, quick, and comes at a low cost. Cowleyet al. [76] put together a comprehensive list of UAV based surveying methods for archaeological

Page 50: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

32

studies and highlights the usefulness and beneficial impacts of this technology in the field. A studycase in this publication was done by Moriarty [77], in which a UAV based multispectral camera wasused for identifying and monitoring crop marks matched with archaeological regions of interest. Thetracking of changes in the crop marks allowed the classification of all the regions across the area ofstudy. Similarly, Themistocleous et al. [11] acquired spectral data from a UAV and were able toidentify the formation of crop and soil marks tied to archaeological remains. With this method, itwas possible to attain a greater level of accuracy in delimiting the areas of interest.

Page 51: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

33

Chapter 3

UAV Development

This chapter presents the development of a fixed-wing UAV as part of the setup stage for testing UAVpayloads. The intention was to develop a reliable UAV that could serve as a testing platform for longterm research applications. While off-the-shelf airframes provide certain level of flexibility, in mostcases it is not enough and modifications or adaptations are usually needed. The decision to redesignan existent off-the-shelf UAV platform comes from the need to have a high level of modularity sothat it is flexible enough to be used in different applications. The goal was to study, modify, andimprove the fixed-wing UAV platform used by the University of Alberta Aerial Robotics Group(UAARG). Making this the starting point helped leveraging the existent knowledge and experience,thus reducing the development time. The airframe is complemented with a customized UAV systemboard that integrates all UAV essential navigation and control functions, together with a redundantpower supply board that provides main and backup power regulation.

The following sections describe the design and development of the airframe of a fixed-wing UAVand a fully integrated UAV system board with redundant power supply. The UAV platform describedin this chapter, both airframe and electronics, is used for testing the payload systems developed inthe subsequent chapters, and is part of the complete UAV solution presented in this thesis.

3.1 UAV Electronic System

A fundamental feature of many UAVs is the autonomous navigation system, which is based onan onboard autopilot computer that controls and guides the UAV in flight without assistance fromhuman operators. Increasing efforts have been made in both hardware and software in order toproduce reliable and robust autopilot platform solutions. Currently, there are a large number ofhardware development boards available that can be used with a wide range of fixed-wing and rotary-wing airframes. Besides a hardware platform, these solutions bring a full software package for tasks

Page 52: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

34

like programming, calibration, tuning, simulation, and monitoring. Chao et al. [78] made a surveyon existing autopilot platforms at the moment, comparing the main features of both proprietary andopen-source solutions. More recently, Lim et al. [79] and Ebeid et al. [80] made an extensivecomparison of available open-source autopilot hardware and software platforms. These completesolutions can make the task of building and flying a UAV relatively easy and fast.

Some hardware platforms are designed to be a minimal solution, relying on additional dedicatedboards or modules that are wired together, giving some flexibility to the user for assembling a cus-tomized final system [81]. Other approaches are focused on full integration of components into asingle board, aiming for simplicity and reduction of wiring [82, 83]. A minimal autopilot systemincludes an onboard processor and attitude sensors (accelerometer, gyroscope and magnetometer).Additional standard components that are usually needed (depending on the airframe and application)and connected to the autopilot base board are: motor(s), servo(s), Global Positioning System (GPS)unit, pressure sensor, RF telemetry transceiver, and remote control (RC) receiver for manual flightcontrol.

Another feature that is not commonly found in a UAVs electronic system is a redundant powersupply that allows the UAV to recover from a main power loss. This feature has been explored insome concept development designs [83] and is found in some high-end proprietary solutions [84].

This section presents a comprehensive autopilot system board for UAVs [85] that integrates all essen-tial sensors and components. The features and parts of the board are described in detail, emphasizingthe high level of integration that simplifies the assembly, reduces wiring and weight, and increasesreliability.

A secondary fully compatible, detachable, and novel board is also presented, that provides powerregulation and redundancy. The architecture of this module is presented in detail, highlighting theparallel power supply approach used, which results in minimal power loss and instant switchingfrom main to backup power supply.

3.1.1 UAV System Board

The first module is the system board which incorporates all common UAV electronics, including ba-sic sensors, Microcontroller Unit (MCU) and communication ports. Additionally, the board featuresan on-board data logger with a microSD card port, gates to enable/disable Pulse Width Modulation(PWM) control outputs (typically wired to an enable switch), and an expansion port that gives ac-cess to most MCU pins. Important features are described in more detail in the following sections. Adetailed user guide for this board can be found in Appendix A.

The UAV system board was designed to be fully compatible with the existing Lisa MX v2.1 classautopilot board designed by 1BitSquared, which is similar to the Lisa M autopilot board [86] with

Page 53: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

35

an upgraded MCU. This boards are supported in the Paparazzi-UAV project [87, 86, 88], which isan open-source ground and airborne software architecture UAS platform. This compatibility savesany need to develop custom software and gives the possibility to reuse the existing firmware filesdesigned for the Lisa MX board. The Lisa MX was selected mainly because of its powerful MCU.Figure 3.1 shows the block diagram of the autopilot board hardware.

Figure 3.1: Block diagram of the UAV system board hardware.

3.1.1.1 Microcontroller Unit (MCU)

The UAV system board is built around the STMicroelectronics STM32F405VGT6, a 32-bit ARM®Cortex® M4 MCU running at 168MHz with 192KB RAM, 1024KB Flash, and Floating Point Unit(FPU). The MCU includes 82 I/O, 10 Timers, and CAN, I2C, I2S, SPI and UART communicationinterfaces in a LQFP-100 package.

3.1.1.2 Inertial Measurement Unit (IMU)

The 10 Degrees-of-Freedom (DOF) or 10 Degrees-of-Measurement (DOM) Inertial MeasurementUnit (IMU) is comprised by an integrated 3-axis accelerometer and gyroscope (Invensense MPU6000),a 3-axis magnetometer (Honeywell HMC5883L) and a barometer (Measurement Specialties MS5611-01BA03). The accelerometer/gyroscope and barometer are connected via SPI interface to the MCU,and the magnetometer is connected to the accelerometer/gyroscope via I2C interface. The ac-celerometer/gyroscope acts as a master with the magnetometer, pulling the data and saving it locally,allowing the MCU to extract data from these three sensors directly from the MPU6000 via SPI.

Page 54: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

36

3.1.1.3 GPS

The system includes a uBlox MAX-7Q GPS module with a super-capacitor for hot start and cor-responding antenna matching circuitry. The GPS works in the 1575.42Mhz band and the antennacircuitry includes a SAW filter and optional Low Noise Amplifier (LNA) that can be bypassed ifused with an active antenna. The board supports antennas with either uFL or SMA connectors. Theconnectors can be selected using zero ohm jumper resistors. When power is off, the super-capacitoris capable of keeping the GPS module in backup battery mode for approximately 20 minutes toallow for hot start [81]. The GPS is connected to the MCU via a UART communication interface.Optionally, when an external GPS unit is used, the on-board GPS can be powered off and discon-nected from the MCU by means of on-board jumpers. The external GPS can then be powered andconnected to the MCU UART port using the corresponding connector.

3.1.1.4 XBee and Data-logger

A socket is included for connecting an XBee RF transceiver module for telemetry. The XBee isconnected to the MCU via UART communication interface. A secondary MCU based on an AtmelAVR 8-bits 16Mhz ATMEGA328P logs the telemetry data from the serial port to a micro-SD card.This data-logger uses a modified version of the open-source “OpenLog” firmware that was adaptedfor Paparazzi by the Paparazzi team. The data-logger starts running and saving data automaticallyafter power up. The board includes a jumper for powering on/off the data-logger.

3.1.1.5 PWM Outputs

There are eight PWM outputs that can be used to control motors and servos. Output signals runthrough gates and each signal can be enabled/disabled by connecting a switch or a jumper to oneof two user control inputs that can be accessed from an on-board connector. This functionalityprovides an easy way to have kill-switches for turning off PWM signals to the motors and servosindependently. This feature is useful for debugging, programming and tuning output signals whilekeeping the motors off. This also serves as a safety feature that can prevent human injury, damageto servos while programming, accidental motor spinning and fly-away incidents. PWM outputs canbe assigned to either the motor or the servo group by using solder jumpers (one for each output).

3.1.1.6 Power Regulation

The on-board power supply is split into two 3.3V Low Drop-Out (LDO) linear voltage regulatorswith +5V input. The first voltage regulator (Texas Instruments LP2992AIM5-3.3) powers the ac-celerometer, gyroscope, magnetometer and barometer sensors, so as to reduce noise in the measure-

Page 55: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

37

ments, while the second voltage regulator (Texas Instruments LM3940IMPX-3.3) powers to the restof the board. The latter is capable of delivering up to 1A of power.

3.1.1.7 Additional Features

Finally, the UAV system board has additional I2C, SPI, CAN, and UART ports for external com-munications, analog inputs, GPIOs, bind button, status LEDs, and connection for two radio control(RC) receivers. Some communication ports have dedicated connectors and most MCU pins are ac-cessible through a 48-pin expansion header. The expansion header provides the opportunity for thedesign and use of daughter boards and “shields”, increasing flexibility and expanding functionalities.The features of the UAV system board are summarized in table 3.1.

Figure 3.2: UAV system board. 90x52mm 4-layers PCB with a weight of 29 grams. A detailed manual anduser guide for the board can be found in Appendix A.

Page 56: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

38

Table 3.1: UAV System Board Features

Characteristic Specifications

Microcontroller STM32F405VGT6

10 DOF IMU 3 axis accelerometer3 axis gyroscope3 axis magnetometerbarometer

Remote Control Two UART ports for remote control receivers

GPS One uBlox MAX-7Q GPS unitThe GPS on-board circuitry includes:- SAW filter- Optional LNA- Selectable uFL or SMA connector for GPS antenna

Connectivity 8 PWM, 1 JTAG, 1 SPI, 2 I2C, 1 CAN, 3 GPIO, 3 ADC, 1UART

Data-logger Secondary MCU and micro-SD card socket for telemetrydata-logging

Power 3.3V@1A voltage regulator

User Interface Bind buttonEight status LEDs

Size and weight 90x52 mm, 29 grams, 4 layers PCB

3.1.2 Redundant Power Supply Board

The second module is a redundant power supply board comprised of main and backup voltage reg-ulators, shown in Figure 3.3. This module was designed as a secondary and optional board thatprovides main power for regular operation and backup power for emergency operation. Both mainand backup run on separate battery packs. The purpose of the backup power supply is to providepower to the UAV system and servos in case the main battery or main power supply fails, so that acontrolled emergency landing of a fixed-wing UAV can be performed. If the backup power is notrequired the board can be easily replaced by any 5V power supply with enough current to power theUAV system board.

Page 57: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

39

Figure 3.3: Redundant Power supply module. 41x46mm 4-layers PCB with a weight of 14 grams. There arefour voltage regulators, from bottom left and clockwise: autopilot main, payload, servo backup, and autopilotbackup.

Each voltage regulator is based on the Texas Instruments LM43603PWPT synchronous step-downDC-DC buck converter, which is capable of delivering up to 3 A from an input voltage that can goup to 36 V, suitable for up to an 8 series cell lithium polymer battery. Furthermore, each voltageregulator has over-temperature thermal shutdown protection, output short circuit protection, andadjustable switching frequency. There is also a low battery voltage protection feature that shutsdown the regulator if the input voltage goes under a certain threshold, thus protecting the batteryfrom over-discharge damage. This threshold is adjustable by means of an on-board voltage divider.The output voltage is also variable and can be adjusted by replacing a single resistor. This designuses backup voltage regulators placed in parallel with the main voltage supply, taking advantageof the high impedance of the output of the LM43603PWPT when it is off. A small difference inregulator set point (e.g. 100mV) keeps the backup power supply in standby as described below.

Figure 3.4 shows the connection diagram of the main and backup power system with the rest of theairplane electronics in the two possible operation modes. Under normal power operation (Figure3.4a) only the main battery is used, all systems are on and the two backup regulators (autopilot andservo) are inactive.

Page 58: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

40

Autopilot Main Supply

Autopilot Backup Power Supply

Autopilot

Motor

MainBattery

BackupBattery

Servos

Servo Backup Power Supply

Payload

ESC BEC Supply

(a)

Autopilot Main Supply

Autopilot Backup Power Supply

Autopilot

Motor

MainBattery

BackupBattery

Servos

Servo Backup Power Supply

Payload

ESC BEC Supply

(b)

Figure 3.4: Redundant Power supply module block diagram. Greyed blocks and dashed lines are inactive. (a)Normal power operation mode: all systems are on and powered from main battery. (b) Backup power operationmode: autopilot and servos are powered from backup battery; motor, payload and Electronic Speed Controller(ESC) / Battery Eliminator Circuit (BEC) are off.

In the event of main battery or main power supply failure, the system enters in backup power op-eration (Figure 3.4b), where the backup battery is used. System board and servos are powered bythe backup regulators. The payload, ESC and motor(s) remain off in this mode. When running onbackup power for the system board (including telemetry radio and RC radio receiver) and servos,an emergency glide landing for a fixed-wing UAV can be performed without motor power. For thecase of a main power supply failure in a rotary aircraft, telemetry, including situational awarenessand GPS location for recovery would be maintained.

3.1.2.1 Principle of Operation

The operation of the redundant power supply is based on the the use of two voltage regulatorsconnected in parallel (Figure 3.5), with the main regulator having a slightly higher voltage (VMAIN)than the backup regulator (VBACKUP), making the latter stay inactive while the former sources all thecurrent. In the case when the main regulator shuts down, its output goes to high impedance and thebackup regulator starts to source the current. VOUT will always be the higher of the two supplies’voltages.

When both voltage regulators are on (normal operation mode), meaning the backup regulator isinactive due to the main regulator having a higher output voltage, there is a current flowing from themain regulator to the backup regulator. There is also a current drawn from the backup battery. Letus define these two currents as iS, the current sunk by the backup voltage regulator while inactive,

Page 59: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

41

BackupBattery

MainBattery

VOUT

VMAIN

VBACKUP

iS

iB

Main Power Supply

Backup Power Supply

Figure 3.5: Main and backup power supplies connection diagram. Both power supplies are connected inparallel and there is a voltage difference between the two, where VMAIN is slightly higher than VBACKUP whenin normal operation mode.

and iB, the current drawn by the inactive backup voltage regulator from the backup battery (batterydrain current). It is desired for these two currents to be as small as possible so that neither the mainnor the backup batteries capacity is affected in a considerable way.

Current iB is the result of the constant current drawn by the LM43603PWPT circuit when it isinactive (approx. 65µA) plus the current dissipated in the voltage divider, which depends on thevalue of the resistors and the battery voltage. The value of the resistors should be in the range of0.1-1MΩ, in order to keep iB under 100µA. Current iS varies depending on the voltage difference∆V = VMAIN −VBACKUP. As shown in Figure 3.6, there is a threshold in the values of ∆V underwhich iS becomes negative, meaning that the backup voltage regulator is active and starts sourcingcurrent together with the main voltage regulator. It is desired to keep ∆V above such threshold inorder to keep the backup voltage regulator inactive and iS positive.

0.04 0.06 0.08 0.1 0.12 0.14Voltage Difference ∆V (V)

-5000

-4000

-3000

-2000

-1000

0

1000

Cur

rent

i S (µ

A)

Figure 3.6: Variation of the current sank by the backup voltage regulator circuit (iS) with the voltage difference∆V =VMAIN −VBACKUP, including the point of current inversion for a low ∆V in which iS becomes negative.

Page 60: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

42

Therefore, for any voltage difference VMAIN −VBACKUP > 70mV the backup voltage regulator willremain inactive and the main voltage regulator will source all the current. Figure 3.7 shows that fora wide range of voltage differences ∆V , from 75mV to 130mV, the value of iS varies slightly with alinear behavior from 72.6µA to 73.2µA.

0.07 0.08 0.09 0.1 0.11 0.12 0.13 0.14Voltage Difference ∆V (V)

72.6

72.7

72.8

72.9

73

73.1

73.2

Cur

rent

i S (µ

A)

Figure 3.7: Variation of the current sank by the backup voltage regulator circuit (iS) with the voltage difference∆V when both voltage regulators are on (∆V =VMAIN −VBACKUP).

In the case of the backup voltage regulator being off (battery depleted or disconnected, or regulatorforced off), its output goes to high impedance and the current iS drops while still varying dependingon the voltage seen at the output. Note that this scenario is the same as when the main battery isdepleted and the main voltage regulator shuts down, enabling the backup voltage regulator (in thiscase iS would be negative according to the direction defined in Figure 3.5). Figure 3.8 shows therelation between iS and the output voltage VOUT .

5.06 5.08 5.1 5.12 5.14 5.16 5.18 5.2 5.22Output Voltage VOUT (V)

55.4

55.6

55.8

56

56.2

56.4

56.6

56.8

Cur

rent

i S (µ

A)

Figure 3.8: Variation of the current sank by the backup voltage regulator (iS) circuit with the output voltagelevel (VOUT ) when the backup voltage regulator is off (output in high impedance).

Page 61: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

43

3.1.3 Firmware and Ground Control Station (GCS)

As mentioned before, the UAV system board is compatible with the Paparazzi-UAV platform, andthe same drivers, configuration files and code used for the Lisa MX can be used with this board.Flashing of the MCU can be done using any ARM Cortex MCU programmer, although the BlackMagic Probe is supported by the software. The Black Magic Probe is a JTAG and SWD adapterdesigned by 1BitSquared and Black Sphere Technologies for programming and debugging ARMCortex MCUs.

The Paparazzi-UAV software platform offers Ground Control Station (GCS), airborne code, flightplans, mission control, and communications [88] segments integrated in an open-source fully cus-tomizable distribution. The GCS (Figure 3.9) is an essential development, debugging, and visualiza-tion tool that allows interfacing easily with the UAV, both on the ground and during flight. It allowscontinuous and real time monitoring and logging of all variables including raw sensor readings,remote commands, and control outputs.

Figure 3.9: Paparazzi-UAV Ground Control Station (GCS) interface during a flight.

Page 62: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

44

3.2 Payload Computer

The UAV is equipped with a single board computer (SBC) used for interfacing with payload andprocess real-time payload data. The ODROID-C2 was chosen for this task. The ODROID-C2 fromSouth Korean manufacturer Hardkernel is a small and lightweight 64-bit quad-core SBC.

Figure 3.10: Hardkernel ODROID-C2.

The ODROID-C2 was loaded with Ubuntu 16.04. USB ports were used for interfacing with USBcamera modules. A UART port accessible from the side male header was connected to the telemetryoutput of the system board for internal logging of data. The ODROID-C2 provides access to theterminal console via a UART link. This link was used to control the ODROID via command linefrom an external computer. Table 3.2 summarizes the technical specifications of the ODROID-C2.

Table 3.2: Technical Specifications of the ODROID-C2

- Amlogic ARM® Cortex®-A53(ARMv8) 1.5Ghz quad core CPUs

- Mali™-450 GPU (3 Pixel-processors + 2 Vertex shader processors)

- 2 Gbyte DDR3 SDRAM

- Gigabit Ethernet

- HDMI 2.0 4K/60Hz display

- H.265 4K/60FPS and H.264 4K/30FPS capable VPU

- 40pin GPIOs + 7pin I2S

- eMMC5.0 HS400 Flash Storage slot / UHS-1 SDR50 MicroSD Card slot

- USB 2.0 Host x 4, USB OTG x 1 (power + data capable)

- Infrared(IR) Receiver

- Ubuntu 16.04 or Android 6.0 Marshmallow based on Kernel 3.14LTS

Page 63: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

45

3.3 UAV Avionics Module

The final avionics module for the Falcon I airframe (covered in the next section) is a single assemblythat can easily slide in and out of the plane (Figure 3.11). It contains all the main electronicsincluding the UAV system board with the redundant power supply, a control switches board usedto power on/off the boards and enable/disable the motor and servos, and the payload computer(ODROID). A minimal wiring is necessary for connections between the ODROID and the UAVsystem board, and USB cables from the ODROID are used for payload connection.

Figure 3.11: UAV avionics module fully assembled with all its components.

3.4 Airframe

The starting point for the airframe design was set based on the work done by the University ofAlberta Aerial Robotics Group (UAARG). UAARG is a student vehicle project in the Faculty ofEngineering of the University of Alberta. Their project is to design, build, test, and take to competi-tion a fully autonomous UAV.

At the time, UAARG was using the HobbyKing EPP FPV airframe for their planes (Figure 3.12).The EPP FPV is a lightweight fixed-wing airframe made out of mostly expanded polypropylene(EPP). The fully assembled airframe has a total length of 1320mm, a wingspan of 1800mm, andweighs approximately 1000 grams just the airframe (no electronics, batteries, motors, etc).

UAARG would build the airframe, and then install all the avionics (autonomous navigation and con-trol electronics) and payload systems. The first step was analyzing the system design, construction,setup and deployment procedures. This analysis, together with experience information provided bythe team members, produced a list of problems and potential improvements.

Page 64: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

46

Figure 3.12: HobbyKing EPP FPV airframe.

One of the major issues that was noted was inherent to the airframe design. The assembly processinvolves putting together the left and right wings, then placing the wings on top of the fuselage, alignvisually, and finally secure them with rubber bands. This setup is not reliable since the alignment ofthe wings with respect to the fuselage is done visually. It is almost impossible to do the placementthe exact same way every time. While this might not represent a major problem during manualflight, where the pilot can compensate these inconsistencies by trimming the plane and adjusting toit, it does represent a considerable problem when it comes to autonomous navigation. One of thefirst steps in setting up the autonomous navigation system is to tune it according to the behaviorand response of the plane. Once it has been tuned, the autonomous navigation system will be ableto accurately control the plane. Tuning parameters are specific for each system. Therefore, if theassembly of the plane is not constant and repeatable, the tuning would have to be performed everytime, otherwise the autonomous navigation system would underperform. Tuning involves time andeffort, and it should be a one-time only step for as long as the plane is kept the same.

Another problem found in the airframe was the accessibility of the space inside the fuselage. Withonly two access points, top and front, installation and handling of the electronics is limited.

Considering that UAARG had been using the EPP FPV for 3 years at that time and that it has provento be stable and easy to control, the plan was to keep using this airframe and take advantage of theexistent knowledge and experience. With the aforementioned issues in mind a redesign of the planewas done. The aim was to solve the problems and produce a customized version of the airframe witha reliable and repeatable assembly, and optimize the space inside the fuselage for better use.

Page 65: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

47

3.4.1 Falcon I

The Falcon I is the first iteration of the EPP FPV modified airframe. Measurements were takenfrom the original airframe and drawn in SolidWorks. Then, modifications were introduced on topof the base model. While the EPP material is very forgiving when it comes to hard landings andminor crashes, as it absorbs impacts very well, a more rigid material is needed for some parts likejoints, attachments, etc. Therefore, some parts of the plane were replaced with plastic parts so asto strengthen and reinforce key structural points. Some of the plastic parts were designed to allowextra space were to put electronics, wiring, etc.

One of the most important aspects of the redesign was to split the fuselage in two, giving access tothe inside space. This way, it was possible to work, modify, and optimize the inside of the fuselageto allocate the electronics efficiently (Figure 3.13). Some cuts were done in the inside of the fuselageto allocate more space (Figure 3.14), without affecting the overall structural strength of the body.

Figure 3.13: Inside of the HobbyKing EPP FPV fuselage after splitting. Inside cuts are marked and nose wasremoved.

Figure 3.14: Side of the fuselage with cuts and some plastic parts glued already (right).

Page 66: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

48

Figure 3.15 shows a comparison of the unmodified EPP FPV fuselage with the redesigned version.The parts shown in blue in the redesigns are made out of plastic. All plastic parts were designedto be 3D printed using polylactic acid (PLA) as the printing material. PLA is a biodegradable andbioactive thermoplastic aliphatic polyester commonly used in 3D printers. 3D printer settings usedfor these parts can be found in Appendix D.

(a) (b)

Figure 3.15: (a) Original EPP FPV fuselage. (b) Modified fuselage (Falcon I design).

A top fuselage module (Figure 3.16) was designed to fit the wings centre module and also be able touse the space inside of it to fit the GPS/GPS antenna and external magnetometer. The wings centremodule (Figure 3.16) is a middle structure that goes between the wings and holds them togetherusing rods for alignment and an attachment piece that is glued to the wings. It attaches to the top ofthe fuselage by means of the top fuselage module and two bolts at the back. The wings centre moduleis one of the major and most important additions to the design. It firmly holds the wings in place,keeps them aligned, and secures them to the fuselage. It is a repeatable and reliable assembly. Thissetup extends the wingspan from 1800mm to 1910mm, which increases the lift force. Moreover, theinterior of the wings centre module provides additional space and still gives access to the interior ofthe fuselage.

Page 67: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

49

(a) (b)

(c) (d)

Figure 3.16: (a) Top fuselage module. (b) Assembled wing centre module. (c) Assembly attachment glued tothe wings. (d) Wings assembly.

The tail was also redesigned (Figure 3.17). The default tail has wooden supports in the bottom thatconsiderably increase drag. This structure was replaced with a tail centre module that connects tothe tail boom and clears the way under the horizontal stabilizer. The vertical stabilizer is supposedto be glued to the horizontal stabilizer, but the new design allows the vertical stabilizer to be boltedonto the tail centre module, thus allowing disassembly for easy storage and transportation. Thedefault design has elevator and rudder servos mounted on the tail boom close to the fuselage, withexternal wires and long rods. The new design incorporates servos embedded into the foam in bothstabilizers. This reduces the length of the rods that connect to the rudder and elevator. Moreover, theservo wires travel inside the tail boom instead of outside. This new design has a cleaner setup thathelps reduce drag, improves aerodynamics, allows for easy disassembly, and is more reliable withprotected wires and short rods.

Page 68: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

50

(a) (b)

Figure 3.17: (a) Original EPP FPV tail. (b) Modified tail (Falcon I design).

The wiring inside the fuselage is more efficient in the new design as cables are embedded andsecured into the foam (Figure 3.18a). For ease of assembly, wires are kept in just one side of thefuselage.

Figure 3.18: Internal wiring of the fuselage.

The new design uses the volume available in the nose of the plane for payload. A plastic nose (Figure3.19b) was designed for integrating any payload that could fit in such volume. With this approach,it is possible to have multiple noses, one for each type of payload, depending on the needs. Thebase design of the plastic nose can be modified to carry any piece of payload, provided volumeand weight permits. The nose attaches to the fuselage via six alignment pins and two screws. Thissystem provides a solid and reliable attachment while making it easy to install and uninstall the nose.The default foam nose (Figure 3.19a) can be used when no payload is needed. Additionally, the new

Page 69: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

51

design has the battery sliding into the fuselage from the front, thus the nose has to be taken out forinserting and removing the battery. Figure 3.20 shows a rendering cut of the interior of the fuselagewith the electronics installed.

(a) (b)

Figure 3.19: (a) Default foam nose. (b) Custom plastic nose for installing a USB camera board.

Figure 3.20: Rendering cut of the fuselage with internal components.

The main electronics (system board, power board, switches board, ODROID single board computer)was installed on a base board that slides into the upper half of the fuselage, while the battery packslides into the bottom half. Switches in the front allow easy access to the user, and connections tothe UAV system board from the top contribute to keeping a clean and organized wiring. The centreof gravity of the redesigned airframe was kept within the same range of the original.

The redesign of the airframe provides a more efficient use of volume in the fuselage, that allows forbetter allocation and organization of the components inside. Even though plastic parts are heavier

Page 70: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

52

than foam, thanks to the reduction in weight in the electronics assembly and imaging payload, theoverall weight of the fully assembled plane was reduced compared to the previous design. All plasticparts were design aiming for light-weight and high structural strength.

Previous designs had payload mounted on the the wings, while wings are kept intact in this redesign,with no holes or modification except for the attachment piece on the inside face, thus preserving theiraerodynamic capabilities. All the electronics was moved inside the fuselage, bringing all heavycomponents closer to the centre of gravity and making easier to balance the plane.

The result is a lighter plane with an efficient use of space inside the fuselage. The balance betweenthe original foam and the added plastic parts makes up for a reliable plane that achieves repeatabilityand can be assembled in a short time. The clean wiring and reduced size electronics make it easier tounderstand, learn, and handle by the user. The plane has been tested in multiple successful missions,and the University of Alberta Aerial Robotics Group (UAARG) adopted this redesigned version fortheir 2017 competition, and plan to continue working on top of this model. Figures 3.21 and 3.22show the final assembly of the plane.

Figure 3.21: 3D model of the Falcon I full assembly.

(a) (b)

Figure 3.22: Falcon I full assembly on the ground (a) and during flight (b).

Page 71: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

53

Chapter 4

Proof of Concept MultispectralImaging System

This chapter covers the development and experiments carried out with the initial prototype of amultispectral imaging system. The approach used in the development of the multispectral imagingsystem follows a sequence of incremental steps based on prototyping and validation before thesubsequent step. Several iterations of the system were produced throughout the project and aredetailed in this chapter and the next. The first section of this chapter describes the initial work donewith USB camera modules, which led to the use of these cameras in the development of subsequentmulti-camera imaging systems. The second section of this chapter details the development of a3-band multispectral imaging system based on USB camera modules.

4.1 USB Camera Modules

There is a large variety of USB camera modules in the market targeted to areas that range fromresearch to industrial use. The first steps in the process of working with USB cameras was doing thenecessary research about this technology and becoming familiar with its features and capabilities.The main features, as well as the advantages and disadvantages of this type of cameras are coveredin 2.1.5. In this same section it is stated that the interest is focused on USB2.0 cameras compatiblewith the UVC driver.

Page 72: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

54

4.1.1 Selection and Evaluation

The first step was to acquire a prototyping camera that could be used to validate the informationfound in the literature, as well as test the camera and evaluate its performance. With this in mind afirst selection was made. Camera ELP-USB500W02M from China based manufacturer ELP Ailipu

Technology Co.1 was selected as the first prototyping device. Table 4.1 shows the main features ofthis camera module.

Table 4.1: Features of the ELP-USB500W02M camera.

Manufacturer ELP Ailipu Technology Co.Part Number ELP-USB500W02MSensor Omnivision OV5640Sensor Type CMOSMaximum Resolution 2,592x1,944 pixelsPixel Size 1.4 µm

Image Area 3,629 x 2,722 µm

Shutter RollingPower Consumption ∼200 mA @ 5 VDCModule Size 38 x 38 mm

This camera was selected mainly because it provides a variety of settings that can be used to testmultiple configurations. It provides 10 different resolution configurations, both raw (YUYV) andcompressed (MJPG) output formats, and a variety of settings like brightness, contrast, hue, satura-tion, gamma, exposure, etc. In addition to its low cost, it is an excellent choice for performing allthe initial testing and evaluation. Figure 4.1 shows a picture of the ELP-USB500W02M.

Figure 4.1: USB camera module ELP-USB500W02M from ELP Ailipu Technology Co.

1http://www.elpcctv.com/

Page 73: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

55

4.1.2 Software Development

Software development with this camera was done in a Linux environment and based on the use of thenative UVC driver (section 2.1.5.1) through the Video for Linux Two (V4L2) API (section 2.1.5.2).This type of camera can be compared to webcams in so much as components and interface. Thanksto the UVC driver the camera works as a plug-and-play device. It is possible to connect the camerato a computer via USB and open it using a video capture program (e.g. Cheese, Guvcview, VLC).For example, Guvcview2 will provide a Graphical User Interface (GUI) with a first window showinglive streaming video captured from the camera, and a second window with the settings accessible inthe camera.

Further interest was in finding a way to programmatically control the settings of the camera withouthaving to open a GUI. v4l2-ctl is a tool to access V4L2 drivers from the command line. This tool ispart of the v4l-utils library, which is a bundle of packages for handling media devices. Using v4l2-ctlis possible to obtain information from a connected camera and control its settings. Information suchas a list of all the settings with details of the type of field (e.g. int, bool), the range of accepted values,and the default value can be read from the camera. It is also possible to obtain a list of the possiblevideo output formats supported by the camera, as well as the combination of resolution and framerate options for each. Tables 4.2 and 4.3 list the camera settings and output formats respectively forthe ELP-USB500W02M.

Table 4.2: Settings of the ELP-USB500W02M camera.

Setting Type Min Max Step Default

Brightness integer 0 15 1 8Contrast integer 0 15 1 8Hue integer -10 10 1 0Saturation integer 0 15 1 7Sharpness integer 0 15 1 6Gamma integer 1 10 1 7White Balance Temperature integer 2,800 6,500 1 2,800Backlight Compensation integer 0 1 1 0PowerLine Freq. (Anti Flicker) menu 0 2 - 2Focus integer 0 21 1 16Exposure Auto menu 0 3 - 3Exposure integer 4 5,000 1 625

An useful feature of camera is the automatic exposure setting (Exposure Auto), where the camera

2http://guvcview.sourceforge.net/

Page 74: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

56

takes care of adjusting the exposure and some internal gain setting in order to keep frames with theright amount of light.

Table 4.3: Frame Rates for the ELP-USB500W02M camera based on output video format and resolution.

ResolutionFrame Rate Frame Rate

@ MJPG @ YUYV

2592 x 1944 15 3

2048 x 1536 15 3

1920 x 1080 15 3

1600 x 1200 15 3

1280 x 1024 30 7.5

1280 x 720 30 7.5

1024 x 768 30 15

800 x 600 30 30

640 x 480 30 30

320 x 240 30 30

A second software tool was needed for capturing and saving frames from the camera. Several pro-grams were tested, including FFmpeg3, fswebcam4, Guvcview, uvccapture, VLC5, and GStreamer6

as command line tools. OpenCV7 was also tested as it is a popular library of programming functionsmainly aimed at real-time computer vision. Initial development was done using OpenCV, but someproblems were encountered when trying to configure the camera and synchronize processes.

With all of the command line tools being able to select an output format and resolution from thecamera, capture frames, and save them as image files at a certain rate, the decisive factor for a finalselection was performance.

GStreamer was selected as the best software tool because of its speed, low processing load, easeof use, and flexibility. But more importantly, as covered in section 2.1.5.3, GStreamer has theability to interface with multiple cameras simultaneously, which was a feature of interest for furtherdevelopment along this project.

A series of shell scripts were assembled using v4l2-ctl for configuring the camera and Gstreamer forcapturing frames.

3https://www.ffmpeg.org/4http://www.sanslogic.co.uk/fswebcam/5https://www.videolan.org/vlc/index.html6https://gstreamer.freedesktop.org/7https://opencv.org/

Page 75: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

57

4.1.3 Lens Testing - Modulation Transfer Function

A group of lenses with different focal lengths were tested to find the resolution efficiency for each.Due to the lack of a part number to identify them, an easy ID number was assigned to each. A firstset of lenses (Lens ID 1-5) was acquired from the same China based manufacturer ELP Ailipu Tech-

nology Co. Lenses 1-4 did not have any pixel count specification whereas lens 5 was advertised asa megapixel lens rated for 5MP. Finally, one more lens (Lens ID 6) was acquired from the Germanybased manufacturer The Imaging Source8. Lens 6 was also advertised as a megapixel lens rated for5MP.

Lenses were tested following the ISO 12233 slanted-edge MTF method covered in section 2.1.4.3.The target shown in the same section (Figure 2.11) was printed with dimensions of 60x60 cm on amate white sheet with 300ppi quality using a laser printer. The ELP-USB500W02M camera runningat full resolution (2592x1944 pixels) with RAW (YUYV) output was used for taking the pictures.The camera was focused manually to the best focus possible. Pictures were taken, visually checkedfor focus, and processed using a software implementation of the ISO 12233 slanted-edge MTFmethod. The software was developed by Burns in the form of a MATLAB source code packagecalled sfrmat3: SFR analysis for digital cameras and scanners [89]. Each image was processed atleast twice and using different regions of interest in order to validate the results. As expected, themethod is robust and stable, giving consistent results through repeated processing.

The distance from the camera to the target was set so that the target would fill the FOV. Table 4.4shows the results of the tests. Lenses 1-4 showed a low resolution efficiency, with the lowest at 32%for lens 1, and the highest at 42% for lens 2. Lenses 5 and 6 showed a higher resolution efficiency,at 71% and 77% respectively.

Table 4.4: Results of different M12 lenses tested using the slanted-edge MTF method.

LensID

FocalLength

DistanceSamplingEfficiency

Vert.

SamplingEfficiency

Horiz.

ResolutionEfficiency

EffectiveResolution

(mm) (cm) (%) (%) (%) (MP)

1 3.6 77 59 54 32 1.59

2 6.0 115 64 66 42 2.11

3 8.0 158 63 58 37 1.83

4 12.0 261 61 60 37 1.83

5 6.0 120 71 100 71 3.49

6 6.0 120 79 97 77 3.83

8https://www.theimagingsource.com/

Page 76: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

58

Figures 4.2 shows the resulting MTF plots of lens 5 for the horizontal and vertical components.In both cases it can be seen that plots present an increase in sharpness, evidenced by the positivebump that takes the MTF above 1.0 before dropping. This effect is more evident in the horizontalcomponent.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1Frequency (cycles/pixel)

0

0.2

0.4

0.6

0.8

1

1.2

1.4

MTF

Horizontal (100%)Vertical (71%)

Figure 4.2: MTF plot for lens 5 (6mm focal length).

4.2 3-Band Multispectral Imaging System

A multispectral imaging system was developed using USB camera modules. Most basic multispec-tral imaging systems for vegetation and terrain analysis use 3-4 monochromatic cameras coveringthe Landsat Thematic Mapper bands TM2 (green), TM3 (red), and TM4 (NIR) as the most importantbands, and adding the TM1 (blue-green) or the narrow red-edge bands as complementary.

Due to space constraints, a 3-band multispectral imaging system was designed, developed, installedon the Falcon I UAV (section 3.4) and flown. The system is based on three USB camera modulesand optical filters. The cameras are connected to an ODROID SBC (section 3.2) via USB whichtakes care of controlling the cameras and saving frames into its local memory. The ODROID wasalso connected to the telemetry output of the autopilot board (section 3.1) to record avionics datastamps relevant for the images. Figure 4.3 shows the block diagrams of a camera subsystem and thecomplete imaging system.

Page 77: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

59

(a) (b)

Figure 4.3: Simple block diagrams of a single camera (a), and the 3-band multispectral imaging system (b).

4.2.1 Cameras and Lenses

In section 2.2.3 a list of popular lightweight multispectral imaging systems currently in the marketwas presented. Based on an analysis of these systems, it was concluded that a resolution of 1.2MP (1280x960 pixels) was appropriate and reasonable for achieving a good balance between imagedetail, directly translated into GSD, and amount of data. It is important to keep in mind that USB 2.0has a theoretical maximum bandwidth of 480 Mbit/s or 60 MByte/s, and the more pixels a camerahas, the more data will travel via the USB connection. The system must be able to capture imagesfrom all cameras simultaneously, therefore care must be taken in making sure not to exceed the USB2.0 capacity. Camera ELP-USB130W01MT-B/W from ELP Ailipu Technology Co. was selected.Tables 4.5, 4.6, and 4.7 present the detailed features, settings, and output formats of this camera.

Table 4.5: Features of the ELP-USB130W01MT-B/W camera.

Manufacturer ELP Ailipu Technology Co.Part Number ELP-USB130W01MT-B/WSensor ON Semiconductor AR0130CSSensor Type CMOSMaximum Resolution 1,280x960 pixelsPixel Size 3.75 µm

Image Area 4,800 x 3,600 µm

Shutter RollingPower Consumption ∼160 mA @ 5 VDCModule Size 38 x 38 mm

Page 78: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

60

Table 4.6: Settings of the ELP-USB130W01MT-B/W camera.

Setting Type Min Max Step Default

Brightness integer -64 64 1 0Contrast integer 0 95 1 32Hue integer -2,000 2,000 1 0Saturation integer 0 128 1 0Sharpness integer 1 7 1 2Gamma integer 100 300 1 0White Balance Temperature Auto bool 0 1 - 1White Balance Temperature integer 2,800 6,500 1 4,600Backlight Compensation integer 0 3 1 1Gain integer 0 100 1 0PowerLine Freq. (Anti Flicker) menu 0 2 - 1Exposure Auto menu 0 3 - 3Exposure integer 1 5,000 1 625Exposure Priority bool 0 1 - 0

Table 4.7: Frame Rates for the ELP-USB130W01MT-B/W camera based on output video format and resolution.

ResolutionFrame Rate Frame Rate

@ MJPG @ YUYV

1280 x 960 30/15 9

1280 x 720 30/15 10

800 x 600 30/15 20

640 x 480 30/15 30

320 x 240 30/15 30

With 1280x960 pixels and a pixel size of 3.75 µm, the image sensor in this camera has the sameresolution capabilities than the Sentek GEMS, Sentera QUAD, and Parrot Sequoia cameras (section2.2.3). The major difference is that the ELP-USB130W01MT-B/W has a rolling shutter insteadof a global shutter. This poses a challenge as imaging issues like skew or blur could be presentin the final images under certain conditions. Another challenge is synchronizing multiple ELP-USB130W01MT-B/W cameras. Since there is no way of doing this via hardware, synchronizationwould have to be done through software.

It is important to take into consideration that this type of CMOS image sensor have a high quantumefficiency (measure of sensitivity of the photodiodes) in the visible wavelengths range (above 70%),

Page 79: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

61

but it decays at higher wavelengths (Figure 4.4). Therefore, the NIR range will have a lower quantumefficiency that will have to be compensated with a longer exposure time or a higher gain.

350 400 450 500 550 600 650 700 750 800 850 900 950 1000 1050 1100 1150Wavelength (nm)

0

10

20

30

40

50

60

70

80

90

Qua

ntum

Effi

cien

cy (%

)

Figure 4.4: Quantum Efficiency of the Aptina / ON Semiconductor AR0130CS image sensor.

Lens 5 (6 mm focal length, 5 MP) was selected for the imaging system. The higher resolutionefficiency (71%) gives better imaging performance. Going with 6 mm focal length results in a FOVof 43.60° x 33.39° and a GSD of 7.5 cm @ 400 ft (~122m). Compared to values presented in table2.4 (section 2.2.3) for commercial cameras, this configuration is close to the ones adopted by theSentek GEMS, Sentera QUAD, and Parrot Sequoia cameras. The average between the focal lengthsof these cameras gives 5.56 mm, close to the 6 mm focal length selected for our system. Thisselection offers a good balance between FOV and GSD.

4.2.2 Optical Filters

Filters for the imaging system were selected following the general convention of bands used incommercial multispectral cameras (section 2.2.3), which in turn is oriented towards the calculationof vegetation indices. Given that the system has three cameras, the green, red, and NIR bands wereselected. Filters were acquired from Andover Corporation 9. Table 4.8 and Figure 4.5 present thetechnical details and spectral response of the filters. Figure 4.6 shows a comparison of the bandschosen for this system and the bands used in other commercial multispectral imaging solutions.

9https://www.andovercorp.com/

Page 80: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

62

Table 4.8: Technical details of the optical filters.

Parameter Green BandFilter

Red BandFilter

NIR BandFilter

Center Wavelength (nm) 550 +10/-0 650 +10/-0 800 +10/-0Bandwidth (nm) 40 +8/-8 40 +8/-8 40 +8/-8Transmission (%) 50 50 50Size (mm) 12.5 +0/- 0.25 12.5 +0/- 0.25 12.5 +0/- 0.25Thickness (mm) 5.9 5.9 5.9Refractive Index 2.05 2.05 2.05

400 500 600 700 800 900 1000 1100Wavelength (nm)

0

10

20

30

40

50

60

70

80

90

Spec

tral R

espo

nse

(%)

GREENREDNIR

Figure 4.5: Spectral response or transmissivity of the green, red, and NIR band-pass filters chosen for thesystem.

Figure 4.6: Comparison of chosen bands with bands used in other commercial multispectral cameras.

Page 81: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

63

Filters came as discs of 12.5 mm of diameter. A small adapter was designed for holding the filterand fit on top of the lens as a cap (Figure 4.7). The adapter was 3D printed, the filter would fit tightlyinside, and the whole cap would sit on top of the lens, secured by a screw.

Figure 4.7: From left to right; green, red, and NIR filters mounted on lens cap adapters.

Lens 5 comes with an IR cut filter glued to the back of the lens (Figure 4.8). This IR cut filter wasremoved from the lens where the NIR filter was installed, so as to be able to capture the desiredband. The filter is a small glass disc with glue in three points. The filter was removed using a smallsharp knife to cut through the points of glue.

Figure 4.8: Back of a lens with IR cut filter (left), and after IR cut filter removal (right).

4.2.3 Software Development

As previously mentioned in section 4.1.2, a series of shell scripts were created in order to interfacewith the camera. The scripts are based on v4l2-ctl for configuring the camera and Gstreamer forcapturing frames. The v4l2-ctl configuration sets the values of the camera settings shown in table4.6. Default settings were used with automatic exposure enabled.

After configuration, three identical pipelines were set up to run in parallel using Gstreamer. Eachpipeline starts by opening a multimedia source, which in this case is a V4L2 source (camera). The

Page 82: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

64

second step is to select the video output option. In this case, raw video (YUYV) with 1280x960resolution and 9 fps frame rate was selected. The next step is to set the rate at which frames aregoing to be extracted from the source. A 1 fps frame rate was selected, which means that the sourceof 9 fps is downsampled to 1 fps. In other words, one frame is kept and 8 frames are skippedevery second. The reason for selecting this frame rate depends on the conditions of the flights andis explained in more detailed in section 4.2.5. Final steps are to convert and encode the frame as aPNG file and save it in local memory. By using a multifilesink sink the pipeline will run in an endlessloop. Figure 4.9 shows the diagram os the Gstreamer application.

Figure 4.9: Diagram of the Gstreamer pipeline application for three ELP-USB130W01MT-B/W cameras.

While the above was the optimal configuration defined for the system, the software package wasdesigned to keep settings in external configuration files that are given as input arguments to the mainscript. This way it is possible to define multiple configuration files with settings that are easy tochange, thus making the overall software flexible.

Codecs JPEG, TIFF, and RAW were also tested for the encoder section. JPEG is a compressedformat that leads to a faster processing and smaller file size, but by being compressed some infor-mation is lost. TIFF and RAW and complex lossless formats that keep all the information but takelonger to process and lead to larger file sizes. PNG is a simple format that doesn’t have most of thenative features of TIFF or RAW while still being lossless and maintaining the image data. PNG wasselected as the optimal format for being lossless and resulting in relatively small file sizes.

Page 83: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

65

4.2.4 UAV Setup

In section 3.4.1 the interchangeable payload nose was introduced. This concept gives flexibilityto the airframe design as multiple noses can be designed using the base model. The purpose ofthe noses is to hold the payload systems, which in this case is the cameras. A custom nose wasdesigned to hold the three ELP-USB130W01MT-B/W cameras with filters attached. Given thespace available in the nose base model, the cameras were arranged in three different levels (Figure4.10). This arrangement introduces a distance difference of 13 mm between the sensor planes ofeach camera sensor, which is negligible given the intended range of working distances. The upsideof this arrangement is that the optical centre of the cameras is closer to each other (29 mm), reducingthe amount of misalignment between the cameras. Cameras are secured to the nose using screwsand connected to the onboard ODROID SBC via USB cables. The figure shows how the locationof the filters was assigned. The bottom part of the nose includes a transparent acrylic window thatprotects the cameras and preserves aerodynamics.

(a) (b)

Figure 4.10: 3D model of custom plane nose with 3 cameras. (a) Isometric view. (b) Side cross-section view.

(a) (b)

Figure 4.11: (a) Custom plane nose with 3 cameras and filters. (b) Nose installed in the plane.

Page 84: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

66

The 3D printed custom nose with the 3 cameras, lenses, filters, and all the hardware has a totalweight of 154 grams. This doesn’t include the USB cables and ODROID computer that is necessaryfor interfacing with the cameras. Nonetheless, it is a relatively lightweight setup considering the useof 3 USB camera modules. The addition of this payload to the UAV does not increase the overallweight considerably and has a minor effect in the overall center of gravity.

4.2.5 Flight Campaigns

A series of flight campaigns were performed in order to test the system. Flights were performed atBremner Field, a grass field in the North Saskatchewan River Valley, north of Sherwood Park, inAlberta, Canada. It is located in coordinates 53.6380961°, -113.2895888°, with an altitude of 610m above sea level. This field offers north-south, east-west, and northwest-southeast runways withclear and unobstructed approaches from all directions. Bremner Field is managed by the EdmontonRadio Control Society (ERCS)10. Figure 4.12 shows a map of Bremner Field with the delimited areaallowed for flight operations.

Figure 4.12: Map of Bremner Field indicating the delimited area for flight operations.

There was a total of four flights done with the 3-band multispectral imaging system on board of theFalcon I UAV. Table 4.9 presents the details of each flight. Flights 3 and 4 were divided in 2 and

10https://www.ercs.ab.ca/

Page 85: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

67

3 segments respectively with different altitudes each. Actual speeds and altitudes are reported asaverages in the table.

Table 4.9: Flight campaigns details.

Flight Date StartTime

Duration(mm:ss)

Avg. GroundSpeed (m/s)

Avg. Altitude(m)

ImageSets

1 2017-AUG-03 12:13:30 03:08 17 60 1882 2017-AUG-03 12:32:20 08:13 17 100 4933 2017-AUG-24 14:36:46 15:27 21 50 / 90 9274 2017-SEP-15 12:34:13 19:49 18 60 / 80 / 100 1189

Flights involved manual take-off and landing, and a mix of assisted and autonomous flying. Assistedflying allows the pilot to control the direction and speed of the UAV but it limits the amount of roll,pitch, and yaw. Autonomous flight gives total control of the UAV to the onboard autopilot system,which executes a predefined flight plan. An operator in charge of the GCS monitors the behavior ofthe UAV at all times and manages the flight plan when in autonomous mode.

The imaging scripts were activated manually via terminal in the ODROID before taking off, andterminated in the same way after landing. Given the ground speed and altitude of the UAV, theimage capture frequency was set to 1 frame per second. At this capture frequency there is a 33.33%overlap between two consecutive frames flying at 50 m AGL and 20 m/s ground speed, and a 66.67%overlap for 100 m AGL and the same speed. Cameras were used at maximum resolution (1280x960pixels) and focus was set for a range of 50 - 100 meters, using the principle of hyperfocal distance,which is the the closest focusing distance that allows objects at infinity to be acceptably sharp. Underthese conditions, images did not show any visible skew or blur during normal flight (take-off andlanding excluded).

4.2.6 Image Alignment

Due to the use of multiple cameras and the fact that its optical centers are misaligned, the resultingimages will also be misaligned. Alignment of the images is necessary in order to operate on themand obtain useful information like vegetation indices. In the UAV setup presented above the camerasare horizontally aligned (horizontal being the direction perpendicular to the direction of flight) butvertically misaligned (vertical being the direction of flight), with a distance between optical centersof 29 mm. This means that alignment of the images can be done operating only on one axis.

A MATLAB program was developed in order to run an alignment algorithm over the images. Thealgorithm takes the image from the middle camera (with the NIR filter) as the reference and alignsthe other two (red and green) with respect to it. The algorithm is based on correlation between theimages, where one image is the reference and the other is the subject.

Page 86: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

68

The algorithm is based on the assumption that there is a single maxima in the pixel-by-pixel correla-tion between two misaligned images as one of the images is translated a certain amount of pixels ina certain direction. Since the images in this case are misaligned only in one axis, and the directionof misalignment is known, the problem is simplified as the algorithm runs only in one axis and onedirection (1D translation). Figure 4.13 shows the correlation plot of two images as one them is keptfixed and the other is displaced. Displacement was done with 1 pixel increments with a range of50 pixels. The maxima for this example occurs at 22 pixels, with a correlation of 0.84312. At thispoint, the two images have the highest correlation between them. If the pixel offset, 22 pixels inthis case, is cropped from both images, one at the top and the other at the bottom, the result is twoaligned images. The images used for this test were taken at approximately 60 meters away from thetarget. At larger working distances the misalignment is less and the offset in pixels will be less.

0 10 20 30 40 50Displacement (pixels)

0.7

0.75

0.8

0.85

0.9

Cor

rela

tion

(22, 0.84312)

correlationmaxima

Figure 4.13: Correlation plot of alignment algorithm indicating maxima.

The process for aligning a set of three images uses the same principle. Two images are alignedfirst to produce a first aligned subset, then the third image is aligned with respect to this subset toproduce the final set. In this case, the middle image and one of the side images is aligned first.After alignment and cropping, the new middle image is used as reference to align the other sideimage. Then, the offset from the first pair and the second pair is compared, taking the greater oneand adjusting in order to make sure all three images have the same size. The final images will endup with a smaller height due to the cropping (Figure 4.14). The areas of the images, top and bottom,that are not present in all three images are discarded, leaving three images accurately aligned.

Page 87: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

69

Figure 4.14: Example of pixel offset and cropping of images for alignment.

Figure 4.15 shows the example of three images (red, green, and NIR bands) superposed before andafter alignment. In this example the misalignment between red and NIR band images was 19 pixel,and between NIR and green band images was 21 pixels.

Figure 4.15: Superposition of red, green, and NIR band images before and after alignment.

Page 88: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

70

Figure 4.16 shows a closer comparison of the superposed images before and after alignment.

(a) (b)

Figure 4.16: Result of image alignment algorithm. (a) Superposition before alignment. (b) Superposition afteralignment.

While this method might seem simple compared to more advanced algorithms [90], it gives accurateresults and takes less processing time. The algorithm was improved by making it exit the correlationcomparison loop as soon as the maxima was found. It takes the algorithm an average of 1.79 secondsto align a set of three images taken at a working distance of 80 meters, where the misalignment is inaverage 19 pixels.

The resulting images after alignment end up being shorter in height as only common areas of theimages are kept, the rest is removed. The final superposed image is clear and well defined. Sincelens distortion was not corrected the alignment of pixels is not perfect, and some offset can be seen.The offset goes from zero at the center of the image and grows moving outwards. It is possible tosee misaligned areas at the edges and corners of the superposed image. Nonetheless, the results arefast and accurate enough for basic pixel level operations in the images.

4.2.7 Vegetation Indices

After capturing images of the same scene using different spectral bands, and aligning the images, itwas possible to operate over the images and calculate vegetation indices. Although there are severalvegetation indices, the most widely used is the Normalized Difference Vegetation Index (NDVI),introduced in section 2.3.1. The NDVI is a simple numerical indicator used for analyzing remotesensing imagery and determine whether the target contains live green vegetation or not. NDVIimages were calculated from NIR and red band images using equation 2.11 from section 2.3.1 on apixel by pixel basis. Resulting images have a [-1,1] range and are false colored in order to obtain aclear vegetation map.

Page 89: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

71

Figure 4.17 shows a test performed with the 3-band multispectral imaging system presented in thiswork, using six leaves of the same family, each with different levels of stress, going from a perfectlyhealthy leaf to a dead leaf. The experiment was done using natural sunlight as the incident source.

(a) (b)

Figure 4.17: Example NDVI test of leaves with different levels of stress. (a) RGB image. (b) False coloredNDVI image.

Classification within the range of NDVI values is important in order to correctly identify the areasof interest in the image. Barren areas of rock, sand, or snow have a low NDVI value (0.1 andbelow). Sparse vegetation like shrubs and grasslands show a moderate NDVI value (approximately0.2 to 0.5). High NDVI values (approximately 0.6 to 0.9) correspond to dense vegetation commonlyfound in temperate and tropical forests or crops at their peak growth stage. Bare soil is representedwith NDVI values that are closer to 0, and water bodies are represented with negative NDVI values[91]. Although these are general thresholds, in practice it is usually necessary to perform some realsample calibration to adjust the ranges for up to ± 0.1.

In the experiment with the leaves, the color scale was adjusted for moderately healthy vegetation tostart at an NDVI value of 0.25. It is evident in the image, as it is compared to the equivalent RGBimage, that healthy leaves can be clearly differentiated from highly stressed (dry and dead) leaves.Furthermore, clear levels can be identified in the three healthy leaves on the NDVI image.

All the NDVI processing algorithm (calculation and false coloring) was done in MATLAB andpacked in a single processing script. The same script was later used for processing the images ofthe flights after alignment. Figure 4.18 shows samples of NIR band, red band, and NDVI imagesof pictures captured from the UAV at Bremner Field. In this field, the main runways have livegreen grass that is regularly watered and in maintained conditions, whereas the surrounding sectionsof the runways have grass and bushes that live in natural conditions. The surrounding sectionsexhibit less green vegetation and higher levels of dryness to the eye. NDVI images show a cleardifferentiation between the maintained and apparently healthy live grass in one of the main runwaysand the surrounding not maintained grass.

Page 90: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

72

Figure 4.18: Sample pictures of a flights at Bremner Field showing the NIR and Red band images and theNDVI resulting image.

Page 91: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

73

Chapter 5

12-Band Multispectral ImagingPlatform

This chapter covers the design, development, and experiments carried out with the more advancedprototype of a 12-Band multispectral imaging system. The first section introduces the concept andkey aspects of the system. The second section covers the USB camera modules chosen for the sys-tem, explaining the factors and reasons for its selection. Subsequently, the custom camera bridgeboard is introduced, covering the integrated and configurable USB HUB and automatic trigger cir-cuits. Finally, results from testing are presented.

5.1 System Concept

The proposed system will follow the approach presented in the previous section. In this revision,the system is based on an array of up to 12 cameras with global shutter and synchronized trigger.The cameras continue to have a USB 2.0 interface, and will be connected to a custom designed PCBthat will include USB HUBs to integrate USB routes and minimize cabling. The USB routes willbe configurable to a degree to allow for grouping cameras depending on configuration and USBbandwidth requirements. The custom camera bridge board will also include and triggering circuitfor the cameras based on a microcontroller unit (MCU).

The goal is to create and test an integrated multi-camera system based on USB camera modules thatis flexible and configurable, so that it can be used with optical filters as a development platform formultispectral imaging applications.

Page 92: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

74

5.2 USB Camera Modules

The imaging system was designed to work with the DMM 42BUC03-ML USB USB 2.0 board-level camera from the manufacturer The Imaging Source, LLC1. Selection of this camera was madebased on a balance between size, cost, and capabilities. The requirements were set for a camerawith global shutter and trigger as the main specifications. The usual maximum resolution for globalshutter sensors is 1.2 Megapixels. The use of S-Mount lenses and no packaging or enclosure reducesthe costs, and the size was selected as the minimum possible to allow for miniaturization and lightweight. The DMM 42BUC03-ML (Figure 5.1) is a small 30 mm x 30 mm monochromatic board-level camera module with S-Mount (M12) lens based on the ON Semiconductor AR0134CS CMOSimage sensor. This sensor comes with a global shutter [92], which is ideal for imaging a target orscene in movement as it does not suffer from imaging issues associated with rolling shutter such asskew and wobble [93]. Table 5.1 summarizes the technical specifications of this camera.

Figure 5.1: DMM 42BUC03-ML camera module from The Imaging Source, including S-Mount lens holderand lens.

Table 5.1: Specifications of the DMM 42BUC03-ML camera.

Manufacturer The Imaging SourcePart Number DMM 42BUC03-MLSensor ON Semiconductor AR0134CSSensor Type CMOSMaximum Resolution 1,280 x 960 pixelsPixel Size 3.75 µmImage Area 4,800 x 3,600 µmShutter GlobalPower Consumption ∼250 mA @ 5 VDCModule Size 30 x 30 mm

1https://www.theimagingsource.com/

Page 93: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

75

The ON Semiconductor AR0134CS has 1.2 Megapixels (1,280 x 960 pixels) and a pixel size of3.75 µm, which makes it equivalent to the image sensors used in the Parrot Sequoia, MicaSenseRedEdge-M, Sentera QUAD, and Sentek Systems GEMS Multispectral Sensor. The quantum effi-ciency (measure of the sensitivity of the photodiodes) of this sensor (Figure 5.2) is also very similarto the ones reported for the commercial cameras.

350 400 450 500 550 600 650 700 750 800 850 900 950 1000 1050 1100 1150Wavelength (nm)

0

10

20

30

40

50

60

70

80

Qua

ntum

Effi

cien

cy (%

)

Figure 5.2: Quantum efficiency of the ON Semiconductor AR0134CS image sensor.

The DMM 42BUC03-ML camera also features a working mode based on an external trigger input.When a trigger signal, defined as a positive pulse of at least 10 microseconds, is received by thecamera in the trigger input pin, the camera will start the exposure and readout process of a singleframe (Figure 5.3). The length of the exposure can be set by software. The duration of the imagereadout is the reciprocal of the current frame rate. After image readout is finished, the camera is ableto accept another trigger. It is possible use this feature to send the same trigger signal to multiplecameras and have them synchronized. The trigger input on the camera is opto-coupled and admitspositive trigger pulses with any amplitude between 3.3V and 12V.

Figure 5.3: Timing diagram of the trigger, exposure, and image readout processes.

This camera has a monochromatic raw output video format (Y800) and can operate at three different

Page 94: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

76

resolution options, with several frame rate options for each resolution (Table 5.2). When selectinga lower resolution, contrary to pixel binning [94], the DMM 42BUC03-ML will only read the cor-responding amount of pixels referenced to the top left corner of the sensor. Therefore, lowering theresolution reduces image data and USB bandwidth requirements, which makes possible for morecameras to share the USB resource.

Table 5.2: Video formats and frame rates available in the DMM 42BUC03-ML camera.

ResolutionFrame Rate

@ GREY (Y800)

1280 x 960 30 / 25 / 15 / 10

640 x 480 60 / 30 / 25 / 15

320 x 240 60 / 30 / 25 / 15

The USB interface in this camera is compatible with the USB Video Class (UVC) standard driver.This camera offers a range of settings that are accessible and configurable via software (Table 5.3).The “Privacy” value is the trigger enable/disable setting. When disabled the camera is in free runningmode. When enabled the camera is in trigger mode.

Table 5.3: Settings of the DMM 42BUC03-ML camera.

Setting Type Min Max Step Default

Brightness integer 0 255 1 12Gain integer 34 255 1 36Exposure integer 1 300,000 1 127Privacy bool 0 1 - 0

5.2.1 Optical Filters

The use of S-Mount (M12) lenses in these camera modules allow the use of optical filters with thesame approach presented in section 4.2.2. The 12.5 mm of diameter filter discs are easily installedin the lenses using a small 3D printed adapter piece that holds the filter and fits in front of the lens,secured by a screw (Figure 5.4).

5.3 Camera Bridge Board

The camera bridge board (Figure 5.5) acts as a motherboard for the camera modules and is themain integration unit of the system. Cameras connect directly to the bridge board, which routes all

Page 95: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

77

Figure 5.4: Camera board with S-Mount (M12) lens and optical filter installed using the 3D printed adapter.

USB communications via USB HUBs into selectable upstream USB ports. This board includes astep-down voltage regulator that provides power to all components including the cameras. It alsointegrates a microcontroller unit (MCU) that generates the pulses used for triggering the cameras.

Figure 5.5: Diagram of the basic connections between the camera array and the bridge board. The bridgeboard routes all USB communications into selectable upstream ports and sends trigger signals to the cameras.

The goal was to design a compact system, and avoid the use of USB cables as they tend to be bulkyand take up considerable space and mass. The bridge board was designed to have straight mini USBplug connectors that match the mini USB receptacle connectors found in the DMM 42BUC03-MLcamera modules. The trigger input pins of the camera modules are found in a picoblade connector.The bridge board has cutouts underneath of each camera location, and individual trigger connectorson the other side of the board. Mounting holes are also included for securing the cameras to thebridge board using standoffs.

The bridge board Printed Circuit Board (PCB) was designed using the PCB CAD tool Altium De-signer. Schematics and Bill-of-Materials of the camera bridge board can be found in Appendices Band C. Figure 5.6 shows a layout view of the board with the top and bottom layers.

Page 96: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

78

Figure 5.6: Altium Designer layout view of the bridge board with top layer (red), bottom layer (blue), andsilkscreen (yellow).

The PCB is a 4-layer board. External layers are used for routing signals and internal layers areplanes. Layer 1 (top) contains most of the components and USB routing. Layer 2 is a ground planethat provides reference for the USB differential pairs so as to keep the desired impedance. Layer 3is a power plane, and layer 4 (bottom) contains the trigger connectors and additional routing. Thebridge board alone weighs 95 grams, and is 150 mm long by 138 mm wide. Figure 5.7 shows aCAD 3D model of the bridge board.

Figure 5.7: CAD 3D model of the bridge board.

Page 97: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

79

It is important to note that the bridge board itself is a USB 2.0 multi-camera platform that can bemodified and adapted to work with a wide range of cameras. While this board was designed for theuse of the DMM 42BUC03-ML camera modules, it is perfectly possible to connect other cameras.The Imaging Source provides other USB 2.0 board-level camera modules with different features butkeeping the same form factor. Therefore it is possible to replace the selected DMM 42BUC03-MLcamera module with other cameras from the same manufacturer and still be able to utilize the directconnection to the mini USB connector, mounting holes, and trigger connection. Alternatively, it isalso possible to use any other camera module as long as it communicates via USB 2.0. Additionalcables might be needed to complete the USB and trigger connections between the cameras and thebridge board, and extra work might be required for attaching the cameras.

The bridge board provides all necessary electronics for the connection of multiple USB 2.0 camerasto different upstream USB ports via configurable USB HUBs. It also provides a configurable triggersystem handled by a reprogrammable MCU. The following sections cover in detail specific featuresof the bridge board.

5.3.1 USB HUB

A configurable USB 2.0 HUB in the board allows for the connection of up to 12 USB 2.0 cameramodules (Figure 5.8). The cameras are organized in groups of four, going trough a first level of USBHUBs and then USB switches that offer two different routing options. The first option is a secondlevel USB HUB that connects all cameras to a master USB upstream port. The second option is toroute each group to its dedicated USB upstream port. The USB switches are controlled with userselectable jumpers. The flexibility of this approach allows the user to choose between isolating andmerging USB communications depending on the USB bandwidth requirements.

Figure 5.8: Diagram of the configurable USB HUB network. SW 1, SW 2, and SW 3 are 2-to-1 bidirectionalUSB 2.0 switches with inputs controlled by user selectable jumpers. USB 1, USB 2, USB 3, are the isolatedupstream ports, and USB M is the master upstream USB port.

Page 98: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

80

By isolating a first layer HUB we end up with an independent group of 4 cameras routed to anupstream mini USB port (USB 1, 2, 3). A lower number of cameras reduces the load and bandwidthrequirements on a USB port. This setup is used for cameras that require a high USB bandwidth.Alternatively, when using cameras with a lower USB bandwidth requirement, it is possible to mergesome or all USB communication in the bridge board into a single USB port (USB M). This approachsimplifies the connection of the bridge board to an external computer.

5.3.2 Trigger

The bridge board includes a trigger system based on a microcontroller unit (MCU). The MCU isan 8-bit 16Mhz ATMEGA328P-AU compatible with the Arduino framework and development en-vironment [95, 96]. The MCU uses general purpose input/output (GPIO) pins to generate the pulsesthat trigger the cameras. A total of 6 GPIO pins from the same MCU port are used, and each pin con-nects to a group of 2 cameras. By having 6 independent trigger pins it is possible to define customtriggering sequences for each pair of cameras. Conversely, by having the 6 pins from the same MCUport it is possible to trigger all of them at the same time, thus allowing accurate synchronization ofall cameras. Trigger signals go through a jumper section in the board, allowing the user to break theconnection between the MCU and the cameras. By means of this it is possible to connect the MCUpins to a different location, and/or connect trigger signals from an external system. This approachbrings more flexibility to the design, making it easy to modify according to the requirements. Figure5.9 shows a diagram of the trigger system.

Figure 5.9: Diagram of the trigger system based on a microcontroller unit (MCU). Trigger signals are gener-ated via GPIOs. A UART connection via USB is provided for interactively interfacing with the MCU via serialcommands.

The MCU is also connected via UART to a UART-to-USB transceiver, which in turn connects to aUSB connector. This addition provides an easy way to flash the MCU as long as the correspondingbootloader is on. More importantly, this connection provides a way to communicate interactivelywith the MCU via USB serial. A set of commands has been defined in the default MCU program tomake it possible to change some settings of the trigger system on the go, without needing to modify

Page 99: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

81

the program and re-flash it. Features like changing the period of the automatic trigger, start or stopthe automatic trigger, send trigger signals based on a command, are included in the MCU programand can be controlled via serial commands. The ability to have multiple functions and modes ofoperation of the trigger system integrated in the same program makes it easy, flexible, and versatile.

5.4 Software

Similar to the previous imaging system (Chapter 4), a software package was developed for interfac-ing with the imaging system. The core of the software remained in the two elements: The Video forLinux Two (V4L2) API, used to interface with the lower-level UVC driver, and GStreamer, used forsimultaneous interface with multiple cameras.

Similarly, the software package is based on a collection of shell scripts that take care of configuringthe cameras and building the acquisition, processing, and storing pipelines for each camera. Thescript runs according to settings specified in a configuration file that is given to the script as anargument. A series of external configuration files keep different variants of the system and allow forcustomization of the program.

The trigger system MCU runs a program written in C++. The program sends simultaneous pulsesto all cameras periodically. The triggering period is set to 1 second by default. This value is loadedfrom the MCU EEPROM. The program also has a serial terminal-like interface that accepts customdefined AT commands. The commands allow to stop and restart the periodic trigger, change thetriggering period (any new value is saved in the EEPROM), and send a single trigger.

5.5 Testing

An experiment was designed to test the system in the lab with different configurations of the cameras,and determine the effect in the USB traffic load, as well as the maximum number of cameras thatcan work simultaneously for a given configuration. The platform was tested with DMM 42BUC03-ML camera modules synchronized using the trigger system. The system was connected using asingle upstream USB port to an external computer running Linux. USB traffic was analyzed usinga sniffing tool called Wireshark2 [97]. Resolution, Frame rate, and the number of simultaneouscameras were the controlled variables. The total throughput in the USB 2.0 channel was monitored.Table 5.4 summarizes the scenarios that successfully worked with images captured simultaneouslyfrom all cameras and saved locally.

Images from the DMM 42BUC03-ML are transmitted over USB in chunks (packets) of 16,448bytes. First, the host (computer) sends a packet request, and then the peripheral (camera) sends the

2https://www.wireshark.org/

Page 100: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

82

Table 5.4: Test results of USB traffic with different configurations. Cameras were connected to a single USBupstream port and triggered simultaneously. Data transferred based on a single frame from each camera.

Test # Resolution FrameRate

#Cam

DataTransferred

Throughput

(fps) (Bytes) (MB/s)

1 1280 x 960 30 1 1,238,530 38.982 1280 x 960 10 1 1,238,530 12.933 1280 x 960 10 2 2,477,060 25.904 1280 x 960 10 3 3,715,590 38.955 640 x 480 30 2 619,268 20.876 640 x 480 15 2 619,268 10.437 640 x 480 15 4 1,238,536 20.878 640 x 480 15 8 2,477,072 41.619 320 x 240 15 8 619,280 10.32

10 320 x 240 15 12 928,920 15.5111 320 x 240 30 12 928,920 29.44

response packet (includes overhead and payload). This process continues until the whole frame hasbeen transferred. The total data transferred reported in Table 5.4 takes into accounts all traffic, whichincludes request packets and response data packets. As mentioned before, the maximum throughputin practice for USB 2.0 is around 40 MB/s. Tests 1, 4, and 8 have working configurations at the limitof the USB bandwidth capacity. Adding an additional camera to these configurations would exceedthe USB 2.0 bandwidth and would result in dropped packets representing permanently lost fractionsof camera frames.

It is evident that decreasing the resolution leads to less data being transferred, and lowering theframe rate gives more time for data to transfer. Both of these conditions cause the throughput todecrease and allow the connection of more cameras. The resolution and frame rate settings definethe maximum number of cameras that can work at the same time.

Based on the results of these tests it is possible to estimate the maximum number of cameras thatcan work simultaneously with a certain configuration. The estimation uses the maximum practicalUSB bandwidth of the system and an estimate of the total throughput.

Max # Cameras =USB BW

T hroughput(5.1)

The USB BW is set to 40×106 Bytes/s, and the Throughput is given by

Page 101: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

83

T hroughput =Data Frame

Trans f er Time(5.2)

Data Frame = SF× (Width×Height) (5.3)

Trans f er Time =1

Frame rate(5.4)

Since the Y800 format used by the cameras assigns one byte per pixel, Data Frame is estimated asthe total number of bytes required by the image, multiplied by a scaling factor (SF = 1.01) to accountfor the response packets overhead and request packets. This scaling factor was calculated from thesize of the request packets, response packets, and response packets overhead. Each request packetis 64 bytes, and each response packet is 16,448 bytes out of which 100 bytes are overhead and therest is payload. The scaling factor is the ratio of the total number of bytes exchanged via USB tothe actual number of payload bytes (16,512 / 16,348 = 1.01). The transfer time is equivalent to thereadout time, which is the reciprocal of the frame rate. After substituting (5.2), (5.3), and (5.4) in(5.1) we get

Max #

Cameras =USB BW

SF×Width×Height×Frame rate(5.5)

The integer part of this number gives an accurate estimation of the maximum number of camerasthat can work simultaneously for a certain resolution and frame rate.

The final system, shown in Figure 5.10, has been successfully tested with an ODROID C2 singleboard computer (SBC). Given the low CPU load of the software, it can run with no issues in manyrelatively cheap SBCs. The imaging board is 150 mm long by 138 mm wide, and 43 mm tallconsidering the camera modules and lenses. Although the size and focus point of the lenses willaffect the overall height. The board weights 323 grams with 12 DMM 42BUC03-ML cameras,standoffs, and fasteners.

Page 102: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

84

Figure 5.10: Multispectral imaging system hardware with 12 cameras installed.

Finally, table 5.5 shows an extension of table 2.4 presented previously in section 2.2.3, with theaddition of the 12-band multispectral imaging system presented in this section for comparison.

Table 5.5: Comparison of common multispectral imaging systems with the system presented in this work.

Camera This Work Tetracam

ADC Micro

Tetracam

Mini-MCA4

Sentek

GEMS

Sentera

QUAD

Parrot

Sequoia

Dimensions (mm) 150 x 138 x 43 75 x 59 x 33 131 x 78 x 87 127 x 89 76 x 62 x 48 59 x 41 x 28

Weight (g) 323 90 600 320 170 72

Number of Bands 12 4 4 4 4 4

Focal Length (mm) 6.00 8.43 9.6 7.7 5.0 3.98

FOV (H° x V°) 43.6 x 33.4 37.6 x 28.7 38.2 x 30.9 34.6 x 26.3 51.2 x 39.6 61.9 x 48.5

Sensor size (mm) 4.8 x 3.6 6.55 x 4.92 6.66 x 5.32 4.8 x 3.6 4.8 x 3.6 4.8 x 3.6

Resolution (pixels) 1280 x 960 2048 x 1536 1280 x 1024 1280 x 960 1280 x 960 1280 x 960

Pixel size (µm) 3.75 3.12 5.2 3.75 3.75 3.75

GSD (cm) @120m 7.5 4.6 6.6 5.1 9.1 11.0

Price (USD) See Table 5.6 2,995.00 9,995.00 3,500.00 4,599.00 3,500.00

Page 103: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

85

5.6 Costs

Tables 5.6 presents a list of general costs for different configurations of the multispectral imagingsystem hardware. Table 5.7 presents the costs per unit or set for the Bridge Board bare PCB andBOM based on order quantity. All prices are in US Dollars (USD).

Table 5.6: General costs of components for the multispectral imaging system hardware.

Components 3 bands @

1.2MP

8 bands @

0.3 MP

9 bands @

1.2MP

12 bands @

0.3MP

Item Cost Qty. Cost Qty. Cost Qty. Cost Qty. Cost

Bridge board bare PCB 25.80 1 25.80 1 25.80 1 25.80 1 25.80

Bridge board BOM 47.94 1 47.94 1 47.94 1 47.94 1 47.94

Camera 289.00 3 867.00 8 2,312.00 9 2,601.00 12 3,468.00

Lens holder 16.80 3 50.40 8 134.40 9 151.20 12 201.60

Lens 6.00 3 18.00 8 24.00 9 27.00 12 72.00

Optical filter 56.00 3 168.00 8 448.00 9 504.00 12 672.00

ODROID 46.00 1 46.00 1 46.00 3 138.00 2 92.00

Miscellaneous (*) 10.00 3 30.00 8 80.00 9 90.00 12 80.00

TOTAL 1,253.14 3,118.14 3,584.94 4,659.34

(*) fasteners, standoffs, connectors, cables, etc.

Table 5.7: Costs per set/unit based on order quantity for Bridge Board bare PCB and BOM.

Item Cost per set/unit

@ Qty = 1

Cost per set/unit

@ Qty = 10

Cost per set/unit

@ Qty = 100

Bridge board bare PCB 25.80 25.80 15.20

Bridge board BOM 47.94 39.71 31.35

The costs presented in the tables above does not account for full PCB assembly (PCBA). Nonethe-less, compared to the prices of commercial cameras presented in table 5.5, a 3-band multispectralimaging system can be built using the approach and design presented in this thesis for a total coststhat is less that half the price of the cheapest camera (2,995.00).

Page 104: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

86

5.7 Focusing

A fundamental setting of this flexible imaging system is the possibility of changing lenses andfocusing a different distances. But focusing at a certain distance can be a challenge when workingwith multiple cameras. Since each camera has to be manually focused individually, it is important toachieve the best possible focus for all the cameras. A custom software was developed for providinga live stream of the camera capture and quantify focus, so as to provide a metric that could be usedfor achieving the best possible focus for a given target.

Pertuz et al. [98] presented an analysis and comparison of focus measure operators. They concludedthat the Laplacian-based operators presented the best overall performance for normal imaging condi-tions (i.e., without addition of noise, contrast reduction or image saturation). While all the operatorspresent different strengths and weaknesses, and they can perform differently depending on the ap-plication, the Laplacian-based operators provide a good balance between speed and robustness.

The variance of Laplacian operator [99] was chosen for its mathematical simplicity, fast computa-tional speed, and it gave consistent and precise results during testing with the image stream from thecameras. The variance of Laplacian is based on the use of a second derivative operator that allowsthe passing of high spatial frequencies which are associated with sharp edges. The second derivativeoperator used in this method is the Laplacian operator, which can be approximated using the mask

L =

0 1 01 −4 10 1 0

(5.6)

The above is a positive Laplacian operator mask in which center element of the mask should benegative and corner elements of mask should be zero. This mask is used to take out outward edges inan image. The convolution of a grayscale image with this mask highlights gray level discontinuitiesin an image. The result of this operation is an image with grayish edge lines and other discontinuitieson a dark background. Figure 5.11 shows a comparison of a sample image before and after applyingthe positive Laplacian operator.

Page 105: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

87

Figure 5.11: Example of a sample gray scale image (left) after applying the positive Laplacian operator (right).

The focus metric calculates the variance of the absolute values, which leads to the focus measuregiven by

LAP_VAR(I) =M

∑m

N

∑n(| ∆I(m,n) | −∆I)2 (5.7)

where ∆I(m,n) is the convolution of the input image I(m,n) with the mask L and ∆I is the mean ofabsolute values given by

∆I =1

NM

M

∑m

N

∑n| ∆I(m,n) | (5.8)

Figure 5.12 shows a graph of the focus measure of a camera as the focus distance is adjusted bybringing the lens closer, going from out of focus (too far) to out of focus (too close), passing throughthe point of best focus (in-focus). Three points are highlighted in the graph: two random points inthe out of focus regions (too far and too close), and the in-focus point which is the maxima in thegraph. Figure 5.13 shows the three reference images that correspond to the highlighted points in thegraph.

Page 106: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

88

30 40 50 60 70 80 90 100 110 120 130 140 150Focus Distance Sweep (Image Number)

55

60

65

70

75

80

85

90

Focu

s M

easu

re

Out of focus(too far)

In-focus

Out of focus(too close)

Figure 5.12: Graph of the focus measure calculated using the above algorithm as the focus is adjusted from outof focus (too far) to out of focus (too close), passing through the in-focus point. Any point outside the in-focuspoint is out of focus. The graph highlights two random out of focus points for reference, one too far and onetoo close.

(a) (b) (c)

Figure 5.13: Sample images of the three highlighted points in the graph of Figure 5.12 . (a) Out of focus (toofar). (b) In-focus. (c) Out of focus (too close).

A test was also performed to find the standard deviation of the focus measurement with the lenswith and without external vibrations. More than a 100 frames were captured for each case, and thefocus metric was calculated. When leaving the lens fixed (no vibrations) a standard deviation for thefocus measurement was 3.0302. For the lens under vibrations (lens held in place with the hand) thestandard deviation for the focus measurement was 14.2966.

In practice, each camera was focused using the same target scene and the real-time focus measure-ment software. The reported real-time focus measure was used as an indicator to adjust the lensesto the best possible focus. An acceptable threshold for this scale was set at the maximum value(in-focus) minus 5. This means, for example, that in the case of the graph shown in Figure 5.12where the maximum value (in-focus) is 85.63, a value above 80.63 will still produce an acceptablefocused image.

Page 107: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

89

5.8 Image Alignment

Alignment of the images for this system is more complex since cameras are organized in a 3x4array, which means 2D translation is necessary. For this purpose, a subpixel registration algorithmbased on intensity [100] was used. The algorithm minimizes the mean square intensity differencebetween a reference image and a test image. The transformation of the test image, in order to achievealignment to the reference image, can be done by rigid-body motion (rotation and translation) only,or combined with isometric scaling.

The software platform ImageJ3, together with the image registration plugin StackReg4 were used foraligning the image sets. ImageJ [101] is a public Java-based image processing program developedat the National Institutes of Health. ImageJ offers multiple distributions with different features.The Fiji [102] distribution is the recommended updated general purpose option to this date, andit was the one used in this project. StackReg is a software implementation of the aforementionedalgorithm [100], written as a java plugin for ImageJ by the Biomedical Imaging Group of the ÉcolePolytechnique Fédérale de Lausanne.

The algorithm uses each image of a stack (all the images or channel that belong to the same instant)as the template for aligning the next image. The first image of the stack is used as the anchor tobegin the alignment process. The algorithm was implemented using rigid-body motion (rotation andtranslation) combined with isometric scaling. Scaling is used to account for any zooming effect thatresults from differences in focus. Performing these transformations in the images requires access todata outside the image frame. As such data is not available, it is replaced by zeroes in the outputimage. Figure 5.14 shows an example of two images being aligned. Transformations in the testimage result in edges and empty spaces filled with zeroes, which translates to black borders framingthe output image.

3https://imagej.net4http://www.epfl.ch/thevenaz/stackreg/

Page 108: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

90

(a) (b)

Figure 5.14: Result of aligning two images. (a) Template or reference image. (b) Test image aligned to templateimage. It is possible to see the black borders filling the empty spaces after alignment.

ImageJ allows for the creation of scripts that help automate operations. A script was written in Javato run in ImageJ for taking a folder with multiples stacks of multiple images each and run a loop foraligning all the stacks. The program creates a folder where it saves all the aligned images using thesame names. Figure 5.15 shows a comparison of a composite of 3 images superposed before andafter alignment.

(a) (b)

Figure 5.15: Result of image alignment algorithm. (a) Superposition before alignment. (b) Superposition afteralignment.

A second version of the script was created where it is possible to export the transformation matricesof one stack and use it to align the rest of the stacks in the same folder. This implementationcalculates the transformation matrix once and reuses it for the rest of the stacks, thus reducingprocessing time.

Page 109: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

91

Chapter 6

Conclusions and Future Work

The work presented in this thesis introduces the development process of a UAV-compatible multi-spectral imaging system, detailing the design, construction, testing, and validation steps. The workcovered the development of a customized fixed wing UAV, and later focuses on the development ofthe imaging system in all its revisions.

The UAV platform that was being used by the University of Alberta Aerial Robotics Group (UAARG)served as the starting point. The airframe was redesigned so as to improve construction, installation,maintenance, and more importantly, repeatability and reliability. The onboard autonomous naviga-tion system hardware was redesigned to integrate all necessary components in a compact package,and a novel redundant power supply board was added. This UAV system board solution was de-signed to be fully compatible with the Paparazzi platform, and it can be used with any fixed-wingor rotary-wing airframe. The integration on all necessary components saves time during assembly,debugging, and deployment. Additionally, the system board gives access to expansion ports for fur-ther development. Overall, better wiring management inside the plane, more efficient use of internalvolume, and the adoption of modularity, allowed the new version of the fixed wing UAV (Falcon I)to be a robust and reliable platform for research and development.

Imaging testing started with the evaluation of a small USB 2.0 CMOS camera module, with fea-tures that are similar to those found in webcams. USB 2.0 was chosen for its popularity and hard-ware/software relative simplicity compared to other interfaces such as USB 3.0, Firewire or Gigabit,which are also faster, but would increase weight and cost. The camera module was successfullystudied and tested throughout different configurations. USB camera modules are relatively sim-ple imaging systems based on an image sensor, a microcontroller, and a USB 2.0 interface. Witha relatively wide variety of image sensor sizes, which in turn provide several resolution options,USB camera modules are a compact, simple, and efficient solution for aerial imaging. These typesof cameras are an alternative option to higher end consumer cameras, offering a simpler hardware

Page 110: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

92

that incorporates only the necessary components. This approach provides higher flexibility when itcomes to developing custom imaging systems.

With the successful results of the USB camera module testing a custom 3-band multispectral imagingsystem was developed based on similar USB camera modules and optical filters. This proof ofconcept system used 3 monochromatic camera modules with a 1.2 megapixel CMOS image sensor.The selection of this resolution was based on a basic market research that looked at commercialUAV targeted multispectral cameras. Spectral bands used in these cameras was also studied, and aselection of three bands was made for this custom system. While filters can be easily replaceable, thegreen, red, and near infrared (NIR) bands were selected. A key factor is the selection of the cameraswas the shutter. The cameras used in this system had a rolling shutter, in contrast with the globalshutter that is used in commercial multispectral cameras. It was proved that using a rolling shutterfor a 1.2 megapixels camera in standard fixed wing UAV flying conditions did not present negativeeffects. Common imaging problems associated with rolling shutter like skew and motion blur werenot visible in any of the images during normal flight. Another key factor was the use of three freerunning cameras that were capturing images at 30 frames per second each with no synchronizationbetween them. Custom software was written utilizing third party tools so as to interface with thecameras for configuration and image extraction. The developed software was optimized for low CPUusage and multiple camera interface, allowing software synchronization between the three cameras.

After successfully flying the 3-band multispectral imaging system, images were aligned using a cus-tom alignment software written in MATLAB and based on matrix correlation. The software is asimplified implementation of one-dimensional translation that looks for a correlation maxima be-tween two images. While simpler than more advanced image registration and alignment algorithms,it proved effective and fast. After alignment, the Normalized Difference Vegetation Index (NDVI)was calculated using the red and NIR images. Results showed successful and accurate detection ofvegetation with clear identification between different levels of vegetation stress.

The subsequent evolution of the project was to develop a 12-band multispectral imaging system.In this case, the same approach of multiple USB cameras was maintained. Monochromatic cameramodules with 1.2 megapixel CMOS image sensors were also used. The cameras selected for thissystem have a global shutter and an external trigger. The external trigger is a key feature of the sys-tem, allowing cameras to be accurately synchronized. A custom base board was designed containinga configurable USB HUB network and an onboard microcontroller that takes care of triggering thecameras. The system was built and tested with different configurations. The maximum number ofsimultaneous cameras for a certain configuration was determined. A way of estimating the maxi-mum number of cameras based on resolution, frame rate, and the actual practical throughput of USB2.0 (35-40 Mbytes/second) was provided. This customizable multi-camera imaging system providesa flexible and versatile development platform for multispectral imaging applications. A method forachieving in-focus for all cameras was presented, based on a custom software that quantifies focus inreal-time. Using this focus metric it was possible to bring all cameras to the best possible focus us-

Page 111: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

93

ing a reference target scene. Finally, an alignment method was also introduced based on the ImageJimage processing software. The alignment algorithm uses subpixel registration based on intensity,and can operate using two-dimensional translation, rotation, and isometric scaling.

In terms of costs, the 12-band multispectral imaging hardware presented in this work can achievea considerably lower cost when compared to similar UAV compatible commercial multispectralcameras for the same number of bands. The custom bridge board is a multifunctional and importantpiece of hardware with a relatively low cost, which obviously drops as order quantities increase. Alower overall cost, higher development flexibility and control, and capacity for expansion, make thissystem a viable alternative to commercial off-the-shelf systems.

The whole process presented in this work followed a careful methodology that started with the test-ing of a single USB 2.0 camera module. After validation, this type of camera became the core of amulti-camera imaging system. The first revision or proof of concept consisted on a combination of3 cameras and optical filters for the development a 3-band multispectral imaging system. After suc-cessful testing, the designed evolved into a 12-band multispectral imaging system with integratedUSB HUB and automatic camera triggering system. The work done included hardware and soft-ware development, and provides a full imaging solution for custom applications. All the softwaredevelopment and code used in this project and referenced in this thesis can be found in the followinggithub repository: https://github.com/JLMARIN/imaging.git.

6.1 Future Work

Future work of this thesis would contemplate the integration of small single board computers (SBC)into the 12-band multispectral imaging system. With the considerably low cost of SBCs, it shouldbe possible to find a small but powerful board with minimal components, that would connect viaUSB to a group of cameras through the camera bridge board. The SCB would configure and extractimages from the cameras, which are all synchronized thanks to the trigger system. With a networkof several SBCs connected via a high bandwidth channel like ethernet or WiFi, a master SBC couldgather all images from the other computers and have them stored in an SD card or transferred to anexternal system. The imaging platform and processing computer can be packaged together into acompact device.

A further hardware development step along the multi-camera imaging system would be to design aPCB with image sensors interfaced directly with an FPGA. This approach would allow costs to dropconsiderably, and would remove the relatively slow and limited USB 2.0 interface. With the FPGAtalking directly to the sensors it would be possible to retrieve images from all of them simultaneouslyand store the images in an SD card.

Finally, it would be interesting and useful to make the system capable of integration with nearinfrared (NIR) and short-wave infrared (SWIR) cameras, expanding the spectrum coverage of the

Page 112: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

94

overall system. If a trigger is present in such cameras, it can be synchronized with the trigger systemthat controls the Bridge board, and have the interface computer retrieve and save the images.

Page 113: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

95

Bibliography

[1] J. Wilford and J. Creasey, “Landsat thematic mapper,” Geophysical and Remote Sensing

Methods for Regolith Exploration, pp. 6–12, 2002.

[2] B. L. Markham and J. L. Barker, “Spectral characterization of the landsat thematic mappersensors,” International Journal of Remote Sensing, vol. 6, no. 5, pp. 697–716, 1985.

[3] J. Vogelmann, D. Helder, R. Morfitt, M. Choate, and J. Merchant, “Characterization ofLandsat Thematic Mapper radiometry and geometry for land cover analysis,” IGARSS ’98.

Sensing and Managing the Environment. 1998 IEEE International Geoscience and Remote

Sensing. Symposium Proceedings. (Cat. No.98CH36174), vol. 5, pp. 2405–2407 vol.5,1998. [Online]. Available: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=702228

[4] D. L. Williams, J. R. Irons, B. L. Markham, R. F. Nelson, D. L. Toll, R. S. Latty, and M. L.Stauffer, “A statistical evaluation of the advantages of Landsat Thematic Mapper data incomparison to Multispectral Scanner data,” IEEE Transactions on Geoscience and Remote

Sensing, vol. GE-22, no. 3, pp. 294–302, 1984.

[5] R. D. Lees, W. R. Lettis, and R. Bernstein, “Evaluation of Landsat Thematic Mapper Imageryfor Geologic Applications,” Proceedings of the IEEE, vol. 73, no. 6, pp. 1108–1117, 1985.

[6] D. Lamb, “The use of qualitative airborne multispectral imaging for managing agriculturalcrops-a case study in south-eastern Australia,” Australian Journal of Experimental Agricul-

ture, vol. 40, no. 5, pp. 725–738, 2000.

[7] Y. Huang, S. J. Thomson, Y. Lan, and S. J. Maas, “Multispectral imaging systems for air-borne remote sensing to support agricultural production management,” International Journal

of Agricultural and Biological Engineering, vol. 3, no. 1, pp. 50–62, 2010.

[8] S. Nebiker, N. Lack, M. Abächerli, and S. Läderach, “Light-weight multispectral uav sen-sors and their capabilities for predicting grain yield and detecting plant diseases,” Interna-

tional Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences,vol. 2016-Janua, no. July, pp. 963–970, 2016.

Page 114: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

96

[9] P. P. J. Roosjen, J. M. Suomalainen, H. M. Bartholomeus, and J. G. P. W. Clevers, “Hyperspec-tral reflectance anisotropy measurements using a pushbroom spectrometer on an unmannedaerial vehicle-results for barley, winter wheat, and potato,” Remote Sensing, vol. 8, no. 10,pp. 1–16, 2016.

[10] S. Candiago, F. Remondino, M. D. Giglio, M. Dubbini, and M. Gattelli, “Evaluating Mul-tispectral Images and Vegetation Indices for Precision Farming Applications from UAV Im-ages,” Remote Sensing, vol. 7, no. Vi, pp. 4026–4047, 2015.

[11] K. Themistocleous, A. Agapiou, B. Cuca, and D. G. Hadjimitsis, “UnmannedAerial Systems and Spectroscopy for Remote Sensing Applications in Archaeology,”ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial

Information Sciences, vol. XL-7/W3, no. May, pp. 1419–1423, 2015. [Online]. Available:http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-7-W3/1419/2015/

[12] D. Turner, A. Lucieer, Z. Malenovský, D. H. King, and S. A. Robinson, “Spatial co-registration of ultra-high resolution visible, multispectral and thermal images acquired with amicro-UAV over antarctic moss beds,” Remote Sensing, vol. 6, no. 5, pp. 4003–4024, 2014.

[13] L. Hassan-Esfahani, A. Torres-Rua, A. Jensen, and M. McKee, “Assessment of surface soilmoisture using high-resolution multi-spectral imagery and artificial neural networks,” Remote

Sensing, vol. 7, no. 3, pp. 2627–2646, 2015.

[14] Sentek Systems, “GEMS Multispectral Sensor,” 2017. [Online]. Available: http://precisionaguavs.com/sensors/. [Accessed: 2017-06-23].

[15] Sentera LLC, “Sentera Quad Sensor,” 2017. [Online]. Available: https://sentera.com/product/sentera-quad-sensor/. [Accessed: 2017-06-23].

[16] Parrot SA, “Parrot Sequoia,” 2017. [Online]. Available: https://www.parrot.com/ca/business-solutions/parrot-sequoia#parrot-sequoia-. [Accessed: 2017-06-22].

[17] B. E. A. Saleh and M. C. Teich, Fundamentals of Photonics, 2nd ed. Hoboken, N.J.: Wiley-Interscience, 2007.

[18] Wikimedia Commons, “Angle of view,” 2012. [Online]. Available: en.wikipedia.org/wiki/File:Angle_of_view.svg. [Accessed: 2017-06-11].

[19] Panasonic, “How Focal Length Affects Viewing Angle.” [Online]. Available: http://av.jpn.support.panasonic.com/support/global/cs/dsc/knowhow/knowhow12.html. [Accessed:2017-06-11].

[20] LEMKE, “Do you know the difference between the CCD and CMOS chip technology?”2014. [Online]. Available: http://www.lemkevision.com/cn/news/. [Accessed: 2017-06-11].

Page 115: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

97

[21] FLIR Knowledge Base, “Key differences between rolling shutter and frame (global) shutter,”2015. [Online]. Available: www.ptgrey.com/kb/10028#. [Accessed: 2017-11-06].

[22] T. York and R. Jain, “Fundamentals of Image Sensor Performance,” pp. 1–8, 2011. [Online].Available: http://www.cse.wustl.edu/~jain/cse567-11/ftp/imgsens/index.html

[23] P. Swain and D. Cheskis, “Back-Illuminated Image Sensors Come to the Forefront,” 2008.[Online]. Available: www.photonics.com/Article.aspx?AID=34685

[24] V. Suntharalingam, R. Berger, S. Clark, J. Knecht, A. Messier, K. Newcomb, D. Rathman,R. Slattery, A. Soares, C. Stevenson, K. Warner, D. Young, P. Ang, Lin, B. Mansoorian, andD. Shaver, “A 4-side tileable back illuminated 3d-integrated Mpixel CMOS image sensor,”Digest of Technical Papers - IEEE International Solid-State Circuits Conference, pp. 38–40,2009.

[25] Edmund Optics, “Introduction to Modulation Transfer Function.”[Online]. Available: www.edmundoptics.com/resources/application-notes/optics/introduction-to-modulation-transfer-function/. [Accessed: 2017-11-16].

[26] A. A. Michelson, Studies in Optics. Chicago, Ill: The University of Chicago Press, 1927.

[27] Dxomark, “MTF.” [Online]. Available: https://www.dxomark.com/glossary/mtf-2/. [Ac-cessed: 2017-11-27].

[28] ISO, “ISO 12233:2017, Photography – Electronic still picture imaging – Resolution and spa-tial frequency responses,” 2017.

[29] P. D. Burns, “Slanted-Edge MTF for Digital Camera and Scanner Analysis,” in Proc. IS&T

2000 PICS Conference, vol. 3, 2000, pp. 135–138.

[30] S. E. Reichenbach, S. K. Park, and R. Narayanswamy, “Characterizing digital image acquisi-tion devices,” Optical Engineering, vol. 30, no. 2, pp. 170–177, 1991.

[31] D. Williams, “Benchmarking of the ISO 12233 Slanted-edge Spatial Frequency ResponsePlug-in Don,” in IS&T’s 1998 PICS Conference, 1998.

[32] P. D. Burns, “Refined slanted-edge measurement for practical camera and scanner testing,”IS&T PICS Conference, pp. 191–195, 2002.

[33] P. D. Burns and D. Williams, “Digital Camera and Scanner Performance Evaluation : Stan-dards and Software,” in SPIE/IS&T Electronic Imaging Symposium, 2014, pp. 1–41.

[34] P. D. Burns and D. Williams, “Sampling Efficiency in Digital Camera Performance Stan-dards,” in Proc. SPIE, vol. 6808, 2008, pp. 1 – 5.

Page 116: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

98

[35] M. H. Schimek, B. Dirks, H. Verkuil, and M. Rubli, Video for Linux two API specification -

Revision 0.24, 2008.

[36] L. Yinli, Y. Hongli, and Z. Pengpeng, “The Implementation of Embedded Image Acquisi-tion Based on V4L2,” 2011 International Conference on Electronics, Communications and

Control (Icecc), pp. 549–552, 2011.

[37] S. D. Burks and J. M. Doe, “GStreamer as a framework for image processing applicationsin image fusion,” Proc. SPIE 8064, Multisensor, Multisource Information Fusion:

Architectures, Algorithms, and Applications 2011, vol. 8064, pp. 80 640M–80 640M–7,2011. [Online]. Available: http://dx.doi.org/10.1117/12.884797

[38] Gstreamer.freedesktop.org, “What is GStreamer?” [Online]. Available: https://gstreamer.freedesktop.org/documentation/application-development/introduction/gstreamer.html. [Ac-cessed: 2018-01-18].

[39] Tetracam Inc., “Tetracam ADC Micro,” 2017. [Online]. Available: http://tetracam.com/Products-ADC_Micro.htm. [Accessed: 2017-06-22].

[40] Tetracam Inc., “Tetracam Mini MCA,” 2017. [Online]. Available: http://www.tetracam.com/Products-Mini_MCA.htm. [Accessed: 2017-06-22].

[41] A. Burkart, S. Cogliati, A. Schickling, and U. Rascher, “A novel UAV-Based ultra-lightweight spectrometer for field spectroscopy,” IEEE Sensors Journal, vol. 14, no. 1, pp. 62–67, 2014.

[42] J. Suomalainen, N. Anders, S. Iqbal, G. Roerink, J. Franke, P. Wenting, D. Hünniger,H. Bartholomeus, R. Becker, and L. Kooistra, “A lightweight hyperspectral mapping sys-tem and photogrammetric processing chain for unmanned aerial vehicles,” Remote Sensing,vol. 6, no. 11, pp. 11 013–11 030, 2014.

[43] F. Agüera, F. Carvajal, and M. Pérez, “Measuring Sunflower Nitrogen Status From an Un-manned Aerial Vehicle-Based System and an on the Ground Device,” ISPRS - International

Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol.XXXVIII-1/, no. September, pp. 33–37, 2011.

[44] J. S. West, C. Bravo, R. Oberti, D. Lemaire, D. Moshou, and H. A. McCartney, “ThePotential of Optical Canopy Measurement for Targeted Control of Field Crop Diseases,”Annual Review of Phytopathology, vol. 41, no. 1, pp. 593–614, 2003. [Online]. Available:http://www.annualreviews.org/doi/10.1146/annurev.phyto.41.121702.103726

[45] S. Jacquemoud and S. L. Ustin, “Leaf optical properties: a state of the art,” Proc. 8th

International Symposium Physical Measurements & Signatures in Remote Sensing, no.January, pp. 223–232, 2001. [Online]. Available: citeulike-article-id:7215646

Page 117: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

99

[46] Y. Wang, G. Li, J. Ding, Z. Guo, S. Tang, R. Liu, and J. Chen, “A combinedGLAS and MODIS estimation of the global distribution of mean forest canopy height,”Remote Sensing of Environment, vol. 174, pp. 24–43, 2016. [Online]. Available:http://dx.doi.org/10.1016/j.rse.2015.12.005

[47] Y. Knyazikhin, M. A. Schull, P. Stenberg, M. Mõttus, M. Rautiainen, Y. Yang, A. Marshak,P. Latorre Carmona, R. K. Kaufmann, P. Lewis, M. I. Disney, V. Vanderbilt, A. B.Davis, F. Baret, S. Jacquemoud, A. Lyapustin, and R. B. Myneni, “Hyperspectral remotesensing of foliar nitrogen content.” Proceedings of the National Academy of Sciences of

the United States of America, vol. 110, no. 3, pp. E185–92, 2013. [Online]. Available:http://www.pnas.org/content/110/3/E185.short

[48] M. Kneubühler, B. Koetz, S. Huber, M. E. Schaepman, and N. E. Zimmermann, “Space-based spectrodirectional measurements for the improved estimation of ecosystem variables,”Canadian Journal of Remote Sensing, vol. 34, no. 3, pp. 192–205, 2008.

[49] Z. Wang, C. A. Coburn, X. Ren, and P. M. Teillet, “Effect of soil surface roughness andscene components on soil surface bidirectional reflectance factor,” Canadian Journal of Soil

Science, vol. 92, no. 2, pp. 297–313, 2012.

[50] P. P. J. Roosjen, H. M. Bartholomeus, and J. G. P. W. Clevers, “Effects of soil moisturecontent on reflectance anisotropy - Laboratory goniometer measurements and RPV modelinversions,” Remote Sensing of Environment, vol. 170, pp. 229–238, 2015. [Online].Available: http://dx.doi.org/10.1016/j.rse.2015.09.022

[51] A. Bannari, D. Morin, F. Bonn, and A. R. Huete, “A review of vegetation indices,”Remote Sensing Reviews, vol. 13, no. 1, pp. 95–120, 1995. [Online]. Available:http://dx.doi.org/10.1080/02757259509532298

[52] A. Matese, F. Capraro, J. Primicerio, G. Gualato, S. Gennaro, and G. Agati, “Mapping ofvine vigor by UAV and anthocyanin content by a non- destructive fluorescence technique,” inStafford J.V. (eds) Precision agriculture ’13. Wageningen: Wageningen Academic Publish-ers, 2013, pp. 201–208.

[53] A. a. Gitelson, Y. Gritz, and M. N. Merzlyak, “Relationships between leaf chlorophyll contentand spectral reflectance and algorithms for non-destructive chlorophyll assessment in higherplant leaves.” Journal of plant physiology, vol. 160, no. 3, pp. 271–82, 2003.

[54] H. Zheng, X. Zhou, T. Cheng, X. Yao, Y. Tian, W. Cao, and Y. Zhu, “Evaluation of a UAV-based Hyperspectral Frame Camera for Monitoring the Leaf Nitrogen Concentration in Rice,”in Geoscience and Remote Sensing Symposium (IGARSS), 2016, pp. 7350–7353.

Page 118: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

100

[55] I. Pölönen, H. Saari, J. Kaivosoja, E. Honkavaara, and L. Pesonen, “Hyperspectral imagingbased biomass and nitrogen content estimations from light-weight UAV,” in Proceedings of

SPIE - The International Society for Optical Engineering, 2013.

[56] V. P. Polischuk, T. M. Shadchina, T. I. Kompanetz, I. G. Budzanivskaya, A. L.Boyko, and A. A. Sozinov, “Changes in reflectance spectrum characteristic of nicotianadebneyi plant under the influence of viral infection,” Archives Of Phytopathology

And Plant Protection, vol. 31, no. 1, pp. 115–119, 1997. [Online]. Available:http://dx.doi.org/10.1080/03235409709383221

[57] R. Näsi, E. Honkavaara, P. Lyytikäinen-Saarenmaa, M. Blomqvist, P. Litkey, T. Hakala,N. Viljanen, T. Kantola, T. Tanhuanpää, and M. Holopainen, “Using UAV-based photogram-metry and hyperspectral imaging for mapping bark beetle damage at tree-level,” in Remote

Sensing, no. 11, 2013, pp. 15 467–15 493.

[58] F. Garcia-Ruiz, S. Sankaran, J. M. Maja, W. S. Lee, J. Rasmussen, and R. Ehsani,“Comparison of two aerial imaging platforms for identification of Huanglongbing-infectedcitrus trees,” Computers and Electronics in Agriculture, vol. 91, pp. 106–115, 2013. [Online].Available: http://dx.doi.org/10.1016/j.compag.2012.12.002

[59] J. Flexas, J. M. Escalona, and H. Medrano, “Water stress induces different levels of photo-synthesis and electron transport rate regulation in grapevines,” Plant, Cell and Environment,vol. 22, no. 1, pp. 39–48, 1999.

[60] J. Flexas, J. M. Escalona, S. Evain, J. Gulías, I. Moya, C. B. Osmond, and H. Medrano,“Steady-state chlorophyll fluorescence (Fs) measurements as a tool to follow variations ofnet CO2 assimilation and stomatal conductance during water-stress in C3 plants,” European

Space Agency, (Special Publication) ESA SP, no. 527, pp. 26–29, 2002.

[61] P. J. Zarco-Tejada, V. González-Dugo, and J. A. J. Berni, “Fluorescence, temperature andnarrow-band indices acquired from a UAV platform for water stress detection using amicro-hyperspectral imager and a thermal camera,” Remote Sensing of Environment, vol.117, pp. 322–337, 2012. [Online]. Available: http://dx.doi.org/10.1016/j.rse.2011.10.007

[62] E. Honkavaara, M. A. Eskelinen, I. Polonen, H. Saari, H. Ojanen, R. Mannila, C. Holmlund,T. Hakala, P. Litkey, T. Rosnell, N. Viljanen, and M. Pulkkanen, “Remote Sensing of 3-D Geometry and Surface Moisture of a Peat Production Area Using Hyperspectral FrameCameras in Visible to Short-Wave Infrared Spectral Ranges Onboard a Small UnmannedAirborne Vehicle (UAV),” IEEE Transactions on Geoscience and Remote Sensing, vol. 54,no. 9, pp. 5440–5454, 2016.

[63] R. Bryant, D. Thoma, S. Moran, C. Holifield, D. Goodrich, T. Keefer, G. Paige, David,Williams, and S. Skirvin, “Evaluation of Hyperspectral, Infrared Temperature and Radar

Page 119: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

101

Measurements for Monitoring Surface Soil Moisture,” First Interagency Conference in the

Watersheds, pp. 528–533, 2003.

[64] M. P. Finn, M. D. Lewis, D. D. Bosch, M. Giraldo, K. Yamamoto, D. G. Sullivan, R. Kincaid,R. Luna, G. K. Allam, C. Kvien, and M. S. Williams, “Remote Sensing of Soil MoistureUsing Airborne Hyperspectral Data,” GIScience & Remote Sensing, vol. 48, no. 4, pp. 522–540, 2011.

[65] C. S. Allen, M. P. S. Krekeler, C. Scott Allen, and M. P. S. Krekeler, “Reflectance spectra ofcrude oils and refined petroleum products on a variety of common substrates,” Proceedings

of SPIE, vol. 7687, no. May 2012, pp. 76 870L–1–76 870L–13, 2010.

[66] B. Hörig, F. Kühn, F. Oschütz, and F. Lehmann, “HyMap hyperspectral remote sensing todetect hydrocarbons,” International Journal of Remote Sensing, vol. 22, no. 8, pp. 1413–1422, 2001.

[67] I. Leifer, W. J. Lehr, D. Simecek-Beatty, E. Bradley, R. Clark, P. Dennison, Y. Hu,S. Matheson, C. E. Jones, B. Holt, M. Reif, D. A. Roberts, J. Svejkovsky, G. Swayze,and J. Wozencraft, “State of the art satellite and airborne marine oil spill remote sensing:Application to the BP Deepwater Horizon oil spill,” Remote Sensing of Environment, vol.124, pp. 185–209, 2012. [Online]. Available: http://dx.doi.org/10.1016/j.rse.2012.03.024

[68] R. N. Clark, G. A. Swayze, I. Leifer, K. E. Livo, R. Kokaly, T. Hoefen, S. Lundeen, M. East-wood, R. O. Green, N. Pearson, C. Sarture, I. Mccubbin, D. Roberts, E. Bradley, D. Steele,T. Ryan, R. Dominguez, and Airborne Visible/Infrared Imaging Spectrometer (AVIRIS)Team, “A method for quantitative mapping of thick oil spills using imaging spectroscopy,”Tech. Rep. 1167, 2010.

[69] B. Rivard, J. Feng, V. Bushan, and M. Lipsett, “Infrared reflectance hyperspectral features ofAthabasca oil sand ore and froth,” Workshop on Hyperspectral Image and Signal Processing,

Evolution in Remote Sensing, pp. 0–3, 2011.

[70] B. Rivard, D. Lyder, J. Feng, A. Gallie, E. Cloutis, P. Dougan, S. Gonzalez, D. Cox, andM. G. Lipsett, “Bitumen content estimation of Athabasca oil sand from broad band infraredreflectance spectra,” Canadian Journal of Chemical Engineering, vol. 88, no. 5, pp. 830–838,2010.

[71] D. Hausamann, W. Zirnig, G. Schreier, and P. Strobl, “Monitoring of gas pipelines- a civil UAV application,” Aircraft Engineering and Aerospace Technology, vol. 77,no. 5, pp. 352–360, 2005. [Online]. Available: http://www.emeraldinsight.com/doi/10.1108/00022660510617077

Page 120: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

102

[72] J. Tuominen and T. Lipping, “Detection of Environmental Change Using Hyperspectral Re-mote Sensing at Olkiluoto Repository Site,” POSIVA OY, Eurajoki, Finland, Tech. Rep.,2011.

[73] A. J. S. McGonigle, A. Aiuppa, G. Giudice, G. Tamburello, A. J. Hodson, and S. Gurrieri,“Unmanned aerial vehicle measurements of volcanic carbon dioxide fluxes,” Geophysical

Research Letters, vol. 35, no. 6, pp. 3–6, 2008.

[74] A. Agapiou, D. G. Hadjimitsis, and D. D. Alexakis, “Evaluation of broadband and narrow-band vegetation indices for the identification of archaeological crop marks,” Remote Sensing,vol. 4, no. 12, pp. 3892–3919, 2012.

[75] M. Doneus, G. Verhoeven, C. Atzberger, M. Wess, and M. Rus, “New ways to extract archae-ological information from hyperspectral pixels,” Journal of Archaeological Science, vol. 52,pp. 84–96, 2014.

[76] D. C. Cowley, C. Moriarty, G. Geddes, G. L. Brown, T. Wade, and C. J. Nichol, “UAVsin Context: Archaeological Airborne Recording in a National Body of Survey and Record,”Drones, vol. 2, no. 2, pp. 1–16, 2018.

[77] C. Moriarty, “Deploying Multispectral Remote Sensing for Multitemporal Analysis of Ar-chaeological Crop Stress at Ravenshall, Fife,” Master’s Thesis, University of Edinburgh,2017.

[78] H. Chao, Y. Cao, and Y. Chen, “Autopilots for small unmanned aerial vehicles: A survey,”International Journal of Control, Automation and Systems, vol. 8, no. 1, pp. 36–44, 2010.

[79] H. Lim, J. Park, D. Lee, and H. J. Kim, “Build your own quadrotor: Open-source projectson unmanned aerial vehicles,” IEEE Robotics and Automation Magazine, vol. 19, no. 3, pp.33–45, 2012.

[80] E. Ebeid, M. Skriver, and J. Jin, “A Survey on Open-Source Flight Control Platforms ofUnmanned Aerial Vehicle,” in Euromicro Symposium on Digital Systems Design. IEEE,2017.

[81] B. Remes, P. Esden-Tempski, F. Van Tienen, E. Smeur, C. De Wagter, and G. De Croon,“Lisa-S 2.8g autopilot for GPS-based flight of MAVs,” IMAV 2014, International Micro Air

Vehicle Conference and Competition 2014, pp. 280–285, 2014.

[82] R. W. Beard, D. Kingston, M. Quigley, D. Snyder, R. Christiansen, W. Johnson, T. McLain,and M. Goodrich, “Autonomous Vehicle Technologies for Small Fixed-Wing UAVs,” Journal

of Aerospace Computing, Information, and Communication, vol. 2, no. 1, pp. 92–108, 2005.

Page 121: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

103

[83] D. Stojcsics and A. Molnár, “AirGuardian - UAV hardware and software system for smallsize UAVs,” International Journal of Advanced Robotic Systems, vol. 9, pp. 1–8, 2012.

[84] MicroPilot Company, “MP2128-3X MicroPilot’s Triple Redundant UAV Autopilot,” Tech.Rep. [Online]. Available: https://www.micropilot.com/pdf/white-papers/mp21283x.pdf

[85] J. Marin, R. Augustine, M. Lipsett, and Duncan G. Elliott, “An Open Source ComprehensiveSystem Board for UAVs with Redundant Power,” in Unmanned Canada 2016 Conference,Edmonton, 2016.

[86] B. Gati and Paparazzi Community, “Open Source Autopilot for Academic Research - ThePaparazzi System,” Proceeding of the American Control Conference 2013, pp. 1478–1481,2013.

[87] P. Brisset, A. Drouin, M. Gorraz, P.-s. Huard, and J. Tyler, “The Paparazzi Solution,” MAV

2006, 2nd US-European Competition and Workshop on Micro Air Vehicles, pp. 1–15, 2006.

[88] G. Hattenberger, M. Bronz, and M. Gorraz, “Using the Paparazzi UAV System for ScientificResearch,” IMAV 2014, International Micro Air Vehicle Conference and Competition 2014,pp. 247–252, 2014.

[89] P. Burns, “sfrmat3: SFR evaluation for digital cameras and scanners,” 2015. [Online].Available: http://losburns.com/imaging/software/SFRedge/. [Accessed: 2017-04-10].

[90] J. P. Jhan, J. Y. Rau, N. Haala, and M. Cramer, “Investigation of parallax issues for multi-lensmultispectral camera band co-registration,” International Archives of the Photogrammetry,

Remote Sensing and Spatial Information Sciences - ISPRS Archives, vol. 42, no. 2W6, pp.157–163, 2017.

[91] A. Bhandari, A. Kumar, and G. Singh, “Feature Extraction using Normalized DifferenceVegetation Index (NDVI): A Case Study of Jabalpur City,” Procedia Technology, vol. 6, pp.612–621, 2012.

[92] QImaging, “Global Shutter vs. Rolling Shutter Readouts,” pp. 1–9, 2014. [Online]. Available:https://www.qimaging.com/ccdorscmos/pdfs/RollingvsGlobalShutter.pdf

[93] W. Hong, D. Wei, and A. U. Batur, “Video stabilization and rolling shutter distortion re-duction,” in Proceedings - International Conference on Image Processing, ICIP, 2010, pp.3501–3504.

[94] Z. W. Pan, H. L. Shen, C. Li, S. J. Chen, and J. H. Xin, “Fast Multispectral Imaging by SpatialPixel-Binning and Spectral Unmixing,” IEEE Transactions on Image Processing, vol. 25,no. 8, pp. 3612–3625, 2016.

Page 122: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

104

[95] Arduino, “Arduino - Home Page,” 2018. [Online]. Available: https://www.arduino.cc/.[Accessed: 2018-02-05].

[96] J. C. Martínez-Santos, O. Acevedo-Patino, and S. H. Contreras-Ortiz, “Influence of Arduinoon the Development of Advanced Microcontrollers Courses,” Revista Iberoamericana de Tec-

nologias del Aprendizaje, vol. 12, no. 4, pp. 208–217, 2017.

[97] A. Orebaugh, G. Ramirez, and J. Beale, Wireshark & Ethereal Network Protocol Analyzer

Toolkit. Elsevier, 2006.

[98] S. Pertuz, D. Puig, and M. A. Garcia, “Analysis of focus measure operators forshape-from-focus,” Pattern Recognition, vol. 46, no. 5, pp. 1415–1432, 2013. [Online].Available: http://dx.doi.org/10.1016/j.patcog.2012.11.011

[99] J. Pech-Pacheco, G. Cristobal, J. Chamorro-Martinez, and J. Fernandez-Valdivia, “Diatomautofocusing in brightfield microscopy: a comparative study,” Proceedings 15th International

Conference on Pattern Recognition. ICPR-2000, vol. 3, pp. 314–317, 2000. [Online].Available: http://ieeexplore.ieee.org/document/903548/

[100] P. Thévenaz, U. E. Ruttimann, and M. Unser, “A pyramid approach to subpixel registrationbased on intensity,” IEEE Transactions on Image Processing, vol. 7, no. 1, pp. 27–41, 1998.

[101] C. T. Rueden, J. Schindelin, M. C. Hiner, B. E. DeZonia, A. E. Walter, E. T. Arena, andK. W. Eliceiri, “ImageJ2: ImageJ for the next generation of scientific image data,” BMC

Bioinformatics, vol. 18, no. 1, pp. 1–26, 2017.

[102] J. Schindelin, I. Arganda-Carreras, E. Frise, V. Kaynig, M. Longair, T. Pietzsch, S. Preibisch,C. Rueden, S. Saalfeld, B. Schmid, J. Y. Tinevez, D. J. White, V. Hartenstein, K. Eliceiri,P. Tomancak, and A. Cardona, “Fiji: An open-source platform for biological-image analysis,”Nature Methods, vol. 9, no. 7, pp. 676–682, 2012.

Page 123: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

105

Appendix A

UAV System Board User Guide

Page 124: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Autopilot Embedded System Board - Falcon IUser Guide

Jorge Marin and Duncan G. Elliott

Rev. 1.5 - March 29, 2018

Introduction

This user guide introduces the Autopilot Embedded System Board as part of the Falcon I project developedin the University of Alberta, and describes the use and development capabilities for Unmanned Aerial Vehicle(UAV) applications. This project was developed with the support and assistance of Rijesh Augustine, andthe University of Alberta Aerial Robotics Group (UAARG).

106

Page 125: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Scope

This guide provides details on the Falcon I Autopilot system. It is made up of four main sections:

• Section 1 - describes the Falcon I Autopilot main features.

• Section 2 - provides instructions to power up the Falcon I Autopilot board.

• Section 3 - provides an overview of the Falcon I Autopilot board.

• Section 4 - describes the Falcon I Autopilot board components.

Related Items

• STMicroelectronics STM32F405 Datasheet (http://www.st.com/content/ccc/resource/technical/document/datasheet/ef/92/76/6d/bb/c2/4f/f7/DM00037051.pdf/files/DM00037051.pdf/jcr:content/translations/en.DM00037051.pdf)

107

Page 126: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Contents

1 Specifications 4

1.1 Electrostatic Warning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

1.2 Power Supply Warning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Power Up 5

2.1 Power Up the Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.2 Programming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

3 Hardware Introduction 6

3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.2 Equipment List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.3 Board Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

4 Board Components 7

4.1 Board Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

4.2 Function Blocks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4.2.1 Microcontroller Unit (MCU) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4.2.2 Power Supply . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4.2.3 Inertial Measurement Unit (IMU) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4.2.4 GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

4.2.5 XBee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

4.2.6 Data-Logger . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

4.2.7 CAN Transceiver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

4.2.8 Motor/Servo Outputs (PWM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

4.2.9 Connectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

4.2.10 Jumpers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.3 Board Pictures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4.4 Autopilot Board Schematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4.5 Autopilot Board Bill-of-Materials (BOM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

5 Revision History 48

108

Page 127: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

1 SpecificationsTable 1.1: Board Specifications.

Characteristic SpecificationsTemperature:- Operating- Storage

0ºC to +70ºC-40ºC to +85ºC

Relative Humidity 0 to 90% (non-condensing)RoHS status Compliant

1.1 Electrostatic Warning

Warning: ESD-Sensitive Electronic Equipment!

The board system must not be subject to high electrostatic potentials.

It is strongly recommended to use a grounding strap or similar ESD pro-tective device when handling the board in hostile ESD environments (forexample, places with synthetic carpets). Avoid touching the componentpins or any other metallic element on the board.

1.2 Power Supply Warning

Warning: Hardware Power Supply Limitation

Power supply must be 5VDC. Using a power adapter greater than 5VDCmay damage the board.

Warning: Hardware Power Budget

Using the USB port as the main power source (max. 500 mA) is acceptablewhen using the on-board peripherals only.

When connecting external peripherals or add-on boards, it is recommended the use of an external powersource connected to the J9 POWER header (can provide up to 1A on the 3.3V node).

109

Page 128: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

2 Power Up

Several power source options are available to power up the Falcon I Autopilot board.

The board can be:

• USB-powered through the Micro USB connector (USB1 connector).

• Powered through an external 5V source connected via the J9 POWER header. The external powersource must be able to supply 200mA just for the board to work with the basic configuration. Additionalperipherals or boards connected might require more supply current.

• Powered through an external 5V source connected via the J5 expansion header - pins 1 and 2. Sameconditions as above.

Warning: The Falcon I Autopilot board runs at 3.3V. The maximum voltage that the I/O pins can tolerateis 3.3V. Providing higher voltages (e.g. 5V) to an I/O pin could damage the board.

2.1 Power Up the Board

Take the board and connect 5V and GND through the J9 POWER header (check pinout information for J9to see what pins to use).

Table 2.1: Electrical Characteristics.

Electrical Parameter ValuesInput Voltage 5VDCMax DC 3.3V Current Available 1AI/O Voltage 3.3V only

2.2 Programming

The Falcon I Autopilot board is fully compatible with the Lisa MX from 1 Bit Square, and therefore canbe programmed using the same external programmer. This programmer is called “Black Magic Probe”, andconnects directly to the Falcon I Autopilot board through the JTAG port. Connect the other end to a freeUSB port of your PC using a mini-USB to USB cable.

Open PaparazziUAV, and configure and program the board using the same source files used with the LisaMX.

110

Page 129: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

3 Hardware Introduction

3.1 Introduction

The Falcon I Autopilot board is a fully-featured development platform for UAV applications. It integratesall major components needed for UAV autopilot control.

3.2 Equipment List

The Falcon I Autopilot board is built around the integration of a Cortex®-M4-based microcontroller withfloating-point unit, 3-axis accelerometer, gyroscope, and magnetometer, barometer, GPS unit, secondarymicrocontroller with SD card for data-logging, 8 PWM outputs for motors/servos and expansion headersand connectors.

3.3 Board FeaturesTable 3.1: Board Specifications.

Characteristics SpecificationsPCB characteristics 90 x 52 x 11mm (4-layers)Microcontroller STM32F405VGT6 - 168MHz 32-bit ARM® Cortex® M4 MCU

with 192KB RAM, 1024KB Flash, and floating point unit(FPU)

USB One Micro-USB10 DOF IMU 3 axis accelerometer

3 axis gyroscope3 axis magnetometerbarometer

Remote Control Two UART ports for remote control receiversTelemetry One socket for XBee radioGPS One uBlox MAX-7Q GPS unit

The GPS on-board circuitry includes:- SAW filter- Optional LNA- Selectable uFL or SMA connector for GPS antenna

Debug Port One JTAG interface connectorPWM Outputs Eight PWM outputs for motors/servosData-logger Secondary MCU and micro-SD card socket for telemetry

data-loggingExpansion Connectors - One high speed SPI interface for high speed hardware

expansion- Two I2C interfaces for actuators and sensors- One CAN interface (with onboard transceiver) for actuatorsand sensors- Three General Purpose IO pins- Three analog inputs- One UART port with adjustable level shifter- One 48-pin header with access to multiple MCU pins

Board Supply Voltage 5V from USB or J9 POWER headerOn-board power regulation to 3.3V is performed by a LDOlinear voltage regulator

User Interface Bind buttonEight indicator LEDs for status check

111

Page 130: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4 Board Components

4.1 Board Overview

The Falcon I Autopilot board integrates several peripherals and interface connectors, as shown in Figures4.1 and 4.2.

Figure 4.1: Falcon I Autopilot Board Overview (Top).

Figure 4.2: Falcon I Autopilot Board Overview (Bottom).

112

Page 131: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

The Falcon I Autopilot board is equipped with the following interface connectors:

• J3: PWM outputs for motors/servos

• J5: Expansion 48-pin header with access to power, GPIO, and communication ports

• J6: UART3 port (connected to GPS)

• J7: Control switches connector

• J9: Main power supply

• J10: Expansion connector with two UART ports (UART1 and UART5) for RC receivers and one I2Cport (I2C1)

• J11: I2C 3.3V port (I2C2)

• J12: I2C 5V port (I2C2)

• J13: UART1 port (for RC receiver)

• J14: UART5 port (for RC receiver)

• J15: ISP connector for data-logger MCU

• J16: FTDI connector for data-logger MCU

• J17: uFL connector for GPS antenna

• J18: SMA connector for GPS antenna

• J19: SPI port

• J20: UART2 port (connected to XBee)

• J21: UART port with voltage level converter

• J22: Secondary 5V supply for external system

• JTAG1: JTAG 10-pin connector

• USB1: micro-USB connector

• EX1, EX2: Female sockets for XBee

• Various test points located throughout the board

113

Page 132: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.2 Function Blocks

Figure 4.3: Falcon I Autopilot Board Block Diagram.

4.2.1 Microcontroller Unit (MCU)

The Falcon I Autopilot board is built around the STM32F405VGT6, a 32-bit ARM® Cortex® M4 micro-controller running at 168MHz with 192KB RAM, 1024KB Flash, and a floating point unit (FPU). Themicrocontroller includes CAN, I2C, I2S, SPI and UART communication interfaces, 82 I/O and 10 Timers ina LQFP-100 package.

4.2.2 Power Supply

The on-board power supply is split into two 3.3V Low Drop-Out (LDO) voltage regulators. The first voltageregulator (LP2992AIM5-3.3) supplies power to the accelerometer, gyroscope, magnetometer and barometersensors, so as to reduce noise in the measurements, while the second voltage regulator (LM3940IMPX-3.3)supplies power to the rest of the board. The latter is capable of delivering up to 1A.

4.2.3 Inertial Measurement Unit (IMU)

The boards features a 10 Degrees-of-Freedom (DOF) Inertial Measurement Unit (IMU) comprised by a 3-axis accelerometer and 3-axis gyroscope (MPU6000), a 3-axis magnetometer (HMC5883L) and a barometer(MS5611-01BA03).

The accelerometer/gyroscope and the barometer are connected via SPI interface with the MCU, and themagnetometer is connected to the accelerometer/gyroscope via I2C interface.

The accelerometer/gyroscope acts as a master with the magnetometer, pulling the data and saving it locally,allowing the MCU to extract data from these three sensors directly from the MPU6000 via SPI.

114

Page 133: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.2.4 GPS

The board includes a uBlox MAX-7Q GPS module with a super-capacitor for hot start and antenna matchingcircuitry. The GPS works in the 1575.42Mhz band. The antenna circuitry includes a SAW filter and optionalLow Noise Amplifier (LNA) that can be enabled/disabled by means of on-board 0 ohm jumper resistors.The board also includes uFL and SMA connectors for GPS antenna. The connectors can also be selectedusing 0 ohm jumper resistors.

The default setup includes the SMA connector selected and the LNA disabled, so as to use an active antenna.The use of the on-board LNA is advised only when using passive antennas, since active antennas alreadyhave an LNA.

The GPS communication interface is serial and is connected to the MCU using the UART3 port. The boardalso features jumpers for powering on/off the GPS and connecting/disconnecting the GPS from the UART3port, in case and external GPS is used. Details on these jumper can be found on section 4.2.10.

4.2.5 XBee

The board includes a set of female headers for connecting an XBee RF transceiver module for telemetry.It is important to connect the XBee with the right orientation. Figure 4.4 shows an example of an XBeemodule with SMA connector plugged into the Falcon I Autopilot board. The XBee communication interfaceis serial and is connected to the MCU using the UART2 port.

Figure 4.4: Falcon I Autopilot Board Overview with XBee connected.

4.2.6 Data-Logger

A secondary MCU based on an Atmel AVR 8-bits 16Mhz ATMEGA328P takes care of listening to thetelemetry serial port (UART2) and saving the data in a micro-SD card that can be connected using the micro-SD card holder located in the bottom of the board. The MCU uses a modified version of the “OpenLog”firmware that was officially adapted by paparazziUAV to work with their system. Once the data-loggingfirmware is running there is no need to change any settings, as it starts running and saving data automaticallyafter power up.

The board includes a jumper for powering on/off the data-logger. Details on this jumper can be found onsection 4.2.10.

115

Page 134: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.2.7 CAN Transceiver

The board features a Microchip MCP2551 CAN transceiver supporting 1M/s operation. The CANH andCANL pins are accessible through connector J5. Details on this connector can be found on section 4.2.9.

4.2.8 Motor/Servo Outputs (PWM)

The Falcon I Autopilot board has eight PWM outputs that can be used to control motors and servos.Outputs run through tristate buffers and each signal can be enabled/disabled using one of two user controlsignals. This functionality is intended for using external kill-switches for motor and servo signals.

PWM outputs and control signals are accessible through connectors J3 and J7 respectively. Details on theseconnectors can be found on section 4.2.9

4.2.9 Connectors

J3 Header

Figure 4.5 shows the location and pinout of the motor/servo headers. There are eight 3-pin headers, eachone with GND, Power and PWM output. The Power pins are all tied together and have 4.8-5.5V range.When an Electronic Speed Controller (ESC) is connected, the power coming from it is used to power theservos.

Each of the 8 PWM outputs can be assigned to one of the two user control switches (as a motor or as aservo), as explained in 4.2.8. This is done via solder jumpers SB6-SB13. Details on these jumpers can befound on section 4.2.10.

Figure 4.5: J3 Header Pinout.

116

Page 135: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J5 HeaderFigure 4.6 shows the location and pinout of the 48-pin expansion header that can used to access communi-cation ports, GPIO, Analog inputs, etc.

Figure 4.6: J5 Header Pinout.J6 HeaderFigure 4.7 shows the location and pinout of the pico-blade connector for UART3. This serial port can beused to connect an external GPS module. If an external GPS module is used, the on-board GPS modulecan be disconnected from power using jumper JP10 and the TX and RX lines can be disconnected from theMCU using jumpers JP2 and JP4. Details on these jumpers can be found on section 4.2.10.

Figure 4.7: J6 Header Pinout.

117

Page 136: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J7 Header

Figure 4.8 shows the location and pinout of the user control switches. This header can also be used withjumper, but is primarily intended to be used with SPST toggle switches, each one connected between pins1-2, 3-4, 5-6, 7-8, and 9-10.

Figure 4.8: J7 Header Pinout.

From top to bottom, the first three pairs are used for enabling/disabling the main power supply, externalpower supply and backup power supply respectively. When the switch (or jumper) in on, those pins areshorted to GND and power is off for the corresponding function. The last two pairs are used for enabling/dis-abling the servo and motor signals respectively. When the switch (or jumper) in on, those pins are shortedto 3.3V, and the servo/motor signals are disabled (PWM signal disabled and output pulled-down to GND).Jumper SB6-SB13 are used to select which PWM outputs are assigned to motors and servos. Details onthese jumpers can be found on section 4.2.10.

118

Page 137: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J9 Header

Figure 4.9 shows the location and pinout of the power header. This header is used for powering the board,and also carries external and backup power supplies as well as power enable signals. Table 4.1 shows adescription of each pin of this header.

Figure 4.9: J9 Header Pinout.

Table 4.1: J9 Header Pinout Description.

Pin Signal Description1 VIN Battery voltage for voltage measurement2 GND Ground3 Backup Power Enable Enable signal for backup power4 Servo Power Backup power for servos5 External Power Enable Enable signal for external power6 External Supply External power (5V) for secondary system7 Main Power Enable Enable signal for main power8 5V Main power (5V)

119

Page 138: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J10 Header

Header J10 includes 3 communication ports: 1 x I2C and 2 x UART.

Figure 4.10: J10 Header Pinout.

J11 Header

Figure 4.11: J11 Header Pinout.

120

Page 139: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J12 Header

Figure 4.12: J12 Header Pinout.

J13 Header

Figure 4.13 shows the location and pinout of the pico-blade connector for UART1. This serial port is usedto connect an RC receiver.

Figure 4.13: J13 Header Pinout.

121

Page 140: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J14 Header

Figure 4.14 shows the location and pinout of the pico-blade connector for UART5. This serial port is usedto connect an RC receiver.

Figure 4.14: J14 Header Pinout.

J15 Header

Figure 4.15 shows the location and pinout of the ISP connector used for programming the data-logger MCUATMEGA328P.

Figure 4.15: J15 Header Pinout.

122

Page 141: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J16 Header

Figure 4.16 shows the location and pinout of the FTDI connector used for communicating with the data-logger MCU ATMEGA328P.

Figure 4.16: J16 Header Pinout.

J17 Header

Figure 4.17 shows the location of the uFL connector for the GPS antenna. This connector can be enabledsetting the jumper resistor R62 instead of R67.

Figure 4.17: J17 Header Pinout.

123

Page 142: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J18 Header

Figure 4.18 shows the location of the SMA connector for the GPS antenna. This connector can be enabledsetting the jumper resistor R67 instead of R62.

Figure 4.18: J18 Header Pinout.

J19 Header

Figure 4.19 shows the location and pinout of the SPI interface connector.

Figure 4.19: J19 Header Pinout.

124

Page 143: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

J20 HeaderFigure 4.20 shows the location and pinout of the UART2 connector. This serial port is connected to theXBee module used for telemetry. If an external telemetry module is used, it can be connected to this portwhile leaving the XBee sockets empty.

Figure 4.20: J20 Header Pinout.J21 HeaderFigure 4.21 shows the location and pinout of the UART_LV connector. This UART port has a voltage levelshifter and can be used for communication with an external system that uses a different voltage level (e.g.ODROID Linux mini-PC using 1.8V logic).

Figure 4.21: J21 Header Pinout.

125

Page 144: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

This serial port can be connected to UART2 (currently used for the XBee telemetry module) in case thesecondary system only wants to listen to telemetry information; or it can be connected to UART6, whichcan be used as a dedicated serial link. This change can be done using SB1 and SB2 solder jumpers. Detailson these jumpers can be found on section 4.2.10.J22 HeaderFigure 4.22 shows the location and pinout of the External Power connector. This connector provides powerto an external system. Power comes from pin 6 in header J9.

Figure 4.22: J22 Header Pinout.JTAG1 HeaderFigure 4.23 shows the location of the JTAG connector used for programming the STM32F4 MCU.

Figure 4.23: JTAG1 Header Pinout.

126

Page 145: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

USB1 Header

Figure 4.24 shows the location of the micro-USB connector.

Figure 4.24: USB1 Header Pinout.

EX1, EX2 Headers

Figure 4.25 shows the location of the XBee socket connectors used for plugging the XBee module. Refer tosection 4.2.5 for correct orientation of the XBee module.

Figure 4.25: EX1, EX2 Headers Pinout.

127

Page 146: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.2.10 Jumpers

Solder Jumper SB1-SB13

Figure 4.26 shows the location of the solder jumpers SB1 to SB13 on the bottom side of the Falcon I Autopilotboard, and Figure 4.27 shows the three sub-groups of solder jumpers. All solder jumpers have the same 3-pad footprint, where the middle pad is the common pin and the two side pads are the options. To makea connection, place a solder bridge between the middle pad and one of the side pads. Some jumpers comealready with a default connection by means of a small exposed track (no solder mask) joining the middle padwith one of the side pads. If the default option is no desired, the track needs to be cut so as to disconnectthe pads.

Figure 4.26: Location of all solder jumpers.

(a) Solder jumpers SB1-SB2. (b) Solder jumpers SB3-SB5. (c) Solder jumpers SB6-SB13.

Figure 4.27: Solder jumper sub-groups.

Table 4.2 shows the description for each solder jumper and the two options. The location reference for theoptions (Left, Right, Top, Bottom) is based on the orientation of the jumpers shown in Figure 4.27. Notethat SB4 and SB5 have a default option (option 1) already selected by means of a trace connection, so thereis no need to place a solder bridge if this option is desired. If option 2 is desired, the trace must be carefullycut using a sharp blade (e.g. scalpel knife) before placing the solder bridge.

Warning: For SB4 and SB5, do not solder bridge option 2 without removing option 1 default connection.Powering the boards like this might damage it.

128

Page 147: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Table 4.2: Solder jumpers descriptions.

SB Option 1 Option 2 Description1 (Left) UART6_TX (Right) UART2_TX Selects UART (Low Voltage) port (TX pin)2 (Left) UART2_RX (Right) UART6_RX Selects UART (Low Voltage) port (RX pin)3 (Top) 3.3V (Bottom) 5V UART3 Power Rail4 (Top) 3.3V (default) (Bottom) 5V UART1 Power Rail5 (Top) 3.3V (default) (Bottom) 5V UART5 Power Rail6 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 1)7 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 2)8 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 3)9 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 4)10 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 5)11 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 6)12 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 7)13 (Top) Servo Group (Bottom) Motor Group PWM signal enable/disable (Output 8)

GPS Signal Jumpers

Figure 4.28 shows the location of the jumpers used for configuring the GPS signal circuitry. There arethree GPS signal setup jumpers used for configuring the signal path. Each jumper consists of an 0402capacitor/resistor that can be soldered vertically or horizontally. The orientation of these componentsdefines the route of the signal. Table 4.3 shows the description for each setup based on these jumpers.

Figure 4.28: Location of GPS signal jumpers.

Table 4.3: GPS signal jumpers descriptions.

Setup DescriptionC84 On and R63 On On-board LNA enabledC86 On and R66 On On-board LNA disabledR62 On uFL antenna connector enabledR67 On SMS antenna connector enabled

129

Page 148: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

GPS and Data-logger Power Jumpers

Figure 4.29 shows the location of the two power jumpers. When JP7 is closed the data-logger is poweredand when JP10 in closed the GPS is powered.

Figure 4.29: Location of power jumpers.

GPS Serial Jumpers

Figure 4.30 shows the location of the GPS serial jumpers. These jumpers form a “T” connection for theUART3 port between the on-board GPS module, MCU and J6 header. JP1 and JP3 connect the RX andTX lines respectively of the on-board GPS module to header J6. JP2 and JP4 connect the TX and RX linesrespectively of the MCU to header J6.

Figure 4.30: Location of GPS serial jumpers.

130

Page 149: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

During normal operation (using the on-board GPS), all 4 jumpers are closed. There are other two cases:

1. JP1 and JP3 can be left open for using an external GPS module connected directly to J6.

2. JP2 and JP4 can be left open for talking directly to the GPS module from an external system throughJ6. An example of this is connecting to the GPS from a PC in order to configure it.

131

Page 150: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.3 Board Pictures

Figure 4.31: Falcon I Autopilot Board (Top).

Figure 4.32: Falcon I Autopilot Board (Bottom).

132

Page 151: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.4 Autopilot Board Schematics

This section contains the following schematics:

• Index

• UAARG Autopilot

• Connectors

• MCU

• IMU

• GPS

• CAN, JTAG, LDO, USB

• Telemetry, I2C

• Switches, UI

• Data Logger

• Power

• Main Power

• Backup Power

• Notes

• Documentation

• Revision History

133

Page 152: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AIn

dex.

SchD

oc2.

020

-DEC

-201

51

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

b

U_D

ocum

enta

tion

Doc

umen

tatio

n.Sc

hDoc

U_R

evis

ion

His

tory

Rev

isio

n H

isto

ry.S

chD

oc

UA

AR

G A

utop

ilot.P

rjPc

b2.

0Pr

ojec

t Titl

e:R

evis

ion:

Dat

e:20

-DEC

-201

5

Blo

cks

Page

Nam

e...

......

.....

......

......

......

......

......

......

......

......

......

......

......

......

......

.....

1In

dex

2U

AA

RG

Aut

opilo

t3 4 5 6 7 8 9 18

......

......

......

......

......

......

......

......

......

......

..17

......

......

......

......

......

......

......

......

......

......

..16

......

......

......

......

......

......

......

......

......

......

..1514 23

......

......

......

......

......

......

......

......

......

......

..22

......

......

......

......

......

......

......

......

......

......

..21

......

......

......

......

......

......

......

......

......

......

..20

......

......

......

......

......

......

......

......

......

......

..19

......

......

......

......

......

......

......

......

......

......

..

25...

......

......

......

......

......

......

......

......

......

.....

24...

......

......

......

......

......

......

......

......

......

.....

11

Not

es

10 13

Rev

isio

n H

isto

ry

12

Doc

umen

tatio

n

27...

......

......

......

......

......

......

......

......

......

.....

26...

......

......

......

......

......

......

......

......

......

.....

28...

......

......

......

......

......

......

......

......

......

.....

U_U

AA

RG

Aut

opilo

tU

AA

RG

Aut

opilo

t.Sch

Doc

Dat

a Lo

gger

Tele

met

ry, I

2C

Mai

n Po

wer

U_N

otes

Not

es.S

chD

oc

Uni

vers

ity o

f Alb

erta

Aer

ial R

obot

ics G

roup

MC

U

CA

N, J

TAG

, LD

O, U

SB

IMU

Switc

hes,

UI

Bac

kup

Pow

er

U_P

ower

Pow

er.S

chD

oc

GPS

Pow

er

134

Page 153: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

55

66

DD

CC

BB

AA

1400

x900

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:Re

v:D

ate:

Shee

t

o

fD

raw

n by

:

UA

RT_I

N

U_D

ata

Logg

erD

ata

Logg

er.S

chD

oc

TP3

TP4

TP1

GN

D

+3V

3+3

V3_

SEN

S

S1 S2 S3 S4 S5 S6 S7 S8

S1 S2 S3 S4 S5 S6 S7 S8

LED

s

Serv

os

BIN

D

BU

FF1_

ENB

UFF

2_EN

U_S

witc

hes,

UI

Switc

hes,

UI.S

chD

oc

PB0

PB2

PB5

PA11

PA12

PC0

PC1

PC3

PC4

PC14

NR

STB

OO

T0Se

rvos

JTA

G

CA

N

I2C

1

I2C

2

LED

s

SPI1

SPI2

UA

RT

USB

U_M

CU

MCU

.Sch

Doc

Serv

os

LED

s

BIN

D

MA

G_I

NT

MPU

_IN

T

GPS

_TX

GPS

_RX

JTA

G

CA

NCA

N_B

US

USB

U_C

AN

, JTA

G, L

DO

, USB

CAN

, JTA

G, L

DO

, USB

.Sch

Doc

JTA

G

CA

N

XB

EE_R

XX

BEE

_TX

LV_R

XLV

_TX

I2C

1

I2C

2

I2C1

_5V

I2C2

_5V

UA

RT_T

XU

ART

_RX

U_T

elem

etry

, I2C

Tele

met

ry, I

2C.S

chD

ocI2

C1

I2C

2

SPI2

UA

RT.

UA

RT2.

RX

UA

RT

UA

RT.

UA

RT2.

TX

UA

RT_T

XU

ART

_RX

LV_T

XLV

_RX

SPI1

I2C1

_5V

I2C2

_5V

CAN

_BU

S

AD

C_1

AD

C_2

AD

C_3

BO

OT0

NR

ST

GPI

O1

GPI

O2

GPI

O3

9.1K

R2

2.2K

R3

GN

D

VB

AT

UA

RT.

UA

RT2.

TX

+5V

UA

RT.

UA

RT1.

TXU

AR

T.U

ART

1.R

XU

AR

T.U

ART

2.TX

UA

RT.

UA

RT2.

RX

UA

RT3_

TX_E

UA

RT3_

RX_E

UA

RT.

UA

RT5.

TXU

AR

T.U

ART

5.R

X

LV_T

XLV

_RX

LV+5

V_A

LT

BO

OT0

NR

ST

GPI

O1

GPI

O2

GPI

O3

AD

C_1

AD

C_2

AD

C_3

UA

RT.

UA

RT5.

TX

SPI1

.MO

SISP

I1.M

ISO

SPI1

.SC

LKSP

I1.C

SSP

I1.D

RDY

CAN

_BU

S.CA

NH

CAN

_BU

S.C

AN

LI2

C1_5

V.S

DA

I2C1

_5V

.SC

L

I2C2

.SD

AI2

C2.S

CL

I2C2

_5V

.SD

AI2

C2_5

V.S

CL

GN

D

GN

D

SET1

_EN

SET2

_EN

12

34

56

78

910

1112

1314

1516

1718

1920

2122

2324

2526

2728

2930

3132

3334

3536

3738

3940

4142

4344

4546

4748

J5 Mai

n Po

rt

GN

D

UA

RT.

UA

RT6.

TXU

AR

T.U

ART

6.R

X

+3V

3+5

V+3

V3

UA

RT.

UA

RT6.

TX

UA

RT.

UA

RT6.

RX

UA

RT.

UA

RT2.

RX

UA

RT.

UA

RT2.

TX3

1

2

SB1

UA

RT_T

X

UA

RT_R

X

UAR

T (L

ow V

olta

ge) S

elec

tion

VB

AT

MH

1

Mou

ntin

g H

oles

3.1

75m

m (1

25m

il) d

rill f

or 4

-40/

M2.

5/M

3 sc

rew

s

FID

1FI

D2

FID

4FI

D5

FID

3FI

D6

Fidu

cial

s

Mec

hani

cal

TP2

+5V

USB

31

2

SB2

MH

2M

H3

MH

4

0R1

MPU

_IN

TM

AG

_IN

T

SPI2

U_I

MU

IMU

.Sch

Doc

GPS

_RX

GPS

_TX

U_G

PSG

PS.S

chD

oc

GN

D

SET1

_EN

SET2

_EN

+3V

3

12

34

56

78

910

J7 Switc

hes

BC

KP_

EN

VIN

_REF

MA

IN_E

NA

LT_E

NB

CK

P_EN

M_E

XT_

EN

12

34

56

78

J9 Pow

er

S1 S2 S3 S4 S5 S6 S7 S8 M_E

XT_

EN

I2C

2

I2C2

_5V

UA

RT

GPS

_TX

GPS

_RX

SPI1

LV_T

XLV

_RX

UA

RT3_

RX_E

UA

RT3_

TX_E

U_C

onne

ctor

sCo

nnec

tors

.Sch

Doc

I2C

2

M_E

XT_

EN

UA

RT1_

VC

C

UA

RT5_

VC

CU

ART

3_V

CC

UA

AR

G A

utop

ilot.S

chD

oc2.

020

-DEC

-201

52

16En

g. Jo

rge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

SERV

O_P

WR

GN

D

+5V

+5V

_ALT

VIN_REFM

AIN

_EN

ALT

_EN

UA

RT3_

RX_E

UA

RT3_

TX_E

COFID1

COFI

D2COFID3

COFID4

COFI

D5CO

FID6

PIJ501

PIJ502

PIJ503

PIJ504

PIJ505

PIJ506

PIJ507

PIJ508

PIJ509

PIJ5010

PIJ5011

PIJ5012

PIJ5

013

PIJ5014

PIJ5015

PIJ5016

PIJ5017

PIJ5018

PIJ5019

PIJ5020

PIJ5021

PIJ5022

PIJ5023

PIJ5024

PIJ5025

PIJ5026

PIJ5

027

PIJ5028

PIJ5029

PIJ5030

PIJ5

031

PIJ5032

PIJ5

033

PIJ5034

PIJ5035

PIJ5036

PIJ5

037

PIJ5038

PIJ5039

PIJ5040

PIJ5041

PIJ5042

PIJ5043

PIJ5044

PIJ5045

PIJ5046

PIJ5047

PIJ5048

COJ5

PIJ701

PIJ702

PIJ703

PIJ704

PIJ705

PIJ706

PIJ707

PIJ708

PIJ709

PIJ7010

COJ7

PIJ901

PIJ902

PIJ903

PIJ904

PIJ905

PIJ906

PIJ907

PIJ908

COJ9

PIMH101COMH1

PIMH201COMH2

PIMH301COMH

3

PIMH401COMH4

PIR101PIR102 COR1

PIR201PIR202 COR2

PIR301PIR302 COR3

PISB101

PISB102PISB103

COSB1

PISB201

PISB202PISB203

COSB2

PITP101 COTP

1

PITP201 COTP

2

PITP301 COTP3

PITP401 COTP4

PIJ503

PIJ504

PIJ707

PIJ709 PITP301

PITP401

PIJ501

PIJ502

PIJ908

PITP201

PIJ5047

PIJ906

PIJ5019

NLAD

C01

PIJ5021

NLAD

C02

PIJ5023

NLAD

C03

PIJ704

PIJ905

NLALT0EN

PIJ706

PIJ903

NLBC

KP0E

N

NLBI

ND

PIJ509

NLBOOT0

NLCA

NNL

CAN

PIJ5035

NLCA

N0BU

S

NLCAN0BUS0CANH

PIJ5

037

NLCA

N0BU

S

NLCAN0BUS0CANL

PIJ505

PIJ506

PIJ5048

PIJ701

PIJ703

PIJ705

PIJ902

PIR301

PITP101

PIJ5011

NLGP

IO1

PIJ5

013

NLGP

IO2

PIJ5015

NLGP

IO3

NLGPS0RX

NLGPS0TX

NLI2

C1NL

I2C1

PIJ5041

NLI2

C105

V

NLI2

C105

V0SC

LPIJ5039

NLI2

C105

VNL

I2C1

05V0

SDA

PIJ5036

NLI2

C2

NLI2

C20S

CLPIJ5034

NLI2

C2

NLI2C20SDA

PIJ5040

NLI2

C205

VNL

I2C2

05V0

SCL

PIJ5038

NLI2

C205

V

NLI2

C205

V0SD

A

NLJT

AGNL

JTAG

NLJT

AGNL

JTAG

NLJT

AG

NLLE

DsNL

LEDs

NLLE

DsNL

LEDs

NLLE

Ds

PIJ5046

PIJ5042

NLLV0RX

PIJ5044

NLLV0TX

PIJ5043

NLM0EXT0EN

NLMA

G0IN

T

PIJ702

PIJ907

NLMA

IN0E

N

NLMP

U0IN

T

PIMH101PIMH201

PIMH301PIMH401

PIR101PIR201 PIR302

PIJ507

NLNR

STNLS

1NLS

2NLS

3NLS

4NLS

5NLS

6NLS

7NLS

8

PIJ904

NLSe

rvos

NLSe

rvos

NLSe

rvos

NLSe

rvos

NLSe

rvos

NLSe

rvos

NLSe

rvos

NLSe

rvos

PIJ708

NLSE

T10E

N

PIJ7010

NLSE

T20E

N

PIJ5

031

NLSP

I1

NLSP

I10C

S

PIJ5

033

NLSP

I1

NLSPI10DRDY

PIJ5

027

NLSP

I1

NLSPI10MISO

PIJ5025

NLSP

I1

NLSPI10MOSI

PIJ5029

NLSP

I1

NLSPI10SCLK

NLSP

I2NL

SPI2

NLSP

I2NL

SPI2

NLSP

I2

PIJ508

PIJ5024

NLUART30RX0E

PIJ5022

NLUART30TX0E

PIJ5010

PIJ5012

PIJ5016

NLUA

RT

NLUA

RT0U

ART1

0RX

PIJ5014

NLUA

RT

NLUART0UART10TX

PIJ5020

PISB203

NLUA

RT

NLUA

RT0U

ART2

0RX

PIJ5018

PISB103

NLUA

RT

NLUART0UART20TX

NLUA

RTNL

UART

PIJ5028

NLUA

RT

NLUA

RT0U

ART5

0RX

PIJ5017

PIJ5026

NLUA

RT

NLUART0UART50TX

PIJ5032

PISB201

NLUA

RT

NLUART0UART60RX

PIJ5030

PISB101

NLUA

RT

NLUART0UART60TX

PISB202

NLUA

RT0R

X

PISB102

NLUA

RT0T

X

NLUS

BNL

USB

NLUS

B

PIJ5045

PIR102

NLVBAT

PIJ901

PIR202NLVIN0REF

135

Page 154: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

55

66

DD

CC

BB

AA

1400

x900

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:Re

v:D

ate:

Shee

t

o

fD

raw

n by

:

S_G

ND

SERV

O_P

WR

S1

S_G

ND

SERV

O_P

WR

S2

S_G

ND

SERV

O_P

WR

S3

S_G

ND

SERV

O_P

WR

S4

S_G

ND

SERV

O_P

WR

S5

S_G

ND

SERV

O_P

WR

S6

S_G

ND

SERV

O_P

WR

S7

S_G

ND

SERV

O_P

WR

S8

Serv

os/M

otor

(s)

+3V

3+5

V

JP1

UA

RT3_

TX_E

UA

RT3_

RX_E

GPS

_TX

GPS

_RX

10K

DN

PR

4

GN

D

+3V

3+5

V

UA

RT1_

TX10

KR

7

UA

RT1_

RX

UA

RT5_

TXU

ART

5_R

X

31

2

SB4

UAR

T1 &

UAR

T5 (R

C)

UAR

T3 (G

PS)

GN

D

+3V

3

UA

RT1_

VC

C

GN

D

+3V

3+5

V

10K

R8

UA

RT5_

VC

C

I2C2

31

2

SB3

31

2

SB5

1 2 3

J3A 4 5 6

J3B 7 8 9

J3C 10 11 12

J3D 13 14 15

J3E 16 17 18

J3F 19 20 21

J3G 22 23 24

J3H

1 2 3 4

J10A

UA

RT1

UA

RT5

I2C

2

Serv

o 1

Serv

o 2

Serv

o 3

Serv

o 4

Serv

o 5

Serv

o 6

Serv

o 7

Serv

o 8

5 6 7 8

J10B

9 10 11 12

J10C

81

1KRN1A

72

1KRN1B

63

1KRN1C

54

1KRN1D

81

1KRN2A

72

1KRN2B

63

1KRN2C

54

1KRN2D

1 2 3 4

J13

PB-U

AR

T1

1 2 3 4

J14

PB-U

AR

T5

1 2 3 4

J6 PB-U

AR

T3

iPC

B R

ule

iPC

B R

ule

Mot

or(s

) Ext

erna

l Shu

tdow

n

C1

B1

E1MMDT5551-7-F

Q1A

GN

D C2

B2

E2MMDT5551-7-F

Q1B

GN

D

0R5

0R9

S8_O

UT

S7_O

UT

S8_O

UT

S7_O

UT

M_E

XT_

EN1KR

6

1KR10

S_G

ND

GN

D

JP3

JP2

JP4

Con

nect

ors.S

chD

oc2.

020

-DEC

-201

53

16En

g. Jo

rge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

S1 S2 S3 S4 S5 S6 S7 S8

M_E

XT_

ENSD

ASC

L

I2C

2

I2C

2

SDA

SCL

I2C2

_5V

I2C2

_5V

I2C2

_SC

LI2

C2_S

DA

I2C2

_5V

_SC

LI2

C2_5

V_S

DA

UA

RT1_

TXU

ART

1_R

XU

ART

5_TX

UA

RT5_

RX

TX RX

UART2

UA

RT

TX RX

UART3TX RX

UART1 TX RX

UART5 TX RX

UART6

UA

RT1

UA

RT2

UA

RT3

UA

RT5

UA

RT6

UA

RT

UA

RT1_

TXU

ART

1_R

X

UA

RT5_

TXU

ART

5_R

X

UA

RT3_

TX

UA

RT3_

RX

UA

RT3_

TXU

ART

3_R

X

GPS

_TX

GPS

_RX

I2C2

_SC

LI2

C2_S

DA

GN

D

+3V

3 I2C2

_SC

LI2

C2_S

DA

1 2 3 4

J12

PB-I2

C2_5

VG

ND

+5V

I2C2

_5V

_SC

LI2

C2_5

V_S

DA

1 2 3 4 5 6 7

J19

PB-S

PI1

SCLK

MO

SIM

ISO CS

DRD

Y

SPI1

SPI1

SPI1

_MO

SISP

I1_C

SSP

I1_D

RDY

SPI1

_MIS

OSP

I1_S

CLK

+3V

3

GN

D

1 2 3 4

J20

PB-U

AR

T2

UA

RT2_

TXU

ART

2_R

X

GN

DG

ND

GN

D

+3V

3

UA

RT2_

TXU

ART

2_R

X

1 2 3 4 5

J21

PB-U

AR

T_LV

LV_T

XLV

_RX

M_E

XT_

EN

LV

GN

D GN

D

LV_T

XLV

_RX

UAR

T (E

xter

nal V

olta

ge L

evel

)

SERV

O_P

WR

SERV

O_P

WR

SPI1

_MO

SI

SPI1

_CS

SPI1

_DRD

Y

SPI1

_MIS

OSP

I1_S

CLK

UA

RT1_

VC

CU

ART

5_V

CC

UA

RT3_

VC

C

1 2 3 4

J11

PB-I2

C2

UAR

T2 (T

elem

etry

)

SPI1

UA

RT3_

TX_E

UA

RT3_

TX_E

UA

RT3_

RX_E

UA

RT3_

RX_E

1 2

J22

Ext.

Pow

er

Exte

rnal

Pow

erGN

D

+5V

_ALT

PIJ301

PIJ302

PIJ303

COJ3

A

PIJ304

PIJ305

PIJ306

COJ3

B

PIJ307

PIJ308

PIJ309

COJ3

C

PIJ3010

PIJ3011

PIJ3012CO

J3D

PIJ3013

PIJ3014

PIJ3015CO

J3E

PIJ3016

PIJ3017

PIJ3018CO

J3F

PIJ3019

PIJ3020

PIJ3021CO

J3G

PIJ3022

PIJ3023

PIJ3024CO

J3H

PIJ601

PIJ602

PIJ603

PIJ604COJ6

PIJ1001

PIJ1002

PIJ1003

PIJ1004

COJ1

0APIJ1005

PIJ1006

PIJ1007

PIJ1008

COJ1

0B

PIJ1009

PIJ10010

PIJ10011

PIJ10012

COJ1

0C

PIJ1101

PIJ1102

PIJ1103

PIJ1104

COJ1

1

PIJ1201

PIJ1202

PIJ1203

PIJ1204

COJ1

2

PIJ1301

PIJ1302

PIJ1303

PIJ1304

COJ1

3

PIJ1401

PIJ1402

PIJ1403

PIJ1404

COJ1

4

PIJ1901

PIJ1902

PIJ1903

PIJ1904

PIJ1905

PIJ1906

PIJ1907

COJ19

PIJ2001

PIJ2002

PIJ2003

PIJ2004

COJ20

PIJ2101

PIJ2102

PIJ2103

PIJ2104

PIJ2105

COJ21

PIJ2201

PIJ2202

COJ22

PIJP

101

PIJP

102

COJP1

PIJP

201

PIJP

202COJP2

PIJP

301

PIJP

302

COJP3

PIJP

401

PIJP

402 COJP4

PIQ10B1

PIQ10C1 PIQ10E1CO

Q1A

PIQ10B2

PIQ10C2 PIQ10E2

COQ1

B

PIR4

01PI

R402

COR4

PIR501

PIR5

02COR5

PIR601

PIR6

02COR6

PIR7

01PIR702

COR7

PIR8

01PI

R802

COR8

PIR901

PIR9

02COR9

PIR1

001

PIR1002COR

10

PIRN101

PIRN

108

CORN

1A

PIRN102

PIRN10

7

CORN

1B

PIRN103

PIRN

106

CORN

1C

PIRN104

PIRN10

5

CORN

1D

PIRN201

PIRN20

8

CORN

2A

PIRN202

PIRN20

7

CORN

2B

PIRN203

PIRN20

6

CORN

2C

PIRN204

PIRN20

5

CORN

2D

PISB301

PISB302PISB303

COSB3

PISB401

PISB402PISB403

COSB4

PISB501

PISB502PISB503

COSB5

PIJ1009

PIJ1101

PIJ1902

PIJ2001

PISB301

PISB401

PISB501

PIJ1201

PISB303

PISB403

PISB503

PIJ2201

PIJ303

PIJ306

PIJ309

PIJ3012

PIJ3015

PIJ3018

PIJ3021

PIJ3024

PIJ604

PIJ1004

PIJ1008

PIJ10012

PIJ1104

PIJ1204

PIJ1304

PIJ1404

PIJ1901

PIJ2004

PIJ2105

PIJ2202

PIQ10E1 PIQ10E2

PIJP

401

NLGPS0RX

POGPS0RX

PIJP

201

NLGPS0TX

POGPS0TX

PIJ1203

NLI2C205V0SCL

POI2

C205

V

PIJ1202

NLI2C205V0SDA

POI2

C205

V

PIJ10011

PIJ1103

NLI2C20SCL

POI2C2

PIJ10010

PIJ1102

NLI2

C20S

DAPOI2C2

PIJ2101

PIJ2103NL

LV0RX

POLV0RX

PIJ2102NLLV0TX

POLV0TX

PIJ2104

PIR6

02

PIR1002

NLM0EXT0EN

POM0EXT0EN

POUART

POUART

PIJ301

PIRN10

5

PIJ304

PIRN

106

PIJ307

PIRN10

7

PIJ3010

PIRN

108

PIJ3013

PIRN20

5

PIJ3016

PIRN20

6

PIQ10B1

PIR601

PIQ10B2

PIR1

001

PIQ10C1PIR501

PIQ10C2PIR901

PIRN104

NLS1

POS1

PIRN103

NLS2

POS2

PIRN102

NLS3

POS3

PIRN101

NLS4

POS4

PIRN204

NLS5

POS5

PIRN203

NLS6

POS6

PIRN202

NLS7

POS7

PIJ3019

PIR5

02

PIRN20

7NLS70OUT

PIRN201

NLS8

POS8

PIJ3022

PIR9

02

PIRN20

8NLS80OUT

PIJ302

PIJ305

PIJ308

PIJ3011

PIJ3014

PIJ3017

PIJ3020

PIJ3023

NLSERVO0PWR

PIJ1906

NLSP

I10C

SPOSPI1

PIJ1907

NLSP

I10D

RDY

POSPI1

PIJ1904

NLSPI10MISO

POSPI1

PIJ1903

NLSPI10MOSI

POSPI1

PIJ1905

NLSPI10SCLK

POSPI1

PIJ1003

PIJ1303

PIR7

01

NLUART10RX

POUART

PIJ1002

PIJ1302

NLUART10TX

POUART

PIJ1001

PIJ1301

PIR702

PISB402NLUART10VCC

PIJ2003

NLUART20RX

POUART

PIJ2002

NLUART20TX

POUART

PIJP

101

NLUART30RX

POUART

PIJ603

PIJP

102

PIJP

202

PIR4

01

NLUART30RX0E

POUART30RX0E

PIJP

301

NLUART30TX

POUART

PIJ602

PIJP

302

PIJP

402

NLUART30TX0E

POUART30TX0E

PIJ601

PIR4

02

PISB302

PIJ1007

PIJ1403

PIR8

01

NLUART50RX

POUART

PIJ1006

PIJ1402

NLUART50TX

POUART

PIJ1005

PIJ1401

PIR8

02

PISB502NLUART50VCC

POGPS0RX

POGPS0TX

POI2C2

POI2C20SCL

POI2C20SDA

POI2

C205

VPO

I2C2

05V0

SCL

POI2

C205

V0SD

A

POLV0RX

POLV0TX

POM0EXT0EN

POS1

POS2

POS3

POS4

POS5

POS6

POS7

POS8

POSPI1

POSP

I10C

SPO

SPI1

0DRD

YPO

SPI1

0MIS

OPO

SPI1

0MOS

IPO

SPI1

0SCL

K

POUART

POUART30RX0E

POUART30TX0E

POUART0UART1

POUART

0UART1

0RXPOU

ART0UA

RT10TX

POUART0UART2

POUART

0UART2

0RXPOU

ART0UA

RT20TX

POUART0UART3

POUART

0UART3

0RXPOU

ART0UA

RT30TX

POUART0UART5

POUART

0UART5

0RXPOU

ART0UA

RT50TX

POUART0UART6

POUART

0UART6

0RXPOU

ART0UA

RT60TX

136

Page 155: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AM

CU

.Sch

Doc

2.0

20-D

EC-2

015

416

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

PA0-

WK

UP(

PA0)

14

PA1

15

PA2

16

PA3

17

PA4

20

PA5

21

PA6

22

PA7

23

PB0

26

PB1

27

PB2-

BO

OT1

(PB

2)28

PB10

29

PB11

30

PB12

33

PB13

34

PB14

35

PB15

36

PA8

41

PA9

42

PA10

43

PA11

44

PA12

45

PA13

(JTM

S-SW

DIO

)46

PA14

(JTC

K-S

WC

LK)

49

PA15

(JTD

I)50

PB3(

JTD

O/T

RA

CES

WO

)55

PB4(

NJT

RST

)56

PB5

57

PB6

58

PB7

59

PB8

61

PB9

62

PC13

2

PC14

-OSC

32_I

N(P

C14

)3

PC15

-OSC

32_O

UT(

PC15

)4

PC0

8

PC1

9

PC2

10

PC3

11

PC4

24

PC5

25

PC6

37

PC7

38

PC8

39

PC9

40

PC10

51

PC11

52

PC12

53

PD2

54

PH0-

OSC

_IN

(PH

0)5

PH1-

OSC

_OU

T(PH

1)6

NR

ST7

VC

AP_

131

VC

AP_

247

BO

OT0

60

U1A

STM

32F4

05R

GT6

VB

AT

1

VSS

A12

VD

DA

13

VSS

18

VD

D19

VD

D32

VD

D48

VSS

63V

DD

64

U1B

STM

32F4

05R

GT6

2.2u

FC

3

GN

D

GN

D

+3V

3 4.7u

F

C6

GN

D

1uF

C11

GN

D

+3V

3

BLM

15A

X10

2SN

1D

L1

+3V

3

PB0

PB0

PB2

PB5

PB1

PB2

PB3

PB4

PB5

PB6

PB7

PB8

PB9

PB10

PB11

PB12

PB13

PB14

PB15

PA0

PA1

PA2

PA3

PA4

PA5

PA6

PA7

PA8

PA9

PA10

PA11

PA12

PA13

PA14

PA15

PC0

PC1

PC2

PC3

PC4

PC5

PC6

PC7

PC8

PC9

PC10

PC11

PC12

PC13

PC14

PC15

PA11

PA12

PC0

PC1

PC3

PC4

PC14

PD2

1 2

Y1

XTA

L_1

10pF

C2

GN

D

10K

R12

GN

D

+3V

3

NR

ST

NR

ST

BO

OT0

BO

OT0

PH0-

OSC

_IN

PH0-

OSC

_OU

T

GN

D

SPI2

_MO

SISP

I2_M

ISO

SPI2

_SC

LKSP

I2_C

S_M

PU

JTA

G_T

DI

JTA

G_T

CK

JTA

G_T

MS

JTA

G_R

ST

JTA

G_T

DO

MPU

_IN

T

SPI2

_CS_

BA

RO

MA

G_I

NT

UA

RT3

_TX

UA

RT3

_RX

CA

N_R

XC

AN

_TX

VD

DA

UA

RT2

_TX

UA

RT2

_RX

LED

4

LED

3

LED

1

LED

5

LED

2

BIN

D_B

UTT

ON

SER

VO

_7/I2

C1_

SCL

SER

VO

_8/I2

C1_

SDA

I2C

2_SC

LI2

C2_

SDA

UA

RT5

_RX

SPI1

_DR

DY

PC12

/UA

RT5

_TX

SER

VO

_4SE

RV

O_3

SER

VO

_2SE

RV

O_1

VB

AT_

MEA

SA

DC

_1

AD

C_3

AD

C_2

UA

RT1

_RX

UA

RT1

_TX

/USB

_VB

US

SPI1

_CS

SPI1

_MO

SISP

I1_M

ISO

SPI1

_SC

LK

SER

VO

_6SE

RV

O_5

100n

F

C7

100n

F

C8

100n

F

C9

100n

F

C10

100n

F

C12

10K

R11

2.2u

FC

4

2.2u

FC

5

10pF

C1

Serv

o1Se

rvo2

Serv

o3Se

rvo4

Serv

o5Se

rvo6

Serv

o7Se

rvo8

Serv

osPC

6PC

7PC

8PC

9PA

0PA

1PB

6PB

7

Serv

os

JTA

G

PA13

PA14

PA15

PB3

NR

ST

TMS

TCK

TDO

TDI

RST

JTA

G

TX RXC

AN

CA

NPB

8PB

9

SDA

SCL

I2C

1

I2C

1

SDA

SCL

I2C

2

I2C

2PB

10PB

11

PB7

PB6

LED

s

LED

1LE

D2

LED

3LE

D4

LED

5

LED

s

PB4

PC15

PC5

PC2

PA8

SCLK

MO

SIM

ISO

CS

DR

DY

SPI1

SPI1

PA7

PA6

PA5

PA4

SCLK

MO

SIM

ISO

CS_

MPU

CS_

BA

RO

SPI2

SPI2

PB13

PB15

PB14

PB12

PB1

PC13

PB0

PA11

/USB

_DN

PA12

/USB

_DP

VB

US

D_N

D_P

USB

USB

TX RX

UART2

UA

RT

PA2

PA3

TX RX

UART3

PC10

PC11

TX RX

UART1

PA9

PA10

TX RX

UART5

TX RX

UART6

PD2

PC12

PC6

PC7

UA

RT1

UA

RT2

UA

RT3

UA

RT5

UA

RT6

UA

RT

PA9

PA11

PA12

PIC1

01PIC102

COC1

PIC2

01PIC202

COC2

PIC301PIC302COC3

PIC401PIC402COC

4

PIC501PIC502COC

5

PIC601PIC602COC6

PIC701 PIC702COC7

PIC801 PIC802COC8

PIC901 PIC902COC9

PIC1001 PIC1002COC1

0

PIC1101PIC1102COC1

1PIC1201 PIC1202CO

C12

PIL1

01PI

L102

COL1

PIR1101PIR1102 COR1

1

PIR1201PIR1202 COR1

2PIU102

PIU103

PIU104

PIU105

PIU106

PIU107

PIU108

PIU109

PIU1010

PIU1011

PIU1014

PIU1015

PIU1016

PIU1017

PIU1020

PIU1021

PIU1022

PIU1023

PIU1024

PIU1025

PIU1026

PIU1027

PIU1028

PIU1029

PIU1030

PIU1031

PIU1033

PIU1034

PIU1035

PIU1036

PIU1037

PIU1038

PIU1039

PIU1040

PIU1041

PIU1042

PIU1043

PIU1044

PIU1045

PIU1046

PIU1047

PIU1049

PIU1050

PIU1051

PIU1052

PIU1053

PIU1054

PIU1055

PIU1056

PIU1057

PIU1058

PIU1059

PIU1060

PIU1061

PIU1062

COU1

A

PIU101

PIU1012

PIU1013

PIU1018

PIU1019

PIU1032

PIU1048

PIU1063

PIU1064CO

U1B

PIY101

PIY102

PIY103

PIY104

COY1

PIC602PIC701

PIC801PIC901

PIC1001

PIL1

01

PIR1202

PIU101

PIU1019

PIU1032

PIU1048

PIU1064

PIR1102PIU1060

NLBOOT0

POBOOT0

PIC1

01

PIC2

01

PIC301PIC401

PIC501

PIC601PIC702

PIC802PIC902

PIC1002

PIC1101PIC1202

PIR1101

PIU1012

PIU1018

PIU1063

PIY103

PIY104

PIC302PIU1047

PIC402

PIU1031

PIC502PIR1201

PIU107

NLNR

ST

POJTAG

PONRST

PIU1014

NLPA0

POServos

PIU1015

NLPA1

POServos

PIU1016

NLPA2

POUART

PIU1017

NLPA3

POUART

PIU1020

NLPA4

POSPI1

PIU1021

NLPA5

POSPI1

PIU1022

NLPA6

POSPI1

PIU1023

NLPA7

POSPI1

PIU1041

NLPA8

POLEDs

PIU1042

NLPA9

POUART

POUSB

PIU1043

NLPA10

POUART

PIU1044

NLPA11

POPA11

POUSB

PIU1045

NLPA12

POPA12

POUSB

PIU1046

NLPA13

POJTAG

PIU1049

NLPA14

POJTAG

PIU1050

NLPA15

POJTAG

PIU1026

NLPB0

POPB0

PIU1027

NLPB1

POSPI1

PIU1028

NLPB2

POPB2

PIU1055

NLPB3

POJTAG

PIU1056

NLPB4

POLEDs

PIU1057

NLPB

5POPB5

PIU1058

NLPB6

POI2C1

POServos

PIU1059

NLPB7

POI2C1

POServos

PIU1061

NLPB8

POCAN

PIU1062

NLPB9

POCAN

PIU1029

NLPB

10POI2C2

PIU1030

NLPB

11POI2C2

PIU1033

NLPB

12POSPI2

PIU1034

NLPB

13

POSPI2

PIU1035

NLPB

14POSPI2

PIU1036

NLPB

15POSPI2

PIU108

NLPC0

POPC0

PIU109

NLPC1

POPC1

PIU1010

NLPC2

POLEDs

PIU1011

NLPC3

POPC3

PIU1024

NLPC4

POPC4

PIU1025

NLPC5

POLEDs

PIU1037

NLPC6

POServos

POUART

PIU1038

NLPC7

POServos

POUART

PIU1039

NLPC8

POServos

PIU1040

NLPC9

POServos

PIU1051

NLPC

10POUART

PIU1052

NLPC

11POUART

PIU1053

NLPC

12

POUART

PIU102

NLPC

13

POSPI2

PIU103

NLPC

14POPC14

PIU104

NLPC15

POLEDs

PIU1054

NLPD2

POUART

PIC102

PIU105

PIY101

NLPH

00OS

C0IN

PIC202

PIU106

PIY102

NLPH00OSC0OUT

PIC1102PIC1201

PIL1

02

PIU1013

NLVD

DA

POBOOT0

POCAN

POCAN0RX

POCAN0TX

POI2C1

POI2C10SCL

POI2C10SDA

POI2C2

POI2C20SCL

POI2C20SDA

POJTAG

POJTAG0RST

POJTAG0TCK

POJTAG0TDI

POJTAG0TDO

POJTAG0TMS

POLEDS

POLE

DS0L

ED1

POLE

DS0L

ED2

POLE

DS0L

ED3

POLE

DS0L

ED4

POLE

DS0L

ED5

PONRST

POPA11

POPA12

POPB0

POPB2

POPB5

POPC0

POPC1

POPC3

POPC4

POPC14

POSERVOS

POSERV

OS0SER

VO1POS

ERVOS0

SERVO2

POSERV

OS0SER

VO3POS

ERVOS0

SERVO4

POSERV

OS0SER

VO5POS

ERVOS0

SERVO6

POSERV

OS0SER

VO7POS

ERVOS0

SERVO8

POSPI1

POSPI10CS

POSP

I10D

RDY

POSP

I10M

ISO

POSP

I10M

OSI

POSP

I10S

CLK

POSPI2

POSPI20CS0BARO

POSPI20CS0MPU

POSP

I20M

ISO

POSP

I20M

OSI

POSP

I20S

CLK

POUART

POUA

RT0U

ART1

POUART

0UART1

0RXPOU

ART0UA

RT10TX

POUA

RT0U

ART2

POUART

0UART2

0RXPOU

ART0UA

RT20TX

POUA

RT0U

ART3

POUART

0UART3

0RXPOU

ART0UA

RT30TX

POUA

RT0U

ART5

POUART

0UART5

0RXPOU

ART0UA

RT50TX

POUA

RT0U

ART6

POUART

0UART6

0RXPOU

ART0UA

RT60TX

POUSB

POUSB0D0N

POUSB0D0P

POUSB0VBUS

137

Page 156: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AIM

U.S

chD

oc2.

020

-DEC

-201

55

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

MPU

-600

0

CLK

IN1

NC

2

RSVD21

SDA/SDI24

SCL/SCLK23

RSVD22

VD

D13

GN

D18

AUX_CL 7

AD0/SDO 9

INT 12

CS 8

FSYNC 11REGOUT 10

NC

3

NC

4

NC

5

AU

X_D

A6

NC

14N

C15

NC

16N

C17

CPOUT20

RSVD19

GND EP

U3

2.2n

FC

14

GN

D

GN

D

GN

D

GN

DG

ND

GN

DM

ISO

CS_

MPU

MO

SISC

LK

MPU

_SC

L

MPU

_SD

A

MPU

_IN

T

Acc

eler

omet

er &

Gyr

osco

pe

MS5

611-

01B

A03

VD

D1

CSB

5

PS2

SCLK

8SD

A/S

DI

7

GN

D3

SDO

6

CSB

4

U2

GN

DG

ND

CS_

BA

RO

Bar

omet

er

SCL

1

NC

5V

DD

2

SETP

8

NC

7

NC

3

NC

6S1

4

GN

D9

C1

10

GN

D11

SETC

12

VD

DIO

13

NC

14

DR

DY

15

SDA

16

U4

HM

C58

83L

220n

FC

18

MA

G_I

NT

GN

DG

ND

MPU

_SC

LM

PU_S

DA

GN

D

Mag

neto

met

er

2.2K

R13

MPU

_SC

LM

PU_S

DA

100n

FC

16

100n

FC

15

100n

FC

19

100n

FC

13

10K

R15

2.2K

R14

+3V

3_SE

NS

+3V

3_SE

NS

+3V

3_SE

NS

+3V

3_SE

NS

+3V

3_SE

NS

SCLK

MO

SIM

ISO

CS_

MPU

CS_

BA

RO

SPI2

SPI2

CS_

MPU

MIS

OM

OSI

SCLK

MIS

OM

OSI

SCLK

CS_

BA

RO

TP5

TP6

TP7

TP8

TP9

4.7u

FC

17

GN

D

PIC1301 PIC1302CO

C13

PIC1401PIC1402CO

C14

PIC1501 PIC1502CO

C15

PIC1601 PIC1602CO

C16

PIC1701 PIC1702CO

C17

PIC1801 PIC1802CO

C18

PIC1901PIC1902CO

C19

PIR1301PIR1302 COR1

3

PIR1401PIR1402 COR1

4

PIR1501PIR1502 COR1

5

PITP

501

COTP5

PITP

601

COTP

6

PITP

701

COTP7

PITP

801

COTP

8PI

TP90

1CO

TP9

PIU201

PIU202

PIU203

PIU204

PIU205

PIU206

PIU207

PIU208

COU2

PIU301

PIU302

PIU303

PIU304

PIU305

PIU306

PIU307PIU308PIU309PIU3010

PIU3011PIU3012

PIU3013

PIU3014

PIU3015

PIU3016

PIU3017

PIU3018

PIU3019PIU3020PIU3021PIU3022PIU3023PIU3024

PIU30EP

COU3

PIU401

PIU402

PIU403

PIU404

PIU405

PIU406

PIU407

PIU408

PIU409

PIU4010

PIU4011

PIU4012

PIU4013

PIU4014

PIU4015

PIU4016

COU4

PIC1301

PIC1501

PIC1601

PIR1302PIR1402

PIR1502

PIU201

PIU3013

PIU402

PIU404

PIU4013

PITP

901

PIU204

PIU205

NLCS0BARO

POSPI2

PITP

801

PIU308

NLCS

0MPU

POSPI2

PIC1302

PIC1402

PIC1502

PIC1602

PIC1702

PIC1901

PIU202

PIU203

PIU301

PIU3011

PIU3018

PIU30EP

PIU409

PIU4011

PIU4015

POMAG0INT

PITP

601

PIU206

PIU309

NLMI

SO

POSPI2

PITP

701

PIU207

PIU3024

NLMO

SI

POSPI2

PIR1501

PIU3012

POMPU0INT

PIR1401

PIU307PIU401

NLMPU0SCL

PIR1301PIU306

PIU4016

NLMPU0SDA

PIC1401 PIU3020

PIC1701PIU4010

PIC1801PIU408

PIC1802PIU4012

PIC1902

PIU3010

PIU302

PIU303

PIU304

PIU305

PIU3014

PIU3015

PIU3016

PIU3017

PIU3019PIU3021PIU3022

PIU403

PIU405

PIU406

PIU407

PIU4014

PITP

501

PIU208

PIU3023

NLSC

LK

POSPI2

POMAG0INT

POMPU0INT

POSPI2

POSPI20CS0BARO

POSPI20CS0MPU

POSP

I20M

ISO

POSP

I20M

OSI

POSP

I20S

CLK

138

Page 157: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AG

PS.S

chD

oc2.

020

-DEC

-201

56

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

MA

X-7

Q

GN

D1

TXD

2

RX

D3

TIM

EPU

LSE

4EX

TIN

T5

V_B

CK

P6

VC

C_I

O7

VC

C8

RES

ET9

GN

D10

RF_

IN11

GN

D12

AN

T_O

N13

VC

C_R

F14

RES

ERV

ED15

SDA

16

SCL

17

RES

ERV

ED18

U23

GN

D

GPS

_RX

GPS

_TX

GPS

_RX

GPS

_TX

+3V

3_G

PS

11m

FC

87

LQW

15A

N6N

8H00

D

L6 6.8n

H

1

GND GND

J17

U.F

L

GN

D

GN

D

1N41

48W

X-T

PD

4

100

R61

+3V

3_G

PS

GN

D100n

FC

83

+3V

3_G

PS

+3V

3

1

GND GND

J18

SMA

GN

D

MA

X26

59EL

T+T

RFI

N3

GN

D2

VC

C4

RFO

UT

6SH

DN

5G

ND

1

U25

470p

F

C89

33nF

C88

GN

D

+3V

3_G

PS

GN

D

TP19

TP18

SF11

86K

-3

INC

GN

DB

OU

TA

GN

DD

GN

DE

U24

+3V

3_G

PS

8.2p

F

C85

8.2p

F

C84

GN

D

VC

C_R

F

AN

T_O

N

RF_

OU

T2

VC

C_R

F RF_

OU

T2

10K

R65

GN

D

0R64

AN

T_O

N

8.2p

F

C86

RF_

OU

T1

0R66

0R63

0R62

RF_

OU

T1

GPS

Mod

ule

and

SAW

Filt

er

Low

Noi

se A

mpl

ifier

(LN

A)

Act

ive

Ant

enna

Sup

ply

RF_

LNA

0R67

RF_

LNA

10R68

LQW

15A

N27

NJ0

0D

L7 27nH

VC

C_R

FR

F_O

UT1

Ant

enna

Sel

ectio

n C

ircui

t

JP10

[1] A

nten

na tr

aces

requ

ire 5

0ohm

im

peda

nce

all t

he w

ay th

roug

h fr

om th

e G

PS m

odul

e to

the

ante

nna

conn

ecto

r

PIC8301PIC8302CO

C83

PIC8

401

PIC8

402COC

84

PIC8501

PIC8

502COC8

5PI

C860

1PI

C860

2COC8

6

PIC8701 PIC8702

COC8

7

PIC8801 PIC8802CO

C88

PIC8901

PIC8902COC

89

PID401 PID402COD4

PIJ1

701

PIJ170GND

COJ17

PIJ1

801

PIJ180GND

COJ18

PIJP1001PIJP1002CO

JP10

PIL601

PIL602

COL6

PIL701

PIL702

COL7

PIR6101 PIR6102COR6

1

PIR6

201

PIR6

202COR6

2

PIR6301PIR6302 COR6

3

PIR6

401

PIR6

402

COR6

4

PIR6501PIR6502 COR6

5

PIR6601PIR6602 COR6

6

PIR6

701

PIR6

702COR6

7

PIR6801

PIR6802

COR6

8

PITP18

01COTP18

PITP19

01COTP19

PIU2301

PIU2302

PIU2303

PIU2304

PIU2305

PIU2306

PIU2307

PIU2308

PIU2309

PIU23010

PIU23011

PIU23012

PIU23013

PIU23014

PIU23015

PIU23016

PIU23017

PIU23018CO

U23

PIU240A

PIU240B

PIU240C

PIU240D

PIU240ECO

U24

PIU2501

PIU2502

PIU2503

PIU2504

PIU2505

PIU2506

COU2

5

PIJP1002PIC8302

PIJP1001

PIR6101

PIU2307

PIU2308

PIU2309

PIR6

401

PIU23013

NLAN

T0ON

PIC8301

PIC8702 PIC8802

PIJ170GND PIJ180GND

PIR6501

PIU2301

PIU23010

PIU23012

PIU240B

PIU240D

PIU240E

PIU2501

PIU2502

PIU2303

NLGPS0RX

POGPS0RX

PIU2302

NLGPS0TX

POGPS0TX

PIC8

402

PIC8

602

PIU240C

PIC8501

PIU240A

PIC8

502

PIU23011

PIC8701PID402PIU2306

PIC8901

PIL601

PIC8902

PIU2503

PID401PIR6102

PIJ1

701

PIR6

201

PIJ1

801

PIR6

701

PIL701

PIR6802

PIR6

202

PIR6301 PIR6602

PIR6

702

PIR6

402

PIR6502PIU2505

PITP18

01PIU2305

PITP19

01PIU2304

PIU23015

PIU23016

PIU23017

PIU23018

PIL602

PIR6302

NLRF0LNA

PIC8

601

PIL702

PIR6601

NLRF0OUT1

PIC8

401

PIU2506

NLRF0OUT2

PIC8801

PIR6801

PIU23014

PIU2504

POGPS0RX

POGPS0TX

139

Page 158: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AC

AN

, JT

AG

, LD

O, U

SB.S

chD

oc2.

020

-DEC

-201

57

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

2 4 6 8 10

1 3 5 7 9

JTA

G1

+3V

3

GN

D

JTA

G

TMS

TCK

TDI

TDO

RST

TXD

1

GN

D2

VC

C3

RX

D4

SHD

N5

CA

NL

6C

AN

H7

RS

8

MA

X30

51EK

A+T

U5

GN

D

120

R16

GN

D

CA

NH

CA

NL

+3V

3

GN

D

CA

N T

rans

ceiv

er

RS

LDO

Vol

tage

Reg

ulat

or (M

ain)

IN1

OU

T3

GN

D 24

LM39

40IM

PX-3

.3/N

OPB

U6

GN

DG

ND

GN

D

+3V

3+5

V

Gre

enLE

D6

GN

D

+3V

3

LDO

Vol

tage

Reg

ulat

or (S

enso

rs)

LP29

92A

IM5-

3.3/

NO

PB

IN1

GN

D2

ON

/OFF

3

BY

PASS

4

OU

T5

U7

GN

D

4.7u

FC

242.

2uF

C23

GN

D

+5V

GN

D

GN

D

+3V

3_SE

NS

100n

FC

20

0R17

470

R18

100n

FC

2233

uFC

21

TMS

TCK

TDO

TDI

RST

JTA

G

JTA

G

TX RXC

AN

CA

NC

AN

HC

AN

L

CA

N_B

US

CA

N_B

US

VB

US

1

D-

2

D+

3

ID4

GN

D5

SHSH

USB

1

Mic

ro U

SB B

GN

D

0D

NP

R20

GN

D

0D

NP

R21

+5V

0D

NP

R19

CG0603MLC-05LECT-ND

Z1

10K

R23

CG0603MLC-05LECT-ND

Z2

GN

DG

ND

GN

D

1N58

19H

W-F

DIC

T-N

DD

3

VB

US

D_N

D_PU

SB

D_N

USB

D_P

USB

_D_N

USB

_D_P

USB

100K

R22

10nF

C25

PIC2001 PIC2002CO

C20

PIC2101 PIC2102CO

C21

PIC2201 PIC2202CO

C22

PIC2301PIC2302CO

C23

PIC2401PIC2402COC

24

PIC2501 PIC2502CO

C25

PID301PID302COD3

PIJT

AG10

1PI

JTAG

102

PIJT

AG10

3PI

JTAG

104

PIJT

AG10

5PI

JTAG

106

PIJT

AG10

7PI

JTAG

108

PIJT

AG10

9PIJTAG1010

COJT

AG1

PILED601 PILED602CO

LED6

PIR1601PIR1602 COR1

6

PIR1701PIR1702 COR1

7

PIR1801 PIR1802COR1

8

PIR1

901

PIR1902

COR1

9PI

R200

1PIR2002

COR2

0

PIR2

101

PIR2

102

COR2

1

PIR2201

PIR2

202

COR2

2

PIR2301PIR2302 COR2

3

PIU501

PIU502

PIU503

PIU504

PIU505

PIU506

PIU507

PIU508

COU5

PIU601

PIU602

PIU603

PIU604

COU6

PIU701

PIU702

PIU703

PIU704

PIU705

COU7

PIUSB101

PIUSB102

PIUSB103

PIUSB104

PIUSB105

PIUSB10SH

COUS

B1

PIZ101PIZ102COZ

1PIZ201PIZ202

COZ2

PIC2001

PIC2101

PIJT

AG10

1

PIR1801PIU503

PIU603

PIC2402PIU705

PIC2201 PIC2302

PID302PIU601

PIU701

PIU703

PIR1602PIU507

NLCA

NH

POCAN0BUS

PIR1601PIU506

NLCA

NL

POCAN0BUS

PIR2

001

NLD0

NPOUSB

PIR2

101NLD0P

POUSB

PIC2002

PIC2102PIC2202 PIC2301

PIC2401

PIC2502

PIJT

AG10

3

PIJT

AG10

5

PIJT

AG10

7

PIJT

AG10

9

PILED602

PIR1701

PIR2201

PIR2301

PIU502

PIU602PIU604

PIU702

PIUSB105

PIUSB10SH

PIZ101PIZ201

PIC2501PIU704

PID301PI

R190

1

PIR2302

PIUSB101

PILED601PIR1802

PIR1902

POUSB

PIR2

202

PIUSB104

PIU501

POCAN

PIU504

POCAN

PIU505

PIR1702PIU508

NLRS

PIJTAG1010

NLRST

POJTAG

PIJT

AG10

4NL

TCK

POJTAG

PIJT

AG10

8NL

TDI

POJTAG

PIJT

AG10

6NL

TDO

POJTAG

PIJT

AG10

2NL

TMS

POJTAG

PIR2002

PIUSB102

PIZ102

NLUSB0D0N

PIR2

102

PIUSB103

PIZ202NL

USB0

D0P

POCAN

POCAN0RX

POCAN0TX

POCAN0BUS

POCAN0BUS0CANH

POCAN0BUS0CANL

POJTAG

POJTAG0RST

POJTAG0TCK

POJTAG0TDI

POJTAG0TDO

POJTAG0TMS

POUSB

POUSB0D0N

POUSB0D0P

POUSB0VBUS

140

Page 159: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

ATe

lem

etry

, I2C

.Sch

Doc

2.0

20-D

EC-2

015

816

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

XB

P9B

-DM

ST-0

02

VC

C1

DO

UT

2

DIN

3

DO

84

RES

ET5

PWM

0/R

SSI

6

PWM

17

(Res

erve

d)8

DTR

/SLE

EP_R

Q/D

I89

GN

D10

AD

4/D

IO4

11C

TS/D

IO7

12O

N/S

LEEP

13V

REF

14A

SSO

C/A

D5/

DIO

515

RTS

/AD

6/D

IO6

16A

D3/

DIO

317

AD

2/D

IO2

18A

D1/

DIO

119

AD

0/D

IO0

20

U12

+3V

3

GN

D

RSS

I

XB

EE_R

XX

BEE

_TX

Tele

met

ry (X

Bee)

I2C

1 Le

vel S

hifte

r

PCA

9306

DC

TR

GN

D1

VR

EF1

2

SCL1

3

SDA

14

SDA

25

SCL2

6V

REF

27

EN8

U13

GN

D

81 2.2K

RN

4A

+3V

3+3

V3

+3V

3+5

V+5

V+5

V

200K

R38

I2C

2 Le

vel S

hifte

r

GN

D

+3V

3+3

V3

+3V

3+5

V+5

V+5

V

PCA

9306

DC

TR

GN

D1

VR

EF1

2

SCL1

3

SDA

14

SDA

25

SCL2

6V

REF

27

EN8

U14

200K

R39

72 2.2K

RN

4B

63 2.2K

RN

4C

54 2.2K

RN

4D

63 2.2K

RN

5C

54 2.2K

RN

5D

81 2.2K

RN

5A

72 2.2K

RN

5B

0R

400

R41

UA

RT

(Low

Vol

tage

)6

2

1

BSS

138P

S,11

5Q

2A

81 10K

RN

3A

72 10K

RN

3B

+3V

3LV

LV_T

X

3

5

4

BSS

138P

S,11

5Q

2B

63 10K

RN

3C

54 10K

RN

3D

+3V

3LV

LV_R

X

SDA

SCL

I2C

1

I2C

1

SDA

SCL

I2C

2

I2C

2

SDA

SCL

I2C

1_5V

I2C

1_5V

SDA

SCL

I2C

2_5V

I2C

2_5V

UA

RT_

TX

UA

RT_

RX

1 2 3 4 5 6 7 8 9 10

EX1

Hea

der 1

0 2m

m

XB

EE_T

X

1 2 3 4 5 6 7 8 9 10

EX2

Hea

der 1

0 2m

m

XB

EE_R

X

GN

D

+3V

3

XB

EE_T

XX

BEE

_RX

GN

DR

SSI

Yel

low

LED

7

PIN

5PI

N4

PIN

7PI

N8

PIN

9

RSS

IPI

N20

PIN

11PI

N12

PIN

13PI

N14

PIN

15PI

N16

PIN

17PI

N18

PIN

19

PIN

20

PIN

11PI

N12

PIN

13PI

N14

PIN

15PI

N16

PIN

17PI

N18

PIN

19

PIN

5PI

N4

PIN

7PI

N8

PIN

9

63

470

RN

6C

PIEX101

PIEX102

PIEX103

PIEX104

PIEX105

PIEX106

PIEX107

PIEX108

PIEX109

PIEX1010

COEX1

PIEX201

PIEX202

PIEX203

PIEX204

PIEX205

PIEX206

PIEX207

PIEX208

PIEX209

PIEX2010

COEX2

PILED701

PILED702

COLE

D7

PIQ201

PIQ202

PIQ206

COQ2A

PIQ203

PIQ204

PIQ205

COQ2B

PIR3801 PIR3802COR3

8

PIR3901 PIR3902COR3

9

PIR4

001

PIR4

002

COR4

0PI

R410

1PI

R410

2CO

R41

PIRN301 PIRN308CORN3A

PIRN302 PIRN307

CORN3B

PIRN303 PIRN306CORN3C

PIRN304 PIRN305

CORN3D

PIRN401 PIRN408CORN4A

PIRN402 PIRN407

CORN4B

PIRN403 PIRN406

CORN4C

PIRN404 PIRN405

CORN4D

PIRN501 PIRN508CORN5A

PIRN502 PIRN507

CORN5B

PIRN503 PIRN506

CORN5C

PIRN504 PIRN505

CORN5D

PIRN603

PIRN

606CORN6C

PIU1201

PIU1202

PIU1203

PIU1204

PIU1205

PIU1206

PIU1207

PIU1208

PIU1209

PIU12010

PIU12011

PIU12012

PIU12013

PIU12014

PIU12015

PIU12016

PIU12017

PIU12018

PIU12019

PIU12020

COU12

PIU1301

PIU1302

PIU1303

PIU1304

PIU1305

PIU1306

PIU1307

PIU1308

COU1

3

PIU1401

PIU1402

PIU1403

PIU1404

PIU1405

PIU1406

PIU1407

PIU1408

COU14

PIEX101

PIRN302PIRN304

PIRN401PIRN402

PIRN501PIRN502

PIU1201

PIU1302

PIU1402

PIR3801 PIR3901

PIRN403PIRN404

PIRN503PIRN504

PIEX1010

PILED702

PIU12010

PIU1301

PIU1401

PIQ202 PIQ205

PIRN301 PIRN303

PIQ204

PIRN306POLV0RX

PIQ201

PIRN308POLV0TX

PILED701

PIRN603

PIR3802PIU1307

PIU1308

PIR3902PIU1407

PIU1408

PIR4

001

PIU1403

PIR4

002

PIRN507

POI2C2

PIR4

101PIU1404

PIR4

102

PIRN508

POI2C2

PIRN405

PIU1305

POI2C105V

PIRN406

PIU1306

POI2C105V

PIRN407

PIU1303

POI2C1

PIRN408

PIU1304

POI2C1

PIRN505

PIU1405

POI2C205V

PIRN506

PIU1406

POI2C205V

PIEX104

PIU1204

NLPIN4

PIEX105

PIU1205

NLPIN5

PIEX107

PIU1207

NLPIN7

PIEX108

PIU1208

NLPIN8

PIEX109

PIU1209

NLPIN9

PIEX2010

PIU12011NL

PIN1

1

PIEX209

PIU12012NL

PIN1

2

PIEX208

PIU12013NL

PIN1

3

PIEX207

PIU12014NL

PIN1

4

PIEX206

PIU12015NL

PIN1

5

PIEX205

PIU12016NL

PIN1

6

PIEX204

PIU12017NL

PIN1

7

PIEX203

PIU12018NL

PIN1

8

PIEX202

PIU12019NL

PIN1

9

PIEX201

PIU12020NL

PIN2

0PIEX106

PIRN

606

PIU1206

NLRSSI

PIQ203

PIRN307POUART0RX

PIQ206

PIRN305POUART0TX

PIEX103

PIU1203

NLXBEE0RX

POXBEE0RX

PIEX102

PIU1202

NLXBEE0TX

POXBEE0TX

POI2C1

POI2C10SCL

POI2C10SDA

POI2C105V

POI2

C105

V0SC

LPO

I2C1

05V0

SDA

POI2C2

POI2C20SCL

POI2C20SDA

POI2C205V

POI2

C205

V0SC

LPO

I2C2

05V0

SDA

POLV0RX

POLV0TX

POUART0RX

POUART0TX

POXBEE0RX

POXBEE0TX

141

Page 160: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

ASw

itche

s, U

I.Sc

hDoc

2.0

20-D

EC-2

015

916

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

BU

FF1_

EN

Serv

o/M

otor

Sig

nals

Ena

ble

Switc

hes

Red

LED

1O

rang

eLE

D2

Yel

low

LED

4

+3V

3+3

V3

+3V

3+3

V3

+3V

3

LED

s

Gre

enLE

D3

Red

LED

5 GN

D

+3V

3

BIN

D

Bin

d B

utto

nS1 BIN

D

10K

R44

VC

C14

A1

2

GN

D7

OE1

1

Y1

3

A2

5

Y4

11

OE2

4

OE4

13

Y2

6

A4

12A

39

Y3

8

OE3

10

U15

SN74

LVC

125A

PWR

10K

R42

+3V

3

GN

D

S1S1 S2 S3 S4

S2 S3 S4

+3V

3

GN

D

S5S5 S6 S7 S8

S6 S7 S8

S1_E

N

VC

C14

A1

2

GN

D7

OE1

1

Y1

3

A2

5

Y4

11

OE2

4

OE4

13

Y2

6

A4

12A

39

Y3

8

OE3

10

U16

SN74

LVC

125A

PWR

LED

s

LED

1LE

D2

LED

3LE

D4

LED

5

LED

s

Serv

o1Se

rvo2

Serv

o3Se

rvo4

Serv

o5Se

rvo6

Serv

o7Se

rvo8

Serv

os

Serv

os

S2_E

NS3

_EN

S4_E

N

S5_E

NS6

_EN

S7_E

NS8

_EN

3 1

2SB6 S1_EN

3 1

2SB7

3 1

2SB8

3 1

2SB9

3 1

2SB10

3 1

2SB11

3 1

2SB12

S2_EN

S3_EN

S4_EN

S5_EN

S6_EN

S7_EN

S8_EN

3 1

2SB13

BU

FF2_

EN

GN

D

10K

R43

GN

D

5 4470

RN

7D

6 3470

RN

7C

7 2470

RN

7B

8 1470

RN

7A

5 4470

RN

6D

81

1KR

N8A

S1

72

1KR

N8B

S2

63

1KR

N8C

S3

54

1KR

N8D

S4G

ND

81

1KR

N9A

S5

72

1KR

N9B

S6

63

1KR

N9C

S7

54

1KR

N9D

S8G

ND

PILED101 PILED102CO

LED1

PILED201 PILED202CO

LED2

PILED301 PILED302CO

LED3

PILED401 PILED402COLED4

PILED501 PILED502CO

LED5

PIR4201

PIR4

202COR4

2

PIR4301

PIR4

302COR4

3

PIR4401PIR4402 COR4

4

PIRN604PIRN605 CORN6D

PIRN701PIRN708 CORN7A

PIRN702PIRN707CORN7B

PIRN703PIRN706CORN7C

PIRN704PIRN705CORN7D

PIRN801

PIRN

808

CORN8A

PIRN802

PIRN

807

CORN8B

PIRN803

PIRN

806

CORN8C

PIRN804

PIRN

805

CORN8D

PIRN901

PIRN908

CORN9A

PIRN902

PIRN907

CORN9B

PIRN903

PIRN906

CORN9C

PIRN904

PIRN905

CORN9D

PIS101PIS102 COS1

PISB601PISB602

PISB603COSB6

PISB701PISB

702

PISB703COSB7

PISB801PISB

802

PISB803COSB8

PISB901PISB

902

PISB903CO

SB9

PISB1001PISB10

02

PISB1003CO

SB10

PISB1101PISB1102

PISB1103CO

SB11

PISB1201PISB1202

PISB1203CO

SB12

PISB1301PISB13

02

PISB1303CO

SB13

PIU1501

PIU1502

PIU1503

PIU1504

PIU1505

PIU1506

PIU1507

PIU1508

PIU1509

PIU15010

PIU15011

PIU15012

PIU15013

PIU15014

COU1

5

PIU1601

PIU1602

PIU1603

PIU1604

PIU1605

PIU1606

PIU1607

PIU1608

PIU1609

PIU16010

PIU16011

PIU16012

PIU16013

PIU16014

COU1

6PIR4402

PIRN605PIRN705

PIRN706PIRN707

PIRN708

PIU15014

PIU16014

PIR4401 PIS102POBIND

PIR4301

PISB601PISB701

PISB801PISB901

PISB1001PISB1101

PISB1201PISB1301

POBUFF10ENPI

R4201

PISB603PISB703

PISB803PISB903

PISB1003PISB1103

PISB1203PISB1303

POBU

FF20

EN

PIR4

202

PIR4

302

PIRN

805

PIRN

806

PIRN

807

PIRN

808

PIRN905

PIRN906

PIRN907

PIRN908

PIS101

PIU1507

PIU1607

PILED101PIRN704 PILED102

POLEDs

PILED201PIRN703 PILED202

POLEDs

PILED301PIRN702 PILED302

POLEDs

PILED401PIRN701 PILED402

POLEDs

PILED501PIRN604 PILED502

POLEDs

PIU1502

POServos

PIU1505

POServos

PIU1509

POServos

PIU15012

POServos

PIU1602

POServos

PIU1605

POServos

PIU1609

POServos

PIU16012

POServos

PIRN801

PIU1503

NLS1

POS1

PISB602

PIU1501

NLS1

0EN

PIRN802

PIU1506

NLS2

POS2

PISB

702

PIU1504

NLS2

0EN

PIRN803

PIU1508

NLS3

POS3

PISB

802

PIU15010

NLS3

0EN

PIRN804

PIU15011

NLS4

POS4

PISB

902

PIU15013

NLS40EN

PIRN901

PIU1603

NLS5

POS5

PISB10

02

PIU1601

NLS50EN

PIRN902

PIU1606

NLS6

POS6

PISB1102

PIU1604

NLS60EN

PIRN903

PIU1608

NLS7

POS7

PISB1202

PIU16010

NLS70EN

PIRN904

PIU16011

NLS8

POS8

PISB13

02

PIU16013

NLS80EN

POBIND

POBUFF10EN

POBU

FF20

EN

POLEDS

POLE

DS0L

ED1

POLE

DS0L

ED2

POLE

DS0L

ED3

POLE

DS0L

ED4

POLE

DS0L

ED5

POS1

POS2

POS3

POS4

POS5

POS6

POS7

POS8

POSERVOS

POSERV

OS0SER

VO1POS

ERVOS0

SERVO2

POSERV

OS0SER

VO3POS

ERVOS0

SERVO4

POSERV

OS0SER

VO5POS

ERVOS0

SERVO6

POSERV

OS0SER

VO7POS

ERVOS0

SERVO8

142

Page 161: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AD

ata

Logg

er.S

chD

oc2.

020

-DEC

-201

510

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

D2

- N

C1

CD/D

3- C

S2

CD

9

GND 10

CMD

- D

I3

VDD

4

CLK

- S

CK5

VSS

6

D0

- D

O7

D1

- N

C8

MIC

RO

SD

U17

GN

D

SPI_

SCK

SPI_

MIS

O

SPI_

MO

SISP

I_C

S

GN

D

+3V

3_D

LC

D GN

D

+3V

3_D

L

Mic

ro S

D C

ard

1 2 3 4

J15

ICSP 1 2 3 4 5 6

J16

FTD

I

SPI_

SCK

SPI_

MIS

OSP

I_M

OSI

RES

ET

DTR

GN

DG

ND

+3V

3_D

L

TXO

RX

I

Hea

ders

PC6

(RES

ET/P

CIN

T14)

29

PD0

(RX

D/P

CIN

T16)

30

PD1

(TX

D/P

CIN

T17)

31

PD2

(IN

T0/P

CIN

T18)

32

PD4

(PC

INT2

0/X

CK

/T0)

2

VC

C6

GN

D5

PB6

(PC

INT6

/XTA

L1/T

OSC

1)7

PB7

(PC

INT7

/XTA

L2/T

OSC

2)8

PD5

(PC

INT2

1/O

C0B

/T1)

9

PD6

(PC

INT2

2/O

C0A

/AIN

0)10

PD7

(PC

INT2

3/A

IN1)

11

PB0

(PC

INT0

/CLK

O/IC

P1)

12

PB1

(PC

INT1

/OC

1A)

13

PB2

(PC

INT2

/SS/

OC

1B)

14

PB3

(PC

INT3

/OC

2A/M

OSI

)15

PB4

(PC

INT4

/MIS

O)

16

PB5

(SC

K/P

CIN

T5)

17

AV

CC

18

AR

EF20

GN

D21

PC0

(AD

C0/

PCIN

T8)

23

PC1

(AD

C1/

PCIN

T9)

24

PC2

(AD

C2/

PCIN

T10)

25

PC3

(AD

C3/

PCIN

T11)

26

PC4

(AD

C4/

SDA

/PC

INT1

2)27

PC5

(AD

C5/

SCL/

PCIN

T13)

28

GN

D3

VC

C4

AD

C6

19

AD

C7

22

PD3

(PC

INT1

9/O

C2B

/INT1

)1

U18

ATm

ega3

28P-

AU

GN

D

GN

D

SPI_

SCK

SPI_

MIS

OSP

I_M

OSI

SPI_

CS

RES

ET

CD

TXO

RX

I

Dat

a Lo

gger

MC

U

DTR

+3V

3_D

L

+3V

3_D

L

+3V

3_D

L

GN

DG

ND

RX

IU

AR

T_IN

100n

FC

54

100n

F

C52

10K

R45

12

16M

Hz

Y2

GN

DG

reen

LED

8G

reen

LED

9

0R46

+3V

3_D

L+3

V3

7 2470

RN

6B

8 1470

RN

6A

JP7

100n

FC

5510

0nF

C56

100n

FC

53

LED

10 to

ggle

s on

/off

ev

ery

time

a ne

w

char

acte

r is

rece

ived

LED

9 is

con

nect

ed to

th

e SP

I SC

K li

ne. I

t fla

shes

ver

y fa

st w

hen

writ

ing

to th

e SD

car

d

PIC5201

PIC5202

COC5

2PIC5301 PIC5302

COC5

3

PIC5401 PIC5402CO

C54

PIC5501 PIC5502CO

C55

PIC5601 PIC5602COC56

PIJ1

501

PIJ1

502

PIJ1

503

PIJ1

504COJ

15

PIJ1601

PIJ1

602

PIJ1603

PIJ1

604

PIJ1

605

PIJ1

606COJ

16

PIJP

701

PIJP

702

COJP7

PILED801 PILED802CO

LED8

PILED901 PILED902CO

LED9

PIR4501PIR4502 COR4

5

PIR4

601

PIR4

602COR4

6

PIRN601PIRN608 CORN6A

PIRN602PIRN607CORN6B

PIU1701

PIU1702

PIU1703

PIU1704

PIU1705

PIU1706

PIU1707

PIU1708

PIU1709

PIU17010

COU17

PIU1801

PIU1802

PIU1803

PIU1804

PIU1805

PIU1806

PIU1807

PIU1808

PIU1809

PIU18010

PIU18011

PIU18012

PIU18013

PIU18014

PIU18015

PIU18016

PIU18017

PIU18018

PIU18019

PIU18020

PIU18021

PIU18022

PIU18023

PIU18024

PIU18025

PIU18026

PIU18027

PIU18028

PIU18029

PIU18030

PIU18031

PIU18032

COU18

PIY2

01PIY202

PIY203

COY2

PIJP

702

PIC5301

PIC5401PIC5501

PIC5601

PIJ1603

PIJP

701

PIR4502PIU1704

PIU1804

PIU1806

PIU18018

PIU18020

NL03

V30D

L

PIU1709

PIU18025

NLCD

PIC5202

PIJ1

606

NLDT

R

PIC5302

PIC5402PIC5502

PIC5602PIJ1601

PIJ1

602

PILED802PILED902

PIU1706

PIU17010

PIU1803

PIU1805

PIU18021

PIY203

NLGND

PILED801PIRN601

PILED901PIRN602PIRN607

PIU1809

PIU1701

PIU1708

PIU1801

PIU1802

PIU1807

PIY2

01

PIU1808

PIY202

PIU18010

PIU18011

PIU18012

PIU18013

PIU18019

PIU18022

PIU18023

PIU18024

PIU18026

PIU18027

PIU18028

PIU18032

PIC5201

PIJ1

501

PIR4501

PIU18029

NLRE

SET

PIJ1

604

PIR4

602

PIU18030

NLRXI

PIU1702

PIU18014

NLSP

I0CS

PIJ1

504

PIU1707

PIU18016

NLSPI0MISO

PIJ1

503

PIU1703

PIU18015

NLSPI0MOSI

PIJ1

502

PIRN608

PIU1705

PIU18017

NLSP

I0SC

K

PIJ1

605

PIU18031

NLTX

O

PIR4

601

POUART0IN

POUART0IN

143

Page 162: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

APo

wer

.Sch

Doc

2.0

20-D

EC-2

015

1116

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

DC

1_EN

DC

2_EN

U_M

ain

Pow

erM

ain

Pow

er.S

chD

oc

VIN

GN

D

VIN

_BC

KP

DC

3_EN

DC

4_EN

U_B

acku

p Po

wer

Bac

kup

Pow

er.S

chD

ocB

AC

KU

P_EN

+5V

_BC

KP

+5V

_SER

VO

_BC

KP

+5V

_MA

IN

+5V

_SER

VO

CM

S04(

TE12

L,Q

,M)

D1

CM

S04(

TE12

L,Q

,M)

D2

1 2 3 4

J1 POW

ER

VIN

GN

D

+5V

_BC

KP

+5V

_SER

VO

_BC

KP

12

34

56

78

J4 Hea

der 4

X2

+5V

_SER

VO

+5V

_MA

IN

GN

DV

IN

DC

1_EN

DC

2_EN

BA

CK

UP_

EN

VIN

+5V

_BC

KP

+5V

_SER

VO

_BC

KP

BA

CK

UP_

EN

VIN

GN

D

+5V

_BC

KP

+5V

_SER

VO

_BC

KP

VIN

+5V

_BC

KP

+5V

_SER

VO

_BC

KP

BA

CK

UP_

EN

Mai

n Po

wer

Bac

kup

Pow

er

12

34

56

78

J2 Hea

der 4

X2

12

34

56

78

J8 Hea

der 4

X2

DC

1_EN

DC

2_EN

+5V

_EX

T

PID101

PID102

COD1

PID201

PID202

COD2

PIJ101

PIJ102

PIJ103

PIJ104

COJ1

PIJ201

PIJ202

PIJ203

PIJ204

PIJ205

PIJ206

PIJ207

PIJ208

COJ2

PIJ401

PIJ402

PIJ403

PIJ404

PIJ405

PIJ406

PIJ407

PIJ408

COJ4

PIJ801

PIJ802

PIJ803

PIJ804

PIJ805

PIJ806

PIJ807

PIJ808

COJ8

PID101

PIJ407

PIJ408

PIJ807

PIJ808NL05V0BCKP

PIJ205

PID102

PIJ207

NL05V0MAIN

PID202

PIJ203

NL05V0SERVO

PID201

PIJ405

PIJ406

PIJ805

PIJ806NL05V0SERVO0BCKP

PIJ204

PIJ403

PIJ804

NLBA

CKUP

0EN

PIJ208

NLDC10EN

PIJ206

NLDC20EN

PIJ102

PIJ104

PIJ201

PIJ404

PIJ803

NLGN

D

PIJ101

PIJ202

PIJ401

PIJ402

PIJ801

PIJ802NL

VIN

PIJ103

144

Page 163: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

AM

ain

Pow

er.S

chD

oc2.

020

-DEC

-201

512

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

SW1

SW2

CB

OO

T3

VC

C4

BIA

S5

SYN

C6

RT

7

PGO

OD

8

FB9

AG

ND

10

SS/T

RK

11

ENA

BLE

12

VIN

13

VIN

14

PGN

D15

PGN

D16

PAD

U8

LM43

603P

WPT

GN

DG

ND

DN

PR

29

GN

D

10uF

C28

453K

R25

100K

R30

GN

D

4.7u

FC

29

GN

DG

ND

GN

D

VIN

TP11

0.47

uF

C27

10nF

C37

GN

D

IHLP

3232

DZE

R6R

8M11

L2 6.8u

H

4.7u

FC

38

GN

D

1Meg

R24

100p

FC

30

249K

R28

GN

D

100K

R27

47uF

C31

10uF

C33

100n

FC

34

GN

DG

ND

GN

DG

ND

DN

PC

26

+5V

_MA

IN

Mai

n Po

wer

- Lo

cal S

tep-

Dow

n V

olta

ge C

onve

rter

Mai

n Po

wer

- E

xter

nal S

tep-

Dow

n V

olta

ge C

onve

rter

+5V

_EX

T

2.2u

FC

36

0R26

47uF

C32

SW1

SW2

CB

OO

T3

VC

C4

BIA

S5

SYN

C6

RT

7

PGO

OD

8

FB9

AG

ND

10

SS/T

RK

11

ENA

BLE

12

VIN

13

VIN

14

PGN

D15

PGN

D16

PAD

U10

LM43

603P

WPT

GN

DG

ND

GN

D

10uF

C41

453K

R32

100K

R37

GN

D

4.7u

FC

42

GN

DG

ND

GN

D

VIN

TP13

0.47

uF

C40

10nF

C50

GN

D

IHLP

3232

DZE

R6R

8M11

L3 6.8u

H

4.7u

FC

51

GN

D

1Meg

R31

249K

R35

GN

D

100K

R34

47uF

C44

10uF

C46

100n

FC

47

GN

DG

ND

GN

DG

ND

2.2u

FC

49

47uF

C45

DN

PC

39

DN

PR

36

0R33

VO

UT_

5V_M

AIN

_LO

CA

L

VO

UT_

5V_M

AIN

_EX

T

DC

1_EN

2.2V

Thr

esho

ld

2.2V

Thr

esho

ld DC

2_EN

ENA

BLE

1

VIN

2

GN

D3

GN

D4

VO

UT

5

D24

V50

FX

U9

DC

1_EN

DC

2_EN

DC

1_EN

VIN

GN

D

DC

1M_V

OU

T

DC

1M_V

OU

T

DC

2M_V

OU

T

ENA

BLE

1

VIN

2

GN

D3

GN

D4

VO

UT

5

D24

V50

FX

U11

DC

2_EN

VIN

GN

D

DC

2M_V

OU

T

TP10

TP12

JP5

JP6

100n

FC

35

100n

FC

4810

0pF

C43

R25

and

R30

form

a v

olta

ge d

ivid

er u

sed

for c

utof

f vo

ltage

. If t

he v

olta

ge o

n th

e EN

AB

LE p

in is

low

er th

an

2.2V

the

LM43

603

will

shu

t dow

n. R

25 c

an b

e ch

ange

d to

a

high

er v

alue

to in

crea

se th

e cu

toff

vol

tage

.

R32

and

R37

form

a v

olta

ge d

ivid

er u

sed

for c

utof

f vo

ltage

. If t

he v

olta

ge o

n th

e EN

AB

LE p

in is

low

er th

an

2.2V

the

LM43

603

will

shu

t dow

n. R

32 c

an b

e ch

ange

d to

a

high

er v

alue

to in

crea

se th

e cu

toff

vol

tage

.

The

jum

per i

s us

ed to

se

lect

bet

wee

n th

e on

bo

ard

regu

lato

r and

an

exte

rnal

regu

lato

r

The

jum

per i

s us

ed to

se

lect

bet

wee

n th

e on

bo

ard

regu

lato

r and

an

exte

rnal

regu

lato

r

PIC2

601

PIC2

602

COC2

6

PIC2

701

PIC2

702

COC2

7

PIC2801 PIC2802CO

C28

PIC2901 PIC2902CO

C29

PIC3001 PIC3002CO

C30

PIC3101 PIC3102CO

C31

PIC3201 PIC3202COC32

PIC3301 PIC3302CO

C33

PIC3401 PIC3402CO

C34

PIC3501 PIC3502CO

C35

PIC3601PIC3602CO

C36

PIC3701 PIC3702CO

C37

PIC3801 PIC3802CO

C38

PIC3

901

PIC3

902

COC3

9

PIC4

001

PIC4

002

COC40

PIC4101 PIC4102CO

C41

PIC4201 PIC4202CO

C42

PIC4301 PIC4302CO

C43

PIC4401 PIC4402CO

C44

PIC4501 PIC4502COC45

PIC4601 PIC4602CO

C46

PIC4701 PIC4702CO

C47

PIC4801 PIC4802CO

C48

PIC4901PIC4902CO

C49

PIC5001 PIC5002CO

C50

PIC5101 PIC5102CO

C51

PIJP501 PIJP

502

PIJP503

COJP5

PIJP601 PIJP

602

PIJP603

COJP6

PIL201

PIL202

COL2

PIL301

PIL302

COL3

PIR2401 PIR2402COR2

4

PIR2501PIR2502 COR2

5

PIR2

601

PIR2

602COR2

6

PIR2

701

PIR2

702COR2

7PIR2801 PIR2802CO

R28

PIR2901PIR2902 COR2

9

PIR3001PIR3002 COR3

0

PIR3101 PIR3102COR3

1

PIR3201PIR3202 COR3

2

PIR3

301

PIR3

302COR3

3

PIR3

401

PIR3

402COR3

4PIR3501 PIR3502CO

R35

PIR3601PIR3602 COR3

6

PIR3701PIR3702 COR3

7

PITP1001 COTP

10PITP1101

COTP11

PITP1201 COTP

12PITP1301

COTP13

PIU801

PIU802

PIU803

PIU804

PIU805

PIU806

PIU807

PIU808

PIU809

PIU8010

PIU8011

PIU8012

PIU8013

PIU8014

PIU8015

PIU8016

PIU80EP

COU8

PIU901

PIU902

PIU903

PIU904

PIU905CO

U9

PIU1001

PIU1002

PIU1003

PIU1004

PIU1005

PIU1006

PIU1007

PIU1008

PIU1009

PIU10010

PIU10011

PIU10012

PIU10013

PIU10014

PIU10015

PIU10016

PIU100EPCO

U10

PIU1101

PIU1102

PIU1103

PIU1104

PIU1105CO

U11

PIJP

602

PIJP

502

PIR2501 PIR3002PITP1101

PIU8012

PIU901

NLDC10EN

PODC10EN

PIJP503

PIU905

NLDC1M0VOUT

PIR3201 PIR3702PITP1301

PIU10012

PIU1101

NLDC20EN

PODC20EN

PIJP603

PIU1105

NLDC2M0VOUT

PIC2802PIC2902

PIC3102PIC3202

PIC3302PIC3402

PIC3502

PIC3601

PIC3702PIC3802

PIC4102PIC4202

PIC4402PIC4502

PIC4602PIC4702

PIC4802

PIC4901

PIC5002PIC5102

PIR2802PIR2901

PIR3001

PIR3502PIR3601

PIR3701

PIU806

PIU8010

PIU8015

PIU8016

PIU80EP

PIU903

PIU904

PIU1006

PIU10010

PIU10015

PIU10016

PIU100EP

PIU1103

PIU1104

NLGND

PIC2

601

PIC2

702

PIL201

PIU801

PIU802

PIC2

701

PIU803

PIC3002PIR2402 PIR2801

PITP1001PIU809

PIC3602

PIU804

PIC3701PIU8011

PIC3801

PIR2

602

PIU805

PIC3

901

PIC4

002

PIL301

PIU1001

PIU1002

PIC4

001

PIU1003

PIC4302PIR3102 PIR3501

PITP1201PIU1009

PIC4902

PIU1004

PIC5001PIU10011

PIC5101

PIR3

302

PIU1005

PIR2

702

PIU808

PIR2902

PIU807

PIR3

402

PIU1008

PIR3602

PIU1007

PIC2801PIC2901

PIC3501

PIC4101PIC4201

PIC4801

PIR2502 PIR3202

PIU8013

PIU8014

PIU902

PIU10013

PIU10014

PIU1102

NLVIN

PIC3

902

PIC4301PIC4401

PIC4501PIC4601

PIC4701

PIJP601

PIL302

PIR3101

PIR3

301

PIR3

401

NLVO

UT05

V0MA

IN0E

XT

PIC2

602

PIC3001PIC3101

PIC3201PIC3301

PIC3401

PIJP501

PIL202

PIR2401

PIR2

601

PIR2

701

NLVO

UT05

V0MA

IN0L

OCAL

PODC10EN

PODC20EN

145

Page 164: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

ABa

ckup

Pow

er.S

chD

oc2.

020

-DEC

-201

513

16En

g. J

orge

Mar

in (m

arin

mar

@ua

lber

ta.c

a)

UA

AR

G A

utop

ilot.P

rjPc

bU

nive

rsity

of A

lber

ta A

eria

l Rob

otic

s Gro

up

SW1

SW2

CB

OO

T3

VC

C4

BIA

S5

SYN

C6

RT

7

PGO

OD

8

FB9

AG

ND

10

SS/T

RK

11

ENA

BLE

12

VIN

13

VIN

14

PGN

D15

PGN

D16

PAD

U19

LM43

603P

WPT

GN

DG

ND

DN

PR

52

GN

D

10uF

C59

453K

R48

100K

R53

GN

D

4.7u

FC

60

GN

DG

ND

GN

D

TP15

10nF

C68

GN

D

IHLP

3232

DZE

R6R

8M11

L4 6.8u

H

4.7u

FC

69

GN

D

1Meg

R47

249K

R51

GN

D

100K

R50

47uF

C62

10uF

C64

100n

FC

65

GN

DG

ND

GN

DG

ND

+5V

_BC

KP

Bac

kup

Pow

er -

Loca

l Ste

p-D

own

Vol

tage

Con

verte

r

Bac

kup

Pow

er -

Bac

kup

Serv

o St

ep-D

own

Vol

tage

Con

vert

er

+5V

_SER

VO

_BC

KP

2.2u

FC

67

0R49

47uF

C63

SW1

SW2

CB

OO

T3

VC

C4

BIA

S5

SYN

C6

RT

7

PGO

OD

8

FB9

AG

ND

10

SS/T

RK

11

ENA

BLE

12

VIN

13

VIN

14

PGN

D15

PGN

D16

PAD

U21

LM43

603P

WPT

GN

DG

ND

GN

D

10uF

C72

453K

R55

100K

R60

GN

D

4.7u

FC

73

GN

DG

ND

GN

D

TP17

0.47

uF

C71

10nF

C81

GN

D

IHLP

3232

DZE

R6R

8M11

L5 6.8u

H

4.7u

FC

82

GN

D

1Meg

R54

249K

R58

GN

D

100K

R57

47uF

C75

10uF

C77

100n

FC

78

GN

DG

ND

GN

DG

ND

2.2u

FC

80

47uF

C76

DN

PC

70

DN

PR

59

0R56

VO

UT_

5V_B

AC

KU

P_LO

CA

L

VO

UT_

5V_B

AC

KU

P_SE

RV

O

DC

3_EN

2.2V

Thr

esho

ld

2.2V

Thr

esho

ld DC

4_EN

VIN

_BC

KP

VIN

_BC

KP

ENA

BLE

1

VIN

2

GN

D3

GN

D4

VO

UT

5

D24

V50

FX

U20

DC

3_EN

VIN

_BC

KP

GN

D

DC

3M_V

OU

T

DC

3M_V

OU

T

DC

4M_V

OU

T

ENA

BLE

1

VIN

2

GN

D3

GN

D4

VO

UT

5

D24

V50

FX

U22

DC

4_EN

VIN

_BC

KP

GN

D

DC

4M_V

OU

T

DC

4_EN

DC

3_EN

TP14

TP16

JP8

JP9

100n

FC

66

100n

FC

79

100p

FC

61

100p

FC

74

DN

PC

57

0.47

uF

C58

R48

and

R53

form

a v

olta

ge d

ivid

er u

sed

for c

utof

f vo

ltage

. If t

he v

olta

ge o

n th

e EN

AB

LE p

in is

low

er th

an

2.2V

the

LM43

603

will

shu

t dow

n. R

48 c

an b

e ch

ange

d to

a

high

er v

alue

to in

crea

se th

e cu

toff

vol

tage

.

R55

and

R60

form

a v

olta

ge d

ivid

er u

sed

for c

utof

f vo

ltage

. If t

he v

olta

ge o

n th

e EN

AB

LE p

in is

low

er th

an

2.2V

the

LM43

603

will

shu

t dow

n. R

55 c

an b

e ch

ange

d to

a

high

er v

alue

to in

crea

se th

e cu

toff

vol

tage

.

The

jum

per i

s us

ed to

se

lect

bet

wee

n th

e on

bo

ard

regu

lato

r and

an

exte

rnal

regu

lato

r

The

jum

per i

s us

ed to

se

lect

bet

wee

n th

e on

bo

ard

regu

lato

r and

an

exte

rnal

regu

lato

r

PIC5

701

PIC5

702

COC5

7

PIC5

801

PIC5

802

COC5

8

PIC5901 PIC5902CO

C59

PIC6001 PIC6002CO

C60

PIC6101 PIC6102CO

C61

PIC6201 PIC6202CO

C62

PIC6301 PIC6302COC63

PIC6401 PIC6402CO

C64

PIC6501 PIC6502CO

C65

PIC6601 PIC6602CO

C66

PIC6701PIC6702CO

C67

PIC6801 PIC6802CO

C68

PIC6901 PIC6902CO

C69

PIC7

001

PIC7

002

COC7

0

PIC7

101

PIC7

102

COC71

PIC7201 PIC7202CO

C72

PIC7301 PIC7302CO

C73

PIC7401 PIC7402CO

C74

PIC7501 PIC7502CO

C75

PIC7601 PIC7602COC76

PIC7701 PIC7702CO

C77

PIC7801 PIC7802CO

C78

PIC7901 PIC7902CO

C79

PIC8001PIC8002CO

C80

PIC8101 PIC8102CO

C81

PIC8201 PIC8202CO

C82

PIJP801 PIJP

802

PIJP803

COJP8

PIJP901 PIJP

902

PIJP903

COJP9

PIL401

PIL402

COL4

PIL501

PIL502

COL5

PIR4701 PIR4702COR4

7

PIR4801PIR4802 COR4

8

PIR4

901

PIR4

902COR4

9

PIR5

001

PIR5

002COR5

0PIR5101 PIR5102CO

R51

PIR5201PIR5202 COR5

2

PIR5301PIR5302 COR5

3

PIR5401 PIR5402COR5

4

PIR5501PIR5502 COR5

5

PIR5

601

PIR5

602COR5

6

PIR5

701

PIR5

702COR5

7PIR5801 PIR5802CO

R58

PIR5901PIR5902 COR5

9

PIR6001PIR6002 COR6

0

PITP1401 COTP

14PITP1501

COTP15

PITP1601 COTP

16PITP1701

COTP17

PIU1901

PIU1902

PIU1903

PIU1904

PIU1905

PIU1906

PIU1907

PIU1908

PIU1909

PIU19010

PIU19011

PIU19012

PIU19013

PIU19014

PIU19015

PIU19016

PIU190EPCO

U19

PIU2001

PIU2002

PIU2003

PIU2004

PIU2005CO

U20

PIU2101

PIU2102

PIU2103

PIU2104

PIU2105

PIU2106

PIU2107

PIU2108

PIU2109

PIU21010

PIU21011

PIU21012

PIU21013

PIU21014

PIU21015

PIU21016

PIU210EPCO

U21

PIU2201

PIU2202

PIU2203

PIU2204

PIU2205CO

U22

PIJP

802

PIJP

902

PIR4801 PIR5302PITP1501

PIU19012

PIU2001

NLDC30EN

PODC30EN

PIJP803

PIU2005

NLDC3M0VOUT

PIR5501 PIR6002PITP1701

PIU21012

PIU2201

NLDC40EN

PODC40EN

PIJP903

PIU2205

NLDC4M0VOUT

PIC5902PIC6002

PIC6202PIC6302

PIC6402PIC6502

PIC6602

PIC6701

PIC6802PIC6902

PIC7202PIC7302

PIC7502PIC7602

PIC7702PIC7802

PIC7902

PIC8001

PIC8102PIC8202

PIR5102PIR5201

PIR5301

PIR5802PIR5901

PIR6001

PIU1906

PIU19010

PIU19015

PIU19016

PIU190EP

PIU2003

PIU2004

PIU2106

PIU21010

PIU21015

PIU21016

PIU210EP

PIU2203

PIU2204

NLGND

PIC5

701

PIC5

802

PIL401

PIU1901

PIU1902

PIC5

801

PIU1903

PIC6102PIR4702 PIR5101

PITP1401PIU1909

PIC6702

PIU1904

PIC6801PIU19011

PIC6901

PIR4

902

PIU1905

PIC7

001

PIC7

102

PIL501

PIU2101

PIU2102

PIC7

101

PIU2103

PIC7402PIR5402 PIR5801

PITP1601PIU2109

PIC8002

PIU2104

PIC8101PIU21011

PIC8201

PIR5

602

PIU2105

PIR5

002

PIU1908

PIR5202

PIU1907

PIR5

702

PIU2108

PIR5902

PIU2107

PIC5901PIC6001

PIC6601

PIC7201PIC7301

PIC7901

PIR4802 PIR5502

PIU19013

PIU19014

PIU2002

PIU21013

PIU21014

PIU2202

NLVI

N0BC

KP

PIC5

702

PIC6101PIC6201

PIC6301PIC6401

PIC6501

PIJP801

PIL402

PIR4701

PIR4

901

PIR5

001

NLVO

UT05

V0BA

CKUP

0LOC

AL

PIC7

002

PIC7401PIC7501

PIC7601PIC7701

PIC7801

PIJP901

PIL502

PIR5401

PIR5

601

PIR5

701

NLVOUT05V0BACKUP0SERVO

PODC30EN

PODC40EN

146

Page 165: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

Com

pany

:A

Gen

eral

Not

es

Gen

eral

not

es re

latin

g to

the

desi

gn.

Not

e N

o.D

escr

iptio

n

PCB

Lay

out R

equi

rem

ents

Layo

ut n

otes

refe

rring

to sp

ecifi

c co

mpo

nent

s or

gro

up o

f

[Num

ber]

Des

crip

tion

com

pone

nts

are

indi

cate

d on

the

sche

mat

ics

by th

e no

te

PCB

Ass

embl

y V

aria

tions

List

of a

ssem

bly

varia

tions

. Var

iant

s ar

e de

sign

ated

with

a le

tter

Var

iant

Des

crip

tion

star

ting

from

A. T

he d

escr

iptio

n sh

ows

the

Var

iant

nam

e in

num

ber,

encl

osed

in b

race

s, a

djac

ent t

o th

e co

mpo

nent

. Eg

[1].

Not

es

brac

kets

and

then

any

com

men

ts.

A[w

/o h

eade

rs] A

ll SM

T co

mpo

nent

s so

lder

ed.

[1]

GPS

ant

enna

trac

es re

quire

50o

hm

Cop

yrig

ht s

ectio

n (if

app

licab

le)

Not

es.S

chD

oc

2.0

20-D

EC-2

015

1416

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

b

impe

danc

e.

Thro

ugh-

hole

hea

ders

are

not

incl

uded

.

B[w

/ hea

ders

] All

SMT

com

pone

nts

sold

ered

,in

clud

ing

thro

ugh-

hole

hea

ders

.

C[A

ssem

bled

] All

com

pone

nts

are

sold

ered

and

exte

rnal

com

pone

nts

are

plug

ged.

Fully

ass

embl

ed b

oard

.

147

Page 166: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

Com

pany

:A

SPI2NA

ME

MPU

-600

0

PER

IPH

ER

AL

PB12

MC

U C

S PI

N

MS5

611-

01BA

03PC

13

SPI S

LA

VE

S A

ND

"C

HIP

SE

LE

CT

" PI

N M

APP

ING

Doc

umen

tatio

n.Sc

hDoc

2.0

20-D

EC-2

015

1516

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

b

UAR

T1

NA

ME

TX

FUN

CTI

ON

PA9

MC

U P

IN

RXPA

10

UA

RT

PO

RT

S M

APP

ING

-DE

FAU

LT

SE

T

RC R

ecei

ver 1

TX

UAR

T2TX

PA2

RXPA

3

XBee

RX

XBee

TX

UAR

T3TX

PC10

RXPC

11

GPS

RX

GPS

TX

UAR

T5TX

PC12

RXPD

2

GPI

O P

C12

RC R

ecei

ver 2

TX

UAR

T6TX

PC6

RXPC

7

SERV

O_1

SERV

O_2

UAR

T4TX

PA0

RXPA

1

SERV

O_5

SERV

O_6

148

Page 167: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

Shee

t Titl

e:

Proj

ectT

itle:

Size

:N

umbe

r:R

ev:

Dat

e:Sh

eet

of

Dra

wn

by:

Com

pany

:A

09-D

EC-2

015

DA

TE

1.0

RE

VIS

ION

RE

VIS

ION

HIS

TO

RY

Initi

al R

elea

se

DE

SCR

IPT

ION

Rev

isio

n H

isto

ry.S

chD

oc

2.0

20-D

EC-2

015

1616

Eng.

Jor

ge M

arin

(mar

inm

ar@

ualb

erta

.ca)

UA

AR

G A

utop

ilot.P

rjPc

b

149

Page 168: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

4.5 Autopilot Board Bill-of-Materials (BOM)

This section contains the list of components used in the design of board, including manufacturer and partnumbers.

150

Page 169: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Designator Quantity Value Manufacturer ManufacturerPartNumber

C1,C2 2 10pF Yageo CC0402JRNPO9BN100

C3,C4,C5,C23,C36,C49,C67,C80 8 2.2uF Yageo CC0402MRX5R5BB225

C6,C24 2 4.7uF Yageo CC0603KRX5R6BB475C7,C8,C9,C10,C12,C13,C15,C16,C19,C20,C22,C35,C48,C52,C53,C54,C55,C56,C66,C79,C83 21 100nF Yageo CC0402KRX7R6BB104

C11 1 1uF Yageo CC0402KRX5R5BB105

C14 1 2.2nF Yageo CC0402KRX7R9BB222

C17,C38,C51,C69,C82 5 4.7uF Yageo CC0402MRX5R6BB475

C18 1 220nF Yageo CC0402KRX5R5BB224

C21 1 33uF Kemet T491A336K010AT

C25,C37,C50,C68,C81 5 10nF Yageo CC0402KRX7R7BB103

C27,C40,C58,C71 4 0.47uF Yageo CC0402ZRY5V7BB474

C28,C41,C59,C72 4 10uF SamsungElectro-MechanicsAmerica,Inc. CL32B106KBJNNNE

C29,C42,C60,C73 4 4.7uF SamsungElectro-MechanicsAmerica,Inc. CL32B475KBUYNNE

C30,C43,C61,C74 4 100pF Yageo CC0402JRNPO9BN101

C31,C32,C44,C45,C62,C63,C75,C76 8 47uF MurataElectronicsNorthAmerica GRM32ER71A476KE15L

C33,C46,C64,C77 4 10uF Yageo CC0805KKX5R7BB106

C34,C47,C65,C78 4 100nF Yageo CC0603KPX7R7BB104

C85,C86 2 8.2pF Yageo CC0402BRNPO9BN8R2

C87 1 11mF SeikoInstruments CPH3225A

C88 1 33nF SamsungElectro-MechanicsAmerica,Inc. CL05B333JO5NNNC

C89 1 470pF TDKCorporation C1005C0G1H471J050BA

D1,D2 2 ToshibaSemiconductorandStorage CMS04(TE12L,Q,M)

D3 1 DiodesIncorporated 1N5819HW-7-F

D4 1 MicroCommercialCo 1N4148WX-TP

EX1,EX2 2 SullinsConnectorSolutions NPPN101BFCN-RC

J6,J11,J12,J13,J14,J20 6 Molex,LLC 0533980471

J16 1 3M 961106-5500-AR-PR

J17 1 HiroseElectricCoLtd U.FL-R-SMT(10)

J18 1 CinchConnectivitySolutionsJohnson 142-0701-801

J19 1 Molex,LLC 0533980771

J21 1 Molex,LLC 0533980571

JTAG1 1 SamtecInc. FTSH-105-01-F-DV-K

L1 1 MurataElectronicsNorthAmerica BLM15AX102SN1D

L2,L3,L4,L5 4 6.8uH VishayDale IHLP3232DZER6R8M11

L6 1 6.8nH MurataElectronicsNorthAmerica LQW15AN6N8H00D

L7 1 27nH MurataElectronicsNorthAmerica LQW15AN27NJ00D

LED1,LED5 2 Lite-OnInc. LTST-C190KRKT

LED2 1 Lite-OnInc. LTST-C190AKT

LED3,LED6,LED8,LED9 4 Lite-OnInc. LTST-C190KGKT

LED4,LED7 2 Lite-OnInc. LTST-C190YKT

Q1 1 DiodesIncorporated MMDT5551-7-F

Q2 1 NXPSemiconductors BSS138PS,115R1,R5,R9,R17,R26,R33,R40,R41,R46,R49,R56,R66,R67 13 0 Yageo RC0402JR-070RL

R2 1 9.1K Yageo RC0603FR-079K1L

R3 1 2.2K Yageo RC0603FR-072K2L

R6,R10 2 1K Yageo RC0402FR-071KL

R7,R8,R11,R12,R15,R23,R42,R43,R44,R45,R65 11 10K Yageo RC0402FR-0710KL

R13,R14 2 2.2K Yageo RC0402FR-072K2L

R16 1 120 Yageo RC0402FR-07120RL

R18 1 470 Yageo RC0402JR-07470RL

151

Page 170: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

R22,R27,R34,R50,R57 5 100K Yageo RC0402FR-07100KL

R24,R31,R47,R54 4 1Meg RohmSemiconductor MCR01MRTF1004

R25,R32,R48,R55 4 453K Yageo RC0603FR-07453KL

R28,R35,R51,R58 4 249K RohmSemiconductor MCR01MRTF2493

R30,R37,R53,R60 4 100K Yageo RC0603FR-07100KL

R38,R39 2 200K Yageo RC0402FR-07200KL

R61 1 100 Yageo RC0402FR-07100RL

R68 1 10 Yageo RC0402FR-0710RL

RN1,RN2,RN8,RN9 4 1K Yageo YC164-JR-071KL

RN3 1 10K Yageo YC164-JR-0710KL

RN4,RN5 2 2.2K Yageo YC164-JR-072K2L

RN6,RN7 2 470 Yageo YC164-JR-07470RL

S1 1 C&KComponents KMR221GLFS

U1 1 STMicroelectronics STM32F405RGT6

U2 1 MeasurementSpecialties,Inc. MS5611-01BA03

U3 1 InvenSense MPU-6000

U4 1 HoneywellMicroelectronics&PrecisionSensors HMC5883L-TR

U5 1 MaximIntegrated MAX3051EKA+T

U6 1 TexasInstruments LM3940IMPX-3.3/NOPB

U7 1 TexasInstruments LP2992AIM5-3.3/NOPB

U8,U10,U19,U21 4 TexasInstruments LM43603PWPT

U13,U14 2 TexasInstruments PCA9306DCTR

U15,U16 2 TexasInstruments SN74LVC125APWR

U17 1 GLOBALCONNECTORTECHNOLOGY MEM2051-00-195-00-A

U18 1 Atmel ATMEGA328P-AU

U23 1 u-blox MAX-7Q

U24 1 MurataElectronicsNorthAmerica SF1186K-3

U25 1 MaximIntegrated MAX2659ELT+T

USB1 1 Molex,LLC 0473460001

Y1 1 AbraconLLC ABM8G-12.000MHZ-4Y-T3

Y2 1 MurataElectronicsNorthAmerica CSTCE16M0V53-R0

Z1,Z2 2 BournsInc. CG0603MLC-05LE

152

Page 171: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

5 Revision HistoryTable 5.1: Falcon I Autopilot Board User Guide Revision History.

Rev. Date Changes1.0 04-Aug-2016 First issue.1.1 29-Nov-2016 Changed board name, updated introduction and updated jumper

table to indicate default jumpers. Added note on solder jumperssection.

1.2 04-Jan-2017 Fixed typographic errors and updated IMU and GPS paragraphs.1.3 25-Jan-2017 Updated cover page picture and added board pictures section.1.4 15-Mar-2018 Added text link to STM32 MCU datasheet. Updated block diagram.1.5 29-Mar-2018 Updated authors. Added BOM.

153

Page 172: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

154

Appendix B

Camera Bridge Board Schematics

Page 173: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Ind

ex.

Sch

Doc

1.0

14-

DEC

-201

71

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

U_

Doc

ume

ntatio

nD

ocu

ment

atio

n.Sc

hDoc

U_

Revi

sion

Hist

ory

Revi

sion

Hist

ory.

Sch

Doc

Cam

era

US

B H

UB.

Prj

Pcb

1.0

Proj

ect

Titl

e:

Revi

sion

:Da

te:

14-

DE

C-20

17

Bloc

ksP

age

Name

......

.....

...

......

.....

........

........

.....

......

......

......

......

......

......

......

......

...

1Inde

x

2Ca

mera

US

B H

UB

3 4 5 6 7 8 9 18

..............................................................

17

..............................................................

16

..............................................................

15

..............................................................

14

..............................................................

23

..............................................................

22

..............................................................

21

..............................................................

20

..............................................................

19

..............................................................

25

..............................................................

24

..............................................................

11

Not

es10 13

Revi

sion

His

tory

12

Doc

ume

ntation

27

..............................................................

26

..............................................................

28

..............................................................

U_

Came

ra

USB

H

UB

Came

ra

USB

H

UB.

Sch

Doc

..............................................................

U_

Not

esNot

es.Sc

hDoc

US

B H

UB

Came

ra

Arr

ay

US

B S

wit

ch

Came

ra

Conne

ction

Po

wer

Trigge

r Cont

rolle

r

FT

DI

1 5 5

Page 174: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Ca

mera

US

B H

UB.

Sch

Doc 1.

014-

DEC

-201

72

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

MS

TT

RG1

MS

TT

RG2

MS

TT

RG3

MS

TT

RG4

MS

TT

RG5

MS

TT

RG6

TX

DR

XD

DT

R

Trigg

er C

ont

roll

erTrigg

er C

ont

roll

er.Sc

hDoc

US

B2US

B3US

B4

US

B1US

BU

US

B H

UB

US

B H

UB.

Sch

Doc

US

B1

US

B2

US

B3

US

B4

US

B5

US

B6

US

B7

US

B8

US

B9

US

B10

US

B11

US

B12

TRI

GG

ER1

TRI

GG

ER2

TRI

GG

ER3

TRI

GG

ER4

TRI

GG

ER5

TRI

GG

ER6

TRI

GG

ER7

TRI

GG

ER8

TRI

GG

ER9

TRI

GG

ER1

0

TRI

GG

ER1

1

TRI

GG

ER1

2

Came

ra

Arra

yCa

mera

Ar

ray.

Sch

Doc

US

B2US

B3US

B4

US

B1US

BU

US

B H

UB

US

B H

UB.

Sch

Doc US

B2US

B3US

B4US

B1US

BU

US

B H

UB

US

B H

UB.

Sch

Doc

MS

TT

RG1

MS

TT

RG2

MS

TT

RG3

MS

TT

RG4

MS

TT

RG5

MS

TT

RG6

US

B

US

B1

US

B2

US

B S

witc

h A

US

B S

witc

h.Sc

hDoc

US

B

US

B1

US

B2

US

B S

witc

h B

US

B S

witc

h.Sc

hDoc

US

B

US

B1

US

B2

US

B S

witc

h C

US

B S

witc

h.Sc

hDoc

US

BU1

US

BU2

US

BU3

US

B2US

B3US

B4

US

B1US

BU

US

B H

UB

US

B H

UB.

Sch

Doc

USBA1USBA2USBA3

D+

D-

US

B

GN

D

US

BV

CC1

D+

D-

US

B

GN

D

US

BV

CC2

D+

D-

US

B

GN

D

US

BV

CC3

D+

D-

US

B

GN

D

US

BV

CC4

US

BC1

US

BC2

US

BC3

US

BC4

US

BC5

US

BC6

US

BC7

US

BC8

US

BC9

US

BC1

0

US

BC1

1

US

BC1

2

CMS

04(

TE1

2L,

Q,M)

D1

US

BV

CC1

MF

-NS

MF2

00-2

CT-

ND

F1

CMS

04(

TE1

2L,

Q,M)

D2

US

BV

CC2

MF

-NS

MF2

00-2

CT-

ND

F2

CMS

04(

TE1

2L,

Q,M)

D3

US

BV

CC3

MF

-NS

MF2

00-2

CT-

ND

F3

CMS

04(

TE1

2L,

Q,M)

D4

US

BV

CC4

MF

-NS

MF2

00-2

CT-

ND

F4

+5

V

Po

wer

Po

wer.

Sch

Doc

VB

US

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P1

Mini

USB

B_1 V

BUS

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P2

Mini

USB

B_1 V

BUS

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P3

Mini

USB

B_1 V

BUS

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P4

Mini

USB

B_1

DT

R

RX

TX

CTS

FT

DIF

TDI.

Sch

Doc

GN

D

PI

D1

01

P

ID

10

2

COD1

PI

D2

01

P

ID

20

2

COD2

PI

D3

01

P

ID

30

2

COD3

PI

D4

01

P

ID

40

2

COD4

PI

F1

01

P

IF

10

2

COF1

PI

F2

01

P

IF

20

2

COF2

PI

F3

01

P

IF

30

2

COF3

PI

F4

01

P

IF

40

2

COF4

PI

P1

01

PI

P1

02

PI

P1

03

PI

P1

04

PI

P1

05

PI

P1

0S

H

CO

P1

PI

P1

01

PI

P1

02

PI

P1

03

PI

P1

04

PI

P1

05

PI

P1

0S

H

PI

P2

01

PI

P2

02

PI

P2

03

PI

P2

04

PI

P2

05

PI

P2

0S

H

CO

P2

PI

P2

01

PI

P2

02

PI

P2

03

PI

P2

04

PI

P2

05

PI

P2

0S

H

PI

P3

01

PI

P3

02

PI

P3

03

PI

P3

04

PI

P3

05

PI

P3

0S

H

CO

P3

PI

P3

01

PI

P3

02

PI

P3

03

PI

P3

04

PI

P3

05

PI

P3

0S

H

PI

P4

01

PI

P4

02

PI

P4

03

PI

P4

04

PI

P4

05

PI

P4

0S

H

CO

P4

PI

P4

01

PI

P4

02

PI

P4

03

PI

P4

04

PI

P4

05

PI

P4

0S

H

PI

D1

02

PI

D2

02

PI

D3

02

PI

D4

02

PI

P1

04

PI

P1

05

PI

P1

0S

H

PI

P2

04

PI

P2

05

PI

P2

0S

H

PI

P3

04

PI

P3

05

PI

P3

0S

H

PI

P4

04

PI

P4

05

PI

P4

0S

H

NLMS

TTRG

1 NL

MSTT

RG2

NLMS

TTRG

3 NL

MSTT

RG4

NLMS

TTRG

5 NL

MSTT

RG6

PI

D1

01

P

IF

10

2

PI

D2

01

P

IF

20

2

PI

D3

01

P

IF

30

2

PI

D4

01

P

IF

40

2

PI

P1

02

PI

P1

03

PI

P2

02

PI

P2

03

PI

P3

02

PI

P3

03

PI

P4

02

PI

P4

03

NLUSBA1 NLUSBA1

NLUSBA2 NLUSBA2

NLUSBA3 NLUSBA3

NLUS

BC1

NLUS

BC1 NL

USBC

2 NL

USBC

2

NLUS

BC3

NLUS

BC3

NLUS

BC4

NLUS

BC4

NLUS

BC5

NLUS

BC5

NLUS

BC6

NLUS

BC6

NLUS

BC7

NLUS

BC7

NLUS

BC8

NLUS

BC8

NLUS

BC9

NLUS

BC9

NLUS

BC10

NL

USBC

10

NLUS

BC11

NL

USBC

11

NLUS

BC12

NL

USBC

12

NLUS

BU1

NLUS

BU1

NLUS

BU2

NLUS

BU2

NLUS

BU3

NLUS

BU3

PI

F1

01

PI

P1

01

PI

F2

01

PI

P2

01

PI

F3

01

PI

P3

01

PI

F4

01

P

IP

40

1

1 5 6

Page 175: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

US

B H

UB.

Sch

Doc

1.0

14-

DEC

-201

73

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

VSS

1

XO

UT

2

XI

N3

DM4

4

DP4

5

DM3

6

DP3

7

DM2

8

DP2

9

DM1

10

DP1

11

VD

18_

O12

VD33

13

RE

XT

14D

MU

15DP

U16

XRS

TJ17

VB

US

M18

BUSJ

19V

DD5

20V

D33_

O21

DR

V22

LE

D1/

EES

CL

23LE

D2

24P

WRJ

25O

VCJ

26TES

TJ/EES

DA

27V

D18

28U1

FE1.

1S

12

Y1

12

MHz

GN

DUS

B4_

NUS

B4_P

US

B3_

NUS

B3_P

US

B2_

NUS

B2_P

US

B1_

NUS

B1_P

10uF

C8

GN

D

10uF

C9

GN

D

2.7

K1

%

R6

GN

D

US

BU

_N

US

BU

_P

100

K

R4100

K

R2

10uF

C6

10uF

C5+5

V

GN

DG

ND

10uF

C1

GN

D

100

KR5

GN

DG

ND

+5

V

DR

VLE

D1

LE

D2

SD

A3

WP

5

VSS

2

SC

L1

VC

C4

U2

24LC

02B

T-I

/O

T

GN

D

TES

TJ

TES

TJLE

D1

+5

V

GN

D

LE

D5

Red

1K

R7

LE

D2

Gre

en

100nF

C710

K

R3

10nF

C410

K

R1

LE

D4

Gre

en1

KR8

GN

D

LE

D1

Gre

en LE

D3

Gree

n1

KR9

D+D-

US

B

US

B1

D+D-

US

B

US

B2

D+D-

US

B

US

B3

D+D-

US

B

US

B4

D+

D-

US

B

US

BU

LE

D1

LE

D2

DR

V

EE

PR

OM (o

ptio

nal)

VD18

VD18

VD33

VD33

VD33

VD33

VD33

20pF

C2

GN

DG

ND

20pF

C3PIC101 PIC102

CO

C1

PIC201 PIC202

CO

C2

PIC301 PIC302

CO

C3

PIC401 PIC402

CO

C4

PIC5

01

PIC5

02 C

OC

5

PIC6

01

PIC6

02 C

OC

6

PIC701 PIC702 C

OC

7

PIC801 PIC802 C

OC

8

PIC901 PIC902 C

OC

9

PILE

D101

PI

LED1

02

CO

LE

D1

PILE

D201

PI

LED2

02

CO

LE

D2

PILE

D301

PI

LED3

02

CO

LE

D3

PILE

D401

PI

LED4

02

CO

LE

D4

PILED501 PILED502 C

OL

ED

5

PIR1

01

PIR1

02 C

OR

1

PIR2

01

PIR2

02

CO

R2

PIR3

01

PIR3

02 C

OR

3

PIR4

01

PIR4

02

CO

R4

PIR501 PIR502 CO

R5

PIR601 PIR602 CO

R6

PIR701 PIR702 CO

R7

PIR801 PIR802 CO

R8

PIR901 PIR902 CO

R9

PI

U1

01

PI

U1

02

PI

U1

03

PI

U1

04

PI

U1

05

PI

U1

06

PI

U1

07

PI

U1

08

PI

U1

09

PI

U1

01

0

PI

U1

01

1

PI

U1

01

2

PI

U1

01

3

PI

U1

01

4

PI

U1

01

5

PI

U1

01

6

PI

U1

01

7

PI

U1

01

8

PI

U1

01

9

PI

U1

02

0

PI

U1

02

1

PI

U1

02

2

PI

U1

02

3

PI

U1

02

4

PI

U1

02

5

PI

U1

02

6

PI

U1

02

7

PI

U1

02

8

CO

U1

PI

U2

01

PI

U2

02

PI

U2

03

PI

U2

04

PI

U2

05

CO

U2

PI

Y1

01

P

IY

10

2

CO

Y1

PIC6

02

PIR1

01

PIR3

01

PI

U1

02

0

PILE

D101

PI

LED2

01

PILE

D302

PI

LED4

02

PILED501

PI

U1

02

2

NLDR

V

PIC101 PIC201

PIC302

PIC402

PIC5

01

PIC6

01 PIC702

PIC801 PIC901

PIR501

PIR601

PIR701

PI

U1

01

PI

U2

02

PIR901

PI

U1

02

3

PI

U2

01

NLLE

D1

PIR801

PI

U1

02

4

NLLE

D2

PIC202

PI

U1

03

PI

Y1

01

PIC301

PI

U1

02

PI

Y1

02

PIC401 PI

R102

PI

U1

02

6

PIC701 PI

R302

PIR502

PI

U1

01

8

PILE

D102

PILE

D301

PIR902

PILE

D202

PILE

D401

PIR802

PILED502 PIR702

PIR2

01

PI

U1

01

9

PIR4

01

PI

U1

01

7

PIR602

PI

U1

01

4

PI

U1

02

5

PI

U1

02

7

PI

U2

03

NLTE

STJ

PI

U1

01

0

NLUS

B10N

POUSB1

PI

U1

01

1

NLUS

B10P

POUSB1

PI

U1

08

NLUS

B20N

P

OU

SB

2

PI

U1

09

NLUS

B20P

P

OU

SB

2

PI

U1

06

NLUS

B30N

P

OU

SB

3

PI

U1

07

NLUS

B30P

P

OU

SB

3

PI

U1

04

NLUS

B40N

P

OU

SB

4

PI

U1

05

NLUS

B40P

P

OU

SB

4

PI

U1

01

5

NLUS

BU0N

P

OU

SB

U

PI

U1

01

6

NLUS

BU0P

P

OU

SB

U

PIC102

PIC802

PI

U1

01

2

PI

U1

02

8

NLVD

18

PIC5

02

PIC902

PIR2

02

PIR4

02

PI

U1

01

3

PI

U1

02

1

PI

U2

04

PI

U2

05

NLVD

33

POUSB1

POUS

B10D

0

PO

US

B2

PO

USB2

0D0

PO

US

B3

PO

USB3

0D0

PO

US

B4

PO

USB4

0D0

PO

US

BU

PO

USBU

0D0

1 5 7

Page 176: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

US

B S

witc

h.S

ch

Doc

1.0

14-

DEC

-201

74

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

GN

D

+3

V3

D+

D-

US

B

US

BD+D-

US

B

US

B1

D+D-

US

B

US

B2

GN

D

1 2

P5

Hea

der

2

+3

V3 10

KR1

0

GN

D

100nF

C10

GN

D

1D+

11

D-2

2D+

32

D-4

GN

D5

OE

6

D-7

D+

8

S9

VC

C10

EP

11

U3

TS

3USB2

21E

DR

CR

US

B_P

US

B_

NUS

B1_P

US

B1_

N

US

B2_P

US

B2_

N

PIC1001 PIC1002 C

OC

10

PI

P5

01

PI

P5

02

CO

P5

PIR1001 PIR1002 CO

R1

0

PI

U3

01

PI

U3

02

PI

U3

03

PI

U3

04

PI

U3

05

PI

U3

06

PI

U3

07

PI

U3

08

PI

U3

09

PIU3010

PIU3011

CO

U3

PIC1001 PIR1001

PIU3010

PIC1002

PI

P5

02

PI

U3

05

PI

U3

06

PIU3011

PI

P5

01

PIR1002

PI

U3

09

PI

U3

02

NLUS

B10N

P

OU

SB

1

PI

U3

01

NLUS

B10P

P

OU

SB

1

PI

U3

04

NL

USB2

0N

PO

US

B2

P

IU

30

3

NLUS

B20P

P

OU

SB

2

PI

U3

07

NL

USB0

N P

OU

SB

P

IU

30

8

NLUS

B0P

PO

US

B

PO

US

B

PO

US

B1

PO

USB1

0D0

PO

US

B2

PO

USB2

0D0

POUS

B0D0

1 5 8

Page 177: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Ca

mera

Arra

y.S

ch

Doc

1.0

14-

DEC

-201

75

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

Repe

at(

USB

_N)

Repe

at(

USB

_P)

Repe

at(

TRI

GG

ER)

Repe

at(

Came

ra,1

,12

)Ca

mera

Con

necti

on.

Sch

Doc

USP

_N[

1..12

]US

B_

N

USP

_P[1..1

2]

USB_P

US

B_

N1

US

B_

N2

US

B_

N3

US

B_

N4

US

B_

N5

US

B_

N6

US

B_

N7

US

B_

N8

US

B_

N9

US

B_

N10

US

B_

N11

US

B_

N12

US

B_P1

US

B_P2

US

B_P3

US

B_P4

US

B_P5

US

B_P6

US

B_P7

US

B_P8

US

B_P9

US

B_P

10

US

B_P

11

US

B_P

12

D+ D-

US

BUS

B_P1

US

B_

N1

US

B1

D+ D-

US

BUS

B_P2

US

B_

N2

US

B2

D+ D-

US

BUS

B_P3

US

B_

N3

US

B3

D+ D-

US

BUS

B_P4

US

B_

N4

US

B4

D+ D-

US

BUS

B_P5

US

B_

N5

US

B5

D+ D-

US

BUS

B_P6

US

B_

N6

US

B6

D+ D-

US

BUS

B_P7

US

B_

N7

US

B7

D+ D-

US

BUS

B_P8

US

B_

N8

US

B8

D+ D-

US

BUS

B_P9

US

B_

N9

US

B9

D+ D-

US

BUS

B_P

10US

B_

N10

US

B10

D+ D-

US

BUS

B_P

11US

B_

N11

US

B11

D+ D-

US

BUS

B_P

12US

B_

N12

US

B12

TRI

GG

ER

TRIGGER[1..12]

TRI

GG

ER1

TRI

GG

ER2

TRI

GG

ER3

TRI

GG

ER4

TRI

GG

ER5

TRI

GG

ER6

TRI

GG

ER7

TRI

GG

ER8

TRI

GG

ER9

TRI

GG

ER1

0TRI

GG

ER1

1TRI

GG

ER1

2

TRI

GG

ER1

TRI

GG

ER2

TRI

GG

ER3

TRI

GG

ER4

TRI

GG

ER5

TRI

GG

ER6

TRI

GG

ER7

TRI

GG

ER8

TRI

GG

ER9

TRI

GG

ER1

0TRI

GG

ER1

1TRI

GG

ER1

2

NLUS

B0N

NLUS

B0N1

PO

US

B1

NLUS

B0N

NLUS

B0N2

PO

US

B2

NLUS

B0N

NLUS

B0N3

PO

US

B3

NLUS

B0N

NLUS

B0N4

PO

US

B4

NLUS

B0N

NLUS

B0N5

PO

US

B5

NLUS

B0N

NLUS

B0N6

POUSB6

NLUS

B0N

NLUS

B0N7

PO

US

B7

NLUS

B0N

NLUS

B0N8

PO

US

B8

NLUS

B0N

NLUS

B0N9

PO

US

B9

NLUS

B0N

NLUS

B0N1

0

POUSB10

NLUS

B0N

NLUS

B0N1

1 POUSB11

NLUS

B0N

NLUS

B0N1

2

POUSB12

NLUS

B0P

NLUS

B0P1

PO

US

B1

NLUS

B0P

NLUS

B0P2

P

OU

SB

2

NLUS

B0P

NLUS

B0P3

PO

US

B3

NLUS

B0P

NLUS

B0P4

PO

US

B4

NLUS

B0P

NLUS

B0P5

PO

US

B5

NLUS

B0P

NLUS

B0P6

POUSB6

NLUS

B0P

NLUS

B0P7

PO

US

B7

NLUS

B0P

NLUS

B0P8

PO

US

B8

NLUS

B0P

NLUS

B0P9

PO

US

B9

NLUS

B0P

NLUS

B0P1

0

POUSB10

NLUS

B0P

NLUS

B0P1

1

POUSB11

NLUS

B0P

NLUS

B0P1

2

POUSB12

NLTRIGGER0100120 NLTR

IGGE

R1

NLTR

IGGE

R

PO

TR

IG

GE

R1

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R2

PO

TR

IG

GE

R2

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R3

PO

TR

IG

GE

R3

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R4

PO

TR

IG

GE

R4

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R5

PO

TR

IG

GE

R5

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R6

PO

TR

IG

GE

R6

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R7

PO

TR

IG

GE

R7

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R8

PO

TR

IG

GE

R8

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R9

PO

TR

IG

GE

R9

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R10

PO

TR

IG

GE

R1

0

NLTRIGGER0100120 NLTR

IGGE

R

NLTR

IGGE

R11

POTRIGGER11

NLTRIGGER0100120 NLTR

IGGE

R

NLTRIGGER12

POTRIGGER12

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0N0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

NLUS

P0P0

1001

20

PO

TR

IG

GE

R1

P

OT

RI

GG

ER

2

PO

TR

IG

GE

R3

P

OT

RI

GG

ER

4

PO

TR

IG

GE

R5

P

OT

RI

GG

ER

6

PO

TR

IG

GE

R7

P

OT

RI

GG

ER

8

PO

TR

IG

GE

R9

P

OT

RI

GG

ER

10

POTRIGGER11

POTRIGGER12

PO

US

B1

PO

USB1

0D0

PO

US

B2

PO

USB2

0D0

PO

US

B3

PO

USB3

0D0

PO

US

B4

PO

USB4

0D0

PO

US

B5

PO

USB5

0D0

POUSB6

POUS

B60D

0

PO

US

B7

PO

USB7

0D0

PO

US

B8

PO

USB8

0D0

PO

US

B9

PO

USB9

0D0

POUSB10

POUS

B100

D0

POUSB11

POUS

B110

D0

POUSB12

POUS

B120

D0

1 5 9

Page 178: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Ca

mera

Conn

ecti

on.

Sch

Doc 1.0

14-

DEC

-201

76

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

125

3

64U4

US

BL

C6-

2S

C6

VB

US

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P11

US

B Mi

ni-

B

+5

V

GN

D

100uF

C11

GN

D

+5

VG

ND

US

B_

N

US

B_P

US

B Conne

ctor

Tri

gger

Co

nne

ctor

TRI

GG

ER

1 2 3 4

P6

Hea

der

4

GN

D

US

BIN

_P

US

BIN

_N

US

B_P

US

B_

NPIC1101 PIC1102 C

OC

11

PI

P6

01

PI

P6

02

PI

P6

03

PI

P6

04

CO

P6

PI

P1

10

1

PI

P1

10

2

PI

P1

10

3

PI

P1

10

4

PI

P1

10

5

PI

P1

10

SH

CO

P1

1

PI

U4

01

PI

U4

02

PI

U4

03

P

IU

40

4

PI

U4

05

PI

U4

06

CO

U4

PIC1101

PI

P1

10

1

PI

U4

05

PIC1102

PI

P6

01

PI

P1

10

4

PI

P1

10

5

PI

P1

10

SH

PI

U4

02

PI

P6

03

PI

P6

04

PI

P6

02

P

OT

RI

GG

ER

PI

P1

10

2

PI

U4

04

NL

USB0

N

PI

P1

10

3

PI

U4

06

NLUS

B0P

PI

U4

03

NL

USBI

N0N

PO

US

B0

N

PI

U4

01

NL

USBI

N0P

PO

US

B0

P

PO

TR

IG

GE

R

PO

US

B0

N

PO

US

B0

P

1 6 0

Page 179: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Trigge

r Con

trolle

r.Sc

hDo

c 1.0

14-

DEC

-201

77

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

PC6

(R

ES

ET/

PCI

NT

14)

29

PD0

(R

XD/

PCI

NT1

6)30

PD1

(T

XD/

PCI

NT

17)

31

PD2

(IN

T0/PCI

NT

18)

32

PD4

(PC

IN

T20

/XC

K/T

0)2

VC

C6

GN

D5

PB6

(PC

IN

T6/

XT

AL1

/T

OSC1

)7

PB7

(PC

IN

T7/

XT

AL2

/T

OSC2

)8

PD5

(PC

IN

T21

/OC0

B/T

1)9

PD6

(PC

IN

T22

/OC0

A/A

IN0

)10

PD7

(PC

IN

T23

/AI

N1)

11

PB0

(PC

IN

T0/

CL

KO/

ICP1

)12

PB1

(PC

IN

T1/

OC1

A)13

PB2

(PC

IN

T2/

SS/

OC1

B)14

PB3

(PC

IN

T3/

OC2

A/M

OS

I)15

PB4

(PC

IN

T4/

MIS

O)16

PB5

(S

CK/

PCI

NT

5)17

AV

CC

18

AR

EF

20

GN

D21

PC0

(A

DC0/

PCI

NT

8)23

PC1

(A

DC1/

PCI

NT

9)24

PC2

(A

DC2/

PCI

NT

10)

25

PC3

(A

DC3/

PCI

NT

11)

26

PC4

(A

DC4/

SD

A/P

CIN

T12

)27

PC5

(A

DC5/

SC

L/P

CIN

T13

)28

GN

D3

VC

C4

AD

C619

AD

C722

PD3

(PC

IN

T19

/OC2

B/I

NT1

)1

U5

AT

mega

328P-

AU

GN

D

GN

D

SPI_SC

KSPI_

MIS

OSPI_

MOSI

SPI_

CS

RES

ET

TX

DR

XD

MC

U

DT

R

12

16

MHz

Y2

GN

D

12

34

56

P8

ICSP

RES

ET

SPI_SC

KSPI_

MIS

OSPI_

MOSI

GN

D

+5

V

+5

V

+5

V

+5

V

ICS

P

10

KR1

1

MS

TT

RG1

MS

TT

RG2

MS

TT

RG3

MS

TT

RG4

MS

TT

RG5

MS

TT

RG6

Ju

mpe

rs

12

34

56

78

910

1112

P7

Hea

der 6

X2

MS

TT

RG1

MS

TT

RG2

MS

TT

RG3

MS

TT

RG4

MS

TT

RG5

MS

TT

RG6

1K

R12 LE

D6

Gre

en

GN

D

100nF

C13

100nF

C15

100nF

C14

100nF

C12

MS

TT

RG1

MS

TT

RG2

MS

TT

RG3

MS

TT

RG4

MS

TT

RG5

MS

TT

RG6

TX

DR

XD

DT

RPI

C120

1 PI

C120

2

COC12

PIC1301 PIC1302 COC13

PIC1401 PIC1402 COC14

PIC1501 PIC1502 COC15

PILED601 PILED602 CO

LED6

PI

P7

01

P

IP

70

2

PI

P7

03

P

IP

70

4

PI

P7

05

P

IP

70

6

PI

P7

07

P

IP

70

8

PI

P7

09

PIP7010

PIP7011

PIP7012

COP7

PI

P8

01

P

IP

80

2

PI

P8

03

P

IP

80

4

PI

P8

05

P

IP

80

6

COP8

PIR1101 PIR1102 COR11

PIR1201 PIR1202 COR12

PI

U5

01

PI

U5

02

PI

U5

03

PI

U5

04

PI

U5

05

PI

U5

06

PI

U5

07

PI

U5

08

PI

U5

09

PIU5010

PIU5011

PIU5012

PIU5013

PIU5014

PIU5015

PIU5016

PIU5017

PIU5018

PIU5019

PIU5020

PIU5021

PIU5022

PIU5023

PIU5024

PIU5025

PIU5026

PI

U5

02

7

PIU5028

PI

U5

02

9

PI

U5

03

0

PIU5031

PIU5032

COU5

PIY2

01

PIY202 PIY203

COY2

PIC1301 PIC1401

PIC1501

PI

P8

02

PIR1101

PI

U5

04

PI

U5

06

PIU5018

PIU5020

PIC1

202

NLDT

R P

OD

TR

PIC1302 PIC1402

PIC1502

PILED602

PI

P8

06

PI

U5

03

PI

U5

05

PIU5021

PIY203

PI

P7

02

P

IP

70

1

PIU5032

POMSTTRG1

NLMS

TTRG

1

PI

P7

04

P

IP

70

3

PI

U5

01

PO

MS

TT

RG

2

NLMS

TTRG

2

PI

P7

06

P

IP

70

5

PI

U5

02

PO

MS

TT

RG

3

NLMS

TTRG

3

PI

P7

08

P

IP

70

7

PI

U5

09

PO

MS

TT

RG

4

NLMS

TTRG

4

PIP7010

PI

P7

09

PIU5010

PO

MS

TT

RG

5

NLMS

TTRG

5

PIP7012

PIP7011

PIU5011

PO

MS

TT

RG

6

NLMS

TTRG

6

PILED601 PIR1201

PI

U5

07

PIY2

01

PI

U5

08

PIY202

PIU5012

PIU5013

PIU5019

PIU5022

PIU5023

PIU5024

PIU5025

PIU5026

PI

U5

02

7

PIU5028

PIC1

201

PI

P8

05

PIR1102

PI

U5

02

9

NLRE

SET

PI

U5

03

0

NLRX

D P

OR

XD

PIU5014

NLSP

I0CS

PI

P8

01

PIU5016

NLSP

I0MI

SO

PI

P8

04

PIU5015

NLSP

I0MO

SI

PI

P8

03

PIR1202 PIU5017

NLSP

I0SC

K

PIU5031

NLTX

D P

OT

XD

PO

DT

R

POMSTTRG1

PO

MS

TT

RG

2

PO

MS

TT

RG

3

PO

MS

TT

RG

4

PO

MS

TT

RG

5

PO

MS

TT

RG

6

PO

RX

D

PO

TX

D

1 6 1

Page 180: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et:

Pro

ject:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

FT

DI.

Sch

Doc

1.0

14-

DEC

-201

78

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

VC

CIO

4

VC

C20

US

BDP

15US

BD

M16

NC

8

RES

ET

19

NC

24

OS

CI27

OS

CO

28

3V3

OU

T17

AGND25

GND7

GND18

GND21

TEST26

TX

D1

RX

D5

RTS

3

CTS

11

DT

R2

DS

R9

DC

D10

RI6

CB

US0

23

CB

US1

22

CB

US2

13

CB

US3

14

CB

US4

12

U8

FT23

2R

L SS

OP

GN

D

GN

D

GN

DG

ND

US

B5

V

+5

V

+5

V

+5

V+5

VDT

R

RX

TX

1.5

KR2

11.

5K

R20

CTS

FT

DI_3

V3

LE

D8

Red

LE

D9

Red

10uF

C30

100nF

C31

100nF

C32

VB

US

1

D-2

D+

3

ID

4

GN

D5

SH

SH

P12

Mini

USB

B_2

CMS

04(

TE1

2L,

Q,M)

D5

US

B5

V+5

V

+5

V

PIC3001 PIC3002 COC30

PIC3101 PIC3102 COC31

PIC3201 PIC3202 COC32

PI

D5

01

P

ID

50

2

COD5

PILED801 PILED802 CO

LED8

PILED901 PILED902 CO

LED9

PI

P1

20

1

PI

P1

20

2

PI

P1

20

3

PI

P1

20

4

PI

P1

20

5

PI

P1

20

SH

COP12

PIR2001 PIR2002 COR20

PIR2101 PIR2102 COR21

PI

U8

01

PI

U8

02

PI

U8

03

PI

U8

04

PI

U8

05

PI

U8

06

PIU807

PI

U8

08

PI

U8

09

PI

U8

01

0

PI

U8

01

1

PI

U8

01

2

PI

U8

01

3

PI

U8

01

4

PI

U8

01

5

PI

U8

01

6

PI

U8

01

7

PIU8018

PI

U8

01

9

PI

U8

02

0

PIU8021

PI

U8

02

2

PI

U8

02

3

PI

U8

02

4

PIU8025 PIU8026

PI

U8

02

7

PI

U8

02

8 COU8

PIC3002 PIC3101

PI

D5

02

PIR2002 PIR2102

PI

U8

04

PI

U8

02

0

PI

U8

01

1

PO

CT

S

PI

U8

02

P

OD

TR

PIC3201 P

IU

80

17

PIC3001 PIC3102

PIC3202

PI

P1

20

5

PI

P1

20

SH

PIU807 PIU8018

PIU8021 PIU8025

PIU8026

PILED801 PIR2001 PILED802 P

IU

80

22

PILED901 PIR2101 PILED902 P

IU

80

23

PI

P1

20

2

PI

U8

01

6

PI

P1

20

3

PI

U8

01

5

PI

P1

20

4

PI

U8

03

PI

U8

06

PI

U8

08

PI

U8

09

PI

U8

01

0

PI

U8

01

2

PI

U8

01

3

PI

U8

01

4

PI

U8

01

9

PI

U8

02

4

PI

U8

02

7

PI

U8

02

8

PI

U8

05

P

OR

X

PI

U8

01

P

OT

X

PI

D5

01

PI

P1

20

1

PO

CT

S

PO

DT

R

PO

RX

P

OT

X

1 6 2

Page 181: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Po

wer.

Sch

Doc

1.0

14-

DEC

-201

79

12Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

SW

1

SW

2

CB

OO

T3

VC

C4

BIAS

5

SY

NC

6

RT

7

PG

OO

D8

FB

9

AG

ND

10

SS/

TR

K11

EN

AB

LE

12

VI

N13

VI

N14

PG

ND

15

PG

ND

16

PA

D

U6

LM

43603P

WP

T

GN

DG

ND

10uF

C17

453

KR1

4

100

KR1

8

GN

D

4.7uF

C18

GN

DG

ND

GN

D

VI

N

TP2

0.47uF

C16

10nF

C26

GN

D

IH

LP3

232

DZER

6R8

M11

L1 6.8u

H

4.7uF

C27

GN

D

1Me

g

R13

100pF

C20

249

KR1

7

GN

D

100

K

R16

47uF

C21

10uF

C23

100nF

C24

GN

DG

ND

GN

DG

ND

+5

V

2.2uF

C25

0R15

47uF

C22

2.2

V Th

resho

ld

DC1

_E

N

TP1

100nF

C19

GN

D

GN

DG

ND

GN

D

IN

3

1

OU

T2

GN

D

U7

NCP

1117

LPS

T33

T3G

+3

V3

+5

V+5

V

10uF

C28

10uF

C29

1K

R19

LE

D7

Gre

en

Buc

k Co

nver

ter

LD

O 3.

3V

Regul

ator

1 2

P10

Hea

der

2

1 23

J1

PWR

JA

CK

VI

N

GN

DG

ND

VIN

Po

wer

Jac

k and

Hea

ders

1 2

P9

Hea

der

2

GN

D

+5

V

PIC1

601

PIC1

602

COC16

PIC1701 PIC1702 COC17

PIC1801 PIC1802 COC18

PIC1901 PIC1902 COC19

PIC2001 PIC2002 COC20

PIC2101 PIC2102 COC21

PIC2201 PIC2202 COC22

PIC2301 PIC2302 COC23

PIC2401 PIC2402 COC24

PIC2501 PIC2502 COC25

PIC2601 PIC2602 CO

C26

PIC2701 PIC2702 CO

C27

PIC2801 PIC2802 COC28

PIC2901 PIC2902 COC29

PIJ1

01

PIJ1

02

PIJ1

03

COJ1

PIL101

PIL102

COL1

PILED701 PILED702 CO

LED7

PI

P9

01

PI

P9

02

COP9

PIP1001

PIP1002

COP10

PIR1301 PIR1302 COR13

PIR1401 PIR1402 COR14

PIR1

501

PIR1

502 COR15

PIR1

601

PIR1

602 COR16

PIR1701 PIR1702 COR17

PIR1801 PIR1802 COR18

PIR1901 PIR1902 COR19

PITP101 COTP1

PITP

201

COTP2

PI

U6

01

PI

U6

02

PI

U6

03

PI

U6

04

PI

U6

05

PI

U6

06

PI

U6

07

PI

U6

08

PI

U6

09

PI

U6

01

0

PI

U6

01

1

PI

U6

01

2

PI

U6

01

3

PI

U6

01

4

PI

U6

01

5

PI

U6

01

6

PI

U6

0E

P COU6

PIU701

PI

U7

02

P

IU

70

3

PI

U7

04

COU7

PIC2901

PI

U7

02

P

IU

70

4

PIC2001 PIC2101

PIC2201 PIC2301

PIC2401

PIC2801 PIL102

PI

P9

01

PIR1301

PIR1

501

PIR1

601

PIR1902 P

IU

70

3

PIR1401 PIR1802

PITP

201

PI

U6

01

2

NLDC

10EN

PIC1702

PIC1802 PIC1902

PIC2102 PIC2202

PIC2302 PIC2402

PIC2501

PIC2602 PIC2702

PIC2802 PIC2902

PIJ1

02

PIJ1

03

PILED702

PI

P9

02

PIP1002

PIR1702

PIR1801

PI

U6

06

PI

U6

01

0

PI

U6

01

5

PI

U6

01

6

PI

U6

0E

P

PIU701

PIC1

601

PI

U6

03

PI

C160

2 PIL

101

PI

U6

01

PI

U6

02

PIC2002

PIR1302 PIR1701

PITP101 P

IU

60

9

PIC2502

PI

U6

04

PIC2601 P

IU

60

11

PIC2701

PIR1

502

PI

U6

05

PILED701 PIR1901

PIR1

602

PI

U6

08

PI

U6

07

PIC1701 PIC1801

PIC1901

PIJ1

01

PIP1001

PIR1402

PI

U6

01

3

PI

U6

01

4

1 6 3

Page 182: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

MH1

MH2

MH3

MH4

Mou

ntin

g Hol

es 3.

175

mm

(125

mil) p

ad,

5.714

mm

(225

mil) d

rill

BO

AR

D M

OU

NTI

NG

HO

LES

Mounting

Hol

es

Fiduc

ials

FI

D1

FID2

FID

UCI

ALS

3X

TOP

FI

D3

Gene

ral

Note

s

Gene

ral no

tes

rela

ting to

the

de

sign.

Not

e No.

Desc

riptio

n

PC

B La

yout

Requi

reme

nts

Layo

ut no

tes

refe

rrin

g to

spe

cifi

c co

mpo

nent

s or gr

oup

of

[Nu

mbe

r]De

scriptio

n

compo

nent

s ar

e in

dica

ted

on th

e sc

hema

tics

by th

e no

te

PC

B A

sse

mbl

y Va

riations

List

of a

sse

mbl

y va

riatio

ns.

Vari

ants

are

des

igna

ted

with

a l

ette

r

Vari

ant

Desc

riptio

n

star

ting

fro

m A.

The

desc

riptio

n sho

ws t

he

Vari

ant

name

in

nu

mbe

r, e

nclo

sed in

br

aces

, ad

jace

nt t

o th

e co

mpo

nent.

Eg [1].

Mecha

nica

l

brac

kets

and th

en a

ny

com

ment

s.

A[Pr

otot

ype

A]

Stra

ight

mini

USB

con

nect

ors.

[1]

90oh

m i

mpe

danc

e for

USB

di

ffer

enti

al pa

irs.

Copy

righ

t se

ctio

n (if ap

plic

able

)

Not

es.Sch

Doc

1.0

14-

DEC

-201

710

12

Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

[Pr

otot

ype

B]

R/A

mini

USB

con

nect

ors.

B

COFI

D1

COFI

D2

COFI

D3

PIMH101

COMH

1

PIMH201

COMH

2

PIMH301

COMH

3

PIMH401

COMH

4

PIMH101 PIMH201

PIMH301 PIMH401

1 6 4

Page 183: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

Doc

ume

ntation.

Sch

Doc

1.0

14-

DEC

-201

711

12

Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

1 6 5

Page 184: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

11

22

33

44

DD

CC

BB

AA

She

et

Titl

e:

Pro

ject

Titl

e:

Size

:N

umbe

r:Re

v:Da

te:

She

et of

Dra

wn by

:

Co

mpa

ny:

A

14-

DE

C-101

7

DA

TE

1.0

RE

VISI

ON

RE

VISI

ON

HIS

TO

RY

Firs

t re

leas

e.

DES

CR

IP

TIO

N

Revi

sion

Hist

ory.

Sch

Doc

1.0

14-

DEC

-201

712

12

Eng. J

orge

Ma

rin

(jl

mari

nma

rcan

o@g

mail.c

om)

Ca

mera

US

B H

UB.

Prj

Pcb

1 6 6

Page 185: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

167

Appendix C

Camera Bridge BoardBill-of-Materials

Page 186: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

Designator Quantity Value Manufacturer ManufacturerPartNumberC1_USBHUB1,C1_USBHUB2,C1_USBHUB3,C1_USBHUB4,C5_USBHUB1,C5_USBHUB2,C5_USBHUB3,C5_USBHUB4,C6_USBHUB1,C6_USBHUB2,C6_USBHUB3,C6_USBHUB4,C8_USBHUB1,C8_USBHUB2,C8_USBHUB3,C8_USBHUB4,C9_USBHUB1,C9_USBHUB2,C9_USBHUB3,C9_USBHUB4,C30 21 10uF Yageo CC0603MRX5R5BB106C2_USBHUB1,C2_USBHUB2,C2_USBHUB3,C2_USBHUB4,C3_USBHUB1,C3_USBHUB2,C3_USBHUB3,C3_USBHUB4 8 20pF Yageo CC0603JRNPO9BN200C4_USBHUB1,C4_USBHUB2,C4_USBHUB3,C4_USBHUB4 4 10nF Yageo CC0603KRX7R8BB103C7_USBHUB1,C7_USBHUB2,C7_USBHUB3,C7_USBHUB4,C10_USBSwitchA,C10_USBSwitchB,C10_USBSwitchC,C12,C13,C14,C15,C24,C31,C32 14 100nF Yageo CC0603KRX7R9BB104C11_Camera1,C11_Camera2,C11_Camera3,C11_Camera4,C11_Camera5,C11_Camera6,C11_Camera7,C11_Camera8,C11_Camera9,C11_Camera10,C11_Camera11,C11_Camera12 12 100uF MurataElectronics GRM31CR60J107ME39KC16 1 0.47uF Yageo CC0402ZRY5V7BB474

C17 1 10uFSamsungElectro-MechanicsAmerica,Inc. CL32B106KBJNNNE

C18 1 4.7uFSamsungElectro-MechanicsAmerica,Inc. CL32B475KBUYNNE

C19 1 100nF Yageo CC0402KRX7R6BB104C20 1 100pF Yageo CC0402JRNPO9BN101

C21,C22 2 47uFMurataElectronicsNorthAmerica GRM32ER71A476KE15L

C23,C28,C29 3 10uF Yageo CC0805ZKY5V6BB106C25 1 2.2uF Yageo CC0402MRX5R5BB225C26 1 10nF Yageo CC0402KRX7R7BB103C27 1 4.7uF Yageo CC0402MRX5R6BB475

D1,D2,D3,D4,D5 5ToshibaSemiconductorandStorage CMS04(TE12L,Q,M)

F1,F2,F3,F4 4 BournsInc. MF-NSMF200-2J1 1 CUI-Inc. PJ-075CH-SMT-TRL1 1 6.8uH VishayDale IHLP3232DZER6R8M11

168

Page 187: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

LED1_USBHUB1,LED1_USBHUB2,LED1_USBHUB3,LED1_USBHUB4,LED2_USBHUB1,LED2_USBHUB2,LED2_USBHUB3,LED2_USBHUB4,LED3_USBHUB1,LED3_USBHUB2,LED3_USBHUB3,LED3_USBHUB4,LED4_USBHUB1,LED4_USBHUB2,LED4_USBHUB3,LED4_USBHUB4,LED6,LED7 18 Kingbright APT1608SGCLED5_USBHUB1,LED5_USBHUB2,LED5_USBHUB3,LED5_USBHUB4,LED8,LED9 6 Kingbright APT1608SURCKP1,P2,P3,P4,P12 5 TEConnectivityAMP 1734510-1

P6_Camera1,P6_Camera2,P6_Camera3,P6_Camera4,P6_Camera5,P6_Camera6,P6_Camera7,P6_Camera8,P6_Camera9,P6_Camera10,P6_Camera11,P6_Camera12 12 Molex,-LLC 530480410P11_Camera1,P11_Camera2,P11_Camera3,P11_Camera4,P11_Camera5,P11_Camera6,P11_Camera7,P11_Camera8,P11_Camera9,P11_Camera10,P11_Camera11,P11_Camera12 12 HiroseElectric UX20-MB-5PR1_USBHUB1,R1_USBHUB2,R1_USBHUB3,R1_USBHUB4,R3_USBHUB1,R3_USBHUB2,R3_USBHUB3,R3_USBHUB4,R10_USBSwitchA,R10_USBSwitchB,R10_USBSwitchC,R11 12 10K Yageo AC0603FR-0710KLR2_USBHUB1,R2_USBHUB2,R2_USBHUB3,R2_USBHUB4,R4_USBHUB1,R4_USBHUB2,R4_USBHUB3,R4_USBHUB4,R5_USBHUB1,R5_USBHUB2,R5_USBHUB3,R5_USBHUB4,R18 13 100K Yageo RC0603FR-07100KLR6_USBHUB1,R6_USBHUB2,R6_USBHUB3,R6_USBHUB4 4 2.7K Yageo RC0603FR-072K7LR7_USBHUB1,R7_USBHUB2,R7_USBHUB3,R7_USBHUB4,R8_USBHUB1,R8_USBHUB2,R8_USBHUB3,R8_USBHUB4,R9_USBHUB1,R9_USBHUB2,R9_USBHUB3,R9_USBHUB4,R12,R19 14 1K Yageo RC0603FR-071KLR13 1 1Meg RohmSemiconductor MCR01MRTF1004R14 1 453K Yageo RC0603FR-07453KLR15 1 0 Yageo RC0402JR-070RLR16 1 100K Yageo RC0402FR-07100KLR17 1 249K RohmSemiconductor MCR01MRTF2493R20,R21 2 1.5K Yageo RC0603FR-071K5L

169

Page 188: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

U1_USBHUB1,U1_USBHUB2,U1_USBHUB3,U1_USBHUB4 4 TerminusTech FE1.1sU3_USBSwitchA,U3_USBSwitchB,U3_USBSwitchC 3 TexasInstruments TS3USB221EDRCR

U4_Camera1,U4_Camera2,U4_Camera3,U4_Camera4,U4_Camera5,U4_Camera6,U4_Camera7,U4_Camera8,U4_Camera9,U4_Camera10,U4_Camera11,U4_Camera12 12 STMicroelectronics USBLC6-2SC6U5 1 Atmel ATMEGA328P-AUU6 1 TexasInstruments LM43603PWPTU7 1 ONSemiconductor NCP1117LPST33T3G

U8 1FTDI,FutureTechnologyDevicesInternationalLtd FT232RL-REEL

Y1_USBHUB1,Y1_USBHUB2,Y1_USBHUB3,Y1_USBHUB4 4 ECSInternational ECS-120-20-33-TR

Y2 1MurataElectronicsNorthAmerica CSTCE16M0V53-R0

170

Page 189: Design and Development of a UAV System for Multispectral … · iii infrared bands. The cameras were controlled by the SBC on board of the UAV. The imaging system was tested over

171

Appendix D

3D Printer Settings

The following table summarizes the settings used for 3D printing the PLA parts.

Table D.1: 3D printer settings.

Setting Units Value

Layer Height mm 0.18First Layer Height mm 0.27Perimeter Shells 2Top Solid Layers 3Bottom Solid Layers 3Fill Density % 15-20Fill Pattern Hexagonal (Honeycomb)Combine Infill Every 2 LayersPrint Speed mm/s 60Travel Speed mm/s 80Extruder Temperature °C 200Platform Temperature °C 50