escuela tecnica superior de ingenieros industriales...

250
Universidad Polit´ ecnica de Madrid Escuela T´ ecnica Superior De Ingenieros Industriales Doctorado en Autom´ atica y Rob ´ otica Soft-Computing Based Visual Control for Unmanned Vehicles A thesis submitted for the degree of Doctor of Philosophy in Robotics and Automation Miguel ´ Angel Olivares M´ endez BS Computer Engineer 2013

Upload: phungtu

Post on 19-Apr-2018

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Universidad Politecnica de MadridEscuela Tecnica Superior De Ingenieros Industriales

Doctorado en Automatica y Robotica

Soft-Computing Based Visual Control forUnmanned Vehicles

A thesis submitted for the degree ofDoctor of Philosophy in Robotics and Automation

Miguel Angel Olivares MendezBS Computer Engineer

2013

Page 2: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 3: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Escuela Tecnica Superior De Ingenieros IndustrialesDoctorado en Automatica y Robotica

Universidad Politecnica de Madrid

Soft-Computing Based Visual Control forUnmanned Vehicles

A thesis submitted for the degree ofDoctor of Philosophy in Robotics and Automation

Author:Miguel Angel Olivares Mendez

BS Computer Engineer

Director:Dr. Pascual Campoy Cervera

PhD Full Professor Industrial Engineer

2013

Page 4: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 5: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Tıtulo:Soft-Computing Based Visual Control for

Unmanned Vehicles

Autor:Miguel Angel Olivares Mendez

BS Computer Engineer

Tribunal nombrado por el Mgfco. y Excmo. Sr Rector de la UniversidadPolitecnica de Madrid el dıa.........de...................de 2011

TribunalPresidente : ......................................., ............

Secretario : ......................................., ............

Vocal : ......................................., ............

Vocal : ......................................., ............

Vocal : ......................................., ............

Suplente : ......................................., ............

Suplente : ......................................., ............

Realizado el acto de lectura y defensa de la tesis el dıa ........... de ................ de2011.

calificacion de la Tesis.........................................

El Presidente: Los Vocales:

El Secretario:

Page 6: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 7: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

A mi hermano y a mi prima Patri por su apoyo,

A mis padres, por haberme ensenado a creer enmi,

A Eduard Punset, por haberme ensenado a creeren la ciencia,

A Pascual Campoy, por creer tanto en mi,

A mi Lu-z, por el impulso diario, sin su energıahubiera sido imposible,

A Marco.

Miguel A. Olivares Mendez

V

Page 8: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 9: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Acknowledgements

Son muchas las personas que han hecho posible que haya conseguido llegara escribir estas lineas en este documento.

En primer lugar quisiera agradecer a la gran mayorıa de miembros delComputer Vision Group, entre otros, a Sergio Dominguez, por darle un pocomas de alegrıa al dıa a dıa, Ivan Mondragon, por estar siempre disponible aayudar en todo momento, Carol Martınez, por su ayuda y gran disposicio,Ignacio Mellado, por su apoyo y colaboracion en los momentos dificiles delproyecto de Siemens, Jose Luıs, por estar siempre disponible a ayudar con unsonrisa, David Galindo, por su colaboracion y ayuda, Paloma, por ayudarmetanto en la fase final de los papeleos de la tesis, a Ramon, por su ayuda en elporyecto Omniworks y especialmente a Changhong Fu por su inestimable ayuday motivacio durante le ultimo ano de grandes esfuerzos. A todos ellos graciaspor hacer tan especial y agradable mi docotrado.

Quisiera dar las gracias tambien a la gente del departamento, sufridorestambien conmigo del dıa a dıa en el laboratorio. Lisandro, Carlos, Inaki,Roberto, Gonzalo, Gabriel, Alberto, Joao, y alos que se fueron de vuelta Hector,Isela y Alberto Trasloceros.

Tambien quisiera dar las gracias al personal del DISAM, en especial a Rosa,Tere y Carlos, por su ayuda y su infinita paciencia conmigo. A los profesoresqye hacen posible que siga funcionando el departamento y que mi estancia aquihaya sido tan satisfactoria, Ricardo Sanz, Manuel Ferre, Claudio Rossi, RoqueSaltaren, Antonio Barrientos, Enrique Pinto, Jaime, Angel, Rafael y Agustın.

Gracias tambien al Ministerio de Educacion y Ciencia ayuda en la sub-vencion del proyecto DPI2007-66156 “Vision por computador para UAV:Navegacion, seguimiento e inspeccion”. A la Universidad Politecnica de Madridfor su ayuda en la financiacion de las estancias docotrales internacional yeuropea, asi como a la Comunidad Europea por financiar el proyecto ”IRSESFP7-IRSES IPCUAS”

Page 10: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Me gustarıa agradecer tambien al ”Australian Research Centre for AerospaceAutomation” ARCAA - Australia y al ”Laboratory of Intelligent System” delEPFL - Lausanne por acogerme en mis estancias doctorales y al profesor FelixSchill, y especialmente a Luis Mejıas por su apoyo, companıa y por los buenosmomentos pasados en Australia.

A Andrades y Clara por tantos buenos ratos y por sus inspiradores vinitos.Al senor Victrol por su apoyo y animos.

A los tantos amigos que he conocido, me han ayudado y animado durantetodo este tiempo Monica, Evelina, Cesar, Valeria, Rafa, Filippo, Laura, lapequena Lydia, y especialmente a Raul (Baloo) por tantas enriquecedorasconversaciones.

Y finalmente a toda mi familia, a mis segundos padres Puri y Manolo, porsus animos constantes, a mis primas Julia, Ana y Alba. A mi abuelo Celestino,a Rafa, Nieves y Paola. Tambien quiero expresar mi gratitud de manera muyespecial a Mirella, por su apoyo, animo incondicional y por haberme dadoel regalo mas grande. Y a los que se han convertido ya en parte de familiaPascual, Ana, Pablo, Alicia y Miguelito, gracias por todos los buenos momentos,especialmente a Pascual por haber creido en mi durante todo este tiempo, pordarme una “vision” nueva de la investigacion y por ayudarme en todos losmomentos dificiles del doctorado.

Gracias a todos de corazon.Miguel A. Olivares Mendez

Page 11: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Resumen

El objetivo principal de esta Tesis es extender la utilizacion del “Soft-Computing” para el control de vehiculos sin piloto utilizando vision. Estetrabajo va mas alla de los tıpicos sistemas de control utilizados en entornosaltamente controlados, demonstrando la fuerza y versatilidad de la logica difusa(Fuzzy Logic) para controlar vehiculos aereos y terrestres en un amplio abanicode applicaciones. Para esta Tesis se ha realizado un gran numero de pruebasreales en las cuales los controladores difusos han manejado una plataformavisual “pan-and-tilt”, un helicoptero, un coche comercial y hasta dos tiposdistintos de quadrirotores. El uso del metodo de optimizacion “Cross-Entropy”ha sido utilizado para el ajuste de algunos parametros de los controladoresmejorando ası su comportamiento para la evitacion de colisiones con vehıculosaereos.

Todos los controladores difusos presentados en esta Tesis han sido im-plementados utilizando la librerıa de C++ desarrollada por el candidato paratal efecto, llamado MOFS (Miguel Olivares’ Fuzzy Software). Diferentesalgoritmos visuales han sido utilizados para adquirir la informacion visual delentorno, “Camshift”, descomposicion de la homografıa y deteccion de marcasde realidad aumentada, entre otros. Dicha informacion visual ha sido utilizadacomo entrada de los controladores difusos para comandar los vehıculos en lasdiferentes applicacines autonomas.

El volante de un vehıculo comercial ha sido controlado para realizar pruebasde conduccion autonoma en condiciones de trafico similares a las de una ciudad.El sistema ha llegado a completar con exito pruebas de mas de 6 km sin ningunainteraccion humana, mediante el seguimiento de una lınea pintada en el suelo.El limitado campo visual del sistema (50×30 cm) no ha sido impedimento paraalcanzar velocidades de hasta 48 km/h y ser guiado autonomamente en curvasde radio reducido.

Objetos estaticos y moviles han sido seguidos desde un helicoptero no trip-ulado, mediante el control de una plataforma visual “pan-and-tilt”. Este mismohelicoptero ha sido controlado completamente para su aterrizaje autonomo,

Page 12: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

X RESUMEN

mediante el control del movimiento lateral (roll), horizontal (pitch) y en altitud.El seguimiento de objetos voladores desde una distancia de seguridad ha sidoresuelto mediante el control horizontal (pitch) y de orientacion (heading) deun quadrirotor. Para tareas de evitacion de obstaculos se ha implementado uncontrolador difuso para el manejo de la orientacion (heading) de un quadrirotor.

En el campo de la optimizacion de controladores se ha aportado al estadodel arte una extension del uso del metodo “Cross-Entropy”. Esta Tesis presentauna novedosa implementacion de dicho metodo para la optimizacion de lasganancias, la posicion y tamano de los conjuntos difusos de las funcionesde pertenecia y el peso de las reglas para mejorar el comportamiento de loscontroladores difusos. Dichos procesos de optimizacion se han realizadoutilizando “ROS” y “Matlab Simulink” para obteniendose grandes mejoras en laevitacion de colisiones con vehıculos aereos no tripulados.

Esta Tesis demuestra que los controladores basados en logica difusa sonaltamente capaces de controlador sistemas sin tener en cuenta el modelo delvehıculo a controlador, estando en entornos altamente perturbables con un sen-sor de bajo coste como es una camara. El ruido presente en la adquisicion deimagenes causado por los cambios de iluminacion y la alta incertidumbre en ladeteccion visual han sido manejados satisfactoriamente por esta tecnica de de“Soft-Computing” para distintas aplicaciones tanto con vehıculos aereos comoterrestres.

Page 13: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Abstract

The aim of this Thesis is to exploit the use of Soft Computing to controlunmanned vehicles using vision. This works goes beyond the typical controlsystems used in highly controlled environments, by demonstrating the powerof the Fuzzy Logic Controllers (FLCs) to command aerial and ground vehiclesin a sort of different tasks. A huge amount of real tests are presented in whichthe implemented Fuzzy controllers manage a visual pan and tilt platform, ahelicopter, a commercial car and two different types of quadcopters. The use ofthe Cross-Entropy method to optimize the behavior of these controllers is alsoshown.

All the visual servoing controllers presented in this Thesis were implementedusing the self-developed software tool called MOFS (Miguel Olivares’ FuzzySoftware). Different visual algorithms were used to acquire the informationof the surrounding environment of the vehicles. The CamShift, homographydecomposition, and augmented reality mark detection among others. This visualinformation was used as input of the Fuzzy controllers to manage the vehicle todo different autonomous tasks.

The steering wheel of a commercial car was controlled to implement adriverless vehicle for inner-city tests. Long distance of more than 6 km wascovered without driver in a close circuit using a vision line following algorithm.The limited field of view (50× 30 cm) of the system was not an impediment toreach a top speed of 48km/h and guide the vehicle inside low radius curves.

Static and moving objects like cars were tracking from an unmannedhelicopter controlling an on board pan and tilt visual platform. A full control ofaltitude, lateral and forward movements was implemented for an auto-landingtask of a helicopter. An implementation of pitch and heading controllers wereused to command a quadrotor for object following task. The heading was alsocontrolled for See and Avoid task with this type of UAVs.

The Cross-Entropy optimization method is not wide used for control in the

Page 14: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XII ABSTRACT

literature. This Thesis presents the way to optimize the gains, membershipfunction sets’ position and size and the rules’ weight to improve the behavior ofa Fuzzy controller. This optimization process was done using ROS and MatlabSimulink to obtain better results for See and Avoid tests for UAVs.

This Thesis demonstrates that the Fuzzy Logic Controllers are widely ca-pable to command free-model systems in high disturbance environments with alow cost sensor. The noisy effects of illumination changes and the high uncer-tain of the visual detection were manage in a gentle way by this Soft Computingtechnique to approach different tasks with different aerial and ground vehicles.

Page 15: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Nomenclature

The conventions used in this document are defined as follows:

Matrix will be represented by capital bold letters such as H or R.

Vectors will be represented by a lower case bold letters with or without aupper bar, such as x or t.

A vector is a column vector, unless indicated otherwise.

Scalar variables are normal italic letters such as i or k.

Images are normally represented by capital italic letters such as I or T.

Page 16: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 17: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Acronyms

Following is a description of most commonly acronyms referenced on thiswork:

ATC: Air Traffic Control

ATM: Air Traffic Management

CAA: United Kingdom Civil Aviation Authority

CASA: Australian Civil Aviation Safety Authority

CPU: Central Processing Unit

DOF: Degrees of freedom

EASA: European Aviation Safety Agency

FAA: Federal Aviation Administration

FOV: Field Of View

GPS: Global Positioning System

GPU: Graphics Processing Unit

HOT: Hight Order Terms

IBVS: Image-Based Visual Servoing

ICIA: Inverse Compositional Image Alignment

IMU: Inertial Measurement Unit

ITSE: Integral Time of Square Error

ITAE: Integral Time of Absolute Error

KF: Kalman Filter

Page 18: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XVI ACRONYMS

LKT: Lucas-Kanade Tracker

MAV: Micro-Aerial Vehicles.

MTOW: Maximum Take-Off Weight

NAAs: National Aviation Authorities

NAS: National Airspace System

PBVS: Position-Based Visual Servoing

PID: : Proportional-Integral-Derivative controller

RANSAC: RANdom SAmple Consensus

RC: Radio Controlled

RMSE: Root Mean Square Error

RPM: Revolutions per Minute

R&D: Research and Development

SIFT: Scale-Invariant Feature Transform

SLAM: Simultaneous Localization And Mapping

SURF: Speeded Up Robust Features

UAS: Unmanned Aircraft System

UAV: Unmanned Aerial Vehicle

VTOL: Vertical Take-Off and Landing

Page 19: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Contents

Acknowledgements VII

Resumen IX

Abstract XI

Nomenclature XIII

Acronyms XV

List of Figures XXI

List of Tables XXXI

1. Introduction 11.1. Motivation and Problem Statement . . . . . . . . . . . . . . . . 11.2. Main Contributions . . . . . . . . . . . . . . . . . . . . . . . . 31.3. Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2. State of Art 72.1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2. Unmanned Vehicles . . . . . . . . . . . . . . . . . . . . . . . . 8

2.2.1. Perception . . . . . . . . . . . . . . . . . . . . . . . . 82.2.2. Localization . . . . . . . . . . . . . . . . . . . . . . . . 92.2.3. Planning and Navigation . . . . . . . . . . . . . . . . . 11

2.3. Visual Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.4. Fuzzy Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

XVII

Page 20: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XVIII Contents

3. Fuzzy Logic Control for Ground Vehicles 173.1. Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.2. Unmanned Ground Vehicle Description . . . . . . . . . . . . . 203.3. Visual Localization Algorithms . . . . . . . . . . . . . . . . . . 22

3.3.1. Lateral Estimation Error . . . . . . . . . . . . . . . . . 223.3.2. Global Localization by Visual Marks . . . . . . . . . . 23

3.4. Autonomous Driving with Local Localization . . . . . . . . . . 253.4.1. Fuzzy Control for Lateral Estimation . . . . . . . . . . 253.4.2. Experiments . . . . . . . . . . . . . . . . . . . . . . . 29

3.5. Autonomous Driving with Global Localization . . . . . . . . . 353.5.1. Fuzzy Control for Lateral Estimation and Global Local-

ization . . . . . . . . . . . . . . . . . . . . . . . . . . . 363.5.2. Experiments . . . . . . . . . . . . . . . . . . . . . . . 36

3.6. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

4. Fuzzy Logic Control for Unmanned Helicopters 534.1. Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 544.2. Unmanned Helicopter Description . . . . . . . . . . . . . . . . 554.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking 57

4.3.1. Visual Tracking . . . . . . . . . . . . . . . . . . . . . . 574.3.2. Fuzzy Control of a Pan and Tilt Visual Platform and

UAV’s Heading . . . . . . . . . . . . . . . . . . . . . . 574.3.3. Experiments . . . . . . . . . . . . . . . . . . . . . . . 61

4.3.3.1. Laboratory Experiments . . . . . . . . . . . . 614.3.3.2. Real Tests with the UAV. . . . . . . . . . . . 65

4.4. Autonomous Landing . . . . . . . . . . . . . . . . . . . . . . . 714.4.1. 3D Estimation Based on Homographies . . . . . . . . . 714.4.2. Fuzzy Control of UAV’s Pitch, Roll and Altitude. . . . . 754.4.3. Experiments . . . . . . . . . . . . . . . . . . . . . . . 80

4.4.3.1. 3D positioning experiments . . . . . . . . . . 804.4.3.2. Autonomous Descend Experiments . . . . . . 844.4.3.3. Fully Autonomous Landing . . . . . . . . . . 85

4.5. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

5. Fuzzy Logic Control for Unmanned Quadrotors 915.1. Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 925.2. Unmanned Quadrotors Description . . . . . . . . . . . . . . . . 93

5.2.1. Basic Quadrotor Mechanics . . . . . . . . . . . . . . . 935.2.2. Pelican Ascending Technologies Quadcopter . . . . . . 945.2.3. AR.Drone Parrot Quadrotor . . . . . . . . . . . . . . . 94

5.3. Color-Based Tracking . . . . . . . . . . . . . . . . . . . . . . . 955.4. Object Following . . . . . . . . . . . . . . . . . . . . . . . . . 96

5.4.1. Fuzzy Control of Quadrotor’s Pitch and Heading . . . . 985.4.2. Experiments . . . . . . . . . . . . . . . . . . . . . . . 101

5.5. Avoiding Obstacles . . . . . . . . . . . . . . . . . . . . . . . . 106

Page 21: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Contents XIX

5.5.1. Fuzzy Control of Quadrotor Heading . . . . . . . . . . 1065.5.2. Experiments . . . . . . . . . . . . . . . . . . . . . . . 109

5.6. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 1136.1. Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . 1146.2. Cross-Entropy optimization method . . . . . . . . . . . . . . . 1156.3. Fuzzy Control Optimization General Approach . . . . . . . . . 1176.4. Gains optimization of Fuzzy controllers: Applied to UAV See

and Avoid task. . . . . . . . . . . . . . . . . . . . . . . . . . . 1186.4.1. Color-Based Tracking . . . . . . . . . . . . . . . . . . 1186.4.2. Scaling Factor Optimized Fuzzy control of Quadrotors’

Heading for See and Avoid . . . . . . . . . . . . . . . . 1196.4.3. Optimization process using the ROS-Gazebo simulation

environment . . . . . . . . . . . . . . . . . . . . . . . 1206.4.3.1. Software Implementation . . . . . . . . . . . 1206.4.3.2. Simulator Experiments . . . . . . . . . . . . 122

6.4.4. Experiments . . . . . . . . . . . . . . . . . . . . . . . 1256.5. Membership Functions’ sets and Rules’ Weight optimization of

Fuzzy controllers: Applied to UAV See and Avoid task. . . . . . 1316.5.1. Object detection and Positioning Estimation using QR

codes . . . . . . . . . . . . . . . . . . . . . . . . . . . 1326.5.2. Membership Functions and Rules Optimized Fuzzy con-

trol of Quadrotors’ Heading for See and Avoid . . . . . 1326.5.3. Optimization process using the Matlab-Simulink simu-

lation environment . . . . . . . . . . . . . . . . . . . . 1366.5.3.1. Software Implementation . . . . . . . . . . . 1366.5.3.2. Simulator Experiments . . . . . . . . . . . . 140

6.5.4. Experiments . . . . . . . . . . . . . . . . . . . . . . . 1526.6. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159

7. Thesis Conclusions 163

A. Publications derived from this thesis 167A.1. Journals with JCR . . . . . . . . . . . . . . . . . . . . . . . . . 167A.2. Peer-Reviewed Congress . . . . . . . . . . . . . . . . . . . . . 168A.3. Book Chapters . . . . . . . . . . . . . . . . . . . . . . . . . . 171A.4. Digital media . . . . . . . . . . . . . . . . . . . . . . . . . . . 172A.5. Participation on Technology Transfer Projects . . . . . . . . . . 172A.6. Research Positions . . . . . . . . . . . . . . . . . . . . . . . . 173

B. MOFS: Miguel Olivares’ Fuzzy Software 175B.0.1. How to use MOFS . . . . . . . . . . . . . . . . . . . . 177

Page 22: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XX Contents

C. UAV specifications 181C.1. Gas and Electric powered UAV helicopters . . . . . . . . . . . . 181

C.1.1. Helicopter Autopilot API . . . . . . . . . . . . . . . . . 181C.2. UAV quadcopter . . . . . . . . . . . . . . . . . . . . . . . . . . 185

C.2.1. AscTec Pelican . . . . . . . . . . . . . . . . . . . . . . 185C.2.1.1. Quadcopter autopilot SDK . . . . . . . . . . 186

D. Research Groups on UAS and Vision Systems 189

Page 23: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Figures

1.1. Thesis Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.1. Position-Based Visual Control. . . . . . . . . . . . . . . . . . . . 122.2. Image-Based Visual Control. . . . . . . . . . . . . . . . . . . . . 13

3.1. Driverless Car developed by Sebastian Thrun et. altres . . . . . 183.2. Driverless Car projects before 2000. . . . . . . . . . . . . . . . 193.3. SARTRE Volvo project. . . . . . . . . . . . . . . . . . . . . . . 193.4. VisLab driverless van, tested from Parma to Shangai. . . . . . . 203.5. The high level of traffic nowadays in every city make the detec-

tion of the lane almost impossible. . . . . . . . . . . . . . . . . 203.6. Automated Citroen C3 Pluriel . . . . . . . . . . . . . . . . . . 213.7. Metal structure of the visual system. Limits the field of view of

the camera to 30 x 50 cm. . . . . . . . . . . . . . . . . . . . . . 223.8. Representation of the circuit at INSIA installations took with

Google Earth tool. . . . . . . . . . . . . . . . . . . . . . . . . . 223.9. Line and mark detection by the comnputer vision algorithm. . . 253.10. Control loop of the visual servoing system for lateral control.

FuzzyPD+ I schema. . . . . . . . . . . . . . . . . . . . . . . . 263.11. First input variable of the Fuzzy controller: the error between the

center of the line and the center of the image, in pixels. . . . . . 263.12. Second input variable of the Fuzzy controller: the difference be-

tween the last error and the current, in pixels. . . . . . . . . . . 273.13. Output variable of the Fuzzy controller: the steering wheel angle,

in degrees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

XXI

Page 24: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXII List of Figures

3.14. Representation of the steering wheel response. The delay fromthe command sent to the steering wheel reach the commandedangle position is 0.4 seconds. . . . . . . . . . . . . . . . . . . . 29

3.15. Step response to a 50 pixels step at 10 km/h in straight. The Errormeasurement, and the steering wheel movements, and the Fuzzycontroller output are shown. The value of the RMSE of the testis 7.1666 cm. . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.16. Step response to a 50 pixels step at 15 km/h in straight. The Errormeasurement, and the steering wheel movements, and the Fuzzycontroller output are shown. The value of the RMSE of the testis 7.0592 cm. . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.17. Step response to a 50 pixels step at 10 km/h in curve. The Errormeasurement, and the steering wheel movements, and the Fuzzycontroller output are shown. The value of the RMSE of the testis 7.8007 cm. . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

3.18. Step response to a 50 pixels step at 15 km/h in curve. The Errormeasurement, and the steering wheel movements, and the Fuzzycontroller output are shown. The value of the RMSE of the testis 7.2187 cm. . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

3.19. Evolution of the system during 18 laps inside the circuit withoutmark detection. Constant speed was applied. . . . . . . . . . . . 32

3.20. Zoom to one lap of the circuit. . . . . . . . . . . . . . . . . . . 333.21. Zoom of 170 pixels step at the beginning of the second curve. . . 333.22. Evolution of the system during the emergency stop’s test. . . . . 343.23. Evolution of the system during 4 laps inside the circuit without

mark detection. Variable speed was applied. . . . . . . . . . . . 353.24. Control loop of the visual servoing system with global localiza-

tion. Fuzzy+ I +O f f set schema. . . . . . . . . . . . . . . . . 363.25. Step response to a 50 pixels step at 10 km/h in straight with

mark approach system. The Error measurement, and the steeringwheel movements, and the Fuzzy controller output are shown.The value of the RMSE of the test is 6.8402 cm. . . . . . . . . . 37

3.26. Step response to a 50 pixels step at 15 km/h in straight withmark approach system. The Error measurement, and the steeringwheel movements, and the Fuzzy controller output are shown.The RMSE value of the test is 6.719 cm. . . . . . . . . . . . . . 37

3.27. Step response to a 50 pixels step at 10 km/h in curve with markapproach system. The Error measurement, and the steeringwheel movements, and the Fuzzy controller output are shown.The value of the RMSE of the test is 5.1034 cm. . . . . . . . . . 38

3.28. Step response to a 50 pixels step at 15 km/h in curve with markapproach system. The Error measurement, and the steeringwheel movements, and the Fuzzy controller output are shown.The value of the RMSE of the test is 5.5429 cm. . . . . . . . . . 38

Page 25: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Figures XXIII

3.29. Evolution of the system during 9 laps inside the circuit with markdetection. Speed is set to 15 km/h. . . . . . . . . . . . . . . . . 40

3.30. Evolution of the system during this test 13 laps inside the circuitwith mark detection. Variable speed with an average of 20 km/h. 40

3.31. Evolution of the system during 13 laps inside the circuit withmark detection. Speed is set to 15 km/h. . . . . . . . . . . . . . 41

3.32. Evolution of the system during 21 laps inside the circuit withmark detection. Speed is set to 14 km/h. An emergency stop wasapplied. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

3.33. Evolution of the system during 30 laps inside the circuit withmark detection. Speed is set to 15 km/h. . . . . . . . . . . . . . 43

3.34. Evolution of the system with mark detection and speed changesfrom 0 to 48 km/h. . . . . . . . . . . . . . . . . . . . . . . . . 44

3.35. Evolution of the system during 17 laps inside the circuit withmark detection. Speed changes from 15 to 25 km/h. . . . . . . . 45

3.36. Evolution of the system during 4 laps inside the circuit with markdetection. Speed changes from 6 to 38 km/h. . . . . . . . . . . . 46

3.37. Evolution of the system during 6 laps inside the circuit with markdetection. Speed changes from 14 to 42 km/h. . . . . . . . . . . 47

3.38. Evolution of the system during 4 laps inside the circuit with markdetection. Speed changes from 14 to 42 km/h. . . . . . . . . . . 48

3.39. Evolution of the system during 14 laps inside the circuit withmark detection. Speed changes from 10 to 23 km/h. . . . . . . . 49

3.40. Evolution of the system during 17 laps inside the circuit withmark detection. Speed changes from 14 to 24 km/h. . . . . . . . 50

4.1. Autonomous electric helicopter SR20. . . . . . . . . . . . . . . 554.2. Autonomous gas-powered helicopter SRA1. . . . . . . . . . . . 564.3. Visual platform yaw controller. . . . . . . . . . . . . . . . . . . 594.4. Visual platform pitch controller. . . . . . . . . . . . . . . . . . 594.5. Heading controller. . . . . . . . . . . . . . . . . . . . . . . . . 604.6. Different rules-base and output definition fuzzy systems used in

laboratory tests. a) Without more significant sector defined in theoutput. b) With a output minor range of action than the one showin Figure 4.5(c), sets in -10, -7.5, -5, -2.5, 0, 2.5, 5, 7.5 and 10.c) the selected one shown in Figure 4.5(c) . . . . . . . . . . . . 62

4.7. A sample window of the helicopter Simulator. . . . . . . . . . . 634.8. Error between the static object tracked and the center of the im-

age, running with the UAV simulator. . . . . . . . . . . . . . . 634.9. Response of the Fuzzy control for the Pitch axis of the visual

platform tracking a static object with the simulator of the UAVcontrol. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

4.10. Response of the Fuzzy control for the Yaw axis of the visualplatform tracking a static object with the simulator of the UAVcontrol. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

Page 26: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXIV List of Figures

4.11. Response of the Fuzzy control for the heading of the helicopter. 64

4.12. Heading movement. . . . . . . . . . . . . . . . . . . . . . . . . 64

4.13. 3D flight reconstruction from the GPS and the IMU data fromthe UAV. Where, the ’X’ axis represents the NORTH axis of thesurface of the tangent of the earth, the ’Y’ axis represents theEAST axis of the earth, the ’Z’ is the altitude of the helicopterand the red arrows show the pitch angle of the helicopter. . . . . 65

4.14. Different pitch and yaw movements of the UAV. . . . . . . . . . 66

4.15. Error between center of the image and center of the object to track. 66

4.16. Output from the Fuzzy Controller. . . . . . . . . . . . . . . . . . . 67

4.17. Input and Output Variables for the Pitch and Yaw controllers. . . 68

4.18. 3D flight reconstruction using the GPS and the IMU data from the UAV.Where, the ’X’ axis represents the NORTH axis of the surface of thetangent of the earth, the ’Y’ axis represents the EAST axis of the earth,the ’Z’ is the altitude of the helicopter and the red arrows show thepitch angle of the helicopter. . . . . . . . . . . . . . . . . . . . . . 69

4.19. Velocity changes of the UAV. . . . . . . . . . . . . . . . . . . . 69

4.20. Axis changes of the UAV in degrees. . . . . . . . . . . . . . . . 70

4.21. Error between center of the image and center of the object to track. . . 70

4.22. Error between center of the image and center of the dynamicobject (a van) to track. . . . . . . . . . . . . . . . . . . . . . . 71

4.23. Response of the Fuzzy control for the Yaw axis of the visualplatform tracking a dynamic object (a van). . . . . . . . . . . . 71

4.24. Response of the Fuzzy control for the Pitch axis of the visualplatform tracking a dynamic object (a van). . . . . . . . . . . . 71

4.25. Projection model on a moving camera and frame-to-frame ho-mography induced by a plane. . . . . . . . . . . . . . . . . . . 73

4.26. Fuzzy controller for UAV’s altitude. . . . . . . . . . . . . . . . 76

4.27. Fuzzy controller for UAV’s roll. . . . . . . . . . . . . . . . . . 77

4.28. Fuzzy controller for UAV’s pitch. . . . . . . . . . . . . . . . . . 78

4.29. Helipad, camera and U.A.V coordinate systems . . . . . . . . . 80

4.30. Measures of the X axis of the UAV (Roll) based on the homog-raphy estimation. . . . . . . . . . . . . . . . . . . . . . . . . . 81

4.31. Measures of the Y axis of the UAV (Pitch) based on the homog-raphy estimation. . . . . . . . . . . . . . . . . . . . . . . . . . 82

4.32. Measures of the altitude of the UAV based on the homographyestimation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

4.33. Measures of the heading of the UAV (yaw) based on the homog-raphy estimation. . . . . . . . . . . . . . . . . . . . . . . . . . 83

Page 27: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Figures XXV

4.34. 3D pose estimation based on a helipad tracking using RobustHomography estimation during a UAV landing process. Landingflight test beginning at an altitude of 5.2m. For all images, thereference image I0 is on the small rectangle on the upper left cor-ner. Left it the current frame and Right the Optical Flow betweenthe actual and last frame. Superimposed are the projection of theoriginal rectangle, the translation vector and the Tait-Bryan angles. 84

4.35. 3D flight and heading reconstruction for the flight sequence showon figure 4.34 (flight test 1) . . . . . . . . . . . . . . . . . . . . 85

4.36. Comparison between the Z axis displacement for homographyestimation and IMU data during the test. . . . . . . . . . . . . . 85

4.37. 3D flight reconstruction of first test of fully autonomous landing. 864.38. Homography estimation for pitch, roll and altitude of second test

of fully autonomous landing. . . . . . . . . . . . . . . . . . . . 874.39. 3D flight reconstruction of second test of fully autonomous landing. 874.40. Homography estimation for pitch, roll and altitude of second test

of fully autonomous landing. . . . . . . . . . . . . . . . . . . . 884.41. 3D flight reconstruction of third test of fully autonomous landing. 884.42. Homography estimation for pitch, roll and altitude of third test

of fully autonomous landing. . . . . . . . . . . . . . . . . . . . 884.43. 3D flight reconstruction of fourth test of fully autonomous landing. 894.44. Homography estimation for pitch, roll and altitude of fourth test

of fully autonomous landing. . . . . . . . . . . . . . . . . . . . 89

5.1. Quadrotor mechanics. Motors 1 and 3 spin clockwise and motors2 and 4 spin counter-clockwise. Picture from www.wikipedia.org 94

5.2. CVG-UPM (CVG, 2012) Pelican Quadrotor UAV used forAerial Object Following tests . . . . . . . . . . . . . . . . . . 95

5.3. AR.Drone Parrot . . . . . . . . . . . . . . . . . . . . . . . . . 955.4. Camshift tracking of a colored red target on a image sequence.

The white circle corresponds to the boundaries of the trackedcolored area . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

5.5. Image captured with the onboard camera. . . . . . . . . . . . . 975.6. Aerial Object Following application described with the onboard

and external images of a real situation. . . . . . . . . . . . . . . 975.7. Definition of the Yaw controller: First input. Estimation of the

deviation of the object from the centre of the image capture fromthe UAV. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

5.8. Definition of the Yaw controller: Second input. Difference be-tween the last two measurements. . . . . . . . . . . . . . . . . 98

5.9. Definition of the Yaw controller: Output. Velocity commands tochange the heading of the UAV. . . . . . . . . . . . . . . . . . . 99

5.10. Definition of the Pitch controller: First input. Estimation of thedeviation of the object from the centre of the image capture fromthe UAV. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

Page 28: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXVI List of Figures

5.11. Definition of the Pitch controller: Second input. Difference be-tween the last two measurements. . . . . . . . . . . . . . . . . 99

5.12. Definition of the Pitch controller: Output. Velocity commandsto change the heading of the UAV. . . . . . . . . . . . . . . . . 99

5.13. Object Following Dynamic image-based look and move fuzzyservoing system architecture. . . . . . . . . . . . . . . . . . . . 101

5.14. Trajectory of the Pelican-UAV during the First Aerial Object Fol-lowing Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

5.15. Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the first test. . . . . 103

5.16. Measurements for the orientation of the UAV and response of theYaw controller of the first test. . . . . . . . . . . . . . . . . . . 103

5.17. Trajectory of the Pelican-UAV during the Second Aerial ObjectFollowing Test. . . . . . . . . . . . . . . . . . . . . . . . . . . 104

5.18. Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the second test. . . 104

5.19. Measurements for the orientation of the UAV and response of theYaw controller of the second test. . . . . . . . . . . . . . . . . . 105

5.20. Trajectory of the Pelican-UAV during the Third Aerial ObjectFollowing Test. . . . . . . . . . . . . . . . . . . . . . . . . . . 105

5.21. Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the third test. . . . 106

5.22. Measurements for the orientation of the UAV and response of theYaw controller of the third test. . . . . . . . . . . . . . . . . . . 106

5.23. Definition of the Yaw controller. . . . . . . . . . . . . . . . . . 1075.24. Schematic diagram of the control Loop. . . . . . . . . . . . . . 1095.25. 3D flight reconstruction of the flight path. The obstacle to avoid

is a orange traffic cone located at the position (5,0,1.1). . . . . . 1105.26. Onboard images taken during the execution of the test. Figures

5.26(a) and 5.26(b) are previous to the control activation. Figures5.26(c) and 5.26(d) are during the control process and Figure5.26(e) is when the obstacle has been overtaken. . . . . . . . . . 111

5.27. Evolution of the error during a real test. . . . . . . . . . . . . . 111

6.1. Image captured with the onboard camera. . . . . . . . . . . . . 1196.2. PID-Fuzzy Controller: Membership function of the first input,

the error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1196.3. PID-Fuzzy Controller: Membership function of the second in-

put, the derivative of the error. . . . . . . . . . . . . . . . . . . 1206.4. PID-Fuzzy Controller: Membership function of the third input,

the integral of the error. . . . . . . . . . . . . . . . . . . . . . . 1206.5. PID-Fuzzy Controller: Membership function of the output,

heading degrees to turn. . . . . . . . . . . . . . . . . . . . . . . 1206.6. Flowchart of the optimization process. . . . . . . . . . . . . . . 122

Page 29: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Figures XXVII

6.7. Interaction between the ROS-Gazebo 3D simulator and the twoother process developed for this work. . . . . . . . . . . . . . . 122

6.8. Control loop with the optimization of the Cross-Entropy method. 1236.9. Evolution of the probability density function for the first input

gain. The standard variance converge in 12 iterations to a valueof 0.0028 so that the obtained mean 0.9572 can be used in thereal tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

6.10. Evolution of the probability density function for the second inputgain. The standard variance converge in 12 iterations to a valueof 0.0159 so that the obtained mean 0.4832 can be used in thereal tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

6.11. Evolution of the probability density function for the third inputgain. The standard variance converge in 12 iterations to a valueof 0.0015 so that the obtained mean 0.4512 can be used in thereal tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

6.12. Evolution of the ITAE error during the 12 Cross-Entropy itera-tions. The ITAE value of each iteration correspond to the meanof the first 5 of 30 controllers of each iteration. . . . . . . . . . 124

6.13. Evolution of the gains of each input. The value of the gain cor-respond to the first 5 of 30 controllers of each iterations. . . . . 125

6.14. Parrot-AR.Drone, the platform used for the real tests. . . . . . . 1256.15. Control loop with the optimization of the Cross-Entropy method. 1256.16. Explanation of the avoiding task approach. The quadcopter starts

at point 0.0 (1.Motor ignition) and flies 0.5 meters keeping theobstacle to avoid in the center of the image (2.Avoiding task.Start). Then the reference to one of the edge of the image isadded to the position of the obstacle in the image plane until 3.5meters (3.Avoiding task. Finish). The quadrotor continues. Thelast yaw command is send after the avoiding task is finished. Theobstacle to avoid is at point (0,4.5). . . . . . . . . . . . . . . . 126

6.17. Onboard images during the execution of the test. . . . . . . . . 1276.18. Evolution of the error during a real test at 0.04 m/s forward speed

using the non optimized fuzzy controller. A RMSE of 9.0081 hasbeen obtained. . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

6.19. Evolution of the error during a real test at 0.04 m/s forward speedusing the fuzzy controller optimized using the Cross-Entropymethod. A RMSE of 5.271 has been obtained, more than 40%reduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129

6.20. 2D reconstruction of the trajectory defined during a real test at0.04 m/s forward speed using the non optimized fuzzy controller. 129

6.21. 2D reconstruction of the trajectory defined during a real test at0.04 m/s forward speed using the fuzzy controller optimized us-ing the Cross-Entropy method. . . . . . . . . . . . . . . . . . . 130

Page 30: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXVIII List of Figures

6.22. Evolution of the error during a real test at 0.08 m/s forward speedusing the non optimized fuzzy controller. A RMSE of 9.0081 hasbeen obtained. . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

6.23. 2D reconstruction of the trajectory defined during a real test at0.08 m/s forward speed using the non optimized fuzzy controller. 130

6.24. Evolution of the error during a real test at 0.08 m/s forward speedusing the fuzzy controller optimized using the Cross-Entropymethod. A RMSE of 5.271 has been obtained, more than 40%reduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

6.25. 2D reconstruction of the trajectory defined during a real test at0.08 m/s forward speed using the fuzzy controller optimized us-ing the Cross-Entropy method. . . . . . . . . . . . . . . . . . . 131

6.26. Detection of AR marker using the AruCo software. . . . . . . . 1326.27. See-and-avoid task for UAV . . . . . . . . . . . . . . . . . . . 1336.28. This is the initial definition of the variables of the Fuzzy Logic

Controller before any optimization. . . . . . . . . . . . . . . . . 1346.29. Quadcopter’s Matlab Simulink block. . . . . . . . . . . . . . . 1366.30. Matlab Simulink block of the flying object to avoid. . . . . . . . 1376.31. Onboard UAV camera’s Matlab Simulink block. . . . . . . . . . 1376.32. Fuzzy Logic Controller Matlab Simulink block. . . . . . . . . . 1376.33. Seeing and Avoiding Task Chart module. . . . . . . . . . . . . 1376.34. Real-time dynamic training process in Matlab Simulink . . . . . 1386.35. Interaction between the three parts of the matlab process. The

Cross-Entropy optimization method, the Simulink block struc-ture and the main process that manage the evolution of the sim-ulations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139

6.36. Representation of the software implementation in a flowchart. . 1396.37. 25 degrees step response of the best controller of each optimiza-

tion phase using the error estimation RMSE (Table 6.10). . . . . 1416.38. 25 degrees step response of the best controller of each optimiza-

tion phase using the error estimation ITSE (Table 6.11). . . . . . 1426.39. Control loop of the Cross-Entropy method for the membership

functions optimization. . . . . . . . . . . . . . . . . . . . . . . 1436.40. Evolution of the probability function of the inputs’ membership

functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1446.41. Evolution of the probability density function of the output’s

membership functions. . . . . . . . . . . . . . . . . . . . . . . 1456.42. Evolution of the sigma value for the membership functions opti-

mization phase. . . . . . . . . . . . . . . . . . . . . . . . . . . 1466.43. Evolution of the values for the membership functions optimiza-

tion phase. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1466.44. ITSE evolution at the first part of the optimization process. . . . 1476.45. Variable definition of the Fuzzy logic controller after the two

optimization phases of the membership functions. . . . . . . . . 148

Page 31: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Figures XXIX

6.46. Control loop of the Cross-Entropy method for the rules’ weightoptimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . 149

6.47. Evolution of the probability density function of three selectedrules’ weight. . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

6.48. Evolution of the sigma value for the selected rules and the meanof all the rules. . . . . . . . . . . . . . . . . . . . . . . . . . . 151

6.49. Evolution of the values of the selected rules during the optimiza-tion process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151

6.50. ITSE evolution at the first part of the optimization process. . . . 1516.51. Description of the avoiding task test. . . . . . . . . . . . . . . . 1526.52. Significant processed images from the avoiding task states de-

scribed in Figure 6.51. . . . . . . . . . . . . . . . . . . . . . . 1536.53. Control loop for the real tests. . . . . . . . . . . . . . . . . . . 1536.54. Real tests results of the three controller for 0.2m/s pitch speed. . 1566.55. Real tests results of the three controller for 0.4m/s pitch speed. . 1576.56. Real tests results of the three controller for 0.6m/s pitch speed. . 1586.57. Real tests results of the three controller for 0.8m/s pitch speed. . 1586.58. Real tests results of the three controller for 1.0m/s pitch speed. . 1596.59. Full Optimized controller with 1.2m/s pitch speed . . . . . . . . 1596.60. Full Optimized controller with 1.4m/s pitch speed . . . . . . . . 159

B.1. Software definition. . . . . . . . . . . . . . . . . . . . . . . . . 176B.2. Example of a variable with a most important sector definition in

the center of it . . . . . . . . . . . . . . . . . . . . . . . . . . . 177

C.1. UAVs rotary wing helicopters used on this thesis . . . . . . . . 182C.2. UAVs rotary wing quadcopter used on this thesis . . . . . . . . 185

Page 32: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 33: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

List of Tables

2.1. Classification of sensors used in unmanned vehicles. A, active;P, passive; P/A, passive/active; PC, proprioceptive; EC, extero-ceptive. Table from (Siegwart and Nourbakhsh, 2004) . . . . . . 9

3.1. Base of rules of the Fuzzy controller. . . . . . . . . . . . . . . . 283.2. Results of the 50 pixels step perturbation. . . . . . . . . . . . . 323.3. Results obtained with the driveless car without mark detection. . 353.4. Results of the 50 pixels step perturbation for the system with

mark detection. . . . . . . . . . . . . . . . . . . . . . . . . . . 393.5. Results obtained with the driveless car with mark detection. . . . 45

4.1. Meaning of the acronyms of the linguistic values of the the fuzzyvariables inputs and the output. . . . . . . . . . . . . . . . . . . 60

4.2. Rules’ base for the Yaw and Pitch controllers. . . . . . . . . . . 614.3. Rules’ base for the Heading controller. . . . . . . . . . . . . . . 614.4. Measures of the most relevant Fuzzy system. . . . . . . . . . . 624.5. Data from big attitude changes sections of the flight. . . . . . . 674.6. Rules’ Base of the altitude Fuzzy controller. . . . . . . . . . . . 794.7. Rules’ Base of the roll Fuzzy controller. . . . . . . . . . . . . . 794.8. Rules’ Base of the pitch Fuzzy controller. . . . . . . . . . . . . 794.9. Comparison between homography estimation of X, Y, Z and

IMU data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 824.10. Comparison between homography estimation of heading and

IMU data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 824.11. Autonomous landing tests. . . . . . . . . . . . . . . . . . . . . 89

5.1. Rules’ base of the Heading Fuzzy logic controller. . . . . . . . . 1005.2. Rules’ base of the Pitch Fuzzy logic controller. . . . . . . . . . 100

XXXI

Page 34: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXXII List of Tables

5.3. RMSE of Pitch and Yaw controllers for the three presented tests. 1075.4. Base of rules of the Fuzzy controller. . . . . . . . . . . . . . . . 108

6.1. Base of Rules with value for the third input (Integral of the Error)equal to Zero . . . . . . . . . . . . . . . . . . . . . . . . . . . 121

6.2. Base of Rules with value for the third input (Integral of the Error)equal to Negative . . . . . . . . . . . . . . . . . . . . . . . . . 121

6.3. Base of Rules with value for the third input (Integral of the Error)equal to Positive . . . . . . . . . . . . . . . . . . . . . . . . . 121

6.4. Comparison between the non optimized and the Cross-Entropyoptimized Fuzzy controllers at different speeds. . . . . . . . . . 128

6.5. Base of rules with value for the third input (integral of the error)equal to zero, without CE optimization . . . . . . . . . . . . . 134

6.6. Base of rules with value for the third input (integral of the error)equal to negative, without CE optimization . . . . . . . . . . . 135

6.7. Base of rules with value for the third input (integral of the error)equal to big negative, without CE optimization . . . . . . . . . 135

6.8. Base of rules with value for the third input (integral of the error)equal to positive, without CE optimization . . . . . . . . . . . 135

6.9. Base of rules with value for the third input (integral of the error)equal to big positive, without CE optimization . . . . . . . . . 135

6.10. Optimization using RMSE. . . . . . . . . . . . . . . . . . . . . 1416.11. Optimization using ITSE. . . . . . . . . . . . . . . . . . . . . . 1426.12. Optimization results for the membership functions. Here are pre-

sented the different values of the set’s location of the differentmembership functions after the different optimization process. . 147

6.13. Base of rules with value for the third input (integral of the er-ror) equal to negative, after CE optimization of the membershipfunctions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

6.14. Base of rules with value for the third input (integral of the error)equal to zero, after CE optimization of the membership functions.150

6.15. Base of rules with value for the third input (integral of the er-ror) equal to positive, after CE optimization of the membershipfunctions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151

6.16. Comparison between the non optimized (Non-Opt), and twoCross-Entropy optimized controller at different stage. One hasthe membership functions optimized (Half-Opt), and the otherinclude the membership functions and weight rules optimization(Full-Opt). The three controllers were tested at different speeds.Four tests were done to check the behavior of each controller ateach speed. The best test is highlighted with bold. . . . . . . . . 155

C.1. Rotomotion UAVs technical specifications . . . . . . . . . . . . 183C.2. Pelican AscTec UAV technical specifications . . . . . . . . . . 186

Page 35: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

computer HAL “I’m sorry, Frank, I think you missed it,Queen to bishop three, bishop takes queen, knight takes bishop, mate”

cosmonaut Dr. Frank Poole “Uh. Yeah, looks like you’re right. I resign”HAL “Thank you for a very enjoyable game”

the cosmonaut “Yeah, thank you”

2001: A space Odyssey,Arthur C. Clarke, 1968.

Page 36: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

XXXIV List of Tables

Page 37: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 1Introduction

Humans have to interact with the environment in which they are. To do thiswe have to detect things, to measure distances, size, etc, to evaluate the situationand act. Robotics is trying to copy these actions for ages. For the first twoactions we use senses and robots use sensors. To evaluate situations we usethe brain and our previous learned knowledge, and for robots a set of differentartificial intelligence techniques and strategies have been developed. Finally weuse our members commanded by the brain with a continue feedback informationto interact with the environment. Robots have to use its own robotic members toaccomplish the interaction with the environment.

We can use robots to accomplish certain missions by substituting a person,in situations not accessible for a person or because of live risk, or by helpinghim/her to do this action in a better way. In this thesis we try to cover both casesin different situations with ground and aerial vehicles. In this Thesis the sensorused to interact is vision, because of the much information provided and artificialintelligence technique is the Fuzzy control, because of its resembling to humanreasoning.

1.1. Motivation and Problem StatementNowadays there is a large number of vehicles inside cities and out of them.

Car accidents are one of the primary cause of death in Europe. In order to re-

1

Page 38: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

2 1.1. Motivation and Problem Statement

duce the number of accidents robotics can substitute the driver by a completeautonomous driving system or given an extra help to the driver to increase thesafety of the car occupants, other vehicles and pedestrians. There are some re-search projects around the world trying to automatize cars for an autonomousdriving with a big number of expensive sensors. Those systems are focusedto drive out of cities or in places with low traffic. Inside cities the high levelof traffic, the occlusion of the street lanes, GPS accuracy problems, among otherproblems increase the difficult to solve this task. The principal car companies areincluding more and more sensors each day in their vehicles, helping the driver topark, to brake in a danger situation, etc. A specific case inside city traffic is thepublic services which are doing the same path every day and every hour. Thisproblem could face as a guided vehicle in large scale. Usually the autonomousguided vehicles were used in very restricted areas like companies stores. Herewe present a solution for driving help focused in public services based on au-tonomous guided vehicles.

Aerial vehicles are widely used for more than passenger transport. A viewfrom a certain altitude distance could give us a different aspect of a problem.The inspection of big structures, large areas or places without ground accesscould not be done using ground vehicles. Some of these tasks could be involvehigh live risks for the people who are in the aerial vehicle, like in catastrophehazard or burning areas. In other cases could imply a high cost reduction for acompany to use an unmanned aerial vehicle to access to reduced space areas orto do inspection tasks. The UAV is a recent highlight topic in robotics becauseof the range of possibilities in different tasks. Here we present a set of researchworks using different unmanned aerial vehicle to accomplish different inspectionand autonomous navigation tasks.

As is previously mentioned there is a set of sensors to measure the environ-ment of the robot. Each one has its own pros cons and its different costs. In aerialvehicles we also have a weight restriction for the choice of the sensor. For bothtype of vehicles (ground and aerial) we use a vision sensor to get the informationof the robots surroundings. This sensor has the benefit that the information ob-tained could be processed in different ways. There a big number of algorithmsdeveloped for different utilities like object detection, moving measure, or imagesegmentation. Because of it and its low price is the most used sensor in robotics.In this Thesis a set of different visual algorithm were used for accomplish differ-ent tasks

When a robot is used in not controlled environment unexpected disturbancesincrease the difficult of the problem. The robots are non linear machines that im-ply a hard work to identify the model, which could be affected by little mechan-ics mismatches from one day to another. The non-linear controls techniques haveto be used for control robots in high dynamics environments. Soft-Computingtechniques are getting the front position in this robotics field, based on theirfree-model controllers and how these techniques manage the uncertainty of thesensors. One of most important technique of Soft-Computing is the Fuzzy logic.

Page 39: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 1. Introduction 3

This technique is used in the presented Thesis to control ground and aerial vehi-cles for different task using vision.

1.2. Main Contributions

This Thesis contributes with new solutions for an extended number of ap-plications of Fuzzy control for unmanned vehicles using visual control. Theseare distributed in chapters depending the type of the vehicle to control (ground,helicopter and multicopter). The high capability of cameras to measure the envi-ronment for real time applications is also presented. A final chapter presents thecontributions of this dissertation on Fuzzy control optimization.

There is a wide range of sensors to measure the robot’s surroundings. Eachone have its pros and cons, and the visual sensors are not and exception. Theweak resistance against lightening changes face with the good relation betweenefficiency and low cost is one of the big pros of this sensor, and the big numberof developed algorithms presented in the literature for different visual applica-tions. In this dissertation four different visual algorithms are used for extractingcharacteristics from the environment to give enough information to the Fuzzycontrol to operate the unmanned vehicle for different applications.

The control strategy to use is always a hard work when a process have to beautomatized. When the process is a vehicle in outdoors environment the numberof possible solutions is reduced drastically. The different possible perturbationspresented in this type of tests, adding the high difficult to model a real vehicle,makes Soft-Computing techniques one of the best option. Fuzzy control tech-nique is used in this research work to control 4 different model of 3 vehicles. Adriver-less car, a RC helicopter and two quadcopters. All the presented Fuzzycontrol are implemented using the self-developed software (MOFS). The differ-ent control system are used for 5 different applications, showing in all the casesa set of real tests.

The final contribution of this Thesis to the literature is the novel approachof use the Cross-entropy optimization method to tune different parameters of aFuzzy controller. This method was used previously in the literature for the opti-mization of the input’s gains. In this research work is presented a complementaryoptimization of the inputs’ gains. A contribute of the use of this method for opti-mization/tuning of the membership functions sets and rules’ weight is presentedand tested with quadrotor for avoiding collisions task.

It have to be mentioned that all the software developed during this researchwork, the MOFS for design and configure Fuzzy controllers and the Cross-Entropy implementation for optimize Fuzzy controllers, are available under opensource policy for who want to use it.

Page 40: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

4 1.3. Outline

1.3. OutlineChapter 2 presents a review of control of unmanned ground and aerial ve-hicles and the solutions presented in the the literature. There are manypossibilities to face this problem, and in this Thesis we focused on visualcontrol, because of the efficiency of this sensor. In this chapter is also pre-sented how a camera sensor can be used to close the control loop, and theselected methodology of image-based visual control. Since the principalaim of this Thesis is to contributes with novel Fuzzy control systems fordifferent robots without knowing its model, in the last section of this chap-ter this technique is presented and the most relevant works for unmannedvehicle control and other industrial control systems are mentioned.

Chapter 3 presents a new Fuzzy control system for an unmanned groundvehicle. A driver-less car is presented using just one camera with a limitedfield of view. The visual algorithm for detection the painted line on theroad and the design and detection of visual marks for car location is alsopresented. The lateral Fuzzy control of the vehicle is presented and testedwith a big amount of experiments with location information and withoutit.

Chapter 4 presents novel Fuzzy control systems using visual informationto control an autonomous helicopter. Firstly, a Fuzzy control of a visualpan and tilt platform and the vehicle’s heading is used for track movingtargets, as an eye in the sky. Secondly, the control of the pitch, roll andaltitude of the helicopter is presented for autonomous landing task. Thedifferent visual algorithm used in both application are also explained.

In chapter 5 a contribute in visual Fuzzy control is presented for unmannedmulticopters. A color based tracking is used for two different applicationswith two different quadrotors. First, a Fuzzy control of the pitch and head-ing of an AscTech Pelican quadcopter is used to follow a flying object.Second, a Fuzzy control of the heading is used to control an AR.Drone forsee and avoid task.

Chapter 6 present a novel approach of the Cross-Entropy optimizationmethod for tuning different parameters of a Fuzzy controller. A colorbased and a QR code detection visual algorithms are used for avoid col-lisions. Firstly, the optimization of the inputs’ gains using the roboticsoperative system (ROS) is presented. Second the optimization of the mem-bership functions sets’ size and location and the rules’ weight is shown.

Chapter 7 presents the Thesis conclusions and the possible lines for futurework.

A visual outline of the presented Thesis is shown in Figure 1.1.

Page 41: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 1. Introduction 5

Figu

re1.

1:T

hesi

sSt

ruct

ure

Page 42: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 43: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 2State of Art

This chapter presents the state of art of unmanned vehicles, the visual con-trol and the Fuzzy control. More detailed information about the related workspresented in the literature for each different application explained in this Thesisis presented in each chapter.

2.1. Introduction

Unmanned vehicle is term that refer to a vehicle without a human pilot on-board. In the specific case of aerial robots are commonly known as a drone. Forground vehicles the are also called as driver-less or pilot-less. In both cases thevehicle has to measure its environment to know its local and/or global positionto execute the desired actions.

The UAV and UGV history, development, architectures and navigation con-trol theory are out of the focus of this thesis. The aerial robotics chapter (Hand-book of robotics) (Siciliano and Khatib, 2008a) gives a general review of theUAVs history, flight concepts and applications, as well as is presented in (Puriet al., 2007). Aiming to UGV some interesting information about historicalfacts and the evolution of mobile robots can be found in (Siegwart and Nour-bakhsh, 2004), (Nehmzow, 2003) and the mobile robotics chapter in (Sicilianoand Khatib, 2008a) .

7

Page 44: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

8 2.2. Unmanned Vehicles

2.2. Unmanned VehiclesThe interaction between robots and its surrounding area is generally divided

in perception, localization, and planning and navigation. Next are presented abrief description of each problem. For a more detailed information the readercould find it in (Siegwart and Nourbakhsh, 2004), (Siciliano and Khatib, 2008a).

2.2.1. PerceptionOne of the most important task of an autonomous system or vehicle of any

kind is to acquire knowledge about its environment. This is done using oneor more sensors for taking measurements and extracting the meaningful infor-mation. There are a wide variety of sensors used in unmanned vehicles. Somesensors are used to measure simple values like the rotational speed of the motors.Other, more sophisticated sensors can be used to acquire information about therobot environment or even to directly measure a robot’s global position. Sensorscould be classify as proprioceptive/exteroceptive and passive/active.

Proprioceptive sensors measure robot’s internal values, like motor speed,battery voltage.

Exteroceptive sensors acquire information from the robot’s environment,like distance, light intensity. This information must be interpreted to ex-tract the meaningful information.

Passive sensors measure ambient environmental energy entering the sen-sor, like microphones and cameras.

Active sensors emit energy into the environment and measure it reaction,like laser and sonar.

Table 2.1 shows a classification of most useful sensors for unmanned vehiclesapplications. This classification is arranged in ascending order of complexity anddescending order of technological maturity. Tactile sensors and proprioceptivesensors are critical to almost all mobile robots, and are well understood andeasily implemented. On the other extreme, visual interpretation by means of oneor more cameras provides a broad set of potential functionalities.

The most important sensors for unmanned vehicles are extereoceptive ones.The choice between passive and active sensors depends on the task to solve andthe battery limitations of the vehicle. Active sensors could be more accuratethan passive sensors but imply an extra energy consumption, furthermore, thesensor’s signal may suffer interference. In mobile vehicles the battery capacityare bigger than in aerial robots, being this type of sensor more used in the firsttype of robots.

The most promising sensor for the future of unmanned vehicles is vision andsection 2.3 present an overview of visual control in robotics. This sensor provide

Page 45: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 2. State of Art 9

General classification Sensor PC/EC A/P(typical use) Sensor SystemTactile sensors Contact switches, bumpers EC P(detection of physical contact or Optical barriers EC Acloseness; security switches) Noncontact proximity sensors EC AWheel/motor sensors Brush encoders PC P(wheel/motor speed Potentiometers PC Pand position) Synchros, resolvers PC A

Optical encoders PC AMagnetic encoders PC AInductive encoders PC ACapacitive encoders PC A

Heading sensors Compass EC P(orientation of the robot in relation Gyroscopes PC Pto a fixed reference frame) Inclinometers EC A/PGround-based beacons GPS EC A(localization in a fixed Active optical or RF beacons EC Areference frame) Active ultrasonic beacons EC A

Reflective beacons EC AActive ranging Reflectivity sensors EC A(reflectivity, time-of-flight, Ultrasonic sensor EC Aand geometric triangulation) Laser rangefinder EC A

Optical triangulation (1D) EC AStructured light (2D) EC A

Motion/speed sensors Doppler radar EC A(speed relative to fixed Doppler sound EC Aor moving objects)Vision-based sensors CCD/CMOS camera(s) EC P(visual ranging, whole-image Visual ranging packagesanalysis, segmentation, Object tracking packagesobject recognition)

Table 2.1: Classification of sensors used in unmanned vehicles. A, active; P,passive; P/A, passive/active; PC, proprioceptive; EC, exteroceptive. Table from(Siegwart and Nourbakhsh, 2004)

a big amount of data of the robot’s surroundings. A wide range of different kindof algorithms have been developed in the last two decades.

2.2.2. Localization

Navigation is one of the most challenging competences required of an un-manned robot. Success in navigation depends on success in each of the fourcomponents of navigation: perception, the robot have to extract the meaningfulinformation acquired by its sensors; localization, the robot have to determine itsposition in the environment; cognition, the robot have to decide what is the nextstep to accomplish its mission; and motion control, the robot must modulate itsmotors or servos outputs to modify its trajectory to the desire one.

Between all of theses four blocks, localization has received the greatest re-search attention in the past decade and, significant advances have been obtainedon this front.

If one could attach an accurate GPS sensor to a unmanned vehicle, much ofthe localization problem would be solved. Unfortunately, this sensor is not cur-rently practical for this specific task. The accuracy provided by the current GPSnetworks is within several meters, which is unacceptable for human-scale robotsand needless to say with minirobots. Beyond GPS limitations, localization im-

Page 46: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

10 2.2. Unmanned Vehicles

plies more than knowing one’s absolute position in the Earth’s reference frame;it means building a map, and then identifying the robot’s position in relation tothat map. Clearly, the robot’s sensors and effectors play an integral role in all thelocalization forms. The inaccuracy and incompleteness of these sensors, thereare a set of sensors and strategies to face this problem.

To have a high accuracy in localization the most extended sensor used is thelaser. The problem used to be that are quite big, heavy and expensive, and giveyou a 2D plane information. Nowadays there scaled version of laser sensorsthat could be used onboard small quadcopters. There are also other strategies tolocalize robots using other sensors like vision, landmarks and odometry.

Laser & encoders localization Encoders onboard mobile robots provide anestimation of the position of the robot. This information is not reliable enoughbecause of the the presence of errors. The uses of other sensors to detect e objectsfrom previous map can help, but also this information is not reliable enough.The laser sensor is the most used sensor for simultaneous localization and mapbuilding (SLAM). The most effective and popular solutions to SLAM have beenbased on probabilistic techniques. Some of them are Fast-SLAM, EKF-SLAMand Graph-SLAM. The main difference between them concerns basically theenvironment representation and the uncertainty description. A extended reviewof all the map building techniques could be found in (Thrun, 2002) and (de laPuente Yusty, 2012).

Visual localization Visual SLAM could be done using landmarks or basedon feature detection. The early works of Davison in (Davison, 1998), (Davi-son, 2003), presents a single camera SLAM algorithm based on a Kalman filter.The features are initialized by using a particle filter which estimates their depth.Montiel extended this framework in (Montiel et al., 2006) by proposing an in-verse depth parameterization of the landmarks. Chekhlov use a SIFT-like featuredescriptors and track the 3D motion of a single camera by using an unscentedKalman filter (Chekhlov, 1998).Recently Klein and Murray (Klein and Murray, 2007) developed the paral-lel tracking and mapping (PTAM) that introduced a new strategy based onkeyframes for a robust visual SLAM.

Landmark-based navigation Landmarks are generally defined as passive ob-jects in the environment that provide a high degree of localization accuracy whenthey are within the robot’s field of view. The control system for a landmark-based navigator consists of two different phases. When a landmark is detected,the robot update its position without cumulative error. But when the robot isin no landmark “zone”, then only action update occurs, and the robot accumu-lates position uncertainty until the next landmark is detected. The robot is thuseffectively dead-reckoning from landmark zone to landmark zone. This in turn

Page 47: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 2. State of Art 11

means the robot must consult its map carefully, ensuring that each motion be-tween landmarks is sufficiently short, given its motion model, that it will be ableto localize successfully upon reaching the next landmark. Some of the first workswith robot navigation with landmarks are (Lazanas and claude Latombe, 1992),(Lazanas and Latombe, 1995), (Nourbakhsh et al., 1999).

2.2.3. Planning and NavigationCognition generally represents the purposeful decision-making and execu-

tion that a system use to achieve its highest-order goals. In case of unmannedvehicles, the specific concept of cognition directly linked to robust mobility isnavigation competence. Navigation enclose the ability of the robot to act basedon its knowledge and sensor values to reach its goal positions as efficiently andas reliably as possible.

In the artificial intelligence community planning and reacting are oftenviewed as contrary approaches or even opposites. In fact, when applied to phys-ical systems such as unmanned robots, planning and reacting have a strong com-plementarity. In most of the application with unmanned vehicles this fact ishappens, but when the goal position depends of dynamic position of somethingin the environment it is possible to based our navigation strategy just in the sen-sor information. This is the case of applications like object following, sense andavoid.

There are a wide range of control architectures and techniques used to man-age unmanned vehicles, that are explained widely in the literature, some of theseworks are (Siegwart and Nourbakhsh, 2004), (Nehmzow, 2003). In this Thesiswe use the Fuzzy logic to develop the control system to commands the differ-ent unmanned vehicles. In section 2.4 is presented a brief description of thistechnique and some of the works presented in the literature.

2.3. Visual ControlVision is a useful robotic sensor since it mimics the vision’s human sense and

allow for non-contact measurement of the environment. The use of vision withrobots has a long history, starting with the work of Shirai and Inoue (Shirai andInoue, 1973). They describe how a visual feedback loop can be used to correctthe position of a robot to increase task accuracy. Today vision system are avail-able from major vendors that are highly integrated in robots systems. Typicallyvisual sensing and manipulation are combined in a open-loop fashion, “look-ing” and then “moving”. The accuracy of the results depends directly on theaccuracy of the visual sensor, the robot end-effector and the controller. An alter-native to increase overall accuracy of the system is use a visual feedback controlloop. The term of visual servoing was first mentioned by Hill and Park (Hilland Park, 1979), to distinguish their approach from the earlier “blocks world”experiments where the system alternated between picture taking and moving.

Page 48: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

12 2.3. Visual Control

Prior to this concept was introduced, the less specific term viual feedback wasgenerally used. Visual servoing is no more than the use of vision at the lowestlevel, with simple image processing to provide reactive or reflexive behavior toservo-position a robotic system. Classical visual servo control was developedfor serial-link robotic manipulators with the camera typically mounted on theend-effector, also called eye-in-hand. For the purpose of this Thesis, the task invisual servoing is to use the visual information to control the unmanned vehicle’spose with respect objects or landmarks presented on the environment.

Since the first work about visual servoing system, presented by Sanderson(Sanderson and Weiss, 1983) at the decade of the 80’s, progress in visual controlof robots has been fairly slow. It was with the increase of the computing powerwhen was able the analysis of images at a sufficient rate to “servo” a robot ma-nipulator. A huge amount of applications appeared in the last two decades. Acomprehensive review of the literature in this field, as well the history and roboticapplications, is given by Croke (Corke, 1994) and includes a large bibliography.

Vision-based robot control using an eye-in-hand system is classified into twogroups (Hamel and Mahony, 2002), (Hutchinson et al., 1996): position-basedand image-based control systems. In the literature is also called by PBVS asPosition-Based Visual Servoing and IBVS as Image-Based Visual Control.

PBVS involves reconstruction of the target pose with respect to the robotand leads to a Cartesian motion planning problem. This kind of control is basedon the tri-dimensional information from the scene (Soatto and Perona, 1994)(Ma et al., 2000). So is needed the geometric model of the object to track anda calibrated model of the camera, with it get a estimation of the position andorientation of the object (Figure 2.1). The sensitivity of PBVS design to cameracalibration is particularly worrying when low quality camera are used.

xd + e

- x

Figure 2.1: Position-Based Visual Control.

In contrast, for IBVS the control task is defined in terms of image features.A controller is designed to maneuver the image features for a desire configura-tion. The original Cartesian motion planning problem is solved. The approach isinherently robust to camera calibration and target modeling errors reducing thecomputational cost (Figure 2.2).

However, this configuration imply an extra complexity for the control de-sign problem. The approach used in classical IBVS control (Espiau et al., 1992)can not be used for the integrated control of non-linear fully dynamic systems.Recent approaches to robustly extending IBVS to dynamic systems include; dis-

Page 49: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 2. State of Art 13

xd + e

-

x

Figure 2.2: Image-Based Visual Control.

sipative control design in the image space (Maruyama and Fujita, 1999), robustbackstepping (Zergeroglu et al., 1999), (Mahony and Hamel, 2005a), and op-timal control techniques (Zhang and Ostrowski, 1999). In particular, the dy-namics of aerial robotic vehicles have proved difficult to overcome and very fewrigorous developments of IBVS control exist like the simulation test of an heli-copter stabilization with robust non-linear control shown in (Isidori et al., 2003);a helicopter kinematics simplified as a Liouvillian system (Sira-RamArez et al.,2000); a path planning simulation examples involving a ground robot and a smallautonomous helicopter is presented in (Frazzoli et al., 2001); also simulation ex-periments with a UAV is presented in (Mahony and Hamel, 2005b) for trackingparallel lines. On the other hand, there are more applications using the configura-tion of position-based visual control, as the trajectory tracking with a helicopterpresented in (Frazzoli et al., 2000), an autonomous landing helicopter (Shakerniaet al., 1999), (Saripalli et al., 2002), and (Altug et al., 2002) in which is presentedthe control of an unmanned quadcopter.

2.4. Fuzzy ControlAt the beginning of the 60s decade, artificial intelligence (AI) has found its

way into industrial applications. The area of expert knowledge-based decisionmaking was the most attractive area for the design and monitoring of industrialprocesses using this technique. The advances in computer technology allow todeveloped many applications. The invention of Fuzzy chips in the 1980s was themilestone for this technique in industrial application, especially in Japan. Alsoneural networks and evolutionary computation were also received a high boostin industry and academia. As a result of these events, Soft-Computing was born.

Nowadays Soft-Computing continues to play a important role in modeling,system identification, and control systems. The principal techniques of Soft-Computing are Fuzzy logic, neural networks, genetic algorithms, genetic pro-graming, chaos theory and probabilistic reasoning. One of the principal compo-nents is Fuzzy logic. From a control theoretical point of view, Fuzzy logic hasbeen used for all the important aspects of systems theory, that is modeling, iden-tification, analysis, stability, synthesis, filtering, and estimation. Fuzzy controlprovides a formal methodology for representing, manipulating, and implement-ing a human’s heuristic knowledge about how to control a system.

Fuzzy control design methodology can be used to construct Fuzzy controllers

Page 50: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

14 2.4. Fuzzy Control

for challenging real-world applications. As opposed to classical control ap-proaches, like proportional-integral-derivative (PID), lead-lag, and state feed-back control, where the focus is on modeling and the use of this model to con-struct a controller that is described by differential equations. in Fuzzy control wefocus on gaining an intuitive understanding of how to best control the process,then we load this information directly into the Fuzzy controller. Basically, themotivation of use Fuzzy control is to avoid the difficult task of modeling andsimulating complex real-world systems for control system developments.

There are a huge amount of uses of Fuzzy control in different research fields.In the early 1970s during the world’s first big energy crisis the concept of thebuilding energy management (BEMS) was introduced. The Fuzzy control wasused to manage the “smart” buildings by simultaneous sensing, control, andmonitoring of the internal environment responding to the external climate fac-tors (Talebu-Daryani, 1999). In (Talebu-Daryani, 1995) the system structurewere established. A control application for manage the air conditional systemis presented in (Lygouras et al., 2008), (Eftekhari et al., 2003). An extendedreview of Fuzzy control application for BEMS is presented in (Talebu-Daryaniand Plass, 1998), (Talebu-Daryani and Luther, 1998).

Several applications of FCSs with Mamdani FCs are reported in manufac-turing. They include control of industrial weigh belt feeders (Zhao and Collins,2003), the realization of specific controllers (Dvorak et al., 2003), (Galichet andFoulloy, 2003), control of machining processes (Haber et al., 2003), (Nandi andDavim, 2009), (Haber et al., 2009), (E.Haber et al., 2010), laser tracking systems(Bai et al., 2005), plastic injection molding (Chen et al., 2008) and vibration sup-pression (Marinaki et al., 2010). The manufacturing area is related to robotics.Mamdani FCs concern control of both manipulators.

The automotive industry is one special successful area of Mamdani FCs.Problems and practical issues related to suspension control are discussed in(Caponetto et al., 2003). The control of hybrid electric vehicles is treated in(Schouten et al., 2002) and the complexity of all related control strategies isemphasized in (Salmasi, 2007). The control of anti-lock braking systems is an-alyzed in (Mirzaei et al., 2006), (Zhao et al., 2006). Process industries includeMamdani fuzzy control. The applications reported in this context tackle the con-trol of furnaces (Moon and Lee, 2003), filtration processes (Onat and Dogruel,2004), heat exchangers (Maidi et al., 2008) or forging machines (Lee and Kopp,2001). The control of the temperature inside a liquid helium cryostat is presentedin (Santos and Dexter, 2001), (Santos and Dexter, 2002). Fuzzy control has re-cently been applied to a variety of servo systems and actuators in mechatronics(Ahmed et al., 2001), (Cho and Lee, 2002), (Precup et al., 2003), (Kalyoncu andHaydim, 2009).

A large overview of different control applications using Soft-Computingtechniques is presented in (Zilouchian and Jamshidi, 2000) (Malhotra et al.,2011a), (Precup and Hellendoorn, 2011a), (Lee, 1990b), (Lee, 1990a).

Regarding the topics related to this Thesis, some of the Fuzzy logic ap-

Page 51: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 2. State of Art 15

proaches for mobile robots or unmanned ground vehicles are mention next.Navigation tasks are implemented with z fuzzy controller in (P.Moustris andG.Tzafestas, 2011); and (Pradhan et al., 2009) in which a big number of mobilerobots are controlled. A hybrid controller with Fuzzy and Petri-potential is usedfor the same purpose in (Parhi and Mohanta, 2011). Onieva et. altres presents in(E.Onieva et al., 2011) an lateral control for unmanned vehicles using a Fuzzycontroller optimized with genetic algorithms. The diagonal parking with a car-like autonomous vehicle is presented in (Baturone et al., 2004). The control ofthe steering wheel of a nonholonomic mobile robot in (Yang et al., 2004). Afuzzy-path planning for a tracked mobile robot (an excavator) is presented in(Saeedi et al., 2005). A motion control of multiple vehicle for passenger comfortin (Raimondi and Melluso, 2008). A system on chip of a fuzzy control for pathplanning with a Pionner mobile robot is presented in (Tzafestas et al., 2010).

Because of the the complexity of the dynamics of the aerial robots, the con-trol of this type of robots is one of the most fruitful field of the Fuzzy control.The way that the Fuzzy logic manage the uncertainty caused by noisy informa-tion of sensors allow to work with different type of sensors like radar, laser andGPS. The first work with Fuzzy control an unmanned aerial vehicles is presentedby Sugeno in (Sugeno et al., 1995), after that a lot of research were done in thisfield. The automated altitude hold of a UAV was developed in a NASA projectand presented in (Dufrene, 2004). In (Kadmiry and Driankov, 2004) is presentedthe simulation results of the altitude, pitch, roll and yaw control of an unmannedhelicopter. (Phillips et al., 1996) presents a fuzzy control of a UH-1 helicopterof the US-Army. The stabilization of an unmanned helicopter is also presentedin (Antequera et al., 2006), and the hovering using visual information is shownin (Shin and Oh, 1993). The control of an unmanned Kiteplane is presented in(Kumon et al., 2006), the control of a fixed wind is presented in (Gomez andJamshidi, 2010), and a quadrotor control is presented in (Nicol et al., 2011),(Santos et al., 2010). A close formation flight control of multi-UAVs (Smith,2007). A fire detection using an unmanned helicopter is presented in (Nikoloset al., 2004). An aerial river searching and tracking is presented in (Rathinamet al., 2007). Simulated obstacles avoidance and 3D path planning for UAVsare presented in (Hrabar, 2008). A control, navigation and collision avoidancewithout vision is presented in (Chee and Zhong, 2013).

Page 52: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 53: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3Fuzzy Logic Control for GroundVehicles

In this chapter is presented the works using Fuzzy control with a commercialcar “Citroen C3 Pluriel”. These works were part of an industrial project withSiemens Spain. The aim of this project was transforming a commercial car intoa driverless car using just vision. Based on the requirements of the project wehave to create a system that can follows a line painted on the road under innercity conditions of speed and low radius curves. The car must go inside a circuitand its position must be know. But in the case that the system loss any loca-tion information, the car must continue working. For this reason two systemapproaches were done. The first one does not use any information about thecircuit. No curve and straight characteristic are known. The only informationused is the line to follow. The second one uses the line information and a visualmark detection. Using these visual marks the system is able to know its exactlylocation inside the circuit and the curves and straight characteristics. For bothapproaches a Fuzzy controller was developed to control the steering wheel of the

publications related to this chapter:-“A Visual AGV−urban Car using Fuzzy Control, IEEE−ICARA’11-“Autonomous Guided Car using a Fuzzy Controller”, Recent Advances in Robotics and Au-tomation. Springer Studies in Computational Intelligence, 2013-“Inner-City Driverless Vehicle: A Low Cost Approach with Vision for Public Transport insideCities”, Robotics and Automation Magazine (under review)

17

Page 54: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

18 3.1. Related Works

vehicle. The speed of the vehicle was manage manually and fixed in a constantvalue. A hard set of tests were done and supervised by Siemens people to checkthe correct behavior of the system with excellent results.

This chapter is organized as is explained next. Section 3.1 shows the relatedworks of this specific part of the present Thesis. The Car system used is ex-plained in Section 3.2. The visual algorithms are divided in two parts, the linedetection, shows in Section 3.3 and the marks detection in Section 3.3.2. Thedetailed information of the implemented Fuzzy logic controllers and the testsdone of each approach are presented in Sections 3.4 (without mark detection),and 3.5 (using mark information). Finally Section 3.6 presents the conclusionsof the work presented in this Chapter.

3.1. Related Works

Driverless cars are a popular topic nowadays. The advances during the lastdecade in the hard testbed of DARPA (DARPA, 2011) have given a strong push-up to this research line. But it was the recent work of Google (Inc., 2010) thatgives the final shot to popular interest (Figure 3.1(a)). This project is managedby Sebastian Thrun who obtain successful results in DARPA challenge in 2005(Thrun et al., 2006) and 2008 (Montemerlo et al., 2008) (Figure 3.1(b)). Theexcellent results of this approach force legal changes to give the first license toa self-driving car. Google has a fleet of at 8 cars, each with at least 4 differentkinds of sensors. The information acquired by these sensors is merged withmap database. But before this popular event happened, a number of importantprojects had been done around the world using rather different approaches.

(a) Google Car (2012) (b) Junior: DARPA winner 2008.

Figure 3.1: Driverless Car developed by Sebastian Thrun et. altres

In Europe the PROMETHEUS project (PROgraM for a European Traffic withHighest Efficiency and Unprecedented Safety) started in 1986. The fantastic re-sults of this project are regarded as milestones in vehicular robotics history. Twoof them were ARGO by VisLab (Bertozzi et al., 1998), (Broggi et al., 2000)and VaMoRs (Behringer and Muller, 1998) tested on 1998 (Figure 3.2(a)). Bothof them use two cameras to detect road lanes and to avoid obstacles but imple-

Page 55: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 19

ment different algorithms and strategies. In 1995 in the United States startedthe NAHSC project (National Automated Highway System Consortium) insidethe California PATH (Partners for Advanced Transit and Highways) program(Shladover, 2006) (Figure 3.2(b)). 1997 exhibited the important Demo’97 in SanDiego in which a set of cars were guided by a magnetic guided line inside theasphalt. A array of different sensors have been installed in those cars to executeself-driving tests and automated platoon formations of 8 cars.

(a) ARGO (1998-2000). (b) NAHSC: California PATH (1997).

Figure 3.2: Driverless Car projects before 2000.

In 2010, the multidisciplinary European project SARTRE used new ap-proaches in platoon formations and leader systems to successfully present anautonomous platooning demo traveling covering 120 miles (project, 2012). Thecaravan was composed by one human-driven truck followed by 4 cars equippedwith cameras, laser sensors, radar, and GPS technology (Figure 3.3).

Figure 3.3: SARTRE Volvo project.

A complete test of different systems of leader following, lane and obstacledetection and terrain mapping has been done by the VisLab. In 2010, the labo-ratory directed by Alberto Broggi covered 15.926 km from Parma to Shanghaiwith a convoy of 4 motor homes (Vis, 2012). All of then were equipped with 5cameras and 4 laserscanners. No road maps were used. The first vehicle droveautonomously in selected sections of the trip, and other vehicles were 100% au-tonomous using the sensors and the GPS waypoints sent by the leader vehicle(Figure 3.4).

Page 56: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

20 3.2. Unmanned Ground Vehicle Description

Figure 3.4: VisLab driverless van, tested from Parma to Shangai.

Most of the previous approaches use vision algorithms or sophisticated sen-sors to detect road lanes. But in city environments nowadays, traffic and carsparked on both sides of the streets make it almost impossible to automatize thiskind of approach, as is shown in Figure 3.5. Furthermore, the streets are fullof shadows produces by trees, buildings, and other city structures causing bigvariations in brightness in the images. These kinds of visual effects are usu-ally detected as occlusions by visual algorithms. Public transports take the samepath every hour and every day. This work presents a low cost approach to a visualbased driverless vehicle robust against brightness variations for public transports.The presented system just has one camera and doesn’t use any GPS information.A visual system has been developed to follow a predefined path and to get infor-mation about the vehicle position. The developed control system has been testedin different city situations with excellent results.

Figure 3.5: The high level of traffic nowadays in every city make the detectionof the lane almost impossible.

3.2. Unmanned Ground Vehicle Description

For the presented approach the vehicle used is a “Citroen C3 Pluriel” (Fig.3.6). The power-assisted motor of the steering wheel has been modified to movethis wheel with the controller commands. This assistance system consists of

Page 57: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 21

an electric DC motor attached to the steering rack through a gear. This motorcouples the steering to the action of the driver on the steering wheel. This actionis measured through a torque sensor located in the steering bar. The signal fromthe sensor is received by a control/power until that sends a PWM signal to themotor, to assist the steering movement. This device allows a fast modification toautomation since the mechanical and electrical elements were already installedin the car. For our purpose, the connections of the motor were cut, and it wasattached to a servo amplifier, with 240 Watts of peak power at 12 V. This cardis able to generate a PWM signal whose duty cycle is proportional to an analog± 10 V input signal. This input signal is generated by an acquisition card thatis connected to an onboard computer. The necessary feedback is provided by anabsolute encoder. The encoder gives the angular positions at a rate of 100Hz. Ainternal control loop was previously design to transform the desire turn angle to aPWM signal at different speeds. This controller is deeply explained in (Naranjoet al., 2012).

Figure 3.6: Automated Citroen C3 Pluriel

The vehicle system was designed to follow a painted line on the floor. To keepthis line away from possible vandalism we used an invisible paint that looks blueunder ultraviolet light emission. An ultraviolet bulb and camera were installedinside a metal structure at the front of the vehicle. The principal aim of thisstructure is to control the illumination. The structure limits the field of view ofthe camera. The size of the field of view is just 30 x 50 cm.

All the tests were inside the infrastructure of INSIA-UPM. The painted cir-cuit has a length of 345 meters and is composed by two curves of 10 and 20meters of radio. In order to test the system with speeds over 40 km/h a straighthas been design to go inside the circuit with a length of 100 meters. A virtualreconstruction of the circuit using a satellite image is shown in Figure 3.8.

Page 58: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

22 3.3. Visual Localization Algorithms

Figure 3.7: Metal structure of the visual system. Limits the field of view of thecamera to 30 x 50 cm.

Figure 3.8: Representation of the circuit at INSIA installations took with GoogleEarth tool.

3.3. Visual Localization AlgorithmsThe implementation of the image processing related to this work has two

parts: The line detection and the marks detection and decoding. The first is usedin both approaches, and the mark detection is used, only in the second one, inwhich the system has information about its location inside the circuit.

3.3.1. Lateral Estimation ErrorFor the line’s detection, a custom real-time computer vision algorithm was

designed. The algorithm is able to detect the line’s centroid and orientation underharsh conditions, such as partially occluded and poorly painted line on a roughterrain, coping with non-compact line shapes.

On the front-end of the visual system, the camera captures the scene whichis lit with UV light at 30 fps. First, a color-based segmentation is performed on

Page 59: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 23

YUV space. Despite some other color spaces were tested, YUV was found to bethe best performer under different light conditions. A rectangular prism insidethe YUV colour space is defined, so that only the pixel values inside this volumeare considered to be part of the line. The result is a binary image where only theline pixels are set. This method proved to be robust detecting lines of differentblue tones and brightness.

In the binary image, every 8-connected pixel group is marked as a blob. Atthe first step, to reduce the noise, blobs having an area outside a defined range arediscarded. Then, for every survivor, centroid, dominant direction and maximallength are calculated, and those being too short are ignored. The remaining blobsare clustered according to proximity and parallelism, so each cluster becomes acandidate line. The centroid and dominant direction of each candidate line iscalculated from the weighted sum of the features of its component blobs, beingthe weight of each blob proportional to its relative area. In this way, the algorithmis able to accurately detect lines that are fragmented because of ageing.

Finally, from the whole set of candidate lines, a detected line must be selectedfor the current frame. In order to do that, the distance between the centroids ofevery candidate line in the current frame and the detected line in the previousframe is measured. If the smallest distance is higher than a certain threshold, thedetected line will be the leftmost or rightmost candidate line, depending on theuser-defined criterion. Otherwise, the closest candidate line is taken as detectedline. This mechanism avoids switching to ”fake” lines when there are traces ofold paintings along the circuit, even when it is deteriorated.

The algorithm outputs whether the line is detected or not, if it is, it alsooutputs the error of the line in the x-axis from to the center of the image and thedirection of the line, expressed as an angle.

3.3.2. Global Localization by Visual Marks

The algorithm is able to detect and recognize visual marks painted next tothe line to follow, even when they are rotated in the image because of vehi-cle turns. Each visual mark represents a binary-encoded number, which is themark’s unique identifier. Mark bits are drawn as bars that are parallel to the line.Because of the reduced visual space, instead of painting a bar for each bit like incommon barcodes, wider or narrower depending on the bit’s value, a more com-pact encoding scheme was chosen. All bits are bars with the same width, with nospacing between them. When a bit is 1, the bar is painted; when it is 0, the barspace is left unpainted. In the image, a mark appears as a set of yellow rectan-gles. Every rectangle is a bit with value 1. A start bar is added at the furthest bitslot from the line to designate the beginning of the mark. The mark specificationalso defines the number of bits per mark, a fixed bit width, a minimum bit length,and valid ranges for line-mark angle and line-to-start-bit distance. According tothe specification, the algorithm will only spot marks that lie to the right of theline in the direction of movement of the vehicle.

Page 60: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

24 3.3. Visual Localization Algorithms

As in the line detection phase, the potential mark pixels are segmented bycolor in YUV space. The color space boundaries are set so that the segmentationadmits several yellow tones, ranging from tangerine yellow to bright green, asseen in tests with multiple paints. This makes the color-based filter less restric-tive, avoiding false negatives. On the other hand, the probability that any yellownoise has a valid mark structure is low. The next step of the algorithm seeks thisvalid structure in the color-segmented pixels, thus reducing the false positives.

The 8-connected pixels from the color segmentation are grouped in blobs.The blobs not meeting the following criteria are considered noise and are dis-carded: the blobs must lie to the right of the line, the blob area must be in avalid range (computed from the visual mark specification), the angular distancebetween the dominant blob direction and the line must be in the specified range,and the blob length in the dominant direction must be larger than the minimumbit length.

The blobs that pass the filters correspond to a set of bits with value 1. Thespecific number of bits in each blob is determined by pattern matching. Assum-ing each mark has a total of N bits (including the start bit), N pattern tests arecarried out for each blob, one test for each number of bits in [1,N]. For i bits,the pattern Pi,B is a rectangle with the direction and length of the blob B and awidth equal to i times the bit width in the specification (in pixels). The numberof bits in blob B will be the value i that minimizes the cost function present inthe Equation 3.1.

c(i,B) = f (i,B)+ g(i,B) (3.1)

Where

f (i,B) = 1−a(B∩Pi,B)

a(B)

g(i,B) =g(i,B)gmax

g(i,B) =| a(B)−a(Pi,B) |

and a(S) is the area in pixels inside the shape S.Function f indicates how much the pattern covers the blob. Function g eval-

uates the similarity between the blob and pattern areas. Patterns whose f or g areabove a threshold are discarded. This forces the best solution to have a minimumquality in both indicators. Then, the minimization process favors patterns thatcover the blob while having a similar size. The threshold for g is gmax and g is anormalized version that stays in [0,1], like f does.

After assigning a number of 1s to all blobs, the rightmost 1 is interpreted asthe start bit. If its distance to the line is in the range allowed by the specifica-tion, the mark is decoded. The decoding starts by finding the mark’s dominantdirection as the average of all its blobs. The orthogonal vector to this directiondefines a baseline that is divided into N consecutive fixed-width bit slots, beingthe start bit on the first slot. All bit slots are assigned a default value of 0. Then,blob centroids are projected on the baseline and each projection falls into one of

Page 61: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 25

the slots, which is filled with a 1. Adjacent slots are also filled with 1s accordingto the blob’s number of bits. The slot values define the binary-encoded markidentifier, whose least significant bit is the closest to the start bit.

The final detected mark identifier that is passed to the control system iselected with a voting process among the last M frames. Large values of M pro-duce higher detection delays but increase detection robustness, as more imagesamples are taken into account. In our experiments, M = 3 has given good re-sults. Besides the detected mark identifier, the algorithm provides an estimationof the number of frames and time lapse since the mark was last seen. It is espe-cially useful at high speeds and high values of M, when the decision is deferreduntil M frames have been captured, but the mark was only seen on the first fewframes. Moreover, an estimation of the mark quality is given based on its aver-age luminance. Figure 3.9 shows the detection of the line and the mark whichrepresent the number 19.

Figure 3.9: Line and mark detection by the comnputer vision algorithm.

3.4. Autonomous Driving with Local Localization

As previously mentioned this approach just use the line detection algorithm.No location information is given to the system. So, the vehicle goes inside thecircuit without knowing if in the next meters it will find a straight or a curve. Thefinal system have to go with location information about the circuit, but also haveto continue working if the location information was lost. This approach is theone that must be active in this kind of situations until a new location informationwill be found.

3.4.1. Fuzzy Control for Lateral Estimation

The steering control of the car includes two components. The first one is theFuzzy controller and the other one is the integral of the error. The latter is added

Page 62: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

26 3.4. Autonomous Driving with Local Localization

Figure 3.10: Control loop of the visual servoing system for lateral control.FuzzyPD+ I schema.

at the end of the control loop to the output of the controller, making a structureof FuzzyPD+ I, as it is shown in Figure 5.24.

The Fuzzy controller was implemented using the MOFS (Miguel Olivares’Fuzzy Software). The controller has two inputs and one output. All are fuzzyfiedusing triangular membership functions. The first input is defined as the errorbetween the center of the image and the center of the line to follow (Figure3.11). The second input is the difference between current and previous error(Figure 3.12). The output of the controller is the absolute turn of the steeringwheel to correct this error, in degrees (Figure 4.17(d)). To obtain this output, 49if-then rules were defined. The developed fuzzy system is a Mamdani type thatuse a height weight defuzzification model with the product inference model inEquation 6.7.

y =∑

Ml=1 yl

∏Ni=1(µxl

i(xi))

∑Ml=1 ∏

Ni=1(µxl

i(xi)) (3.2)

Where N and M represent the number of inputs variables and total numberof rules respectively. µxl

idenote the membership function of the lth rule for the

ith input variable. yl represent the output of the lth rule.

Figure 3.11: First input variable of the Fuzzy controller: the error between thecenter of the line and the center of the image, in pixels.

The calculation of the integrator value is shown in Equation 3.3.

It = It−1 + e× 1t×Ki (3.3)

Page 63: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 27

Figure 3.12: Second input variable of the Fuzzy controller: the difference be-tween the last error and the current, in pixels.

Figure 3.13: Output variable of the Fuzzy controller: the steering wheel angle,in degrees.

Where e is the current error between the center of the line and the center ofthe image, t is the framerate, and Ki is a constant that appropriately weights theeffect of the integrator, and for this case is equal to 0.6.

The initial idea of this work was to develop a controller for a circuit withshort radius curves. In such conditions, the speed of the car can not be very high.Thus the actual velocity of the car was not included in the Fuzzy controller, butit is taken into account multiplying the fuzzy output by 10

v , being v the currentvelocity of the vehicle. The definition of the numerator value of this factor isbased on the velocity, in km/h, during a skilled human driving session, in whichdata was acquired to tune the rule base of the fuzzy controller. It is practicallyimpossible for a human to drive faster than 10km/h while keeping the line infollowing error low enough to meet the requirements of the application. This isbecause the driver only sees 30cm forward, and, at that speed, the contents ofthis area change completely every 0.108 seconds

The driving session performed by the human at 10km/h output the necessarytraining data to modify the initial base of rules of the controller and the size ofthe fuzzy sets of its variables. For the definition of the fuzzy sets, a heuristicmethod was used based on the extraction of statistical measures from the train-ing data. For the initial base of rules, we used a supervised learning algorithm,implemented in MOFS. This algorithm evaluates the situation (value of inputvariables) and looks for the rules that are involved in it (active rules). Then, ac-cording to the steering command given by the human driver, the weights of theserules are changed. Each time that the output of an active rule coincides withthe human command, its weight will be increased. Otherwise, when the outputdiffers from the human command, its weight will be decreased by a constant.Anytime the weight of a rule becomes negative the system sets the output of therule to the one given by the human driver.

Table 3.1 shows the resulting base of rules which enclose the knowledge ofthe controller.

Page 64: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

28 3.4. Autonomous Driving with Local Localization

Dot

/Err

orB

igL

eft

Lef

tL

ittle

Lef

tC

ente

rL

ittle

Rig

htR

ight

Big

Rig

htB

igN

e gat

ive

Gre

atL

eft

Gre

atL

eft

Big

Lef

tB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Neg

ativ

eG

reat

Lef

tB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htL

ittle

Neg

ativ

eB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Zer

oB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Litt

lePo

sitiv

eL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Posi

tive

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Big

Posi

tive

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Gre

atR

ight

Tabl

e3.

1:B

ase

ofru

les

ofth

eFu

zzy

cont

rolle

r.

Page 65: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 29

3.4.2. Experiments

In this subsection are presented the obtained results. First, we present thesystem behavior’s results against step perturbations. These reference changeswere applied at different velocities and circuit curvatures. Subsequently, resultsfor the tests with an autonomous driving inside the circuit are presented.

Step perturbation test series.

First of all it must be taken into account that the car has 0.4 seconds or 11frames of response delay. This is the elapsed time from the moment that thecontroller sent the command until the wheels reach the commanded angle. Thisrelation is shown in Figure 3.14.

Figure 3.14: Representation of the steering wheel response. The delay from thecommand sent to the steering wheel reach the commanded angle position is 0.4seconds.

In order to measure how good the fuzzy controller is, a set of step tests wasmade. The step value is 50 pixels, equivalent to 6.85 cm (1 pixels = 0.137 cm).The step was applied by adding a reference of 50 pixels to the current error value.For this reason the red line that represents the step applied not always is on −50or 50 pixels line. In order to obtain a good representation of the response ofthe fuzzy controller a set of tests at different speeds and at different parts of thecircuit were done (straights and curves).

Figure 3.15(a) shows the error measured when a +50 pixels step perturbationis applied to the system at 10 km/h. The resulting RSME value is 7.166 cm. Atit is shown, the system corrects the error in just 27 frames, which is about 1second for an average rate of 28 frames per second during the test. The angle ofthe steering wheel versus the controller commands is shown in Figure 3.15(b).The previously mentioned delay of 11 frames in the steering wheel action maybe noticed. Ignoring this delay, the system correct the error in 17 frames or 0.6seconds.

Figure 3.16 shows the results for a step perturbation test at 15 km/h in astraight way. This test had a RMSE value equal to 7.0592. The settling time isless than half second (25−11 frames).

Page 66: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

30 3.4. Autonomous Driving with Local Localization

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.15: Step response to a 50 pixels step at 10 km/h in straight. The Er-ror measurement, and the steering wheel movements, and the Fuzzy controlleroutput are shown. The value of the RMSE of the test is 7.1666 cm.

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.16: Step response to a 50 pixels step at 15 km/h in straight. The Er-ror measurement, and the steering wheel movements, and the Fuzzy controlleroutput are shown. The value of the RMSE of the test is 7.0592 cm.

To test the robustness of the controller against step perturbations similar testshave been done when the vehicle was inside a curve. Figure 3.17(a) shows thestep command and the evolution of the error at 10 km/h. In this case the curveto the left and the step was done to the internal part of the curve (to the left).The action of the controller and the response of the steering are shown in Figure3.17(b).

The test at 15 km/h inside the curve has been done applying a perturbationin the direction against the curve, trying to move the car out of the curve. Fig-ure 3.18(a) shows the evolution of this test comparing the step command and theerror at each frame. As well as previous tests the Figure 3.18(b) shows a compar-ison between the commands sent by the Fuzzy controller and the steering wheel

Page 67: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 31

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.17: Step response to a 50 pixels step at 10 km/h in curve. The Er-ror measurement, and the steering wheel movements, and the Fuzzy controlleroutput are shown. The value of the RMSE of the test is 7.8007 cm.

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.18: Step response to a 50 pixels step at 15 km/h in curve. The Er-ror measurement, and the steering wheel movements, and the Fuzzy controlleroutput are shown. The value of the RMSE of the test is 7.2187 cm.

position frame by frame.Table 3.2 shows the results of all the step perturbation’s tests.In all the step tests done the response of the controller was around 1 second.

Taking into account the response delay of the system and the complexity of thisnon-linear system, the controller behavior is better than the initial expectations.

Constant vehicle speed

Next are presented two tests of the car covering a long distance inside thecircuit. The first one was done with a constant vehicle’s speed and the secondone was done changing the speed during the test.

First at all is presented a test in which the car covered 18 laps of the circuit.

Page 68: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

32 3.4. Autonomous Driving with Local Localization

Step size circuit speed RMSE(pixels) section (km/h) (cm)50 straight 10 7.166650 straight 10 7.425750 straight 10 8.332150 straight 15 6.931450 straight 15 7.059250 curve 10 7.800750 curve 15 6.857450 curve 20 7.2187

Table 3.2: Results of the 50 pixels step perturbation.

Figure 3.19 shows the evolution of the vehicle and the controller in this test. InFigure 3.19(b) the measured error during the whole test is shown. In this case,the RMSE value is 5.0068 cm. Figure 3.19(c) shows the comparison between thecontroller commands and the measured angle of the steering wheel. In the Fig-ure, the changes between straight lines and curves could be appreciated. In thestraight lines, the steering wheel stays around 0 degrees, while it turns between−100 and −150 degrees in the first curve, and between −150 and −300 in thesecond one. It is easily appreciates in Figure 3.20, in which the plot is scaledto show only one lap. The evolution of the vehicle speed is depicted in Figure3.19(a), which covers speeds between 12 and 13 km/h.

(a) Vehicle’s speed. (b) Error measured. The value of RMSE forthis test is 5.0015 cm.

(c) Steering wheel versus the Fuzzy con-troller’s commands.

Figure 3.19: Evolution of the system during 18 laps inside the circuit withoutmark detection. Constant speed was applied.

Page 69: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 33

Figure 3.20: Zoom to one lap of the circuit.

In Figure 3.19(b) large error peak of even 170 pixels appear at every curvaturechange. However, they are decreased in a few frames by the controller. Thiserrors appear because the circuit was not designed with clothoids. Therefore,curvature discontinuities happen when changing from straight line to curve andvice-versa. Figure 3.21 shows a zoom of one of this instants in which a peak of−171 pixels occurs. The evolution of the error is plotted in Figure 3.21(a), whilethe output of the controller and the steering wheel angle are in Figure 3.21(b).

(a) Zoom of the error (b) Zoom of the steering wheel angle and con-troller commands

Figure 3.21: Zoom of 170 pixels step at the beginning of the second curve.

Variable vehicle speed

Here are presented some tests in with the vehicle speed is not fixed to a con-stant value. First, a test to check the robustness of the system against emergencystops an big speed changes is presented. Then a test of continuous autonomousdriving of the system inside the system covering some laps is shown.

In this test is presented how the system response against big speed changes.In order to check this behavior an emergency stop was applied to the vehicle asis shown in Figure 3.22. Figure 3.22(a) shows the speed of the vehicle duringthis test. At the end of this Figure is shown the radical step from 10 to 0 km/h.Figure 3.22(b) shows the evolution of the error during this test. To increase thecomplexity of the test the speed reduction has been done when the vehicle wasinside a curve. At the beginning of the test the vehicle was located in a straight,gong inside a curve at the end of test. The transition from straight to curve isappreciable at the interval between the frames 280 to 420. The response of the

Page 70: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

34 3.4. Autonomous Driving with Local Localization

controller and the evolution of the steering wheel have been shown in Figure3.22(c).

(a) Vehicle’s speed during the emergencystop’s test.

(b) Error measured during the emergencystop’s test.

(c) Comparison between the steering wheeland the commands sent by the Fuzzy controllerduring the emergency stop’s test.

Figure 3.22: Evolution of the system during the emergency stop’s test.

Another test has been done to check the correct behavior of the controllerwhen the speed is not fixed (Figure 3.23). In this case is appreciated some peaksof 25 km/h. These speed changes are shown in Figure 3.23(a). The evolutionof the error is shown in Figure 3.23(b) with a RMSE value of 5.8328 cm. Inthis Figure also is appreciated the changes between straight and curve and vice-verse when the error peaks appears. These transitions are appreciated too, in thevariations of the steering wheel value in Figure 3.23(c) as has been explained inthe previous Figure 3.20.

Finally Table 3.3 shows the results of these two test covering 1 and 3.5 km.

Page 71: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 35

(a) Vehicle’s speed. (b) Error measured. The RMSE during thistest was 5.8328

(c) Measure of the steering wheel of the vehi-cle during the 4 laps test with speed variations.

Figure 3.23: Evolution of the system during 4 laps inside the circuit withoutmark detection. Variable speed was applied.

Number Kms Min Speed Top Speed RMSEof Laps (km/h) (km/h) (cm)18 3.5 13 13 5.00684 1 11 26 5.8328

Table 3.3: Results obtained with the driveless car without mark detection.

3.5. Autonomous Driving with Global LocalizationOnce the system was successfully tested following the line painted on the

road as the only input of the system. A mark codification was added to knowthe position of the car inside the circuit. Following the visual algorithm of themark detection on decoding is presented. Then how it is integrated in the controlsystem is explained. Finally a set of tests is presented.

After develop a system that is able to work without location information, anupdate was done that take into account the mark decoding information about thecircuit. As previously mentioned visual marks were painted on the road to givesome information about the length and type of the different circuit’s sectors. Theadjustments in the control system are presented next, as well as a huge amount

Page 72: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

36 3.5. Autonomous Driving with Global Localization

of tests were done to check the impeccable system’s behavior.

3.5.1. Fuzzy Control for Lateral Estimation and Global Lo-calization

The Fuzzy controller is almost the same of the presented on 3.4.1. In this casean offset is added at the end of the controller’s output. The control’s structure ofFuzzy+ I +O f f set is shown in Figure 5.24.

Figure 3.24: Control loop of the visual servoing system with global localization.Fuzzy+ I +O f f set schema.

The offset component behaves like a feed-forward controller that offsets theeffect of the change of the curvature of the circuit each time that a new mark hasbeen detected. It is calculated theoretically using the equations of the Frenet-frame kinematic model of a car-like mobile robot (Siciliano and Khatib, 2008b).More detailed information about this control component is shown in (Sanchez-Lopez et al., 2012).

The Ki is reduced from 0.6, of the previous approach, to 0.3.

3.5.2. ExperimentsThe behavior of the Fuzzy controller with the mark recognition has been

checked with a set of tests in the real circuit. First is presented a set of stepresponse. Secondly the system was tested covering a number of laps inside thecircuit with a constant vehicle’s speed. Finally a set of tests are shown with speedvariations.

Step response

As well as the previous approach the new approach was also tested against50 pixels step perturbation. The most relevant tests are presented next.

Page 73: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 37

Figure 3.25(a) shows the error measured when a +50 pixels step perturbationis applied to the system at 10 km/h. The resulting RSME value is 6.8402 cm. Theangle of the steering wheel versus the controller commands is shown in Figure3.25(b).

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.25: Step response to a 50 pixels step at 10 km/h in straight with markapproach system. The Error measurement, and the steering wheel movements,and the Fuzzy controller output are shown. The value of the RMSE of the test is6.8402 cm.

Figure 3.26 shows the results for a step perturbation test at 15 km/h in astraight way. For this test the value of the RMSE is 7.0592.

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.26: Step response to a 50 pixels step at 15 km/h in straight with markapproach system. The Error measurement, and the steering wheel movements,and the Fuzzy controller output are shown. The RMSE value of the test is 6.719cm.

To test the robustness of the controller against step perturbations similar testshave been done when the vehicle was inside a curve. Figure 3.27(a) shows thestep command and the evolution of the error at 10 km/h. In this case the curve to

Page 74: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

38 3.5. Autonomous Driving with Global Localization

the left and the step was done to the internal part of the curve (to the left).

The action of the controller and the response of the steering are shown inFigure 3.27(b).

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.27: Step response to a 50 pixels step at 10 km/h in curve with markapproach system. The Error measurement, and the steering wheel movements,and the Fuzzy controller output are shown. The value of the RMSE of the test is5.1034 cm.

The test at 15 km/h inside the curve has been done applying a perturbationin the direction against the curve, trying to move the car out of the curve. Fig-ure 3.28(a) shows the evolution of this test comparing the step command and theerror at each frame. As well as previous tests the Figure 3.28(b) shows a compar-ison between the commands sent by the Fuzzy controller and the steering wheelposition frame by frame.

(a) Error measured (in pixels). (b) Evolution of the steering wheel angle ver-sus the controller commands.

Figure 3.28: Step response to a 50 pixels step at 15 km/h in curve with markapproach system. The Error measurement, and the steering wheel movements,and the Fuzzy controller output are shown. The value of the RMSE of the test is5.5429 cm.

Page 75: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 39

Table 3.4 shows the results of all the step perturbation’s tests. Comparing thistable with the previous approach (Table 3.2) is possible to appreciated that thechange of the integral’s gain (from 0.6 to 0.3) has a small effect in controller’sbehavior. In curve tests the offset added in the second approach improves thecontroller behavior reducing the RMSE value.

Step size circuit speed RMSE(pixels) section (km/h) (cm)50 straight 10 7.505150 straight 10 6.840250 straight 15 7.827450 straight 15 6.719050 curve 10 6.956150 curve 10 5.103450 curve 15 5.542950 curve 15 6.517450 curve 15 6.264850 curve 20 6.9676

Table 3.4: Results of the 50 pixels step perturbation for the system with markdetection.

Constant vehicle speed

As well as the previous approach a set of test were done to check the behaviorof the controller when the system have to cover a long distance. Firstly a set oftests in which the speed is set to a constant value is presented. Then a set of testsis presented with variable vehicle’s speed.

In the first test the vehicle cover 9 laps on the circuit (Figure 3.29). Thedistance covered is almost 2 km. Figure 3.29(a) shows that the vehicle speedwas set to 15 km/h for this test. Figure 3.29(b) shows the evolution of the carinside the circuit by the steering wheel’s movement. As previously shown inFigure 3.20 when the car is on the bigger radius curve the steering wheel is turnedaround 120 degrees. The steering wheel’s turn around 220 represents when thevehicle is on the short radius curve. Otherwise the car is on the circuit’s straights.Figure 3.29(c) shows the error during all the test. The RMSE value obtained inthis test is 4.8270 cm.

In the second test the vehicle cover 13 laps on the circuit (Figure 3.30). Thedistance covered is almost 2 km. Figure 3.30(a) shows the vehicle speed set to20 km/h with small variations. Figure 3.30(b) shows the error during all the test.The RMSE value obtained in this test is 3.5256 cm.

In the third test the vehicle cover 13 laps on the circuit (Figure 3.31). Thedistance covered is more than 2 km. Figure 3.31(a) shows the vehicle speed setto 15 km/h. Figure 3.31(b) shows the evolution of the car inside the circuit by

Page 76: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

40 3.5. Autonomous Driving with Global Localization

(a) Vehicle speed.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 4.820 cm.

Figure 3.29: Evolution of the system during 9 laps inside the circuit with markdetection. Speed is set to 15 km/h.

(a) Vehicle speed. (b) Error measured in pixels

Figure 3.30: Evolution of the system during this test 13 laps inside the circuitwith mark detection. Variable speed with an average of 20 km/h.

the movement of the steering wheel. Figure 3.31(c) shows the error during all

Page 77: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 41

(a) Vehicle speed.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 5.8098 cm.

Figure 3.31: Evolution of the system during 13 laps inside the circuit with markdetection. Speed is set to 15 km/h.

the test. The RMSE value obtained in this test is 5.8098 cm.

In the fourth test the vehicle cover 21 laps on the circuit (Figure 3.32). Thedistance covered is more than 4 km. Figure 3.32(a) shows the vehicle speed setto 14 km/h. An emergency stop was tested here. After that the test was finishedcovering one lap more at 16 km/h. Figure 3.32(b) shows the evolution of the carinside the circuit by the movement of the steering wheel. Figure 3.32(c) showsthe error during all the test. The RMSE value obtained in this test is 3.41006 cm.

Finally a long test of 30 laps on the circuit (Figure 3.33) was done. Thedistance covered is around 6 km. Figure 3.33(a) shows the vehicle speed set to15 km/h. Figure 3.33(b) shows the evolution of the car inside the circuit by themovement of the steering wheel. Figure 3.33(c) shows the error during all thetest. The RMSE value of this test is 3.6874 cm.

Page 78: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

42 3.5. Autonomous Driving with Global Localization

(a) Vehicle speed.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 3.41006 cm.

Figure 3.32: Evolution of the system during 21 laps inside the circuit with markdetection. Speed is set to 14 km/h. An emergency stop was applied.

Variable vehicle speed

Next is presented a set of tests in which the speed is not set to a constantvalue. In this cases the speed was controlled by a human.

In the first one was tested the behavior of the controller against a big speedacceleration, and reduction and two stop and go tests inside curves (Figure 3.34).Here a top speed of 48 km/h was reached as is shown in Figure 3.34(a). Afterthis, a strong reduction of the speed was applied. The vehicle cover 2 laps onthe circuit. Figure 3.34(b) shows the evolution of the car inside the circuit by themovement of the steering wheel. Figure 3.34(c) shows the error during all thetest. The RMSE value of this test is 3.0313 cm. The circuit length limitationsimplies to do a big acceleration to reach the top inner-city speed.

The second test shows the behavior the system covering 17 laps at differentspeeds (Figure 3.35). Vehicle speed changes from 15 to 25 km/h as is shown inFigure 3.35(a). Figure 3.35(b) shows the evolution of the car inside the circuitby the movement of the steering wheel. Figure 3.35(c) shows the error during all

Page 79: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 43

(a) Vehicle speed.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 3.6874 cm.

Figure 3.33: Evolution of the system during 30 laps inside the circuit with markdetection. Speed is set to 15 km/h.

the test. The RMSE value of this test is 5.5535 cm.

The third test shows the behavior the system covering 4 laps with big speedchanges (Figure 3.36). Here is also checked the robustness of the system af-ter line loss. Vehicle speed changes from 6 to 38 km/h as is shown in Figure3.36(a). Figure 3.36(b) shows the evolution of the car inside the circuit by themovement of the steering wheel. Figure 3.36(c) shows the error during all thetest. The RMSE value of this test is 8.6494 cm. The line loss are shown as anerror constant value.

The fourth test shows the behavior the system covering 6 laps with big speedchanges (Figure 3.37). Vehicle speed changes from 14 to 42 km/h as is shownin Figure 3.37(a). Figure 3.37(b) shows the evolution of the car inside the circuitby the movement of the steering wheel. Figure 3.37(c) shows the error during allthe test. The RMSE value of this test is 10.7916 cm.

Page 80: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

44 3.5. Autonomous Driving with Global Localization

(a) Vehicle speed. A top speed of 48km/hhas been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The valueof RMSE for this test is 3.0313cm.

Figure 3.34: Evolution of the system with mark detection and speed changesfrom 0 to 48 km/h.

The fifth test shows the behavior the system covering 4 laps with big speedchanges (Figure 3.38). Vehicle speed changes from 14 to 42 km/h as is shownin Figure 3.38(a). Figure 3.38(b) shows the evolution of the car inside the circuitby the movement of the steering wheel. Figure 3.38(c) shows the error during allthe test. The RMSE value of this test is 9.1397 cm.

The following test shows the behavior the system covering 14 laps at differentspeeds (Figure 3.39). Vehicle speed changes from 10 to 23 km/h as is shown inFigure 3.39(a). Figure 3.39(b) shows the evolution of the car inside the circuitby the movement of the steering wheel. Figure 3.39(c) shows the error during allthe test. The RMSE value of this test is 7.5486 cm.

The last test shows the behavior the system covering 6 laps at different speeds(Figure 3.40). Vehicle speed changes from 14 to 24 km/h as is shown in Figure3.40(a). Figure 3.40(b) shows the evolution of the car inside the circuit by themovement of the steering wheel. Figure 3.40(c) shows the error during all thetest. The RMSE value of this test is 8.4839 cm.

Table 3.5 shows together the more important information of the presented

Page 81: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 45

(a) Vehicle speed. A top speed of 25km/h has been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 5.5535cm.

Figure 3.35: Evolution of the system during 17 laps inside the circuit with markdetection. Speed changes from 15 to 25 km/h.

Number Min Speed Top Speed RMSEof Laps (km/h) (km/h) (cm)9 15 15 4.827013 20 20 3.525613 15 15 5.809821 14 14 3.410030 15 15 3.687417 15 25 5.55354 6 38 8.64946 14 42 10.79164 14 42 9.139714 10 23 7.54866 14 24 8.48392 10 48 3.0313

Table 3.5: Results obtained with the driveless car with mark detection.

Page 82: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

46 3.5. Autonomous Driving with Global Localization

(a) Vehicle speed. A top speed of 38km/h has been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 8.6494cm.

Figure 3.36: Evolution of the system during 4 laps inside the circuit with markdetection. Speed changes from 6 to 38 km/h.

tests.Videos of the system working could be find in (web, 2011).

Page 83: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 47

(a) Vehicle speed. A top speed of 42km/h has been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this test is10.7916cm.

Figure 3.37: Evolution of the system during 6 laps inside the circuit with markdetection. Speed changes from 14 to 42 km/h.

Page 84: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

48 3.5. Autonomous Driving with Global Localization

(a) Vehicle speed. A top speed of 42km/h has been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 9.1397cm.

Figure 3.38: Evolution of the system during 4 laps inside the circuit with markdetection. Speed changes from 14 to 42 km/h.

Page 85: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 49

(a) Vehicle speed. A top speed of 23km/h has been reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 7.5486cm.

Figure 3.39: Evolution of the system during 14 laps inside the circuit with markdetection. Speed changes from 10 to 23 km/h.

Page 86: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

50 3.5. Autonomous Driving with Global Localization

(a) Speed car during the first test. A top speed of 24km/h hasbeen reached.

(b) Evolution of the steering wheel.

(c) Error measured (in pixels). The value of RMSE for this testis 8.4839cm.

Figure 3.40: Evolution of the system during 17 laps inside the circuit with markdetection. Speed changes from 14 to 24 km/h.

Page 87: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 3. Fuzzy Logic Control for Ground Vehicles 51

3.6. ConclusionsThis chapter presents a reliable control system for a low cost visual driverless

car in city streets. In comparison with the important research results obtainedby high funded projects, this approach uses just one camera to follow a lineon the street. No GPS information has been used to position the vehicle. AFuzzy controller was developed to control the steering wheel of the vehicle. Thevehicle uses a visual algorithm to follow a line painted on the street. To controlthe illumination of the environment a metal structure holding the camera and theUV bulb was designed and installed on the front of the car. The field of viewwas limited by this structure being just 30 x 50 cm. A mark detection systemwas developed to give location information and extra data to the control systemabout the curvature’s size and length of the sector in which the vehicle is. A bigamount of tests in different conditions were presented. A maximum speed of 48km/h was reach with excellent controller response. Distance of more than 6 kmwas covered with a RMSE test’s value of 3.6874 cm. The system was also testedwithout mark detection with lower speed range. The system was able to cover adistance of 3 km and response to emergency stops. A set of step response testswere done with and without mark detection to check the behavior of the system.

The characteristics of the system make it ideal for autonomous public trans-port inside a city, where it is hard to use lane detection because of the high densityof occlusion caused by other vehicles and shadows of buildings, trees and more.

The big limitations of the field of view of this approach and the robustnessof the system warrant further work to create a system not dependent on a paintedline that can be used outside of cities. The developed control system could beadjusted to use more visual information, for example with one or more camerason the top of the vehicle and using a road lane detector. More sensors could beadded to increase the capabilities of the developed system, like a laserscanner toaccomplish the task of obstacle avoidance, or a GPS to get the information thatis currently obtained by the mark detection.

Page 88: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 89: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4Fuzzy Logic Control for UnmannedHelicopters

This chapter presents the works done with Fuzzy control and a sophisticatedRadio controlled helicopter. Two applications were presented to prove the powerof the Fuzzy logic technique for control. Both applications use the visual in-formation acquired by onboard cameras. The helicopter used for this test is acommercial RC helicopter adapted for be controlled autonomously by an on-board computer. The helicopter description is presented in Section 4.2. The firstapplication is about the control of an onboard visual pan and tilt platform andthe heading of the helicopter. This application is focused to be an eye in the skyfor tracking objects and surveillance tasks. This is presented in Section 4.3. Thesecond application presents the Fuzzy control of the helicopter for autonomous

publication related to this chapter:-“A pan-tilt camera fuzzy vision controller on an unmanned aerial vehicle”, IEEE-IROS 2009-“Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visualsystems”, Journal of Autonomous Robots, 2010-“Fuzzy controller for UAV-landing task using 3d-position visual estimation”, IEEE-FUZZ 2010(WCCI 2010)-“Visual servoing using fuzzy controllers on an unmanned aerial vehicle”, EUROFUSE 2010-“Visual Servoing for UAVs”, InTech, Book chapter, 2010-“Non-symmetric membership function for Fuzzy-based visual servoing on-board a UAV”, Com-putation Intelligence Foundations and Applications, Book chapter, 2010

53

Page 90: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

54 4.1. Related Works

landing tasks. Firstly is presented the control of the thrust of the aircraft, and sec-ondly the control of pitch, roll and thrust for a fully autonomous landing. Thesework are presented in Section 4.4.

4.1. Related WorksThe unmanned aerial vehicles (UAV) have made its way quickly and deci-

sively to the forefront of current aviation technology. Opportunities exist in abroadening number of fields for the application of UAV systems as the com-ponents of these systems become increasingly lighter and more powerful. Ofparticular interest are those occupations that require the execution of missionswhich depend heavily on dull, dirty, or dangerous work, UAVs provide a cheap,safe alternative to manned systems and often provide a far greater magnitude ofcapability.

Many applications were developed and shown in the literature, as is previ-ously mentioned in the Chapter 2. Here are presented the related works of thepresented applications.

The control of an onboard visual pan and tilt platform, increase the possibili-ties to detect and track objects. In all of the visual control related works the UAVmust change its position to track the object, as is shown in (Mejias et al., 2006a),(Campoy et al., 2009b). The use of a visual platform reduce the limitations ofthe UAV’s movements, because the aircraft can change its position according thesurrounding limitations, while the servo assisted camera continue tracking theobject . Related with this work, there are some laboratory tests works made, like(Zou et al., 2006) based on the tracking on a biomimetic Eye. (Dobrokhodovet al., 2006) performed real tests of target tracking and motion estimation formoving target. (Chitrakaran et al., 2006), also performed real tests of a visionassisted autonomous path following. The visual algorithm used to track is alsoan important topic related to this work. In the presented tests we use pyramidalLucas-Kanade based algorithm as is presented in 4.3.1. There is a large diversityof tracking methods from which approaches can be mention based on features(Mikolajczyk and Smid, 2005), direct methods (correlation)(Irani and Anandan,2000), color and shape algorithms among others. The goal of any visual trackingsystem is to be able to identify a reference pattern correctly and continuouslyon the image plane independently from the variations presented on the image se-quence with respect to parameters like camera or scene rotations and translations,object occlusions, illumination changes and noise, among others.

Different works have also been done where a vision system was used forlow altitude position estimation and autonomous landing. In the work presentedin (De Wagter and Mulder, 2005), the authors have evaluated the use of visualinformation at different stages of a UAV control system, including a visual con-troller and a pose estimation for autonomous landing using a chessboard pattern.Merz (Merz et al., 2004), (Merz et al., 2006), uses a method that fuses visual andinertial information in order to control an autonomous VTOL aircraft landing on

Page 91: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 55

known landmarks. Saripalli et. al. have proposed and experimental method forautonomous landing on a moving target, (Saripalli and Sukhatme, 2007), (Sari-palli et al., 2003), by tracking a known helipad and using it to complement thecontroller IMU+GPS state estimation. In (Fucen et al., 2009) a visual system isused to detect, identify a landing zone (helipad) and confirm the landing directionof the vehicle, have been presented.

Vision based landing for multi-rotor UAVs has been an actively studied fieldin recent years. Some examples are the work presented by Lange in (Lange et al.,2008) where the visual system is used to estimate a vehicle position relative to alanding place. Recently, Nonami (Nonami et al., 2010) and then Wenzel (Wenzelet al., 2011) have presented two different methods for small UAV autonomoustakeoff, tracking and landing on a moving platform. The first is based on op-tical flow, the second uses IR landmarks visual tracking to estimate the aircraftposition.

Regarding the autonomous landing works presented in the literature the workpresented in Section 4.4 is focused on the control system developed, despitethe visual algorithm and speed estimation of those works. However a extendedexplanation of the visual algorithm used and the position estimation is presentedin 4.4.1

4.2. Unmanned Helicopter Description

The two unmanned aerial vehicles used for the works presented in this chap-ter are an electric helicopter SR20, shown in Figure 4.1, and a gas-powered heli-copter SRA1, shown in Figure 4.2. The first one is a modified Xcell Electric RChelicopter. The second one is Modified Bergen Industrial Twim gas powered. Amore detailed information could be found in appendix C.

Figure 4.1: Autonomous electric helicopter SR20.

Both aircrafts are equipped with an Xscale-based flight computer augmentedwith sensors (GPS, IMU, Magnetometer, fused with a Kalman filter for stateestimation). Additionally they includes a pan and tilt servo-controlled platform

Page 92: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

56 4.2. Unmanned Helicopter Description

Figure 4.2: Autonomous gas-powered helicopter SRA1.

for many different cameras and sensors. In order to enable it to perform visionprocessing, it also has a VIA mini-ITX 1.5 GHz onboard computer with 2 GbRAM, a wireless interface, and support for many Firewire cameras includingMono (BW), RAW Bayer, color, and stereo heads. It is possible to use IP andanalog cameras as well. The system runs in a client-server architecture usingTCP/UDP messages. The computers run Linux OS working in a multi-clientwireless 802.11g ad-hoc network, allowing the integration of vision systems andvisual tasks with flight control. This architecture allows embedded applicationsto run onboard the autonomous helicopter while it interacts with external pro-cesses through a high level switching layer. The visual control system and addi-tional external processes are also integrated with the flight control through thislayer using TCP/UDP messages. The layer is based on a communications APIwhere all messages and data types are defined.

The flight control system is composed of three control loops arranged in acascade formation, allowing it to perform tasks at different levels depending onthe workspace of the task. The first control loop is in charge of the attitude ofthe helicopter. It is based on a decoupled PID control in which each degreeof freedom is controlled separately based on the assumption that the helicopterdynamics are decoupled. The attitude control stabilizes the helicopter duringhovering by maintaining the desired roll, pitch and heading. It is implementedas a proportional-plus-derivative (PD) control. The second control loop is theone that has the visual signal feedback. This control loop is composed of the vi-sual pan-tilt platform and the heading controllers. All of them are designed withFuzzy Logic techniques, as explained in more detail in the next subsection. Withthis loop, the UAV is capable of receiving external references (from the VisualProcess) to keep it aligned with a selected target, and it leaves the stability of theaircraft to the most internal loop in charge of the attitude. The third controller(position based control) is at the higher level of the system, and is designed to re-ceive GPS coordinates. The control scheme allows different modes of operation,one of which is to take the helicopter to a desired position (position control). In-tegration of the visual references uses the TCP/UDP and API architecture. Moredetailed information of how those messages are integrated in the server processrunning onboard the UAV is given in (Mejias, 2006).

Page 93: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 57

4.3. Eye in the sky: Autonomous Visual Servoingfor Object Tracking

In this Section we describe a control system that can follow static and mov-ing objects in real time, despite the helicopter’s movements. This visual servo-ing system improves on previous works (Campoy et al., 2009b), (Mejias et al.,2006b) in the inclusion of control of a pan and tilt visual platform. This givesa quicker response than helicopter movements and also gives total freedom ofmovement to the UAV, making it possible to track objects not only during hover-ing flight, as in the previous work, but also during a path-planned flight or whenflying under manual control.

4.3.1. Visual Tracking

The visual tracking system is based on the Lucas-Kanade algorithm (Lucasand Kanade, 1981) which is a Gauss-Newton gradient descent non-linear opti-mization algorithm. An optical flow with a pyramidal implementation of thisalgorithm is used. It relies on two premises: first, intensity constancy in thevicinity of each pixel considered as a feature; second, minimal change in the po-sition of the features between two consecutive frames. Given these restrictions,to ensure the performance of the algorithm, it can be expressed in the followingform: if we have a feature position pi = (x,y) in the image Ik, the objective of thetracker is to find the position of the same feature in the image Ik+1 that fits theexpression p′i = (x,y)+ t, where t = (tx, ty). The t vector is known as the opticalflow, and it is defined as the visual velocity that minimizes the residual functione(t) defined as:

e(t) =W

∑(Ik(pi)− Ik+1(pi + t))2w(W ) (4.1)

where w(x) is a function that assigns different weights to the comparisonwindow W . This equation can be solved for each tracked feature, but since itis expected that all features on physical objects move in solidarity, summationcan be done over all features, obtaining the movement of the object on the imageplane and using as the input to the Fuzzy controller.

4.3.2. Fuzzy Control of a Pan and Tilt Visual Platform andUAV’s Heading

To control the visual platform two Fuzzy logic controller were developed.Another Fuzzy logic controller was developed to command the heading of thehelicopter. The combined control of the visual platform and the heading of thehelicopter allows a suitable automatic control to be applied in different situations.In addition to overcoming surroundings difficulties or adapting to the needs of

Page 94: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

58 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

the pilot, it provides the possibility of tracking static and moving objects in thesesituations:

1. With a preprogrammed series of way points.

2. When the pilot is controlling the helicopter or when the flight commandsare sent from the ground station, making it easier to control the UAV byallowing it to control heading automatically.

3. Staying in hovering position in a safe place.

Also, the fuzzy logic provides a more versatile solution for controlling theplatform and helicopter because it is easier to tune and to adapt to the real world,due to the fact that it can represent non-linear problems. This gives a bettersolution for overcoming the helicopter’s own vibrations and other perturbationsignals from the environment. In this chapter, we will first describe the controlconfiguration of the UAV, continue with an explanation of our fuzzy softwareimplementation, and finish by presenting the various fuzzy controllers that weimplemented.

The three Fuzzy logic controller were developed using the C++ libraryMOFS (Miguel Olivares’ Fuzzy Software). The fuzzification of the inputs andthe outputs is defined using a triangular membership function, for the platformcontrollers, and trapezoidal and triangular membership functions, for the head-ing. The controllers are presented in Figures 4.3, 4.4, 4.5, and have two inputsand one output. The first input is the error between the center of the object andthe center of the image (Figures 4.5(a) and 4.4(a)) and the derivate of it (Figures4.5(b), 4.4(b)). The platform controller output represents the required movementof the servomotor in the two axes to minimize the error, on a scale from 0 to 255,(Figure 4.3(c), 4.4(c) ). The heading controller takes the same inputs as the yawcontroller (Figure 4.5(a), 4.5(b)) , and the output represents how many radiansthe UAV must rotate to line up with the object (Figure 4.5(c)).

The three controllers work in parallel, providing redundant operation on theyaw axis. The servos have a movement limitation from −90 to 90 degrees, thehelicopter heading controller cancel this limitation. The third controller alsoserves to eliminate the turn limitations of the platform when the tracked objectmoves to the back part of the UAV. The rules’ base is based on heuristic knowl-edge and have 49-rules base. The output has a defined a more important sectorin the section near zero, as shown in Figures 4.3(c) 4.4(c). This option gives tothe controller more sensibility when the error is small and the object is very nearto the center of the image and a very quick-response controller when the objectis far away. The heading controller has a central trapezoidal section in the outputdefinition. This special form send no command when the error is smaller. In thatway this controller lets the servo yaw controller works when the error is not toobig. When error increase the heading controller start to work helping the yaw

Page 95: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 59

(a) Yaw Error.

(b) Derivative of the Yaw error.

(c) Output of the Yaw and the Pitch Platform con-troller.

Figure 4.3: Visual platform yaw controller.

(a) Pitch Error.

(b) Derivative of the Pitch error.

(c) Output of the Yaw and the Pitch Platform con-troller.

Figure 4.4: Visual platform pitch controller.

servo when it is close to its physical limits. This special definition increase, also,the helicopter’s stability in situations where the tracked object is near the center.

The linguistic values used for the variables definition are represented usingacronyms in Figures 4.3, 4.4, 4.5. The Meaning of these acronyms are shown in

Page 96: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

60 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

(a) Heading Error.

(b) Derivative of the heading error.

(c) Output of the Heading controller.

Figure 4.5: Heading controller.

Variable Name Acronym Meaning

Error

VBL Very Big to the LeftBL Big to the LeftLL Little to the LeftC Center

LR Little to the RightBR Big to the Right

VBR Very Big to the Right

Derivative Error

VBN Very Big NegativeBN Big NegativeLN Little NegativeZ Zero

LP Little PositiveBP Big Positive

VBP Very Big Positive

Output: Turn

VBL Very Big to the LeftBL Big to the LeftL Left

LL Little to the LeftC Center

LR Little to the RightR Right

BR Big to the RightVBR Very Big to the Right

Table 4.1: Meaning of the acronyms of the linguistic values of the the fuzzyvariables inputs and the output.

Page 97: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 61

Table 4.1.Dot \ Error VBL BL LL C LR BR VBRVBN VBL VBL VBL BL L LL ZBN VBL VBL BL L LL Z LRLN VBL BL L LL Z LR RZ BL L LL Z LR R BRLP L LL Z LR R BR VBRBP LL Z LR R BR VBR VBRVBP Z LR R BR VBR VBR VBR

Table 4.2: Rules’ base for the Yaw and Pitch controllers.

Dot\ Error VBL BL LL C LR BR VBRVBN BL BL BL BL L LL ZBN BL BL BL L LL Z LRLN BL BL L LL Z LR RZ BL L LL Z LR R BRLP L LL Z LR R BR BRBP LL Z LR R BR BR BRVBP Z LR R BR BR BR BR

Table 4.3: Rules’ base for the Heading controller.

4.3.3. ExperimentsFollowing are presented the laboratory experiments done to set the Fuzzy

controllers and the real experiments done with the helicopter.

4.3.3.1. Laboratory Experiments

In order to set the Fuzzy controllers a some tests were done in the laboratory.First the controllers of the visual platform were set. After that a helicopter simu-lator was used to set the heading controller. In both tunning process the variablerange and the sets were modify.

For the pan and tilt controllers tunning a step response test was defined. Theinitial conditions were equal for all the tests. The platform was located fixed witha tripod, and the object to track was located at the image position (−100,30)pixels. To compare the results of each controller the stabilization time, the type,value and the number of oscillations were taken into account. The most relevanttests are shown in Figure 4.6. Table 4.4 shows a comparison of the between thestabilization time for each controller.

In order to test and fit the heading controller, some tests were done with areal pan and tilt platform and a helicopter’s simulator. A capture frame of thissimulator is shown in Figure 4.7. In it is possible to appreciate the position of

Page 98: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

62 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

Figure 4.6: Different rules-base and output definition fuzzy systems used in lab-oratory tests. a) Without more significant sector defined in the output. b) With aoutput minor range of action than the one show in Figure 4.5(c), sets in -10, -7.5,-5, -2.5, 0, 2.5, 5, 7.5 and 10. c) the selected one shown in Figure 4.5(c)

Fuzzy System axis frames tostabilization

a) Yaw 26a) Pitch 35b) Yaw 10b) Pitch 10c) Yaw 5c) Pitch 5

Table 4.4: Measures of the most relevant Fuzzy system.

the UAV in a real map, the 3D render image of the helicopter, as well a telemetryscreen where we can see all of the axis, velocities and position changes of the he-licopter. In these tests the heading controller of the simulated helicopter and thereal pan and tilt platform work together. The platform was moved with chaoticmovements during the duration of the test to simulate the helicopter movements.The object to track was a mark set in a static position. One of these tests ispresented next.

The error measure in both axis is shown in Figure 4.8. In this Figure is ap-preciated the big changes of the pan and tilt position in both axis. The controllersmanages these changes reducing the error in few frames.

Figure 4.9 shows the response of the Fuzzy controller of the visual platformfor the pitch angle. The fast response of the controller reduce the error as isshown in Figure 4.8. Figure 4.10 shows the controller response of the platform’syaw axis. It is possible to appreciated a big and rapid movement near the 1600frames, reaching an error of almost 100 pixels. This big error is canceled in

Page 99: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 63

Figure 4.7: A sample window of the helicopter Simulator.

Figure 4.8: Error between the static object tracked and the center of the image,running with the UAV simulator.

less than 10 frames. It must be taking into account that the visual platform is incontinuous movement.

Figure 4.9: Response of the Fuzzy control for the Pitch axis of the visual platformtracking a static object with the simulator of the UAV control.

The response of the heading controller is shown in Figure 4.11. As previouslymentioned the controller only response when the error is big enough. Figure 4.12shows the heading changes of the helicopter.

Page 100: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

64 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

Figure 4.10: Response of the Fuzzy control for the Yaw axis of the visual plat-form tracking a static object with the simulator of the UAV control.

Figure 4.11: Response of the Fuzzy control for the heading of the helicopter.

Figure 4.12: Heading movement.

Page 101: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 65

4.3.3.2. Real Tests with the UAV.

Following are presented the results obtained using the helicopter. In this onlythe platform controllers were active. To check the behavior of the developedcontrol system two different tests were done. First, is presented a static object’stracking. Then a tracking of a moving van is shown.

Tracking Static Objects The static object tracked was a high contrasted mark.The tracking process start before the helicopter motor’s ignition and finish whenit is landed. This flight was made by sending set-points from the ground station.Figure 4.18 shows a 3D reconstruction of the flight using the GPS and IMU dataon three axes: North (X), East (Y), and Altitude (Z). The UAV is positionedover the north axis, looking to the east, where the mark to track is located. Theframe rate is 15 frames per second, so those 2500 frames represent a full flight ofalmost 3 minutes. This first test of tracking static object was done gas-poweredhelicopter.

Figure 4.13: 3D flight reconstruction from the GPS and the IMU data from theUAV. Where, the ’X’ axis represents the NORTH axis of the surface of the tan-gent of the earth, the ’Y’ axis represents the EAST axis of the earth, the ’Z’is the altitude of the helicopter and the red arrows show the pitch angle of thehelicopter.

Figure 4.14 shows the UAV’s yaw and pitch movements. Figure 4.16 shows

Page 102: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

66 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

(a) Pitch angle movements.

(b) Yaw angle movements.

(c) Roll angle movements.

Figure 4.14: Different pitch and yaw movements of the UAV.

the output of the two Fuzzy controllers. The big changes in the helicopter po-sition and the wind perturbations generates big errors in both axis. These errorwere canceled in few frames by the visual platform’s controllers.

Figure 4.15: Error between center of the image and center of the object to track.

A summarize of the most important changes in both axis are presented inTable 4.5. The different perturbation’s size were manage by the controllers re-ducing the error very fast. This Table is divided by its first column in 4 sections.These sections are related to the big changes shows in Figures 4.14, 4.15. Themaximum error value was obtained after the motor ignition. During this specificpart of the test the helicopter experiment a big lateral oscillation. For this reasonthe error reach +100 pixels value at frame interval (540−595). During the initial

Page 103: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 67

Figure 4.16: Output from the Fuzzy Controller.

Section Frames Interval Attitude angle Degrees Frames Num. Time degrees/s. Error (pixels)1 540-595 Yaw +8 55 3.6s +2.28 +100 (Yaw)1 590-595 Roll -5 5 0.33s -15 +100 (Yaw)1 570-595 Pitch -4 25 1.6s -2.5 +40 (Pitch)1 595-620 Yaw -22 25 1.6s -13.75 +50 (Yaw)1 595-660 Roll +10 65 4.3s +2.35 +50 (Yaw)1 620-660 Yaw +20 40 2.6s +15.38 -75 (Yaw)2 1460-1560 Yaw -40 100 6.6s -6.06 +52 (Yaw)2 1560-1720 Yaw +28 160 10.6s +2.64 48 (Yaw)3 2170-2260 Yaw -35 90 6s -5.8 55 (Yaw)4 2375-2450 Yaw -27 75 5s -5.4 48 (Yaw)

Table 4.5: Data from big attitude changes sections of the flight.

elevation of the aircraft big error values were obtained as well. The big changesexperimented in Roll and Pitch angles caused errors of +100 pixels for the yawcontroller and +40 pixels for the Pitch controller. This is the only part of theflight in which the helicopter experimented big changes in the three helicopteraxis at the same time.

Another kind of problem occurs when the helicopter is not undergoing bigchanges in its attitude angles. In those phases of the flight with light movements,we have an error of 5 pixels in the pitch axis of the platform and a +5, -15pixels error in the yaw angle. In degrees, this amounts to 0.6562 degrees ofpitch and 0.6562, -1.96875 degrees of yaw. Notice that, in this section of lightmovements, we have continued small yaw changes, as shown in figure 4.20(b)between section 1 and 2.

The second test of static object tracking was done with the electric helicopter.In this case some modification were done in the membership functions of theinputs variables. A more important sector was defined in the center of the bothinputs of the pan and tilt platform. In Fig. 4.17(a) and 4.17(b) are shown the firstinput of each controller and in Fig. 4.17(c) the second input of them. The Fig.4.17(d) shows the definition of the output for each controller. In those figures, itis possible to notice the definition of the non-symmetric membership functions.

A real flight test was carried out. In figure 4.18, it is shown a 3D reconstruc-tion of the flight using the GPS and the IMU data.

In this figure, it is possible to see the trajectory of the UAV during the trackingtask. In order to increase the difficulty of the test, the selection of the object to

Page 104: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

68 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

(a) First input variable of the Yaw controller.

(b) First input Variable of the Pitch controller.

(c) Second input of the Yaw and Pitch Controllers.

(d) Output of the Yaw and Pitch Controllers.

Figure 4.17: Input and Output Variables for the Pitch and Yaw controllers.

track was made during the flight from the start point (as is shown in Figure 4.18).At first, some rapid movements were made in the three axis (Figures 4.19(a),4.19(b) and 4.19(c), before frame 180-200), making big changes to the differentangles of the helicopter, as shown in Figures 4.20(a), 4.20(b), and 4.20(c). Then,a side movement with an increasing velocity was made (from 0.0m/s to morethan 1m/s), as is shown in Figure 4.19(c), from frame 180 to frame 300. Later,a rapid descend with a maximum of 1m/s was made, which is possible to seefrom frame 300 to frame 450 at the figure 4.19(a) and 4.20(a). The manuallanding affected with big changes in all the angles when the helicopter was nextto the floor. The test finalizes with the vibrations of the helicopter when it was incontact with the terrain at the End Point.

This flight is faster than the presented in the previous test, in order to increasethe difficulty of the action to follow the tracked object.

In figure 4.21, it is possible to see the error in the two axis. Also, it is pos-sible to see that there are not peaks of error during the flight, obtaining a verygood response of the controller. Furthermore, the results show the correct con-trol tracking besides the increase of the movements and velocities in comparisonwith previous work. Thus, based in these results, and on the behavior of the con-

Page 105: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 69

Figure 4.18: 3D flight reconstruction using the GPS and the IMU data from the UAV.Where, the ’X’ axis represents the NORTH axis of the surface of the tangent of the earth,the ’Y’ axis represents the EAST axis of the earth, the ’Z’ is the altitude of the helicopterand the red arrows show the pitch angle of the helicopter.

(a) Velocity changes in Z axis (altitude).

(b) Velocity changes in X axis (Forward).

(c) Velocity changes in Y axis (Side).

Figure 4.19: Velocity changes of the UAV.

trollers, it is possible to say that the improvements in the membership functionshave a successful behavior to overcome the vibrations problems and the high

Page 106: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

70 4.3. Eye in the sky: Autonomous Visual Servoing for Object Tracking

(a) Pitch angles changes.

(b) Yaw angles changes.

(c) Roll angles changes.

Figure 4.20: Axis changes of the UAV in degrees.

non-linearity of the VTOL-UAV.

Figure 4.21: Error between center of the image and center of the object to track.

Tracking Moving Objects Here is presented a tracking of a van with contin-uous movements of the helicopter increasing the difficulty of the test. In Figure4.22 we can see the error in pixels of the two axes of the image. Also, we cansee the moments where we deselected the template and re-selected it, in orderto increase the difficulty for the controller. These intervals show up as the errorremains fixed in one value for a long time.

In figures 4.23 and 4.24 it is possible to appreciate the response of the tothe controllers, showing the large movements sent by the controller to the servoswhen the mark is re-selected. Notice that in all of the figures that show thecontroller responses, there are no data registered when the mark selection is lostbecause no motion is tracked. Figure 4.22 shows the data from the flight log, the

Page 107: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 71

Figure 4.22: Error between center of the image and center of the dynamic object(a van) to track.

black box of the helicopter. We can see that the larger responses of the controllersare almost±10 degrees for the yaw controller and almost 25 degrees for the pitchcontroller, corresponding to the control correction over a period of fewer than 10frames.

Figure 4.23: Response of the Fuzzy control for the Yaw axis of the visual plat-form tracking a dynamic object (a van).

Figure 4.24: Response of the Fuzzy control for the Pitch axis of the visual plat-form tracking a dynamic object (a van).

It is possible to view these test videos and more on (url, 2012a).

4.4. Autonomous Landing

4.4.1. 3D Estimation Based on HomographiesNext is explained how the frame-to-frame homography is estimated using

matched points and robust model fitting algorithms. For it, the pyramidal Lucas-Kanade optical flow (Bouguet Jean Yves, 1999) on corners detected using themethod of Shi and Tomasi (Shi and Tomasi, 1994) is used to generate a set of

Page 108: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

72 4.4. Autonomous Landing

corresponding points, then, a RANSAC (Fischer and Bolles, 1981) algorithm isused to robustly estimate projective transformation between the reference objectand the image. Next section explains how this frame-to-frame is used to obtainthe 3D pose of the object with respect to the camera coordinate system.

On images with high motion, good matched features can be obtained us-ing the well known Pyramidal Lucas-Kanade algorithm modification (BouguetJean Yves, 1999). It is used to solve the problem that arise when large andnon-coherent motion are present between consecutive frames, by first trackingfeatures over large spatial scales on the pyramid image, obtaining an initial mo-tion estimation, and then refining it by down sampling the levels of the imagespyramid until it arrives at the original scale.

The set of corresponding or matched points between two consecutive images((xi,yi)↔ (x′i,y

′i) for i = 1 . . .n,) obtained using the pyramidal Lucas-Kanade op-

tical flow is used to compute the 3x3 matrix H that takes each xi to x′i or x′i = Hxior the Homography that relates both images. The matched points often have twoerror sources. The first one is the measurement of the point position, which fol-lows a Gaussian distribution. The second one is the outliers to the Gaussian errordistribution, which are the mismatched points given by the selected algorithm.These outliers can severely disturb the estimated homography, and consequentlyalter any measurement based on homographies. In order to select a set of inliersfrom the total set of correspondences so that the homography can be estimatedemploying only the set of pairs considered as inliers, robust estimation usingRandom Sample Consensus (RANSAC) algorithm (Fischer and Bolles, 1981) isused. It achieves its goal by iteratively selecting a random subset of the originaldata points by testing it to obtain the model and evaluating the model consensus,which is the total number of original data points that best fit the model. In thecase of a Homography, four correspondences are enough to have a exact solutionor minimal solution using the Inhomogeneous method (Criminisi et al., 1999).This procedure is then repeated a fixed number of times, each time producingeither a model which is rejected because too few points are classified as inliers,or a refined model. When total trials are reached, the algorithm returns the Ho-mography with the largest number of inlier.

World Plane Projection onto The Image Plane In order to align the planarobject on the world space and the camera axis system, we consider the generalpinhole camera model and the homogeneous camera projection matrix, that mapsa world point xw in P3 to a point xi on ith image in P2, defined by equation 4.2:

sxi = Pixw = K[Ri|ti]xw = K[ri

1 ri2 ri

3 ti]xw (4.2)

where the matrix K is the camera calibration matrix, Ri and ti are the rotationand translation that relates the world coordinate system and camera coordinatesystem, and s is an arbitrary scale factor. Figure 4.25 shows the relation between

Page 109: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 73

Figure 4.25: Projection model on a moving camera and frame-to-frame homog-raphy induced by a plane.

a world reference plane and two images taken by a moving camera, showing thehomography induced by a plane between these two frames.

If point xw is restricted to lie on a plane Π , with a coordinate system selectedin such a way that the plane equation of Π is Z = 0, the camera projection matrixcan be written as equation 4.3:

sxi = PixΠ = Pi

XY01

= 〈Pi〉

XY1

(4.3)

where 〈Pi〉 denotes that this matrix is deprived on its third column or 〈Pi〉=K[ri

1 ri2 ti]. The deprived camera projection matrix is a 3×3 projection ma-

trix, which transforms points on the world plane ( now in P2) to the ith imageplane (likewise in P2), that is none other that a planar homography Hi

w definedup to scale factor as equation 4.4 shows.

Hiw = K

[ri

1 ri2 ti]= 〈Pi〉 (4.4)

Equation 4.4 defines the homography which transforms points on the worldplate to the ith image plane. Any point on the world plane xΠ = [xΠ,yΠ,1]T isprojected on the image plane as x = [x,y,1]T . Because the world plane coordi-nates system is not know for the ith image, Hi

w can not be directly evaluated.However, if the position of the word plane for a reference image is known, ahomography H0

w, can be defined. Then, the ith image can be related with thereference image to obtain the homography Hi

0. This mapping is obtained usingsequential frame-to-frame homographies Hi

i−1, calculated for any pair of frames(i-1,i) and used to relate the ith frame to the first image Hi

0 using equation 4.5:

Hi0 = Hi

i−1Hi−1i−2 · · ·H

10 (4.5)

Page 110: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

74 4.4. Autonomous Landing

This mapping and the aligning between initial frame to world plane refer-ence is used to obtain the projection between the world plane and the ith imageHi

w = Hi0H0

w. In order to relate the world plane and the ith image, we mustknow the homography H0

w. A simple method to obtain it, requires to matchfour points on the image with the corresponding corners of the rectangle inthe scene, forming the matched points (0,0)↔ (x1,y1), (0,ΠWidth)↔ (x2,y2),(ΠLenght ,0)↔ (x3,y3) and (ΠLenght ,ΠWidth)↔ (x4,y4). This process can be doneby both, a helipad frame and corners detector or by an operator through a groundstation interface. The helipad points selection generates a world plane definedin a coordinate frame in which the plane equation of Π is Z = 0. With thesefour correspondences between the world plane and the image plane, the minimalsolution for homography H0

w =[h1

0w h2

0w h3

0w]

is obtained.

Translation Vector and Rotation Matrix The rotation matrix and the transla-tion vector are computed from the plane to image homography using the methoddescribed in (Zhang, 2000).

From equation 4.4 and defining the scale factor λ = 1/s, we have that[r1 r2 t

]= λK−1Hi

w = λK−1 [h1 h2 h3]

where

r1 = λK−1h1, r2 = λK−1h2, t = λK−1h3

(4.6)

The scale factor is calculated as λ = 1‖K−1h1‖

.Because the columns of the rotation matrix must be orthonormal, the third

vector of the rotation matrix r3 could be determined by the cross product of r1×r2. However, the noise on the homography estimation causes that the resultingmatrix R =

[r1 r2 r3

]does not satisfy the orthonormality condition and we

must find a new rotation matrix R′ that best approximates to the given matrixR according to smallest Frobenius norm for matrices (the root of the sum ofsquared matrix coefficients) (Sturm, 2000) (Zhang, 2000). As demonstrated by(Zhang, 2000), this problem can be solved by forming the Rotation Matrix R =[r1 r2 (r1× r2)

]= USVT and using singular value decomposition (SVD) to

form the new optimal rotation matrix R′ = UVT .The solution for the camera pose problem is defined as xi = PiX = K[R′|t]X.The translational vector obtained is already scaled based on the dimensions

defined for the reference plane during the alignment between the helipad andimage I0, so if the dimensions of the world rectangle are defined in mm, theresulting vector ti

w = [x,y,z]t is also in mm. In (Mondragon et al., 2010), it is showhow the Rotation Matrix can be decomposed in order to obtain the Tait-Bryanor Cardan Angles, which is one of the preferred rotation sequences in flight andvehicle dynamics. Specifically, these angles are formed by the sequence: (1 ) ψ

about z axis (yaw Rz,ψ ), (2) θ about ya (pitch Ry,θ ), and (3) φ about the final xbaxis (roll Rx,φ ), where a and b denote the second and third stage in a three-stagesequence or axes.

Page 111: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 75

Estimation Filtering. An extended Kalman Filter (EKF) has been incorpo-rated in the 3D pose estimation algorithm in order to smooth the position andcorrect the errors caused by the homography drift along time. The state vector isdefined as the position [xk,yk,zk] and velocity [∆xk,∆yk,∆zk] of the kth helipadexpressed in the onboard camera coordinate system. We consider the dynamicmodel as a linear system with constant velocity, as presented in the followingequations:

xk = Fxk−1 +wk (4.7)xkykzk

∆xk∆yk∆zk

=

1 0 0 ∆t 0 00 1 0 0 ∆t 00 0 1 0 0 ∆t0 0 0 1 0 00 0 0 0 1 00 0 0 0 0 1

xk−1yk−1zk−1

∆xk−1∆yk−1∆zk−1

+wt−1

(4.8)

Where xk−1 is the state vector (position and velocity), F is the system matrix,w the process noise, and ∆t represents the time step.

Because the visual system only estimates the position of the helipad, themeasurements are expressed as follows:

zk =

xkykzk

+vk (4.9)

Where zk is the measurement vector and [xk,yk,zk]t is the position of the

helipad with respect to the camera coordinate system and vk is measurementnoise . With the previous definitions, the two phases of the filter Predictionand Correction can be formulated as presented in (Welch and Bishop, 1995),assuming that the process noise wk and the measurement noise vk are white,zero-mean, Gaussian noise with covariance matrix Q and R, respectively.

The output of the filter is the smoothed position of the helipad, that will beused as input for the control system.

This method is similar to the one propose by Simon et. alters (Simon et al.,2000), (Simon and Berger, 2002) and is deeply detailed in (Mondragon et al.,2010)

4.4.2. Fuzzy Control of UAV’s Pitch, Roll and Altitude.Three controllers were design to control the aircraft for the autolanding task.

All of these controllers were developed using the software MOFS (Miguel Oli-vares’ Fuzzy Software). The three developed controllers get the homographyestimation of the altitude, the lateral and the front/back errors. The controllerscommands the thrust, roll and pitch of the UAV. As is shown in Section 4.4.3,

Page 112: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

76 4.4. Autonomous Landing

this controller was designed and tests first. After checking the correct behaviorof this controller, we design the roll and pitch controller for a full autonomouslanding.

(a) Estimation of the altitude (mm), based on the homog-raphy of the helipad.

(b) Derivate of the altitude estimation (mm/s).

(c) Output of the Fuzzy Controller, velocity commandsfor the UAV’s thrust in m/s.

Figure 4.26: Fuzzy controller for UAV’s altitude.

The thrust controller was implemented for control the altitude of the UAVduring the autolanding task (Figure 4.26). It was design with two inputs and oneoutput. The definition of the variables and the rules’ base is based on heuristicinformation. This data was acquired from different manual and hover flight testsover the helipad. The two inputs are: the estimation of the altitude, that is madeby the homography (Figure 4.26(a)), and the variation of this value between thelast two frames (Figure 4.26(b)). For this last input is taken into account possiblevariations of the frame-rate of the camera in the last second, in order to keep itmore robust for possible changes of the frame-rate caused by some computeroperations. The output of the controller is the velocity command, in meters persecond, that are executed by the aircraft in order to approximate to the helipadlocation (Figure 4.26(c)). The rules’ base of this controller is shown in Table 4.6.

The roll and pitch controllers are quite similar, the only thing that changesis the linguistic value of the membership functions’ sets. As well as the thrustcontroller these controllers have a PD-like definition. The roll controller is shownin Figure 4.27. The first input is the lateral error estimation using the 3D positionestimation of the homography (Figure 4.27(a)). The second input is the derivateof the first input value (Figure 4.27(b)). The output of the controllers is the roll

Page 113: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 77

(a) Estimation of the lateral error (m), based on the ho-mography of the helipad.

(b) Derivate of the lateral error (m/s).

(c) Output of the Fuzzy Controller, velocity commandsfor the UAV’s roll in m/s.

Figure 4.27: Fuzzy controller for UAV’s roll.

command in m/s to sent to the UAV (Figure 4.27(c)). The rules’ base of thiscontroller is shown in Table 4.7.

The pitch controller is shown in Figure 4.28. The first input is the front/backerror estimation using the 3D position estimation of the homography (Figure4.28(a)). The second input is the derivate of the first input value (Figure 4.27(b)).The output of the controllers is the pitch command in m/s to sent to the UAV(Figure 4.28(c)). The rules’ base of this controller is shown in Table 4.8.

Page 114: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

78 4.4. Autonomous Landing

(a) Estimation of the front/back error (m), based on thehomography of the helipad.

(b) Derivate of the front/back error (m/s).

(c) Output of the Fuzzy Controller, velocity commandsfor the UAV’s pitch in m/s.

Figure 4.28: Fuzzy controller for UAV’s pitch.

Page 115: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 79

Dot

/err

orV

ery

Clo

seC

lose

Nea

rFa

rV

ery

Far

Ver

ySo

ftV

ery

Slow

Slow

Nor

mal

Fast

Ver

yFa

stSo

ftV

ery

Slow

Slow

Nor

mal

Fast

Ver

yFa

stH

ard

Ver

ySl

owV

ery

Slow

Slow

Slow

Ver

yFa

stV

ery

Har

dV

ery

Slow

Ver

ySl

owSl

owSl

owV

ery

Fast

Tabl

e4.

6:R

ules

’Bas

eof

the

altit

ude

Fuzz

yco

ntro

ller.

Dot

/err

orB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Big

Ne g

ativ

eG

reat

Lef

tG

reat

Lef

tG

reat

Lef

tB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Neg

ativ

eG

reat

Lef

tG

reat

Lef

tB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Litt

leN

egat

ive

Gre

atL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Zer

oB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Litt

lePo

sitiv

eL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htG

reat

Rig

htPo

sitiv

eL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Gre

atR

ight

Gre

atR

ight

Big

Posi

tive

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htG

reat

Rig

htG

reat

Rig

htG

reat

Rig

ht

Tabl

e4.

7:R

ules

’Bas

eof

the

roll

Fuzz

yco

ntro

ller.

Dot

/err

orB

igF o

rwar

dFo

rwar

dL

ittle

Forw

ard

Zer

oL

ittle

Bac

kwar

dB

ackw

ard

Big

Bac

kwar

dB

igN

e gat

ive

Gre

atF o

rwar

dG

reat

Forw

ard

Gre

atFo

rwar

dB

igFo

rwar

dFo

rwar

dL

ittle

Forw

ard

Zer

oN

egat

ive

Gre

atFo

rwar

dG

reat

Forw

ard

Big

Forw

ard

Forw

ard

Litt

leFo

rwar

dZ

ero

Litt

leB

ackw

ard

Litt

leN

egat

ive

Gre

atF o

rwar

dB

igFo

rwar

dFo

rwar

dL

ittle

Forw

ard

Zer

oL

ittle

Bac

kwar

dB

ackw

ard

Zer

oB

igF o

rwar

dFo

rwar

dL

ittle

Forw

ard

Zer

oL

ittle

Bac

kwar

dB

ackw

ard

Big

Bac

kwar

dL

ittle

Posi

tive

Forw

ard

Litt

leFo

rwar

dZ

ero

Litt

leB

ackw

ard

Bac

kwar

dB

igB

ackw

ard

Gre

atB

ackw

ard

Posi

tive

Litt

leF o

rwar

dZ

ero

Litt

leB

ackw

ard

Bac

kwar

dB

igB

ackw

ard

Gre

atB

ackw

ard

Gre

atB

ackw

ard

Big

Posi

tive

Zer

oL

ittle

Bac

kwar

dB

ackw

ard

Big

Bac

kwar

dG

reat

Bac

kwar

dG

reat

Bac

kwar

dG

reat

Bac

kwar

d

Tabl

e4.

8:R

ules

’Bas

eof

the

pitc

hFu

zzy

cont

rolle

r.

Page 116: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

80 4.4. Autonomous Landing

4.4.3. Experiments

For these tests a Monocromo CCD Firewire camera with a resolution of640x480 pixels is used. The camera is calibrated before each test, so the intrinsicparameters are know. The camera is installed in such a way that it is lookingdownward with relation to the UAV. A known rectangular helipad is used as thereference object to estimate the UAV 3D position. It is aligned in such a way thatits axes are parallel to the local plane North East axes. This helipad was designedin such a way that it produces many distinctive corners for the visual tracking.Figure 4.29, shows the helipad used and the coordinate systems involved in thepose estimation.

Figure 4.29: Helipad, camera and U.A.V coordinate systems

4.4.3.1. 3D positioning experiments

Before performing the tests of the altitude controller, we have to do sometests checking the correct detection and estimation of the 3D positioning basedon the homography.

Page 117: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 81

The test begins when the UAV is hovering over the helipad. Then theuser (through the ground station interface) manually selects four point on theimage that corresponds to four corners on the helipad, forming the matchedpoints (0,0) ↔ (x1,y1), (910mm,0) ↔ (x2,y2), (0,1190mm) ↔ (x3,y3) and(910mm,1190mm)↔ (x4,y4). This manual selection generates a world planedefined in a coordinates frame in which the plane equation of Π is Z = 0 andalso defining the scale for the 3D results. With these four correspondences be-tween the world plane and the image plane, the minimal solution for homographyH0

w is obtained.Then, the UAV is moved, in this case manually using a RC station, making

changes on all the axis. The helipad is constantly tracking by estimating theframe-to-frame homographies Hi

i−1, which is used to obtain the homographiesHi

0, and Hiw from which Ri

w and tiw is estimated. The 3D poses estimation process

is done with an average of 12 frames per second (FPS).A wide number of tests were done to check the estimation of the homography.

Two of these tests are presented next. All the Figures shows the estimation ofthe homography compared with the IMU data. The error during the test is mea-sured by the RMSE value. Figure 4.30 shows estimation of the X axis, whichcorresponds to the lateral movement of the helicopter (roll). Figure 4.31 showsthe estimation of the Y axis, which corresponds to the front/back movement ofthe aircraft (pitch). Figure 4.32 shows the estimation of the Z axis, which corre-sponds to the altitude of the vehicle. Finally Figure 4.33 shows the estimation ofthe heading of the UAV.

0 20 40 60 80 100 120 140 160 180 200−1000

−500

0

500

1000X displacement Flight 1

X m

m

RMSE = 171

0 50 100 150 200 250 300 350 400−4000

−2000

0

2000

4000X displacement Flight 2

frame

X m

m

RMSE = 492.5 Homography estimationI.M.U data

Figure 4.30: Measures of the X axis of the UAV (Roll) based on the homographyestimation.

Page 118: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

82 4.4. Autonomous Landing

0 20 40 60 80 100 120 140 160 180 200−500

0

500

1000

1500Y displacement Flight 1

Y m

mRMSE = 82.7

0 50 100 150 200 250 300 350 4000

1000

2000

3000

4000Y displacement Flight 2

frame

Y m

m

RMSE = 163.5

Homography estimationI.M.U data

Figure 4.31: Measures of the Y axis of the UAV (Pitch) based on the homographyestimation.

The successful estimation of the altitude, pitch, roll and heading of the UAVby the homographies allow to continue with the autonomous lading tests.

Table 4.9 shows the RMSE value of the comparison between the homographyestimation and the data obtained by the IMU for the X, Y and Z axis. Table 4.10shows the comparison of the estimation of the UAV’s heading.

Axis RMSE(millimeters)

X 171X 492.5Y 82.7Y 163.5Z 161Z 857.3

Table 4.9: Comparison between homography estimation of X, Y, Z and IMUdata.

Angle RMSE(degrees)

Yaw 2.548Yaw 4.935

Table 4.10: Comparison between homography estimation of heading and IMUdata

Page 119: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 83

0 20 40 60 80 100 120 140 160 180 2004000

4200

4400

4600

4800

5000Z displacement Flight 1

Z m

m

RMSE = 161

0 50 100 150 200 250 300 350 4006000

8000

10000

12000Z displacement Flight 2

frame

Z m

m

RMSE = 857.3 Homography estimationI.M.U data

Figure 4.32: Measures of the altitude of the UAV based on the homographyestimation.

0 20 40 60 80 100 120 140 160 180 20036

38

40

42

44YAW angle Flight 1

YA

W º

RMSE = 2.548

0 50 100 150 200 250 300 350 4000

10

20

30YAW angle Flight 2

frame

YA

W º

RMSE = 4.935Homography estimationI.M.U data

Figure 4.33: Measures of the heading of the UAV (yaw) based on the homogra-phy estimation.

Page 120: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

84 4.4. Autonomous Landing

4.4.3.2. Autonomous Descend Experiments

Once the homography estimation was tested, we decided to test the altitudecontroller.

Figure 4.34 shows a landing sequence in which the 3D pose estimation basedon a reference helipad is constantly estimated. The landing test begins whenthe UAV is hovering with an altitude of 5.2m, the figure shows the approach se-quence when the helicopter is descending on the helipad. This figure also showsthe original reference image, the current frame, the optical flow between last andcurrent frame, the helipad coordinates in the current frame camera coordinatesystem and the Tait-Bryan angles obtained from the rotation matrix. Figure 4.35shows the reconstruction of the flight test 1, using the IMU data. During this testthe X,Y axes are controlled by the autopilot, maintaining a hovering conditionwhile the helicopter is descending.

Figure 4.34: 3D pose estimation based on a helipad tracking using Robust Ho-mography estimation during a UAV landing process. Landing flight test begin-ning at an altitude of 5.2m. For all images, the reference image I0 is on the smallrectangle on the upper left corner. Left it the current frame and Right the OpticalFlow between the actual and last frame. Superimposed are the projection of theoriginal rectangle, the translation vector and the Tait-Bryan angles.

The 3D pose estimated using the visual system is compared with helicopterposition estimated by the controller, with reference to the takeoff point (Centerof the Helipad). Because the local tangent plane to the helicopter is defined insuch a way that the X axis is the North position, the Y axis is the East positionand Z axis is the Down Position (negative), the measured X and Y values mustbe rotated according with the helicopter heading or Yaw angle, in order to becomparable with the estimated values obtained from the homographies. Figure

Page 121: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 85

Figure 4.35: 3D flight and heading reconstruction for the flight sequence showon figure 4.34 (flight test 1)

4.36 shows the estimated distance from the landmark with respect to the UAVduring the descending approach.

Figure 4.36: Comparison between the Z axis displacement for homography esti-mation and IMU data during the test.

4.4.3.3. Fully Autonomous Landing

Next are presented 4 tests of autonomous landing controlling the roll, pitchand thrust of the UAV. These tests are done in similar conditions as the testspresented in previous subsections. Once the helicopter is flying over the helipad,and it is selected in the homgraphy estimation program, the control system is

Page 122: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

86 4.4. Autonomous Landing

active and the helicopter start to descend. The altitude controller is activatedonly when the error in X and Y is minimum. This is when the object is centeredin side the image. Working in that way we reduce the possibilities to lost thehelipad by a wind perturbation. As well as in the autonomous descend test,when the helicopter is close enough to the helipad the thrust motor’s power isreduced to accomplish the landing test. This is when the helicopter is between 2and 1 meters.

The Figure 4.37 shows the 3D reconstruction of the first flight using the GPSdata. The altitude estimation is not very accurate, so we try to adjust this measureto have a more realistic 3D flight reconstruction. Figure 4.38 shows the measureddone using the 3D positioning based on the homography estimation for pitch, rolland altitude. In this flight the autonomous landing start at 3 meters. The RMSEvalue for the pitch in this experiment is 0.8126 meters and for the roll is 0.6792.In this Figure is possible to appreciate that the altitude is not reduced until thepitch and roll controllers reduce its error value close to zero.

Figure 4.37: 3D flight reconstruction of first test of fully autonomous landing.

The Figure 4.39 shows the 3D reconstruction of the second flight using theGPS data. Figure 4.40 shows the measured done using the 3D positioning basedon the homography estimation for pitch, roll and altitude. In this flight the au-tonomous landing start at 4 meters. The RMSE value for the pitch in this exper-iment is 0.7344 meters and for the roll is 0.7199.

The Figure 4.41 shows the 3D reconstruction of the third flight using the GPSdata. Figure 4.42 shows the measured done using the 3D positioning based on thehomography estimation for pitch, roll and altitude. In this flight the autonomous

Page 123: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 87

Figure 4.38: Homography estimation for pitch, roll and altitude of second test offully autonomous landing.

Figure 4.39: 3D flight reconstruction of second test of fully autonomous landing.

landing start at 4 meters. The RMSE value for the pitch in this experiment is0.5254 meters and for the roll is 0.6817.

The Figure 4.43 shows the 3D reconstruction of the fourth flight using theGPS data. Figure 4.44 shows the measured done using the 3D positioning basedon the homography estimation for pitch, roll and altitude. In this flight the au-tonomous landing start at 8 meters. The RMSE value for the pitch in this exper-iment is 0.3632 meters and for the roll is 1.4302.

Table 4.11 shows the most relevant information about the autonomous land-ing tests.

Page 124: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

88 4.4. Autonomous Landing

Figure 4.40: Homography estimation for pitch, roll and altitude of second test offully autonomous landing.

Figure 4.41: 3D flight reconstruction of third test of fully autonomous landing.

Figure 4.42: Homography estimation for pitch, roll and altitude of third test offully autonomous landing.

Page 125: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 4. Fuzzy Logic Control for Unmanned Helicopters 89

Figure 4.43: 3D flight reconstruction of fourth test of fully autonomous landing.

Figure 4.44: Homography estimation for pitch, roll and altitude of fourth test offully autonomous landing.

Test Altitude Pitch RMSE Roll RMSEnumber (meters) (meters) (meters)1 3 0.8126 0.67922 4 0.7344 0.71793 4 0.5254 0.68174 8 0.3632 1.4302

Table 4.11: Autonomous landing tests.

Page 126: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

90 4.5. Conclusions

4.5. ConclusionsIn this chapter are presented two applications controlling different heli-

copter’s degrees of freedom and an onboard visual platform. Both approachesuses visual information acquired by an onboard camera. The first one presents aFuzzy control of a pan and tilt visual platform on-board UAV and a UAV-heading.These controllers were developed to track static and moving object. Just the vi-sual information acquired using a visual tracker based on Lucas-Kanade algo-rithm. The results presented shown that the controllers have an excellent behav-ior tracking static objects and moving objects, despite of the wind perturbationsand the helicopter’s vibrations. The use of the pan and tilt visual platform give tothe helicopter a freedom of movements, as well as, a faster response with mov-ing objects tracking, being the others implementations of visual servoing on UAVwithout pan and tilt platform more limited and slower than the platform servosresponse. With the heading controller eliminates the physical limitation of thepan servo. The pan and tilt control system contributes by adding a extra degreeof freedom to the helicopter. Now the aircraft is able to follow moving targetsfrom a static hover position, while it is flying following a previously set pathor under manual control. The number of possible applications is also enlargeddealing with environment limitations, as visual inspection of big structures likedams or power plants and moving structures like wind turbines or offshore metmasts.

The second application presented is an autonomous landing using only vi-sion. The 3D-position estimation of the UAV is acquired using the homographyof an specific helipad. Results of the homography estimation show a good perfor-mance of the visual estimated values compared with the IMU data. The altitudecontroller was tested first with excellent results. The fully autonomous land-ing task was tested 4 times controlling the pitch, roll and altitude. The helicopterlands successfully in all the tests done, from 3 to 8 meters over the landmark. Theexcellent results corroborate the good behavior of the three Fuzzy controllers, be-sides the high difficult of the system to control. The optimal visual informationallows to carry out the landing task in scenarios where the GPS dropouts makethis task impossible to be accomplished. Furthermore, the obtained estimatedinformation is more precise than the GPS-IMU onboard system, as are shown inreal tests.

Page 127: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5Fuzzy Logic Control for UnmannedQuadrotors

This chapter presents the works done with Fuzzy control and unmannedquadrotors. Two different aircraft were used for different tasks. The specificrelated works regarding to this chapter are presented in Section 5.1. A briefintroduction about quadrotors and the system description are presented in Sec-tion5.2. The color based tracking algorithm for object detection used in bothapplications is presented in Section 5.3. The first application is about objectfollowing using the Pelican quadcopter is presented in Section 5.4. The secondapplication, is presented in Section 5.5 shows how to solve the see and avoidtask with a AR.Drone quadrotor. Finally Section 5.6 presents the conclusions ofusing Fuzzy logic control with unmanned quadcopters.

publications related to this chapter:-“Aerial Object Following Using Fuzzy Servoing”, RED-UAS 2011-“Quadcopter See and Avoid Using a Fuzzy Controller”, World Scientific Proceedings Series onComputer Engineering and Information Science, Book chapter-“Vision-Based PD-type Fuzzy Servoing of a Quadrotor UAV: Apply for Aerial Object Follow-ing”, Journal of Intelligent & Robotic Systems, 2013

91

Page 128: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

92 5.1. Related Works

5.1. Related Works

The main objective of the first presented work in this Chapter is to evaluatethe performances of Fuzzy Logic (FL) based controllers using just visual infor-mation to follow a flying aerial object. In the literature, similar works have beenpresented for object following task. Different vision-based algorithms have beenused to follow a moving car autonomously with a UAV (Campoy et al., 2009a),(Teuliere et al., 2011), (Gomez-Balderas et al., 2012) and (Ding et al., 2006). Acooperative strategy has been presented in (Zengin and Dogan, 2011) for mul-tiple UAVs to pursuit a moving target in an adversarial environment. The low-altitude road following problem for UAV using computer vision technology wasaddressed in (Egbert and Beard, 2011). People following method with ParallelTracking and Mapping (PTAM) algorithm has been developed in (Rodrıguez-Canosa et al., 2012).

In the second work presented in this chapter, we propose the use of Fuzzycontrol to address the sense-and-avoid problem (USd, 2010) for unmanned aerialsystems (UAS). Before UASs are allowed to routinely fly in civil airspace, sev-eral technological hurdles need to be addressed. For example, sense-and-avoidor safe termination systems are some of the technologies that UAS require be-fore they share the airspace and fly over populated areas (USd, 2010). The UASsector is gaining considerable predominance among researchers nowadays. In-dustry, academia and general public are placing more attention in UASs to un-derstand the potential benefits UAS could provide to society.

The onboard sense-and-avoid capability can be provided by the use of sin-gle or multiple onboard sensors (Meister et al., 2008),(Moses et al., 2011),(kin,2012). Wagter et. alters presents in (De Wagter and Mulder, 2005) the advan-tages of using visual sensors on-board UAVs in order to increase the capabilitiesof UAVs. They also presents the advantages of using the vision as a redundantsystem for other classical sensors like GPS and IMU. In (He et al., 2006) ispresented the employ of the motion field on UAV images to compute a rangemap of objects in the camera FOV, using this information for navigation and ob-stacles avoidance. A visual system inspired by flying insects for detecting andavoiding static structures has been proposed by Beyele (Beyeler et al., 2009).Lai (Lai et al., 2011a) presented a real-time system for sense-and-avoid by us-ing parallel image processing on commercial graphics processing units (GPUs).Furthermore, self-contained and passive (Electro Optical - EO) sense-and-avoidsystems have the capability to address non-cooperative scenarios at the same itprovide an alternative to the Size, Weight and Power (SWaP) limitations of manysmall-medium size UAS. Onboard EO or cameras have not only the capability toperform sense-and-avoid (Lai et al., 2011b), (Mejias et al., 2011), (Mejias et al.,2010) but also they can be used for state estimation (Shabayek et al., 2012),(Martinez et al., 2011), (Dusha and Mejias, 2012) among others applications.

Page 129: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 93

5.2. Unmanned Quadrotors DescriptionA quadrotor is also called quadrotor helicopter or quadcopter, is a vertical

take off and landing system (VTOL) lifted and propelled by four rotors. Unlikemost helicopters, quadrotors generally use symmetrically pitched blades. Con-trol of vehicle motion is achieved by altering the pitch and/or rotation rate of oneor more rotor discs, thereby changing its torque load and thrust/lift characteris-tics.

Early in the history of flight, quadrotor configurations were seen as a possiblesolution to some of the persistent problems in vertical flight; torque-induced con-trol issues can be eliminated by counter-rotation and the relatively short bladesare much easier to construct. A number of manned designs appeared in the1920s and 1930s. These vehicles were among the first successful heavier-than-air VTOL vehicles. However, early prototypes suffered from poor performance,and latter prototypes required too much pilot work load, due to poor stabilityaugmentation and limited control authority.

There are several advantages to quadrotors over comparably-scaled heli-copters. First, quadrotors do not require mechanical linkages to vary the rotorblade pitch angle as they spin. This simplifies the design and maintenance ofthe vehicle. Second, the use of four rotors allows each individual rotor to havea smaller diameter than the equivalent helicopter rotor, allowing them to possessless kinetic energy during the flight. This reduces the damages caused if rotorshit anything. The required pilot’s capabilities a quadrotor are significantly lowerthan the required to pilot a helicopter.

5.2.1. Basic Quadrotor MechanicsA quadrotor is controlled by the angular speeds of four electric motors as

shown in 5.1. Each motor produces a thrust and a torque, whose combinationgenerates the main thrust, the yaw torque, the pitch torque, and the roll torqueacting on the quadrotor. Two opposite rotors are rotating clockwise, while theother two rotors are rotating counter-clockwise, canceling out their respectivetorques. The following actions can be taken to maneuver the quadrotor:

Vertical accelerations: The four rotors have to turn at the same speed,increasing the speed the quadrotor will ascend and if the rotation speeddecrease the aircraft will descend. Keeping the four rotor without speedacceleration will stay in the air without movement.

Horizontal movements: The speed increase of one rotor and the speeddecrease of the opposite one will move the quadrotor in the direction ofthe first one, changing the pitch or roll angle of the aircraft.

Yaw rotation: The speed increase of the clockwise rotation motors willrotate the quadcopter in this direction, otherwise the speed increase of the

Page 130: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

94 5.2. Unmanned Quadrotors Description

Figure 5.1: Quadrotor mechanics. Motors 1 and 3 spin clockwise and motors 2and 4 spin counter-clockwise. Picture from www.wikipedia.org

other rotors will affects the rotation movement on the counter-clockwisedirection.

Accurate sensors and an advanced controlling routines are required to con-trol the speed engines of each rotor and control the quadrotor movements andaccelerations.

5.2.2. Pelican Ascending Technologies QuadcopterThe Pelican is a quadrotor developed by Ascending Technologies (Asc,

2010). It is built in a modular fashion allowing you to change your boards quicklyand easily. The main core is designed like a tower, making it plug-and-play. Thistestbed shown in Figure 5.2 has a low-level stability controller based on PID thatuses information from GPS, IMU, pressure altimeter and magnetometer fused us-ing a Kalman filter. This controller is embedded, closed, unmodifiable but gainsare tunable. Onboard vision processing is achieved using a dual core Atom 1.6GHz processor with 1 GB RAM, wireless interface and support for several typesof USB cameras (mono or stereo). This computer runs Linux OS working in amulti-client wireless 802.11(a,b,g) ad-hoc network, allowing it to communicatewith a ground station PC used for monitoring and supervision. It has a maximumpayload of 650 g. More detailed information could be found in appendiz C.2.1.

5.2.3. AR.Drone Parrot QuadrotorThe AR.Drone is a low cost quadrotor developed by Parrot (par, 2010). Fig-

ure 5.3 shows the AR.Drone Parrot. It is very popular because of the availableapps for controlling it using a smart phone or a tablet PC. Original designed asa sophisticated toy for augmented reality games. It has two different styrofoamhulls for indoor and outdoors flights. Depends of the mounted hull the quadrotor

Page 131: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 95

Figure 5.2: CVG-UPM (CVG, 2012) Pelican Quadrotor UAV used for AerialObject Following tests

has a payload of 80 to 120 g. The drone is equipped with two cameras (for-ward and downward), an ultrasound altimeter, a 3-axis accelerometer, a 2-axisgyroscope (for roll and pitch) and 1-axis yaw precision gyroscope. The obboardcontroller is composed of an ARM9 468 MHz processor with 128 Mb DDR Ram,on which a BusyBox based GNU/Linux distribution is running. It has an USBservice port and is controlled via wireless LAN.

Figure 5.3: AR.Drone Parrot

5.3. Color-Based TrackingThe real-time processing performance to extract visual information from im-

ages and tolerance to noises, e.g. wind disturbances and illumination changes,are the key elements for UAV using camera sensor to carry out the outdoor tasks.

The target is detected by pre-defining a color and then designing an algorithmto highlight this color. The color will be tracked along the image sequence. Thetracking is performed by using the Continuously Adaptive Mean Shift (Brad-ski, 1998) (CamShift). This algorithm is based on the mean shift originallyintroduced by Fukunaga and Hostetler (Fukunaga and Hostetler, 1975). Thisalgorithm accounts for the dynamic nature of changes in lighting conditions bydynamically adapting to changes in probability distributions of color.

Using the Camshift algorithm we track and estimate the centre of the colorregion that describes the object to get the information about the object’s positionin the image’s frame. Figure 6.1 shows an example of the tracking processes ona red colored object.

Page 132: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

96 5.4. Object Following

Figure 5.4: Camshift tracking of a colored red target on a image sequence. Thewhite circle corresponds to the boundaries of the tracked colored area

In the first application presented in this chapter we use the red color to repre-sent the flying object to follow. Therefore, the size and center of this color objectcan be obtained from real-time images via the CamShift algorithm operating onthe adjusted color probability distribution. Figure 5.4 shows the example of colorobject tracking sequences using the Camshift technique.

For second application presented we select the orange color for the detectionof the object to avoid. The color will be tracked along the image sequence usingthe previously mentioned Camshift technique. Using the location of the target inthe image we generate desired yaw commands (while keeping forward velocityconstant) which in turn will modify the trajectory of the vehicle in order to keepthe object at constant relative bearing.

Figure 5.5 shows an example of a process image for the avoiding collisiontask. In this image is possible to appreciate the detection of the orange trafficcone.

5.4. Object Following

This application aims to solve the problem of object following using a UAV.The moving target object T flies with an unknown trajectory on the world spaceR3. The UAV has an onboard forward-looking calibrated camera to follow it.The control strategy is to command the quadcopter to track it in the Field ofView (FOV). The UAV must follows the target object with a fixed safe distanceand maintain the centroid of the object on the center of real-time camera image

Page 133: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 97

Figure 5.5: Image captured with the onboard camera.

Figure 5.6: Aerial Object Following application described with the onboard andexternal images of a real situation.

plane.Figure 5.6 describes the principle of object following: the target object is

a red balloon tied and randomly guided by a person using a rope. It has a 3Dspherical structure. Once the camera detected it, its projection on the cameraimage plane will be a circle region as shown in the onboard image. This can bedefined by its center xt = [xt ,yt ]

T and the circumference diameter øt . Since theUAV always want to keep it in the center of image, the error meassure betweenthe current horizonal coordinates with center coordinate, are the input will besent to the yaw controller. Similarly, considering the detected circumferencediameter of on the image plane is inversely proportional to the distance from thecamera to the aerial object. The size errors of circumference diameter as thedistance information will be sent to the pitch controller to keep the distance L.

Page 134: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

98 5.4. Object Following

5.4.1. Fuzzy Control of Quadrotor’s Pitch and Heading

This Section presents the implementation of two Mamdani Fuzzy controllers.Both of them are based on the visual information previously described, gener-ating yaw and pitch commands for the UAV (velocity commands in degrees perseconds and centimeters per seconds).

For this work two fuzzy controllers were defined. As well as previous workspresented in this Thesis the Fuzzy logic controllers were implemented using theMOFS (Miguel Olivares’ Fuzzy Software). All the variables of the two con-trollers are defined using triangular membership functions. The decision of thisand the rest parts of the design of the controllers were made based on the excel-lent results obtained in the previously mention works.

Both controllers have two inputs and one output. The design of the con-troller of the yaw or heading of the UAV is shown next. The first input of thiscontroller is the angle estimation, in degrees, between the UAV and the centerof the image and the center of the object to follow (Figure 5.23(a)). The sec-ond input is the difference between the last angle estimation and the actual angle(Figure 5.23(b)). This controller sends velocity commands (degrees per seconds)to change the heading position of the aircraft (Figure 5.23(c)).

Figure 5.7: Definition of the Yaw controller: First input. Estimation of the devi-ation of the object from the centre of the image capture from the UAV.

Figure 5.8: Definition of the Yaw controller: Second input. Difference betweenthe last two measurements.

Table 5.1 shows the rules’ base of the heading Fuzzy controller.The second controller acts on the pitch state of the UAV as is shown next.

The controller take the data about the size of the object, in pixels, to follow andestimate the distance. The first input is the actual size of the object (Figure 5.10),and the second one is the difference between the last size measurement and theactual one (Figure 5.11). The controller outputs velocity commands to go ahead,

Page 135: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 99

Figure 5.9: Definition of the Yaw controller: Output. Velocity commands tochange the heading of the UAV.

in the case that the object is far away (Figure 5.12). Stay in the same positionif the object is near at a predefined distance of security, or go back if it is veryclose to the UAV.

Figure 5.10: Definition of the Pitch controller: First input. Estimation of thedeviation of the object from the centre of the image capture from the UAV.

Figure 5.11: Definition of the Pitch controller: Second input. Difference betweenthe last two measurements.

Figure 5.12: Definition of the Pitch controller: Output. Velocity commands tochange the heading of the UAV.

Table 5.2 shows the rules’ base for the Pitch Fuzzy controller.

Page 136: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

100 5.4. Object Following

Dot

/err

orB

igle

ftL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igN

e gat

ive

Gre

atL

eft

Gre

atL

eft

Big

Lef

tB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Neg

ativ

eG

reat

Lef

tB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htL

ittle

Neg

ativ

eB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Zer

oB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Litt

lePo

sitiv

eL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Posi

tive

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Big

Posi

tive

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Gre

atR

ight

Tabl

e5.

1:R

ules

’bas

eof

the

Hea

ding

Fuzz

ylo

gic

cont

rolle

r.

Dot

/err

orV

ery

Smal

lSm

all

Nor

mal

Big

Ver

yB

igG

reat

Muc

hSm

alle

rG

reat

Re v

erse

Big

Rev

erse

Rev

erse

Litt

leR

ever

seZ

ero

Litt

leFo

rwar

dSm

alle

rB

igR

e ver

seR

ever

seL

ittle

Rev

erse

Zer

oL

ittle

Forw

ard

Forw

ard

Equ

alR

ever

seL

ittle

Rev

erse

Zer

oL

ittle

Forw

ard

Forw

ard

Big

Forw

ard

Big

ger

Litt

leR

e ver

seZ

ero

Litt

leFo

rwar

dFo

rwar

dB

igFo

rwar

dG

reat

Forw

ard

Muc

hB

igge

rZ

ero

Litt

leFo

rwar

dFo

rwar

dB

igFo

rwar

dG

reat

Forw

ard

Gre

atFo

rwar

dTa

ble

5.2:

Rul

es’b

ase

ofth

ePi

tch

Fuzz

ylo

gic

cont

rolle

r.

Page 137: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 101

Figure 5.13: Object Following Dynamic image-based look-and-move system ar-chitecture.

The Fuzzy controllers developed are implemented on the Pelican UAV usinga dynamic look-and-move servoing architecture as is presented on figure 5.13. Inthis scheme, the velocity references generated by the controller (running onboardaircraft) are used as a input references for the Pelican Low Level controlled.This low level controller, when is operated on position control, allows to get asinput, velocity commands, as well as direct control actions. For this test, thevelocity commands generated by the visual system are directly send as inputvelocities for the Autopilot. This autopilot also allow to independently controlthe X, Y, Z and Yaw. Roll and Pitch angles are no directly controlled by velocitycommand, but it is possible to control it by means of motor direct control. So, inthis control architecture, and considering that the camera is looking forward, thequadrotor X and Z as well as the yaw angle will be controlled by the generatedreferences (VXq =VZc,VZq =VY c,ωZq = ωY c) . The Y axis and Pitch,Roll anglesare controlled by the low level autopilot.

The defuzzification process is made using the product model of inference andthe height weight defuzzification model as is shown in the Eq. 5.1.

y =∑

Ml=1 yl

∏(µB′(yl)

)∑

Ml=1 ∏

(µB′(yl)

) (5.1)

5.4.2. ExperimentsA large number of tests have been done with red balloon as target object. The

balloon is moved with a random trajectory in the 3D space. The visual systemcomposed of the Camshift color track module and the Fuzzy Controller runningon the Pelican quadcopter (Figure 5.2) in a Intel Atom Board PC. The velocitycommands are send to the low level autopilot through a serial interface.

This section presented the three most relevant tests. For each test the 2Dflight reconstruction is presented using the onboard GPS information. The GPSdata was plotted over a Google Earth image of the place in which the tests weredone. The black arrows represents the heading of the aircraft. Furthermore themeasurements of the yaw angle of the balloon and the size of it are presented.

Page 138: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

102 5.4. Object Following

The response of the the heading and the pitch controller are also shown. All thetests have a duration of more than 2 minutes without any lost of the object tofollow.

Figure 5.14 shows the trajectory that was made by the UAV and some cap-tions from the onboard camera. In this case a red balloon was followed duringalmost two minutes. The initial point of the aircraft is marked with the number1. Then the aircraft follows the trajectory marked by the consecutive numbers.

Figure 5.14: Trajectory of the Pelican-UAV during the First Aerial Object Fol-lowing Test.

Figure 5.15(a) shows the measurement of the balloon’s size during the test.The size of the balloon gives the information to the controller to go ahead, whenthe object it is far away or to go back when is so near. Some captions of themost representing movements are include in this Figure. The first picture rep-resents the begin of the test before the motors ignition. The second one showsthe normal situation in where the object has the predefined size. To test the re-verse movements of the aircraft, the balloon was moved against the UAV, that isshown on the third. The fourth picture shows how the UAV was recovered fromthe previous situation in few seconds.

Figure 5.15(b) shows the response of the Pitch controller to follow the objectwith a safe fixed distance.

The Figure 5.16(a) shows the estimation of the angle between the red balloon,the UAV and the center of the image. This measurement is obtained by the visualalgorithm, and is used as input for the heading controller. As is shown in thisFigure, the controller keeps the object in the center of the image, despite theeffects of the wind and the random trajectory. Some captions of the most critical

Page 139: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 103

(a) Measurement of the size of the red balloonduring all the test.

(b) Output of the Pitch Fuzzy Controller

Figure 5.15: Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the first test.

moments are included in the Figure.Figure 5.16(b) shows the output values of the yaw controller.

(a) Measurements for the orientation of theUAV

(b) Output of the Yaw Fuzzy Controller.

Figure 5.16: Measurements for the orientation of the UAV and response of theYaw controller of the first test.

The second test was done in similar conditions, but the unexpected waftsmake it totally different to the previous test. Figure 5.17 shows the 2D flightreconstruction of the aircraft’s trajectory. Some of the most representative cap-tions of this flight are included. As well as in the previous test, the number 1represents the initial position of the flight.

The measurements of the size of the balloon during all this test are shownin Figure 5.18(a). The most significant moments of the test are shown with fourprocessed images. The responses of the pitch controller are also shown in Figure5.18(b).

The changes on the angle estimation of the balloon’s position is shown in Fig-

Page 140: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

104 5.4. Object Following

Figure 5.17: Trajectory of the Pelican-UAV during the Second Aerial ObjectFollowing Test.

(a) Measurement of the size of the red balloon. (b) Output of the Pitch Fuzzy Controller.

Figure 5.18: Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the second test.

ure 5.19(a). Some of the most relevant moments are shown with four processedimages. The responses to this estimation are shown in Figure 5.19(b).

Finally the third test is presented in Figure 5.20. In this case we try to checkthe behavior of the controllers in a trajectory with a small radius turn. The fourimages are shown to defined the most relevant parts of the flight. The initialposition is defined by the number 1.

The pitch commands are shown in Figure 5.21(b). The changes in the size

Page 141: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 105

(a) Measurements for the orientation of theUAV.

(b) Output of the Yaw Fuzzy Controller.

Figure 5.19: Measurements for the orientation of the UAV and response of theYaw controller of the second test.

Figure 5.20: Trajectory of the Pelican-UAV during the Third Aerial Object Fol-lowing Test.

estimation of the balloon is shown in Figure 5.21(a). The image processed at theinitial position is shown with the number 1. Three other processed images areshown representing the smallest and bigger size estimation moments.

The heading commands sent to the aircraft are shown in Figure 5.22(b). Theangle estimation of the position of the balloon inside the image plane is shownin Figure 5.22(a). The moments in which the error was bigger in both side of theimages are shown with four processed images.

Finally, Table 5.3 shows the results of the three tests. The root mean-square

Page 142: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

106 5.5. Avoiding Obstacles

(a) Measurement of the size of the red balloon. (b) Output of the Pitch Fuzzy Controller.

Figure 5.21: Measurements for the distance between the UAV and the aerialobject and response of the Pitch controller of the third test.

(a) Measurements for the orientation of theUAV.

(b) Output of the Yaw Fuzzy Controller.

Figure 5.22: Measurements for the orientation of the UAV and response of theYaw controller of the third test.

error (RMSE) was used to evaluate each one. The Pitch error is consider whenthe object to follow is bigger o smaller than 40 pixels. Yaw controller efficiencyis shown by the measure of the error using degrees. The obtained results are verysmall taking into account that the object to follow (a balloon) is highly affectedby wind disturbances.

Videos of these tests could be found at (CVG, 2012).

5.5. Avoiding Obstacles

5.5.1. Fuzzy Control of Quadrotor HeadingFor the control of the aircraft for this task a heading controller was designed.

The controller was implemented using the Miguel Olivares’ Fuzzy Software(MOFS). The yaw controller has two inputs and one output. The first input is

Page 143: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 107

Test RMSE Pitch RMSE Yaw(pixels) (degrees)

1 18.8495 9.31982 13.0741 9.01333 14.5949 6.9412

Table 5.3: RMSE of Pitch and Yaw controllers for the three presented tests.

angle between the quadcopters, the object to avoid and the right or the left sideof the image, as shown in Figure5.23(a). The second input is the measure theevolution of this angle between the last two frames, as is shown in Figure5.23(b).The output of the controller is the desired yaw angle that the quadcopter need toturn to keep the object at the desired position, see Figure 5.23(c).

(a) First input. Estimation of the deviation of theobject from the centre of the image capture fromthe MUAV.

(b) Second input. Difference between the lasttwo measures.

(c) Output. Velocity commands to change theheading of the MUAV

Figure 5.23: Definition of the Yaw controller.

Page 144: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

108 5.5. Avoiding Obstacles

Dot

/Err

orB

igL

eft

Lef

tL

ittle

Lef

tC

ente

rL

ittle

Rig

htR

ight

Big

Rig

htB

igN

e gat

ive

Gre

atL

eft

Gre

atL

eft

Big

Lef

tB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Neg

ativ

eG

reat

Lef

tB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htL

ittle

Neg

ativ

eB

igL

eft

Big

Lef

tL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Zer

oB

igL

eft

Lef

tL

ittle

Lef

tZ

ero

Litt

leR

ight

Rig

htB

igR

ight

Litt

lePo

sitiv

eL

eft

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Posi

tive

Litt

leL

eft

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Big

Posi

tive

Zer

oL

ittle

Rig

htR

ight

Big

Rig

htB

igR

ight

Gre

atR

ight

Gre

atR

ight

Tabl

e5.

4:B

ase

ofru

les

ofth

eFu

zzy

cont

rolle

r.

Page 145: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 109

Table 5.4 shows the rules’ base of this controller. The selection of the ruleswas done using heuristic methods.

Figure 5.24 shows the control loop of the external visual control of thequadrotor.

Figure 5.24: Schematic diagram of the control Loop.

5.5.2. Experiments

Visual awareness is achieved by using an onboard forward-looking cam-era. Images from the camera are then sent for off-board processing in a laptopground-station. The outcome of the visual processing (and servoing commands)are then sent back to the vehicle using a Wi-Fi link.

The avoidance task aims to keep the target in the image plane at constantbearing, either right or left (as seen from image centre). When the object is firstdetected it is pushed to the edge of the image (far left or right side), and kept ata fixed position that represents a constant relative bearing.

Flight test were conducted using Parrot-AR.Drone platform. Communica-tion routines were deloped to send and receive information from the vehicle. Atypical orange traffic cone was selected as the object to avoid. We used a VICONmotion tracking system (VICON, 2012) to record accurately the trajectory of ve-hicle with the maximum precision. This information was used for 3D plotting,and no data was used for the control of the aircraft. As mentioned before in thispaper, the only information used by the Yaw-controller is the visual information.

The flights test were performed with constant forward speed (constant pitchangle). No roll commands were sent during the experiments. The altitude wasset to a constant value of 0.8m and is was controlled by the internal altitudecontroller of the AR.Drone.

The position of the quadcopter is calibrated at the beginning of the test, beingthe initial position the point (0,0,0) meters. The obstacle to avoid is located infront of the initial position of the quadcopter at distance of 6 meters and at 1.1meters from the floor (5,0,1.1) meters. Little variations of no more that 10 cmat the initial position of the quadcopter were observed during the executions ofdifferent tests. In the Figure 5.25 the 3D flight reconstruction is shown. Thesetests were made at the indoor flying laboratory at the Australian Research Centrefor Aerospace Automation (ARCAA).

Once the quadrotor takes-off, it flies for 1 meter towards the obstacle in open

Page 146: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

110 5.5. Avoiding Obstacles

Figure 5.25: 3D flight reconstruction of the flight path. The obstacle to avoid isa orange traffic cone located at the position (5,0,1.1).

loop. Then the control process is activated, and then during the next following5 seconds the controller is sending commands to the aircraft. The image pro-cessing and control task are finished after the quadrotor reaches its maximumallowable yaw angle. After this point the aircraft will go forward without anyyaw commands. The Figure 6.17 shows some images captured from the onboardcamera during the execution of this test. The Figure 5.26(a) shows when themotors have not been ignited. The Figure 5.26(b) shows the beginning of the testduring the first meter without control. The Figures 5.26(c) and 5.26(d) showstwo frames during the control process, and the Figure 5.26(e) shows when thequadrotor is overtaking the obstacle. A full video of the test can be found at (url,2012a) and (url, 2012b) .

The behavior of the controller is represented in the Figure 5.27 which showsthe evolution of the error during the test. The red line step represent the momentin which the image processing start. The measure of the step is 25 degrees, butat the moment when the step is applied the aircraft was looking at the oppositeside increasing the step command to 35 degrees. To evaluate the behavior of thecontroller we use the error estimator of the root mean-square error (RMSE). Thelower value this error estimator of RMSE = 9.57 degrees. The quick responseof the controller shown in this Figure corroborate the excellent behavior of theoptimized-controller.

Page 147: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 5. Fuzzy Logic Control for Unmanned Quadrotors 111

(a) (b)

(c) (d)

(e)

Figure 5.26: Onboard images taken during the execution of the test. Figures5.26(a) and 5.26(b) are previous to the control activation. Figures 5.26(c) and5.26(d) are during the control process and Figure 5.26(e) is when the obstaclehas been overtaken.

Figure 5.27: Evolution of the error during a real test.

Page 148: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

112 5.6. Conclusions

5.6. ConclusionsIn this chapter were presented to applications using Fuzzy control with

quadrotors. Both applications use the visual information obtained using the on-board forward looking camera. The proposed visual algorithm is based on colordetection and could be used to detects objects with different color and shape.

In the first one aerial object following have been presented. The fuzzy con-trollers acts on the yaw and the pitch of a UAV, in order to keep the aircraft witha safe distance to the object to follow and maintain it in the centre of the imageplane. Real tests in outdoor scenarios have been done to validate the proposedmethod and the Fuzzy controllers. Presented results proved the excellent behav-ior of the fuzzy controllers performing the following action of the target objectivewith a safe distance.

The second application presents a see-and-avoid task. A Fuzzy logic con-troller has been developed to automatize the collision avoidance. This controlleracts changing the heading of the aircraft, keeping the obstacle to avoid at the rightside (or left) of the image until the object can be overtaken. Excellent results havebeen obtained in real tests using the commercial quadcopter AR.Drone-Parrotwith a quick response and a low error estimation.

The excellent results in both applications demonstrated the robustness of thedeveloped Fuzzy control against noise visual information caused by light varia-tions.

Page 149: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6Optimization of Fuzzy LogicControllers using theCross-Entropy method

In previous chapters of this Thesis were presented the implementation ofseveral Fuzzy logic controllers for different applications. All of these Fuzzycontroller were designed using expert knowledge and heuristic. To finish thisThesis we decided to go little bit further in the design of Fuzzy controllers us-ing the Cross-Entropy method to tune controllers. This method was motivatedby an adaptive algorithm for estimating probabilities of rare events in complexstochastic networks, which involves variance minimization. This novel optimiza-tion method has few approaches for control in the literature. We presents the usesof this optimization method to tune different parts of PID-like Fuzzy controllers.This controllers were used to manage the AR.DRone Parrot for See and Avoidtask.

publications related to this chapter:-“See-and-avoid quadcopter using fuzzy control optimized by cross-entropy”, IEEE-FUZZ 2012(WCCI 2012)-“UAS See-and-Avoid using two different approaches of Fuzzy Control”, ICUAS 2012-“Cross-Entropy Optimization for Scaling Factors of a Fuzzy Controller: A See-and-Avoid Ap-proach for Unmanned Aerial Systems”, Journal of Intelligent & Robotic Systems 2013

113

Page 150: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

114 6.1. Related Works

In this chapter, is presented the related works about Fuzzy control tuningtechniques and the uses of the Cross-Entropy method in section 6.1. The Cross-Entropy method is explained in details in section 6. The general approach totune Fuzzy logic controllers is presented in section 6.3. Then, in section 6.4 ispresented the works done using the Cross-Entropy optimization method to opti-mize the scaling factors of a Fuzzy PID-like controller. For this work a softwareimplementation of the CE method was done in C++ to connect it with ROS andthe 3D simulator Gazebo. The optimization of the Membership functions andthe Rules’ weights in presented in section 6.5. For this work a software imple-mentation of the CE method was done in Matlab to connect it with the developedsimulator environment done in Matlab-Simulink. In section 6.6 is presented theregarding conclusions of this chapter.

6.1. Related WorksAs is previously mentioned in this Thesis, Soft-Computing techniques (SC)

are one of the most suitable option to face real world problems. The two of mostcommon techniques of SC techniques are Fuzzy logic and neural networks. InHunt et al. (Hunt et al., 1992) a survey of Neural Networks for control systems ispresented. In the same way the work of Precup and Hellendoormn (Precup andHellendoorn, 2011b) presents a survey of industrial control applications withFuzzy Control. Similar to other types of controllers, SC controllers need to betuned or optimized manually or automatically. In 1992, Zheng (Zheng, 1992) de-fined a tuning sequence and classification for manual tuning of Fuzzy controllers(FC). This process can be performed at three different scales based on the ef-fects caused to the controller behavior: The Macroscopic effects are caused bythe modification of the scaling factors (SF), which are defined as gains of in-puts and/or outputs. Medium-size effects which impact the controller when themembership functions (MF) are modified, and Microscopic effects which arepresented when we modify the output or the weight of each rule. This sequenceof effects could be easily understood is we visualize the rule base of the FC asa rule table. A modification of one scaling factor affects the entire rule table.A modified set of membership functions affects one row, one column, or onediagonal in the table. A modified rule only affects one cell of the rule table.

The autonomous optimization approaches presented in the literature followsthis classification too. Malhorta et al (Malhotra et al., 2011b) presents a macro-scopic optimization of PID and PI-like Fuzzy controllers using genetics algo-rithms. In (Bonissone et al., 1996) Bonissone presents the use of Genetics Algo-rithms for macroscopic and medium-size optimization of a PI Fuzzy controller.Wei Li (Li, 1994) presents a medium-size scale optimization using neural net-works. In (shing Roger Jang, 1993), Jang presents an adaptive neural basedFuzzy inference system (ANFIS) that was used to refine the Fuzzy if-then rules,being in this case a microscopic optimization. The learning algorithm is basedon the gradient descent and the chain rule proposed by Werbos (Werbos, 1974) in

Page 151: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 115

the 1970’s. Also of interest to the reader is the work of Bonissone in (Bonissoneet al., 1999), who presents a deep discussions of SC hybrid systems and opti-mization methodologies with very clear examples of industrial and commercialapplications.

In this chapter, is presented a novel optimization approach using the Cross-Entropy method to tune Fuzzy logic controllers. This novel optimization methodis a general Monte Carlo approach to combinatorial and continuous multi-extremal optimization and importance sampling. It was motivated by an adaptivealgorithm for estimating probabilities or rare events in complex stochastic net-works (Rubinstein, 1996), which involves variance minimization. A simple mod-ification of the initial algorithm allows to apply it to solve difficult combinatorialoptimization problems. This method derives its name from the cross-entropy(or Kullback-Leibler) distance, which is a fundamental concept of modern in-formation theory. In a nutshell, the CE method involves an iterative procedurewhere each iteration can be broken down into two phases. In the first stage, arandom data sample (e.g. a set of membership functions or rules’ weight forFuzzy Logic Controller) is generated according to a specified mechanism. Then,the parameters of the random mechanism are updated based on the data in orderto produce a ”better” sample in the next iteration. This optimization methodhave been used for several approaches. For solving complicates (e.g. NP-hard)combinatorial optimization problems (COPs) (Helvik and Wittner, 2001) (Keithand Kroese, 2002) (Rubinstein, 2002). To provide an easy and effective wayto tackle the buffer allocation problem (BAP) (Allon et al., 2005) (Spinellis andPapadopoulos, 1999). In industry (Belmudes et al., 2008) for power system relia-bility evaluation, and (Zhang et al., 2007) use this method for a antenna selectionto improve the radio channel capacity, and for queuing models of telecommuni-cation systems (de Boer et al., 2000) (Boer et al., 2004). In robotics (Celesteet al., 2006) to find the optimal path planning, and (Kobilarov, 2011) for mo-tion planning. The uses of this optimization method in control is reduced to twoworks in the literature. In (Bodur, 2007) Bodur presents the use of the CE methodto optimize the gains of a classic PID controller to manage the invert pendulumproblem in a simulated environment. Haber et alter (E.Haber et al., 2010) use theCE method to optimize the scaling factors of a Fuzzy PD controller for cuttingforce regulation of a drilling process. Experimental results are presented in thiswork for this high controlled environment process. In this Thesis this method isused to optimize a PID-like Fuzzy logic controller for See and Avoid task. Thecontroller manage the heading of the aircraft to avoid collisions with an object.According to the tune classification presented before, two different works arepresented next, a Macroscopic and a Medium-size + Microscopic optimization.

6.2. Cross-Entropy optimization methodThe Cross-Entropy (CE) method is a new approach in stochastic optimiza-

tion and simulation. It was developed as an efficient method for the estimation of

Page 152: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

116 6.2. Cross-Entropy optimization method

rare-event probabilities. The CE method has been successfully applied to a num-ber of difficult combinatorial optimization problems. We present an applicationof this method for optimization of the gains of a Fuzzy controller. Next, wepresent the method and the Fuzzy controller optimization approach. A deeperexplanation of the Cross Entropy method is presented on (R.Y.Rubinstein andD.P.Kroese, 2004) and (Boer et al., 2002).

The CE method is iterative and based on the generation of a random datasample (x1, ...,xN) in the χ space according to a specified random mechanism. Areasonable option is to use a probability density function (pdf) such as the normaldistribution. Let g(−,v) be a family of probability density functions in χ param-eterized by a real value vector v ∈ ℜ: g(x,v). Let φ be a real function on χ , sothe aim of the CE method is to find the minimum (like in our case) or maximumof φ over χ , and the corresponding states x∗ satisfying this minimum/maximum:γ∗ = φ(x∗) = minx∈χφ(x).

In each iteration the CE method generates a sequence of (x1, ...,xN) andγ1...γN levels such that γ converges to γ∗ and x to x∗. We are concerned withestimating the probability l(γ) of an event Ev = {x ∈ χ | φ(x)≥ γ},γ ∈ℜ.

Defining a collection of functions for x ∈ χ,γ ∈ℜ.

Iv(x,γ) = I{χ(xi)>γ} =

{1 i f φ(x)≤ γ

0 i f φ(x)> γ(6.1)

l(γ) = Pv(χ(x)≥ γ) = Ev · Iv(x,v) (6.2)

where Ev denotes the corresponding expectation operator.In this manner, Equation 6.2 transforms the optimization problem into an

stochastic problem with very small probability. The variance minimization tech-nique of importance sampling is used, in which the random sample is generatedbased on a pdf h. Being the sample x1, ...,xN from an importance sampling den-sity h on φ and evaluated by:

l =1N·

N

∑i=1

I{χ(xi)>γ} ·W (xi) (6.3)

Where l is the importance sampling and W (x) = g(x,v)l is the likelihood ratio. The

search for the sampling density h∗(x) is not an easy task because the estimationof h∗(x) requires that l be known h∗(x) = I{χ(xi)>γ} ·

g(x,v)l . So the referenced pa-

rameter v∗, must be selected such the distance between h∗ and g(x,v) is minimal,thereby the problem is reduced to a scalar case. A way to measure the distancebetween two densities is the Kullback-Leibler, also known as cross-entropy:

D(g,h) =∫

g(x) · ln g(x)dx−∫

g(x) · ln h(x)dx (6.4)

The minimization of D(g(x,v),h∗) is equivalent to maximize∫h∗ln[g(x,v)]dx which implies that maxvD(v) = maxvEp

(I{χ(xi)>γ} · ln g(x,v)

),

in terms of importance sampling it can be rewritten as:

Page 153: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 117

maxvD(v) = max1N

N

∑i=1

I{χ(xi)>γ} ·px(x)h(xi)

· ln g(xi,v) (6.5)

Note that h is still unknown, therefore the CE algorithm will try to overcome thisproblem by constructing an adaptive sequence of the parameters (γt | t ≥ 1) and(vt | t ≥ 1).

6.3. Fuzzy Control Optimization General Ap-proach

Here is presented the interpretation the Cross-Entropy method focused totune Fuzzy logic controllers. The CE method generates a set of N fuzzy con-trollers xi = (xi1,xi2, ...,xih) with g(x,v) = (g(x1,v),g(x2,v), ...,g(xh,v)) and cal-culates the objective function value for each controller. The controllers param-eters xi1,xi2, ...,xih could correspond to the gains’ value, and/or to the positionof the membership functions sets in the first optimization, and/or to the rulesweight in the second one. After all the generated controllers are simulated anupdate of g(x,v) is done using a set of best controllers. The number of best con-trollers used to update the pdf is denoted by Nelite. Then a new set of controllersis generated to be tested. The process finish when the minimum value of the costfunction or the maximum number of iteration are reached. A generic version ofthe optimization process for fuzzy controllers is presented in the Algorithm 1.

Algorithm 1 Cross-Entropy Algorithm for Fuzzy controller optimization1. Initialize t = 0 and v(t) = v(0)2 Generate a sample of N controllers: (xi(t))1≤i≤N) from g(x,v(t)), being eachxi = (xi1,xi2, ...,xih)3. Compute φ(xi(t)) and order φ1,φ2, ...,φN from smallest ( j = 1) to biggest( j = N).Get the Nelite first controllers γ(t) = χ[Nelite].4. Update v(t) withv(t +1) = argvmin 1

Nelite ∑Nelite

j=1 I{χ(xi(t))≥γ(t)} · ln g(x j(t),v(t))5. Repeat from step 2 until convergence or ending criterion.6. Assume that convergence is reached at t = t∗, an optimal value for φ can beobtained from g(.,v(t)∗).

For both optimization process a Normal (Gaussian) distribution function wasused. The mean µ and the variance σ of each h parameters are calculate for each

t iteration as µth = ∑Nelite

j=1x jh

Nelite and σth = ∑Nelite

j=1(x jh−µ jh)

2

Nelite . The mean vector ¯µshould converge to γ∗ and the standard deviation ¯σ to zero.

In order to obtain a smooth update of the mean and the variance we use a setof parameters (β ,α,η), where α is a constant value used for the mean, η is avariable value which is applied to the variance to avert the occurrences of 0s and

Page 154: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1186.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

1s in the parameter vectors, and β is a constant value which modify the value ofη(t).

η(t) = β −β · (1− 1t )

q

µ(t) = α · µ(t)+(1−α) · µ(t−1)σ(t) = η(t) · σ +(1−η) · σ(t−1)

(6.6)

6.4. Gains optimization of Fuzzy controllers: Ap-plied to UAV See and Avoid task.

In this Section is presented the optimization of the input gains’ values (scal-ing factors) of a Fuzzy PID like controller. This optimization caused macro-scopic effects to the controller’s behavior, as previously mention. The optimiza-tion process were done using the simulator environment ROS-Gazebo. An im-plementation of the Cross-Entropy method using C++ was done and connectedwith ROS. To check the improvements obtained after the optimization a morethan 30 tests were done comparing the non optimized controller and the opti-mized one.

6.4.1. Color-Based Tracking

Visual awareness is achieved by using an on-board forward-looking cameraof AR.Drone Parrot. Images from the camera are then sent for off-board pro-cessing in a laptop ground-station. The outcome of the visual processing (andservoing commands) are then sent back to the vehicle using a wifi link.

The avoidance task aims to keep the target in the image plane at constantbearing, either right or left (as seen from image centre). When the object is firstdetected it is pushed to the edge of the image (far left or right side), and kept ata fixed position that represents a constant relative bearing.

The target is detected by pre-defining a color and then designing an algorithmto highlight this color. The color will be tracked along the image sequence. Thetracking is performed by using the Continuously Adaptive Mean Shift (Brad-ski, 1998) (CamShift). This algorithm is based on the mean shift originallyintroduced by Fukunaga and Hostetler (Fukunaga and Hostetler, 1975). Thisalgorithm accounts for the dynamic nature of changes in lighting conditions bydynamically adapting to changes in probability distributions of color.

Using the Camshift algorithm we track and estimate the centre of the colorregion that describes the object. Figure 6.1 shows an example of the trackingprocesses on a red colored object. Using the location of the target in the imagewe generate desired yaw commands (while keeping forward velocity constant)which in turn will modify the trajectory of the vehicle in order to keep the objectat constant relative bearing.

Page 155: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 119

Figure 6.1: Image captured with the onboard camera.

6.4.2. Scaling Factor Optimized Fuzzy control of Quadrotors’Heading for See and Avoid

The Fuzzy PID controller was implemented using our self-developed C++library MOFS (Miguel Olivares’ Fuzzy Software) (Mondragon et al., 2010). Theinputs and the outputs were defined using triangular membership functions. Theproduct t-norm is used for the conjunction of the rules and the Height Weightmethod has been selected for the defuzzification phase (Equation 6.7).

y =∑

Ml=1 yl

∏(µB′(yl)

)∑

Ml=1 ∏

(µB′(yl)

) (6.7)

The Fuzzy controller was defined using three inputs and one output. The firstinput measures the error in degrees between the quadrotor, the object to avoidminus the reference (Figure6.2). The second, is the derivative of the error, as isshown in Figure 6.3, and third input, shown in Figure 6.4 represents the integralof the error. The output is the commanded yaw that the vehicle needs to turn tokeep the object at the desired relative bearing, see Figure 6.5. First and secondoutputs are equivalent to the inputs of the first approach.

Figure 6.2: PID-Fuzzy Controller: Membership function of the first input, theerror.

The definition of the fuzzy variables uses 45 rules. By the reason that thesystem have 3 inputs the base of rules has a cube disposition of 5×3×3. To beeasy to the reader to understand the rules’ base, we present three matrix of 5×3

Page 156: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1206.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

Figure 6.3: PID-Fuzzy Controller: Membership function of the second input, thederivative of the error.

Figure 6.4: PID-Fuzzy Controller: Membership function of the third input, theintegral of the error.

Figure 6.5: PID-Fuzzy Controller: Membership function of the output, headingdegrees to turn.

with the relation between the first two inputs, error and dot (derivative of time)of the error. Each matrix has a static value of the third input, the integral of theerror. Table 6.1 shows the output values for the variables error and dot, with theintegral of the error value equal to zero. The Table 6.2 shows the output valuesfor the variables error and dot, with the static value for the third variable equal tonegative. And finally the Table 6.3 shows the output values for the variable errorand dot, with the static value for the third variable equal to Positive.

6.4.3. Optimization process using the ROS-Gazebo simula-tion environment

6.4.3.1. Software Implementation

The simulation tests were performed using the ROS (Robotics Operative Sys-tem) and the 3D simulation environment Gazebo (ros, 2012). In simulations, aquadcopter model of the starmack ros-pkg developed by Berkeley University(sta, 2012) was used. The obstacle to avoid was defined by a virtual yellow

Page 157: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 121

Dot/Error Big Left Left Zero Right Big RightNegative Big Left Left Little Left Zero Little RightZero Left Little Left Zero Little Right RightPositive Little Left Zero Little Right Right Big Right

Table 6.1: Base of Rules with value for the third input (Integral of the Error)equal to Zero

Dot/Error Big Left Left Zero Right Big RightNegative Left Little Left Zero Little Right RightZero Little Left Zero Little Right Right Big RightPositive Zero Little Right Right Big Right Great Right

Table 6.2: Base of Rules with value for the third input (Integral of the Error)equal to Negative

Dot/Error Big Left Left Zero Right Big RightNegative Great Left Big Left Left Little Left ZeroZero Big Left Left Little Left Zero Little RightPositive Left Little Left Zero Little Right Right

Table 6.3: Base of Rules with value for the third input (Integral of the Error)equal to Positive

balloon.Two external software routines in C++ were developed for accomplish these

tests. One is the implementation of the Cross-Entropy method, which is respon-sible of the optimization process. This program generates a set of controllers,selects the controller to test and when all the controllers are tested, update thepdf with the results to obtain new values of mean and variance of each pdf togenerate the new set of controllers to test in the next iteration. The other one isresponsible to execute iteratively the ROS-Gazebo system. In order to test allthe controllers in the same conditions, the ROS-Gazebo is restarted for each testgetting same initial state for all the tests. In each iteration the program send akill command to close the ROS-Gazebo and start it again loading all the initialparameters needed by the simulator. The Figure 6.6 shows the tests flowchart, inwhich the tasks performed by the Cross-Entropy program are represented withgreen boxes. The tasks performed by the iteration program are those which arerepresented by blue diamonds. The blue box titled Simulation represents theROS-Gazebo process.

Additionally, two nodes were added to the ROS-Gazebo, visual algorithmand Fuzzy controller nodes, respectively. The visual algorithm which gets theimage from the simulated camera onboard the quadcopter and converts it to anOpenCV image for further processing. After the image is processed the informa-tion obtained is sent to the Fuzzy controller node. The controller evaluates thisdata to obtain the correct yaw value. Finally, this command is sent to the simu-lated aircraft in the 3D simulator. One advantage of this simulation environment

Page 158: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1226.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

Figure 6.6: Flowchart of the optimization process.

is that the detection algorithm used in this phase is the same that was used in thereal tests.

Figure 6.7: Interaction between the ROS-Gazebo 3D simulator and the two otherprocess developed for this work.

6.4.3.2. Simulator Experiments

In order to obtain an optimal parameters for a controller we should generallytest a large number of different controllers. Testing these controller in the sameconditions is challenging. To do this we defined a type of test based on somefixed parameters such as fixed time for each simulation cycle; the quadcopterpositioned in front of the object to avoid in a defined starting location and eachtest is performed sending a constant pitch command to the aircraft of 0.03 m/s.To evaluate the performance of each test we used the Integral Time Absolute

Page 159: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 123

Error (ITAE) criterion. We also used the Root Mean Square Error (RMSE) cri-terion but similar results. We choice the ITAE error estimator is motivated bythe error penalization it imposes at the end of the test. Being more importantestimator during a optimization process. The RMSE criterion was used with thereal tests because is easier to understand what the performance of the test was,because the result is given in the same unit that is used by the first input of thecontroller (the error). The cross-entropy system generates N = 30 controllersper iteration based on the last update of the probability density function for eachgains. From this set of controllers the five with the lowest ITAE value are se-lected (Nelite = 5) to update the next pdf parameters. The initial values for thepdf of all the gains are µ(0) = 0.5, σ(0) = 0.5. The rest of the parameters of thecross-entropy method are q = 2, η(0) = 0, β = 0.92, α(0) = 0. Those valuesare based on values reported in (E.Haber et al., 2010) and (Botev and Kroese,2004a).

Figure 6.8: Control loop with the optimization of the Cross-Entropy method.

Figure 6.9: Evolution of the probability density function for the first input gain.The standard variance converge in 12 iterations to a value of 0.0028 so that theobtained mean 0.9572 can be used in the real tests.

A number 330 tests were performed to obtain the optimal controller. Thisprocess corresponds to 11 updates of the pdf for the gains. Figure 6.9 shows theevolution of the probability density function of the first input of the controller.The initial mean and sigma for the three gains were 0.5 for both parameters. The

Page 160: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1246.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

Figure 6.10: Evolution of the probability density function for the second inputgain. The standard variance converge in 12 iterations to a value of 0.0159 so thatthe obtained mean 0.4832 can be used in the real tests.

Figure 6.11: Evolution of the probability density function for the third input gain.The standard variance converge in 12 iterations to a value of 0.0015 so that theobtained mean 0.4512 can be used in the real tests.

Figure 6.12: Evolution of the ITAE error during the 12 Cross-Entropy iterations.The ITAE value of each iteration correspond to the mean of the first 5 of 30controllers of each iteration.

final values of the pdf were mean = 0.9572 and sigma = 0.0028. Figure 6.10shows the evolution for the second input with the final values of mean = 0.4832and sigma = 0.0159. In the same way Figure 6.11 shows the evolution of thepdf for the third input, which finalize with mean = 0.4512 and sigma = 0.0015.Figure 6.12 presents the evolution of the mean of the ITAE value of the 5 winnersfrom each set of 30 controllers. The Figure 6.13 shows the evolution of thedifferent gains of the controller during the 330 tests.

Page 161: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 125

Figure 6.13: Evolution of the gains of each input. The value of the gain corre-spond to the first 5 of 30 controllers of each iterations.

6.4.4. Experiments

In order to validate and compare the behavior of both controllers we con-ducted real flights tests. We used a AR.Drone-Parrot (par, 2010) platform withour own software routines developed for this purpose (url, 2012a). A typicalorange traffic cone was used as the object to avoid. We recorded the quadrotortrajectory with the maximum precision using the VICON position detection sys-tem (vic, 2012). The VICON system was used for data logging, no data was usedfor the control of the quadrotor.

The quadcopter system used in these tests is a commercial-off-the-shelf Par-rot AR.Drone. This is an aircraft with two cameras onboard, one forward-looking which has been used in this work, and one downward-looking. Theaircraft is connected to a ground station via WiFi connection. A extended expla-nation of this platform is presented at (par, 2010).

Figure 6.14: Parrot-AR.Drone, the platform used for the real tests.

Figure 6.15 shows the control loop of the system once the Cross-Entropyprocess was remove.

Figure 6.15: Control loop with the optimization of the Cross-Entropy method.

Page 162: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1266.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

For both controllers the flight tests were performed with predefined constantforward speed (constant pitch angle) during the test. No roll commands weresent during the experiments. The altitude was set to a constant value of 0.8m andis was controlled by the internal altitude controller of the aircraft. The position ofthe quadcopter is calibrated at the beginning of the test, being the initial positionthe point (0,0,0) meters in the VICON system. The obstacle to avoid was locatedin front of the initial position of the quadcopter at 4.5 meters of distance and at1.1 meters from the floor (4.5,1.1) meters.

Once the quadrotor take-off, it flies 0.5 meters towards the obstacle usingthe same controller with a reference value equal to 0 to keep the obstacle in thecentre of the image. Once the aircraft flies the first half meter the reference forthe control system it change to keep the obstacle in one of the edge of the imageto try to avoid the obstacle. Until the aircraft does not reaches 3.5 in the forwarddirection it will continue trying to avoid the obstacle. Once this distance wasreached by the aerial vehicle, a constant yaw (last yaw commanded) will be send.In that way we can compare how the optimization improve the behavior of thecontroller. Keeping the obstacle in one of the edge of the image tracking it withyaw commands imply lateral deviation of the trajectory of less than 2 meters butkeeping the same direction when the test finishes successfully. It must be takeinto account that the aim of this work is the optimization of the controller and notthe way to starts and ends the avoiding obstacle task, as is shown in Figure 6.16.The Figure 6.17 shows some images captured from the onboard camera duringthe execution of one of these tests. The Figure 6.17(a) shows the beginning ofthe test during the first 0.5 meters keeping the obstacle in the center of the image.The Figure 6.17(b) shows the capture image at the middle of the test and at theFigure 6.17(c) can be seen when the quadrotor is overtaking the obstacle.

Figure 6.16: Explanation of the avoiding task approach. The quadcopter startsat point 0.0 (1.Motor ignition) and flies 0.5 meters keeping the obstacle to avoidin the center of the image (2.Avoiding task. Start). Then the reference to one ofthe edge of the image is added to the position of the obstacle in the image planeuntil 3.5 meters (3.Avoiding task. Finish). The quadrotor continues. The lastyaw command is send after the avoiding task is finished. The obstacle to avoidis at point (0,4.5).

Page 163: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 127

(a) (b) (c)

Figure 6.17: Onboard images during the execution of the test.

To compare the improvements of the optimization process we test both con-trollers at different speeds. Table 6.4 shown all the tests done. We test from0.02 m/s speed until 1.4 m/s and avoiding the obstacle keeping it on the rightside and on the left side. In this Table is shown, also, the Root Mean SquaredError (RMSE) of each test. When no number is shown on this box it representsthat the aircraft could not keep the obstacle to avoid on one edge of the image,losing it before the aircraft has covered the distance of 3 meters. This kind ofsituations imply that the quadrotor change too much the trajectory or goes veryclose to the obstacle to avoid. Definitively the optimized controller has obtainedbetter results. More tests have been finished successfully by this controller andthe RMSE is lower when both controller finished the same test. In order to doa more reliable comparison no tests have been done at the same speed of theoptimization process (0.03 m/s).

Following the most significant tests are presented. The first test shown iskeeping the obstacle on the left edge of the image at 0.04 m/s. In this test bothcontrollers past it successfully with a RMSE of 9.0081 for the non-optimizedcontroller versus a RMSE of 5.271 of the optimized controller. These RMSEvalues represent a reduction of 41.5%. Figure 6.18 shows the evolution of theerror during this test for the non optimized controller. The Figure 6.19 showsthe same test for the optimized controller using the Cross-Entropy method. Inboth Figures the red step represents the moment in which the avoiding task isdone. Once the obstacle is out of the image no more error information has beenobtained. The black circle with the white cross represents the position of theobstacle to avoid. In the first case the aircraft has to modify

A 2D reconstruction of the flight that the aircraft has done is presented inthe next Figures. For these Figures we use the information obtained with theVICON, which has not used to control the vehicle for the avoiding obstacle task.Figure 6.20 shows the trajectory defined by the non optimized controller, andFigure 6.21 shows the trajectory defined by the aircraft using the Cross-Entropyoptimized controller. Comparing both Figures is possible to appreciate that thenon-optimized controller is slower than the optimized one, as is shown, also , in

Page 164: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1286.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

Type of controller RMSE speed obstacle(degrees) (m/s) position

Non-Optimized Fuzzy controller 7.848 0.02 LeftCE-Fuzzy controller 6.4048 0.02 LeftNon-Optimized Fuzzy controller 9.0081 0.04 LeftCE-Fuzzy controller 5.2714 0.04 LeftNon-Optimized Fuzzy controller —— 0.06 LeftCE-Fuzzy controller 7.4886 0.06 LeftNon-Optimized Fuzzy controller —— 0.08 LeftCE-Fuzzy controller 9.8207 0.08 LeftNon-Optimized Fuzzy controller —— 0.1 LeftCE-Fuzzy controller 11.3606 0.1 LeftNon-Optimized Fuzzy controller —— 0.12 LeftCE-Fuzzy controller 9.4459 0.12 LeftNon-Optimized Fuzzy controller —— 0.14 LeftCE-Fuzzy controller —— 0.14 LeftNon-Optimized Fuzzy controller —— 0.14 RightCE-Fuzzy controller —— 0.14 RightNon-Optimized Fuzzy controller —— 0.12 RightCE-Fuzzy controller 10.3514 0.12 RightNon-Optimized Fuzzy controller —— 0.1 RightCE-Fuzzy controller 11.4794 0.1 RightNon-Optimized Fuzzy controller —— 0.08 RightCE-Fuzzy controller 10.5684 0.08 RightNon-Optimized Fuzzy controller —- 0.06 RightCE-Fuzzy controller 8.1564 0.06 RightNon-Optimized Fuzzy controller 12.7498 0.04 RightCE-Fuzzy controller 8.6037 0.04 RightNon-Optimized Fuzzy controller 7.1514 0.02 RightCE-Fuzzy controller 6.3117 0.02 Right

Table 6.4: Comparison between the non optimized and the Cross-Entropy opti-mized Fuzzy controllers at different speeds.

Page 165: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 129

Figure 6.18: Evolution of the error during a real test at 0.04 m/s forward speedusing the non optimized fuzzy controller. A RMSE of 9.0081 has been obtained.

Figure 6.19: Evolution of the error during a real test at 0.04 m/s forward speedusing the fuzzy controller optimized using the Cross-Entropy method. A RMSEof 5.271 has been obtained, more than 40% reduction.

Figure 6.20: 2D reconstruction of the trajectory defined during a real test at 0.04m/s forward speed using the non optimized fuzzy controller.

the error evolution Figures.Because of its slowness can happens to different situations. The aircraft will

hit the obstacle to avoid or the trajectory change too much to the initial one. Thislast situation is the one what happens in the next test. In this case the speeddoubles the one of the previous test. Figure 6.22 shows how the non optimizedcontroller can not keep the obstacle to avoid on the edge of the image and looseit. This is appreciate, also, in Figure 6.23, in which the trajectory defined by theaircraft is totally different to the previous test. The evolution of the error finish

Page 166: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1306.4. Gains optimization of Fuzzy controllers: Applied to UAV See and Avoid

task.

Figure 6.21: 2D reconstruction of the trajectory defined during a real test at 0.04m/s forward speed using the fuzzy controller optimized using the Cross-Entropymethod.

when the detected obstacle to avoid is out of the image. In less than 1.5 secondsthe controller loose the obstacle. However the optimized controller could finishthis test successfully. Figure 6.24 shows the evolution of the error, and Figure6.25 shows the movements of the aircraft to avoid the obstacle.

Figure 6.22: Evolution of the error during a real test at 0.08 m/s forward speedusing the non optimized fuzzy controller. A RMSE of 9.0081 has been obtained.

Figure 6.23: 2D reconstruction of the trajectory defined during a real test at 0.08m/s forward speed using the non optimized fuzzy controller.

Page 167: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 131

Figure 6.24: Evolution of the error during a real test at 0.08 m/s forward speedusing the fuzzy controller optimized using the Cross-Entropy method. A RMSEof 5.271 has been obtained, more than 40% reduction.

Figure 6.25: 2D reconstruction of the trajectory defined during a real test at 0.08m/s forward speed using the fuzzy controller optimized using the Cross-Entropymethod.

6.5. Membership Functions’ sets and Rules’Weight optimization of Fuzzy controllers:Applied to UAV See and Avoid task.

In this section is presented the optimization of the membership functions’sets location and size and the rules’ weight value of a Fuzzy controller. Aswell as in the previous section the controller commands the heading of an air-craft for see and avoid task. Following the previously explained classificationof Zhang this optimization process have medium-size and microscopic effectson the controller’s behavior. The membership functions were optimized first, anthe obtained optimal controller was used for the rules’ weight optimization. Thecontroller optimization was done using Matlab. A simulator was implemented inMatlab-Simulink and an implementation of the Cross-Entropy method focusedon Fuzzy control optimization was developed in Matlab. The optimization pro-cess using different cost functions were compared. More than 80 tests weredone with a real aircraft comparing the non optimized controller and the opti-mal controllers obtained after the membership function and the rules’ weightoptimization. A big reduction in the number of rules was obtained.

Page 168: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1326.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

6.5.1. Object detection and Positioning Estimation using QRcodes

Visual awareness is achieved by using an onboard forward-looking camera.Images from the camera are then sent for off-board processing in a laptop GroundStation (GS). The outcome of the visual processing (and servoing commands) arethen sent back to the vehicle using a local wifi link. The detailed information ofthis GS software is shown in (Mellado-Bataller et al., 2012).

To detect and avoid the obstacle, an algorithm of AR marker detection wasused. This algorithm has a better response than the color detection used in previ-ous works (Olivares-Mendez et al., 2011)(Olivares-Mendez et al., 2012b). Theillumination changes do not affect too much to this visual algorithm. Extra in-formation about the orientation of the obstacle and distance estimation were ob-tained using a AR marker. The algorithm used in real tests is based on a devel-oped library of augmented reality using OpenCV named ArUco(ArU, 2012). Itis a kind of improved Hamming Code with error detection. Using the AR markerdetection algorithm we track and estimate the center of the marker region that thedescribes the object. Figure 6.26 shows an example of the tracking processes ona known AR marker. Using the location of the target in the image we generatedesired yaw commands (while keeping forward velocity constant) which in turnwill modify the trajectory of the vehicle in order to keep the object at constantrelative bearing.

Figure 6.26: Detection of AR marker using the AruCo software.

Figure 6.27 shows the see and avoid task in four images.

6.5.2. Membership Functions and Rules Optimized Fuzzycontrol of Quadrotors’ Heading for See and Avoid

To control the orientation of the aircraft a Fuzzy controller was designed. Wedecide to use a controller based on Fuzzy logic because of the non-linearity of

Page 169: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 133

(a) Detection (b) Avoiding1

(c) Avoiding2 (d) overtake

Figure 6.27: See-and-avoid task for UAV

the system and the high dynamic environment. The huge experience with thiskind of controllers also proved that it also a good reason to use this kind of Soft-computing technique. As well as previous developed controllers this controllerwas developed using the MOFS (Miguel Olivares’ Fuzzy Software).

The controller is a PID like controller, so it has three inputs. The first oneis the error in degrees of the estimated angle between the reference point in oneside of the image, the object and aircraft. Other inputs are the derivate valueof this angle estimation and the integral on time. The output is a command ofdegrees per seconds to change the heading of the aircraft. The heading FLC hasa symmetric definition of the inputs, output and the rules base. This is becausethe avoiding task is identical for right side avoiding or left side. The aircraft,also, has a symmetric design with the same behavior for left and right headingmovements.

Figure 6.28 shows the initial definition of the inputs and output. Each inputhas 5 sets and the output has 9 sets. The symmetry of the FLC imply that anymodification of the right side of each variable (input and output) was apply to

Page 170: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1346.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) First input membership functions. YawError, without CE optimization

(b) Second input membership functions,Derivative of Yaw Error, without CE opti-mization

(c) Third input membership functions, Inte-gral of Yaw Error, without CE optimization

(d) Output membership functions, Yaw Com-mand, without CE optimization

Figure 6.28: This is the initial definition of the variables of the Fuzzy LogicController before any optimization.

the left side.The rules’ base was designed using the heuristic information based on expert

knowledge. Each rule without CE optimization has a default weight equal to one.In that way each rule has the same importance and affects in the same way to theFLC behavior. The three inputs of the controller imply that the rules’ base hasa cube disposition of 5x5x5. In order to show the rules’ base here are presented5 tables of 5x5. Each one is related to one of the 5 linguistic values of the thirdvariable; the integral of error. Table 6.5 shows the rules’ base slide for the zerovalue for the error’s integral on time. Table 6.6 shows the slide for the negativevalue. Table 6.7 shows the slide for the big negative value. Table 6.8 shows theslide for the positive value. Finally Table 6.9 shows the slide for the big positivevalue.

Dot/error Big Left Left Zero Right Big RightBig Negative Great Left Big Left Left Little Left ZeroNegative Big Left Left Little Left Zero Little RightZero Left Little Left Zero Little Right RightPositive Little Left Zero Little Right Right Big RightBig Positive Zero Little Right Right Big Right Great Right

Table 6.5: Base of rules with value for the third input (integral of the error) equalto zero, without CE optimization

Page 171: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 135

Dot/error Big Left Left Zero Right Big RightBig Negative Big Left Left Little Left Zero Little RightNegative Left Little Left Zero Little Right RightZero Little Left Zero Little Right Right Big RightPositive Zero Little Right Right Big Right Great RightBig Positive Little Right Right Big Right Great Right Great Right

Table 6.6: Base of rules with value for the third input (integral of the error) equalto negative, without CE optimization

Dot/error Big Left Left Zero Right Big RightBig Negative Left Little Left Zero Little Right RightNegative Little Left Zero Little Right Right Big RightZero Zero Little Right Right Big Right Great RightPositive Little Right Right Big Right Great Right Great RightBig Positive Right Big Right Great Right Great Right Great Right

Table 6.7: Base of rules with value for the third input (integral of the error) equalto big negative, without CE optimization

Dot/error Big Left Left Zero Right Big RightBig Negative Great Left Great Left Big Left Left Little LeftNegative Great Left Big Left Left Little Left ZeroZero Big Left Left Little Left Zero Little RightPositive Left Little Left Zero Little Right RightBig Positive Little Left Zero Little Right Right Big Right

Table 6.8: Base of rules with value for the third input (integral of the error) equalto positive, without CE optimization

Dot/error Big Left Left Zero Right Big RightBig Negative Great Left Great Left Great Left Big Left LeftNegative Great Left Great Left Big Left Left Little LeftZero Great Left Big Left Left Little Left ZeroPositive Big Left Left Little Left Zero Little RightBig Positive Left Little Left Zero Little Right Right

Table 6.9: Base of rules with value for the third input (integral of the error) equalto big positive, without CE optimization

The product t-norm is used for rules conjunction. Since rule weights willbe optimized with CE method, the defuzzification method used in this approachis a modification of the Height Weight method. We introduce the value of theweight assigned to each rule in the defuzzification process. Equation 6.8 showsthe defuzzification method.

y =∑

Ml=1 yl

∏Ni=1(µxl

i(xi)wi

)∑

Ml=1 ∏

Ni=1(µxl

i(xi)wi

) (6.8)

Where N and M represent the number of inputs variables and total numberof rules respectively. µxl

idenote the merbership function of the lth rule for the

ith input variable. yl represent the output of the lth rule. wi corresponds to theweight of the ith rule; that could takes values from 0 to 1.

Page 172: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1366.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

6.5.3. Optimization process using the Matlab-Simulink simu-lation environment

6.5.3.1. Software Implementation

In this section is presented in details the simulator used to optimize the head-ing Fuzzy controller and the process itself. A simulator environment was definedto test each controller generated by the Cross-Entropy method.

The optimization process is done using Matlab Simulink. A structure of aset of Simulink blocks was implemented. The main block of this structure isthe Quadcopter model. This model is based on the work done by Bouabdallahin (Bouabdallah et al., 2004), (Bouabdallah, 2007). Some modifications wereapplied by Pestana et alters as is shown in (Pestana et al., 2012). For this workother blocks were included as the flying object to avoid, a virtual camera, thefuzzy logic controller and the flowchart that manage the tests.

The Figure 6.29 shows the UAV model block. The inputs of this blocksare the different speed commands of the vehicle. The first one is UAV pitchcommands. For this work a constance speed of 0.3 ms was selected for all thetests. The roll and altitude speed inputs were set to 0; because for the presentedavoiding task just the modification of the aircraft heading was required. Thefourth input is the yaw command that is generated by the FLC. This signal isfiltered with a Zero-Order Hold to give one command to the UAV per each imageprocess. The block’s outputs are the current 3D position and the current pitch,roll and yaw of the quadcopter.

Figure 6.29: Quadcopter’s Matlab Simulink block.

The Figure 6.30 shows the implemented block for the flying object to avoid.The inputs of this block are its initial position in (x,y,z), the linear and angularspeeds and the orientation. A initial position of (10,0,1) was set for all tests.This block obtain the current 3D position, and yaw of the flying obstacle toavoid. In the optimization phase no speed commands were sent to the object,being in this case a static flying object. In real tests an augmented reality codedetection was used. One of this code is used to represent this block.

The quadcopter has an onboard forward camera. The information obtainedfor this camera is used to detect the object and provide the information to theFuzzy logic controller. Figure 6.31 shows the onboard virtual camera block im-plemented on Matlab Simulink. This block gets as inputs the current position of

Page 173: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 137

Figure 6.30: Matlab Simulink block of the flying object to avoid.

the object to avoid and the quadcopter. The output is the horizontal angle be-tween the center of the image, and the position of the object inside the image,and the camera position. This information is transformed to degrees to be usedfor the FLC.

Figure 6.31: Onboard UAV camera’s Matlab Simulink block.

Figure 6.32: Fuzzy Logic Controller Matlab Simulink block.

Figure 6.33: Seeing and Avoiding Task Chart module.

The Figure 6.32 represents the Fuzzy logic controller block. As previously

Page 174: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1386.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

Figure 6.34: Real-time dynamic training process in Matlab Simulink

mentioned the heading FLC has 3 inputs and 1 output. The inputs are the er-ror, the derivate of the error and the integral of it. The FLC generate the yawcommand to send to the UAV to avoid the collision with the obstacle. A flagwas included to determine exactly when the collision avoidance task start andstop. This flag is controlled by the flowchart implemented to manage the simu-lation. This flowchart is presented in Figure 6.33. This block has as inputs thedistance between the quadcopter, the orientation of the yaw, the estimated angleand the reference. The reference represents the side to avoid the obstacle. Adetailed information of the process inside this flowchart is shown in Figure 6.34.To manage the tests, four states were defined. The first one is the initialization ofthe test, in which no control was applied. The second one is the Avoiding Obsta-cle state. In this state the controller works to avoid the obstacle as is commandedby the variable ActivateControl = 1. The input of the controller testError getsthe value of the error calculated previously using the virtual camera. From thisstate two possible transitions were defined. One is object Lost that is activatedwhen the object is out of the visual field of the camera (errorInfo ¡−30.0 degreesor errorInfo ¿ 30.0 degrees). In this state the controller is deactivated and a bigconstant value is assigned to the error of the test to penalize the controller be-havior. The second transition of the Avoiding Obstacle state is the end of the testthat is activated when the quadrotor is less than 2 meters close to the obstacle andthe current yaw of the quadrotor is very small (¿−0.05 and ¡0.05 degrees). Inthis state the controller is deactivated and the error is set to 0 to reward the con-troller behavior. The current yaw values allow the quadrotor to fly ahead withoutcollide with the object.

Out of the Simulink a Matlab function was implemented to carry out theCross-Entropy optimization process. At the beginning of the process this func-tion set the initial parameters for the Cross-Entropy optimization method. Fur-thermore this function generates a set of controllers, and selects the controllerto test. Once all the controllers of the current iteration are tested, this func-tion update optimization parameters to generate another set of controllers to test.Another Matlab function was implemented to manage the optimization processand the Simulink structure, as usually called in programming environment, themain program. This function set the initial values of for the simulation processand is the responsible to execute iteratively the simulation of each CE-generated

Page 175: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 139

controller.

Figure 6.35: Interaction between the three parts of the matlab process. TheCross-Entropy optimization method, the Simulink block structure and the mainprocess that manage the evolution of the simulations.

The Figure 6.35 shows the Matlab structure of the implemented software.The blue box represents all the Simulink blocks explained before. The greenbox represents the implemented function that is responsible of the optimizationprocess using the Cross-Entropy method. Finally the orange box is the mainprogram that manage each simulation done by the Simulink box.

Figure 6.36: Representation of the software implementation in a flowchart.

Figure 6.36 shows in a flowchart the software implementation. As well as theFigure 6.35 the orange color is used to represents what the main process do. Thegreen boxes and arrows shows what the Cross-Entropy method realized. Finallythe blue box represents the simulation process executed in Simulink.

Page 176: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1406.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

6.5.3.2. Simulator Experiments

To avoid the obstacle we follow the strategy of keep the obstacle in one ofthe two sides of the image until the obstacle is close enough to it is possible toovertake without hit it. At the beginning of the test the object is in front of thequadcopter. The same constant pitch speed of 0.3 ms was sent during all the test.The test is divided in three parts: go against the object, avoid the obstacle andovertake it. Once the test starts the controller must keep the object in the centerof the virtual image of the forward camera. After 1 meter the heading controllermust try to keep the obstacle on the right side of the virtual image. This change ofreference corresponds to a 25 degrees step. The avoiding task test finish when theaircraft is closer than 2 meters to the obstacle. After that no heading commandsare sent, concluding the test overtaking the obstacle, crashing or going far away.The full test length is 6 seconds. The heading modification imply changes on thetrajectory of the UAV, because of the constant pitch speed command and the zeroroll command.

The optimization process is done using Matlab Simulink as is explained insection 6.5.3. This process could be divided by three. The two first parts ofthe optimization process tune the position and size of the membership functionssets. This process affects to all the inputs and the output of the controller. Asis mention previously the modification of the membership functions caused amedium size effects to the Fuzzy logic controller behavior. In the first part ofthe membership function optimization we try to minimized the time response ofthe system. The time of the test was reduced 66% and is called short length test(2 seconds). In the second part the tests were done using the full time length(full length test). In this case we try to optimized the steady state error. Afterthese optimization processes a modification of the weight of each rule was done.The value of the weight was defined from 0 to 1. The rules’ weight modificationcaused minimum effects to the controller, because a change in the weight of onerule just affects a very restricted part of the controller’s behavior. For this test thefull time length of the test was used too.

Phase 1: Optimization of membership function sets in a short length test.The time of the test is reduced to improve the quick response of the system.The obstacle is not overtaken in this phase.

Phase 2: Optimization of membership functions sets in a full length test.

Phase 3: Optimization of rules weight in a full length test.

The initial parameters of the Cross-Entropy optimization are presented next.In each iteration of the optimization process a set of 100 (N) controllers wasgenerated based on the probability density functions. Just the 5 (Nelite) best con-trollers were used to update each pdfs in each iteration of the optimization pro-cess. At the first optimization process the mean and the sigma of the probabilitydensity function of each membership function’s sets depends on the number of

Page 177: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 141

sets of the variable and the range. Both of them were set before the optimizationprocess start, and is fixed during it. The initial position of the sets of the mem-bership functions is defined by a division of the total range of the variable andthe number of sets. So each set have the same size as is shown in Figure 6.28.The initial mean value of the first input 15, 5 for the second and the third one.The initial mean values for the output are 22.5, 45, and 62.5. The initial sigmadepends on the size of the set, being 2/3 of the set size. The second part of theThe update of the mean and the sigma values are used some smoothness param-eters based on (Botev and Kroese, 2004b). The values of these parameters areq = 2, β = 0.92, η(0) = 0, α(0) = 0. Those values are based on values reportedin (Olivares-Mendez et al., 2013a) and (E.Haber et al., 2010).

To evaluate the performance of each controller’s test two kind of objectivefunctions were tested. The first one is the Root Mean Square Error (RMSE)and second one is the Integral Time Square Error (ITSE). A full optimization ofthe Fuzzy controller (membership functions and rules’ weights) were done. TheTable 6.10 presents the results of the optimization done using the error estimatorRMSE. The results of the simulation of an avoiding task keeping the object at 25degrees on the right side of the image using the best controller of each RMSE’soptimization phase are shown in Figure 6.37.

Table 6.10: Optimization using RMSE.Type of objective Iterations Winner Initial Final ITSE RMSEOptimization function iteration sigma sigma value value

(degrees)MF1MF short length ITSE 200 142 4.2792 0.0298 100840 5.306MF2MF full length RMSE 125 26 0.298 0.0215 40658 5.039Rules1Rules full length RMSE 300 285 0.32 0.0611 35759 4.73

Figure 6.37: 25 degrees step response of the best controller of each optimizationphase using the error estimation RMSE (Table 6.10).

The results of the optimization done using the ITSE objective function ispresented in table 6.11. The first part of the optimization is common for bothoptimization process. As well as for the previous optimization process, Figure

Page 178: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1426.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

6.38 shows the response against the same type of test for each ITSE optimizationphase’s best controller.

Table 6.11: Optimization using ITSE.Type of objective Iterations Winner Initial Final ITSE RMSEOptimization function iteration sigma sigma value value

(degrees)MF1MF short length ITSE 200 142 4.2792 0.0298 100840 5.306MF3MF full length ITSE 125 99 0.298 0.0153 36281 5.1588Rules2Rules full length ITSE 300 222 0.32 0.0711 22692 4.73

Figure 6.38: 25 degrees step response of the best controller of each optimizationphase using the error estimation ITSE (Table 6.11).

In both Figures 6.37 and 6.38 the black line represents the evolution of theerror of the non-optimized controller. The first optimization is also common forboth Figures. The MF1 is the short time optimization phase used to optimize theresponse time of the controller. The response of the optimized controller of thisphase reduce the response time, and the big oscillation in the steady state. Fur-thermore the response oscillates too much to use this controller for a real UAV.In order to reduce this oscillation two kind of optimization were done. tThe MF2that is a optimization of the membership functions using the RMSE objectivefunction; and the MF3 is an optimization of the membership functions of eachvariable too, but using the ITSE objective function. For these two optimizationphases the obtained values of the previous phase are used as initial mean. Thesigma value was initialized with the obtained sigma value in the previous op-timization phase but multiply by 10. Both optimizations reduce the oscillationof the previous optimization phase. The ITSE optimization obtained a little bitbigger reduction of the oscillation.

As previously mentioned the final phase is the optimization of the rules’weight. Also in this case the optimization has done using the two objectivefunctions (RMSE and ITSE). The Figure 6.37 shows the result of the optimiza-tion with RMSE. The Figure 6.38 shows the result of the ITSE optimization. Inboth cases the response of the best controller is represented by the red line. Theresponse of the controller trained using the ITSE error estimator is notably better

Page 179: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 143

that the one obtained with the RMSE. The ITSE optimization’s controller notonly reduce to zero the response oscillation but also reduce the time response ofthe system. These results are not notably shown in the Tables 6.10, 6.11, but areclearly shown at the response Figures.

Following are shown the most relevant results obtained during the differentoptimization phases. First is presented the membership optimization correspond-ing to the MF1 + MF3 (MF13). As previously mention this phase is the one inwhich the membership function are optimized using the full length of the testusing the ITSE objective function. After that the results obtained after the rules’weight optimization using the ITSE objective function are presented (MF1 +MF3 + rules’ weight (ITSE)).

Membership functions optimization Figure 6.39 shows the control loop dur-ing the membership functions optimization phase. In this Figure is appreciatedthat the error information is measure using the ITSE objective function. Thisinformation is given to the Cross-Entropy optimization method and it change thevalues of the location of the sets in the membership functions of the inputs andthe output. The dots line represents that the modification is not done in real timeduring the test. As previously mention the membership function sets’ valuesmodification is done after all the generated controllers are tested.

Figure 6.39: Control loop of the Cross-Entropy method for the membership func-tions optimization.

The evolution of the value to optimize could be shown by the mean andthe sigma associated of each membership function. Both values represent theprobability density function used to generate the N controllers for each iteration.Figure 6.40 shows the evolution of the pdf for the value to optimized for the dif-ferent inputs. The first input is shown in Figure 6.40(a). The second one is shownin Figure 6.40(b). And finally the evolution of the third input is represented in

Page 180: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1446.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) Evolution of the probability density function for the first input ofthe Fuzzy logic controller.

(b) Evolution of the probability density function for the second inputof the Fuzzy logic controller.

(c) Evolution of the probability density function for the third input ofthe Fuzzy logic controller.

Figure 6.40: Evolution of the probability function of the inputs’ membershipfunctions.

Figure 6.40(c).As well as the inputs variable the three set to optimized for the output variable

are presented in Figure 6.41. Figure 6.41(c) shows the evolution for the firstoutput’s set. The second one is shown in Figure 6.41(b). Figure 6.41(a) showsthe evolution of the probability density function of the third output’s set. Thevalues of the output were divided by 10 in the optimization process.

The mean and the sigma could be shown be itself in other way to checkthe system’s evolution. Figure 6.42 shows the evolution of the sigma in thisoptimization phase. The Figure 6.42(a) shows the evolution of the sigma valueof each input. Figure 6.42(b) shows the evolution of the sigma value of the threesets of the output variable. In both Figures is possible to appreciated that the big

Page 181: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 145

(a) Evolution of the probability density function for the first output’sset membership function.

(b) Evolution of the probability density function for the second out-put’s set membership function.

(c) Evolution of the probability density function for the third output’set membership function.

Figure 6.41: Evolution of the probability density function of the output’s mem-bership functions.

reduction of the value is done in the first 30 iterations.

Figure 6.43 shows the evolution of the best controller value for the inputs andthe outputs. As well as is shown in Figure 6.42 is in the first 350 iteration whenthe value changes more, and a fine optimization of these value is done.

The evolution of the ITSE value for this optimization phase is presented inFigure 6.44. In it is possible to appreciated that in just 20 iterations the systemdone the bigger reduction of the error. This is in relation with the evolution ofthe sigma and mean values in Figures 6.42, 6.43. In this optimization case thelower value was obtained at the 99th iteration.

Page 182: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1466.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) Evolution of the sigma value of the 3 inputs set to optimized.

(b) Evolution of the sigma value of the 3 sets of the output.

Figure 6.42: Evolution of the sigma value for the membership functions opti-mization phase.

(a) Evolution of the values of the 3 input sets during the optimizationprocess.

(b) Evolution of the values of the 3 sets of the output during the opti-mization process.

Figure 6.43: Evolution of the values for the membership functions optimizationphase.

Page 183: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 147

Figure 6.44: ITSE evolution at the first part of the optimization process.

Figure 6.45 shows the Fuzzy logic controller after the two optimizationphases of the membership functions. Here is appreciated how the sets posi-tion were change. Furthermore in the second and third inputs (Figures 6.45(b),6.45(c)) two sets were totally canceled. Output and first input (Figures 6.45(a),6.45(d)) keep the same number of sets as the non optimized controller, but witha totally different disposition. The cancellation of the sets in the input 2 and 3affects the size of the rules’ base. In this case a reduction of 64% is obtained. Atthe begging the Fuzzy controller rules’ base was composed by 125 (Figure 6.28),and after the membership functions’ optimization has 45 rules.

Table 6.12 shows the final values for each membership function of the dif-ferent inputs and output after the different optimization process. As is shown inFigures 6.45 in some cases the new position of the set generate a set with a veryreduced size. In this table is also presented the number of sets canceled and thesize of the rules’ base after each membership optimization process.

Variable Set Initial Value after Value after Value afternumber Value MF1 opt MF12 opt MF13 opt

input 1 1 15.0 0.9126 2.6241 3.1850input 2 1 5 8.152 9.8066 9.9948input 3 1 5 9.9877 9.9804 9.3674output 1 1 22.5 2.31 6.19 15.54output 1 2 45.0 67.34 66.99 56.40output 1 3 67.0 79.38 83.22 74.74

sets reduced input 3 input 2 input 2output 1.3 input 3 input 3

number of Rules 125 75 45 45Table 6.12: Optimization results for the membership functions. Here are pre-sented the different values of the set’s location of the different membership func-tions after the different optimization process.

Page 184: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1486.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) PID-fuzzy controller: membership function of the firstinput, Yaw Error, with CE optimization

(b) PID-fuzzy controller: membership function of the sec-ond input, Derivative of Yaw Error, with CE optimization

(c) PID-fuzzy controller: membership function of thethird input, Yaw Error, with CE optimization

(d) PID-fuzzy controller: membership function of the out-put, Yaw Error, with CE optimization

Figure 6.45: Variable definition of the Fuzzy logic controller after the two opti-mization phases of the membership functions.

Rules’ weight optimization The last optimization’s process is the tunning ofeach rule’s weight. Figure 6.46 shows the control loop for this phase. As well asin Figure 6.39 the dots line represents that the Cross-Entropy method does notaffects in real time. It just modify the controller to use in each test and modify theparameters to generate the next iteration’s set of N controllers to test based on theITSE information of the current iteration. After the rules base’s size reductionthe dimension of the problem was reduced from 125 to 45. Furthermore, as wasmentioned before the controller is symmetric and it also affects the rules’ base.

Page 185: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 149

In this case just the 50% +1 of the rules must be optimized. The other 50%−1 takes the same value of each symmetric rule. Finally the dimension of thisproblem is reduced to 23 values to optimize.

Figure 6.46: Control loop of the Cross-Entropy method for the rules’ weightoptimization.

As previously mentioned after the optimization of the membership functions;the rules’ weight has to be optimized. As well as in the previous optimizationphase here are presented the evolution of the pdfs. In this case the number ofrules to optimize is 23, so it is not possible to show all the pdfs evolution. Hereare presented the evolution of three of these rules; number 7, and 13, and 21. InFigures 6.47(a), 6.47(b), 6.47(c) are shown the evolution of the pdfs for the rulenumber 7, and 13, and 21, respectively.

The evolution of the mean of the 23 rules weight’s sigma is presented inFigure 6.48(a). The evolution of the sigma of the three selected rules is shown inFigure 6.48(b).

The evolution of the best controller’s value of the selected rules is shown inFigure 6.49.

The evolution of the ITSE value during the optimization process is shownin Figure 6.50. As well as in the previous optimization phase the big reductionof the ITSE value was obtained in the first 30 iterations. In this case a notablereduction of the value of this objective function is obtained after 200 iterations.The lower value or the best controller is obtained in the iteration number 222.

Finally Tables 6.13, 6.14, 6.15 shows the final value of the weight for eachrule. It is appreciated that in this optimization phase some rules has a final weightrule equal to 0 or < 0.01. These rules were canceled too. In this case the canceledrules are 14. The final controller has 31 rules, that it is a reduction of the 75.2%

Page 186: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1506.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) Evolution of the probability density function for the Rule 7weight.

(b) Evolution of the probability density function for the Rule 13weight.

(c) Evolution of the probability density function for the Rule 21weight.

Figure 6.47: Evolution of the probability density function of three selected rules’weight.

from the initial controller.Dot/error Big Left Left Zero Right Big RightNegative L (0.3879) LL(0.2291) Z (0.1201) LR (0.2289) R (0.9734)Zero LL (0.4700) Z (0.0805) LR (0.0) R (0.0) BR(0.7584)Positive Z (0.4679) LR (0.0) R (0.0076) BR (0.0673) GR (0.7808)

Table 6.13: Base of rules with value for the third input (integral of the error)equal to negative, after CE optimization of the membership functions.

Dot/error Big Left Left Zero Right Big RightNegative BL (0.1833) L (0.0) LL (0.0298) Z (0.0) LR (0.0722)Zero L (0.0) LL (0.0831) Z (0.1292) LR (0.0831) R (0.0)Positive LL (0.0722) Z (0.0) LR (0.0298) R (0.0) BR (0.1833)

Table 6.14: Base of rules with value for the third input (integral of the error)equal to zero, after CE optimization of the membership functions.

Page 187: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 151

(a) Evolution of the sigma mean value of all the rules’ weight tooptimized.

(b) Evolution of the sigma value of the three selected rules’weight.

Figure 6.48: Evolution of the sigma value for the selected rules and the mean ofall the rules.

Figure 6.49: Evolution of the values of the selected rules during the optimizationprocess.

Figure 6.50: ITSE evolution at the first part of the optimization process.

Dot/error Big Left Left Zero Right Big RightNegative GL (0.7808) BL (0.0673) L (0.0076) LL (0.0) Z (0.4679)Zero BL (0.7584) L (0.0) LL (0.0) Z (0.0805) LR (0.4700)Positive L (0.9734) LL (0.2289) Z (0.1201) LR (0.2291) R (0.3879)

Table 6.15: Base of rules with value for the third input (integral of the error)equal to positive, after CE optimization of the membership functions.

Page 188: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1526.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

6.5.4. Experiments

After all the optimization processes were finalized in the simulator, we haveto compare some of those optimized controllers with a real quadcopter. Theaerial vehicle used in this work is the AR.Drone Parrot(par, 2010) with our ownsoftware routines developed for this purpose (Mellado-Bataller et al., 2012). Aspreviously mentioned in section 6.5.1 a AruCo’s QR marker was used as the ob-ject to avoid. As well as the simulated tests after the ignition of the motors theFLC have to keep the obstacle on the center of the image for the first seconds ofthe test. After that the avoiding task starts and a reference was added to the loca-tion of the code in the image. In this way the FLC pass from keep the obstacle inthe center of the image to keep the obstacle on the right side of it. The headingFLC works in this way until the obstacle is so close. Until this moment the FLCcommands must be enough to locate the quadcopter in a safe distance and with acorrect orientation to overtake the obstacle without collied with it. As is shownin Figure 6.51. Figure 6.52 shows the onboard images of the different states ofthe avoiding collision task. In these Figures is possible to appreciate the codedetection using the AruCo visual algorithm.

Figure 6.51: Description of the avoiding task test.

Figure 6.53 shows the control loop used for real tests. In this one there isno optimization method which modify the controller. Now is used the optimalcontroller obtained in different phases of two optimization process.

In order to check the changes on the behavior of the controller during theoptimization phases a big set of tests were done. Here is presented the result of

Page 189: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 153

(a) Initial detection of the obstacle to avoid,corresponding to the first state of Figure6.51 in which the obstacle is centered in theimage.

(b) Starting point of the avoiding task, cor-responding to the second state of Figure6.51 in which the obstacle must be locatedon the right side of the image.

(c) Close to the end of the avoiding task,corresponding to the third state of Figure6.51, in which the quadrotor is getting closeto the obstacle to avoid.

(d) Overtaken the obstacle to avoid, corre-sponding to the fourth state of Figure 6.51,in which after the quadrotor is close enoughto the obstacle and the current heading ofthe aircraft is adequate to overtake the ob-stacle without collied and with a big changeof the initial trajectory.

Figure 6.52: Significant processed images from the avoiding task states de-scribed in Figure 6.51.

Figure 6.53: Control loop for the real tests.

Page 190: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1546.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

84 tests. Three controller were tested, doing 28 tests per each one. 7 differentpitch speed were selected to do these tests, from 0.2 to 1.4 ms. In order tohave more feasible results the speed used for the optimization phase (0.3 ms)was not used. The pitch speed is constant during each test. The three selectedcontrollers are the non optimized (Non-Opt); the optimal controller resulted fromthe optimization phase MF13 (Half-Opt) in which two optimizations were doneto improve the location of the membership functions’ sets using the objectivefunction ITSE; and the third the one obtained after the MF13 + Rules2 (Full-Opt) in which after the MF13 optimization was done an optimization of the rules’weight, as is detailed explain in the previous subsection. Table 6.16 shows theresults of 84 tests of the three controllers. For each pitch speed 4 tests were donewith each controller. The best test is highlighted with bold. The last column ofthis table shows the mean value of the 4 tests of each controller, being more easyto compare the results for each speed test. When the quadcopter would not ableto overtake the obstacle by hitting it or going far away a symbol ”−”is included.When at least the 50% of the tests are not successful no mean was calculated.

Following the most significant tests are presented. One test of each controllerand each speed are shown in the next Figures. Figure 6.54 shows the results ofthe three controllers with a 0.2 ms of pitch speed. In this case the three controllersovertake the obstacle successfully without collied with the obstacle. The red linerepresents the change of reference in the image frame. At the beginning of thetest the controller has to keep the obstacle to avoid in the center of the image(reference equal to 0). Next, the reference change to 20 to keep the obstacle onthe right side of the image. To finish returning the step reference to 0 when nocontrol is active during the overtake phase of the task. As high the speed is asshort is the step. In relation with Figure 6.38 is possible to appreciate that theresponse of the different controllers in real tests is quite similar to the simulatedone. The non optimized controller response shows in Figure 6.54(a) is similarto the black line of Figure 6.38. The response of the half optimized controllershows in Figure 6.54(b) is similar to the blue line of Figure 6.38 correspondingto the MF1 + MF3 (ITSE) optimization. Finally the response of Figure 6.54(c)shows a big reduction of the oscillation as well as the red line in Figure 6.38.Being this last controller the one with the best response as is shown in Table6.16.

Figure 6.55 shows the response of the three controllers for 0.4 ms pitch speed.The results are too similar to the previous test, but an increment of the responsetime is appreciated in the response of the non optimized controller and the halfoptimized controller as is shown in Figure 6.55(a), and 6.55(b). As well as theprevious test the full optimized controller has excellent response with a verysmall oscillation on the steady state of the test.

Figure 6.56 shows the results of two controllers for 0.6 ms pitch speed. Thereduced period of time of the test caused by the speed imply that no successful

Page 191: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 155

Speed Type of controller RMSE of 4 tests Mean(m/s) (degrees)

Non-Opt 7.2879 7.0467 10.8141 6.8253 7.99350.2 Half-Opt 6.0440 5.8709 6.3980 5.7861 6.0248

Full-Opt 5.6600 5.2283 5.4412 5.6539 5.4959Non-Opt 9.2584 7.5010 6.8580 11.9268 8.8861

0.4 Half-Opt 8.2417 5.5976 6.7325 8.3705 7.2356Full-Opt 5.8313 6.5556 7.4925 6.7448 6.6561Non-Opt – – – – –

0.6 Half-Opt 9.8469 8.0216 7.7419 9.3541 8.7411Full-Opt 6.9058 6.1400 5.5173 6.4143 6.2443Non-Opt – – – – –

0.8 Half-Opt 13.3745 11.1343 9.8743 11.8977 11.5702Full-Opt 8.8235 10.4854 9.7342 8.4219 9.3663Non-Opt – – – – –

1.0 Half-Opt 9.5725 11.0677 – – –Full-Opt 7.9680 7.8514 8.1845 10.7737 8.6944Non-Opt – – – – –

1.2 Half-Opt – – – – –Full-Opt 9.6387 11.6800 7.4338 10.0082 9.6902Non-Opt – – – – –

1.4 Half-Opt – – – – –Full-Opt 10.8667 – – – –

Table 6.16: Comparison between the non optimized (Non-Opt), and two Cross-Entropy optimized controller at different stage. One has the membership func-tions optimized (Half-Opt), and the other include the membership functions andweight rules optimization (Full-Opt). The three controllers were tested at differ-ent speeds. Four tests were done to check the behavior of each controller at eachspeed. The best test is highlighted with bold.

tests of the non optimized controller could be obtained. In this case the halfoptimized controller has a peak at the beginning of the reference change causedby the problems of this controller to keep the obstacle in the center of the imagein the first part of the test. Also in this test an oscillation is appreciated when thecontroller try to keep the obstacle at the reference point in the image as is shownin Figure 6.56(a). Figure 6.56(b) shows that the full optimized controller has anexcellent response with both reference values. A little oscillation is appreciatedat the middle of the test (time equal to 8 seconds).

Figure 6.57 shows the response of the half and full optimized controller forpitch speed value equal to 0.8 ms. Figure 6.57(a) shows how the half optimizedcontroller couldn’t reduce to zero the error. The controller manage the quad-copter to overtake the obstacle but has not enough power to keep a parallel tra-jectory to the initial one, going a little far away. The response keep an error of 10degrees from the reference value. Figure 6.57(b) shows the response of the full

Page 192: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1566.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) Non Optimized controller with 0.2m/s pitch speed.

(b) Half Optimized controller with 0.2m/s pitch speed.

(c) Full Optimized controller with 0.2m/s pitch speed.

Figure 6.54: Real tests results of the three controller for 0.2m/s pitch speed.

optimized controller which has enough power to keep reduce the error to zeroand overtake the obstacle with a trajectory parallel to the initial one.

Figure 6.58 shows the response of half and full optimized controller for 1.0ms pitch speed. As well as the previous test the half optimized controller has notenough power to reduce the error to zero as is shown in Figure 6.58(a). For thisspeed is first time that the full optimized controller couldn’t reduce to zero theerror as is shown in Figure 6.58(b). The steady step error when at the end of thestep is lower than 3 degrees, being this response better than the one shown in theprevious test by the half optimized controller (Figure 6.57(a) ). As is shown inTable 6.16 for this speed just the 50% of the half optimized tests were finishedsuccessful.

Figure 6.59 shows the response of the full optimized controller for 1.2 mspitch speed. For this speed no successful tests of the half optimized controller

Page 193: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 157

(a) Non Optimized controller with 0.4m/s pitch speed.

(b) Half Optimized controller with 0.4m/s pitch speed.

(c) Full Optimized controller with 0.4m/s pitch speed.

Figure 6.55: Real tests results of the three controller for 0.4m/s pitch speed.

were obtained. In this case the steady state error is augmented from the previoustest but continue being smaller than the half optimized tests of 0.8 and 1.0 ms.It has to take into account that this pitch speed is 4 times bigger than the speedused in the optimization processes.

Finally Figure 6.60 shows the response only successful test for 1.4 ms pitchspeed of the full optimized. In this case the controller has not the power to reducethe error at least to the 50% of the reference value. Being this response equal tothe response obtained with the half optimized controller for 0.8 and 1.0 ms.

The tests’ results presented in this section shows that the half optimized con-troller double the speed of the non optimized controller with four successful testswith a 0.8 ms pitch speed. A big reduction of the oscillation is obtained too. Butthis oscillation is only reduced to almost zero with the full optimized controller.The optimization of the rules’ weight accomplish four successful tests with an

Page 194: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

1586.5. Membership Functions’ sets and Rules’ Weight optimization of Fuzzy

controllers: Applied to UAV See and Avoid task.

(a) Half Optimized controller with 0.6m/s pitch speed.

(b) Full Optimized controller with 0.6m/s pitch speed.

Figure 6.56: Real tests results of the three controller for 0.6m/s pitch speed.

(a) Half Optimized controller with 0.8m/s pitch speed.

(b) Full Optimized controller with 0.8m/s pitch speed.

Figure 6.57: Real tests results of the three controller for 0.8m/s pitch speed.

increment of the 50% pitch speed of the last four successful tests of the halfoptimized controller.

The most relevant videos of these tests are shown in (CVG, 2012) and (You,2012).

Page 195: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 159

(a) Half Optimized controller with 1.0m/s pitch speed.

(b) Full Optimized controller with 1.0m/s pitch speed.

Figure 6.58: Real tests results of the three controller for 1.0m/s pitch speed.

Figure 6.59: Full Optimized controller with 1.2m/s pitch speed

Figure 6.60: Full Optimized controller with 1.4m/s pitch speed

6.6. Conclusions

In this chapter is presented and optimization a novel approach of the Cross-Entropy optimization method for tune Fuzzy Logic controllers. This method wasmotivated by an adaptive algorithm for estimating probabilities of rare events in

Page 196: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

160 6.6. Conclusions

complex stochastic networks, which involves variance minimization. The CEmethod has been successfully applied to a number of difficult combinatorial op-timization problems, including the maximal cut problem, the traveling salesmanproblem (TSP), the quadratic assignment problem, different types of schedulingproblems, the clique problem, the buffer allocation problem (BAP) for produc-tion lines, and combinatorial optimization problems associated with the genome.The uses in control are reduced to two works presented in the literature. It is outof the scope of this work the comparison with other optimization methods. Butthis work contributes to the literature with a new method to adjust the gains, thesize and location of the membership function’s sets and the rules’ weight of aFuzzy controller.

Following the tuning definition and classification proposed by Zheng, in thischapter are presented a macroscopic optimization, and a medium-size and mi-croscopic optimization of a Fuzzy PID-like controller. Being the macroscopicoptimization, the modification of the scaling factors or inputs’ gains of the con-trollers. The medium-size optimization corresponds to the update of the size andlocation of the membership functions’ sets. Finally the microscopic optimiza-tion affects the rules of the controller. In the presented work have been done amodification of each rule’s weight. Both optimization process were focused tooptimize the heading controller of an aerial vehicle for avoiding collision task.Different simulator environments were used for each work.

The optimization of the scaling factors of the controller were done usingROS-Gazebo (Robotic Operative System). An implementation of the Cross-Entropy software to optimize Fuzzy controllers created using MOFS was de-veloped and include in ROS environment. The optimization process was doneusing a fixed vehicle speed The optimization method just need 11 iterations toobtained good enough results to use them in a real environment. The low numberof controllers tested, just 330 remarks the power of this optimization method. Abig amount of tests at different speed value were done to determine the improve-ment of the controller. The optimized controller could finish successfully theavoiding task in more situations than the non optimized controller. The use ofthe Cross-Entropy method to optimize the controller allows to double the speedof the non optimized controller. The faster test of the optimized controller was 4times faster than the speed used to train it.

The optimization of the membership functions and the rules’ weight weredone using Matlab-Simulink. As well as in the previous work an implemen-tation of the Cross-Entropy method for optimize Fuzzy controllers were devel-oped. In this case a full implementation were done in which is possible to selectfrom one to three the number of factors to optimize at the same time, and thesequence of the optimization process. A simulator was developed using MatlabSimulink environment, which involves a model of the quadrotor, object to avoidand a virtual camera. Two different objective functions (RMSE and ITSE) wereused and compared for the optimization process. The most relevant optimal con-trollers obtained in the different optimization’s phases were tested with a real

Page 197: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 6. Optimization of Fuzzy Logic Controllers using the Cross-Entropymethod 161

quadcopter and compared against the non optimized controller. Different speedvalues were used to check the behavior of each controller. The tuned controllersolve the avoiding task reducing the error and with a speed increment of 3 timesagainst the last successful test of the initial controller. The most relevant testswere shown in details. This novel use of the Cross-Entropy optimization methodnot only shows an improvement in the behavior of the controller, but also a bigreduction on the base of rules of 75%. The initial rules’ base was composed by125 rules, and at the end of the optimization process, the tuned controller justuse 31 rules.

Page 198: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 199: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 7Thesis Conclusions

In this chapter are presented general conclusions of this Thesis. Specificconclusions of the different novel approaches on visual Fuzzy control presentedin this Thesis are exposed in each chapter.

This Thesis presents an extended research work about Fuzzy control forunmanned vehicles using visual information. The outcome of this dissertationis to demonstrate the versatility of this Soft-Computing technique for controlsystems with a high complex dynamics. The capacity of design control systemsusing this technique without having the model of the vehicle let us to designa big set of controllers that have been implemented for different tasks usingdifferent kind of vehicles. The novel visual Fuzzy control systems presentedin this Thesis were developed using the self-developed software, called MiguelOlivares’ Fuzzy Software (MOFS). Being and important aim of this Thesis bea guide for other researchers who want to design Fuzzy controllers for similaror different tasks. For this reason the 12 developed controllers presented in thisThesis are explained in details in a ready-to-use mode. In the same way, all thecode developed is available under open source policy (CVG, 2012).

To corroborate the correct behavior of each Fuzzy control system developedan exhausted set of experiments were done in outdoors and indoors environ-ments with real unmanned vehicles.

163

Page 200: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

164

A novel approach of a visual Fuzzy control was developed to manage adriverless car for inner city environments. This low cost autonomous systemwas implemented using just a camera with a reduced field of view of 30× 50cm. A visual algorithm was implemented to follow a line painted on the roadand to detect visual landmarks. The system was tested in a wide range ofsituations like a top speed of 48 km/h, driving more than 5 km without anyhuman intervention, being robust against hard slowdowns, sudden accelerations,and lateral disturbances. The publications (Olivares-Mendez et al., Dec),and (Olivares-Mendez et al., 2013b) shows some of the results obtained withthis visual Fuzzy control. Other results are still waiting for the reviewer response.

A contribution in visual Fuzzy control was presented to command anunmanned helicopter for different autonomous solutions. The control of anonboard pan and tilt visual platform and the helicopter’s heading have beenimplemented to tracked ground objects from the sky with a total freedom ofmovements for the helicopter using visual information. The Fuzzy control of thehelicopter’s pitch, roll and altitude was developed to command this aircraft foran autonomous landing based on the visual pose estimation. The specificationof the first Visual Fuzzy control and the obtained results are published on(Olivares-Mendez et al., Oct), (Olivares-Mendez et al., 2009), (Mondragonet al., 2010), (Olivares-Mendez et al., 2010). The autonomous landing controlsystem is published on (Olivares-Mendez et al., July)

A new visual Fuzzy control system was developed to control two differentkind of quadrotors. The input information of the control system was the infor-mation acquired by an onboard camera. The pitch and heading of a quadcopterwere controlled to follow an aerial object from a safety distance. The visualFuzzy control of the heading was used for see and void tasks. The autonomouscontrol for aerial object following is published on (Olivares-Mendez et al.,2011), (Olivares-Mendez et al., 2013c). The visual Fuzzy control for see andavoid task is published on (Olivares-Mendez et al., 2012a)

Auto-tuning fuzzy control is also a topic involved in this Thesis. A novel ap-proach using the Cross-Entropy method is presented. This optimization methodwas used to modify the scaling factors of the controller, as well the membershipfunction sets’ size and location and the rules’ weight. The controllers wereoptimized for a see and avoid task using a quadcopter. Comparisons withnon optimized controller and intermediate optimized controllers were done tocorroborate the behavior’s improvements of the final optimized Fuzzy control.Excellent results were obtained with a reduction of 75% of the size of therules’ base. The optimization process of the FLC’s input gains for see andavoid task with a quadrotor is published on (Olivares-Mendez et al., 2012b),(Olivares-Mendez et al., 2013a). The optimization of the membership functions

Page 201: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Chapter 7. Thesis Conclusions 165

sets and the rules’ weight is still under review.

The most relevant visual algorithms presented in the literature are used forthese applications, (eg. Lukas-Kanade optical flow, Blob extraction, homogra-phy estimation, CamShift and QR code detection). Depending on the tasks to beachieved, image-based visual servoing or position-based visual servoing wereused.

Summarizing, this Thesis involve works on control contributions with highcomplex robots, and a novel optimization approach for tunning the principalparameters of Fuzzy controllers and software development for design Fuzzycontrollers and to auto-tuning controllers for specific tasks and vehicles.

This research work establish the base for continuing working with unmannedvehicles and increase the complexity of the tasks using the presented optimiza-tion approach. The sensor fusion must be other research line to continue. Addinginformation from onboard sensors like lasers, sonars or external ones will im-prove the controllers’ behavior.

Page 202: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,
Page 203: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix APublications derived from this thesis

A.1. Journals with JCR

Below is a list that summarizes the Journal derived from the woks presentedon this thesis. Related information about the ranking given for the categoryRobotics by the Journal Citation Reports (JCR) (ThomsonReuters, 2010) for theyear 2010 is provided.

Inner-City Driverless Vehicle: A Low Cost Approach with Vision forPublic Transport inside Cities: Miguel A. Olivares-Mendez, Pascual Campoy,Ignacio Mellado-Bataller, Jose Luis Sanchez-Lopez, Felipe Jimenez. Roboticsand Automation Magazine. Under review. JCR: 2.173, first third.

Hierarchical Tracking Strategy for Vision-Based Applications On-BoardUAVs: Carol Martinez, Pascual Campoy, Jose Luis Sanchez-Lopez, IvanMondragon, Miguel A. Olivares-Mendez. Journal of Intelligent and RoboticSystems, Accepted. JCR: 0.757, last third.

Vision-Based PD-type Fuzzy Servoing of a Quadrotor UAV: Apply forAerial Object Following: Miguel A. Olivares-Mendez, Changhong Fu, IvanF. Mondragon, Pascual Campoy, Luis Mejias, Carol Martinez. Journal ofIntelligent and Robotic Systems, Accepted. JCR: 0.757, last third.

167

Page 204: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

168 A.2. Peer-Reviewed Congress

Cross-Entropy Optimization for Scaling Factors of a Fuzzy Controller:A See-and-Avoid Approach for Unmanned Aerial Systems: Miguel A.Olivares-Mendez, Luis Mejias, Pascual Campoy, Ignacio Mellado-Bataller.Journal of Intelligent and Robotic Systems, January 2013, Volume 69, Issue 1-4, pp 189-205, DOI: 10.1007/s10846-012-9791-5, 2013. JCR: 0.757, last third.

On-board and Ground Visual Pose Estimation Techniques for UAVControl: Carol Martinez, Ivan F. Mondragon, Pascual Campoy, Miguel A.Olivares, Journal of Intelligent and Robotic Systems, ISSN: 0921-0296 (Print)1573-0409 (Online). Volume 61, Numbers 1-4, 301-320, DOI: 10.1007/s10846-010-9505-9, 2011. JCR: 0.757, last third.

Unmanned aerial vehicles UAVs attitude, height, motion estimation andcontrol using visual systems: Ivan F. Mondragon, Pascual Campoy, CarolMartinez, Miguel A. Olivares, Luıs Mejıas, Journal of Autonomous Robots,Volume 29, Number 1, 17-34, DOI: 10.1007/s10514-010-9183-2, 2010. JCR:2.011, first third.

Omnidirectional vision applied to unmanned aerial vehicles (UAVs)attitude and heading estimation: Ivan F. Mondragon, Pascual Campoy, CarolMartinez, Miguel A. Olivares, Robotics and Autonomous Systems, Volume 58,Issue 6, Omnidirectional Robot Vision, 30 June 2010, Pages 809-819, ISSN0921-8890, DOI: 10.1016/j.robot.2010.02.012. JCR: 1.313, second third.

Visual 3-D SLAM from UAVs: Jorge Artieda, Jose Maria Sebastian,Pascual Campoy, Juan Correa, Ivan Mondragon, Carol Martinez, MiguelOlivares. Journal of Intelligent and Robotic Systems, ISSN: 0921-0296 (Print)1573-0409 (Online). DOI 10.1007/s10846-008-9304-8 January 2009. JCR:0.757, last third.

Computer Vision Onboard UAVs for civilian tasks: Pascual Campoy, JuanCorrea, Ivan Mondragon, Carol Martinez, Miguel Olivares, Jorge Artieda, LuısMejıas. Journal of Intelligent and Robotic Systems, ISSN: 0921-0296 (Print)1573-0409 (Online). DOI 10.1007/s10846-008-9256-z. August 2008. JCR:0.757,last third .

A.2. Peer-Reviewed CongressSee-and-avoid quadcopter using fuzzy control optimized by cross-entropy:

Miguel A. Olivares-Mendez, Luis Mejias, Pascual Campoy, Ignacio Mellado-Bataller. IEEE International Conference on Fuzzy Systems (FUZZ-IEEE),2012. 10-15 June 2012. Brisbane, Australia.

Page 205: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix A. Publications derived from this thesis 169

Rapid Prototyping Framework for Visual Control of Autonomous Mi-cro Aerial Vehicles: Ignacio Mellado-Bataller, Pascual Campoy, Miguel A.Olivares-Mendez, Luis Mejias. The 12th International Conference on Intelli-gent Autonomous Systems (IAS-12), June 26-29 2012, Jeju Island, South Korea.

UAS See-and-Avoid using two different approaches of Fuzzy Control:Miguel A. Olivares-Mendez, Luis Mejias, Pascual Campoy, Ignacio Mellado-Bataller. ICUAS 2012 : 2012 International Conference on Unmanned AircraftSystems, Philadelphia, PA, USA, Jun 12, 2012 - Jun 15, 2012.

Hierarchical Strategy for Real-Time Tracking On-Board UAVs: CarolMartinez, Pascual Campoy, Ivan F. Mondragon, Jose Luis Sanchez-Lopez,Miguel A. Olivares-Mendez. ICUAS 2012 : 2012 International Conference onUnmanned Aircraft Systems, Philadelphia, PA, USA, Jun 12, 2012 - Jun 15,2012.

Adaptive Control System based on Lineal Control Theory for the Path-Following Problem of a Car-Like Mobile Robot: Jose Luis Sanchez-Lopez,Pascual Campoy, Miguel A. Olivares-Mendez, Ignacio Mellado-Bataller, DavidGalindo. IFAC Conference on Advances in PID Control PID’12, 28-30 March2012, Brescia, Italy.

A Visual AGV-urban Car using Fuzzy Control: Miguel A. Olivares-Mendez, Ignacio Mellado, Pascual Campoy, Ivan Mondragon, Carol Martinez.Proceedings for IEEE International Conference on Automation, Robotics andApplications ICARA2011. Wellington, New Zealand, 6-8 December, 2011.

Aerial Object Following Using Fuzzy Servoing: Miguel A. Olivares-Mendez, Luis Mejıas, Pascual Campoy, Ivan F. Mondragon, Carol Martinez.Workshop on Research, Development and Education on Unmanned AerialSystems RED-UAS 2011, Seville, Spain, November 30 -December 1, 2011.

3D Object following based on visual information for Unmanned AerialVehicles: Ivan F. Mondragon, Pascual Campoy, Carol Martinez, MiguelA. Olivares. Accepted for Proceedings of IEEE Latin American RoboticsCompetition (LARC) and The Latin American Robotics Symposium (LARS)LARC & LARS & CCAC 2011. Bogota, Colombia. Oct. 1-4, 2011.

3D Pose estimation based on planar object tracking for UAVs control:Ivan F. Mondragon, Pascual Campoy, Carol Martinez, Miguel A. Olivares.Proceedings of IEEE international conference on robotics and automationICRA2010. Anchorage, Alaska, USA. May 3-8, 2010.

Page 206: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

170 A.2. Peer-Reviewed Congress

Fuzzy-4D/RCS for Unmanned Aerial Vehicles: Miguel A. Olivares, PascualCampoy, Ivan F. Mondragon, Carol Martinez. International Congress of BrainInspired Cognitive Systems BICS 2010, Madrid, Spain, 14-16 July, 2010.

Fuzzy controller for UAV-landing task using 3d-position visual estimation:Miguel A. Olivares, Ivan F. Mondragon, Pascual Campoy, Carol Martinez.proceedings of IEEE world congress on computational intelligence (IEEEWCCI 2010-IEEEFUZZ2010). Barcelona, Spain, July 18-23, 2010.

A Robotic Eye Controller Based on Cooperative Neural Agents: OscarChang, Pascual Campoy, Carlo Martinez, Miguel A. Olivares-Mendez. Pro-ceedings of IEEE world congress on computational intelligence (IEEE WCCI2010-IJCNN2010). Barcelona, Spain, July 18-23, 2010.

Onboard and ground visual pose estimation techniques for UAV control:Pascual Campoy, Ivan F. Mondragon, Carol Martinez,, Miguel A. Olivares. 3rdinternational symposium on unmanned aerial vehicles (UAV’10). Dubai, ArabEmirate, June 21-23, 2010.

A pan-tilt camera fuzzy vision controller on an unmanned aerial vehicle:Miguel A. Olivares, Ivan F. Mondragon, Pascual Campoy, Carol Martinez. 2009IEEE/RSJ international conference on intelligent robots and systems (IROS), StLouis, Usa. Oct 11-15,2009,

Trinocular ground system to control UAVs: Carol Martinez, Ivan F.Mondragon, Pascual Campoy, Miguel A. Olivares. 2009 IEEE/RSJ internationalconference on intelligent robots and systems (IROS), St Louis, Usa. Oct11-15,2009,

Visual servoing using fuzzy controllers on an unmanned aerial vehicle:Miguel A. Olivares, Pascual Campoy, Ivan F. Mondragon, Carol Martinez.EUROFUSE 2009. workshop on on preference modelling and decision analysis.Pamplona, Spain, September 16-18, 2009

Computer Vision onboard UAVs for civilian tasks: Pascual Campoy, IvanF. Mondragon, Miguel A. Olivares-Mendez, Carol Martinez. UAV WorldConference at AIRTECH. Frankfurt, Germany, November 12-14, 2008.

Vision for guidance and control of UAVs in civilian tasks: PascualCampoy, Ivan F. Mondragon, Carol Martinez, Miguel A. Olivares. UAV’08International Symposium On Unmanned Aerial Vehicles Orlando, Florida,USA. June 23-24, 2008.

Page 207: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix A. Publications derived from this thesis 171

Visually Guiding Autonomous Helicopters for Civilian Tasks: PascualCampoy, Ivan Mondragon, Juan Correa, Carol Martinez, Miguel Olivares.Innovation in Unmanned Aerial Vehicles, Madrid, Spain, November 2007.

Fuzzy Logic User Adaptive Navigation Control System For Mobile Robotsin Unknown environments: Miguel Olivares Mendez, Javier Madrigal. 5thIEEE International Symposium on Intelligent Signal Processing, WISP-07,Alcala de Henares, Spain, October 2007.

A.3. Book Chapters

Autonomous Guided Car using a Fuzzy Controller: Miguel A. Olivares-Mendez, Pascual Campoy, Ignacio Mellado-Bataller, Ivan F. MondragAłn, CarolMartinez. Recent Advances in Robotics and Automation. Springer Studies inComputational Intelligence. Accepted.

Quadcopter See and Avoid Using a Fuzzy Controller: Miguel A. Olivares-Mendez, Luis Mejias, Pascual Campoy, Ignacio Mellado-Bataller. WorldScientific Proceedings Series on Computer Engineering and Information Sci-ence: Uncertainty Modeling in Knowledge Engineering and Decision Making.Volume 7, pages 1239-1244. ISBN: 978-981-4417-73-0. 2012.

MAVwork: a Framework for Unified Interfacing between Micro AerialVehicles and Visual Controllers: Ignacio Mellado-Bataller, JesAss Pestana,Miguel A. Olivares-Mendez, Pascual Campoy, Luis Mejias. Frontiers of Intel-ligent Autonomous Systems. Springer Studies in Computational Intelligence.Volume 466 2013, pages 165-179. ISBN: 978-3-642-35484-7 (Print). 2012

Non-symmetric membership function for Fuzzy-based visual servoingon-board a UAV: Miguel A. Olivares, Pascual Campoy, Ivan F. Mondragon,Carol Martinez. Computation Intelligence Foundations and Applications ISBN:978-981-4324-69-4 981-4324-69-8, pages 300-307. 2010.

Visual Servoing for UAVs: Pascual Campoy, Ivan F. Mondragon, CarolMartinezUAV’08, Miguel A. Olivares. Visual Servoing. ISBN 978-953-307-095-7, Publisher: InTech Publication date: April 2010.

Fuzzy control system navigation using priority areas: Miguel A. Olivares,Pascual Campoy, Ivan F. Mondragon, Carol Martinez, Juan Correa. Computa-tional Intelligence in Decision and Control. ISBN: 978-981-279-947-0(ebook)981-279-947-8(ebook), pages 987-996. 2008.

Page 208: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

172 A.4. Digital media

A.4. Digital media

Digital media related with this thesis is available at the vision4uavweb page project (CVG, 2012) on the section dedicated to this thesishttp://www.vision4uav.eu/?q=phdthesis

A.5. Participation on Technology Transfer Projects

Industry Technology Transfer

On board Control and Visual Guidance of a New Prototype of Urban Vehi-cle. SIEMENS S.A. and supported by the CDTI. (April 2010 - Dec 2011).Project Leader until June 2011.

Visual Guidance of a Commercial Compact Car. SIEMENS S.A. (Oct2008-Dec 2010). Project Leader from Dec 2009.

Visual Guidance Feasibility Study for and Outdoor Urban Vehicle.SIEMENS S.A. (Nov 2007 - Oct 2008)

International R&D programs

OMNIWORKS: Omnidirectional vision for human-UAV co-working.ECHORD Project in the Eruropean FP7. (Feb 2012 - Aug 2013). ProjectLeader.

UECIMUAVS: USA and Europe Cooperation in Mini UAVs. IRSESproject - Marie Curie Program FP7. (May 2012 - April 2016).

ICPUAS: International Cooperation Program for Unmanned Aerial Sys-tems Research and Development. IRSES project - Marie Curie ProgramFP7. (March 2009 - Sept 2012)

National R&D projects

Computer Vision for UAV, from visual information to visual guidance.Funded by the Spanish Ministry of Science MICYT #DPI2010-20751-C02-01. (Jan 2011 - Dec 2013)

Computer Vision for UAV. Guidance, Control, Cooperation, and Inspec-tion. Funded by the Spanish Ministry of Science MICYT. (Jan 2008 - Dec2010)

Page 209: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix A. Publications derived from this thesis 173

A.6. Research PositionsComputer Vision Group at Universidad Politecnica de Madrid (UPM) -Centro de Automatica y Robotica (CAR): 2007 - present.

Australian Research Centre for Aerospace Automation (ARCAA) atQueensland University of Technologies (QUT). Brisbane - Australia: June- Dec 2011.

Laboratory of Intelligent Systems (LIS) at Ecole Polythecnique Federalede Lausanne (EPFL). Lausanne - Switzerland: March - May 2012.

Page 210: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

174 A.6. Research Positions

Page 211: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix BMOFS: Miguel Olivares’ FuzzySoftware

The Miguel Olivares’ Fuzzy Software (MOFS) is a C++ library developeddeveloped by Miguel A. Olivares-Mendez to create Fuzzy controllers. Thelibrary has a classes definition in which a class is defined for each part of theFuzzy control environment, as is shown in Figure B.1. This class structurefacilitate the way to work whit it and future software updates. There are differentclasses for variables, rules, membership functions and defuzzification modes.Depending on the system that the user wants to create, he can chose the numberof inputs and outputs. More than one controller could be used in parallel or serialdisposition. In the same way the user can chose the different characteristics ofthe variables, the fuzzification inference type or the defuzzification mode.

175

Page 212: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

176

Figu

reB

.1:S

oftw

are

defin

ition

.

Page 213: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix B. MOFS: Miguel Olivares’ Fuzzy Software 177

This Fuzzy software lets the user to introduce a more important sector ineach fuzzy variables. In that way a bigger number of sets with a smaller sizecould be defined in the selected part of the variable, as is shown in FigureB.2. This improvement reduces the size of the rule-base than if the sets hasthe same size in all the variable’s range. The improvements of this specificcharacteristic of this software is clearly presented in (Olivares-Mendez et al.,2008), (Olivares-Mendez et al., 2010), (Mondragon et al., 2010).

Figure B.2: Example of a variable with a most important sector definition in thecenter of it

This C++ library also include a supervised learning algorithm. It is based onthe idea of the synapses weight of the brain neurons. Each rule has a assignedweight value. Initially all the weights are set with a default value (0.5), and itwill change depending if the control output is equal to the user decision or not.There is a flag definition to activate/deactivate the learning algorithm. Once thelearning algorithm is activated the user has to command the vehicle or system.In each situation evaluation the Fuzzy control check the situation and gives anoutput based on the rules’ base, this linguistic value is compare to the user’soutput. If both output belongs to the same linguistic tags, the weight of therules involve in this evaluation will be increased by the membership functionvalue (belongs to (0,1)) divide by 3. If the outputs do not belongs to the samelinguistic tag the weight value of all the rules involve will be decreased by aconstant value equal to 0.3. Once the weight of a rule goes below zero, the ruleget the output of the linguistic value with higher membership function value ofthe user decision. In that way the controller modify its rules’ base based on theuser decisions, adapting its behavior to the different users, as is presented in(Olivares and Madrigal, 2007).

B.0.1. How to use MOFSAs is shown in Figure B.1 the main class of the library is MOFSModel. There

2 constructors defined in this class with 3 different parameters:The parameters are:

mode

Page 214: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

178

0: Supervised learning algorithm activated in manual mode. In eachevaluation the system ask to the user if the output is correct of not.

1: Supervised learning algorithm activated in autonomous mode. Ineach evaluation the system compare its output with the user’s out-put.

2: Supervised learning algorithm deactivated. The Fuzzy logic controlevaluates the situation and sends the output command to the system.

genRules

0: Read the rules’ base from a file. The antecedent, the consequent(output) and the weight.

1: Generate the rules’ base output randomly.

option4CreateVars

0: Read the range of the variable, the number of sets, the range of themost important sector and the number of sets in this sector.

1: Read all the X axis intersections.

The constructors are:

MOFSModel(int mode): Read the variables by keyboard input. After thatthe rules base output is genreated randomly (genRules = 1).

MOFSModel(char ∗fileVars, char ∗fileRules, int mode, int genRules, intoption4CreateVars): The files of the variables and the rules’ base is passedat parameters. The way to create the controller depends on the others pa-rameters.

The input files to create a controller are defined in the following way:To read the variables:

option4CreateVars = 0

numVars: Number of variables in the system (inputs + outputs).

Var type: Type of the variable I = input, O = output

name: The variable name

max x: Maximum value of the variable’s range

min x: Minimum value of the variable’s range

n sets: Number of sets

max MI sector: Maximum value of the most important sector range

min MI sector: Minimum value of the most important sector range

n sets MI sector: Number of sets of the most important sector range

Page 215: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix B. MOFS: Miguel Olivares’ Fuzzy Software 179

- Repeat from Var type to n sets MI sector for each variable

option4CreateVars = 1

numVars: Number of variables in the system (inputs + outputs).

Var type: Type of the variable I = input, O = output

name: The variable name

n sets: Number of sets

1stX axis intersection: Value of the first X axis intersection

2ndX axis intersection: Value of the second X axis intersection

3rdX axis intersection: Value of the third X axis intersection

...

nthX axis intersection: Value of the n-th X axis intersection

- Repeat from Var type to nthX axis intersection for each variable

To read the rules’ base:

Antecedents

Consequents

Weight

- Repeat all of them per each rule

For both variable and rules’ base file each value is in one line, so after thevalue a end of line is added.

Page 216: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

180

Page 217: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix CUAV specifications

In this appendix, there is a brief description of the UAVs employed on thedevelopment of this thesis. This description includes some aircrafts specifica-tions, the visual system employed and the communications structures used forinformation exchange between the aircraft low level controller and the high levelon-board visual system.

C.1. Gas and Electric powered UAV helicoptersThe vision4uav project (CVG, 2012) has two Rotomotion UAV vehicles (Fig-

ure C.1), The first one is a modified Bergen industrial Twim RC helicopter SRA1.It is a gas powered helicopter. The second one is a modified Xcell Electric RCelectric helicopter SR20. Table C.1 summarizes the principal technical charac-teristics.

C.1.1. Helicopter Autopilot APIExternal process (ground clients, and on-board system) interact with the Low

Level AFCS autopilot through a UDP socket over and ethernet network. TheAPI consist of a series of messages, that are used to send and get information

181

Page 218: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

182 C.1. Gas and Electric powered UAV helicopters

Figure C.1: UAVs rotary wing helicopters used on this thesis: (a) Rotomo-tion SRA1 Modified Bergen Industrial Twim gas powered.(b) Rotomotion SR20modified Xcell Electric RC helicopter

from the autopilot. Following are summarized the most used messages.

Aircraft State: The state object encapsulates most all of the state of thehelicopter and the autopilot system. It is sent at the highest rate of all themessages since it is the most useful for the ground station:

t y p e d e f s t r u c t {/∗ E u l e r a n g l e s r e l a t i v e t o t h e ground ∗ /do ub l e p h i ;do ub l e t h e t a ;do ub l e p s i ;/∗ Body frame r o t a t i o n a l r a t e s ∗ /do ub l e p ;do ub l e q ;do ub l e r ;/∗ P o s i t i o n r e l a t i v e t o t h e ground ∗ /do ub l e x ;do ub l e y ;do ub l e z ;/∗ V e l o c i t y ove r t h e ground ∗ /do ub l e vx ;do ub l e vy ;do ub l e vz ;/∗ Raw magne tomete r r e a d i n g s ∗ /do ub l e mx ;do ub l e my ;do ub l e mz ;/∗ Body frame l i n e a r a c c e l e r a t i o n s ∗ /do ub l e ax ;do ub l e ay ;do ub l e az ;/∗ raw body frame r o t a t i o n a l r a t e s ∗ /do ub l e raw p ;

Page 219: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix C. UAV specifications 183

Table C.1: Rotomotion UAVs technical specificationsSRA1 Gas powered SR20 Electric powered

Length (mm) 1700 1700Height (mm) 750 650Main Blade Size (mm) 880 880Tail Blade Size (mm) 130 130Power Unit Gasoline Motor @52 HP Electric motor @1400 W.Max Payload (kg) 14 4Endurance (min) 45 15Autopilot AFCS 2.5 AFCS 3.4Sensor GPS, IMU and Magne-

tometerGPS, IMU and Magne-tometer

Communications WiFi/Ethernet WiFi/EthernetClassification Light UAV Light UAVVisual System

CPU VIA mini-itx 1.0 GHz VIA nano-itx 1.5 GHzRAM (GB) 1.0 2.0cameras interfaces USB 2.0, 1394a, ethernet USB 2.0, 1394a, ethernet,

Analog capturedOS linux linuxCommunication withAutopilot and Ground

WiFi/Ethernet WiFi/Ethernet

do ub l e raw q ;do ub l e r a w r ;/∗ M i s c e l l a n e o u s ∗ /do ub l e t r a c e ;do ub l e v o l t a g e ;u i n t 1 6 t r e s e t s ;u i n t 1 6 t rpm0 ;u i n t 1 6 t rpm1 ;u i n t 1 6 t rpm2 ;

} s t a t e t ;

GPS state: The GPS state messages encode information about the quality ofthe position solution and the raw Lat/Lon coordinates for applications that are notsatisfied with solely the local tangent plane. This allows geo-referencing of thesensor data and payloads, should it be required. The raw velocity and positionvalues are run through a Kalman fitler that uses the vdop and pdop to weight thereadings. It uses the integrated velocity as a position state estimate, then aidesthat with the position measurements. This allows a higher quality of solution forthe local tangent plane.

t y p e d e f s t r u c t {i n t 1 6 t numsv ;i n t 1 6 t s a c c ;

Page 220: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

184 C.1. Gas and Electric powered UAV helicopters

i n t 1 6 t pacc ;i n t 1 6 t f i x t y p e ;do ub l e v e l n ;do ub l e v e l e ;do ub l e v e l d ;do ub l e pos n ;do ub l e p o s e ;do ub l e pos d ;do ub l e l a t ( r a w p o s n AFCS 2 . 5 ) ;do ub l e long ( r a w p o s e AFCS 2 . 5 ) ;do ub l e a l t ( r a w p o s d AFCS 2 . 5 ) ;do ub l e t r a c e ;

} m s g g p s t ;

Position Control commands: It tells the AFCS where the operator wantsthe UAV to go

t y p e d e f s t r u c t {i n t 1 6 t pos mode ;i n t 1 6 t hdg mode ;i n t 1 6 t t r a n s i t t i m e ;u i n t 1 6 t pad1 ;do ub l e x ;do ub l e y ;do ub l e z ;do ub l e hdg ;

} m s g f l y t o t ;

Velocity teleoperation commands: Teleop messages are sent by customground stations with their own joysticks or controllers, or by ground based soft-ware that is handling targeting and/or navigation control. The velocities areroughly in m/s, but depend on the actual steady state offsets. Very small val-ues may actual cause the aircraft to move backwards.

t y p e d e f s t r u c t {do ub l e f o r e ;do ub l e s i d e ;do ub l e a l t ;do ub l e hdg ;

} m s g t e l e o p t ;

Helicopter desired position and heading confirmation: The desired posi-tion message is sent as a response to a flyto packet. It indicates the position inthe NED frame to which the UAV will try to fly

t y p e d e f s t r u c t {do ub l e n ;do ub l e e ;

Page 221: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix C. UAV specifications 185

do ub l e d ;do ub l e hdg ;

} m s g d e s i r e d p o s t ;

Pan and Tilt platform control: If the UAV is outfitted with a camera plat-form, it will respond to this message. Only the fields relevant to the selectedmode need to be filled in by the ground station; the AFCS will fill in all the fieldsin its reply.

t y p e d e f s t r u c t {i n t 1 6 t mode ;i n t 1 6 t pad0 ;i n t 1 6 t pad1 ;i n t 1 6 t pad2 ;do ub l e r e l a t i v e r o l ldo ub l e r e l a t i v e p i t c hdo ub l e r e l a t i v e y a wdo ub le a b s o l u t e r o l ldo ub l e a b s o l u t e p i t c hdo ub l e a b s o l u t e y a wdo ub le camera ndo ub l e camera edo ub l e camera d

} m s g d e s i r e d p o s t ;

C.2. UAV quadcopter

C.2.1. AscTec Pelican

The vision4uav project (CVG, 2012) has one AscTec Pelican Quadcopter(Figure C.2. Table C.2 summarizes the principal technical characteristics.

Figure C.2: UAVs rotary wing quadcopter used on this thesis: Ascending Tech-nologies Pelican UAV

Page 222: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

186 C.2. UAV quadcopter

Table C.2: Pelican AscTec UAV technical specificationsAscTec Pelican

rotors 4Area (mm2) 600Blade Size (mm) 250Power Unit 4 Electric motor @120 W.Max Payload (kg) 0.5Endurance (min) 15Autopilot AscTec AutoPilotSensor GPS, IMU, Magnetometer, and

pressure altimeterCommunications Serial Xbee RFClassification Light UAVVisual System

CPU AscTec 1.6 GHz Atom BoardRAM (GB) 1.0cameras interfaces USB 2.0OS linuxCommunication with Autopi-lot and Ground

Serial Interface /Wifi-Ethernet

C.2.1.1. Quadcopter autopilot SDK

External process (ground clients, and on-board system) interact with theLow Level AscTec autopilot through a serial interface (running at 57600bps)and a series of messages, that are used to send and get information from theautopilot. Following are summarized the most used messages.

s t r u c t LL STATUS{/ / b a t t e r y v o l t a g e s i n mV

s h o r t b a t t e r y v o l t a g e 1 ;s h o r t b a t t e r y v o l t a g e 2 ;s h o r t s t a t u s ;s h o r t c p u l o a d ;c h a r c o m p a s s e n a b l e d ;c h a r c h k s u m e r r o r ;c h a r f l y i n g ;c h a r m o t o r s o n ;s h o r t f l i g h t M o d e ;s h o r t u p t i m e ;

} ;

s t r u c t IMU RAWDATA{i n t p r e s s u r e ;s h o r t g y r o x ;

Page 223: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix C. UAV specifications 187

s h o r t g y r o y ;s h o r t g y r o z ;s h o r t mag x ;s h o r t mag y ;s h o r t mag z ;s h o r t a c c x ;s h o r t a c c y ;s h o r t a c c z ;u n s i g n e d s h o r t t emp gyro ;u n s i g n e d i n t temp ADC ;

} ;

s t r u c t IMU CALCDATA{i n t a n g l e n i c k ;i n t a n g l e r o l l ;i n t ang le yaw ;i n t a n g v e l n i c k ;i n t a n g v e l r o l l ;i n t angve l yaw ;s h o r t a c c x c a l i b ;s h o r t a c c y c a l i b ;s h o r t a c c z c a l i b ;s h o r t a c c x ;s h o r t a c c y ;s h o r t a c c z ;i n t a c c a n g l e n i c k ;i n t a c c a n g l e r o l l ;i n t a c c a b s o l u t e v a l u e ;i n t Hx ;i n t Hy ;i n t Hz ;i n t mag heading ;i n t s p e e d x ;i n t s p e e d y ;i n t s p e e d z ;i n t h e i g h t ;i n t d h e i g h t ;i n t d h e i g h t r e f e r e n c e ;i n t h e i g h t r e f e r e n c e ;

} ;

s t r u c t GPS DATA ADVANCED{i n t l a t i t u d e ;i n t l o n g i t u d e ;i n t h e i g h t ;

Page 224: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

188 C.2. UAV quadcopter

i n t s p e e d x ;i n t s p e e d y ;i n t h e a d i n g ;u n s i g n e d i n t h o r i z o n t a l a c c u r a c y ;u n s i g n e d i n t v e r t i c a l a c c u r a c y ;u n s i g n e d i n t s p e e d a c c u r a c y ;u n s i g n e d i n t numSV ;i n t s t a t u s ;i n t l a t i t u d e b e s t e s t i m a t e ;i n t l o n g i t u d e b e s t e s t i m a t e ;i n t s p e e d x b e s t e s t i m a t e ;i n t s p e e d y b e s t e s t i m a t e ;

} ;

s t r u c t CTRL INPUT{s h o r t p i t c h ;s h o r t r o l l ;s h o r t yaw ;s h o r t t h r u s t ;s h o r t c t r l ;s h o r t chksum ;

} ; s t r u c t CTRL INPUT CTRL Input ;

Page 225: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix DResearch Groups on UAS andVision Systems

Several teams from MIT, Stanford, Berkeley, ARCAA and USC, UPMamong others, have had an ongoing UAV project. They are multidisciplinarygroups working on different areas related with UAS, some of them having anactively research on computer vision for aerial robotics. Below is a reviewof some of the most important institutions and their UAV projects. Finally, acompressive list of UAV manufacturers and research groups around the world,can be find on http://uav.eas.gatech.edu.

Universidad Politecnica de Madrid UPM - Spain

1. Computer Vision Group CVG - Universidad Politecnica de Madrid(UPM)http://vision4uav.comThe mail goal of he CVG in the field of aerial robotics, is the de-sign and build of Unmanned Aerial Vehicles (UAVs) guided and con-trolled by computer visual systems for applications in civilian spacesResearch includes vision techniques for pose and attitude estimation,visual enhancement, visual control and servoing among others.

189

Page 226: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

190

2. (Robotics and Cybernetics Group - Universidad Politecnica deMadrid (UPM) -http://www.robcib.etsii.upm.es/The Robotics and Cybernetics research group worked on several top-ics within the fields of robotics, flexible manufacturing and biomed-ical engineering. In the field of UAV, they are currently focused onUAV system modeling and low-level control to mission planning, su-pervision and collision avoidance, going from vehicle constraints tomission constraints as well as Micro Aerial Vehicles indoor/outdoornavigation.

Universitat Politecnica de Catalunya UPC - SpainICARUShttp://icarus.upc.eduThe research areas includes fields like: Intelligent Communications andAvionics for Robust Unmanned Aerial Systems or Evaluation and newstrategies for the smooth integration of civil mission oriented UAV in non-segregated airspace. ICARUS research project focus on the developmentof new hardware and software architectures to control the mission and pay-load in Unmanned Aerial Vehicles (UAVs).

Centro Avanzado de Tecnologıas Aerospaciales CATEC - Spain.http://www.catec.aeroCATEC works on several topics on the industrial aviation sector, includingR&D and technology transfer on the field of Avionics and ob-board sys-tems, UAVs platforms, Automation and Robotics among others. On thefield of Aerial Robotics have a a fleet of UAVs platforms used for systemsintegration and testing.

Technische Universitat Berlin - GermanyLaboratory for autonomous flying robotshttp://pdv.cs.tu-berlin.de/lfafr/index.htmlThey are working on different subjects connected with practical applica-tions of autonomous aerial robots with a certain level of on-board intelli-gence. The main research areas include:mathematical modeling of smallscale aerial robots, control of small scale aerial robots, sensors and sensordata processing for autonomous navigation, collision detection/avoidancefor small scale aerial robots, control of multiple coupled helicopters, dis-tributed real-time systems.

German Aerospace Center (DLR) -GermanyInstitute of Flight SystemsAutonomous Rotorcraft Testbed for Intelligent Systems - ARTIShttp://www.dlr.de/ft/Research focused on the development and integration of technologies and

Page 227: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix D. Research Groups on UAS and Vision Systems 191

components for autonomous flight. It includes works on controller de-sign, testing of human-Machine interfaces, implementation of intelligentbehavior, Sensor fusion, image processing for collision avoidance or opti-cal navigation among others.

Linkoping University - SwedenUnmanned Aircraft System Technologies Lab UASTechhttp://www.ida.liu.se/divisions/aiics/aiicssite/uastech/The UASTech lab has a multi-disciplinary group with backgrounds incontrol theory, aeronautical engineering, signal processing, computerscience, artificial intelligence, and software engineering. Basic andapplied research goals include: design and implementation of integratedcontrol modes for UAVs. These include autonomous take-off and landing,trajectory following and vehicle following.

Universidad de Minho -PortugalAIVA project .http://aiva.dei.uminho.pt/aiva_ing/index.htmThe purpose of this project is the construction and automatization of asmall airplane, capable to fly autonomously and guided by vision that al-lows it to operate in any atmospheric condition and recurring (only) to theon-board systems and processing capabilities, to perform various kinds ofapplications, such as: Monitoring of forests and natural parks (includingfire detection), beaches (coastal zones), rivers roads and highways, and ofrailway lines among others. Support to telecommunications and data trans-mission and Acquisition of aerial images for territory mapping. design andimplementation of sensor platforms for integrated perceptory capability inour UAV platforms, including image processing components and sensorand information fusion techniques

Instituto Superior de Engenharia do Porto - Autonomous systemsLaboratory LSA - Portugal.http://www.lsa.isep.ipp.ptLSA conducts research in autonomous systems and related areas such asnavigation, control and coordination of multiple robots, developed multi-ple land, air and sea autonomous robots. On the field of UAV their re-search is focused on systems development for low altitude applicationssuch as forest fire prevention, security, environmental monitoring or aerialimagery.

University of Bristol - Dynamics and Systems Research Group - UK.http://www.bristol.ac.uk/engineering/research/dynamicscontrol/The dynamics and control research activity is concerned with researchproblems relating to modeling, simulation and control of civil, mechanical

Page 228: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

192

and aerospace engineering systems. On the field on Aerial Roboticsresearch involves autonomous landing on moving platforms and aerialrefueling among others.

Ecole Polythechnique Federale de Lausanne-Laboratory of IntelligentSystems LIS - Switzerland.http://lis.epfl.ch/The Laboratory of Intelligent Systems conduct researches on bio-inspiredUAVs and visual sensor. They have works on MicroUAV, fast networkingusing aerial vehicles and bio-inspired vision for direct flying control.

Universite de Technologie Campiegne - Heuristic and Diagnosis forComplex Systems Joint research unit HEUDIASYC - France.http://www.hds.utc.frThe objective of this research group is the development and control of au-tonomous aerial vehicles or UAVs focusing on designing, modeling andcontrolling mini-helicopters such as multi-rotors. Studies are recently de-voted to convertible drones capable of vertical take-off and landing asVTOL vehicles and forward flight like planes. The works include tech-nological supports and developments for the design of embedded architec-tures.

Swiss Federal Institute of Technology, Zurich - Switzerland

1. UAV group - Measurement and Control Laboratory.- ETH.http://www.uav.ethz.ch/Research activities on flight control for fixed wing aircraft and air-ships as well as integrated navigation algorithms and computer boarddevelopments.

2. AIRobots, Skysailor, sFly, Flying Reel and my Copter Project -Autonomous System Laboratory.- ETH.http://http://www.asl.ethz.ch//Interest is in the development of intelligent products and systemswith special emphasis on autonomous mobile robots. AiIRobotsGoal is to develop a new generation of aerial service robots capableto support human beings in all those activities which require the abil-ity to interact actively and safely with environments not constrainedon ground but, indeed, freely in air. The Skysailo goal project isto design and build a solar powered micro airplane for autonomousexploration

3. UAV Photogrammetry - Institute of Geodesy andPhotogrammetry- ETH.http://www.igp.ethz.ch/photogrammetry/The main goal of this project is the development and implementationof processing methods of GPS/INS and image data for updating the

Page 229: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix D. Research Groups on UAS and Vision Systems 193

position and attitude of the UAV.

Shenyang Institute of Automation, Chinese Academy of Sciences (SIA,CAS)http://www.sia.ac.cn/en/

Korea Advanced Institute of Science and Technology (KAIST)Unmanned System Research Group (FDCL)http://unmanned.kaist.ac.kr/Interested in the research and development of highly advanced au-tonomous aerial robots by combining various principles of control theory,aerospace engineering, and computer science

The Australian Research Centre for Aerospace Automation (AR-CAA)- Australiahttp://www.arcaa.aero/Conducts research into all aspects of aviation automation, with a particularresearch focus on autonomous technologies which support the more effi-cient and safer utilization of airspace, and the development of autonomousaircraft and on-board sensor systems for a wide range of commercial ap-plications.

University of California at Berkley

1. BEAR: Berkeley Aerobot Teamhttp://robotics.eecs.berkeley.edu/bear/The BErkeley AeRobot (BEAR) project is a collective, interdisci-plinary research effort at UC Berkeley that encompasses the disci-plines of hybrid systems theory, navigation, control, computer vision,communication, and multi-agent coordination, since 1996. They cur-rently operate six fully instrumented helicopters, in addition to manyfixed- and rotary wing vehicles under development, equipped withGPS/INS, camera, and other sensors on board, which we have beenusing to validate our control systems design algorithms for UAVs.

2. Center for Collaborative Control of Unmanned Vehicleshttp://c3uv.berkeley.edu/interdisciplinary group focused on the fundamental theoretical devel-opments necessary to allow teams of unmanned vehicles to operateautonomously, without extensive monitoring and intervention by hu-man operator

Massachusetts Institute of Technology MIT -USAAerospace Controls Laboratoryhttp://acl.mit.edu/The Aerospace Controls Laboratory (ACL) researches topics related to au-tonomous systems and control design for aircraft, spacecraft, and ground

Page 230: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

194

vehicles. Theoretical research is pursued in areas such as: decision makingunder uncertainty; path planning, activity and task assignment; estimationand navigation; sensor network design; robust control, adaptive control,and model predictive control.. More interesting:

1. UAV SWARM Health Management Projecthttp://vertol.mit.edu/prjinfo.htmlInvestigating on techniques that will enable the execution of contin-uous (24-7) mission operations using multiple autonomous vehicles(i.e., vehicle SWARMS) in a dynamic environment.

2. Autonomous UAV Aerobatics Projecthttp://aerobatics.mit.eduuses a Vicon motion capture sensing to enable rapid prototyping ofaerobatic flight controllers for helicopters and aircraft; robust coordi-nation algorithms for multiple helicopters; and vision-based sensingalgorithms for indoor flight.

University of Pennsylvania - USAThe General Robotics, Automation, Sensing and Perception (GRASP)Laboratoryhttp://www.grasp.upenn.edu/GRASP researchers are building autonomous vehicles and robots, devel-oping self-configuring humanoids, and making robot swarms a reality.Some of the GRASP project are:

1. Micro Autonomous Systems Technologies (MAST)http://alliance.seas.upenn.edu/˜kumar/wiki/index.phpInvestigating on techniques that will enable the execution of contin-uous (24-7) mission operations using multiple autonomous vehicles(i.e., vehicle SWARMS) in a dynamic environment.

2. Autonomous Aerial Vehicleshttp://uav-planning.no-ip.org:8080/uav-planning/wiki/WikiStartThis research project is mainly focused around autonomous naviga-tion of unmanned air vehicles. The challenge is to design systems,which exhibit a goal-driven behavior, while sensing and reactingto changing environment. This project is a collaboration betweenstudents and faculty from University of Pennsylvania and industryexperts from Dragonfly Pictures, Inc.

Georgia Institute of Technology - USA

1. Georgia Tech Aerial Robotics Team (GTAR)http://controls.ae.gatech.edu/wiki/gtar/

Page 231: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Appendix D. Research Groups on UAS and Vision Systems 195

The Georgia Tech Aerial Robotics Team (GTAR) is a team of stu-dents, faculties, and staffs of Georgia Institute of Technology whorepresents the Institute in the International Aerial Robotics Competi-tion (IARC). The GTAR team has been participating in IARC sinceits debut in 1991, and also hold the record of winning the most mis-sions. On 2010, the team participated on IARC’s 6th mission whichis focused on GPS-denied indoor flight. Finished as the leading entryin 2010, the team was able to develop an indoor flight vehicle thatcan autonomously enter and explore indoor environment. The teamplans to complete the mission in the 2011 competition

2. UAV Research Facility (UAVRF)http://controls.ae.gatech.edu/wiki/uavrfThe UAV Research Facility (UAVRF) at Georgia Tech does researchrelated to Unmanned Aerial Vehicles (UAV). The UAVRF operatesseveral different vehicles and conducts flight tests to validate researchfindings.

North Carolina State UniversityNCSU Aerial Robotics Clubhttp://art1.mae.ncsu.edu/The NCSU Aerial Robotics Club (ARC) is a student organization that con-structs airplanes and puts computers on them to make them intelligent andautonomous. The club aims to participate in the AUVSI and IARC compe-titions. In common to both competitions is the ability of the airplane to flyautonomously, take high resolution surveillance pictures, and send thesepictures back to the ground station.

University of MarylandAutonomous Vehicle Laboratoryhttp://www.avl.umd.edu/index.htmlThe Autonomous Vehicle Laboratory (AVL) conducts research and devel-opment in the area of biologically inspired robotics. They seek to distillthe fundamental sensing and feedback principles that govern locomotivebehavior in small organisms that will enable the next generation of au-tonomous microsystems. Capabilities include rapid-prototyping facilitiesfor microsystem fabrication and development, a VICON marker-based vi-sual tracking system that provides direct measurements of 6-DOF vehicleposition and orientation for system identification and real-time feedback,a low speed wind tunnel with a specialized high speed camera system forinsect tracking and wing kinematics measurement, and advanced hardwareand software tools for visual-based simulation of flight systems.

University of Southern CaliforniaAutonomous Flying Vehicle Projecthttp://www-robotics.usc.edu/˜avatar/

Page 232: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

196

The USC Autonomous Flying Vehicle Project was initiated in 1991. Sincethen the Robotic Embedded Systems Laboratory has designed, built andconducted research with four robot helicopters, the latest being the 3rdgeneration AVATAR (Autonomous Vehicle Aerial Tracking And Recon-naissance). Since the beginning of the project, a guiding design philos-ophy has been to create flying robots with high levels of autonomy. Ini-tially, the focus research was in creating a reliable control mechanism fora model helicopter. Once that had been achieved they focused on per-forming higher lever tasks with the helicopter. Besides stable autonomousflight, they are able to perform tasks such as GPS waypoint navigation, au-tonomous vision-based landing and autonomous sensor deployment. Re-searching areas includes autonomous landing on a moving target, deploy-ment on a moving target, stealthy target pursuit and vision-based obstacleavoidance in 3D..

Oakland University-USAEmbedded System Research Laboratoryhttps://sites.google.com/a/oakland.edu/oar/HomeResearch on methodologies, design, verification, and implementation ofembedded systems. Current work focuses is on online reconfiguration asmeans to fault-tolerance for real-time safety-critical distributed embeddedsystems. Other interests and experiences are in the areas of rapid proto-typing, product development, data acquisition and control, Biomedical andautonomous aerial vehicle systems.

Virginia Tech-USAUnmanned Systems Labhttp://www.me.vt.edu/unmanned/index.htmlResearch dedicated to autonomous and remotely operated systems devel-opment and integration. Areas of expertise include air and ground vehicledesign, ground control stations, vision and LIDAR systems, image andsignal processing, communications vehicle testing and acoustics amongothers.

Page 233: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

Bibliography

(2010). Ar.drone parrot. http://ardrone.parrot.com.

(2010). Ascending technologies. http://www.asctec.de.

(2010). United states department of defense.

(2011). Computer vision group. polytechnic university of madrid. driverless ve-hicle test.

(2012). Aruco: a minimal library for augmented re-ality applications based on opencv: [online]. avail-able:http://www.uco.es/investiga/grupos/ava/node/26.

(2012a). Computer vision group-upm. http://www.vision4uav.eu.

(2012). Computer vision group-upm: [online]. avail-able:http://www.vision4uav.eu.

(2012). Hibrid system laboratory, berkeley university, quadrotor and kinect.

(2012). Motion capture system from vicon.

(2012). Robot operating system (ros).

(2012). Starmac-ros package. hybrid systems laboratory, uc berkeley.

(2012). Vislab, university of parma.

(2012b). Youtube channel computer vision group-upm.http://www.youtube.com/colibriprojectUAV.

197

Page 234: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

198 BIBLIOGRAPHY

(2012). Youtube channel computer vision group-upm: [online]. avail-able:http://www.youtube.com/colibriprojectuav.

Ahmed, M., Bhatti, U., Al-Sunni, F., and El-Shafei, M. (2001). Design of a fuzzyservo-controller. Fuzzy Sets and Systems, 124(2):231 – 247.

Allon, G., Raviv, T., and Rubinstein, R. Y. (2005). Application of the crossentropy method for buffer allocation problem in simulation based environ-ment. In Annals of Operations Research. Springer Science and BusinessMedia.

Altug, E., Ostrowski, J., and Mahony, R. (2002). Control of a quadrotor heli-copter using visual feedback. In Robotics and Automation, 2002. Proceed-ings. ICRA ’02. IEEE International Conference on, volume 1, pages 72 –77 vol.1.

Antequera, N., Santos, M., and de la Cruz, J. (2006). A helicopter control basedon eigenstructure assignment. In Emerging Technologies and Factory Au-tomation, 2006. ETFA ’06. IEEE Conference on, pages 719 –724.

Bai, Y., Zhuang, H., and Roth, Z. (2005). Fuzzy logic control to suppress noisesand coupling effects in a laser tracking system. Control Systems Technology,IEEE Transactions on, 13(1):113 – 121.

Baturone, I., Moreno-Velo, F., Sanchez-Solano, S., and Ollero, A. (2004). Au-tomatic design of fuzzy controllers for car-like autonomous robots. FuzzySystems, IEEE Transactions on, 12(4):447 – 465.

Behringer, R. and Muller, N. (1998). Autonomous road vehicle guidance fromautobahnen to narrow curves. Robotics and Automation, IEEE Transactionson, 14(5):810 –815.

Belmudes, F., Ernst, D., and Wehenkel, L. (2008). Cross-entropy based rare-event simulation for the identification of dangerous events in power systems.In Proceedings of the 10th IEEE International Conference on ProbabilitiesMethods Applied to Power System, pages 1–7.

Bertozzi, M., Broggi, A., Conte, G., Fascioli, A., and Fascioli, R. (1998). Vision-based automated vehicle guidance: the experience of the argo vehicle.

Beyeler, A., Zufferey, J.-C., and Floreano, D. (2009). Vision-based control ofnear-obstacle flight. Autonomous Robots, 27(3):201–219.

Bodur, M. (2007). An adaptive cross-entropy tuning of the pid control for robotmanipulators. In World Congress on Engineering.

Boer, P. T. D., Kroese, D., Mannor, S., and Rubinstein, R. (2002). A tutorial onthe cross-entropy method. Annals of Operations Research, 134.

Page 235: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 199

Boer, P. T. D., Kroese, D. P., and Rubinstein, R. Y. (2004). A fast cross-entropymethod for estimating buffer overflows in queueing networks. ManagementScience, 50:213–223.

Bonissone, P., Khedkar, P., and Chen, Y. (1996). Genetic algorithms for auto-mated tuning of fuzzy controllers: a transportation application. In FuzzySystems, 1996., Proceedings of the Fifth IEEE International Conference on,volume 1, pages 674 –680 vol.1.

Bonissone, P. P., Chen, Y.-T., Goebel, K., and Khedkar, P. S. (1999). Hybrid softcomputing systems: Industrial and commercial applications. In Proceedingsof the IEEE.

Botev, Z. and Kroese, D. P. (2004a). Global likelihood optimization via the cross-entropy method with an application to mixture models. In Proceedings ofthe 36th conference on Winter simulation, pages 529–535.

Botev, Z. and Kroese, D. P. (2004b). Global likelihood optimization via thecross-entropy method with an application to mixture models. In Proceed-ings of the 36th conference on Winter simulation, WSC ’04, pages 529–535.Winter Simulation Conference.

Bouabdallah, S. (2007). design and control of quadrotors with application toautonomous flying. PhD thesis, Ecole Polythecnique Federale de Lausanne,EPFL, Lausanne.

Bouabdallah, S., Murrieri, P., and Siegwart, R. (2004). Design and control of anindoor micro quadrotor. In ICRA, pages 4393–4398.

Bouguet Jean Yves (1999). Pyramidal implementation of the lucas-kanade fea-ture tracker. Technical report, Intel Corporation. Microprocessor ResearchLabs, Santa Clara, CA 95052.

Bradski, G. R. (1998). Computer vision face tracking for use in a perceptual userinterface. Intel Technology Journal, 1(Q2).

Broggi, A., Bertozzi, M., and Fascioli, A. (2000). Architectural issues on vision-based automatic vehicle guidance: The experience of the argo project. Real-Time Imaging, 6(4):313 – 324.

Campoy, P., Correa, J., Mondragon, I., Martınez, C., Olivares, M., Mejıas, L.,and Artieda, J. (2009a). Computer vision onboard uavs for civilian tasks.Journal of Intelligent and Robotic Systems, pages 105–135.

Campoy, P., Correa, J. F., Mondragon, I., Martınez, C., Olivares, M., Mejıas, L.,and Artieda, J. (2009b). Computer vision onboard UAVs for civilian tasks.Journal of Intelligent and Robotic Systems., (1-3).

Page 236: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

200 BIBLIOGRAPHY

Caponetto, R., Diamante, O., Fargione, G., Risitano, A., and Tringali, D. (2003).A soft computing approach to fuzzy sky-hook control of semiactive sus-pension. Control Systems Technology, IEEE Transactions on, 11(6):786 –798.

Celeste, F., Dambreville, F., and Le Cadre, J.-P. (2006). Optimal path planningusing cross-entropy method. In Information Fusion, 2006 9th InternationalConference on, pages 1–8.

Chee, K. and Zhong, Z. (2013). Control, navigation and collision avoidance foran unmanned aerial vehicle. Sensors and Actuators A: Physical, 190(0):66– 76.

Chekhlov, D. (1998). Towards Robust Data Association in Real-time VisualSLAM. PhD thesis, Faculty of Engineering, Department of Computer Sci-ence, University of Bristol, England.

Chen, M.-Y., Tzeng, H.-W., Chen, Y.-C., and Chen, S.-C. (2008). The appli-cation of fuzzy theory for the control of weld line positions in injection-molded part. ISA Transactions, 47(1):119 – 126.

Chitrakaran, V., Dawson, D., Kannan, H., and Feemster, M. (2006). Visionassisted autonomous path following for unmanned aerial vehicles. Decisionand Control, 2006 45th IEEE Conference on, pages 63–68.

Cho, S.-K. and Lee, H.-H. (2002). A fuzzy-logic antiswing controller for three-dimensional overhead cranes. ISA Transactions, 41(2):235 – 243.

Corke, P. I. (1994). Visual control of robot manipulators – a review. In VisualServoing, pages 1–31. World Scientific.

Criminisi, A., Reid, I. D., and Zisserman, A. (1999). A plane measuring device.Image Vision Comput., 17(8):625–634.

DARPA (2011). Darpa Gran Challenge http://www.darpagrandchallenge.com.

Davison, A. J. (1998). Mobile Robot Navigation Using Active Vision. PhD thesis,Robotics Research Group, Department of Engineering Science, Universityof Oxford, England.

Davison, A. J. (2003). Real-time simultaneous localisation and mapping with asingle camera. In Proceedings of the Ninth IEEE International Conferenceon Computer Vision - Volume 2, ICCV ’03, pages 1403–, Washington, DC,USA. IEEE Computer Society.

de Boer, P.-T., Nicola, V. F., and Rubinstein, R. Y. (2000). Techniques for sim-ulating difficult queueing problems: adaptive importance sampling simula-tion of queueing networks. In Winter Simulation Conference, pages 646–655.

Page 237: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 201

de la Puente Yusty, P. (2012). Probabilistic mapping with mobile robots in struc-tured environments. PhD thesis, Escuela Tecnica Superior de IngenierosIndustriales. Universidad Politecnica de Madrid, Spain.

De Wagter, C. and Mulder, J. (2005). Towards vision-based uav situation aware-ness. AIAA Guidance, Navigation, and Control Conference and Exhibit.

Ding, W., Gong, Z., Xie, S., and Zou, H. (2006). Real-time vision-based ob-ject tracking from a moving platform in the air. In Intelligent Robots andSystems, 2006 IEEE/RSJ International Conference on, pages 681–685.

Dobrokhodov, V., Kaminer, I., Jones, K., and Ghabcheloo, R. (2006). Vision-based tracking and motion estimation for moving targets using small uavs.American Control Conference, 2006, pages 6 pp.–.

Dufrene, W.R., J. (2004). Ai techniques in uninhabited aerial vehicle flight.Aerospace and Electronic Systems Magazine, IEEE, 19(8):8 – 12.

Dusha, D. and Mejias, L. (2012). Error analysis and attitude observability ofa monocular gps/visual odometry integrated navigation filter. The Interna-tional Journal of Robotics Research, 31(6):714–737.

Dvorak, A., Habiballa, H., Novak, V., and Pavliska, V. (2003). The concept oflflc 2000its specificity, realization and power of applications. Computers inIndustry, 51(3):269 – 280.

Eftekhari, M., Marjanovic, L., and Angelov, P. (2003). Design and performanceof a rule-based controller in a naturally ventilated room. Computers in In-dustry, 51(3):299 – 326.

Egbert, J. and Beard, R. W. (2011). Low-altitude road following using strap-down cameras on miniature air vehicles. Mechatronics, pages 831–843.

E.Haber, R., del Toro, R. M., and Gajate, A. (2010). Optimal fuzzy controlsystem using the cross-entropy method. a case study of a drilling process.Information Sciences, 180:2777–279.

E.Onieva, Naranjo, J., V.Milanes, J.Alonso, R.Garcia, and J.Perez (2011). Auto-matic lateral control for unmanned vehicles via genetic algorithm. AppliedSoft Computing, 11:1303–1309.

Espiau, B., Chaumette, F., and Rives, P. (1992). A new approach to visual servo-ing in robotics. Robotics and Automation, IEEE Transactions on, 8(3):313–326.

Fischer, M. A. and Bolles, R. C. (1981). Random sample concensus: a paradigmfor model fitting with applications to image analysis and automated cartog-raphy. Communications of the ACM, 24(6):381–395.

Page 238: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

202 BIBLIOGRAPHY

Frazzoli, E., Dahleh, M., and Feron, E. (2000). Trajectory tracking control de-sign for autonomous helicopters using a backstepping algorithm. In Amer-ican Control Conference, 2000. Proceedings of the 2000, volume 6, pages4102 –4107 vol.6.

Frazzoli, E., Dahleh, M., and Feron, E. (2001). Real-time motion planning foragile autonomous vehicles. In American Control Conference, 2001. Pro-ceedings of the 2001, volume 1, pages 43 –49 vol.1.

Fucen, Z., Haiqing, S., and Hong, W. (2009). The object recognition and adap-tive threshold selection in the vision system for landing an unmanned aerialvehicle. In Information and Automation, 2009. ICIA ’09. International Con-ference on, pages 117 –122.

Fukunaga, K. and Hostetler, L. (1975). The estimation of the gradient of a den-sity function, with applications in pattern recognition. Information Theory,IEEE Transactions on, 21(1):32 – 40.

Galichet, S. and Foulloy, L. (2003). Integrating expert knowledge into industrialcontrol structures. Computers in Industry, 52(3):235 – 251. ¡ce:title¿SoftComputing in Industrial Applications¡/ce:title¿.

Gomez, J. and Jamshidi, M. (2010). Fuzzy logic control of a fixed-wing un-manned aerial vehicle. In World Automation Congress (WAC), 2010, pages1 –8.

Gomez-Balderas, J., Flores, G., Garcıa Carrillo, L., and Lozano, R. (2012).Tracking a ground moving target with a quadrotor using switching control.Journal of Intelligent and Robotic Systems, pages 1–14.

Haber, R., Alique, J., Alique, A., Hernandez, J., and Uribe-Etxebarria, R. (2003).Embedded fuzzy-control system for machining processes: Results of a casestudy. Computers in Industry, 50(3):353 – 366.

Haber, R. E., Haber-Haber, R., Jimenez, A., and Galon, R. (2009). An optimalfuzzy control system in a network environment based on simulated anneal-ing. an application to a drilling process. Applied Soft Computing, 9(3):889– 895.

Hamel, T. and Mahony, R. (2002). Visual servoing of an under-actuated dynamicrigid-body system: an image-based approach. Robotics and Automation,IEEE Transactions on, 18(2):187 –198.

He, Z., V, L. R., and R, C. P. (2006). Vision-based UAV flight control andobstacle avoidance. In Proceedings of the American Control Conference,page 5pp.

Page 239: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 203

Helvik, B. E. and Wittner, O. (2001). Using the cross-entropy method to guide/-govern mobile agent’s path finding in networks. In in Proceedings of 3rdInternational Workshop on Mobile Agents for Telecommunication Applica-tions, pages 14–16. Springer Verlag.

Hill, J. and Park, W. T. (1979). Real time control of a robot with a mobile camera.In in Proc. 9th ISIR, Washinton, D.C., pages 233–246.

Hrabar, S. (2008). 3d path planning and stereo-based obstacle avoidance forrotorcraft uavs. Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJInternational Conference on, pages 807–814.

Hunt, K., Sbarbaro, D., Ltbikowski, R., and Gawthrop, P. (1992). Neural net-works for control systems: A survey. Automatica, 28(6):1083 – 1112.

Hutchinson, S., Hager, G., and Corke, P. (1996). A tutorial on visual servocontrol. Robotics and Automation, IEEE Transactions on, 12(5):651 –670.

Inc., G. (2010). http://googleblog.blogspot.com.es/2010/10/what-were-driving-at.html.

Irani, M. and Anandan, P. (2000). About direct methods. In Triggs, B., Zis-serman, A., and Szeliski, R., editors, Vision Algorithms: Theory and Prac-tice, volume 1883 of Lecture Notes in Computer Science, pages 267–277.Springer Berlin HeidelbergHeidelberg.

Isidori, A., Marconi, L., and Serrani, A. (2003). Robust nonlinear motion controlof a helicopter. Automatic Control, IEEE Transactions on, 48(3):413 – 426.

Kadmiry, B. and Driankov, D. (2004). A fuzzy gain-scheduler for the attitudecontrol of an unmanned helicopter. Fuzzy Systems, IEEE Transactions on,12(4):502 – 515.

Kalyoncu, M. and Haydim, M. (2009). Mathematical modelling and fuzzylogic based position control of an electrohydraulic servosystem with inter-nal leakage. Mechatronics, 19(6):847 – 858.

Keith, J. and Kroese, D. (2002). Sequence alignment by rare event simulation. InSimulation Conference, 2002. Proceedings of the Winter, volume 1, pages320 – 327 vol.1.

Klein, G. and Murray, D. (2007). Parallel tracking and mapping for small arworkspaces. In Proceedings of the 2007 6th IEEE and ACM InternationalSymposium on Mixed and Augmented Reality, ISMAR ’07, pages 1–10,Washington, DC, USA. IEEE Computer Society.

Kobilarov, M. (2011). Cross-entropy randomized motion planning. In Proceed-ings of Robotics: Science and Systems, Los Angeles, CA, USA.

Page 240: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

204 BIBLIOGRAPHY

Kumon, M., Udo, Y., Michihira, H., Nagata, M., Mizumoto, I., and Iwai, Z.(2006). Autopilot system for kiteplane. Mechatronics, IEEE/ASME Trans-actions on, 11(5):615 –624.

Lai, J., Mejias, L., and Ford, J. J. (2011a). Airborne vision-based collision-detection system. Journal of Field Robotics, 28(2):137–157.

Lai, J. S., Mejias, L., and Ford, J. J. (2011b). Airborne vision-based collision-detection system. Journal of Field Robotics, 28(2):137–157.

Lange, S., Sunderhauf, N., and Protzel, P. (2008). Autonomous landing fora multirotor UAV using vision. In In Workshop Proceedings of SIMPARIntl. Conf. on SIMULATION, MODELING and PROGRAMMING for AU-TONOMOUS ROBOTS, pages 482–491, Venice, Italy.

Lazanas, A. and claude Latombe, J. (1992). Landmark-based robot navigation.In Algorithmica, pages 816–822.

Lazanas, A. and Latombe, J.-C. (1995). Motion planning with uncertainty: alandmark approach. Artificial Intelligence, 76:287 – 317.

Lee, C. (1990a). Fuzzy logic in control systems: fuzzy logic controller. ii. Sys-tems, Man and Cybernetics, IEEE Transactions on, 20(2):419 –435.

Lee, C. C. (1990b). Fuzzy logic in control systems: Fuzzy logic controller-parti. IEEE Transactions on Systems, Man and Cybernetics, 20(2):404–418.

Lee, Y.-H. and Kopp, R. (2001). Application of fuzzy control for a hydraulicforging machine. Fuzzy Sets and Systems, 118(1):99 – 108.

Li, W. (1994). Optimization of a fuzzy controller using neural network. InFuzzy Systems, 1994. IEEE World Congress on Computational Intelligence.,Proceedings of the Third IEEE Conference on, pages 223 –227 vol.1.

Lucas, B. D. and Kanade, T. (1981). An iterative image registration techniquewith an application to stereo vision. In Proc. of the 7th IJCAI, pages 674–679, Vancouver, Canada.

Lygouras, J., Kodogiannis, V., Pachidis, T., Tarchanidis, K., and Koukourlis, C.(2008). Variable structure tito fuzzy-logic controller implementation for asolar air-conditioning system. Applied Energy, 85(4):190 – 203.

Ma, Y., Kosecka, J., and Sastry, S. (2000). Linear differential algorithm formotion recovery: A geometric approach. International Journal of ComputerVision, 36:71–89.

Mahony, R. and Hamel, T. (2005a). Image-based visual servo control of aerialrobotic systems using linear image features. Robotics, IEEE Transactionson, 21(2):227 – 239.

Page 241: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 205

Mahony, R. and Hamel, T. (2005b). Image-based visual servo control of aerialrobotic systems using linear image features. Robotics, IEEE Transactionson, 21(2):227 – 239.

Maidi, A., Diaf, M., and Corriou, J.-P. (2008). Optimal linear pi fuzzy controllerdesign of a heat exchanger. Chemical Engineering and Processing: ProcessIntensification, 47(5):938 – 945.

Malhotra, R., Singh, N., and Singh, Y. (2011a). Soft computing techniquesfor process control applications. International Journal on Soft Computing,2(3):32–44.

Malhotra, R., Singh, N., and Singh, Y. (2011b). Soft computing techniquesfor process control applications. International Journal on Soft Computing,2:32–44.

Marinaki, M., Marinakis, Y., and Stavroulakis, G. E. (2010). Fuzzy controloptimized by pso for vibration suppression of beams. Control EngineeringPractice, 18(6):618 – 629.

Martinez, C., Mondragon, I., Olivares-Mendez, M., and Campoy, P. (2011). On-board and ground visual pose estimation techniques for uav control. vol-ume 61, pages 301–320. Springer Netherlands. 10.1007/s10846-010-9505-9.

Maruyama, A. and Fujita, M. (1999). Visual feedback control of rigid bodymotion base on dissipation theoretical approach. In Decision and Control,1999. Proceedings of the 38th IEEE Conference on, volume 4, pages 4161–4166 vol.4.

Meister, O., Frietsch, N., Ascher, C., and Trommer, G. (2008). Adaptive pathplanning for a vtol-uav. In Position, Location and Navigation Symposium,2008 IEEE/ION, pages 1252 –1259.

Mejias, L. (2006). Control visual de un vehiculo aereo autonomo usandodeteccion y seguimiento de caracterısticas en espacios exteriores. PhDthesis, Escuela Tecnica Superior de Ingenieros Industriales. UniversidadPolitecnica de Madrid, Spain.

Mejias, L., McNamara, S., Lai, J. S., and Ford, J. J. (2010). Vision-baseddetection and tracking of aerial targets for uav collision avoidance. InIEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), Taipei International Convention Center, Taipei. IEEE. Acceptedversion:14/06/2010.

Mejias, L., Mondragon, I., and Campoy, P. (2011). Omnidirectional bearing-onlysee-and-avoid for small aerial robots. In The 5th International Conferenceon Automation, Robotics and Applications, James Cook Hotel, Wellington,New Zealand. IEEE.

Page 242: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

206 BIBLIOGRAPHY

Mejias, L., Saripalli, S., Campoy, P., and Sukhatme, G. (2006a). Visual ser-voing approach for tracking features in urban areas using an autonomoushelicopter. In Proceedings of IEEE International Conference on Roboticsand Automation, pages 2503–2508, Orlando, Florida.

Mejias, L., Saripalli, S., Campoy, P., and Sukhatme, G. (2006b). Visual servoingof an autonomous helicopter in urban areas using feature tracking. JournalOf Field Robotics, 23(3-4):185–199.

Mellado-Bataller, I., Mejias, L., Campoy, P., and Olivares-Mendez, M. A.(2012). Rapid prototyping framework for visual control of autonomousmicro aerial vehicles. In 12th International Conference on Intelligent Au-tonomous System (IAS-12).

Merz, T., Duranti, S., and Conte, G. (2004). Autonomous landing of an un-manned helicopter based on vision and inbertial sensing. In InternationalSymposium on Experimental Robotics, Singapore.

Merz, T., Duranti, S., and Conte, G. (2006). Autonomous landing of an un-manned helicopter based on vision and inertial sensing. In Ang, M. andKhatib, O., editors, Experimental Robotics IX, volume 21 of Springer Tractsin Advanced Robotics, pages 343–352. Springer Berlin / Heidelberg.

Mikolajczyk, M. and Smid, C. (2005). A performance evaluation of lo-cal descriptors. IEEE Trans. Pattern Analysis and Machine Intelligence,27(10):1615–1630.

Mirzaei, A., Moallem, M., Dehkordi, B., and Fahimi, B. (2006). Design of an op-timal fuzzy controller for antilock braking systems. Vehicular Technology,IEEE Transactions on, 55(6):1725 –1730.

Mondragon, I., Olivares-Mendez, M., Campoy, P., Martınez, C., and Mejias,L. (2010). Unmanned aerial vehicles uavs attitude, height, motion esti-mation and control using visual systems. Autonomous Robots, 29:17–34.10.1007/s10514-010-9183-2.

Mondragon, I. F., Campoy, P., Martınez, C., and Olivares-Mendez, M. (2010).3d pose estimation based on planar object tracking for UAVs control. InRobotics and Automation (ICRA), 2010 IEEE International Conference on,pages 35 –41.

Montemerlo, M., Becker, J., Bhat, S., Dahlkamp, H., Dolgov, D., Ettinger, S.,Haehnel, D., Hilden, T., Hoffmann, G., Huhnke, B., Johnston, D., Klumpp,S., Langer, D., Levandowski, A., Levinson, J., Marcil, J., Orenstein, D.,Paefgen, J., Penny, I., Petrovskaya, A., Pflueger, M., Stanek, G., Stavens,D., Vogt, A., and Thrun, S. (2008). Junior: The stanford entry in the urbanchallenge. Journal of Field Robotics.

Page 243: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 207

Montiel, J., Civera, J., and Davison, A. (2006). Unified inverse depthparametrization for monocular slam. In Proceedings of Robotics: Scienceand Systems, Philadelphia, USA.

Moon, U.-C. and Lee, K. (2003). Hybrid algorithm with fuzzy system and con-ventional pi control for the temperature control of tv glass furnace. ControlSystems Technology, IEEE Transactions on, 11(4):548 – 554.

Moses, A., Rutherford, M., and Valavanis, K. (2011). Radar-based detection andidentification for miniature air vehicles. In Control Applications (CCA),2011 IEEE International Conference on, pages 933 –940.

Nandi, A. K. and Davim, J. P. (2009). A study of drilling performances with min-imum quantity of lubricant using fuzzy logic rules. Mechatronics, 19(2):218– 232.

Naranjo, J. E., JimASnez, F., GAłmez, O., and Zato, J. G. (2012). Low levelcontrol layer definition for autonomous vehicles based on fuzzy logic. In-telligent Automation and Soft Computing, pages 333–348.

Nehmzow, U. (2003). Mobile Robotics: A Practical Introduction.

Nicol, C., Macnab, C., and Ramirez-Serrano, A. (2011). Robust adaptive controlof a quadrotor helicopter. Mechatronics, 21(6):927 – 938.

Nikolos, I. K., Tsourveloudis, N. C., and Valavanis, K. P. (2004). A uav visionsystem for airborne surveillance. In Robotics and Automation, 2004. Pro-ceedings. ICRA ’04. 2004 IEEE International Conference on, pages 77–83,New Orleans, LA, USA.

Nonami, K., Kendoul, F., Suzuki, S., Wang, W., Nakazawa, D., Nonami, K.,Kendoul, F., Suzuki, S., Wang, W., and Nakazawa, D. (2010). Guidance andnavigation systems for small aerial robots. In Autonomous Flying Robots,pages 219–250. Springer Japan.

Nourbakhsh, I. R., Bobenage, J., Grange, S., Lutz, R., Meyer, R., and Soto, A.(1999). An affective mobile robot educator with a full-time job.

Olivares, M. and Madrigal, J. (2007). Fuzzy logic user adaptive navigation con-trol system for mobile robots in unknown environments. Intelligent SignalProcessing, 2007. WISP 2007. IEEE International Symposium on, pages1–6.

Olivares-Mendez, M., Campoy, P., Martinez, C., and Mondragon, I. (Oct.). Apan-tilt camera fuzzy vision controller on an unmanned aerial vehicle. InIntelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ InternationalConference on, pages 2879–2884.

Page 244: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

208 BIBLIOGRAPHY

Olivares-Mendez, M., Mejias, L., Campoy, P., and Mellado-Bataller, I. (2013a).Cross-entropy optimization for scaling factors of a fuzzy controller: A see-and-avoid approach for unmanned aerial systems. Journal of Intelligent &Robotic Systems, 69:189–205.

Olivares-Mendez, M., Mellado, I., Campoy, P., Mondragon, I., and Martinez,C. (Dec.). A visual agv-urban car using fuzzy control. In Automation,Robotics and Applications (ICARA), 2011 5th International Conference on,pages 145–150.

Olivares-Mendez, M., Mondragon, I., Campoy, P., and Martinez, C. (July).Fuzzy controller for uav-landing task using 3d-position visual estimation.In Fuzzy Systems (FUZZ), 2010 IEEE International Conference on, pages1–8.

Olivares-Mendez, M. A., Campoy, P., Mellado-Bataller, I., Mondragon, I., andMartinez, C. (2013b). Autonomous guided car using a fuzzy controller. InRecent Advances in Robotics and Automation. Springer Studies in Compu-tational Intelligence, (Accepted).

Olivares-Mendez, M. A., Fu, C., Mondragon, I., Campoy, Pascual, M. L., andMartinez, C. (2013c). Vision-based pd-type fuzzy servoing of a quadrotoruav: Apply for aerial object following. Journal of Intelligent & RoboticSystems (Accepted), pages 189–205.

Olivares-Mendez, M. A., Mejias, L., Campoy, P., and Mellado-Bataller, I.(2012a). Quadcopter see and avoid using a fuzzy controller. In World Scien-tific Proceedings Series on Computer Engineering and Information Science,2012.

Olivares-Mendez, M. A., Mejias, L., Campoy, P., Mellado-Bataller, I., and Mon-dragon, I. (2012b). Uas see-and-avoid using two different approaches offuzzy control. In 2012 International Conference on Unmanned AircraftSystems (ICUAS’12).

Olivares-Mendez, M. A., Mondragon, I., Campoy, P., , and Martinez, C. (2010).Non-symmetric membership function for fuzzy-based visual servoing on-board a uav. In Computational Intelligence in Foundations and Applica-tions, 2010, pages 300–307.

Olivares-Mendez, M. A., Mondragon, I., Campoy, P., , Martinez, C., and Cor-rea, J. F. (2008). Fuzzy control system navigation using priority areas. InComputational Intelligence in Decision and Control, 2008, pages 987–996.

Olivares-Mendez, M. A., Mondragon, I., Campoy, P., , Martinez, C., and Correa,J. F. (2009). Visual servoing using fuzzy controllers on an unmanned aerialvehicle. In EUROFUSE 2009. workshop on on preference modelling anddecision analysis, 2009.

Page 245: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 209

Olivares-Mendez, M. A., Mondragon, I., Cervera, P. C., Mejias, L., and Mar-tinez, C. (2011). Aerial object following using visual fuzzy servoing. InFirst Workshop on Research, Development and Education on UnmannedAerial Systems (RED-UAS 2011).

Onat, M. and Dogruel, M. (2004). Fuzzy plus integral control of the effluent tur-bidity in direct filtration. Control Systems Technology, IEEE Transactionson, 12(1):65 – 74.

Parhi, D. R. and Mohanta, J. C. (2011). Navigational control of several mobilerobotic agents using petri-potential-fuzzy hybrid controller. Applied SoftComputing, 11:3546–3557.

Pestana, J., Sanchez, J. L., Mellado, I., Fu, C., and Campoy, P. (2012). Ardrone identification and navigation control at cvg-upm. In XXXIII JornadasNacionales de Automatica, 2012.

Phillips, C., Karr, C. L., and Walker, G. (1996). Helicopter flight control withfuzzy logic and genetic algorithms. Engineering Applications of ArtificialIntelligence, 9(2):175 – 184.

P.Moustris, G. and G.Tzafestas, S. (2011). Switching fuzzy tacking control formobile robots under curvature constraints. Control Engineering Practice,19:45–53.

Pradhan, S. K., Parhi, D. R., and Panda, A. K. (2009). Fuzzy logic techniques fornavigation of several mobile robots. Applied Soft Computing, 9:290–304.

Precup, R.-E. and Hellendoorn, H. (2011a). A survey on industrial applicationsof fuzzy control. Computers in Industry, 62(3):213 – 226.

Precup, R.-E. and Hellendoorn, H. (2011b). A survey on industrial applicationsof fuzzy control. Computers in Industry, 62(3):213 – 226.

Precup, R.-E., Preitl, S., and Faur, G. (2003). Pi predictive fuzzy controllersfor electrical drive speed control: methods and software for stable develop-ment. Computers in Industry, 52(3):253 – 270. ¡ce:title¿Soft Computing inIndustrial Applications¡/ce:title¿.

project, S. (2012). http://www.sartre-project.eu/en/Sidor/default.aspx.

Puri, A., Valavanis, K., and Kontitsis, M. (2007). Statistical profile generationfor traffic monitoring using real-time uav based video data. In Control andAutomation, 2007.MED ’07. Mediterranean Conference on, MED, pages1–6.

Raimondi, F. M. and Melluso, M. (2008). Fuzzy motion control strategy forcooperation of multiple automated vehicles with passengers comfort. Auto-matica, 44(11):2804 – 2816.

Page 246: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

210 BIBLIOGRAPHY

Rathinam, S., Almeida, P., Kim, Z., Jackson, S., Tinka, A., Grossman, W., andSengupta, R. (2007). Autonomous searching and tracking of a river usingan uav. American Control Conference, 2007. ACC ’07, pages 359–364.

Rodrıguez-Canosa, G. R., Thomas, S., del Cerro, J., Barrientos, A., and Mac-Donald, B. (2012). A real-time method to detect and track moving objects(datmo) from unmanned aerial vehicles (uavs) using a single camera. Re-mote Sensing, pages 1090–1111.

Rubinstein, R. Y. (1996). Optimization of computer simulation models with rareevents. European Journal of Operations Research, 99:89–112.

Rubinstein, R. Y. (2002). Cross-entropy and rare events for maximal cut andpartition problems. ACM Trans. Model. Comput. Simul., 12(1):27–53.

R.Y.Rubinstein and D.P.Kroese (2004). The Cross-Entropy Method: A UnifiedApproach to Combinational Optimization, Monte-Carlo Simulation, andMachine Learning. Springer-Berlin, Germany.

Saeedi, P., Lawrence, P., Lowe, D., Jacobsen, P., Kusalovic, D., Ardron, K.,and Sorensen, P. (2005). An autonomous excavator with vision-basedtrack-slippage control. Control Systems Technology, IEEE Transactions on,13(1):67 – 84.

Salmasi, F. (2007). Control strategies for hybrid electric vehicles: Evolution,classification, comparison, and future trends. Vehicular Technology, IEEETransactions on, 56(5):2393 –2404.

Sanchez-Lopez, J. L., Campoy, P., Olivares-Mendez, M. A., Mellado-Bataller, I.,and Galindo-Gallego, D. (2012). Adaptive control system based on linearcontrol theory for the path-following problem of a car-like mobile robot. InIFAC Conference on Advances in PID Control PID’12.

Sanderson, A. C. and Weiss, L. E. (1983). Adaptative visual servo control ofrobots. In Robot Vision (A. Pugh, ed), pages 107–116.

Santos, M. and Dexter, A. (2001). Temperature control in liquid helium cryostatusing self-learning neurofuzzy controller. Control Theory and Applications,IEE Proceedings -, 148(3):233 –238.

Santos, M. and Dexter, A. (2002). Control of a cryogenic process using a fuzzypid scheduler. Control Engineering Practice, 10(10):1147 – 1152.

Santos, M., Lopez, V., and Morata, F. (2010). Intelligent fuzzy controller of aquadrotor. In Intelligent Systems and Knowledge Engineering (ISKE), 2010International Conference on, pages 141 –146.

Page 247: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 211

Saripalli, S., Montgomery, J., and Sukhatme, G. (2002). Vision-based au-tonomous landing of an unmanned aerial vehicle. In Robotics and Automa-tion, 2002. Proceedings. ICRA ’02. IEEE International Conference on, vol-ume 3, pages 2799 –2804.

Saripalli, S., Montgomery, J. F., and Sukhatme, G. S. (2003). Visually-guidedlanding of an unmanned aerial vehicle. IEEE Transactions on Robotics andAutomation, 19(3):371–381.

Saripalli, S. and Sukhatme, G. S. (2007). Landing a helicopter on a movingtarget. In Proceedings of IEEE International Conference on Robotics andAutomation, pages 2030–2035, Rome, Italy.

Schouten, N., Salman, M., and Kheir, N. (2002). Fuzzy logic control for par-allel hybrid vehicles. Control Systems Technology, IEEE Transactions on,10(3):460 –468.

Shabayek, A. E. R., Demonceaux, C., Morel, O., and Fofi, D. (2012). Visionbased uav attitude estimation: Progress and insights. volume 65, pages295–308.

Shakernia, O., Ma, Y., Koo, T. J., John, T., and Sastry, S. (1999). Landingan unmanned air vehicle: Vision based motion estimation and nonlinearcontrol. Asian Journal of Control, 1:128–145.

Shi, J. and Tomasi, C. (1994). Good features to track. In 1994 IEEE Conferenceon Computer Vision and Pattern Recognition (CVPR’94), pages 593–600.

Shin, K.-K. and Oh, J.-H. (1993). A study on hovering of model helicopter usingvisual information and fuzzy controller. In Takamori, T. and Tsuchiya, K.,editors, Robotics, Mechatronics and Manufacturing Systems, pages 737 –742. Elsevier, Amsterdam.

shing Roger Jang, J. (1993). Anfis: Adaptive-network-based fuzzy inferencesystem. IEEE Transactions on Systems, Man, and Cybernetics, 23:665–685.

Shirai, Y. and Inoue, H. (1973). Guiding a robot by visual feedback in assemblingtasks. Pattern Recognition, 5(2):99 – 108.

Shladover, S. (2006). Path at 20 history and major milestones. In IntelligentTransportation Systems Conference, 2006. ITSC ’06. IEEE.

Siciliano, B. and Khatib, O., editors (2008a). Springer Handbook of Robotics.Springer, Berlin, Heidelberg.

Siciliano, B. and Khatib, O., editors (2008b). Springer Handbook of Robotics.Springer.

Page 248: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

212 BIBLIOGRAPHY

Siegwart, R. and Nourbakhsh, I. R. (2004). Introduction to Autonomous MobileRobots. Bradford Company, Scituate, MA, USA.

Simon, G. and Berger, M.-O. (2002). Pose estimation for planar structures. Com-puter Graphics and Applications, IEEE, 22(6):46–53.

Simon, G., Fitzgibbon, A., and Zisserman, A. (2000). Markerless tracking usingplanar structures in the scene. In Augmented Reality, 2000. (ISAR 2000).Proceedings. IEEE and ACM International Symposium on, pages 120–128.

Sira-RamArez, H., Castro-Linares, R., and LicASaga-Castro, E. (2000). A li-ouvillian systems approach for the trajectory planning-based control of he-licopter models. International Journal of Robust and Nonlinear Control,10(4):301–320.

Smith, III, J. F. (2007). Fuzzy logic planning and control for a team of uavs. InProceedings of The Eleventh IASTED International Conference on ArtificialIntelligence and Soft Computing, pages 286–294.

Soatto, S. and Perona, P. (1994). Structure-independent visual motion control onthe essential manifold. In In Proc. of the IFAC Symposium on Robot Control(SYROCO, pages 869–876.

Spinellis, D. and Papadopoulos, H. T. (1999). A simulated annealing approachfor buffer allocation in reliable production lines. Annals of Operations Re-search, 93:365–375.

Sturm, P. (2000). Algorithms for plane-based pose estimation. In Proceedings ofthe IEEE Conference on Computer Vision and Pattern Recognition, HiltonHead Island, South Carolina, USA, pages 1010–1017.

Sugeno, M., Hirano, I., Nakamura, S., and Kotsu, S. (1995). Development ofan intelligent unmanned helicopter. In Fuzzy Systems, 1995. InternationalJoint Conference of the Fourth IEEE International Conference on FuzzySystems and The Second International Fuzzy Engineering Symposium., Pro-ceedings of 1995 IEEE Int, volume 5, pages 33 –34 vol.5.

Talebu-Daryani, R. (1995). Digtitale gebaeudeautomation und fuzzy control. InUniversity of Applied Science, Cologne.

Talebu-Daryani, R. (1999). Intelligent building for integrated building automa-tion and building energy management system. In KEIO, Uniersity, Yoko-hama, Japan.

Talebu-Daryani, R. and Luther, C. (1998). Application of fuzzy control for in-telligent building, part ii: Fuzzy control of a chilling system. In Proc. oftheWorld Auto. Conf. TSI Press Series, Albuquerque, NM, USA, pages 751–756.

Page 249: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

BIBLIOGRAPHY 213

Talebu-Daryani, R. and Plass, H. (1998). Application of fuzzy control for intel-ligent building part i: Fuzzy control for an ac system. In Proc. ofthe WorldAuto. Conf. TSI Press Series, Albuquerque, NM, USA, pages 745–750.

Teuliere, C., Eck, L., and Marchand, E. (2011). Chasing a moving target froma flying uav. In IEEE/RSJ International Conference on Intelligent Robotsand Systems, IROS2011, pages 4929–4934.

ThomsonReuters (2010). 2010 Journal Citation Reports JCR science edition.Technical report, Thomson Reuters.

Thrun, S. (2002). Robotic mapping: A survey. In Exploring Artificial Intelli-gence in the New Millenium. Morgan Kaufmann.

Thrun, S., Montemerlo, M., Dahlkamp, H., Stavens, D., Aron, A., Diebel, J.,Fong, P., Gale, J., Halpenny, M., Hoffmann, G., Lau, K., Oakley, C.,Palatucci, M., Pratt, V., Stang, P., Strohband, S., Dupont, C., Jendrossek,L.-E., Koelen, C., Markey, C., Rummel, C., van Niekerk, J., Jensen, E.,Alessandrini, P., Bradski, G., Davies, B., Ettinger, S., Kaehler, A., Nefian,A., and Mahoney, P. (2006). Winning the darpa grand challenge. Journal ofField Robotics. accepted for publication.

Tzafestas, S. G., Deliparaschos, K. M., and Moustris, G. P. (2010). Fuzzy logicpath tracking control for autonomous non-holonomic mobile robots: Designof system on a chip. Robot. Auton. Syst., 58(8):1017–1027.

VICON (2012). www.vicon.com.

Welch, G. and Bishop, G. (1995). An introduction to the kalman filter. Technicalreport, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA.

Wenzel, K., Masselli, A., and Zell, A. (2011). Automatic take off, tracking andlanding of a miniature uav on a moving carrier vehicle. Journal of Intelligentand Robotic Systems, 61:221–238.

Werbos, P. (1974). Beyond regression: New tools for prediction and analysis inthe behavioral sciences. PhD thesis, Harvard University, Cambridge, MA.

Yang, S., Li, H., Meng, M.-H., and Liu, P. (2004). An embedded fuzzy controllerfor a behavior-based mobile robot with guaranteed performance. Fuzzy Sys-tems, IEEE Transactions on, 12(4):436 – 446.

Zengin, U. and Dogan, A. (2011). Cooperative target pursuit by multiple uavsin an adversarial environment. Robotics and Autonomous Systems, pages1049–1059.

Zergeroglu, E., Dawson, D., de Queiroz, M., and Nagarkatti, S. (1999). Robustvisual-servo control of robot manipulators in the presence of uncertainty. In

Page 250: Escuela Tecnica Superior De Ingenieros Industriales ...oa.upm.es/15082/1/03_MIGUEL_ANGEL_OLIVARES_MENDEZ.pdfpor hacer tan especial y agradable mi docotrado. ... Antonio Barrientos,

214 Alphabetical Index

Decision and Control, 1999. Proceedings of the 38th IEEE Conference on,volume 4, pages 4137 –4142 vol.4.

Zhang, H. and Ostrowski, J. (1999). Visual servoing with dynamics: control ofan unmanned blimp. In Robotics and Automation, 1999. Proceedings. 1999IEEE International Conference on, volume 1, pages 618 –623 vol.1.

Zhang, Y., Ji, C., Malik, W., O’Brien, D., and Edwards, D. (2007). Cross-entropyoptimisation of multiple-input multiple-output capacity by transmit antennaselection. Microwaves, Antennas Propagation, IET, pages 1131–1136.

Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Trans-actions on pattern analysis and machine intelligence, 22(11):1330–1334.

Zhao, Y. and Collins, E.G., J. (2003). Fuzzy pi control design for an industrialweigh belt feeder. Fuzzy Systems, IEEE Transactions on, 11(3):311 – 319.

Zhao, Z., Yu, Z., and Sun, Z. (2006). Research on fuzzy road surface identifica-tion and logic control for anti-lock braking system. In Vehicular Electronicsand Safety, 2006. ICVES 2006. IEEE International Conference on, pages380 –387.

Zheng, L. (1992). A practical guide to tune of proportional and integral (pi) likefuzzy controllers. In Fuzzy Systems, 1992., IEEE International Conferenceon, pages 633 –640.

Zilouchian, A. and Jamshidi, M., editors (2000). Intelligent Control SystemsUsing Soft Computing Methodologies. CRC Press, Inc., Boca Raton, FL,USA, 1st edition.

Zou, H., Gong, Z., Xie, S., and Ding, W. (2006). A pan-tilt camera controlsystem of uav visual tracking based on biomimetic eye. Robotics andBiomimetics, 2006. ROBIO ’06. IEEE International Conference on, pages1477–1482.