obstacle detection and avoidance for an autonomous farm tractor

Upload: sanketh07

Post on 04-Apr-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    1/98

    OBSTACLE DETECTION AND AVOIDANCE

    FOR AN AUTONOMOUS FARM TRACTOR

    by

    Keith W. Gray

    A thesis submitted in partial fulfillmentof the requirements for the degree

    of

    MASTER OF SCIENCE

    in

    Electrical Engineering

    Approved:

    _____________________ _____________________Kevin Moore Nick FlannMajor Professor Committee Member

    _____________________ _____________________Kay Baker Noelle E. Cockett

    Committee Member Interim Dean of Graduate Studies

    UTAH STATE UNIVERSITYLogan, Utah

    2000

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    2/98

    ii

    Copyright Keith W. Gray 2000

    All Rights Reserved

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    3/98

    iii

    Abstract

    Obstacle Detection and Avoidance

    for an Autonomous Farm Tractor

    by

    Keith W. Gray, Master of Science

    Utah State University, 2000

    Major Professor: Dr. Kevin L. MooreDepartment: Electrical and Computer Engineering

    The Center for Self-Organizing and Intelligent Systems (CSOIS) is developing

    autonomous ground vehicles for use in farming applications. CSOIS has targeted

    obstacle detection and avoidance as key challenges in real-world implementation of

    autonomous farming vehicles. A range sensor, giving real-time updates of the

    surrounding environment, performs obstacle detection. Obstacle avoidance is

    accomplished through a combination of global and local avoidance subsystems that deal

    with both known and unknown obstacles in the operating area of the vehicle. The global

    avoidance subsystem is a mission-level path planner that preplans paths around all known

    obstacles while the local avoidance subsystem maneuvers the tractor around unknown

    obstacles. The local avoidance subsystem consists of an obstacle filter and an obstacle

    avoidance algorithm. The obstacle filter reports unknown obstacles to the path planner

    and enables the avoidance algorithm if the preplanned path is blocked. The avoidance

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    4/98

    iv

    algorithm learns of all the known and unknown obstacles from the obstacle filter so that

    all obstacles can be safely avoided. The task of the avoidance algorithm is to maneuver

    the vehicle around the obstacle with the goal of returning to the preplanned path as

    quickly as possible. This thesis describes the unique challenges to autonomous farm

    vehicle navigation and the solutions reached that enabled success.

    (98 Pages)

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    5/98

    v

    Acknowledgments

    I would like to thank the many people that helped me on this project. First and

    foremost I would like to thank the John Deere project team. Mel, Mitch, Sarah, and Don.

    You four are the real reason the obstacle detection and avoidance worked on the tractor.

    Your input and suggestions and patience helped me see past the obstacles in my thinking

    and really shaped the final product. I would also like to thank my professors at Utah

    State University, Dr. Moore and Dr. Flann, who gave valuable input and suggestions as

    well as the opportunity to be part of this great team of engineers. I would like to thank

    John Deere Inc. for their belief in our team and for the growth that I received because of

    the project. I want to thank my wife, Malia, who supported me through this whole ordeal

    and always encouraged me when I was burned out, even though I rarely saw her that

    summer, I love her deeply. Finally I want to thank my God who has blessed me with the

    desire to learn and grow and the ability to understand that I am nothing without him.

    Keith W. Gray

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    6/98

    vi

    Contents

    Page

    Abstract ............................................................................................................................. iii

    Acknowledgments.............................................................................................................. v

    List of Tables...................................................................................................................viii

    List of Figures ................................................................................................................... ix

    1 Introduction ................................................................................................................... 1

    1.1 Motivation ............................................................................................................... 11.2 Summary of Results ................................................................................................ 21.3 Thesis Outline ......................................................................................................... 4

    2 Overall System............................................................................................................... 6

    2.1 Base Station............................................................................................................. 82.2 Master Node ............................................................................................................ 82.3 Path Planner Node.................................................................................................112.4 Sensor Node ..........................................................................................................12

    3 Obstacle Detection Technology ................................................................................... 16

    3.1 Obstacle Detection Sensors ................................................................................... 173.1.1 CCD Camera .............................................................................................173.1.2 Ultrasonic Sensors..................................................................................... 203.1.3 Scanning Laser .......................................................................................... 213.1.4 3D Scanning Lasers...................................................................................233.1.5 Millimeter Wave Radar............................................................................. 23

    3.2 Comparsion of Sensors.......................................................................................... 243.2.1 Weather ..................................................................................................... 24

    3.2.2 Light ..........................................................................................................243.2.3 Detection Distance .................................................................................... 253.2.4 Response Time .......................................................................................... 253.2.5 Cost............................................................................................................263.2.6 Summary ...................................................................................................27

    3.3 Detection Sensor Used on Tractor ........................................................................ 28

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    7/98

    vii

    4 Obstacle Avoidance ..................................................................................................... 31

    4.1 Global and Local Obstacle Avoidance.................................................................. 31

    4.2 Vision of Obstacle Avoidance............................................................................... 324.3 Obstacle Avoidance Used in Research.................................................................. 334.3.1 Wall-Following ......................................................................................... 344.3.2 Black Hole................................................................................................. 364.3.3 Path Computing......................................................................................... 384.3.4 Potential Fields.......................................................................................... 404.3.5 Histogram..................................................................................................41

    5 Obstacle Detection and Avoidance ............................................................................ 52

    5.1 Obstacle Detection and Avoidance Data Flow ..................................................... 52

    5.2 Obstacle Detection ................................................................................................ 535.3 Obstacle Filter ....................................................................................................... 585.3.1 Obstacle Filter Flow of Control ................................................................ 605.3.2 Dynamic Obstacles Overview................................................................... 62

    5.4 Avoidance.............................................................................................................. 64

    6 Simulations and Vehicle Tests.................................................................................... 77

    7 Conclusions and Recommendations .......................................................................... 81

    References ........................................................................................................................ 84

    Appendix A ...................................................................................................................... 87

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    8/98

    viii

    List of Tables

    Table Page

    5.1 LMS Continuous Data Report Packet. .................................................................54

    5.2 PC to LMS Reset Packet. .....................................................................................56

    5.3 PC to LMS Request for Continuous Data. ...........................................................56

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    9/98

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    10/98

    x

    5.12 Stop state. ............................................................................................................. 75

    5.13 Block diagram of steering control during avoidance. .......................................... 76

    6.1 Map of VFH+ simulation. .................................................................................... 78

    6.1 Map of VFH+ simulation (cont.). ........................................................................ 79

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    11/98

    Chapter 1

    Introduction

    This thesis presents the design and implementation of an obstacle detection and

    avoidance system for use on an automated tractor. The research described in this thesis

    was carried out as part of a larger project aimed at demonstrating obstacle detection and

    avoidance for an autonomous tractor operating in a typical farming environment.

    Specific contributions of the thesis include an assessment of the different obstacle

    detection sensors and obstacle avoidance algorithms used in autonomous vehicle

    research, and development of a new obstacle avoidance system for the tractor, including a

    unique obstacle avoidance algorithm. Simulation and testing results show the

    effectiveness of the approach.

    1.1 Motivation

    The idea of an autonomous tractor is not a new one. Every child who lived on a

    farm had dreams of a farm vehicle that could take over the plowing and harvesting

    chores, saving them from the boredom of driving long monotonous hours. With

    advances in GPS technology and computerization of farm equipment this dream is now

    closer to reality. Now the simple, yet boring, task of plowing and harvesting a field can

    be turned over to an autonomous tractor that will never get tired and that will do the job it

    is tasked to do. While an autonomous farm tractor is a realizable dream, there are still

    several concerns that need to be addressed before they become common on the family

    farm. Two of the biggest concerns are first, how the tractor will see its environment and,

    second, how the tractor will react to its environment.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    12/98

    2

    The ability to sense the surrounding environment is an important issue for any

    autonomous vehicle. As human beings we have the most powerful set of sensors and the

    best possible computer available for interpreting our environment. For an autonomous

    tractor, the question becomes: is it possible to find sensors that can gather sufficient

    environmental data for safe vehicle navigation? Then, once the environmental data has

    been captured, is there a fast and effective way to interpret the data and determine what it

    means to the tractor? In order to take the farmer out of the tractor these two questions

    must be answered in the affirmative.

    Once the environment around the tractor has been determined then the vehicle

    must react accordingly. A farmers reaction to an unexpected object in his way is nearly

    instant and yet thought out. He knows that if he needs to swerve around a big rock that

    has become unearthed that he cannot cross over, any hay that has been cut, or travel over

    a fence line or ditch. As with the detecting of the environment, if the farmer is to be

    taken out of the tractor the issue of tractor reaction must be fast and precise. It is

    necessary to be able to make correct decisions about the appropriate action needed in

    response to the data about the environment.

    1.2 Summary of Results

    The unique obstacle avoidance algorithm as developed by the author and presented

    in this thesis was demonstrated before the project sponsors in a farm field west of Logan

    Utah in August of 1999. Different scenarios were used to illustrate the need for obstacle

    detection and avoidance as well as demonstrate the effectiveness of the avoidance

    algorithm presented in this thesis. The different scenarios included an obstacle on the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    13/98

    3

    path of travel, obstacles to the right or left of the path of travel, known obstacles close to

    the path of travel, and dynamic, or moving, obstacles.

    In first scenario, an obstacle on the path of travel, the tractor was effective at

    determining the environment around the obstacle to make the decision as weather to

    avoid to the right or left of the obstacle. If the path of travel was close to the border of

    the field then the avoidance algorithm would avoid towards the center of the field. The

    next scenario, obstacles to the right or left of the path, the avoidance algorithm correctly

    drove the tractor the shortest distance around the obstacle and back toward the path.

    The next scenario, known obstacle close to the path of travel, demonstrated the

    obstacle filter portion of the obstacle avoidance algorithm. When the obstacle detection

    sensors detected a known obstacle, such as the fence line surrounding the field, the

    obstacle filter did not inform the avoidance algorithm because the obstacle did not block

    the path of travel. If, however, the tractor was avoiding an unknown obstacle and a

    known obstacle was encountered in the process, that obstacle was also effectively

    avoided.

    The final scenario, dynamic obstacles, was demonstrated using a remote controlled

    truck. The truck was fitted to be tall enough and wide enough to be seen by the detection

    sensor. Three different scenarios with the dynamic obstacle showed the robustness of the

    avoidance algorithm. The first scenario had the dynamic obstacle approach the tractor

    from the right or left. As the tractor avoided the dynamic obstacle the obstacle continued

    to move towards the tractor. When the obstacle was to close to the tractor the avoidance

    algorithm stopped the tractor. The next scenario involving the dynamic obstacle had the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    14/98

    4

    obstacle move in front of the tractor, as the tractor started avoiding the obstacle the

    obstacle quickly moved out of sight of the tractor and normal tractor driving resumed.

    The final scenario with the dynamic obstacle had the obstacle approach the vehicle

    causing the tractor to avoid. As the tractor attempted to maneuver around the obstacle the

    obstacle moved with the tractor, pushing the tractor further from the desired path of

    travel. When the tractor was more than one swath width away from the path of travel the

    avoidance algorithm stopped the tractor.

    1.3 Thesis Outline

    The thesis is organized as follows. Chapter 2 discusses the tractor system,

    illustrating how this work fits into the overall project. In Chapter 3 we discuss various

    obstacle detection technologies used for autonomous vehicles. We also describe the

    obstacle detection sensor used in this project. In Chapter 4 we examine obstacle

    avoidance algorithms other researchers have used for autonomous vehicles. Section 4.1

    discusses the difference between global and local or reactive avoidance. Section 4.2 lists

    the obstacle avoidance requirements that were specified for the overall system developed

    in the larger project. Section 4.3 then considers different obstacle avoidance algorithms

    and compares them against the requirements in Section 4.2. In particular, subsection

    4.3.5 is a detailed description on one obstacle avoidance algorithm that served as the

    model for the avoidance algorithm developed by the author. Chapter 5 gives a detailed

    explanation of the obstacle detection and avoidance system that was developed for the

    autonomous tractor system. This chapter also describes how the system works together

    to successfully detect and interpret the environment surrounding the vehicle and then

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    15/98

    5

    react according to this environment. Section 5.3 and Section 5.4 detail the unique

    approach to the obstacle avoidance algorithm. Chapter 6 presents the simulation and

    actual testing of the detection and avoidance system. Chapter 7 concludes the thesis with

    suggestions for future work.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    16/98

    6

    Chapter 2

    Overall System

    In this chapter we describe the overall autonomous tractor system for which the

    obstacle detection and avoidance algorithm was developed. Each subsystem in the

    system is discussed and we explain how the various subsystems interact with each other

    to create a fully autonomous farm tractor. This architecture was developed by the

    engineering team as part of the larger project [1].

    As seen in Fig. 2.1 the autonomous tractor system is made up of four main

    subsystems: the base station, the master node, the path planner node, and the sensor node.

    The interaction of each of the subsystems will be illustrated by describing a typical task

    UserGui

    Wireless

    Lan

    EmergencyStop Button

    Modem

    Master

    Node

    Path Planner

    Node

    Sensor

    Node

    Wireless

    Lan

    Modem Sensors

    Tractor

    Control

    Base Station

    Coax

    Coax

    Coax

    Tractor System Architecture

    TCP/IP

    RS-232

    Fig. 2.1: Overall system architecture operation.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    17/98

    7

    for the autonomous tractor. The goal of the mission is to perform a simple sweep

    pattern in a field. From the base station we use the User GUI to select the field of

    operation. The sensor node conveys to the path planner the exact location of the tractor

    and the current heading of the tractor. The path planner uses the position and attitude

    data and the desired field of operation to plan a sweep path. The path planner takes into

    account the turning radius of the vehicle as well as the size of the farming implement the

    tractor may be using while planning the sweep pattern. Once the path planner has

    planned a mission the mission is visible on the User GUI at the base station and we can

    begin the mission. We then select on the GUI the start mission command. At this point

    the path information generated by the path planner is forwarded to the master node and

    the sensor node. The master node converts the path information into driving and steering

    commands. These commands control the tractor along the path, thus performing the

    desired task. The sensor node provides feedback to the master node controller on the

    tractors position and attitude along the path. If the obstacle detection sensor senses an

    obstacle that will impede the tractors travel then the obstacle avoidance algorithm

    calculates driving and steering commands that will safely avoid the obstacle. These

    commands are relayed to the master node, which carries out the commands. If for some

    reason we feel that the tractor is behaving in an unsafe manner we have an emergency

    stop button at the base station. This button allows us to shut off the tractor engine and

    engage the emergency brake.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    18/98

    8

    The remainder of this chapter will discuss each of these different subsystems and

    how they contribute to the vehicle operation as well as to obstacle detection and

    avoidance.

    2.1 Base Station

    The base stations contribution to vehicle operation and obstacle detection and

    avoidance is perhaps the most important of all the other subsystems. The reason for this

    is a simple old clich that states, seeing is believing. If a farmer is to turn an

    autonomous tractor loose in his field where the liabilities are high if something goes

    wrong, then he will want accurate information about the tractor and he will want to stop it

    immediately if something goes wrong. This is what the base station provides. The base

    station is the communication link between the farmer and the tractor. The base station

    consists of a graphical user interface (GUI) and a tractor kill button. The GUI allows the

    farmer to task the tractor and then displays in real-time where the tractor is and what it is

    doing. The base station also provides a graphical display of the instrument panel from

    the tractor cockpit. The GUI also displays in real-time the environment around the

    tractor by displaying the obstacle detection sensor information. If the farmer feels the

    need to stop the tractor at any time during the course of the mission he can flip a switch

    and instantly stop the tractors engine and lock the brakes.

    2.2 Master Node

    The master node performs all the functions needed to drive and steer the tractor.

    During obstacle avoidance the avoidance algorithm calculates the proper heading for

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    19/98

    9

    Master Controller

    Motion Exception

    HandlerPath Tracker

    Drive ControllerStatus Broadcaster

    Listeners

    Script Parser &

    Trajectory GeneratorWatchdogs

    Position & Attitude

    Processor

    Path Store

    from Tractor from N av/Sensor Node from Mission Plannerfrom Joystick

    To other Nodes To Wheel Steering

    VehicleState WheelDriveVectors

    VehicleState

    DriveVectors DriveVectors

    VehicleState

    HandledEnableEnable

    PathSegments

    PathSegments

    PathScripts

    CommFailure

    KeepAlives

    Position & Attitude

    Odometry

    DriveVectors

    Joystick DriveVectors

    GPS/FOG

    GPS/FOG/ObstacleComputer PathScriptsJoystick DriveVectorsPathScripts

    Fig. 2.2: Master node architecture.

    collision-free travel, but it must communicate this heading to the master node whose task

    is to steer the tractor to that heading. In the remainder of this section we describe how

    the master node controls the tractor by discussing the architecture shown in Fig. 2.2 [1].

    In order to operate the tractor the master node receives data from four different

    sources. The tractor itself has an on-board computer that interfaces with the master node.

    Through this link the master node can monitor engine RPM, throttle, and other

    parameters necessary for driving. This same link also allows the master node to make

    any changes to the tractors driving, such as increasing or decreasing the throttle and

    steering angle. The joystick is the second source of input data into the master node. The

    input from the joystick is interpreted as steering angle and throttle speed. All of the path

    information computed by the path planner is the third data input into the master node.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    20/98

    10

    The path information is sent as start points and end points of the desired path along

    with the radius of the path. If the radius is zero then the path is a straight line, if the

    radius is greater than zero then the path is a circle. The final input to the master node is

    from the sensor node. The sensor node communicates to the vehicles position and

    heading. If the sensor node is avoiding obstacles then a motion exception is also

    communicated to the master, indicating the speed and steering angle needed to avoid the

    obstacle.

    In addition to receiving data from the other nodes the, master node transmits data

    to the planner and sensor nodes. In order to keep the other node up to date with the

    current vehicle position and heading (heading and position from sensor are augmented in

    the master node with vehicle odometery for more accurate readings) and other status

    information, the master node has a status broadcaster. The status broadcaster updates

    tractor parameters every 250 ms and can be accessed by the sensor and planner node

    through a master listener function.

    The reason for the inter-node communication is so the master node knows exactly

    where the tractor needs to be and which direction it needs to be heading. It is then the

    master nodes responsibility to drive the tractor to where it needs to be and steer it in the

    direction it needs to be headed. This is the accomplished through the drive controller,

    which resolves path commands from the path planner node and position and attitude

    information from the sensor node. In the case of obstacle avoidance the path commands

    from the planner node are disregarded as the sensor node provides the needed driving

    information to the drive controller.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    21/98

    11

    2.3 Path Planner Node

    The path planners contribution to obstacle detection and avoidance is to provide

    mission information to the master and sensor node and to give a global understanding of

    the tractors environment. In the remainder of this section we discuss how the path

    planner creates missions and informs the other nodes of this mission. We also discuss

    how the planner gives a global understanding of the tractors environment.

    Path planning is done using a terrain map, a possibility graph, and a search. The

    terrain map stores terrain information of the area where the vehicle will be driven,

    including locations of roads, fields, obstacles, etc. The possibility graph, which

    represents possible paths through the map, is made up of nodes and edges. Nodes can be

    compared to street intersections and edges to streets. Traveling from node to node is

    done by following an edge or a combination of edges that connects the two nodes.

    Adding nodes and edges to the graph extends the graph and allows travel to previously

    unattainable areas. Each edge is assigned a cost based on terrain information stored in

    the map. An A* search algorithm, as described in [2], uses edge costs to find optimal

    paths from one point to another in the graph.

    In past projects, obstacle avoidance was done by rejecting edges that passed

    through a known obstacle. This simple technique ensures that obstacles will be avoided;

    however, it reduces the connectivity of the graph and may unnecessarily prevent access to

    certain areas of the map. The path planning for the tractor uses edge-building techniques

    that take obstacle shape into account and builds edges that circumvent obstacles.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    22/98

    12

    Once the mission has been constructed it is broken down into path segments.

    Each path segment gives a start and endpoint of the path and a radius for the path. A

    radius of zero indicates a straight path. Then, there are effectively two types of paths:

    lines and curves. All the path segments for the entire mission are sent to the master and

    sensor node and stored in order of execution.

    Planning a mission is not the only function of the path planner. The other

    function is to provide the sensor node with a global understanding of the environment in

    which the tractor operates. This allows the sensor node to react during obstacle

    avoidance to, not only the objects the detection sensor sees, but to objects the sensor

    might not see but that are still present. An example of this might be if the tractor is

    avoiding an obstacle that is on the desired path of travel with no other obstacles in view

    of the detection sensor. From the obstacle detection and avoidance algorithms

    perception, avoiding to the right or left of the obstacle makes no difference. However if

    the map of the field indicated a pond was to the left of the obstacle then the obstacle

    should be avoided by leaving the path to the right. Thus a global understanding of the

    path planner assists the avoidance algorithm by indicating obstacles other than those seen

    by the detection sensor.

    2.4 Sensor Node

    Having discussed how the path planner and master nodes contribute to obstacle

    detection and avoidance we now focus on the node where obstacle detection and

    avoidance are performed. In this section we take a closer look at how obstacle detection

    and avoidance are performed by examining the architecture of the sensor node shown in

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    23/98

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    24/98

    14

    correct for bias drift from the integration error.

    Recall the two questions raised in the introduction: (1) is there a sensor that can

    perceive the environment around the tractor well enough to provide data for collision-free

    driving while accomplishing the predetermined mission as quickly as possible and (2)

    how can we interpret this data and make correct decision about actions needed to avoid

    obstacles. The question of reporting the environment will be addressed in chapter 3.

    After the sensor reports the environment we then must interpret the data as quickly as

    possible so that avoidance headings can be computed.

    This leads us to the obstacle filter. The obstacle filter is a way of interpreting the

    data from the detection sensor and combining that data with the map of known obstacles

    from the path planner to give an up-to-date map of the environment surrounding the

    vehicle. This up-to-date map is constructed by combining all known obstacles with any

    new previously unknown obstacles that the detection sensor saw. The detection sensor

    will see many obstacles, but to avoid dealing with the same obstacles twice, the obstacle

    filter ignores any obstacle reported by the detection sensor that was previously known by

    the path planner. An up-to-date map allows the tractor to make effective collision-free

    heading calculations.

    With the obstacle filter providing a map of the surrounding environment, the

    obstacle controller can calculate proper heading corrections to avoid striking obstacles.

    The algorithm for computing these heading corrections comprises chapter 5. While the

    tractor is avoiding obstacles the obstacle controller also ensures that the master node is

    steering the vehicle to the proper avoidance heading. It does this by monitoring the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    25/98

    15

    vehicle heading and issuing steering commands to the master node that will bring the

    tractor to the calculated avoidance heading.

    In this chapter we have discussed the tractor subsystems and how they all work

    together to operate the vehicle and to aid in the detection and avoidance of obstacles. In

    the next chapter we describe in more detail one of the subsystems, the obstacle detection

    sensor.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    26/98

    16

    Chapter 3

    Obstacle Detection Technology

    Having discussed the tractor system as a whole, in this chapter we discuss the first

    topic of this thesis, the obstacle detection sensor. First we consider why the detection

    sensor is so vital to autonomous vehicles, whether they be tractors, airplanes, or robots.

    We then examine what other researchers have used for detection sensors and what they

    have to say about the various technologies that are available. Finally, we briefly describe

    the obstacle detection sensor that was used on the tractor throughout the course of the

    project.

    For people to truly be taken out of the loop in the field of autonomous vehicles,

    all of the functions that man performs must be mimicked. A microprocessor takes the

    place of the brain; complex algorithms and programs take the place of decision-making;

    actuators take the place of feet to accelerate or brake, and hands to steer. Unfortunately,

    the most important feature of a person is the hardest to mimic. The ability of a person to

    see and take in so much with his eyes is key. Without eyes the microprocessor crunches

    the complex algorithms and programs blindly. The program expects a leisurely cruise in

    the country, but the vehicle may actually be travelling on perilous terrain. This is the

    biggest limitation for autonomous vehicles, with no single perfect solution. Before the

    decision of which detection sensor to place on the tractor we examined many different

    options. Section 3.1 discusses the different detection sensors that other researchers are

    using in this field and then Section 3.2 discusses the detection sensor used for the tractor

    project.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    27/98

    17

    3.1 Obstacle Detection Sensors

    An examination of various research studies on autonomous vehicles/robots shows

    that there are only five to six different types of effective obstacle detection sensors.

    These sensors range in price from inexpensive to very expensive. Each of them has their

    own unique advantages and disadvantages for different applications. The sensors are not

    limited to obstacle detection. Some sensors are used for vehicle localization (the

    detection of objects at known locations to determine the vehicles location in a map as

    explained in [3] and [4]). Other sensors may be used to extract different features in

    plants for plant characterization, allowing an autonomous robot to give the proper

    fertilizer in the proper amounts to different plants as explained by Harper [5]. Again,

    other sensors are used to create maps by mapping passable and impassable terrain as

    explained by Howard [6] and by Matsumoto [7]. While all of these applications do not

    address our problem of obstacle detection directly, they do give us an idea of how these

    different sensors might work in our setup. If a sensor is used effectively to create

    accurate maps for the vehicle environment then it is possible to detect obstacles in a

    farming environment. Following will be a discussion of each of the different sensors

    being used today. Their advantages and disadvantages will be highlighted and the

    possibility of use on the tractor project will be examined.

    3.1.1 CCD Camera

    The first type of detection sensor we consider is the CCD camera. The camera is

    considered a passive sensor since it is a sensor that requires ambient light to illuminate its

    field of view. The camera also happens to be similar in a crude sense to the human eye.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    28/98

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    29/98

    19

    an obstacle. The background is often the same color as the plants Harper [5] wants to

    detect, making cameras useless. The human brain has been trained to recognize so many

    different patterns and color schemes that instantly we can decipher the data from our eyes

    and determine backgrounds and obstacles. The computing power required for this is far

    beyond our capabilities. If the computer brain can be trained to recognize background,

    then a determination of where obstacles are on that background can be accomplished.

    Howard [6] used visual data obtained from cameras to build maps. To do this he assumes

    the floor is a constant color, thus making the distinction between passable and impassable

    areas of the map. Navigation was accomplished by [7] placing patterns on the floors.

    These patterns are easily deciphered from the camera and the robot then travels along the

    pattern. Soto [13] used 5 color cameras for surveillance outdoors. He assumes that all

    background is grass and is therefore green. This means anything not green in considered

    an intruder. Another big problem with stereo vision is dust, rain, and snow. Foessel [15]

    described the problems that cameras have in blowing snow. His research was conducted

    in polar environments were there is an abundance of blowing snow. Corke [16]

    described the problems associated with cameras and dust typical in mining environments.

    To the sensor these conditions appear as obstacles when they are not. Again human eyes

    adjust to these problems and filter them out. Indoor environments, where lighting and

    coloring can be controlled, seem the best place to use stereo vision. The farming

    environment offers too many challenges for stereo vision to be used effectively as an

    obstacle detection sensor.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    30/98

    20

    3.1.2 Ultrasonic Sensors

    A second type of detection sensor is the ultrasonic sensor, or sonar. Sonar is the

    most widely used sensor for obstacle detection because it is cheap and simple to operate.

    Their use ranges from the robot hobbyist to the serious researcher. In [3] and [11] sonar

    is used in vehicle localization and navigation respectively. In [13], [17], [18], and [20]

    sonar is used for obstacle detection. Borenstein [21-28] used a sonar ring around his robot

    for obstacle detection, which allows him to develop an obstacle avoidance algorithm.

    Harper [5] states that sonar works by blanketing a target with ultrasonic energy; the

    resultant echo contains information about the geometric structure of the surface, in

    particular the relative depth information. Unfortunately, a key drawback of sonar is one

    sensor for one distance reading; that is, in order to obtain an adequate picture of the

    environment around the vehicle many sensors must be used together. It is not uncommon

    to have several rings of sonar sensors on omni-directional vehicles because the vehicle

    can travel in any direction at any time, as the case with Borenstein [22], [23], [26]. The

    number of sensors required for an adequate field of view is not the only drawback to

    sonar. Borenstein [28] explains that Even though ultrasonic ranging devices play a

    substantial role in many robotics applications, only a few researchers seem to pay

    attention to (or care to mention) their limitations. He goes on to explain that for

    ultrasonic sensors to be effective they must be as perpendicular to the target as possible to

    receive correct range data. This is because the reflected sound energy will not be

    deflected towards the sensors if the two are not perpendicular to each other. Borenstein

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    31/98

    21

    [25] goes on to list three reasons for ultrasonic sensors being poor sensors when

    accuracy is required. These reasons are:

    1 Poor directionality that limits the accuracy in determination of the spatialposition of an edge to 10-50 cm, depending on the distance to the obstacle andthe angle between the obstacle surface and the acoustic beam.

    2 Frequent misreadings that are caused by either ultrasonic noise from externalsources or stray reflections from neighboring sensors (crosstalk).Misreadings cannot always be filtered out and they cause the algorithm tosee nonexistent edges.

    3 Specular reflections that occur when the angles between the wave front and

    the normal to a smooth surface is too large. In this case the surface reflectsthe incoming ultra-sound waves away from the sensor, and the obstacle iseither not detected at all, or (since only part of the surface is detected) seenmuch smaller than it is in reality.

    Perhaps the biggest limitation of sonar for the tractor application is the limited range of

    sight. Given a one-ton tractor traveling at 5-8 mph, when the sonar detected an obstacle

    it would already be too late to begin safe avoidance. However, despite all the limitations

    of ultrasonic sensors, the technique can still have a good use. Sonar is a good safety net

    sensor. If the other detection sensors miss an obstacle, or for some reason the vehicle

    gets close enough for sonar to be triggered then a halt state can be executed and the

    vehicle stops immediately.

    3.1.3 Scanning Laser

    A third type of detection sensor is a scanning laser. Scanning lasers use a laser

    beam reflected off a rotating mirror. The beam reflects off the mirror and out to a target

    and then is returned to the sensor for range calculations. Two main types of scanning

    lasers are used. The first emits a continuous beam and from the return of that beam range

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    32/98

    22

    data are calculated. A laser of this type, provided by Acuity Research Inc., was used

    by Torrie in a different CSOIS project [20]. The Acuity Research laser is a class 1 laser

    and is not recommended because it is not eye safe. The second type is a pulsed laser that

    sends out many laser pulses and averages the range data on each pulse to determine the

    range to an object. This type of laser is considered a class 3 and is better because it is eye

    safe. Another advantage to the pulsed laser is that measurement errors can be filtered out

    better than continuous beam lasers. Scanning lasers, as they are called, are starting to

    make their presence known in autonomous vehicle research. Bailey [4] uses laser

    scanners to determine the position of his robots. He knows the exact location of all

    obstacles and by using the laser scanner can accurately calculate the robots position based

    on distance to the obstacles and angle with respect to the obstacles. Apostolopoulos [10]

    uses scanning lasers along with cameras to find meteorites in the Antarctic. In [11], [16],

    and [20] scanning lasers are used to detect obstacles, which in turn aids in safe

    navigation. Scanning lasers give better results for range data with far less computational

    constraints compared to camera detection and the resolution is considerably better than

    ultrasonic sensors. The angular resolution on a Sick Optic Inc Laser Measurement

    System (LMS) laser range finder can be as small as 0.25 with a field window of 180.

    Much of the research that places autonomous vehicles outdoors has moved away from

    stereo vision to scanning lasers [16, 20] because no matter the amount of light, range data

    are available. The scanning laser does have its drawbacks. Most notable is that the scans

    are planar, which means that if an obstacle is above or below the scanning plane then

    nothing is detected. The sensors also suffer from dust, rain, and blowing snow, causing

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    33/98

    23

    false readings as was noted by Foessel [15], who operated his robots in polar

    environments.

    3.1.4 3D Scanning Lasers

    3D scanning lasers make up a fourth type of detection sensor. The reason they are

    classified differently than 2D scanning lasers is the jump in price and complexity.

    Langer [14] describes a 3D-scanning laser his company has developed and how it might

    be used. On the surface a 3D scanning laser looks attractive, but for real-time systems it

    is unfeasible. Langer states the time required to scan 8000 pixels is 80 seconds. The

    other draw back is the price of a unit. The author priced out another 3D scanning unit

    with the same specifications as [14] and it was over $150,000.00, too much to put on any

    farm vehicle even if the scanning speed was real-time.

    3.1.5 Millimeter Wave Radar

    The final type of detection sensor we consider is millimeter wave radar. This

    sensor also suffers from a high price tag and not many researchers have the means to use

    it. One place it has been used is at Carnegie Mellon University, on their Antarctic rover.

    Foessel [15] explains why the harsh polar environment renders stereo vision and scanning

    lasers useless: the blowing snow causes to many false objects. When millimeter wave

    radar was tested in this environment blowing snow had almost no effect on millimeter-

    wave range estimation. The narrow beam associated with radar also allows for good

    angular resolution. When properly gimbaled the radar can collect a 3D image of the

    environment in the path of the autonomous vehicle.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    34/98

    24

    3.2 Comparsion of Sensors

    We have examined five different types of obstacle detection sensors and have

    discussed some of their advantages and disadvantages separately. Now we will compare

    each sensor side-by-side for our application. Given the farming environment we can set

    five criteria for selecting the best obstacle detection sensor: operation in any weather,

    operation in any light, detection out to at least 15 meters (50 feet), fast response time, and

    cost that is significantly less than that of the tractor. Ideally the best sensor will meet our

    five criteria, but if this is not possible the one that comes closest will be selected.

    3.2.1 Weather

    The farming environment brings out the whole spectrum where weather is

    concerned. The sensor must be able to handle dust, rain, and snow, all of which can be

    blowing and swirling at any given time. Because of this the camera, sonar, and scanning

    laser can give false readings. The only suitable detection sensor is the millimeter wave

    radar.

    3.2.2 Light

    On the farm the day starts in the darkness of the morning and then gradually the

    light gets more intense as the day goes on until the noon day peak. At this point the light

    recedes much as it started as the day winds down. If, however, the weather changes

    throughout the day the amount of light may also change. This means that our detection

    sensor must operate in any light. From the start this excludes the CCD cameras. They

    function on ambient light and any changes in the lighting can affect the interpretation of

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    35/98

    25

    their data. Sonar, scanning laser, and millimeter wave radar are relatively unaffected

    by changing light conditions and would be good choices for dealing with changing light

    conditions.

    3.2.3 Detection Distance

    For a tractor that travels at 8.8 ft/sec it only takes 3.4 seconds for the tractor to

    travel 30 feet. The obstacle detection sensor must determine the presence of an obstacle

    and the avoidance algorithm must calculate an avoidance heading in under 3.4 seconds,

    preferably well under 3.4 seconds. This means that the detection sensor must be able to

    detect obstacles soon enough that avoidance is performed safely. With this limitation we

    determined that the obstacle detection sensor used on the tractor must have a maximum

    detectable range of at least 15 meters (50 feet). This gives plenty of time for obstacle

    identification and avoidance heading calculations. This requirement excludes most

    ultrasonic sensors, which have a maximum range of about 5 meters. CCD cameras,

    scanning laser, 3D scanning laser, and millimeter wave radar all have maximum

    detectable ranges of at least 15 meters or more.

    3.2.4 Response Time

    In order for a detection sensor to be adequate the time required to see and obstacle

    and cause the tractor respond is critical. In the case of the 3D-scanning laser the time

    required to scan one image in front of the vehicle is 80 seconds, which is obviously

    unacceptable for a real-time system. Sonar can determine fast enough if an obstacle is

    present, but sonar can only see a short distance. That is, the tractor will already be too

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    36/98

    26

    close to an obstacle before the sonar can sense an obstacle. A CCD cameras response

    time depends on the image processing speeds and capabilities. With faster computers this

    is not a problem and cameras response time is fast enough to detect obstacles at a safe

    distance. 2D-scanning lasers have a very fast response time, on the order of 17 ms to 1 s

    depending on the resolution and the speed that data are being output. This allows

    scanning lasers to detect obstacles at a safe distance, allowing for proper obstacle

    detection and avoidance. Millimeter wave radar also has a fast enough response time.

    As with cameras, the higher speed computers allow for fast computation of the radar

    signal. Radar can detect obstacles at safe distances.

    3.2.5 Cost

    For most research projects cost of sensors may not be a prime issue, but with the

    autonomous tractor tested in this project there is a desire to keep cost down because of

    possible future production considerations. If the cheapest detection sensor were the only

    issue then hands down the sonar or CCD cameras would be the choice. The 2D scanning

    laser costs significantly more than the sonar and the CCD camera, but it would not break

    the bank to put it on a production type tractor. In fact, looking at our requirement of

    costing significantly less than the cost of the tractor the scanning laser is acceptable. The

    3D scanning laser, on the other hand, costs as much if not more than most tractors and

    would not be feasible. The millimeter wave radar also has a high cost, while not as much

    as the 3D-scanning laser, it is still significant enough to warrant the choice of a cheaper

    technology.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    37/98

    27

    3.2.6 Summary

    After examining the five criteria we must settle for the optimum sensor. Because

    radar is the only sensor that works well in all weather conditions it would be a natural

    choice. However due to the high cost and the relative newness of the technology we

    opted not to use radar, hoping that within a few years the cost and technology would be

    more practical. Based on the remaining four criteria the 2D scanning laser was the most

    practical choice. The amount of ambient light has little effect on the sensor, the

    maximum detection range is 30 meters, the response time is adequate, and the cost for a

    unit is justifiable. The comparisons of each sensor with the five criteria are shown

    graphically in Fig. 3.1.

    In this section we have discussed five different sensors used for obstacle detection. We

    investigated research that uses each of the five different sensors and we listed their

    advantages and disadvantages. Finally we set forth four criteria that the

    Operation

    in any

    weather

    Operation

    in any

    light

    Detection

    of at least

    15 meters

    Fast

    response

    time

    Cost less

    than

    tractor

    CCD

    Cameras

    Ultrasonic

    Scanning

    Laser

    3D Scanning

    LaserMillimeter

    Wave Radar

    x x x

    x x x

    x x x x

    x x

    x x x x

    Fig. 3.1: Comparision of obstacle detection sensors.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    38/98

    28

    detection sensor needed to meet in order to be used on the tractor. With each sensor

    evaluated against these five criteria we decided that a 2D-laser scanner was the optimum

    sensor at the present time for the autonomous tractor. The next section discusses in detail

    the laser scanner that was chosen and used on the project.

    3.3 Detection Sensor Used on Tractor

    To detect obstacles on the tractor project, an LMS 220-laser range finder was

    mounted on the front of the tractor (see Fig. 3.2). This sensor, from Sick Optic Inc.,

    scans obstacles out to 30 meters in a 180 planar scan window (see Appendix A for

    detailed specifications). The laser measures distance in meters with a range resolution of

    0.01 m, from 0 to 180, in increments of 0.5. The angle resolution is configurable with

    possible resolutions being measurements every 0.25, 0.5, or 1.0. With a resolution of

    0.5 there are 361 distance measurements that the laser reports during continuous output

    mode. Each measurement requires 2 bytes of information, with the low byte being

    reported first and the high byte being reported last. This makes a total of 722 data bytes

    that are reported by the laser to indicate the distance measured every 0.5 . In addition to

    the 722 bytes of distance data there are also 10 additional bytes of data, 7 before the

    distance data and 3 after the distance data, making the total number of bytes in one report

    packet 732. In Section 5.1 we explain how this data is used for avoidance.

    In Section 3.2.5 we selected a sensor based on four criteria. The only criterion

    that the scanning laser did not meet was the problem of false readings due to inclement

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    39/98

    29

    Sick Inc.

    Laser RangeFinder

    Obstacle Sensor

    Fig. 3.2: Laser range finder mounted on tractor.

    weather. In fact we saw that after two months of driving in a dirt field the amount of dust

    created by the tractor tires while driving caused many false readings from the laser. To

    remedy this problem, all range data closer than seven feet were ignored. This was not a

    desirable solution, but without another detection sensor to back up the laser range finder

    it was the only practical solution.

    In this chapter we have discussed obstacle detection and how it relates to the

    project. We first indicated why an obstacle detection sensor was so important to

    autonomous vehicle research. Next we looked at the detection sensors being used by

    researchers in the field of autonomous vehicles and described the basis for the selection

    of a sensor for the autonomous tractor. Finally we briefly described the obstacle

    detection sensor that was used on the tractor. Now we can focus our attention on how the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    40/98

    30

    vehicle will react to measurements of the surrounding environment that are obtained by

    the detection sensor.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    41/98

    31

    Chapter 4

    Obstacle Avoidance

    In this chapter we present two different approaches to obstacle avoidance: global and

    local avoidance. We then examine the desired attributes of the path planner/obstacle

    avoidance algorithm used for the autonomous tractor project. We look at several local

    obstacle avoidance techniques described in research literature, one of which is selected as

    the model for the avoidance algorithm ultimately used on the tractor.

    4.1 Global and Local Obstacle Avoidance

    In autonomous vehicle research two levels of planning occur. One level of

    planning, called global planning, examines the whole world an autonomous vehicle can

    travel in and plans paths from one point to the next dependent upon this world. Obstacle

    avoidance on a global level is accomplished by routing all paths away from potential

    obstacles. If the vehicle encounters an unexpected obstacle during a mission then the

    global planner reexamines the whole map with the added obstacle and adjusts either a

    portion of the affected path or the rest of the path based on the newer map data.

    Local or reactive planning plans a short path for the vehicle to traverse based only

    on the environment surrounding the vehicle, typically using only the detection sensor

    output. Local or reactive avoidance is aware of the desired path of travel, but due to path

    obstructions, plots a path around the obstruction until the vehicle can safely reattach itself

    to the desired path of travel.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    42/98

    32

    An example of global and local obstacle avoidance can be illustrated in my drive

    to work. Each morning I drive to work my route is the same. If one day I drive to work

    and a child jumps in front of my car I do not drive a different route to work, instead I

    swerve around the child and continue on my way in the usual fashion. This is analogous

    to reactive obstacle avoidance. However, one day while driving to work I come across a

    roadblock. Now I choose a different road, or set of roads, that will get me to work. This

    is analogous to global obstacle avoidance.

    4.2 Vision of Obstacle Avoidance

    At the start of the project the project team examined different approaches to

    obstacle avoidance. We knew we wanted the knowledge of known obstacles in the

    global map, but we wanted avoidance to be performed by a reactive algorithm. To

    accomplish this we decided that the path planner would contain a list of positions for all

    known obstacles in the map. These obstacles would determine the paths to be planned;

    that is, no known obstacle would block a planned path. If an unknown obstacle were

    encountered during the mission then a reactive algorithm would use the detection sensor

    data as well as the list of known obstacles to decide the best route around the obstacle or

    obstacles and back onto the desired path of travel.

    Once the relationship between the path planner and the avoidance algorithm was

    clarified we decided on five criteria that the reactive obstacle avoidance algorithm needed

    in order to be married into our design architecture:

    1 The algorithm must compute vehicle-heading changes based on a map of theobstacles in the tractors environment. This map would represent all obstacles

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    43/98

    33

    seen by the detection sensor as well as known obstacles that may not beseen by the detection sensor.

    2 The algorithm must consider the vehicle to have length and width, not simplybe a point source.

    3 To avoid excessive computations the algorithm cannot examine the entire mapwhere the tractor may travel, but must examine only the portion of the map inthe vicinity of the tractor.

    4 The algorithm must halt the tractor if an obstacle is too close to the vehicle orthe algorithm cannot compute a safe direction of travel around the obstacle.

    5 The algorithm must handle dynamic, or moving obstacles.

    This list served as a guide when evaluating the many different obstacle avoidance

    algorithms that we found in various literatures. If an avoidance algorithm were lacking in

    any of the five requirements then it would not be considered for use on the tractor. The

    next section compares several different avoidance algorithms against the five

    requirements and then describes in detail the algorithm chosen as the model for the

    algorithm that was eventually used on the tractor.

    4.3 Obstacle Avoidance Used in Research

    In the previous section we outlined five criteria that our obstacle avoidance

    algorithm needed to have to be compatible with our design architecture. We did not want

    to completely reinvent the wheel so we did an extensive literature search on different

    obstacle avoidance algorithms. This section will examine several of the predominate

    algorithms being used today. Several of the references that will be discussed do not

    outline a formal algorithm, but by reading the material a general understanding of the

    algorithm is achieved. These papers usually refer to other works that explain in greater

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    44/98

    34

    detail the algorithm being used. The remainder of the section is organized as follows.

    Subsection 4.3.1 examines wall-following techniques of avoidance. Subsection 4.3.2

    discusses black hole techniques. Subsection 4.3.3 presents the idea of path-computing

    techniques. Subsection 4.3.4 considers potential field techniques. Finally Subsection

    4.3.5 describes histogram techniques.

    4.3.1 Wall-Following

    Wall-following avoidance is a technique of following the contours of an obstacle until the

    obstacle no longer blocks the desired path to the goal. Kamon [29] describes an

    algorithm used to navigate a robot to a goal location in a real world environment. The

    algorithm uses two basic modes of motion: motion towards the target and obstacle-

    boundary following [29]. The robot proceeds to the goal until an obstacle is

    encountered. Once the obstacle is reached it then moves along the obstacle boundary

    until it determines that it is close enough to the goal to break away from the obstacle.

    The robot then proceeds towards the goal until another obstacle is encountered. Fig. 4.1

    gives an illustration of what this algorithm might do, and is similar to pictures found in

    [29]. Kamon explains that in their experiments the algorithm worked quite well in most

    situations, producing minimal path distances around obstacles compared with other wall-

    following algorithms.

    Even though wall-following is an effective means of obstacle avoidance, when

    graded against our five criteria it is obviously not the technique we want to use. This

    algorithm does not compute vehicle heading based on a map of the obstacle environment,

    as the first criteria states; rather, heading is determined as the detection sensor senses the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    45/98

    35

    Start

    Goal

    Wall Following Technique

    Fig. 4.1: Example of wall following technique as presented in [30].

    contour of the obstacle. This effectively eliminates known obstacles that may not be

    sensed by the detection sensor. This is a problem if the vehicle senses more than one

    obstacle. The algorithm assumes the vehicle is a point source, which does not follow our

    second criterion, although this could be changed in the algorithm such that non-point

    source vehicles could be used. The third criterion does not factor into this evaluation

    because the algorithm uses no map for obstacle avoidance. This is impossible to change

    because the algorithm is truly a reactive algorithm, it only reacts to what the detection

    sensor sees and nothing else. The fourth criterion, stating that the vehicle must halt if

    obstacles are too close or if the obstacle is impassible, is not realized in this algorithm.

    The very nature of the algorithm dictates that the vehicle positions itself very close to the

    obstacle in order to follow the contours properly. The algorithm could be made to stop

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    46/98

    36

    however, if the obstacle is impassible by forcing the vehicle to stop if it has traveled

    too far off the desired path. The final criterion, handling dynamic obstacles, is also not

    realizable. Kamon states the algorithm navigates a point robot in a planar unknown

    environment populated by stationary obstacles with arbitrary shape.

    4.3.2 Black Hole

    The black hole technique of obstacle avoidance is a way of examining all of the

    obstacles in front of the vehicle and heading towards the largest opening, or hole.

    Bischoff [9] discusses how the black hole technique can be applied to humanoid service

    robots. He explains that wandering around is a basic behavior that can be used by a

    mobile robot to navigate, explore and map unknown working environments. When the

    robot wanders it has no knowledge of its surroundings and must perform obstacle

    avoidance. Bischoff explains that the onboard sensors segment the images around the

    robot into obstacle-free and occupied areas and

    After having segmented the image, the robot can pan its camerahead towards those obstacle-free regions that appear to be thelargest. If such a region is classified as large enough, the steeringangle of both wheels is set to half of the cameras head pan angleand the robot moves forward while continuously detecting,tracking and fixating this area.

    In this wandering state the robot travels and avoids obstacles until a user stops the state.

    If the robot becomes stuck in a corner or dead-end hallway then it backtracks until it can

    wander freely again.

    Chahl [8] describes a similar black hole avoidance algorithm. The algorithm is

    different than Bischoffs but the general principle of finding the most open tunnel in front

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    47/98

    37

    of the autonomous vehicle is the same. In the case of Chahl an autonomous airplane is

    used. Here the vehicle can sense obstacles in three dimensions and has three constraints.

    The constraints are:

    1 continual progress towards a goal2 avoiding collisions3 maintaining a low altitude

    With these constraints the goal is to find a free path in the panoramic range image

    provided by the on-board sensors. A free path is described as at least a tunnel of given

    minimum diameter. The diameter of the tunnel is determined by the safe minimum

    distances between the vehicle and the nearest obstacle. These parameters would be

    determined by the wingspan and controllability of the aircraft. This algorithm looks for

    black holes, but unlike [9] the black hole that gets the airplane closest to the target is

    taken.

    Black hole obstacle avoidance is an effective means of using obstacle data in front

    of the vehicle to determine the best course to take. When compared with our five criteria

    for possible use on the tractor all but one of the criteria are acceptable. The black hole

    approach does not meet the first criteria. The safest black hole to travel is calculated with

    only the current detection sensor readings and therefore known obstacles not seen by the

    detection sensor would go unnoticed. Even in [9] where the environment is mapped and

    obstacle-free regions are established the robot must still examine each obstacle free zone

    with its sensors to determine passability. In [9] there is a possibility of placing known

    obstacles in to the map before obstacle-free regions are established. The second criterion

    however is met because for any black hole to be passable it must be at least the

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    48/98

    38

    dimension of the vehicle. The third criterion is also met because computations for

    avoidance heading are made only on the obstacles within the immediate environment of

    the vehicle. The fourth criterion of stopping the vehicle if obstacles are too close or if

    there is no possible path of travel is also achievable in this type of avoidance. Finally, the

    ability to handle dynamic obstacles is achievable because black holes are determined for

    each sensor scan of the surrounding environments. If an obstacle moves then the next

    sensor scan will indicate different black hole locations. An algorithm that is similar to

    this approach is discussed in section 4.3.5 except black holes are found after obstacles are

    placed into a map of the environment.

    4.3.3 Path Computing

    Path computing techniques limit the possible directions of travel by the robot to

    certain number of headings. Usually three directions of travel are possible: right, left,

    and straight-ahead. Every time the vehicle travels towards the goal these three directions

    are determined to be free or blocked. Based upon the free or blocked nature of the paths

    the vehicle navigates toward the goal. Fujimori [30] describes an adaptive obstacle

    avoidance algorithm that assumes the following about the robot:

    1 It moves only in the forward direction2 It turns to the right or left with a minimum rotation radius rmin3 Its velocity is constant except near the goal

    Three distance sensors are used to determine if each of the three directions of travel is

    free or blocked and the vehicle navigates safely to the goal.

    Schwartz in [31] describes another variation on this technique when he describes

    an avoidance algorithm used on a medium sized tracked vehicle called PRIMUS. To

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    49/98

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    50/98

    40

    calculated the most recent obstacle map [31] or the most recent sensor reading [30] is

    used.

    4.3.4 Potential Fields

    The potential field method of obstacle avoidance is used extensively in robotic

    research. This method examines repulsion vectors around obstacles and determines the

    desired heading and velocity of the autonomous robot needed to get around the obstacle.

    Chuang [32] describes this method as collision avoidance in path planning is achieved

    using repulsion between object and obstacle resulting from the Newtonian potential-

    based model. Several examples of potential based obstacle avoidance follow.

    Prassler [17] describes MAid, a robotic wheelchair that roams in a railway station. This

    method first places all obstacles into a time stamp map. This map is similar to occupancy

    grids (discussed in subsection 4.3.5) except that the time an obstacle resided in a

    particular cell in the grid is recorded. With this information obstacles can be tracked and

    an obstacle vector can be predicted. With this obstacle vector and the heading vector of

    the robot a collision cone can be constructed. The collision cone represents the heading

    and velocity vectors that would cause collision with the obstacle. The avoidance

    maneuver consists of a one-step change in velocity to avoid a future collision within a

    given time horizon. The new velocity must be achievable by the moving robot [17]. This

    technique was performed very successfully in a moderately crowded railway station in

    Germany.

    Another robot that is similar to the one described above was designed by Carnegie

    Mellon University and used as an automated tour guide in the Smithsonian Museum [18].

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    51/98

    41

    As stated by Thrun, this algorithm generates collision-free motion that maximizes two

    criteria: progress towards the goal, and the robots velocity. As a result the robot can ship

    around smaller obstacles (e.g. a person) without decelerating.

    Chuang [32] and Soto [13] describe two different algorithms for computing the

    potential fields in the environment around autonomous vehicles. Each of these

    algorithms have been simulated and tested on robots in real world environments with

    good success.

    The four papers described above all base their algorithms on the potential field

    method. Even though all four algorithms are different the results are the same and this

    allows us to compare the potential field method against our five criteria. The first

    criterion is satisfied because the avoidance heading of the vehicle is calculated based on

    obstacles placed in a map. The second criterion is also met because the potential fields

    are computed based on the vehicle size. The computations for the avoidance heading are

    made based on the portion of the map in the immediate environment of the vehicle, thus

    satisfying the third criterion. The fourth criterion is met when the vehicle is stopped if

    the potential fields indicate no collision-free path is possible. Finally the fifth criterion is

    met because the algorithm can handle dynamic obstacles, as is the case for the tour guide

    robot and the robot wandering in the railway station.

    4.3.5 Histogram

    Having discussed each of the different obstacle avoidance techniques we see that

    there are desired aspects of each approach. The histogram approach to obstacle

    avoidance contains many of the different aspects in one algorithm. Johann Borenstein

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    52/98

    42

    [22-27] developed an obstacle avoidance algorithm based on histograms. Each of

    these papers explains the progression of an algorithm that he finally calls the vector field

    histogram plus (VFH+) algorithm in [22]. This VFH+ algorithm served as the model for

    the algorithm that was developed for the tractor, as explained in Chapter 5. Here we give

    a brief explanation of the algorithm found in [22]. For a more in depth understanding of

    the algorithm the reader is encouraged to read [22-27].

    The VFH+ algorithm consists of four stages, with each stage reducing the amount

    of obstacle data to the point of calculating an avoidance-heading vector. The four stages

    are the primary polar histogram, the binary polar histogram, the masked polar histogram,

    and the selection of the steering direction.

    Before any data reduction takes place a preliminary understanding of the algorithm is

    needed. The world where the vehicle travels is resolved into a map of N x M cells. The

    coordinates of this map are always fixed and the vehicle travels with respect to the map.

    This map is called the certainty grid and values associated with each cell represent the

    probability of obstacles residing at the cell locations in the grid. In Borensteins

    algorithm [22-27] when the vehicle mission is first started each cell in the certainty grid

    is initialized to zero, indicating the probability that obstacles reside in any particular cell

    is impossible. In our implementation described in Chapter 5 all cells in the certainty grid

    are initialized to zero except cells that contain known obstacles. Cells that contain known

    obstacles are set to the highest possible certainty value, indicating that an obstacle resides

    at that cell location (because of the prior knowledge of the path planner). As the vehicle

    progresses in its mission the obstacle detection sensor detects obstacles at various grid

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    53/98

    43

    locations. Each time an obstacle is sensed the value of the corresponding grid location

    is updated by one. Thus the higher the certainty cell number the higher the probability

    that an obstacle resides at that cell location. As per the third criterion, avoiding excess

    calculations by using only a portion of the map in the immediate vicinity of the vehicle,

    the VFH+ algorithm is compliant. Another map, called the active window, overlays that

    portion of the certainty grid in the immediate vicinity of the vehicle. It is from all the

    cells of the certainty grid overlaid by the active window that the avoidance heading in

    calculated. Fig. 4.2 illustrates the certainty grid with an active window and several

    known and unknown obstacles that have been placed into the grid. The avoidance

    heading is calculated by reducing the obstacle data until the masked polar

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    Known Obstacle

    10

    25

    50

    0

    0

    35

    0

    0

    25

    0

    0

    10

    Unknown Obstacle

    50

    50

    50

    0

    0

    0

    0

    0

    0

    0

    0

    0

    Active Window Laser Scan Area

    0

    0

    0

    0

    0

    0

    0

    0

    0

    0

    0

    0

    Unknown Obstacle

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    50

    Fig. 4.2: Example of certainty grid.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    54/98

    44

    histogram is calculated. The masked polar histogram indicates the driving directions

    that are opened and those that are blocked. The driving direction closest to the goal will

    be selected and the vehicle will steer in that direction. This is illustrated in Fig. 4.3.

    From the figure we see the vehicle is trying to track to the desired path of travel, but an

    obstacle resides to its left. The masked polar histogram indicates that the vehicle cannot

    choose a steering direction between 180 and 270, but since the path direction is less

    than 180 then vehicle chooses the steering direction that will return it to the path.

    With this preliminary understanding of the VFH+ algorithm we can discuss the four data

    reduction stages.

    Point Being Tracked

    Avoidance Path

    0

    90

    180

    270

    Fig. 4.3: Vehicle tracking a point on desired path.

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    55/98

    45

    4.3.5.1 The Primary Polar Histogram

    As stated by Borenstein, The first data reduction stage maps the active region of

    certainty grid onto the primary polar histogram. The active window (Aij) is a circle of

    diameter Ws with the vehicle always at its center. The content of each cell in the active

    window contains a vector direction and magnitude from the cell Aij to the vehicle center

    point (VCP). The vector is given by:

    =

    0

    01tanxx

    yyB

    i

    j

    ij

    where:

    x0, y0: Present VCP coordinates.

    xi, yj: Active cell, Aij, coordinates.

    The magnitude is given by:

    ( )22ijijij dacm =

    where:

    cij: Certainty value of active cell Aij.

    dij: Distance from active cell Aij to VCP.

    a: A constant based on the diameter of the active circle.

    The parameter a is chosen according to:

    12

    22

    =

    Wsa

    For the vehicle to navigate safely around obstacles a certain distance from the obstacles

    must be maintained. Enlarging each cell in the active window by the vehicle radius Rr

    and some safe distance Ds (rr+s = Rr + Ds) creates a cushion between the vehicle and

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    56/98

    46

    obstacles. Ds is the minimum allowable distance between the vehicle and the obstacle.

    Each cell has an enlargement angle vij given by:

    = +

    ij

    sr

    ijd

    rv

    1sin

    The primary polar histogram Hp

    is created with an angular resolution so that n = 360/

    is an integer. In Borensteins implementation, is specified to be 5, making n = 72 and

    causing the discrete angle sector k = n. For each sector k, the polar histogram is

    created by:

    = Aji ijijp

    k hmH , '

    with:

    h'ij = 1 if k [ Bij - vij, Bij + vij]

    h'ij = 0 otherwise

    It is important to note that the values Bij

    , a d2

    ij, and v

    ijare properties of the active

    window and never change; they can be calculated once and stored in a matrix of size Ws

    x Ws for quick reference.

    4.3.5.2 The Binary Polar Histogram

    Once the primary polar histogram has been calculated for each sector k, the binary

    polar histogram is easily constructed. The binary polar histogram, for each sector k, is

    calculated by using two defined thresholds, low and high, the primary polar histogram,

    and the following rules:

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    57/98

    47

    1, =b

    ikH

    0, =b

    ikH

    b

    ik

    b

    ik HH 1,, =

    i f

    i f

    o t h e r w i s e

    high

    p

    ikH >,

    low

    p

    ikH >,

    A sector k in the binary polar histogram is blocked if it has a value of 1 and is unblocked

    if it has a value of 0. For example, if sectors k = 90 to k = 180 have values of 1, then

    from 90 to 180 an obstacle is present.

    4.3.5.3 The Masked Polar Histogram

    The masked polar histogram takes into account the vehicle turning radius when

    deciding if an obstacle blocks the vehicle path. The turning radius creates a trajectory

    circle, and if an enlarged obstacle cell overlaps the trajectory circle, then all directions

    from the obstacle to the backwards direction of motion are blocked. Borenstein, in [22,

    Fig. 4] gives an example of blocked directions. Fig. 4.4 is a similar example of blocked

    directions, but with a difference in the x and y directions. The figure shows the vehicle

    near two enlarged active cells. Assuming that the active cells contain obstacles (labeled

    obstacle 1 and obstacle 2), the vehicle must decide which direction to travel so not to

    strike the obstacles. When obstacle 2 is enlarged, it overlaps the right trajectory circle

    making travel to the right of obstacle 2 impossible. When obstacle 1 is enlarged, it does

    not overlap the left trajectory circle; thus travel to the left of obstacle 1 is possible. The

    possible directions of travel then become between the two obstacles or to the left of

    obstacle 1.

    The locations of the right and left trajectory centers presented here are different

    than those presented in [22], by a sign change on the y, due to the different x and y

  • 7/29/2019 Obstacle Detection and Avoidance for an Autonomous Farm Tractor

    58/98

    48

    rr+s

    rr+s

    free

    rr+s

    rryr

    xr

    dr

    ylxl

    rl

    dl

    free

    blocked

    blocked

    blockedx

    y

    Obstacle 1

    Obstacle 2

    Fig. 4.4: Example of blocked directions.

    orientations. The locations of the right and left trajectory centers, relative to the current

    vehicle position and orientation are given by:

    xr = rrsin yr = -rrcos

    xl = -rlsin yl = r lcos

    Thus the masked polar histogram is updated to reflect the right and/or left trajectory

    circles being blocked. This is done by finding the distances from each active cell Aij to

    the two trajectory centers as given by the formulas:

    ( ) ( )

    ( ) ( )222

    222

    jlill

    jrirr

    AyAxd

    AyAxd

    +=

    +=

    An obstacle blocks the vehicle direction to the right if:

    ( )srrr rrd ++