robotic sensors & control - final project report - electronic

77
Special Engineering Project Robotic Sensors & Control Final Report Edward Cornish [email protected]

Upload: others

Post on 03-Feb-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Robotic Sensors & Control - Final Project Report - Electronic

Special Engineering Project

Robotic Sensors & Control

Final Report

Edward Cornish

[email protected]

Page 2: Robotic Sensors & Control - Final Project Report - Electronic

University of Surrey

School of Electronics & Physical Sciences

Department of Electronic Engineering

Final Year Project Dissertation

I confirm that the project dissertation I am submitting is entirely my own work and that any material used from other sources has been clearly identified and properly acknowledged and referenced. In submitting this final version of my report to the JISC anti-plagiarism software resource, I confirm that my work does not contravene the university regulations on plagiarism as described in the Student Handbook. In so doing I also acknowledge that I may be held to account for any particular instances of uncited work detected by the JISC anti-plagiarism software, or as may be found by the project examiner or project organiser. I also understand that if an allegation of plagiarism is upheld via an Academic Misconduct Hearing, then I may forfeit any credit for this module or a more severe penalty may be agreed.

Project Title: Robotic Sensors and Control System

Student Name:Edward Cornish

Supervisor: Dr Richard Bowden

Date 6-5-2007

Page 3: Robotic Sensors & Control - Final Project Report - Electronic

Abstract

The Special Engineering Project is based around the development of a

robotic platform for use in the University of Surrey CVSSP. This report deals

with the development of the Sensing and Control system to allow the robot to

move around its environment, using Ultrasound Sensors to detect obstacles. The

Sensing and Control system receives instructions from the robot's Decision

System, and passes sensor data back over the same channel. The project was a

success, in that the Sensing and Control System met the specifications given,

however there is significant room for improvement of the system.

Page 4: Robotic Sensors & Control - Final Project Report - Electronic

Table of Contents

1 INTRODUCTION TO THE PROJECT...................................................................................................1 1.1 OVERALL AIMS OF THE PROJECT (ENUMERATED)...................................................................2 1.2 SPECIFICATION FOR SENSING AND CONTROL SYSTEM..........................................................2 1.3 CONCLUSION.......................................................................................................................................2

2 RESEARCH................................................................................................................................................3 2.1 INTRODUCTION..................................................................................................................................3 2.2 SENSOR TECHNOLOGIES..................................................................................................................3

2.2.1 Infra-Red.........................................................................................................................................3 2.2.2 RADAR..........................................................................................................................................4 2.2.3 Inductive, Magnetic, Capacitive.....................................................................................................4

Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent magnet ©[Fu, 1987] pp2795 2.2.4 Sonar...............................................................................................................................................5

Figure 2.2: Differentiating between Walls and Corners using RCD's © [Nehmzow, 2000] pp28..................6Figure 2.3: 360 degree sonar scan, with two RCDs © [Nehmzow, 2000] pp27.............................................6

2.2.5 Laser Range Finders.......................................................................................................................7 2.2.6 Shaft Encoders................................................................................................................................7

2.3 CONTROL OF MOTORS......................................................................................................................7 2.4 CONTROL INTERFACE.......................................................................................................................8 2.5 TECHNOLOGIES SELECTED.............................................................................................................9

Figure 2.4: Position of Sensors on Chassis....................................................................................................10 3 DESIGN OVERVIEW..............................................................................................................................11

3.1 INTRODUCTION................................................................................................................................11 3.2 HARDWARE DESIGN HIERARCHY...............................................................................................11 3.3 SOFTWARE DESIGN HIERARCHY.................................................................................................12 3.4 CONCLUSION.....................................................................................................................................12

4 HARDWARE DESIGN............................................................................................................................13 4.1 INTRODUCTION................................................................................................................................13 4.2 PIC SENSOR CONTROLLER.............................................................................................................13

4.2.1 Important features of the circuit....................................................................................................14Figure 4.1: Circuit Diagram of Sensor Controller.........................................................................................14

4.2.2 Brief explanation of I2C bus protocol..........................................................................................15 4.3 MOTOR CONTROLLER.....................................................................................................................16 4.4 LAPTOP CRADLE AND SENSOR PLATFORM..............................................................................17

Figure 4.2: Laptop Cradle Design (dimensions in mm).................................................................................17Figure 4.3: Sensor Bracket Design.................................................................................................................17

4.5 TESTING (HARDWARE)...................................................................................................................18 4.5.1 Motor Control Tests......................................................................................................................18

Equipment Used................................................................................................................................................18Figure 4.4: Motor Controller Test Circuit......................................................................................................18

Test Circuit........................................................................................................................................................18Test Procedure...................................................................................................................................................19Results & Analysis............................................................................................................................................19

4.5.2 Speed Tests...................................................................................................................................19Aim....................................................................................................................................................................19Equipment Used................................................................................................................................................19Methodology......................................................................................................................................................19Results & Analysis............................................................................................................................................20

Table 4.5.1: Average speeds for speed settings 6 to 18..................................................................................20Figure 4.5: Graph showing approximate speed values (averaged over three readings and taking into account uncertainties).....................................................................................................................................20

4.5.3 Oscilloscope testing of I2C bus....................................................................................................21Aim....................................................................................................................................................................21Equipment Used................................................................................................................................................21Methodology......................................................................................................................................................21Results & Analysis............................................................................................................................................22Modifications.....................................................................................................................................................22

4.5.4 Sensor Tests..................................................................................................................................22 Equipment Used...............................................................................................................................................23

Page 5: Robotic Sensors & Control - Final Project Report - Electronic

Test Circuit........................................................................................................................................................23Figure 4.6 - Sensor Test Circuit......................................................................................................................23

Methodology......................................................................................................................................................23Results...............................................................................................................................................................24Analysis of Results............................................................................................................................................24

Table 4.5.2: Sensor Test Results.....................................................................................................................24 4.5.5 Sensor Testing – Conclusive.........................................................................................................25

Figure 4.7: Graph showing results of initial sensor tests...............................................................................25Points to be investigated:...................................................................................................................................25Equipment Used................................................................................................................................................25Assumptions:.....................................................................................................................................................26Methodology & Results:....................................................................................................................................26Test 1.................................................................................................................................................................26Test 2.................................................................................................................................................................26Test 3.................................................................................................................................................................26Test 4.................................................................................................................................................................27Test 5.................................................................................................................................................................27Test 6.................................................................................................................................................................27Analysis of Results............................................................................................................................................28

Table 4.5.3: Results for Test 6........................................................................................................................28Figure 4.8: Graph showing results for Test 6, with Y axis error shown........................................................28

4.5.6 Power Consumption Tests............................................................................................................29Aim....................................................................................................................................................................29Methodology......................................................................................................................................................29Results& Analysis.............................................................................................................................................29

4.5.7 Override Relay Tests....................................................................................................................30Aim....................................................................................................................................................................30Methodology......................................................................................................................................................30Results & Analysis............................................................................................................................................30Modifications.....................................................................................................................................................30Results & Analysis of Modified Circuit............................................................................................................30

4.6 CONCLUSION.....................................................................................................................................31 5 SOFTWARE DESIGN.............................................................................................................................32

5.1 INTRODUCTION................................................................................................................................32 5.2 SENSOR CONTROLLER PIC CODE.................................................................................................32

Figure 5.1: PIC code Flowchart.....................................................................................................................33 5.3 HARDWARE INTERFACE SOFTWARE ('MIDDLEMAN')............................................................34

Table 5.3.1: Command Format for Interface with Hardware........................................................................34Table 5.3.2: Acknowledgement Format for Interface with Hardware............................................................34

5.4 HARDWARE INTERFACE CLASS ('BOT')......................................................................................35Figure 5.2: Hardware Interface class - Invocation hierarchy diagram.........................................................35

5.5 REMOTE CONTROL SOFTWARE ('ROBOTERM')........................................................................36 5.6 SOFTWARE TESTING.......................................................................................................................37

5.6.1 Sensor Tests on Robot Chassis.....................................................................................................37Aim....................................................................................................................................................................37Methodology......................................................................................................................................................37Results...............................................................................................................................................................37

Table 5.6.1: Apparent Distances (in cm) when mounted on Chassis..............................................................37Analysis of Results............................................................................................................................................37

5.6.2 Serial Communications Test Program..........................................................................................38Test Procedure...................................................................................................................................................38Results & Analysis............................................................................................................................................38

5.7 CONCLUSION.....................................................................................................................................38 6 OTHER TESTING...................................................................................................................................40

6.1 INTRODUCTION................................................................................................................................40 6.2 INTEGRATION WORK – FLOOR RUNNING TESTS WITH CAMERA.......................................40

6.2.1 Aim...............................................................................................................................................40 6.2.2 Methodology.................................................................................................................................40 6.2.3 Results...........................................................................................................................................40

6.3 INTEGRATION WORK – COLLECTION OF DATA FOR DECISION SYSTEM..........................40 6.3.1 Aim...............................................................................................................................................40 6.3.2 Methodology.................................................................................................................................41 6.3.3 Results...........................................................................................................................................41

Page 6: Robotic Sensors & Control - Final Project Report - Electronic

7 PROBLEMS AND ISSUES ENCOUNTERED......................................................................................42 7.1 INTRODUCTION................................................................................................................................42 7.2 PROJECT TASK OVERRUNS............................................................................................................42

Figure 7.1: Gantt Chart created at beginning of project...............................................................................42Figure 7.2: Gantt Chart created half-way through the Project......................................................................43Figure 7.3: Retrospective Gantt Chart based on schedule of whole project..................................................43

7.3 COMPLETION OF PIC SENSOR INTERFACE................................................................................44 7.4 HARDWARE CONSTRUCTION........................................................................................................44 7.5 MANOEUVRABILITY ISSUES.........................................................................................................44 7.6 HARDWARE INTERFACE CLASS TIME-OUT...............................................................................45

8 CONCLUSION..........................................................................................................................................46 9 REFERENCES..........................................................................................................................................48APPENDIX A – CODE ANALYSIS...........................................................................................................50

PIC SENSOR CONTROLLER CODE........................................................................................................50Header Files....................................................................................................................................................50Global Variables.............................................................................................................................................50Variables in main............................................................................................................................................50Function Prototypes........................................................................................................................................50Preprocessor Directives..................................................................................................................................51Program Flow.................................................................................................................................................51I2C interrupt...................................................................................................................................................52

HARDWARE INTERFACE SOFTWARE.................................................................................................52Header Files....................................................................................................................................................52Definitions.......................................................................................................................................................52Objects and Variables.....................................................................................................................................53Program flow..................................................................................................................................................53

HARDWARE INTERFACE CLASS..........................................................................................................54Preprocessor directives...................................................................................................................................54Header Files....................................................................................................................................................54Member Functions..........................................................................................................................................54

APPENDIX B - USER GUIDE – SENSORS AND CONTROL SYSTEM..............................................56INTRODUCTION.......................................................................................................................................56SET-UP........................................................................................................................................................56

Sensor Controller Circuit Board.............................................................................................................56Sensors....................................................................................................................................................56USB-I2C interface..................................................................................................................................57MD22 Motor Controller..........................................................................................................................57Hardware Interface Software..................................................................................................................57Remote Control Software.......................................................................................................................57

CONTROLLING THE ROBOT..................................................................................................................58Using Roboterm.exe...............................................................................................................................58

Table 1: Command Format for Interface with Hardware..............................................................................58Table 2: Acknowledgement Format for Interface with Hardware..................................................................58

Using other software...............................................................................................................................58Hardware Proximity Override................................................................................................................59

USING THE HARDWARE INTERFACE CLASS....................................................................................59TROUBLESHOOTING...............................................................................................................................59

APPENDIX C – C++ SOURCE CODE......................................................................................................60HARDWARE INTERFACE SOFTWARE – MIDDLEMAN.EXE...........................................................60HARDWARE INTERFACE CLASS – BOT.H..........................................................................................61REMOTE CONTROL SOFTWARE - ROBOTERM.EXE........................................................................66KEYS CLASS – KEYS.H AND KEYS.CPP..............................................................................................69

keys.h.................................................................................................................................................................69keys.cpp.............................................................................................................................................................70

Page 7: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

1 Introduction To The ProjectThis report outlines the research and initial work performed on a sensing and control system for a robot.

Research into existing technologies is outlined, with appropriate references. The testing performed to validate

the technologies used is described. The overall design, at the time of writing, is described, along with the

functional description of each module.

The sensing and control system discussed in this report is intended to be used on a robotic platform for use in

the University of Surrey Centre for Vision Systems and Signal Processing (CVSSP). The robot is being

developed by a team of four Electronics Undergraduate Students, of which the author is one. Each member of

the team has been assigned a specific role within the project. The team members are responsible for individually

documenting their progress, and therefore this report shall not discuss the design particulars of the other parts of

the robotic platform, except where such details affect the sensing and control system. The roles assigned to each

member of the team is as follows:

● Martin Nicholson – Project Management and Artificial Intelligence.

● Peter Helland – Video Processing and acquisition.

● Ahmed Aichi – Networking and Inter-System Communications.

● Edward Cornish (author) – Robotic Locomotion Control and Collision Avoidance.

The Project is being academically supervised by Dr Richard Bowden, a specialist in Computer Vision

Systems. Dr Bowden's role is to oversee the project, and to provide guidance where necessary, however all

management and design decisions are made by the Undergraduate Team.

The robotic platform (hereafter referred to as 'the robot') is controlled by one of two laptop computers

supplied by the CVSSP. It is hoped that, once one platform design has been finalised and tested, a duplicate

system can be constructed that will allow the robots to be used for Artificial Intelligence experiments in team

working and emergent behaviours. The laptop computers are the only technology supplied for use on the robot,

all other equipment must be purchased from a Project Budget of £1000. The University does provide support

facilities in the form of the Engineering Workshop (where bespoke mechanical items can be constructed –

usually free of charge), the CVSSP computing system ('Brother', to allow Video Processing work to be carried

out), and the laboratory and library facilities available to all Electronics Undergraduates. The robot(s), once

completed will be maintained at the CVSSP for use in future projects.

At the earliest stage of the project, a pre-built robotic chassis was chosen to be the basis for the project. This

chassis is a product of Lynxmotion [Lynx, 2006] , and is model 4WD3[Lynx, 2006] . This model was chosen for

its large size and flexibility, allowing many devices to be mounted and supported. The 4WD (and attached non-

electronic components (mounting frames etc.) are hereafter referred to as 'the chassis'. The chassis includes

electric motors for locomotion.

The four project areas are interlinked, yet each is a distinct system in it's own right. The Artificial

Intelligence of the robot will make decisions related to navigation based on the information it receives through

1

Page 8: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

the interfaces provided to it (largely software based) [Nicholson, 2007] . The interfaces between the various

software programs composing the robot control system will communicate using a custom Network API, which

can be used to communicate between programs running on the same computer, as well as between

computers[Aichi, 2007] . This allows decentralised processing to take place; video data captured on the robot

can be passed over a wireless link to the CVSSP network, where the powerful vision processing systems in place

can relieve the processor load placed on the laptop computer.

The vision processing algorithms developed as part of the project will be used to determine the velocity of

the robot, the rough position, and the presence of objects of interest (fixed obstacles, people, goals, etc.)[Helland,

2007] . The Sensing and Control system will allow the robot to move around its environment in order to

complete the goals assigned to it, based on instructions received over the Networking API from the Artificial

Intelligence system, which will issue instructions based on the information it receives from the Vision System

and the Sensing and Control system.

1.1 Overall Aims of The Project (Enumerated) 1 To produce a robot platform for indoor use.

2 To design and build a sensor system allowing the robot to navigate around its environment.

3 To develop a Vision system allowing moving objects to be identified.

4 To design and implement a Networking API in the control system of the robot, allowing interprocess and peer-to-peer communications.

5 To develop Artificial Intelligence (AI) routines that allow the robot to be used for future projects in the CVSSP. Also referred to as the Decision System.

6 To duplicate the robotic platform if possible, and investigate the potential for cooperative behaviours.

1.2 Specification for Sensing and Control System

– The Sensor System must detect objects within 1m of the robot (the approximate width of a CVSSP corridor), and allow the measured distance to the objects to be passed to other systems in the robot.

– The System must allow a high level of control over the speed and direction of the robot.

– There must be a collision avoidance mechanism that will prevent the robot from driving into obstacles.

1.3 ConclusionThe Sensors and Control System is a critical part of the proposed robot platform. It is responsible for

providing a flexible, modular system for controlling the movement of the robot, retrieving sensor data about the

surroundings, and preventing collisions.

The Sensors and Control System is intended to work in parallel with the Vision System to provide

information to the Decision System. The Network API is to provide communication links between the various

processes in the other three Systems.

2

Page 9: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

2 Research

2.1 IntroductionIn this section, potential technologies are examined and discussed. The technology areas to be discussed are:

– Sensor Technologies

– Control of Motors

– Control of Sensors

The Sensor Technologies are discussed in order to determine the most appropriate means of detecting

obstacles, considering the intended operation environment of the robot, and the space, weight, and power

constraints in effect.

The chassis used in the robot's design incorporates four DC electric motors. A means to control these motors

was considered an integral part of the project, especially considering the fixed nature of the motors; the wheels

cannot be rotated to steer the robot, and so a 'skid-steer' system must be used (where the speed of the motors on

one side of the robot is increased, to cause the robot to turn towards the opposing side).

The information collected by the sensors on the robot must be captured and passed back to the higher level

processes running on the laptop computer. Some processing ability (either hardware) was needed to perform this.

The interfaces available on the laptop computer needed to be considered, both hardware (the physical ports), and

the software capabilities (the communications protocols available). The laptop computer is running Windows

XPtm Professional.

Research was performed using Robotics Textbooks in the University of Surrey Library, and a selection of

Internet Resources. These sources are referenced in 9.

2.2 Sensor TechnologiesWhen considering sensor technologies for the robot, the operation environment must be considered. The

robot is intended to operate indoors, on conventional carpeted floors. The obstacles encountered will primarily

be either furniture, features of architecture, and people. The first two types can be assumed to be stationary,

whereas people are likely to move around, and may provide fluctuating sensor returns. It is expected that the

Vision System will be able to differentiate between people and non-living objects.

2.2.1 Infra-RedIR proximity ranging has the disadvantage of only realistically providing detect/non-detect information, since

the reflectivity of objects to IR is much more variable in an indoors environment [Schur, 2006] . The

components, however, are widely available and compact.

IR sensors use reflected IR light to detect surfaces. Low frequency modulation of the emitted beam is usually

used to eliminate interference from unvarying sources, such as electric lights or the sun. Distance measurements

are only possible if the environment has uniform colour an surface structure, and the sensors must be calibrated

3

Page 10: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

to do this. This is rarely practical in most scenarios, however. Black or dark surfaces, for instance, are practically

invisible to IR sensors, so they are not completely infallible when it comes to proximity detection. It is because

of this that IR sensors are generally only effective for object detection, and not distance measuring. Furthermore,

since the intensity of IR light decreases quadratically with distance (proportional to d-2), typical maximum ranges

are 50 to 100 cm, which may prove too small for the purposes of the project. [Nehmzow, 2000]

2.2.2 RADARRADAR provides an accurate picture of the surroundings, and is a well understood technology. The majority

of objects in the indoor environment have high radar reflectivity, however there may be significant potential for

interference from other radio sources in the CVSSP, due to the Wireless Networking systems in place. It is also

uncertain how much power would be needed to operate a RADAR antenna with sufficient power to work

effectively over the distances involved.

2.2.3 Inductive, Magnetic, CapacitiveIn the field of proximity sensing, Inductive sensors may be used to detect the proximity of ferromagnetic

materials. However, this method is unsuitable for use in the specified environment, as ferromagnetic materials

are unlikely to be encountered in great quantities, making this technology more suitable for industrial and

manufacturing robots. In addition to this, the sensor requires motion of a ferromagnetic object to generate an

output voltage; stationary objects relative to the sensor have no effect. The inductive proximity sensor also has

an extremely short range, typically fractions of a millimetre. This range limitation is another reason why this

technology is mainly confined to assembly-line robots.[Fu, 1987]

A technology with a potentially greater detection range is the Hall-effect sensor. This device works on the

principle of a Hall-effect sensor located between the poles of a permanent magnet. When a ferromagnetic

surface/object is brought close to the magnetic poles, the magnetic field across the sensor would be reduced (see

Figure 2.1).

4

Page 11: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

This method has similar disadvantages to the Inductive sensor method described above; only ferromagnetic

materials can be detected, and the range of detection is reduced. [Fu, 1987]

If a sensor is required to detect proximity to a non-ferromagnetic surface, a capacitive sensor may be used.

These sensors are capable (with varying degrees of sensitivity) of reacting to all non-gaseous materials. As the

name implies, the sensor works by detecting a change in capacitance between two electrodes, effectively using

the sensed object (and the air around it) as part of the capacitors dielectric. [Fu, 1987]

Capacitance based sensors are once again subject to a limited range. Also, whilst non-ferrous materials will

give rise to a response, the level will be markedly less than that of a ferrous material; for example, Iron can cause

a response 2.5 times greater than that caused by PVC at the same distance (see [Fu, 1987] pp.281).

2.2.4 SonarA great deal of work has been done on (ultrasound) sonar sensing in the field of Robotics.

In a typical ultrasound sensor system, a 'chirp' of ultrasound is emitted periodically from a reasonably

narrow-beam acoustic transducer. This burst of ultrasound will be reflected from nearby surfaces and can be

detected at the sensor after a time T. This time interval is the out-and-back time. Since the speed of sound in air is

known, it is a simple matter to calculate the distance to the reflecting surface using the relationship between

velocity and time.

A major advantage of ultrasound sensing methods is that the dependency of the sensor response upon the

material being sensed is reduced, when compared to methods such as Opto-sensing and RADAR. This is clearly

of benefit in an indoor environment, where a variety of obstacles will be found having different surface

5

Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent

magnet ©[Fu, 1987] pp279

Page 12: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

compositions may be encountered. A contrasting disadvantage is that the sensor field is in the shape of a cone;

the detected object could be anywhere within the sensor cone at the measured distance. The accuracy of the

position measurement is dependent on the width of the sensor beam. Also, a phenomenon called Specular

Reflections can cause inaccuracies in the measurements. If an ultrasound beam strikes a smooth surface at a

sufficiently shallow angle, the beam will be reflected away from the receiver instead of back towards it. This

may cause a larger range than actually exists to be read by the sensor.

There are methods that have been developed to combat Specular Reflections. One method uses so called

“Regions of constant depth”. If a 360 sonar scan is performed (for example) a significant section of arc where

the ranges measured are constant is termed a Region of constant depth (RCD, see Figure 2.2). These regions can

be interpreted by taking two (or more) sensor scans from two differing locations and comparing the arcs of the

RCD's. If the arcs intersect, a corner is indicated at the point of intersection. If the arcs are caused by a flat wall,

they will be at a tangent to the reflecting plane (see Figure 2.3). [Nehmzow, 2000]

A third issue to be overcome relates to arrays of ultrasound sensors. If one sensor detects the reflected pulse

from another, so-called crosstalk arises. Solutions to this include coding the sensor signals somehow, or

controlling the timing of the sensors to prevent erroneous detections. [Nehmzow, 2000]

Ultrasound sensors are effective at much greater distances than the proximity sensing methods mentioned

above, even taking into account the increased atmospheric attenuation of sound waves at high frequencies. This

means that the robot would have more freedom of movement, and would be able to sense obstacles at a greater

range, allowing more time for path-planning computations to be performed.

An experiment performed by Mitsubishi Electric Corporation showed that a mechanically scanned ultrasound

sensor was able to detect the locations of standing persons within a room ([Pugh, 1986] pp.271). Investigations

were also made into the practicality of an electronic scanning system.

The advantage of the electronic scanning system over the mechanical system is that the servos used to pan

and tilt the sensor beam contribute vibrational noise, and the assembly is by necessity quite large. An electronic

scanning system can be used to deflect the beam by unifying the phases of the emitter elements in the desired

direction. The study performed by Mitsubishi highlighted the problems with resolution, reliability, and

processing time that must be overcome in the implementation of this form of sensor.

6

Figure 2.3: 360 degree sonar

scan, with two RCDs © [Nehmzow,

2000] pp27

Figure 2.2: Differentiating between Walls

and Corners using RCD's © [Nehmzow, 2000]

pp28

Page 13: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

A fixed sensor will not have the flexibility of the scanning sensor, but will be simpler to mount and utilise.

Multiple sensors are needed to provide all around coverage.

2.2.5 Laser Range FindersThese sensors are also referred to as Laser Radar or 'Lidar'. They are common in robotics, and function in the

same manner as the sonar sensors detailed above; instead of emitting a pulse of ultrasound, a pulse of near-

infrared light is emitted. The out-and-back time is again used to determine the range to the detected object.

However, since the speed of light is much faster than the speed of sound through air at room temperature (order

of 106 higher), the means of measuring the out-and-back time must be proportionately more accurate.

Since the wavelength is also much shorter, the probability of total reflection off a smooth surface is reduced,

so specular reflections are less of an issue. Accuracy of commercial Laser sensors is typically in the millimetre

range. [Nehmzow, 2000]

2.2.6 Shaft EncodersIn order to determine the robot's position, some form of odometry is useful. Sensors known as shaft encoders

are used to measure the rotations of robot's wheels. If the circumference of the wheels is known, the distance

travelled (and possibly the direction) can be determined.

For measuring distance travelled, Incremental encoders are most suitable. The alternative, Absolute encoders,

are more suitable for measuring the current position of a rotating shaft. Incremental encoders are suited for

summing movement over time. In a typical set up, two photoreceptors are used to read a disc affixed to the shaft.

The disc is encoded with two tracks of information, one for each receptor, in such a way that one will always lag

in the case of clockwise rotation, and always lead in the case of anti-clockwise rotation (for example). The

number of times each receptor is triggered will inform the number of revolutions achieved.

Using shaft encoders to provide odometry, and in turn an estimate of position, is known as dead reckoning. It

has been observed in practice that dead reckoning is very unreliable over any significant distance. This is due to

motions of the shaft that are not due to locomotive rotation, such as skidding or slipping on a surface. Such

issues would be of particular concern in a skid steer system. [Nehmzow, 2000]

When conducting preliminary research for this project, it was noted that Optical (IR and Laser) and Acoustic

sensors (ultrasound) are common products available for amateur Roboticists. This may be taken as a reasonable

indication of their ease of use and manufacture, and of their suitability for indoor robotic sensing applications.

2.3 Control of MotorsThe robot uses DC electric motors, and in order to control and drive them, a system incorporating a power

converter/regulator is needed. The power from the chassis battery must be translated to the 7.2V needed by the

DC motors, and regulated in such a way as to provide speed, acceleration, and directional control to the robot.

A range of off-the-shelf controllers are available from Devantech Ltd [Devan, 2006] , which perform exactly

the task outlined in the above paragraph. These controllers are highly modular, requiring only power and control

7

Page 14: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

inputs, and can be controlled using a variety of methods, including analogue voltage inputs, and Radio Control

Model systems. Of particular note is the I2C capability built into many of these products, considering the

availability of USB-I2C interface devices from the same manufacturer, although analogue signals can also be

used to control the speed/direction.

It was possible that a custom circuit could have been designed, incorporating the power regulation and

communications capabilities desired. This, however, would have been a significant design undertaking, requiring

significant time and effort, to allow for development of a working device. This option was considered infeasible

within the time constraints of this project.

2.4 Control InterfaceThe Motor controllers used in the above example robot are acting as slaves on an I2C bus. This

communications standard was developed by Philips as a means of communicating between Integrated Circuits,

using a minimum number of data lines. A range of sensors are available from the company supplying the motor

controller and the chassis that also act as I2C slaves. Elsewhere in the product range, there are sensors based on

the same principle, but are triggered by logic levels, with no bus communication functionality. It was felt that the

logic-triggered sensors should be combined with a processing interface, as this would allow for more flexibility,

and would foster greater understanding of the technology.

The laptop computer has several USB ports available (due to an installed USB expansion card), and a USB to

I2C translation device is available from Devantech Ltd [Devan, 2006] . This makes the I2C bus a viable choice

for a sensor/motor controller interface, as the aforementioned translation device is treated as a serial (COM) port

by Windows (through the use of the freely available drivers), and the writing of Windows programs to access

serial ports is a simple task.

Another option for the connection of sensors to the laptop computer is an RS232 interface to one of the serial

communications ports. This would have required more time to implement, however, and in order to achieve a

working solution quickly, the USB-I2C interface was deemed to be the best choice. In addition to this, RS232 is

an older technology, which may not be supported on future laptops. Therefore, in order to include an element of

'future-proofing' to the system, USB is a better choice.

If the bus-enabled sensors were chosen, they would have been connected to the laptop and controlled directly

via the Hardware Interface Software. If the logic-triggered sensors were chosen, then some intermediate device

needed to be in place to govern communications between the sensors and the laptop PC. There also needed to be

some provision for the possibility of using other types of sensors, and sensors from other manufacturers. The I2C

enabled sensors appear to be unique to Devantech Ltd [Devan, 2006] , and this should be considered with regard

to the long term maintainability of the system.

Such an intermediate device needs to either have inbuilt I2C functionality, or be sufficiently customisable

that an I2C interface can be implemented. Devantech Ltd offer a range of I2C to IO devices that can do this job

very simply.

A more sophisticated solution is to use a PIC (Programmable Interface Controller), manufactured by

Microchip Inc. These chips come with a wide range of features (including in-built I2C functionality) and are a

8

Page 15: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

very popular and widely understood product range. Many of Devantech Ltd's products are based around PIC

micro controllers, which suggests that the PIC product family is trusted and well-supported by the robotics

community. In addition to this, facilities for programming PICs are available in the Undergraduate Labs. Such

facilities include MPLABtm [Micro, 2000] software, which allows programs to be composed in assembly

language, and, with the installed C18 compiler, in C. Since the author is familiar with C from a level 1

programming course, this does not require learning a new language. The MPLAB software includes

sophisticated debugging tools, allowing code execution to be 'stepped through', whilst displaying the values of

any program variables. Debugging can be done in hardware, if an In-Circuit Debugger tool is connected to the

correct pins on the PIC. The undergraduate lab has a number of these tools, as well as PICStart Plus

programmers, which are simply used to program PICs before they are installed into a circuit. Also useful are

development boards which provide a variety of tools for testing programs and concepts (switches, keypads,

displays, etcetera).

2.5 Technologies SelectedBased on the available products, and the literature researched (see References), Ultrasound Sensors were

selected for collision avoidance and obstacle detection. The model chosen was the Devantech Ltd SRF05. This

Sensor is simple to use, and has a range between 4m and 1cm, suitable for the distances encountered in the

CVSSP.

The Motors is controlled by an MD22 Motor Controller. As mentioned above, this was shown to work well

with the chassis and motors in a demonstration video. In addition, buying as many components as possible from

the same manufacturer was intended keep postage costs down.

A PIC18F2220 micro controller is used as a sensor controller, and communicates with the Laptop using the

built-in I2C module, through a USB to I2C interface (USB-I2C). It should be noted that the I2C module is part

of a configurable serial port system on the PIC, allowing the use of other serial protocols in the future. The

configurable logic outputs are used to trigger the SRF05 sensors (see below), and the internal timers are used to

measure the length of the return pulse.

The sensors used in the design are triggered by logic signals applied to their control pins. A range of sensors

are also available with I2C bus functionality, allowing them to act as slave devices and respond to commands in

the same fashion as the MD22 motor controller. The simpler sensors were selected to keep the Hardware

Interface Software as simple as possible, and in order to provide a wider range of learning opportunities, such as

PIC programming.

It was reasoned that should the PIC based solution prove unworkable, the I2C-ready sensors could be used

with a minimal number of changes to the Hardware Interface Software. In addition, a possible future project

could be based on constructing Ultrasound rangers from scratch, based on the commercially available model. If

this was the case, the rangers would have an interface already in place, although this was not necessary.

9

Page 16: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

The sensors are mounted at positions on the chassis of the robot, at equal angular spacing (see Figure 2.4.

This will allow a model of the robots surroundings to be produced, by frequently polling each sensor. In this

way, a constantly updating navigational map can be constructed. The eight sensors, mounted as shown in Figure

2.4, should allow for maximum information about the environment to be collected, as well as the possibility of

using Regions of Constant Depth (see 2.2.4).

It was hoped to add a magnetic sensor to the robot to allow orientation to be determined. However due to

time constraints this was not implemented, and there was no significant research done on this type of sensor.

10

Figure 2.4: Position of Sensors on Chassis

Page 17: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

3 Design Overview

3.1 IntroductionThis section examines the overall structure of the Sensors and Control System. The Hardware and Software

aspects of the system are discussed in turn, with brief justifications for different aspects of the design.

3.2 Hardware Design HierarchyThe laptop computer is the control centre for the robot, and is the platform for the software needed to control

the robot. The laptop has Wireless LAN capability, allowing the robot to be controlled from another computer on

the same network. This prevents the decision system being constrained by the specifications of the laptop

computer.

The robot sensors and motors are interfaced using a USB/I2C conversion device. This allows commands to

be sent to the sensor controller/motor controller through a standard serial port software interface.

The motor controller and sensor controller reside on the I2C bus, as mentioned above. The MD22 is a self

contained prefabricated unit, requiring only control and power inputs, whereas the sensor controller is a bespoke

circuit consisting of:

● A PIC chip

● A 24MHz oscillator

● Sensor connections

● Emergency Override Circuit

The power source for the Sensor Controller and MD22 is drawn from the USB/I2C interface device. This

negates the need for a DC-DC converter or other 5V supply.

The control outputs of the MD22 are connected to the chassis motors in a 'skid-steer' configuration. This

means that the controller drives a pair of motors on each side of the chassis. If a right hand turn is desired, for

example, the power to the left hand side motors is increased, and the robot will turn to the right. This method

allows for differential control of direction, which may facilitate simpler computations for the high level AI.

The SRF05 Ultrasound rangers are mounted at eight points on the frame of the robot; each sensor requires the

following connections:

● +5V

● Ground

● Trigger logic input

● Echo logic output

The Echo outputs of each sensor are connected to a single pin on the control PIC, as each sensor is

polled separately. Each Trigger input must be driven from a separate PIC output pin, however.

It was found during the design process that when two or more sensors were connected to the PIC, that the

11

Page 18: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

echo pulse would not be received by the PIC. An oscilloscope was used to determine that the sensors were

functioning correctly and were being triggered. It was reasoned that when two or more sensors are connected to

the same node in the circuit, the echo outputs of the other sensors load that node with their impedance, causing

the voltage at the PIC to be less than the logic 1 voltage. In order to remedy this, diodes were connected between

each echo pin connection and the circuit node where they were joined. This was observed to remedy the

problem.

3.3 Software Design HierarchyData is exchanged between the Robot Hardware and the controlling decision system through an intermediate

software program (Hardware Interface Software). This program must be running for the robot to respond to

commands. This program exchanges commands and data over a TCP/IP connection, using a network API

developed for this project [Aichi, 2007] . The TCP/IP link transmits and receives over the CVSSP Wireless

LAN.

The command set used is flexible enough that the robot can be controlled either by a software decision

system, or by a human operator using a Remote Control program. A piece of software performing this function

was written to aid in development and testing. This software is referred to as the Remote Control Software.

3.4 ConclusionThe Sensors and Control System can be broadly divided into the Hardware aspect, consisting of the Sensor

Controller (with Sensors) and MD22 Motor Controller, and the Software aspect, consisting of the Hardware

Interface Software, which is build around the Hardware Interface Class and is the point of control for the user

and/or decision system.

12

Page 19: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

4 Hardware Design

4.1 IntroductionThis section aims to discuss in greater depth the various Hardware features of the Sensors and Control

System. Each component is described in terms of it's function and capabilities. Concepts necessary for a practical

understanding of the Hardware are explained.

The tests that were performed on the Hardware aspects of the Sensors and Control System are described,

along with the results of the tests, and the conclusions drawn (and actions taken, if any).

4.2 PIC Sensor ControllerThe SRF05 sensors are controlled by a PIC 18F220 device, configured as an I2C bus slave. The PIC is

programmed using MPLAB software released by Microchip Technology Incorporated, who also manufacture the

PIC range of devices. The features built into the PIC 18F220 include:

– On-board and external Oscillator modes

– Pulse Width Modulation (PWM) dedicated inputs and outputs

– 10-bit Analogue-to-Digital Converter

– On-board EEPROM flash memory

– Serial Communications module, with USART and I2C capabilities

The features listed above provide a good deal of flexibility to the user, allowing the PIC to control more

features of the robot, should they prove necessary. This flexibility extends in scope to future projects. See Figure

4.1 for the circuit diagram.

The PIC triggers each sensor in turn, measures the out and back time, and stores the result in a register entry

reserved for that sensor. The sensor data can be requested at any time by the I2C bus master (the laptop

computer), and the PIC will respond with the data. It is also intended for the PIC to perform ranging constantly

upon power being applied.

The PIC is programmed to interrupt power to the motors should an obstacle be detected within a small radius

of the robot. This is achieved by using a relay through which the motor power is carried. If the relay control line

(connected to the PIC) is at logic 0, the relay shall prevent power from reaching the motors, whereas a logic 1

will allow power to flow. This has the added benefit of preventing the robot from moving when the sensor

module is un-powered. A simple switch is fitted in order that this mechanism can be bypassed if necessary.

The PIC clock is driven from an external crystal, across pins RA6 and RA7, running at 24 MHz. This high

clock speed minimises delays in processing, since responsiveness is important when making decisions based on

changing sensor readings.

The oscillator module used for timing the sensor pulses is automatically run through a ¼ prescale, with an

additional ¼ prescale selected in the Timer options, giving an effective frequency of 1.5 MHz. This allows

13

Page 20: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

sufficiently high (> 1cm accuracy) whilst allowing the maximum pulse width of the sensor (~30 ms) to be

measured. The oscillator uses a 16 bit register to store its counter value; this register is read in two byte read

operations.

4.2.1 Important features of the circuit

1uF decoupling capacitors are connected between the 5 V (Vdd) and 0 V (GND) supplies, to help reduce

voltage fluctuations and spikes. Capacitors are also connected between GND and circuit nodes that undergo high

speed voltage transitions to reduce ringing.

Connection points are included for connecting a reset switch for the PIC. The PIC's MCLR pin should be

pulled up to Vdd for the PIC to operate correctly. A switch closed between GND and MCLR will pull this pin

low, resetting the PIC. This feature is intended to be used in the future if the PIC code is expanded; it may be

possible for the program execution to become 'stuck', if future code is not fully implemented.

The heartbeat LED (used to give an indication of program status) is connected between Vdd and the

controlling output on the PIC (with a series resistor to dissipate current). When the output is driven high, the

LED turns off (there is no potential difference across it); when the output is low, the LED will be on.

An RJ12 socket is included for In-Circuit Debugging/Programming. The socket connects to the Vdd and

GND lines, the MCLR reset pin, and the RB7 and RB6 pins. It is important that the sensors are disconnected

when using the RJ12 socket, as the sensors will disrupt the signals on the RB7 and RB6 pins.

14

Figure 4.1: Circuit Diagram of Sensor Controller

Page 21: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

4.2.2 Brief explanation of I2C bus protocolIn the I2C bus protocol, a device is either a Slave or a Master. Each Slave device has a 7-bit address that

must be unique on the bus. Slave devices only respond to signals from the Master, they never initiate a

transaction. Generally there is only one Master on the bus, however it is possible to work with multiple Masters;

this is beyond the scope of this report.

The I2C bus consists of two physical signal lines, SCL and SDA. SCL is an active-high clock line, and SDA

is a Data/Address line. When a device puts a signal onto the bus, it does so by pulling one or both of these lines

low. This has the effect of automatically reserving the bus for that device until it allows the line to be pulled

high. Both the SCL and SDA lines must have pull-up resistors (in this case these are included on the USB-I2C

device).

In the system described in this report, there is one bus Master (the laptop PC) and two slaves (the MD22 and

the PIC sensor controller). The slaves have addresses 0xB0 and 0xE0 respectively.

When the Master wishes to start a transaction, it puts a START condition onto the bus. This consists of the

Master pulling the SDA line low, then pulling SCL low after a short delay. Upon receiving this START

condition, all slave devices will immediately listen for their address. The Master will send the address of the

device it wishes to communicate with; the 7-bit address will be sent as part of a byte, with the last bit indicating

a read or write operation (read = 1 / write = 0). For example, if the Master wished to write to the Sensor

Controller, it would send the address 0xE0, and for a read it would send 0xE1.

The Master controls the SCL line for this phase of the transaction. Data on the SDA line is valid when the

SCL line is high. If a Slave detects its address, it will respond with an ACK condition; as the Master pulls SCL

low to transmit the eighth bit of the address, the Slave will pull the SDA line low. The Master will then release

SCL high, and then the Slave will release the SDA line to complete the ACK sequence. The Master can then

proceed with the next byte of the transaction (if it is writing data) or wait for a response (if it is reading).

For the purposes of the PIC sensor controller, the following sequence will occur for a read (the only data to

be written to the PIC is the register address to read from):

1. The Master will send a START condition.

2. The Master will send the ADDRESS 0xE0 (write mode).

3. The PIC will respond with an ACK.

4. The Master will send a byte with the value of the register it wishes to start reading from.

5. The PIC responds with an ACK.

6. The Master sends another START condition (repeated start), followed by the ADDRESS 0xE1 (read

mode).

7. The PIC responds with an ACK. It will then hold the SCL line low until it is ready to transmit data.

This is referred to as Clock Stretching, and allows slower processors to communicate with faster ones

by allowing the Slave to decide when transmission starts.

15

Page 22: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

8. The Master releases the SDA line, and generates 8 clock pulses on the SCL line (once it has been

released by the Slave). The Slave will change the SDA line according to each bit of the byte to be

transmitted.

9. Upon successful reception of the byte, the Master will generate an ACK condition by pulling the

SDA line low (after it is released by the Slave) and triggering a clock pulse.

10. The Slave is now free to send another byte, unless the Master has read the required number of bytes.

In this case, immediately after the ACK, the Master will send a STOP condition (first release SCL, then

SDA). This tells the Slave to go back to waiting for a START condition.

4.3 Motor ControllerAs the MD22 motor controller is a self-contained device, it is simply connected to the I2C bus, as well as the

requisite power and control points (for details of the MD22 operation, see the relevant section of the Devantech

Ltd website [Devan, 2006] ).

The MD22 is operated in Mode 1; This mode has a separate speed register for each motor (left and right), and

interprets the contents of these registers as signed values (127 (0x7F) is full forward, -127 (0x80) is full reverse).

The MD22 also has an acceleration register, which allows the rate of power stepping of the motors to be

controlled. The acceleration value is changed to prevent over driving the motors. For example, if the robot is

travelling at maximum forward speed, and a command is received to travel at maximum reverse speed, there is

the possibility of over driving the motors if the power steps are of minimum size. A section of code in the

Hardware Interface Software is used to prevent this (see 5.3).

16

Page 23: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

4.4 Laptop Cradle and Sensor Platform

It was necessary to design and construct a hardware fixture to support the laptop when the robot was being

operated. Also, mounting points were needed for the sensors and camera, which could not easily be attached to

the purchased chassis whilst at the same time having a wide coverage. The decision was taken to combine a

laptop cradle and sensor platform into a one-piece construction. Discussions were conducted with the University

Engineering Workshop to establish the construction methods available, and a design was produced (see Figure

4.2). This design was made from aluminium, making it light and strong, and allowing mounting points to be

drilled for the sensors anywhere on the frame. Right-angle mounting brackets were produced in order to attach

17

Figure 4.2: Laptop Cradle Design (dimensions in mm)

Figure 4.3: Sensor Bracket Design

Page 24: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

the Ultrasound sensors to the frame (see Figure 4.3). The cradle also incorporated a support for the laptop screen,

allowing the display to be read when the robot was operating without the screen being swung backwards by the

robot's inertia.

NOTE: The designs in Figure 4.2 and Figure 4.3 were developed in conjunction with the rest of the project

team, and with the assistance and guidance of staff in the University of Surrey Mechanical Workshop. The

concepts involved did not originate only with the author.

4.5 Testing (Hardware)

4.5.1 Motor Control TestsThe aim of this set of tests was to characterise the performance of the MD22 in conjunction with the robot

motors, and to assemble a list of commands that could be used to control the robots direction and speed. In order

to do this, the MD22 documentation was used as a reference in testing the different modes of operation.

Equipment Used

– Laptop Computer

– Serial link test Software

– USB/I2C interface

– Laboratory Power Supply

– MD22 motor controller

– Robot Chassis inc. motors.

Test Circuit

In order to test the functionality of the Motor Control System, the chassis and motors were assembled, and

the MD22 connected to the motors and a laboratory power supply, set to provide 7.2V DC (see Figure 4.4. The

MD22 was accessed using a serial port test program downloaded from the Internet (ref), and connected using the

USB/I2C link. The 5V DC power for the MD22 control circuitry was drawn from the supply pins on the

USB/I2C link, for the sake of simplicity. The chassis was placed with the wheels off the ground, and various

instructions were input to the MD22, and the results observed.

18

Figure 4.4: Motor Controller Test Circuit

M D 2 2 M o t o r C o n t r o l l e r

U S B /I 2 C

L a p t o p C o m p u t e r7 . 2 V

P S U

L e f t M o t o r s

R i g h t M o t o r s

Page 25: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

Test Procedure

The establishment of serial communications with the MD22 was done using a free Serial Port Test Program

[Ser, 2004] . The parameters for communication with the USB/I2C device are (from the manufacturers

documentation)[Devan, 2006]

● Baud Rate of 19200

● 8 data bits

● No Parity bits

● Two Stop bits.

The commands are sent as Hex characters.

The modes of operation were tested in turn, with key speeds being applied (i.e. Full forward, full reverse, half

forward, half reverse etcetera). Some initial testing was done to ensure that the motors were connected correctly.

The reason for this was that the polarities of the motor power outlets on the controller are not labelled.

Results & Analysis

It was found that for correct operation, the positive leads should be on the two outermost outlets. This will

result in the motors turning in the same direction when a “Full-speed forward” command is received.

The modes of operation all performed as expected, and the response of the controller was immediate. The

motor speeds were separated into 18 discrete levels for forward and backwards, to simplify the instruction set. It

is deemed unlikely that 127 levels of forward and backward speed will be required.

4.5.2 Speed Tests

Aim

To characterise the speed of the robot fully loaded with laptop, camera, chassis, and eight sensors.

Equipment Used

– Digital Stopwatch

– Robot (complete chassis with sensors and sensor controller)

Methodology

The robot was set up with the items mentioned above installed. A second laptop computer, owned by the

author was used to issue commands, using the Hardware Interface and Remote Control programs. The Remote

Control program was operated in terminal mode (see 5.5), as it was not possible to steer the robot remotely and

maintain consistent speed whilst measuring the speed.

The robot was placed on the floor of the laboratory, and markers were placed at a 6 metre interval. A

19

Page 26: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

stopwatch was used to time the robot's passage between the markers, running at various speeds. The information

obtained was used to calculate the speed of the robot is metres per second (ms-1), with three runs at each speed

setting being made, then averaged.

The speed settings were incremented in steps of 3, from 6 to 18. The lower speeds were not tested, as it was

expected that the speed would increase linearly, allowing the lower equivalent speeds to be extrapolated.

Results & Analysis

It was observed that the robot did not travel in a straight line, even when the same speed levels were input to

each pair of motors. The robot would always pull to the right, and so it was necessary to place the robot to the

right side of the test course, angled to the left, so that it would not collide with furniture before reaching the end

marker. This naturally made the distance travelled between the markers difficult to determine, and was a source

20

Table 4.5.1: Average speeds for speed settings 6 to 18

Speed Setting6 0.2324 0.2498 0.241069 0.3990 0.4314 0.4151912 0.5576 0.6061 0.5818616 0.7622 0.8342 0.7982218 0.8432 0.9254 0.88430

Lower Limit (ms-1) Higher Limit (ms-1) Average Value (ms-1)

Figure 4.5: Graph showing approximate speed values (averaged over three readings and

taking into account uncertainties)

6 9 12 16 18

0.0000

0.0500

0.1000

0.1500

0.2000

0.2500

0.3000

0.3500

0.4000

0.4500

0.5000

0.5500

0.6000

0.6500

0.7000

0.7500

0.8000

0.8500

0.9000

0.9500

Speed of Robot in metres per second

Lower Limit (ms-1)

Higher Limit (ms-1)

Average Value (ms-1)

Speed Setting (out of 18)

Sp

ee

d m

etr

es

pe

r se

con

ds

Page 27: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

of uncertainty.

Another source of uncertainty was the nature of the timing method. The author used a stopwatch to time the

interval between the start and finish markers. This relied on the authors judgement of when the robot had crossed

the marker, and so an uncertainty of ± 0.1 seconds was assumed.

In order to account for the sources of error present, upper and lower bounds for the speed were calculated,

using the recorded time + 0.1 seconds and distance of 6 metres for the lower limit (the 'worst-case'); and the

recorded time – 0.1 seconds and distance of 6.4 metres for the higher limit, since the curved path of the robot

would have increased the distance travelled. An average of these two limits was calculated.

It can be seen from Figure 4.5 that the trend is quite linear until the higher speed values are reached. It is

likely that the readings for speed level 16 are anomalous in some way, perhaps due to a skewed average of times.

If the linear progression of the lower speed values is continued, ignoring the level 16 data, it should intersect the

level 18 data.

It is considered that the speed of the robot increases linearly with the speed levels, with a top speed of

approximately 0.884 metres per second.

4.5.3 Oscilloscope testing of I2C bus

Aim

To examine the signal lines of the I2C bus during communications with functional devices, and to use that

information to inform the writing of I2C slave routines for the PIC sensor controller.

Equipment Used

– TDS3032 Digital Oscilloscope

– TTi EL30T Power Supply Unit (PSU)

– PIC evaluation board

Methodology

In order to determine the source of the problem with I2C communications on the PIC, a digital oscilloscope

was used to examine the signals on the bus. One channel each was used to monitor the SCL (clock) and SDA

(data/address) lines. The oscilloscope was set to trigger on a falling edge, on the channel attached to SCL (I2C is

an active low clocked bus).

The bus was first tested using a Serial Port Test Program to send commands to the MD22 motor controller.

This was done to establish that the methodology was sound, since the address and data patterns were explicitly

known. A variety of messages were sent to the MD22, setting the mode, left and right speed registers to various

values.

The PIC (programmed with simple I2C code, allowing a constant value of 0x33 to be read) was connected to

the same Test Program using the Interface device. Attempts to read back the constant character were made,

21

Page 28: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

whilst monitoring the SCL and SDA lines.

Results & Analysis

When using the MD22 to test the bus, the bit-patterns were observed to be as expected.

Attempts to read back the constant value from the PIC were made, and were unsuccessful. The oscilloscope

trace showed that the SCL and SDA lines where not being pulled down to ground successfully. It was realised

that the 5V line for the I2C interface was connected to the 5V line powering the PIC, which was fed from a

separate source. The 5V lines were separated, whilst keeping the ground lines connected. The read tests were

repeated, and the oscilloscope trace clearly showed that the SCL and SDA lines were being pulled down

correctly. The test character, however, was not being received by the test program correctly. This indicated that

the code being used was not correct.

Modifications

When writing the I2C slave code for the PIC, consultation was made of an Application Note for PIC devices

published by Microchip. This document is referred to as AN734 [Micro, 2000]. It details the different states that

an I2C slave can be in at any stage during a bus transaction. An error has been pointed out in this application

note, where the CKP bit is not set in the event of a Master NACK condition [I2C Slave, 2002] . This would

cause the PIC to stop responding after the first read.

The PIC code was re-written, based literally on AN734, taking into account the error that was identified. The

five possible states were incorporated into a C switch statement, based on the flag bits relating to the PIC serial

communications module. This was separated into read and write versions of the code, discarding those states that

would not be used. The same test character (0x33) was used for the read tests. The success of the write tests

would be based on whether or not an acknowledgement of 0x01 was sent back to the test program (this was

generated by the I2C interface, not the PIC chip).

This test code was found to work successfully, and the oscilloscope traces showed the correct characters

being sent across the bus, with appropriate responses being received.

4.5.4 Sensor TestsThe accuracy of the SRF05 rangers was assessed in the laboratory. A single ranger was used, and was

suspended from a clamp stand approximately 1m from the surface of the bench, so that vibrational interference

would be minimised, and false returns from the bench top would be negated. The distance was measured three

times at each distance, in order to determine the variability of the sensor readings at a constant distance, and to

account for minor changes in the sensor return between measurements (objects moving in the background). A

range of distances between 5cm and 1m was measured, as this was considered by the author to be the most

critical range for the sensors to operate at (the manufacturers state the maximum range as 3-4m, depending on

application).

22

Page 29: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

Equipment Used

– TDS3032 Digital Oscilloscope

– TTi EL30T Power Supply Unit (PSU)

– Push-to-make switch

– Breadboard

– Tape Measure

Test Circuit

The circuit used to test the sensor was a simple one (see Figure 4.6). A laboratory PSU was used to provide

+5V and Ground levels, which were also used to supply the sensor with power (not shown in circuit). A 1uF

capacitor was connected between the trigger input of the sensor and ground. This was included to counteract the

effect of 'bounce' on the push-to-make switch. If any spurious voltage spiking occurred due to switch bounce, the

capacitor would short-circuit the high-frequency spikes to ground, and provide a good DC level for the logic

trigger.

The trigger input must be held at +5V for 10 microseconds for the sensor to initiate ranging. The ranging will

not start until the trigger input goes low again, allowing the sensor to be triggered by hand using the circuit in

Figure 4.6.

Methodology

A flat upright surface was placed at various distances from the sensor aperture. The surface was the largest

face of a plastic component storage box found in the laboratory. This object was chosen as it had large flat

surfaces, and was of a rigid material, and was expected to have a high Ultrasound reflectivity. The distance

between the sensor and the surface was measured using a tape measure.

The box was placed between 1m and 10cm away from the sensor aperture, in 10cm increments, and finally at

23

Figure 4.6 - Sensor Test Circuit

+ 5 V

T r i g g e r

E c h o1 u F

P u s h - T o - M a k e

O s c i l l o s c o p e

Page 30: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

5cm. At each discrete distance, the push-to-make switch was pressed, and the resulting echo pulse captured on

the oscilloscope. The width of the pulse in microseconds (us) was measured using the scale on the oscilloscope

display. This value was recorded. Each measurement was repeated three times at each distance.

Results

The results obtained show that the length of the echo pulse varies linearly with the distance of the reflecting

surface (see Table 4.5.2). This corresponds to the expected performance of the Ultrasound Ranger, based on the

manufacturers documentation. The method of manually reading the pulse width from the Oscilloscope screen is

inaccurate, and a different method should be used to determine the uncertainty inherent in the Ultrasound

measurements.

Analysis of Results

The results obtained show a strong correlation between the range measured from the sensor and the actual

range, see Figure 4.7. This is as expected.

The method used for measuring the pulse width was highly accurate. It is also very hard to quantify the

inaccuracy. It was decided that further tests should be done, using a more accurate method of measuring the

pulse width, with an accuracy that can be quantified. This will allow a value to be quoted for the accuracy of the

sensors when connecting to other systems.

24

Table 4.5.2: Sensor Test Results

Range (Cm) Readings (us) Average Measured range Deviation (cm)1 2 3 (us) (Cm)

100 5800 5800 5800 5800.00 100.00 090 5100 5150 5120 5123.33 88.33 -1.6780 4550 4550 4560 4553.33 78.51 -1.4970 4080 4090 4090 4086.67 70.46 0.4660 3450 3440 3440 3443.33 59.37 -0.6350 2900 2990 2995 2961.67 51.06 1.0640 2360 2380 2380 2373.33 40.92 0.9230 1760 1730 1750 1746.67 30.11 0.1120 1200 1210 1215 1208.33 20.83 0.8310 620 620 620 620.00 10.69 0.695 320 322 320 320.67 5.53 0.53

Page 31: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

4.5.5 Sensor Testing – Conclusive

Points to be investigated:

– The maximum angle at which the sensor will detect an object

– The response of differently shaped objects at different distances

– The accuracy of the range measurements at ranges up to 3m, down to 1cm, or the minimum distance at which the sensor is effective, whichever is the greater.

Equipment Used

– TDS3032 Digital Oscilloscope

– EL302T Triple Power Supply

– Standard Ranger test circuit consisting of:

– Capacitor

– Wires

– Push to make switch

– SRF05 Mounting Bracket (see Figure 4.3)

– A range of test objects, intended to simulate potential obstacles to the robot:

– A plastic equipment case, with roughly rectangular sides

– A plastic water bottle, empty (22.4cm ± 0.1cm high, radius of 3.1cm ± 0.1cm).

25

Figure 4.7: Graph showing results of initial sensor tests

5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100

5.00

10.00

15.00

20.00

25.00

30.00

35.00

40.00

45.00

50.00

55.00

60.00

65.00

70.00

75.00

80.00

85.00

90.00

95.00

100.00

Measured range vs actual range (cm)

Actual Range

Me

asu

red

Ra

ng

e

Page 32: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

– A small plastic pencil lead case with rhombic cross-section

– An extendible tape measure, with metal casing (width 1.8 cm ± 0.1 cm, height 5.3 cm ± 0.1 cm)

Assumptions:

It is assumed that the beam pattern of the SRF05 is symmetrical, and that the beam pattern of the SRF05

extends at a perfect perpendicular from the plane of the circuit board.

Methodology & Results:

Test 1

An SRF05 sensor was bolted to a mounting bracket using M3 nuts, bolts, and washers. The right angled

mounting surface was secured by resting the edge of a laboratory breadboard atop of it, so that the sensors

transducers were pointing down the length of the laboratory bench. The sensor was upright.

A piece of A4 sized squared paper was placed on the bench top, with one edge secured under the sensor

bracket. This was so that the positions of objects, and the alignments thereof, could be accurately recorded, and

measurements performed.

The plastic water bottle was placed on the bench, 10 cm ± 0.1 cm away from the edge of the breadboard (the

actual distance to the sensor was 9.5 cm ± 0.1 cm, as measured to the circuit board of the SRF05). The bottle

was positioned by eye to be as central to the sonar beam of the sensor as possible (lines were marked on the

paper to facilitate this). The position of the bottle was marked at this point.

Using the pulse width measurement facility of the TDS3032, the out and back time of the Ultrasonic Pulse

was read four times, and the results recorded. Triggering of the sensor was accomplished using the push to make

switch to raise the SRF05 trigger line to 5 Volts.

Radial lines were marked out from the centre of the SRF05, at 10 degree intervals, using a protractor. The

measurement procedure was repeated, with the bottle moved 10 degrees to one side of the marked centre line,

ensuring that the radial distance between the bottle and the sensor was maintained at 9.5 cm ± 0.1 cm. The bottle

was moved to 20 degrees off of centre, and the measurements were taken again. The bottle was then moved to 25

degrees off of centre, where it was found not to be detected by the sensor (the out and back time for all four

measurements was the maximum allowed for by the sensor specification).

In order to discern the threshold at which the sensor beam did not reflect from the bottle, the angle of

deflection was measured incrementally between 20 and 25 degrees. It was found that a deflection angle of 23

degrees was approximately the threshold at which the bottle could not be detected.

Test 2

The procedure of locating the detectability threshold was performed in the same manner as in Test 1,

substituting the metal tape measure for the plastic bottle. This time, the threshold of detectability was measured

to be slightly wider, at 27.5 degrees off of centre.

26

Page 33: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

Test 3

The plastic pencil lead case was used for the third set of measurements, proceeding as outlined in the above

two sections. A problem was encountered, in that the lead case was undetectable beyond a range of 4.8 cm ± 0.1

cm, when facing its widest aspect towards the sensor, and beyond a range of 6.3 cm ± 0.1 cm, when facing its

narrower, more sharply angled aspect toward the sensor.

Test 4

A different procedure was adopted for this test; the plastic equipment case was placed at a distance of 25 cm

± 0.1 cm away from the sensor, centrally to the sensor beam. With its widest flat face toward the sensor, the out

and back period was recorded four times.

The equipment case was rotated on its vertical axis in increments of 10 degrees. With a rotation of 30

degrees, the out and back period was at the sensors maximum, indicating no return pulse was received. The out

and back period was then measured at 25, 26, 26.5, 27 and 27.5 degrees, in order to determine the threshold

beyond which the box would not be detected.

Test 5

The same procedure as in the above Test was performed, substituting the metal tape measure for the plastic

equipment case. It was found that the tape measure could not be detected if it was rotated more than a few

degrees (< 5).

Test 6

The plastic equipment case was placed 1m ± 0.1 cm away from the sensor, in line with the beam centre. The

distance was measured from the edge of the case, to the centre of the SRF05 circuit board. The largest flat side of

the case was oriented towards the sensor, and it was kept as perpendicular as possible to the line of the beam

centre. The out and back period was measured four times, and the results were recorded. This was repeated at the

following distances measured from the case to the sensor (error of ± 0.1 cm): 80 cm, 60 cm, 40 cm, 20 cm, 10

cm, 5 cm, 4cm, 3 cm, 2cm.

In order to test the consistency of the sensor beyond 1m, the case was placed at a range of 2m, ± 1cm, and the

out and back period was measured and recorded four times.

27

Page 34: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

Analysis of Results

It can be observed that increasing the angle of deflection from the centre line increased the out and back time,

even though the straight line distance to the object was the same. This may be caused by a smaller amount of

energy being reflected back to the transducer of the sensor. If a smaller amount of energy is reflected back, and

assuming that the ultrasonic intensity takes a finite time to rise to its highest level, the time at which the received

energy crosses the detection threshold will be delayed, and therefore the out and back time will be seen to

increase, until eventually the received energy will not cross the detection threshold. Assuming that the ultrasonic

beam intensity falls off in a non-linear fashion from the centre of the beam (as is indicated by the manufacturers

specifications), then noticeable increases in the out and back time would be apparent for objects deflected from

that centre.

The results gained using the metal tape measure indicate a much better ultrasound return from a flat surface

28

Figure 4.8: Graph showing results for Test 6, with Y axis error shown

100.00 80.00 60.00 40.00 20.00 10.00 5.00 4.00 3.00 2.00

0.00

5.00

10.00

15.00

20.00

25.00

30.00

35.00

40.00

45.00

50.00

55.00

60.00

65.00

70.00

75.00

80.00

85.00

90.00

95.00

100.00

105.00

Distances measured by Sensor - Test 6

Actual Distance (cm)

Dis

tanc

e M

easu

red

usin

g S

RF

05 (

cm)

Table 4.5.3: Results for Test 6

OBT (us) Average (us)

100.00 5784.00 5752.00 5780.00 5752.00 5767.00 99.43 0.57 0.9980.00 4628.00 4640.00 4636.00 4636.00 4635.00 79.91 0.09 1.0060.00 3480.00 3480.00 3476.00 3480.00 3479.00 59.98 0.02 1.0040.00 2321.00 2321.00 2321.00 2321.00 2321.00 40.02 -0.02 1.0020.00 1152.00 1152.00 1152.00 1152.00 1152.00 19.86 0.14 0.9910.00 555.60 555.60 555.20 555.60 555.50 9.58 0.42 0.965.00 276.80 276.80 276.80 276.80 276.80 4.77 0.23 0.954.00 219.60 219.60 219.60 219.20 219.50 3.78 0.22 0.953.00 162.40 162.40 162.40 162.40 162.40 2.80 0.20 0.932.00 212.40 212.70 212.40 212.40 212.48 3.66 -1.66 1.83

Distance measured (cm)

Distance (cm)

Error (cm)

Error (%)

Page 35: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

than one that has a convex curve in relation to the sensor. The higher rigidity of the metal may also contribute to

a better return. The tests involving the pencil lead case indicated that a small, sharply angled object will not give

a good sensor return, even if it has rigid surfaces.

Expanding on this result, the tests involving the angling of the equipment case in the centre of the beam show

that the maximum angle of the box was 26.5 degrees (to the nearest half degree) before the return became erratic.

An explanation for this is that at an angle of greater than 26.5 degrees results in all of the ultrasound energy

being reflected away from the sensor, instead of back toward it. This may result in a false return at a greater

perceived distance, if the ultrasound energy reflects from another object, then back towards the angled surface

and then towards the sensor. This may cause problems with detecting some obstacles; however, it will also

prevent the robot from perceiving a gentle ramp or slope as an obstacle. With a smaller object, the tolerance to

rotation was much lower, although such objects are unlikely to be serious obstacles.

The tests of the accuracy of the sensor up to a distance of 1 metre showed an average error of ± 2%, some of

which is due to the oscilloscope accuracy (see Table 4.5.3 & Figure 4.8). When using the received sensor

readings, the accuracy should be taken as ± 2%, or ± 1cm, whichever is greater (the sensor readings are only

calculated to the nearest cm, to keep the data compact). This maintains accuracy at acceptable levels.

4.5.6 Power Consumption Tests

Aim

To determine the viability of powering eight sensors, the PIC Sensor Control Circuit, and the MD22 from the

5V line of the USB-I2C device.

Methodology

The power consumption of the various devices to be powered was measured using a laboratory multimeter in

series with the power connections of each device. The total was then compared to the maximum supply current

of the USB-I2C device. The devices were then connected to this power supply, and functional tests were

performed. These tests involved polling for sensor data at the same as issuing drive commands.

Results& Analysis

By measuring the currents drawn by each of the devices, it was found that the sensors drew

approximately 2 mA each, and the MD22 drew approximately 50 mA. This gave a total of 66 mA, 4 mA less

than the maximum current capacity of the USB-I2C device.

The devices were connected in an operational configuration, and all performed as expected, even when

repeated commands were being sent to the sensor controller and MD22. It was concluded that there was no issue

powering the devices from the USB-I2C interface.

29

Page 36: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

4.5.7 Override Relay Tests

Aim

To test the performance of the emergency override relay on the Sensor Controller.

Methodology

The Sensor Controller was installed on the robot, with all eight sensors connected, and the battery power to

the motors running through the override circuit. The robot was supported with the wheels off the floor, and the

motors were set turning at 1/3 speed. The author initially moved his hand close to each sensor in turn, and

observed whether or not the wheels stopped turning. The Udar sensor polling program was used to monitor the

sensor readings. The readings were observed as the author moved his hand closer to the sensor, in order to

determine if the relay was switching at the correct distance.

Results & Analysis

The motors were observed to stop turning, accompanied by an audible click from the relay, as the author

occluded any sensor closer than a certain distance. This distance was observed to be just under 6 cm, which is

the expected cut-off threshold (5.88 cm).

It was observed that the motors did not stop immediately upon being the sensor being occluded. There was a

noticeable delay between the author moving his hand in front of a sensor and the relay switching. Once the relay

had switched, however, the motors stopped almost instantaneously.

The delay before switching is due to the fact that the sensors are fired sequentially. If a sensor is occluded

immediately after being triggered, the instruction to switch the relay will not be reached until the next time it is

triggered, which, based upon simulations of the PIC code, could be up to 593.29 ms. A delay of half a second is

clearly too great, as the robot can cover a 6cm distance within that time even on low speed settings.

In addition to the issue mentioned above, it was observed that if the sensor was occluded at an extremely

close distance (< 1cm) then the relay switched back to allowing motor movement. This 'dead-zone' is caused by

the ultrasonic burst from the transmitting transducer not reaching the receptor, instead being reflected back into

the emitter.

Modifications

The PIC code was modified so that the cut-off distance was twice that used previously: around 12 cm. The

above tests were repeated, and similar performance was observed.

Results & Analysis of Modified Circuit

It is felt that the current arrangement, where the sensor polling loop must complete before the relay is

switched, will be effective only at very low speeds (speed level < 6), even with the cut-off distance extended to

12 cm. At speed level 6, the robot can cover approximately 14.8 cm in 593.29 ms (using the upper speed limit at

that level – see 4.5.2). The mechanism could be controlled by using an interrupt service routine to trigger the

30

Page 37: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

relay, so that the relay is switched immediately upon a sufficiently low sensor reading being recorded.

Another solution, which could be implemented in parallel to that mentioned above, would be to alter the

sequence in which the sensors are triggered. At present, each sensor is triggered sequentially, in order. It may be

possibly, due to the high directionality of the sensors, to trigger sensors on opposite sides of the robot at the same

time without them interfering. This would require significant testing to be done on the reflective properties of

surfaces in the CVSSP, as well as some means by which the sensor controller can 'know' which sensors are

connected in which position; at present, any sensor in any position on the robot can be connected to any set of

control pins on the Controller.

It would also be possible to exploit prior knowledge of which directions collisions are most likely to occur in.

If the robot is moving forward the majority of the time, it is sensible to poll the forward facing sensors more

regularly than the side and rear sensors. This reduces the time between the sensors triggering, making it more

likely that collisions will be avoided in time.

The PIC code can be edited easily in circuit, using a Microchip ICD 2® in conjunction with MPLAB®

software [Micro, 2000]. This leaves the possibility of future improvements open.

4.6 ConclusionThe Hardware aspect of the Sensors and Control System is capable of moving the robot at a variety of

speeds, and can accurately detect obstacles. A feature is present to cut power to the motors if an obstacle is

detected closer than 12 cm, however this is only effective at low speeds. Use of the interrupt capability of the

PIC18F2220 could help overcome this, as could triggering more than one sensor at once, or regularly polling

only the forward sensor.

The Hardware can be powered from the USB port on the laptop computer, excluding the 7.2V supply for the

motors, which is drawn from a battery within the chassis.

The sensors are able to detect a wide variety of objects, including those with soft or curved surfaces, although

detection of angled surfaces is problematic in the current configuration. Work could be done using Regions of

Constant Depth (see 2.2.4) to negate this problem.

31

Page 38: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

5 Software Design

5.1 IntroductionThis section is intended to describe the Software aspects of the Sensors and Control System, as well as how

they interact with each other, and with the Hardware being controlled.

The source code for the various pieces of Software is not featured here: it is included on a CD-ROM (along

with the latest executables) available with this report, and in Appendix C – C++ Source Code. Appendix A –

Code Analysis contains analysis of the source code.

5.2 Sensor Controller PIC codeThe PIC code is written (procedurally) in C, using the Microchip © C18 compiler. The main part of the

program is a loop where each sensor is triggered in turn, and the return pulse width measured and recorded. If an

I2C command is received, the program jumps to an interrupt service routine, where the appropriate response to

the I2C transaction is resolved. The code also contains conditional statements to determine whether any sensor

reading is less than 11.76 cm, and instructions to cut power to the motors should this be the case. See Figure 5.1

for a flowchart of the code. The actual source can be found on the attached CD-ROM.

32

Page 39: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

33

Figure 5.1: PIC code Flowchart

S T A R T

I n s t a n t i a t e V a r i a b l e s

H e a r t b e a t o v e r f l o w ?

L E D T o g g l e

A n y c e l l i n p r o x a r r a y > 1 ?

R e l a y c l o s e d

R e l a y O p e n

I n c r e m e n t S e n s o r t o b e

p o l l e d

T r i g g e r S e n s o r

E c h o l i n e h i g h o r I n t e r r u p t h i g h ?

S t a r t T i m e r

E c h o l i n e l o w o r I n t e r r u p t h i g h ?

S t o r e t i m e

T i m e l e s s t h a n 0 x 4 0 0 ?

S e t p r o x c e l l t o ‘ 1 ’

S e t p r o x c e l l t o ‘ 0 ’

L o o p

Y E S N O

Y E S N O

Y E SN O

N O Y E S

Y E S

N O

S e t u p I2 C

I n t e r r u p t S e r v i c e

R o u t i n e S t a r t

I n t e r r u p t C l e a r

M A S T E R r e a d i n g , l a s t b y t e w a s a d d r e s s ?

R e s e t r e g i s t e r i f

o v e r f l o w e d

M A S T E R r e a d i n g , l a s t b y t e w a s d a t a ?

M A S T E R w r i t i n g , l a s t b y t e w a s a d d r e s s ?

M a s t e r N A C K ?

F l u s h B u f f e r

T x m i t p r e v i o u s l y s e l e c t e d r e g i s t e r

R e l e a s e C l o c k

F l u s h B u f f e r

T x m i t c u r r e n t

r e g i s t e r

R e l e a s e C l o c k

I n c r e m e n t r e g i s t e r

R e a d r e g i s t e r f r o m b u f f e r

R e l e a s e C l o c k

R e l e a s e C l o c k

E n d I n t e r r u p t S e r v i c e R o u t i n e

N O

N O

N O

N O

Y E S

Y E S

Y E S

Y E S

M a i n L o o p

I n t e r r u p t S e r v i c e R o u t i n e

Page 40: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

5.3 Hardware Interface Software ('Middleman')The software program that controls the flow of information between the Sensors/Motor controllers and the AI

is an object-orientated C++ program, called 'Middleman'. The software uses a freely distributed Serial

Communications wrapper for the Microsoft Windows API. This code was written by Ramon de Klein

[R.D.Klein, 2003] . The wrapper simply makes the Serial port functions of the Windows API easier to use. The

Source Code for the Hardware Interface Software is given on page 60, under Hardware Interface Software –

Middleman.exe.

The program contains a Hardware Interface class (called 'bot', see 5.4) that processes the incoming

instruction data and the incoming sensor data, and then makes low level decisions about the movement of the

robot, as well as performing any necessary numerical conversions. This class is entirely self-contained and will

exist as a “black box” object. The software accesses the project Networking API through the use of redirected

streams [Aichi, 2007] .

A set of proprietary commands for communication between the AI process and the Middleman Software has

been established. The primary form of information received from the AI will be motor speed information, whilst

the data transmitted will be sensor readings and override warning flags [Nicholson, 2007] .

The command format is as follows:

Command

(AI to HIS)

Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9 Char 10

Motor

Speeds“D” num num “/0” - - - - - -

Sensor Data

Request“S” “0” “0” “/0” - - - - - -

LED off “L” “0” “0” “/0” - - - - - -

Quit “Q” “0” “0” “/0” - - - - - -

Table 5.3.1: Command Format for Interface with Hardware

Ack. (HIS to AI) Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9Char

10

Valid command (D, L or

Q)“A” num num “/0” - - - - - -

Valid command (D, L or

Q)“A” num num num num num num num num “/0”

Override Notification (S) “O” num num num num num num num num “/0”

Override Notification (D) “O” num num “/0” - - - - - -

Invalid Command “E” num num “/0” - - - - - -

Table 5.3.2: Acknowledgement Format for Interface with Hardware.

The Networking API allows character strings and integers to be sent between processes. The above command

format was developed with this capability in mind.

34

Page 41: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

5.4 Hardware Interface Class ('bot')This class was written to simplify interaction with the robot. Anyone writing software to control the robot

(instead of using the Hardware Interface Software) need simply incorporate this class into their program, and

pass the check_command member function any of the commands in Table 5.3.1. An appropriate

acknowledgement character will be returned from this function. In the case of a sensor data request, the readings

can be obtained by reading the public array of integers named 'sensors'. See Hardware Interface Class – Bot.h on

page 61 for source code.

The Hardware Interface Class enumerates all available COM ports on the host platform upon instantiation. It

does this using a Serial Port Enumeration Class written by Zach Gorman, [Gorman, 2002] .

An additional feature that is useful in a scenario where Wireless LAN signal is intermittent is the time-out

feature. Upon instantiating the Hardware Interface Class, the user will be asked to enter a value in milliseconds.

This value is the time after a movement instruction before the motors are stopped. This measure is to ensure that

if the Wireless signal is lost, the robot will not continue with the last speed instruction indefinitely. This, in

conjunction with the Hardware proximity override feature (see 4.2) should prevent collisions.

A Software proximity override feature is also implemented in the Hardware Interface Class; when a sensor

data request is received (command 'S'), if any sensor reading is less than 6 cm, then the motors are stopped. This

feature can be enabled or disabled when the Hardware Class is instantiated.

NOTE: There exists no provision for checking which sensor on the chassis is connected to which connector

position on the Sensor Controller. The user should be aware of which sensor position is connected to which

35

Figure 5.2: Hardware Interface class - Invocation hierarchy diagram

Page 42: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

connector position. The correlation between cells in the sensor array variable and the connector positions on the

sensor controller is as follows:

Figure 5.2 shows the Invocation Hierarchy for the Hardware Interface class, indicating the relationships

between the public and private member functions. For example, the public member function check_command

will invoke the private member function check, which, if it returns a boolean value of 'true', will invoke the

private member function convert.

The private member function moderate is used to adjust the acceleration of the robot. If the difference

between the robot's current speed and the a new speed instruction is greater than 6, the acceleration of the robot

is reduced in proportion with the difference in speed. This feature is intended to prevent overloading the motors,

for example if the robot was travelling at full reverse speed, and received an instruction to travel at full forward

speed.

As mentioned in 4.5.1, the speed levels are split into 18 forward speed increments and 18 reverse speed

increments. The Hardware Interface Class uses the check and convert private member functions to convert

the received speed value (a signed integer between -18 and +18) into a signed char value between -128 and 128.

If the speed value is not valid or outside of this range, an error acknowledgement will be returned.

The sense and rangetocm private member functions are used to poll for sensor readings and convert the

received two-byte value to an integer, respectively. This is done simply by multiplying the 'high' byte by 256 and

adding the two bytes as integers.

The instruct private member function is used to send motor speeds to the MD22 motor controller,

dependent on the speeds being valid.

The led private member function is used for testing purposes, to turn off the LED on the USB-I2C interface.

The flush_tx and flush_rx private member functions are used to clear old data from the serial port

buffers, after transmission and reception respectively.

5.5 Remote Control Software ('Roboterm')A piece of interface software was written to aid in testing of the different robotic systems. This software is

designated 'Roboterm' and utilises the networking API code written by Ahmed Aichi [Aichi, 2007] (see Remote

Control Software - Roboterm.exe, on page 66). Roboterm communicates with the Hardware Interface Software

using the instruction set mentioned in Table 5.3.1. Roboterm allows remote control of the robot's systems,

providing a keyboard driving interface to motor control; in this mode, Roboterm constantly adds to a text file

containing sets of sensor measurements (polling takes place automatically), with the times and dates that the

readings were taken. This feature was used to aid in testing and development, and is a useful tool for

development of any decision system.

The Remote Control Software also allows individual command characters (and the appropriate values) to be

entered in a terminal interface. This mode was originally the only method of entering commands, and was used

in the early stages of testing the MD22 and the Sensor Controller.

36

Page 43: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

The keyboard control is performed using a class called 'keys'. This class is based on an example shown to the

author by Ahmed Aichi, where upon pressing a key, the corresponding keyboard code was printed to the console

[Aichi, 2007] . This was adapted so that the arrow keys caused left and right speeds to change. These speed

values are then be sent to the Hardware Interface Software after each keystroke. Pressing the space bar stops the

robot. See Keys Class – keys.h and keys.cpp on page 69 for source code.

5.6 Software Testing

5.6.1 Sensor Tests on Robot Chassis

Aim

To ensure that the software interface for retrieving sensor readings works accurately and reliably, and to

determine the effect of soft surfaces on detectability.

Methodology

Six sensors were installed on the robot, and the Sensor Controller was connected in its operational

configuration. A piece of software termed Udar.exe was written (see CD-ROM), incorporating the Bot class.

This program was used to repeatedly poll the Sensor Controller from a laptop computer.

A plastic tool kit with a woollen hat covering it was placed at 10 cm ± 1 mm away from each sensor. The

woollen cover was used to reduce the high reflectively of the plastic surface, in order to better simulate the

objects that would prove more difficult to detect in the CVSSP (i.e. soft furnishings). The apparent distance was

observed both with and without the woollen cover, averaged over at least three readings.

Results

Surface Forward

Sensor

Forward-Left

Sensor

Forward-

Right Sensor

Rear-Left

Sensor

Rear-Right

Sensor

Rear Sensor

Woollen 9 10 9 9 10 10

Plastic 9 10 10 10 10 11

Table 5.6.1: Apparent Distances (in cm) when mounted on Chassis

Analysis of Results

It was surprising to note that the presence of the woollen surface did not increase the apparent distance in any

of the results. Rather, the opposite was observed in some cases (see Table 5.6.1). The measured readings are all

within 1 cm of the measured distance, which is in keeping with the expected accuracy of the Sensor Controller,

bearing in mind that all floating point values are rounded down to the nearest integer. At this distance, an

accuracy of ± 1cm can be expected, which is acceptable given the size of the areas the robot will have to

negotiate.

37

Page 44: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

5.6.2 Serial Communications Test ProgramA test program was written in C++ to test the USB/I2C interface, as well as to learn the basics of serial

communications programming. The aim of the test was to control an LED on the USB/I2C interface device. This

did not require anything to be connected to the I2C bus, merely that the device was connected to a PC USB port.

Test Procedure

The test program works by creating a file handle that accesses the virtual COM port used by the interface

device. The command to deactivate the LED is written as a sequence of BYTE variables. For details of the

USB/I2C interface device, see the relevant section of the Devantech Ltd website [Devan, 2006] .

The parameters for serial communication used by the program are identical to those mentioned in the Motor

Testing Procedure in 4.5.1. The code takes these parameters as arguments when creating the COM port link.

The parameters are as follows:

- Baud Rate of 19200

- 8 data bits

- No Parity bits

- Two Stop bits.

In order to perform the test, the USB/I2C interface was connected to a Laptop Computer with the Test

Program Executable on the hard drive, and the USB/I2C interface drivers installed. After checking that the

interface device was enumerated as expected by the operating system (on port COM3), the test executable

program was run, and the LED on the interface device was observed.

Results & Analysis

The code required a few modifications before the LED deactivated correctly, which were trivial changes to

the code syntax and are not given here. The working code is attached (see CD-ROM). The advantage of this test

is that the result can immediately be observed, and the a successful result verifies the method for sending

commands to other devices on the I2C bus. It was decided to incorporate this test as a possible instruction in the

Hardware Interface Software (see Table 5.3.1), as it allows the user to test if the USB-I2C interface is

communicating properly with their software. This test program uses the Serial Communications Library

mentioned above [R.D.Klein, 2003] .

5.7 ConclusionThe Hardware Interface software can be used to control the robot over any Wireless Network, using the

proprietary Network API created by Ahmed Aichi [Aichi, 2007] . The processing relating to control commands

is done by the Hardware Interface Class, which also has a feature to stop the robot after a certain amount of time

has elapsed without an instruction being received. Sensor readings are given as integer values to the nearest

centimetre, which is deemed accurate enough for the environment the robot is expected to operate in.

The Object-Orientated nature of the Hardware Interface Class grants the Sensors and Control System a high

degree of flexibility and modifiability. For instance, should it be decided that a higher level of accuracy was

38

Page 45: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

required from the sensors (millimetres rather than centimetres for example), the class could be modified without

changing the way the Hardware Interface Software uses it.

The Remote Control Software is a useful tool for testing and development of new and existing features of the

robot, and is intended to emulate the outputs of any decision system. The modular nature of the Sensors and

Control System allows the decision system to be substituted for the Remote Control Software with no changes

being made to the Hardware Interface Software.

39

Page 46: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

6 Other Testing

6.1 IntroductionThis section describes testing that does not definitively fall under the domain of Hardware or Software, but

incorporates elements of both. These tests involved other members of the project team, and further descriptions

are likely to be found in the Project Reports for their respective areas.

6.2 Integration work – Floor running tests with camera

6.2.1 AimTo obtain on-board camera footage to aid development of the Vision System; specifically, footage from the

robot moving forward at varying speeds, so that motion detecting video algorithms could be tested.

6.2.2 MethodologyThe laptop was secured to the chassis cradle, and a web-cam attached to the front of the chassis, in roughly

the position proposed for the final design. This, along with the MD22, was connected via USB to the laptop, and

the Serial Test Program was used to input simple movement instructions (move forward at constant speed). As

no battery had yet been sourced for the robot, a long power lead was connected to a bench-top power supply.

This gave the robot a limited movement range, however it did allow emergency shut-down of the motors by

means of deactivating the power supply.

The robot was tested moving forwards at 1/3, 2/3 and full forward speed. The video data from the web cam

was recorded for each run. Tests were performed where furniture was placed in front of the robot, to examine the

effect this had on the motion data obtained. One of the team members also walked across the cameras field of

vision to examine the effect of moving persons.

6.2.3 ResultsThe video feeds were obtained with no major incident. The movement of the robot was consistent and there

were many visual references for use by the Vision System. This test involved integrating the Sensors and Control

System with the Vision Processing system, in terms of the speed setting applied to the motors, and the apparent

velocity observed from the on-board camera. For full discussion of the results, see the report on the Vision

System [Helland, 2007] .

6.3 Integration Work – Collection of Data for decision system

6.3.1 AimTo obtain sensor data that can be correlated in time with a position given by the External Vision System. This

data will be examined to determine how well it matches the values expected (based on the known layout of the

40

Page 47: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

CVSSP).

6.3.2 MethodologyThe Remote Control Software was used to drive the robot, over a Wireless link, with sensor readings being

polled constantly and saved in a text file, with a time stamp appended to each set of readings. The robot was

driven down one corridor of the CVSSP in clear sight of one of the fixed cameras installed in the centre. Peter

Helland's Vision Processing program was used to give a value in Cartesian coordinates of the robot's position at

regular intervals [Helland, 2007] . The position results were also saved in a separate text file with a time stamp

automatically appended by the program. The system times of the laptop running the Remote Control Software

and the PC running the Vision Processing software were synchronised, to ensure that the readings could be

correlated accurately.

The robot was driven at speed setting 6, close to the centre of the corridor as possible. People were asked to

keep out of the corridor for the duration of the test, so that the position readings obtained by the Vision

Processing Software would not be affected.

6.3.3 ResultsThe data obtained was used by Martin Nicholson to assist in developing his Decision System. This test

involved integration of all aspects of the project; the Vision System, the Networking API, the Sensors and

Control System, and the Decision System. For the numerical analysis of the data, see Martin Nicholson's project

report. [Nicholson, 2007]

41

Page 48: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

7 Problems and Issues Encountered

7.1 IntroductionThis section describes difficulties that impeded the development of the Sensors and Control System. This is

an important section as it identifies where improvements to the author's working practices need to be made.

7.2 Project Task overruns

The Time Management for this project was very poor. The actual schedule of work had little or no relation to

the Gantt Charts drawn up at the start of each semester; in addition, these Charts were not updated regularly as

the project progressed (see Figure 7.1, Figure 7.2, and Figure 7.3).

The result of this was a tendency to get engrossed in difficult details of the development work, when efforts

could have been more usefully directed elsewhere. This caused progress to lag behind what was expected.

42

Figure 7.1: Gantt Chart created at beginning of project

Semester 1 Exams Xmas Semester 2

Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 1 2 3 4 5 6 7 8 9 10

Chassis and Motor TestsDevelop Test RigTest Register reads/writes

Sensor Trade off analysisExamine options

Sensor TestingHeight-above-floor testsHard return testsSoft Return tests

Software InterfaceModule Relationship DesignCodingComments and Annotations

PIC EncodingState Machine Diagram

Coding and SimulationProgramming and testing

Inter-rim reportCollation of results into appendicesWrite-up experimental workWrite Introduction, ConclusionReferencesContents Page

Circuit DevelopmentLayout designPopulation and Tests

Integration onto chassisSystem DuplicationFinal Report (Dissertation)

Construct work scheduleIntegration of new data/resultsUpdate conclusionsEnsure design is currentTidy up and submit

Page 49: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

43

Figure 7.3: Retrospective Gantt Chart based on schedule of whole project

Semester 1 Exams Xmas Semester 2Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Chassis and Motor TestsDevelop Test RigTest Register reads/writes

Sensor Trade off analysisExamine options

Sensor TestingHeight-above-floor testsHard return testsSoft Return tests

Software InterfaceModule Relationship DesignCodingComments and Annotations

PIC EncodingState Machine DiagramCoding and SimulationProgramming and testing

Inter-rim reportCollation of results into appendicesWrite-up experimental workWrite Introduction, ConclusionReferencesContents Page

Other TestingCircuit Development

Layout designPopulation and Tests

Integration onto chassisSystem DuplicationFinal Report (Dissertation)

Tidy up ResearchIntegration of new data/resultsUpdate conclusionsEnsure design is currentTidy up and submitReferencesDiagramsDocumentation

Figure 7.2: Gantt Chart created half-way through the Project

Semester 1 Exams Xmas Semester 2

Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 1 2 3 4 5 6 7 8 9 10

Chassis and Motor TestsDevelop Test RigTest Register reads/writes

Sensor Trade off analysisExamine options

Sensor TestingHeight-above-floor testsHard return testsSoft Return tests

Software InterfaceModule Relationship DesignCodingComments and Annotations

PIC Encoding

State Machine Diagram

Coding and Simulation

Programming and testingInter-rim report

Collation of results into appendicesWrite-up experimental workWrite Introduction, ConclusionReferencesContents Page

Circuit DevelopmentLayout designPopulation and Tests

Integration onto chassisSystem DuplicationFinal Report (Dissertation)

Construct work scheduleIntegration of new data/resultsUpdate conclusionsEnsure design is currentTidy up and submit

Page 50: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

Insufficient allowance was made for delivery of components, completion of PCB's and other time-factors not

under the author's control. The working practice that should have been adopted is for the Project Schedule to be

reviewed in a regular session every week, possibly as an adjunct to the project meetings. There was also no

contingency plan formulated in case areas of the project took longer than expected. The end result of these

delays was that the robot lacks a magnetic orientation sensor, which was seen as a potential addition to the

project that would have increased the capabilities of the robot. Also, no work was done on duplicating the

System until the very end of the project, and at the time of writing is still incomplete, although a Prototype

Sensors Controller is available that lacks the hardware override mechanism. This will be used on the second

robot. It would also be possible to modify the prototype to have the same functionality as the latest circuit.

7.3 Completion of PIC Sensor InterfaceA significant delay was encountered when developing the PIC sensor controller. Of the initial batch of three

PIC18F2220 devices, only one could be correctly programmed using a PICStart Plus programmer. This device

was used for development work while new devices were being ordered.

Further delays were encountered when using the In-Circuit Debugger (ICD) tool. The ICD required

connection to be made to several important pins on the device, and the User Guide did not clearly label the pins

on the connector, leading to the wrong connections being made.

7.4 Hardware constructionA Printed Circuit Board (PCB) was designed for the PIC sensor controller, using software available in

the undergraduate laboratories. The initial version of this design contained errors in the following areas:

– The ICD header pins were connected to the wrong pins on the PIC sensor controller.

– No bypass switch was connected across the relay to prevent the PIC sensor controller shutting off the

motor power.

– The Relay used in the original design was more complicated than was necessary (of the Double Pole -

Double Throw (DPDT) type, as opposed to Double Pole Single Throw (DPST)).

A new PCB layout was designed, and the existing one modified by hand using additional wires. The

new PCB used a differing relay footprint (for a SPST relay), and included a bypass switch and a series resistor

that could be used to reduce the current flowing through the relay, in case this was proved to be an issue.

A 100 ohm resistor was initially connected in series with the relay; this did not allow sufficient voltage drop

across the relay coil. The resistor was replaced with a length of wire: this allowed the relay to function as

desired.

7.5 Manoeuvrability IssuesIt was observed during integration attempts, when rotating the robot on it's vertical axis was necessary, that

the ability of the robot to turn was compromised. It was reasoned that this was due to the high levels of torque

44

Page 51: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

needed in a skid steering system. Since the tyres of the robot are wide, a high level of power is needed to turn on

the spot, since the friction of the tyres on the ground must be overcome.

It was also noted that the wheels persistently developed instabilities, in that a tendency to wobble on the axle

was observed. This resulted in the wheels having to be periodically tightened and adjusted, otherwise the

handling of the robot was affected. Not even stringent readjustment could completely correct the problem (the

robot would consistently pull to one side during tests).

7.6 Hardware Interface Class time-outThe time-out feature of the Hardware Interface Class works as desired, however an error will be returned if

the user enters a number less than 3 digits long. This is because the input validation code requires there to be 3

characters in the input string, which are checked to see if they are digits. Improvements should be made as part

of further work.

45

Page 52: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

8 ConclusionThe system produced as a result of this project provides a self contained means of controlling a robot, which

can be interfaced with using the networking API. Since the Hardware Interface Software (Middleman) is object-

orientated, the Hardware Interface class can be modified as a 'black-box' entity; this makes the software very

flexible, allowing for future development of the robot. The simple interface also means that the Decision System

can be modified as an individual module.

The system could be improved by adding more sensors; these would be best connected to the PIC sensor

controller, and accessed through extra registers. Since the PIC can be reprogrammed in-circuit, the code can be

updated easily. The unused PIC outputs are also brought out to connection points on the circuit board, meaning

that the PCB will not require modifications. It would also be beneficial to incorporate a level of decision-making

ability into the motor-override feature. At present, the motor commands will be overridden regardless of which

sensor is reading below the threshold. More useful would be a system that could detect the direction that would

potentially result in a collision, and override only movement commands that take the robot in that direction. This

was considered beyond the ability of the author to implement in the available time, however a person with

experience in machine intelligence should be able to come up with a workable solution. This task is made more

complex by the interchangeability of the sensors, as any sensor can be connected to any position on the control

port. The Sensors and Control Documentation contains guidance to avoid this (see Appendix B - USER GUIDE

– SENSORS AND CONTROL SYSTEM).

Sensors that could usefully be added to the system include magnetic sensors for determining the robot's

orientation, and odometry sensors for determining the distance travelled (and possibly position relative to the

starting point).

The system meets the specifications outlined in 1.2. The sensors will detect objects up to 5 metres away,

accurate to the nearest centimetre, and the readings are updated constantly. The latest sensor readings are

available at any time, and can be sent to any program, using the Network API [Aichi, 2007] .

The speed and direction of the robot can be controlled very accurately, setting the speeds of the left and right

motors independently. The Hardware Interface Class automatically moderates the acceleration of the robot, to

prevent over-charging the motors.

There exist both software and hardware options for preventing collisions; the software will stop the robot if a

sensor reading is less than 6cm. The sensor readings are not checked automatically, only when a sensor data

request is received. The PIC sensor controller will cut power to the motors if a sensor reading is less than 11.76

cm (2.d.p); this cut-out can be bypassed by a switch, or by not connecting the motor power through the sensor

controller at all. It should be noted that the Artificial Intelligence is expected to avoid obstacles in the first case.

The cut-out is an 'emergency' measure.

The override feature mentioned above is an area of the project where there is significant scope for further

work. As mentioned in 4.5.7, the delay between an obstacle crossing the override threshold and the motors

stopping can be up to half a second. The hardware cut-off threshold has been extended to compensate for this,

however the issue is still pertinent. 4.5.7 suggests possible improvements to the hardware override mechanism.

46

Page 53: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

It is also possible that the software override could be used to more effectively halt movement when

necessary, since all sensor readings are received simultaneously, and the laptop processor can execute

instructions much more quickly than the PIC (the PIC clock is 24 MHz, the laptop PC is 933 MHz). It is possible

that high frequency polling of the Sensor Controller will slow down the rate of sensor updates, however.

Another area with great potential for further work is the acknowledgements system. At present, a command

will be acknowledged based solely on whether it meets the Hardware Interface Class' definition of a valid

command. This could be expanded to included whether or not a response was received from the I2C device

being accessed (a hexadecimal value of 1 will be returned if the write was successful, a value of zero otherwise).

This would allow the user to gain a higher level of information about where in the System any problems were

occurring, particularly if different acknowledgement characters were used for invalid commands and failed

transactions.

It would also be possible to re-write the software associated with this project in a more organised fashion,

packaging code into functions, and perhaps making the whole system (including the Remote Control Software)

object orientated. There is also a minor bug in the Hardware Interface Class which should be remedied (see 7.6).

When undertaking any future projects of this nature, it will be necessary to prepare contingency plans in case

of delays. Regular time-management sessions will also be a part of the project schedule, where Gantt charts will

be updated and priorities re-assessed. This measure should help to highlight problems earlier.

In summary, this project was successful in producing a system which meets the specifications given, despite

poor time management and development difficulties. There is much room for improvement of the System,

however such modification should not be difficult to accomplish, thanks to the modular design.

47

Page 54: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

9 References● [Aichi, 2007] – Special Engineering Project: Networking and Inter-System Communications, Final

Project Report, by Ahmed Aichi 2007

● [Devan, 2006] – Devantech Ltd website, containing product details and technical documentation. Can

be accessed at the following URL: http://www.robot-electronics.co.uk/. The documentation for the products

used in the project can be accessed as follows:

Motor Controller - http://www.robot-electronics.co.uk/htm/md22tech.htm

USB/I2C interface - http://www.robot-electronics.co.uk/htm/usb_i2c_tech.htm.

Ultrasonic Ranger - http://www.robot-electronics.co.uk/htm/srf05tech.htm.

● [Fu, 1987] – Robotics: Control, Sensing, Vision, and Intelligence, by K. S. Fu, R. C. Gonzalez, and C.

S. G. Lee, published by McGraw-Hill

● [Gorman, 2002] – Serial Port Enumeration Class by Zach Gorman, © Archetype Auction

Software, Inc. 2002.

● [Helland, 2007] – Special Engineering Project: Vision Processing, Final Project Dissertation, Section 6:

Experimental work, by Peter Helland, 2007

● [I2C Slave, 2002] – Hobby website containing the only PIC I2C Slave examples found on the web (note

that the examples are not for the C18 compiler, rather they are in PICBasic.).

http://www.astrosurf.com/soubie/pic_as_an_i2c_slave.htm.

● [Lynx, 2006] – Lynxmotion Incorporated website at www.lynxmotion.com. Information on the 4WD3

robot chassis can be found at: http://www.lynxmotion.com/Category.aspx?CategoryID=59

● [Micro, 2000] - Using the PICmicro SSP for Slave I2C Communication, Application Note

734 (AN734), by Stephen Bowling. Available from the Microchip website:

http://ww1.microchip.com/downloads/en/AppNotes/00734a.pdf.

● [Nehmzow, 2000] – Mobile Robotics : A Practical Introduction, by Ulrich Nehmzow, published by

Springer-Verlag London Limited 2000, ISBN:1852331739

● [Nicholson, 2007] – Special Engineering Project: Decision System, Final Project Report, by Martin

Nicholson, 2007

● [Pugh, 1986] – Robot Sensors Volume 2: Tactile and Non-Vision, edited by Alan Pugh, published by

IFS (Publications) Ltd 1986, ISBN:0-948507-02-0

● [R.D.Klein, 2003] – Serial Communication Library written by and © Ramon de Klein, available as free

software. The author can be contacted at the following e-mail address: [email protected] . The code

can be obtained from the author's homepage at: http://home.ict.nl/~ramklein/Projects/Serial.html.

48

Page 55: Robotic Sensors & Control - Final Project Report - Electronic

Robotic Sensors & Control - Final Project Report

● [Schur, 2006] – Robotics Web pages by Chris and Dawn Schur,

http://www.schursastrophotography.com/roboticsmain.html, specifically the article on IR proximity

sensing.

● [Ser, 2004] - SerialComm Application Version 1, 0, 0, 1, by Wrapbit, obtained from

www.sourceforge.net.

49

Page 56: Robotic Sensors & Control - Final Project Report - Electronic

Appendix A – Code Analysis(See Source files on attached CD-ROM)

PIC Sensor Controller Code

Header Files

• “p18F2220.h”: the header file for the model of PIC being used. This file defines the structure and names of all the registers and pins available on the device.

• “i2c.h”: defines library functions written by Microchip to assist in I2C applications.

• “stdio.h” : the C standard input/output library.

• “stdlib.h” : the C standard general purpose library.

Global Variables

• unsigned char last_byte: this byte is used to flush the Serial input buffer during the I2C interrupt service routine.

• ram volatile unsigned char results[18]: 18 bytes that act as user accessible registers, the first 16 of which contain ultrasound sensor readings, the remaining two give the compass reading.

• ram volatile unsigned char reading[2]: 2 bytes to hold the Timer values obtained from measuring an echo pulse, the values are then transferred into the appropriate results register.

• ram volatile int j = 0 : integer used to index the results register, any values written over the I2C bus overwrite this value, allowing the user to select which registers they wish to read from. This value will loop back to 0 if it overflows past 18.

• ram volatile long c : an integer value used in conjunction with the local variable ‘i’ (found in main). The value of 'i' is incremented for every iteration of the main loop; when i exceeds the value of 'c', the status of the heartbeat bit changes. This is visible as a flashing LED on the PCB. Accessing the I2C interrupt service routine (ISR) causes the value of 'c' to change momentarily, and the LED to flash faster.

Variables in main

• unsigned char trig : A byte to store the trigger output patterns for PORTB

• long i : Heartbeat counter; this is incremented with each iteration of the main loop. When i overflows the value of c, the status of an external LED is toggled.

• int beat : The toggle variable for the heartbeat LED.

• int s : Variable to hold which sensor is being polled, used in switch statement. Set to 0 when it overflows the maximum number of sensors (8).

• int prox[8] : array of integers used to determine if one or more of the sensors is reading an object closer than 5.88 cm.

• int k : Variable used to select cells in the prox array.

• int relay : Variable used to sum the prox array. If this value is greater than 0, then the power to the motors is cut off.

Function Prototypes

• void i2c_isr(void) : The prototype for the I2C interrupt service routine. This takes no arguments and returns no values.

• void setup(void) : The prototype for the function which sets the options for I2C communications, Interrupts, and Input/Output. As above, this function takes no arguments and returns no values.

Page 57: Robotic Sensors & Control - Final Project Report - Electronic

Preprocessor Directives

• #pragma code low_vector=0x18 : This directive defines the following code section to be located at the PIC low priority interrupt vector. This is a section of the PIC program memory reserved for code that executes when a low priority interrupt occurs. The code section contains inline Assembler code, indicating a jump to the i2c_isr function.

• #pragma code : This signifies the end of the code section to be located at the low priority interrupt vector.

• #pragma interruptlow i2c_isr : This is the beginning of the interrupt service routine. This #pragma is unique to the Microchip C18 compiler, and is not part of the C standard.

• #define address 0xE0 : Defines a constant address to be 0xE0. This is the address of the sensor controller on the I2C bus.

Program Flow

● At the start of the main function, variables are declared and initialised, and global variable c is set to 10 (gives a slow heartbeat).

● The setup function is called. This sets up the data direction on the input/output ports (TRIS registers), the interrupt configuration (priority off, Serial Port interrupts only), and the I2C settings (slew rate control off, 7-bit address, start/stop interrupts, clock stretching on).

● A do{...}while(1) loop is started here.

● i is incremented, and the code for the heartbeat counter is executed. This is an if statement that checks the logical condition (i > c). If true, the value of the beat variable is toggled using another if statement (if the value is 1, it is set to 0, otherwise it is set to 1). The pin acting as a pull down for the external LED is set to the value of beat.

● If c is less than 10, is is incremented until it is equal to this value. This will cause the heartbeat LED flashing to slow down after speeding up for an interrupt.

● The variable s is incremented. An if statement then checks to see if(s == 8). If so, s is reset to 0.

● The relay variable is used to sum the contents of the prox array. If the relay variable is 0, then the relay is held closed, and power can flow to the motors. If the relay variable is not 0, then the relay is opened, interrupting the power. The 17th cell in the results array is set to 1 if the power is interrupted, 0 otherwise (allowing the user to read this back as an override notification flag).

● A switch statement, taking s as an argument, is used to select the trigger output pattern for the sensor being polled.

● Following the switch statement, any interrupts on Timer1 are cleared. The Timer values are set to overflow after 10 microseconds. The Timer is set to run at 0.75 MHz, and is started. Immediately afterwards, the value of trig is assigned to PORTB. This raises the trigger pin of the appropriate sensor to logic 1.

● When the Timer overflows (after 10 microseconds) PORTB is set to 0, and the Timer is stopped. The interrupt, which was monitored with a while() statement, is cleared.

● The Timer is set for a delay of 700 microseconds (the time quoted by the sensor manufacturer between the end of the trigger pulse and the echo line being raised). The Timer is started with no prescale (at 6 MHz). A while() statement is used to monitor for the echo line going high or the timer overflowing.

● Once the sensor has raised the echo line high, the Timer is set to 0x0000, it's lowest value, and the prescale is set to give an effective clock value of 1.5 MHz. This will give a maximum timer value of around 43 milliseconds, 13 milliseconds greater than the maximum pulse width of the sensor.

● The interrupt from the previous Timer operation is cleared, and the Timer is set running.

● A while() statement is used to halt until either the echo pin goes low, or the Timer overflows. Once either of these occurs, the Timer is stopped, and the interrupt cleared.

● The 2 bytes of Timer value are assigned to the reading variable. The high byte of the Timer is then checked in an if statement; if the value is greater than 512, then the appropriate cell in the prox array

Page 58: Robotic Sensors & Control - Final Project Report - Electronic

is set to 1, otherwise it is set to 0.

● The contents of reading are assigned to the appropriate pair of cells in the results array.

● The Timer is set up for 50 ms before overflow, running at 0.75 MHz, and set running. This is the amount of time recommended by the sensor manufacturer to allow each sensor pulse to fade and eliminate crosstalk. When the Timer has overflowed, the interrupt is cleared, and program execution jumps to the top of the do{...}while(1) loop.

I2C interrupt

● If, at any point during program execution, the Synchronous Serial Port Interrupt Flag (SSPIF) is raised (activity takes place on the I2C bus), the PIC will save all variables currently in use to a special section of memory, and jump to the i2c_isr function at the interrupt vector.

● The global variable c is set to 2. This increases the frequency of the heartbeat counter by a factor of 5, giving a visual indication of when an interrupt has been triggered.

● The SSPIF is cleared before any other action takes place. If a series of transactions on the bus occur in quick succession, then the interrupt service routine will continue to be called, however, clearing the interrupt as a matter of course ensures that program execution can never 'lock up' due to an interrupt never being cleared.

● The global variable j, which is used as an index for the results array, is looped around to the start of the array if it has gone past the 18th value.

● The program now enters a 4-state machine composed of if/else statements. The four states that can be accessed are listed below, the state accessed is dependent on the Serial Port Control Flags.

1. If the bus Master is reading, and the last byte received was an address byte, the port buffer is read to flush the data it contains. The buffer is then loaded (for transmission to the Master) with the currently indexed cell of the results array. The Clock Release bit is set, releasing the bus clock and allowing the transaction to proceed.

2. If the Master is reading, and the last byte received was a data byte, the port buffer is read to flush it, after the value of j has been incremented. The newly indexed results cell is loaded into the buffer for transmission. The Clock Release bit is set.

3. If the Master is writing, and the last byte received was and address byte, then the value of j is set to the value held in the buffer, setting the index of the results array to the user's desired value. The clock is released.

4. If the Master has sent a NACK condition (end of transaction), then the buffer is flushed and the clock release, with no other actions taking place.

Hardware Interface Software

Header Files

● “bot.h” is the header files for the bot class created to perform the conversions for the hardware interface. Only one instance of this class occurs in Middleman.

● “stdio.h” contains the standard input/output function definitions for C/C++.

● <iostream> allows the use of streams for input and output.

● <stdlib.h> is the C/C++ standard library.

● "TCPLink.h" is the header file for Ahmed Aichi's Network API.

Definitions

● #define WIN32_LEAN_AND_MEAN: This option excludes rarely used references from the windows.h header file (included in bot.h), reducing build time.

● using namespace std: this causes all the C standard library functions to be brought into the same

Page 59: Robotic Sensors & Control - Final Project Report - Electronic

namespace as those that are defined to be in the std namespace.

Objects and Variables

● bot pro: this creates and instance of class bot, referred to as pro. See 3.2.2 for a description of this class.

● int i: an integer variable used to count iterations in a for loop.

● long left and long right: long integers used to hold speed values received via the TCP link.

● char command: a single character which is received via the TCP link and denotes an instruction to the robot.

● char ack: a single character returned from pro, and sent via the TCP link to denote either a correct or incorrect instruction.

● int local_port: integer holding the local TCP port number for Middleman.

● int remote_port: integer holding the remote TCP port number that Middleman will accept connections from.

Program flow

● Objects and Variables are instantiated/declared.

● Information is printed to the console, informing the user what the program is and what TCP port numbers are being used.

● Two functions from the Network API are called to set-up the TCP link. The first (TCPLink::Load) launches a program called Bcastmanager. The second declares a TCP link called motorlink, accessed as a stream, which is trying to connect to a program called roboterm.

● When the connection process has begun, the user is informed that connection is in progress, and a full stop is printed to the console every 200 milliseconds, after the fashion of a progress bar. This continues until the TCP link is connected (there is no time-out).

● When the link is connected, the user is notified and execution proceeds.

● The program enters a do{…}while(…) loop, which continues until the command ‘Q’ is received.

● Every 20 milliseconds, the program checks for data received over the TCP link. When data is received, any previously read data still in the buffer is cleared.

● The command character and the left and right speed values are extracted from the motorlink buffer in the same way that characters are extracted from a cin stream.

● The command character is converted to uppercase, since a user can enter both lowercase and uppercase characters into the Roboterm program, and the pro object deals only with uppercase.

● Τhe command is printed to the console (for debugging purposes).

● The command character and left and right speed values are passed to pro.check_command, which returns an appropriate acknowledgement code (either 'A' for a correct command, or 'E' if the command or values where unrecognised or not of the correct type), and acts upon the command (sets the motor speed, for example).

● The acknowledgement is printed to the console (for debugging purposes).

● A nested if(...)else(...)statement checks if the acknowledgement was ‘A’; if so, the appropriate response to the received command is loaded into the motorlink buffer (for example, if the command ‘S’ was received, the user is waiting for eight integer value sensor readings after the acknowledgement, whereas if the command was ‘L’, only the ack character is expected). If the acknowledgement was 'E', then the left and right values are returned as they were received.

● The information in the motorlink buffer is now sent.

Page 60: Robotic Sensors & Control - Final Project Report - Electronic

● If the command character was anything other than ‘Q’, the execution returns to the start of the do(...)while(...) loop waiting for the next instruction. If the command is 'Q', the loop ends, and executions continues to the exit(0) statement.

Hardware Interface Class

Preprocessor directives

● #pragma once: this directive instructs the compiler to only include bot.h once in any single compilation. This should increase compile speed on any implementation using more than one object of class bot. This is not a standard ANSI C directive.

Header Files

● “EnumSerial.h”: header file for a library that facilitates Enumeration of serial ports on a Windows PC [Gorman, 2002] .

● <iostream> is needed for the cout and cin streams.

● “Serial.h” is included to allow the use of the Serial Communications Class [R.D.Klein, 2003] .

Member Functions

Public Member Functions

● bot (constructor) : Since this function is the constructor, all the class variables are initialised here (if necessary). This includes the values for the I2C addresses of the various devices under the programs control. The function has several local variables, including an array vi, which holds the information about the COM ports on the local machine. This array is populated using the function Enumerate, which is defined in the EnumSerial header. Once the array is populated, the names of the ports are listed, and the user selects which port is connected to the USB-I2C interface. The serial interface is then initialise, using the CSerial class defined in Serial.h, with the appropriate parameters (Baud Rate etcetera). The user is asked to enter a time out period in milliseconds. This is the amount of time that will elapse after a Drive command is received before the robot stops moving. The final thing to occur in the constructor is the initialisation of the MD22. The left and right speed values are set to 0 and the acceleration and mode registers are set to the appropriate values.

● check_command : This function is the user's method for sending commands to the robot. The arguments consist of a command character, and a left and right speed value (long integers). If a command is being sent that is not a speed command, both speed values should be 0. The function returns an acknowledgement character. The function first checks that the speed values are appropriate (between -18 and 18), and converts them to a value between -127 and 127. The acceleration values are moderated based on the previous speed values; if the difference is above a certain threshold, the value is increased proportionally. The command character is then run through a nested if statement to determine the appropriate action to take. The functions to execute the received commands are called here. A switch statement is used to choose the response.

● timeout_check : This is called in the host program to check the time out status. The current file time (measured in units of 100 nanoseconds since 1st January 1601) is subtracted from the file time recorded upon reception of the previous Drive command. If the difference is greater than the time out the user entered, the time out flag is set and the motors are stopped.

● ~bot (destructor) : The class destructor stops the motors on the robot, and closes the serial port handle, freeing up the resources associated with it.

Private Member Functions

● rangetocm : This function takes two bytes as arguments, and returns an integer value. The first byte is converted to an integer, and multiplied by 256; the second byte is also converted and added. This concatenates the two bytes into one integer value, which is then returned.

Page 61: Robotic Sensors & Control - Final Project Report - Electronic

● convert : This function converts its argument (a long integer between -18 and 18) to an integer value between -128 and 128. This value is returned by the function.

● check : This function performs a simple logical check on it's argument to determine if it is outside the range -18 to 18. A logical 1 is returned if this is true, a 0 otherwise.

● moderate : This function checks whether the difference between the previous speeds sent to the robot and the new speeds received is greater than 80. If so, the acceleration is set to equal the difference between the new and old values (a higher acceleration value results in slower power stepping of the motors).

● instruct : this function is used to send speed values to the MD22 motor controller. The COM port transmission buffer is flushed, then the values are assigned to the buffer, and sent. The speed values sent are stored for the purposes of acceleration moderation.

● sense : Returns a pointer to the receive buffer of the serial port, so that the sensor readings can be extracted immediately after this function is called. If this is not done, data may be lost. The function takes two arguments, the first is the PIC register that the read operation should start at, the second is the number of bytes to read back. The read operation is preceded by a write, setting the PIC to read back from the register corresponding to the first variable. A pause is inserted of 10 milliseconds, to allow the PIC to respond. The COM port is then read into the receive buffer, a pointer to which is returned.

● led : This function sends a hard coded set of instructions to the serial port to turn the led on the USB-I2C off.

● flush_tx : This function flushes the serial transmit buffer by overwriting it with 0's.

● flush_rx : This function flushes the serial receive buffer by overwriting it with 0's.

Page 62: Robotic Sensors & Control - Final Project Report - Electronic

Appendix B - USER GUIDE – SENSORS AND CONTROL SYSTEM

IntroductionThe Sensors and Control System consists of the following components:

– A Sensor Controller Circuit Board

– Eight sensors per robot with four-way connectors

– A USB-I2C interface with USB cable

– MD22 motor controller

– Hardware Interface Software

– Remote Control Software (optional)

The following User Guide outlines the procedure for correct use of the Sensors and Control System. The set-up procedure for each component will be given, followed by instructions for sending commands to the robot using either the Remote Control Software or another program, and information on writing software to incorporate the Hardware Interface Class.

Set-UpIt is recommended that the user check the tyres of the robot, to ensure that they are firmly affixed. Unstable

wheel alignment can cause steering problems.

Sensor Controller Circuit BoardThe Sensor Controller board should be affixed to three of the four black vertical stand-offs projecting up

from the yellow top-plate of the motor housing, with the five I2C header pins facing towards the rear of the robot, and the PIC chip uppermost. The Allen-head screws that were supplied with the robot should be used for this purpose (Note: one stand-off is covered by the I2C interface header pins; no attempt should be made to insert a screw in this position).

The two-way header linking the positive battery lead to the on/off switch and fuse holder should be connected across the two pins adjacent to the white relay on the Sensor Controller board. The orientation of this connector is unimportant.

The small slide switch on the side of the board should be pushed towards the rear of the robot, if hardware motor overrides are desired. To bypass the Hardware Motor Overrides, the switch should be in the forward position.

Sensors

The Sensors should be connected to the 2 * 16 header strip toward the rear of the Sensor Controller board. The sensor leads are colour coded; the red and black leads should be on the rear side of the header strip, as shown in Illustration 1. NOTE: It is important that the connection scheme in Illustration 1 is followed exactly. Failure to connect the sensors in this manner will lead to non-functional sensors.

Illustration 1: Connector orientation of sensors

Rear Side1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

12

7 6 5 4 3 2 1 0Front Side

Page 63: Robotic Sensors & Control - Final Project Report - Electronic

The user must note which sensors are connected to which parts of the header strip. Each sensor slot (group of four colours in Illustration 1) is referenced by a cell in an integer array in the Hardware Interface Class (see ). The blue numbers in Illustration 1 denote the cell that the sensor reading will be found in. It is recommended that a logical connection scheme is applied; i.e. The front sensor is connected in position 0, with connection proceeding clockwise around the chassis.

USB-I2C interfaceA female-to-female adapter is included with the Sensor Controller board. This consists of a 5-track piece of

veroboard with a 5-way connector on each end. One end of this should be connected to the I2C interface header on the rear of the Sensor Controller board, with the copper side uppermost.

The USB-I2C pins should be inserted into the remaining socket, with the component side of the interface circuit uppermost. The USB cable will be connected to the controlling laptop computer.

MD22 Motor ControllerThe MD22 is located with in the yellow motor housing, along with the battery for the motors. Ensure that the

battery is connected to the 2-way connector, and that the MD22 connections are all in place. The 4 wires of the I2C interface/5 V supply should be connected from the Sensor Controller. The negative and positive battery leads should be connected in the outermost positions on the six way connector, with the right and left motors connected in the appropriate positions (see Illustration 2).

Hardware Interface SoftwareThe Hardware Interface Software (Middleman.exe) should be located on the laptop that will have the USB-

I2C interface connected to it. The software should only be launched once the following measures have been taken:

– Bcastmanager.exe is located in the root of the laptops C drive. A copy of this program can be obtained from Ahmed Aichi.

– The USB cable is from the USB-I2C interface is connected.

When the software is launched, the user will be presented with a list of all the COM ports available on the laptop. Select the entry in the list that is referred to as a USB serial port by entering the list index number of the entry. Press enter.

The user will be asked to enter a time out value in milliseconds. This is the amount of time that will elapse before the robot is stopped, after the Hardware Interface Software has received a motor command. Choose a suitable value based on the means of controlling the robot and the operating environment.

The user will be asked if she/he wishes to enable software proximity overrides. Enter 'Y' or 'N', based on your preference.

Once these options have been selected, the Hardware Interface Software will broadcast using Bcastmanager.exe, until a connection with a controller is made.

Remote Control SoftwareTo remote control the robot, use the Remote Control software (Roboterm.exe) on a PC that is connected to

the laptop via a Wireless or Ethernet link (both Middleman.exe and Roboterm.exe can be run on the same PC if desired). Ensure that the subnet masks on the network are identical.

When the Remote Control software is launched, the user will be asked if she/he wishes to remote drive the

Illustration 2: Connection Diagram for MD22

Battery +Right Motors + I2C groundRight Motors - I2C SDA

Left Motors - I2C SCLLeft Motors + I2C 5V

Battery -

Page 64: Robotic Sensors & Control - Final Project Report - Electronic

robot. Choosing 'No' at this point will take the user to a terminal style interface where commands are entered one at a time. Choosing 'Yes' will cause driving instructions to be printed to the console, and the user will be able to drive the robot using the arrow keys on the keyboard. Sensor readings will be continuously polled and saved in the same directory as Roboterm.exe, with the filename Sensor_log.txt.

Controlling the Robot

Using Roboterm.exeAs mentioned above, the user may drive the robot remotely using a PC keyboard. The arrow keys are used to

increase the speed of the motors forward or backwards, with left and right changing the differential between the left and right motors. Pressing the space bar will stop the robot immediately, whilst pressing escape will exit remote drive. The user will be asked if they wish to re-enter remote drive, or use the terminal mode (see below).

Using the terminal style interface, commands from Table 1 may be entered, followed by the appropriate numerical values. The command will not be executed until the final part of the command is entered (where a numerical value is '0', it will not be asked for by Roboterm.exe).

Acknowledgements from Table 2 will be displayed in response to entered commands.

NOTE: the 'Q' command will exit both the Remote Control Software and the Hardware Interface Software, stopping the robot at the same time.

Command(AI to HIS)

Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9 Char 10

Motor Speeds

“D” num num “/0” - - - - - -

Sensor Data Request “S” “0” “0” “/0” - - - - - -

LED off “L” “0” “0” “/0” - - - - - -

Quit “Q” “0” “0” “/0” - - - - - -

Table 1: Command Format for Interface with Hardware

Ack. (HIS to AI) Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9Char

10

Valid command (D, L or Q)

“A” num num “/0” - - - - - -

Valid command (D, L or Q) “A” num num num num num num num num “/0”

Override Notification (S) “O” num num num num num num num num “/0”

Override Notification (D) “O” num num “/0” - - - - - -

Invalid Command “E” num num “/0” - - - - - -

Table 2: Acknowledgement Format for Interface with Hardware.

Using other softwareTo send commands to the Hardware Interface Software using another program, incorporate the Network API

by Ahmed Aichi into your code. The Hardware Interface Software will always be a process on TCP port 5555, using “ML” as the name for the link (see . The TCPLink object is accessed in the same way as the cin and cout objects; to send a sensor data request, the following code should be used;

link_object << “S” << 0 << 0 << “\n”;

link_object.Send();

This code was written by Ahmed Aichi and is not the intellectual property of the author. See the documentation on the Network API for more information [Aichi, 2007] .

Acknowledgements are received in a similar way, by treating the link as a stream.

Page 65: Robotic Sensors & Control - Final Project Report - Electronic

Hardware Proximity OverrideThe hardware proximity override will cut motor power if an object is detected at less than12 cm. A click will

be heard as the relay activates. This click will still be heard if the override is bypassed, and does not indicate a malfunction.

If the robot is stopped due to the override, either the robot or the obstacle must be moved before the robot will move again, or the override must be bypassed using the bypass switch. It is recommended that the motor speeds are set to 0 before any of these courses of action are taken.

Using the Hardware Interface ClassThe Hardware Interface Class can be included in any C++ software by including the bot.h header file. This

class includes the EnumSerial.h and Cserial.h files; these must be in the same directory.

To pass a command to the robot, it must be passed as a parameter to the check_command member function, along with appropriate numerical values from Table 1. This will return an acknowledgement character.

The sensor readings are stored in the sensors integer array, and can be read at any time.

The timeout_check function should be called regularly, in order for the time-out feature to function (if infinite timeout is desired, never call this function in your program).

NOTE: The acknowledgements in Table 2 are supplied by the Hardware Interface Software; The Hardware Interface Class only returns the acknowledgement character, and the user must write their own code to produce this format if creating their own programs.

Troubleshooting

● The robot will not move, even though movement instructions are acknowledged

Ensure that the on/off switch on the rear of the robot is in the off position

Ensure a 10 amp fuse is fitted in the fuse holder

Check the connections to the MD22

Ensure that no objects are within 12 cm of the robot, and check the state of the override bypass switch

Make sure that the motor battery has sufficient charge

Ensure that the USB-I2C device is connected properly

● All sensor readings are at 753 cm constantly

The Sensor controller is not connected properly

● The sensor readings are constant at over 500 cm

The Sensors are not connected correctly

● The robot accelerated rapidly backwards or forwards when it was turned on!

This is due to the MD22 receiving spurious instructions when the Hardware Interface Software is launched. It is a known issue. It is recommended that the set-up procedure is followed exactly, as this will prevent unwanted movement. An inexperienced user should stand the robot off the floor when powering up the robot, to prevent damage.

Page 66: Robotic Sensors & Control - Final Project Report - Electronic

Appendix C – C++ Source Code

Hardware Interface Software – Middleman.exe/*Middleman Interface program by Edward Cornish, version 1.0 released on 28/04/07*/#define WIN32_LEAN_AND_MEAN //cut out unneeded windows header files

#include "bot.h" //Header for bot class#include "TCPLink.h" //Include header for network API

void main (void){using namespace std; //bring everything into standard namespace

/*Instantiate variables*/int i = 0; //Incrementer variable

long left; //Received left speedlong right; //Received right speed

char command; //Received command characterchar ack; //Acknowledgment character for transmission

cout << "~~~~Middleman V1.0 by Edward Cornish~~~~\n\n"; //Startupbot pro; //create instance of class bot

int local_port = 5555;//Initialise portsint remote_port = 5556;

cout << "Uses port 5555. Looks for port 5556.\n"; //Information for user (delete in final //version)

cout << "Local broadcast name = 'middleman'. Looks for name 'roboterm'\n";

TCPLink::Load(local_port, //(Comments by Ahmed Aichi)"middleman", //local broadcast name_on, //enable broadcasts"c:\\", //Path to file BcastManager.exe, make sure the executable is in theretrue); //Show bcastmanager console. Set to false when you release your program

cout << "TCPLink loaded\n";

TCPLink motorlink ("ML", remote_port, "roboterm"); //Instantiate link called 'motorlink', looks for Roboterm program

cout << "Awaiting connection to controller...\n";do{ //idle while awaiting connection

Sleep(200);cout << ".";

}while(motorlink.notConnected());TCPLink::AllConnected(); //Called when all TCPLink objects are connected,

//This closes the broadcast manager when all //TCPLink objects on the machine are connected

cout << "Connected!\n"; //notify and proceed

do{

do{

pro.timeout_check(); //check for speed timeoutSleep(20);if(motorlink.notConnected()){

ack = pro.check_command('D', 0, 0); //Stop motors if connection is //lost.

cout << "Connection Lost!\n";do{ //idle while awaiting connection

Sleep(200);cout << ".";

}while(motorlink.notConnected());

}

}while (!motorlink.Receipt()); //Loop while awaiting instruction

Page 67: Robotic Sensors & Control - Final Project Report - Electronic

/*Commented by Ahmed Aichi*/motorlink.FreeInputBuf(); //You should free the input buffer on a regular basis,

//see documentation for details//this basically frees the data that you have extracted

//and leaves only the bit that you//you have not read. if you keep old data in there,

//peformance may be affected (Ahmed)

motorlink >> command >> left >> right;//Receive instruction from buffer

command = toupper(command); //Convert command to uppercase (user can enter upper or //lowercase using Roboterm)

cout << "Command: " << command << "\n"; //Display

ack = pro.check_command(command, left, right);//get appropriate acknoledgement code

cout << "The acknowledgment code is:" << ack << "\n"; //Print to console

if (ack == 'A') //Relay over TCPLink as appropriate{

if(command == 'D'){

motorlink << ack << " " << left << " " << right << "\n";}else if (command == 'S'){

motorlink << ack << "\n";for (i = 0; i < 8; i++){

motorlink << pro.sensors[i] << "\n"; //Send each sensor reading //in turn

}}else if (command == 'L' || command == 'Q'){

motorlink << ack << "\n";}

}else{

motorlink << ack << " " << left << " " << right << "\n";}

motorlink.Send(); //Send whatever has been put into buffer

}while(command != 'Q'); //loop while quit command has not been received

exit(0); //Exit program}

Hardware Interface Class – Bot.h#pragma once#include "EnumSerial.h"#include <iostream>#include "Serial.h"

//4-3-07, now bot.h - speed_process was misleading name//updated 13-04-07//updated 15-04-07 - flush functions added, moderate function written. Made most methods and vars //private.//finalised on 4-5-07using namespace std;

class bot{private:

CSerial i2cbus; //the i2c comms linkBYTE txbuff[16]; //Transmit BufferBYTE rxbuff[18]; //Recieve Buffer

BYTE motor_address; //Set in constructorBYTE sensor_address; // ''BYTE motor_mode; // ''BYTE motor_acceleration;// ''long left; //Received left speedlong right; //Received right speedlong old_left; //Previously received left speedlong old_right; //Previously received right speed

Page 68: Robotic Sensors & Control - Final Project Report - Electronic

bool accel; //flag for acceleration moderation

char command; //Received command characterchar ack; //Acknowledgment character for transmissionBYTE *pt; //Pointer to byte, used to convert received out and back time (in two bytes) to a

//single integer

signed char send_left;//Characters to be sent over I2C to MD22signed char send_right;

int timeout; //Timeout value in ms

bool override_flag; //Overridden?bool override_control;//Override on?

FILETIME ft; //Hold filetimeLARGE_INTEGER stamp; //time that last movement instruction was received.LARGE_INTEGER current;//Current file timebool timed_out; //has the drive command timed out?

int rangetocm (BYTE high, BYTE low) //Convert two bytes to an int{

int cm; //The intcm = ((int)high * 256); //Shift high byte up by one bytecm += (int)low;

cm /= 87; //Convert to CM

return (cm); //return int}

int convert(long s) //simple number conversion, for speeds{

int spd; //eight-bit valuespd = s*(127/18); //convert plus or minus 0-18 to plus or minus 0-127return spd; //returns signed int

}

bool check(long v) //checks if outside acceptable range{

if ( (v > 18) || (v < -18))return 0;

elsereturn 1;

}

void moderate (void) //Moderate acceleration in case of high speed differential{

motor_acceleration = 0x50;

if ((old_right - right) > 80){

motor_acceleration = (old_right - right);}else if ((right - old_right) > 80) {

motor_acceleration = (right - old_right);}

if((old_left - left) > 80){

motor_acceleration = (old_left - left);}else if((left - old_left) > 80){

motor_acceleration = (left - old_left);}

}

void instruct(signed char left, signed char right) //Send speeds{

flush_tx();

txbuff[0] = 0x55;txbuff[1] = motor_address;txbuff[2] = 0x01;txbuff[3] = 0x03;txbuff[4] = right;txbuff[5] = left;txbuff[6] = motor_acceleration; //send left and right speeds to the MD22...(swapped)

Page 69: Robotic Sensors & Control - Final Project Report - Electronic

i2cbus.Write(&txbuff, 7); //...nowold_right = (long)right;old_left = (long)left;

}

BYTE *sense(unsigned int start, unsigned int len) //Get sensor readings{

flush_rx();

BYTE r = (BYTE)start; //Register to start fromBYTE l = (BYTE)len; //# of registers to read

txbuff[0] = 0x55;txbuff[1] = (sensor_address + 1); //Read modetxbuff[2] = r; //Register to start fromtxbuff[3] = l; //# of registers to read

i2cbus.Write(&txbuff, 5); //Send read requestSleep(10); //Give chance to respondi2cbus.Read(rxbuff,sizeof(rxbuff)); //Get data - may require timeouts etcSleep(10); //Give chance to respond

return (&rxbuff[0]); //Return address of array

}

void led(void)//Test LED on I2C interface{

flush_tx();

txbuff[0] = 0x5A;txbuff[1] = 0x10;txbuff[2] = 0x00;txbuff[3] = 0x00;

i2cbus.Write(&txbuff, 4); //LED should turn off!}

void flush_tx(void)//flush txbuff{

for (int i = 0; i < sizeof(txbuff) ; i++){

txbuff[i] = 0;}

}

void flush_rx(void)//flush rxbuff{

for (int i = 0; i < sizeof(rxbuff) ; i++){

rxbuff[i] = 0;//Zero buffer}

}

public:int sensors[8]; //Array of sensor readings

void timeout_check (void){

long time_ft;if(!timed_out)//Flag already set?{

GetSystemTimeAsFileTime(&ft);//Acquirecurrent.LowPart = ft.dwLowDateTime;//Assign to Large_intcurrent.HighPart = ft.dwHighDateTime;//for arithmetic operations

time_ft = timeout * 10000;//Convert timeout to units of 100ns

if((current.QuadPart - stamp.QuadPart) > time_ft){//Is difference between current ft and prev ft greater than the timeout value?

instruct(0,0);//Stop the robottimed_out = 1;//Raise flagcout << "Time-out! Speed set to " << old_right << " , " << old_left << ", time_ft is " << time_ft << "timeout is " << timeout << "\n";

}//Print info to console}

}

Page 70: Robotic Sensors & Control - Final Project Report - Electronic

char check_command (char command, long new_left, long new_right){//Use this method to pass instructions to the robot, and request data.

char ack; //Acknowledgment character for transmission char ack;int i = 0; //Incrementer variable

if (check(new_left) && check(new_right)) //Check that left and { //right speeds are validleft = convert(new_left);//converts the proprietary speed right = convert(new_right);//values to the actual values

send_left = (char)left; //send_right = (char)right;cout << "\nThe command is:" << command << "\n";

switch (command){

case 'D' :if(override_flag && override_control){

instruct(0,0);//Need to send S to clear overrideack = 'O';

}else{//send speeds

instruct(send_left, send_right);cout << "The corresponding left speed value is:" << left << "\n";cout << "The corresponding right speed value is:" << right << "\n";ack = 'A';

}//Move or stop the robotcout << send_left << "~" << send_right << "\n";GetSystemTimeAsFileTime(&ft);stamp.LowPart = ft.dwLowDateTime;stamp.HighPart = ft.dwHighDateTime;//^^Convert to LONG_INTEGER for arithmetic^^timed_out = 0;//^^ Timestamp code ^^break;

case 'L' : led();//Turn LED offcout << "LED should be off!\n";ack = 'A';break;

case 'S' :pt = sense(0,16);//Read PIC registers 0-16cout << "Sensors responding!\n";for (i = 0; i < 8 ; i++){//populate array

sensors[i] = rangetocm(pt[i*2], pt[i*2+1]);//convert to intscout << "-" << sensors[i];

if(sensors[i] < 6 && override_control){

override_flag = 1;//set override if ANY sensor is less than 6cm

}else if(sensors[i] > 6 && override_control){ //lower flag if sensor is greater than 6cm

override_flag = 0;}if(override_flag && override_control){ //only if override occurs and they are turned on

instruct(0,0); //Stop if robot gets too closeack = 'O';

}else{

ack = 'A';}

}break;

case 'Q' : ack = 'A';cout << "Exiting....";//QUITbreak;

Page 71: Robotic Sensors & Control - Final Project Report - Electronic

default: cout << "\nUnknown command!\n";ack = 'E';break;

}}else {cout << "\nInvalid input!\n";ack = 'E';}return ack;

}

bot(void) //CONSTRUCTOR{

CArray<SSerInfo,SSerInfo&> vi;//Array of type SSerInfo, contains details of the Ports on the //machine

EnumSerialPorts(vi,FALSE); //Enumerate ports

int j; //To hold the number of ports available

int i = 0; //Loop varchar prt; //Input Var from consolechar millisec_value[256]; //input var for timeoutchar override_choice; //Y or N

LPCTSTR port_name; //To hold name of selected port, pass to create serial obj

j = vi.GetSize(); //How many ports?

cout << "COM ports available on this machine:\n"; //Heading

do{ //A loop to print array of type SSerInfo - list of ports on machinecout << (i + 1) << " - " << vi[i].strFriendlyName << "\n";//print names of the portsi++; //Increment

}while(i != j);//loop while there are still ports in the array that have not been printed

do{ //Loop while the user selects a portcout << "Please Select the COM port attached to the robot interface,\n";cout << "by entering the number preceding the dash:";cin >> prt;

if(i > j || i < j || !isdigit(prt)){

cout << "\nError! Invalid selection!\n"; //User entered invalid input}else break; //**out of loop**

}while(1);//allow user to specify a COM port to use (since the USB-I2C dongle is enumerated //differently on different systems)

i = atoi(&prt); //convert user friendly list number to actual number (array cell)i--; //Decrement icout << i << "\n"; //Print for debugcout << vi[i].strFriendlyName << "\n"; //Print for debugcout << "Opening " << vi[i].strPortName << "...\n"; //Print the port being opened

port_name = vi[i].strPortName;//Pass over the name to create the port object - EnumSerial no //longer needed

i2cbus.Open(_T(port_name));//Open the specified port...

i2cbus.Setup(CSerial::EBaud19200, CSerial::EData8, CSerial::EParNone, CSerial::EStop2);i2cbus.SetupHandshaking(CSerial::EHandshakeOff); //...with the appropriate parameters

//(hardcode, will not change)

/*Setup the addresses - hardcoded*/motor_address = 0xB0; sensor_address = 0xE0;motor_mode = 0x01; //Signed int, skid-steer modemotor_acceleration = 0x50; //Limit acceleration

txbuff[0] = 0x55;txbuff[1] = motor_address;txbuff[2] = 0x00;txbuff[3] = 0x04;txbuff[4] = motor_mode;txbuff[5] = 0x00;txbuff[6] = 0x00;txbuff[7] = motor_acceleration;//Mode, stopped, acceleration

Page 72: Robotic Sensors & Control - Final Project Report - Electronic

i2cbus.Write(&txbuff, 8);//initialises MD22

timed_out = 1; //Pretend time_out has elapsed before start up

do{cout << "Please enter movement time-out value in milliseconds:";cin >> millisec_value;

if(!isdigit(millisec_value[0]) || !isdigit(millisec_value[1]) || !isdigit(millisec_value[2])){//Check that first three chars of string are digits

cout << "Invalid input! Must be numeric.\n";}else break;

}while(1); //loop while the user enters an appropriate timeout value

timeout = atoi(millisec_value); //assign the entered value to an integer var

do{cout << "Do you wish to activate emergency overrides? (Y or N):";//Ask user to enter yes or nocin >> override_choice;override_choice = toupper(override_choice);//Convert input to uppercase

if( override_choice != 'Y' && override_choice != 'N'){

cout << "Invalid input! Must be Y or N!.\n";}else break;

}while(1); //loop while the user enters an appropriate value

switch(override_choice){//set control flag accordinglycase 'Y' : override_control = 1;

break;case 'N' : override_control = 0;

break;}

}

~bot(void) //DESTRUCTOR{

txbuff[0] = 0x55;txbuff[1] = motor_address;txbuff[2] = 0x00;txbuff[3] = 0x04;txbuff[4] = motor_mode;txbuff[5] = 0x00;txbuff[6] = 0x00;txbuff[7] = motor_acceleration;//Mode, stopped, acceleration

i2cbus.Write(&txbuff, 8);//stops all motors before shutdown

i2cbus.Close();//close the port}};

Remote Control Software - Roboterm.exe#include "TCPLink.h"#include "keys.h"#include <cctype>#include <sstream>#include <fstream>#include <windows.h>

using namespace std;

void main (void){

//Instantiate variableschar cmd; //Holds command character received from AIchar ack; //Holds ack character received from HIS

long left; //speeds to send long right;

Page 73: Robotic Sensors & Control - Final Project Report - Electronic

long rxleft;//Speeds received with acknowledgementlong rxright;

int *spd_pt = 0;//pointer to keyboard speedskeys drive; //instantiate keys object

int dr_left = 0;//Speeds used in keyboard control modeint dr_right = 0;

string left_buf;//FOr holding terminal speed inputstring right_buf;

int sensors[8];//Sensor readingsfstream sensor_log;//File stream object to hold sensor readingsSYSTEMTIME time;//Used for sensor log timestamp

cout << "~~~~~Roboterm V1.0, written by Edward Cornish~~~~~\n";int local_port = 5556;//Ports are hardcodedint remote_port = 5555;

TCPLink::Load(local_port, //Comments on this by Ahmed Aichi"roboterm", //local broadcast name_on, //enable broadcasts"c:\\", //Path to file BcastManager.exe, make sure the executable is in theretrue);

TCPLink motorlink ("ML", remote_port, "middleman"); //means connect to the program on local 5555

do{Sleep(200);//Await connectioncout << ".";

}while(motorlink.notConnected());

TCPLink::AllConnected(); //Called when all TCPLink objects are connected,//This closes the broadcast manager when all TCPLink //objects on the machine are connected(Ahmed)

cout << "Connected!\n";do{

cout << "Do you wish to remote-drive the robot? Please enter Y or N:";cin >> cmd;//Use cmd to receive user's choicecmd = toupper(cmd);//To uppercase

if(cmd == 'N')//User wants terminal mode{do{//Terminal mode starts here

cout << "\nPlease enter a command: ";cin >> cmd;//Take command inputcmd = toupper(cmd);//Uppercase

if (cmd != 'Q' && cmd == 'D'){//Drive commandcout << "\nPlease enter left speed value: ";cin >> left_buf;cout << "\nPlease enter right speed value: ";cin >> right_buf;//Get speeds}else{

right = 0;left = 0;

//Right and left speeds are 0 when command is not D. This DOES NOT stop the //robot!}

stringstream sl(left_buf);//Stringstream for string to int conversionstringstream sr(right_buf);

sl >> left;//Convertsr >> right;

motorlink << cmd << " " << left << " " << right << "\n";motorlink.Send();//Send command and speedswhile (!motorlink.Receipt())//Await acknowledgement{

Sleep(10);//Idlecout << "~";

}

Page 74: Robotic Sensors & Control - Final Project Report - Electronic

if( cmd != 'Q' && cmd == 'D'){//Receive two speeds with acknowledgement

motorlink >> ack >> rxleft >> rxright;cout << ack << "-" << rxleft << "-" << rxright << "\n";

left = 0;//reset speedsright = 0;

}else if (cmd != 'Q' && cmd == 'S'){//Receive eight sensor readings with acknowledgement

motorlink >> ack;for(int x = 0; x < 8; x++){motorlink >> sensors[x];//fill sensor array with readings}

cout << ack;//print ackfor(int x = 0; x < 8; x++){

cout << "-" << (int)sensors[x];//print all received sensor readings to screen

}cout << "\n";

}else{motorlink >> ack; //just receive acknowledgementcout << ack;

}

if (cmd == 'Q')//Exit programexit(0);

motorlink.FreeInputBuf(); //Flush broadcast buffer}while(1);

}else if (cmd == 'Y') //User wishes remote control{

cout << "Entering remote-drive mode.\n";//Print Driving instructions to consolecout << "~~~~Driving Instructions~~~~\n\n\n";cout << "> Accelerate = Up Arrow key\n";cout << "> Turn Left = Left Arrow key\n";cout << "> Turn Right = Right Arrow key\n";cout << "> Decelerate = Down Arrow key\n";cout << "> Full Stop = Space Bar\n";cout << "> Centre Steering = Enter\n";cout << "> Exit Remote-drive = Escape\n\n";cout << "~~~~Drive Safely!~~~~\n";

sensor_log.open("Sensor_Log.txt", ios::out | ios::app);//Open Sensor log file for output

do{if(_kbhit())//Detect keystroke{

spd_pt = drive.capture();//Get pointer to speed from keyboard function

dr_left = *spd_pt;//Assign speedsspd_pt = spd_pt+1;dr_right = *spd_pt;cmd = 'D';//Set command to drive

}else{

//Poll sensorscmd = 'S';left = 0; //Speed values are nullright = 0;motorlink << cmd << " " << left << " " << right << "\n";motorlink.Send();//Send Sensor commandwhile (!motorlink.Receipt()){

Sleep(10);//Idle while awaiting data

}motorlink >> ack;//Get acknowledgement//Retreive date and time, append to sensor log file.GetSystemTime(&time);sensor_log << time.wDay << "/" << time.wMonth << "/" << time.wYear << "\n" << time.wHour << ":" << time.wMinute << ":" << time.wSecond

Page 75: Robotic Sensors & Control - Final Project Report - Electronic

<< ":" << time.wMilliseconds << "\n";

//Write latest sensor readings to log file, for(int x = 0; x < 8; x++){motorlink >> sensors[x];//recieve sensor readingssensor_log << sensors[x];//Write to file

if (x < 7){//Insert seperation characters between readings

sensor_log << "|";}

}sensor_log << "\n";

Sleep(20);//Pause between sensor readings}

if(dr_left == 666 && dr_right == 666)//666 is exit code (no such speed){//Exit remote drive

cout << "You have pressed ESC. Exiting remote-drive...\n";dr_left = 0; //Zero speed valuesdr_right = 0;sensor_log.close();//Close filebreak;

}else if (cmd == 'D'){//Send speed values from keyboard capture

motorlink << cmd << " " << dr_left << " " << dr_right << "\n";motorlink.Send();while (!motorlink.Receipt()){

Sleep(10);//Await Acknowledgementcout << cmd;

}motorlink >> ack >> rxleft >> rxright;//Retrieve acknowldegementcout << ack << "-" << rxleft << "-" << rxright << "\n";

}

}while(1);

}else{

cout << "\nUnrecognised Command!";}//User has entered an unknown command valuemotorlink.FreeInputBuf(); //Flush Broadcast buffer}while(1);

}

Keys Class – keys.h and keys.cpp

keys.h#include <conio.h>#include <iostream>#pragma onceusing namespace std;/*The code for retreiving keyboard codes was shown to the author by Ahmed AichiThis class is based on an example provided by him*/

class keys{private:

int k1;int k2;int spd_left_right[2];int left_spd;int right_spd;

public:keys(void);

public:~keys(void);int *capture(void);

};

Page 76: Robotic Sensors & Control - Final Project Report - Electronic

keys.cpp#include "keys.h"keys::keys(void){//constructor - initialise vars

k1 = 0;k2 = 0;spd_left_right[2];//Returns speedsleft_spd = 0;right_spd = 0;

}

keys::~keys(void){//Nothing to do in destructor}

int *keys::capture(void) //check kbhit{

k1 = _getch();//Acquire char if (_kbhit())//If second byte of code to be read {

k2 = _getch(); }

//Speeds are changed in increments of two, to reduce the number //of keystrokes needed to reach full speed, and to eliminate an //annoying beep sound when a speed value of 1 was printed to the console

if (k1 == 224){//Arrow keys

switch (k2){//Determine which key based on second part of codecase 72 :

{ //up arrowleft_spd++;left_spd++;right_spd++;right_spd++;//Speed upbreak;}

case 80 : {//down arrowleft_spd--;left_spd--;right_spd--;right_spd--;//Slow downbreak;

}case 75 : {//right arrow

left_spd--;left_spd--;right_spd++;right_spd++;//Turn rightbreak;}

case 77 : { //left arrowleft_spd++;left_spd++;right_spd--;right_spd--;//Turn leftbreak;}

}}else if (k1 == 32)

{//Spacebar - stops robotleft_spd = 0;right_spd = 0;

}else if (k1 == 27) //ESC - Quits remote drive{

spd_left_right[0] = 666;spd_left_right[1] = 666;//666 should never be reached as a result of keypresses (see below)return spd_left_right;

}

/*Stop speeds from exceeding limits*/if( left_spd > 18) left_spd = 18;if( left_spd < -18) left_spd = -18;if( right_spd > 18) right_spd = 18;

Page 77: Robotic Sensors & Control - Final Project Report - Electronic

if( right_spd < -18) right_spd = -18;

spd_left_right[0] = left_spd;spd_left_right[1] = right_spd;//Assign speeds

return &spd_left_right[0]; //And return them

}