design and development of a visual navigation testbed for ... · and development of computer vision...

14
Design and Development of a Visual Navigation Testbed for Spacecraft Proximity Operations Brent E. Tweddle * Alvar Saenz-Otero David W. Miller MIT Space Systems Laboratory, Cambridge, MA, 02139, USA This paper presents the design and development of an upgrade to the Synchronized Po- sition Hold Engage Reorient Experimental Satellites (SPHERES) that enables the research and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is referred to as the SPHERES Goggles and is shown in Figure 1. It includes two cameras, illuminating LED lights, a 1.0 Gigahertz (GHz) processor running a Linux operating system, an 802.11g wireless modem and a battery that can supply power for at least 90 minutes. These components were integrated in mechanical package that weighs 865 grams (including the battery). This paper will present the concept, design and implementation of the SPHERES Goggle, which occurred over a period of one year. The design is intended to be ”flight-traceable” and further iterations are being considered for operations inside the International Space Station (ISS). I. Introduction Figure 1. SPHERES Goggles From the first spacecraft docking maneu- ver that was performed on Gemini 8, until the recent technology demonstration of DARPA’s Orbital Express, spacecraft proximity opera- tions have been an integral component of a va- riety of space mission objectives. In order to achieve these objectives, any mission that in- volves proximity operations must have an ac- curate, reliable and robust guidance, naviga- tion and control (GN&C) system. A represen- tative testbed is a critical element of the mat- uration of any type of GN&C system. The SPHERES Goggles, which are presented in this paper, are designed to be a representative testbed for computer vision based navigation algorithms for spacecraft proximity operations. A. Previous On-Orbit Navigation Techniques for Proximity Operations Over the years a variety of sensor technologies have been used for proximity navigation. One of the earliest on-orbit autonomous navigation systems was the Russian Kurs, which is still used for rendezvous navigation by the Soyuz and Progress vehicle. The Kurs sensor system is based on an S-band radio transponder that measures range, range-rate, relative pitch and relative yaw. 1 * S.M. Candidate, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email: [email protected], AIAA Student Member. Research Scientist, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email: [email protected], AIAA Member. Professor, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email: [email protected], AIAA Senior Member. 1 of 14 American Institute of Aeronautics and Astronautics AIAA SPACE 2009 Conference & Exposition 14 - 17 September 2009, Pasadena, California AIAA 2009-6547 Copyright © 2009 by Brent E. Tweddle. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.

Upload: others

Post on 18-Mar-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

Design and Development of a Visual Navigation

Testbed for Spacecraft Proximity Operations

Brent E. Tweddle ∗ Alvar Saenz-Otero † David W. Miller ‡

MIT Space Systems Laboratory, Cambridge, MA, 02139, USA

This paper presents the design and development of an upgrade to the Synchronized Po-sition Hold Engage Reorient Experimental Satellites (SPHERES) that enables the researchand development of computer vision based navigation algorithms for spacecraft proximityoperations. This upgrade is referred to as the SPHERES Goggles and is shown in Figure 1.It includes two cameras, illuminating LED lights, a 1.0 Gigahertz (GHz) processor runninga Linux operating system, an 802.11g wireless modem and a battery that can supply powerfor at least 90 minutes. These components were integrated in mechanical package thatweighs 865 grams (including the battery). This paper will present the concept, design andimplementation of the SPHERES Goggle, which occurred over a period of one year. Thedesign is intended to be ”flight-traceable” and further iterations are being considered foroperations inside the International Space Station (ISS).

I. Introduction

Figure 1. SPHERES Goggles

From the first spacecraft docking maneu-ver that was performed on Gemini 8, until therecent technology demonstration of DARPA’sOrbital Express, spacecraft proximity opera-tions have been an integral component of a va-riety of space mission objectives. In order toachieve these objectives, any mission that in-volves proximity operations must have an ac-curate, reliable and robust guidance, naviga-tion and control (GN&C) system. A represen-tative testbed is a critical element of the mat-uration of any type of GN&C system. TheSPHERES Goggles, which are presented inthis paper, are designed to be a representativetestbed for computer vision based navigationalgorithms for spacecraft proximity operations.

A. Previous On-Orbit Navigation Techniques for Proximity Operations

Over the years a variety of sensor technologies have been used for proximity navigation. One of the earlieston-orbit autonomous navigation systems was the Russian Kurs, which is still used for rendezvous navigationby the Soyuz and Progress vehicle. The Kurs sensor system is based on an S-band radio transponder thatmeasures range, range-rate, relative pitch and relative yaw.1

∗S.M. Candidate, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email:[email protected], AIAA Student Member.†Research Scientist, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email:

[email protected], AIAA Member.‡Professor, Department of Aeronautics and Astronautics, 77 Massachusetts Avenue, Cambridge, MA, 02139. Email:

[email protected], AIAA Senior Member.

1 of 14

American Institute of Aeronautics and Astronautics

AIAA SPACE 2009 Conference & Exposition14 - 17 September 2009, Pasadena, California

AIAA 2009-6547

Copyright © 2009 by Brent E. Tweddle. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.

Page 2: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

More recently, ESA’s Automated Transfer Vehicle (ATV) has demonstrated the use of a Relative GlobalPositioning System (RGPS) for the 30 kilometer to 500 meter phase of docking with the International SpaceStation (ISS). The RGPS system uses raw measurements from two GPS receivers (one on the target and theother on the chaser), and differences these measurements in an onboard navigation filter to produce relativerange and range-rate measurements.1

The Laser Camera System (LCS) was developed by the Canadian Space Agency (CSA) to produce threedimensional maps of the Space Shuttle. It was first tested on STS-105 and has been used in all missionsfollowing the Columbia Accident, beginning with STS-114.2 In order to compute the three dimensionalmap, the LCS uses a camera to photograph the pattern created by a scanning laser that is moved along anunknown surface.

On DARPA’s Orbital Express mission, the Advanced Video Guidance Sensor was used for docking be-tween two spacecraft. A laser diode was used to illuminate a retro-reflective visual target that was processedby machine vision algorithms to determine the relative position and orientation between the two spacecraft.3

An on-orbit photo is shown in Figure 2

B. Computer Vision for Proximity Operations

Figure 2. DARPA’s Orbital Express On-Orbit

We believe that using computer vision forproximity navigation provides a number of ad-vantages over other sensing technologies. Vi-sual navigation sensors can be made to besmall and low power, they have no movingparts and are passive systems (i.e. they re-quire no electrical power or data from the tar-get spacecraft). Although visual navigationhas been used on a number of space missions inthe past, there are many more computer visiontechnologies and algorithms that could im-prove mission capabilities, but are not yet ma-ture enough to be implemented on-orbit. Forexample, in 2005 the Committee on the Assess-ment of Options for Extending the Life of theHubble Space Telescope found that “...camera-based control of the grapple arm are mission-

critical technologies that have not been flight-tested” and “Technologies needed for autonomous manipula-tion, disassembly, and assembly, and for control of manipulators based on vision and force feedback, havenot been demonstrated in space”.4A number research projects are currently investigating computer visionbased navigation for spacecraft proximity operations. The Naval Research Laboratory is developing theSUMO/FREND technology demonstration that will perform on-orbit servicing of satellites with no fiducialsor grapple mechanisms.5 Additionally, the Naval Research Laboratory is developing a Low Impact Inspec-tion Vehicle (LIIVe), that is a small nanosatellite that is attached to a much larger spacecraft. Should afailure be detected on the larger spacecraft, the LIIVe nanosatellite would be deployed and would performan inspection maneuver using computer vision as the primary relative navigation sensor.6

C. Previous Inspection Satellite Demonstrators and Testbeds

A number of testbeds have been developed to mature inspection satellite technologies. As was previouslymentioned, the most recent high-profile demonstrator is DARPA’s Orbital Express. In 2007, this satellitedemonstrated robotic satellite servicing with a cooperative target that utilized interfaces designed for ser-vicing. Previously, in 2005, the Demonstration of Autonomous Rendezvous and Docking (DART) mission,attempted similar objectives as Orbital Express, however an anomaly in the navigation system caused thetwo spacecraft to collide. The Naval Research Laboratory is currently developing SUMO/FREND, a roboticservicing demonstrator that will use vision based navigation.5 Another flight demonstrator is the Air ForceResearch Laboratory’s XSS-11, which demonstrated proximity circumnavigation on orbit in 2005.7 TheAERCam Sprint was an inspector satellite that was tested on STS-87.8 This manually controlled inspectorprovided video data to an astronaut onboard the Shuttle. A follow-on program has developed the Mini-

2 of 14

American Institute of Aeronautics and Astronautics

Page 3: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

AERCam testbed, which is designed to be capable of both manual and autonomous inspection,9 however ithas not yet been flown.

A number of other ground testbeds have been built to develop proximity operations technologies. Forexample, the SCAMP testbed was developed at the University of Maryland for semi-autonomous operation ina neutral buoyancy tank.10 Also, the AUDASS Testbed was developed by the Naval Postgraduate Laboratoryand demonstrated vision based navigation techniques on the ground11.12 In 2001, the Lawerence Livermoredemonstrated formation flight based on stereo vision on their ground testbed.13

D. The Need for a Visual Based Navigation Testbed

In order to develop and mature computer vision based navigation algorithms beyond Technology ReadinessLevel 3 (proof-of-concept), it is important to have an operationally relevant testbed that can run a variety ofvision-based-navigation algorithms. Currently there is no testbed that can develop vision based navigationalgorithms on both the ground, and in a six degree-of-freedom microgravity environment. The MassachusettsInstitute of Technology Space Systems Laboratory (MIT SSL), the Naval Research Laboratory (NRL) andAurora Flight Sciences (AFS) have partnered to develop such a vision based navigation upgrade to theSynchronized Position Hold Engage Reorient Experimental Satellites (SPHERES), that is ”flight-traceable”(i.e. minimal modifications will be required to qualify the system for microgravity operations). SPHERES isa formation flight testbed that operates in the microgravity environment of the ISS. Figure 3 shows a photoof three SPHERES performing a formation flight maneuver on the ISS. This paper will discuss the design,development and testing of this upgrade.

II. SPHERES Background

Figure 3. Three SPHERES in formation on the ISS

SPHERES is a formation flighttestbed that was developed by the MITSpace Systems Laboratory (SSL) to ma-ture spacecraft GN&C algorithms in themicrogravity environment of the ISS.Currently 17 ISS Test Sessions have beencompleted, where algorithms for estima-tion, docking, collision avoidance, recon-figuration, fault detection and path plan-ning have been implemented in a low-risk, yet highly representative environ-ment.14

A. SPHERES Global MetrologySystem

The SPHERES Global Metrology sys-tem is an time-of-flight navigation system(similar to GPS) that is capable of mea-suring the position and orientation of each of the SPHERES satellites with an precision of a few millimetersfor position and 1-2 degrees for orientation.15 The maximum update rate of this system is 5 Hz.

The navigation hardware consists of 5 beacons that are installed in known locations on the ISS, each ofwhich has an ultrasonic transmitter and an infrared receiver. Every SPHERE has 24 ultrasonic receivers and12 infrared transceivers mounted on it. Each metrology cycle will begin with a single SPHERE transmittingan infrared pulse to indicate the beginning of a new metrology cycle, which will synchronize the clocks on allof the beacons and SPHERES. Once this has occurred, each of the beacons will transmit an ultrasonic pulsein a pre-programmed sequence, that is separated by 20 ms intervals to avoid overlap. Each SPHERE willthen subtract the known time of transmission from the time of arrival to calculate a time of flight for eachreceived pulse on each ultrasonic receiver. This can result in as many as 120 time of flight measurementsper satellite. All of these measurements will be integrated in an Extended Kalman Filter (EKF), along withgyro measurements, to estimate the position and orientation of the SPHERE within the ”global” referenceframe.

3 of 14

American Institute of Aeronautics and Astronautics

Page 4: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

B. SPHERES Propulsion System

SPHERES uses a cold gas propulsion system to create both the forces and torques that are required toactuate the system. Carbon Dioxide (CO2) is stored in the tanks in a liquid form at approximately 860 psi.A regulator is used to step the pressure down to around 35 psi. This pressure is fed into 12 solenoid valvesand nozzles that produce 110 mN each when opened. A range of forces and torques is created by open looppulse-width modulation, whose duty-cycle is calculated in a software mixer.16

C. SPHERES Onboard Computer and Expansion Port

The SPHERES satellites each have a floating point digital signal processor (Texas Instruments TMS320C6701)that runs at 167MHz and can produce a can produce between 0.167 and 1.0 Giga-floating point operationsper second (GFLOPS), while consuming 7 Watts (W) of power. The DSP is responsible running all of theonboard software, including the Extended Kalman Filter, the various guidance and control algorithms, thecommunications protocol, and all of the high priority hardware interfacing tasks.

Figure 4. SPHERES Expansion Port

During the design phase of the SPHERES, an expansion port was incorporated to allow future hardwareto plug into and communicate with SPHERES. This expansion port has a mechanical attachment thatconsists of four 2-56 standoffs, and an electrical connector that provides (fused) power and RS232 data.Table 1 provides a list of voltages and maximum currents available through the expansion port. A photo ofthe expansion port can be seen in Figure 4.

Supply Voltage Max Supply Current Max Power5 V 0.5 A 2.5 W15 V 0.5 A 7.5 W-15V 0.5 A 7.5 W

Table 1. SPHERES Expansion Port Power Supply

4 of 14

American Institute of Aeronautics and Astronautics

Page 5: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

III. Goggles System Trades

A. Primary Objective

The main objective of the SPHERES Goggles is to provide a flight-traceable platform for the devel-opment, testing and maturation of computer vision-based navigation algorithms for spacecraftproximity operations. It is important to note that although this hardware was not intended to be launchedto orbit, it was required to be easily extensible to versions that can operate both inside, and ultimately outsidethe ISS or any other spacecraft.

B. SPHERES Constraints

Due to the fact that the Goggles must be mounted on the SPHERES expansion port a number of constraintsare introduced.

1. The Goggles must not alter the SPHERES center of mass by more than 2 cm. Therefore the Gogglesmust weigh less than 1 kg.

2. The Goggles must not block the SPHERES CO2 thrusters.

3. The Goggles should not block the SPHERES ultrasonic receivers.

4. The Goggles must not consume more power than the SPHERES expansion port can provide OR theGoggles must provide its own battery power.

C. Onboard versus Offboard Processing

Figure 5. Goggles Architectures

In defining the system architecture, one fundamental tradeoff was heavily debated: How much processingshould be done onboard, and how much processing should be done offboard? These two possible architecturesare shown in Figure 5.

On one hand, if all of the processing is done onboard in a 1 kg package, any algorithms developed will beimmediately transferable to almost types of spacecraft. Additionally, developers will be forced to find createthe most efficient implementations of their algorithms.

5 of 14

American Institute of Aeronautics and Astronautics

Page 6: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

On the other hand, the purpose of a testbed is to allow creativity and flexibility in the algorithms thatare being developed. This would suggest that as much processing power should be available as possible.Additionally, many spacecraft will be capable of allowing more than 1 kg to be budgeted for the entire visionprocessing system.

The architecture that was selected was a compromise between the two previously mentioned extremes.The Goggles were required to be able to provide a reasonable amount of onboard processing power to executethe computer vision algorithms and in addition were required to be able to transfer the raw images over awireless link to an offboard computer at a reasonable frame rate.

D. Requirements

From the primary objective a number of requirements were derived.

1. The Goggles must include at least two flight-traceable CMOS cameras that can be mounted in varyingconfigurations (e.g. stereo and 45o configurations).

(a) The cameras must provide 640x480 8-bit grayscale images at no less than 10 frames per second.

(b) The cameras must have software exposure control.

2. The Goggles must include onboard switchable fill lighting.

3. The Goggles must include at least one flight-traceable microprocessor capable of executing computervision algorithms

(a) The processor must run a Linux operating system

4. The Goggles must include an 802.11 wireless networking card that should capable of sending 10frames per second from each onboard camera with lossless compression

5. The Goggles must include at wired expansion ports of at least one 100 MBps Ethernet connection andone USB 2.0 port.

IV. Goggles Design

A. Design Approach

The schedule and budget of the project required the Goggles to be designed in approximately eight months byone graduate student and one undergraduate student. Since the Goggles would be a complex interconnectedmechatronics system, we decided to design, implement and test the Goggles in an iterative process. The firststep was to select the optics and electronics, and begin developing the software in a breadboard configuration.Once this was done, the electronics would be attached to a SPHERE in an ”un-integrated” fashion, similar toa flat satellite. In this stage all of the electronics would be fully connected, and they would be mechanicallyseparated to make testing and debugging easier. After this was completed and tested, work began on themechanical packaging to create a final Integrated SPHERES Goggles.

B. Optics and Electronics

1. Processor

The main driving factor for the electronics design was the Single Board Computer (SBC) selection. Anumber of options were considered according to a set of Figures of Merit (FOM) as detailed in Table 2. Theprocessor type, power consumption and physical size FOMs are fairly self explanatory, however the FloatingPoint and Instruction Set Architecture (ISA) should be clarified.

There are a number of common instruction sets for embedded computers (x86, ARM, PowerPC etc.).For a research and development testbed, it is desirable to have an ISA that is very popular, and already hasa widely available base of code. This is a distinctly different approach from a flight spacecraft where all ofthe software will be developed in house to ensure quality control and reliability.

Many embedded processors come without a hardware floating point unit (FPU), which is done to savepower and space on the processor. In certain applications this is not a problem, as integer math can be used

6 of 14

American Institute of Aeronautics and Astronautics

Page 7: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

for most computations and floating point instructions can be emulated in software. However, in a number ofvision based navigation algorithms, floating point computations are required (for example inverting a matrixin a Kalman Filter is best done with floating point numbers). Therefore, it is highly desirable to have aFloating Point Unit for the SPHERES Goggles, however it is not absolutely essential.

The processors listed in Table 2 met all of the requirements outlined in the previous sections (a numberof potential SBCs were eliminated due to the fact that they did not have both USB 2.0 and 100 MBpsEthernet). One interesting observation is that most of the processors listed in the table below have beendeveloped for smart-phones and tablet-PCs applications, items which did not exist in volume a few years ago.This implies that the aerospace industry could leverage recent developments in the commercial smart-phoneindustry.

Single BoardComputer

Processor ThermalDesignPower

Size InstructionSet Archi-tecture

FloatingPointUnit

Lippert CoreEx-press

Intel Atom 1.6 GHz 5W 5.8cm×6.5cm x86 Yes

Via Pico-ITX Via C7 1.0 GHz 12W 10cm×7.2cm x86 YesTexas InstrumentsBeagle Board

OMAP3530 (600MHz Cortex-A8)

2W 7.6cm×7.6cm ARM Yes

InHand Fingertip5 XScale PXA320806 MHz

<1W 6.1cm×8.6cm ARM No

Table 2. Single Board Computer Comparison

The processors are listed in order of how well they met the FOMs. As you can see above, the LippertCoreExpress SBC was the clear winner in terms of the above FOMs. However, at the time of development, theIntel Atom was not widely available, and extra development time would be required for building additionalprinted circuit boards. Given that we also had previous experience with the Via Pico-ITX, we selected thisSBC for the SPHERES Goggles. Additionally, we chose to incorporate an 8 GB flash SATA drive as well as1 GB of RAM.

2. Cameras

The selection of the cameras was a fairly simple process. The uEye LE cameras made by IDS Imaging werefound to easily meet all of our requirements (an integrated frame buffer, USB interface and linux drivers).Their specifications are summarized in Table 3. These cameras have a TTL flash output, which we used tosynchronize the lights with the camera’s exposure. A wide variety of lenses are available that fit into theM12 mount. We typically use fairly wide angle lenses with focal lengths between 2 and 5 mm.

Sensor 1/3” CMOS with Global ShutterCamera Resolution 640 x 480 pixelsLens Mount S-Mount, M12Frame Rate 87 FPS (Camera Max), 10 FPS (Typical)Exposure 80 µ s - 5.5 sPower Consumption 0.65 W eachSize 3.6cm×3.6cm×2.0cmMass 12 g

Table 3. IDS Imaging uEye LE

3. Fill Lights

Similar to the selection of the cameras, one set of lights was found that met all of the requirements andprovided sufficient illumination in dark environments. The Luxeon III Star LED’s were selected due to their

7 of 14

American Institute of Aeronautics and Astronautics

Page 8: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

small size, high intensity light and moderate power consumption. Their specifications are shown in Table4. A red-orange LED was selected because this since this is the wavelength where the CMOS camera is themost sensitive. A collimator was also selected that focuses 85% of the light into a 50o beam-width and isshown with the LED in Figure 6.

Figure 6. Goggles Optics

In the optics design we used COTSLED drivers that regulated the LED cur-rent to 1000 mA, which corresponds toa typical power consumption of 2.95 W.Since there are 2 LEDs on the Goggles,the LEDs draw a significant amount ofpower. Therefore, we designed a flashsystem that only turns on the LEDswhen the camera is capturing an image.Through tuning, we found that we wereable to get good illumination up to a fewmeters away with a 30ms exposure. Sincethe camera is capturing 10 frames persecond, this is a 30% duty cycle for thelights. As a result, the net power con-sumption for two lights is approximately 1.8 Watts, and the switching therefore saves approximately 4.2Watts.

(a) Camera Image with Lights Off (LED Fill Lights On) (b) Camera Image with Lights On

Figure 7. Captured Camera Images

Model Red-Orange LambertianTypical Luminous Flux 190 lm at 1400 mATypical Dominant Wavelength 617 nm (red-orange)Typical Forward Voltage 2.95 VDiameter 2 cmMass 5.5 g

Table 4. Luxeon III Star Specifications

4. Wireless Communications

The wireless communications system needs to be able to transmit 10 frames per second of two 640×480 framesat eight bits per pixel. The SPHERES Goggles uses the open source gzip lossless compression software, forwhich it is reasonable to assume a 50% compression ratio for each image. This requires approximately 23

8 of 14

American Institute of Aeronautics and Astronautics

Page 9: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

Mbps to be transfered, which is at the upper limit of what 802.11g is capable of (802.11g is designed for amaximum of 22 Mbps in each direction). It would be ideal to use an 802.11n network, which is capable of 72Mbps in each direction, however at the time of development, these devices were not readily available in smallpackages with linux device drivers. It was decided that a small reduction in image quality or frame-ratewould be an acceptable tradeoff to use 802.11g. We selected a QCom USB 2.0 device as specified in Table5.

Mode Plug-In FTP Transfer FTP Receive

Average Power 0.90 W 2.65 W 2.60 WMaximum Power 2.15 W 2.95 W 2.85 WData Rate N/A 20.97 MBps 18.04 MBps

Table 5. QCom Device Characteristics

5. Battery and Power Distribution

The power budget for all of the necessary components is listed in Table 6

Item Average Power Consumption

Pico-ITX 14 WQCom Wireless Device 3 W2 Cameras 1.3 WLED Lights 3 WTotal 21.3 W

Table 6. Approximate Power Budget for Main Components

Given that the SPHERES expansion port can only provide 17.5 W of power over 3 different voltages, itis not possible to power the Goggles using the SPHERES onboard batteries. This means an external batterymust be used. In order to store as much energy as possible, while providing flight-traceability, an COTSintegrated Lithium-Ion battery pack was selected, see Table 7 for details.

Nominal Voltage 11.1 V (3 cells in Series)Capacity 2500 mAhMass 154 gEnergy Density 180 Wh/kgDimensions 104cm×5.0cm×1.8cmBuilt in Protection Over-current, Over-voltage, Over-Drain, Short-Circuit, Polarity

Table 7. Battery Specifications

With Lithium-Ion batteries there are a number of safety issues associated with over-draining the battery.For this reason, the battery selected comes with a protection circuit that will disconnect the battery beforeit can be damaged. However, if this were to occur it would not an advisable way to power off the electronics.For example, data may be lost or corrupted if there is any hard disk activity that is occurring. For thisreason, we chose to incorporate a battery monitor that indicates the approximate voltage of the batteriesand sounds a buzzer when the batteries are close to depleted.

In order to provide regulated power to the Pico-ITX board, a 20 Watt, 85% efficient, DC-DC converterwas selected. This converter supplies power to the Pico-ITX and any devices that the Pico-ITX powers (e.g.Flash drive, powered USB devices). It was determined through experimental measurement that the USB 5Vpower supply on the Pico-ITX uses a linear regulator that is approximately 45% efficient. This was likelydone by the manufacturer to save space on the Pico-ITX SBC that would have been needed for filteringcapacitors and inductors.

9 of 14

American Institute of Aeronautics and Astronautics

Page 10: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

As a result, the power design incorporated a 5V DC-DC converter that provided power the the USBpower pin for only the 802.11 WiFi device. This is a reasonable way to increase the power to a USB device,however it is not a proper design since the USB hub device typically monitors power consumption and issuesinstructions to devices based on overall current draw.

6. Final Goggles Electronic Design

The final electronics and power distribution design is shown in Figure 8.

Figure 8. Electronics Architecture

C. Goggles Software

In the design of the SPHERES Goggles software architecture, an important distinction must be made. TheGoggles are not intended to have the reliability of a flight system, even though they must be flight-traceable.In many flight systems, a software failure would result in the loss of a significant amount of time and money,and therefore every component of the software must be designed with the utmost care, attention to detail,testing and review. Instead, the Goggles are a testbed system. Therefore the software architecture mustenable rapid development of computer vision algorithms, which can be implemented, tested and evaluatedin relatively short periods of time. In this type of a testbed system, ease of development may even chosenat the expense of reliability. This is the reason that a linux based operating system was selected over morereliable real-time operating systems such as VxWorks or QNX.

When the particular linux distribution was selected for the Goggles, Ubuntu 8.04 ”Hardy Heron” (basedon the 2.6.24 kernel) was chosen. This was due to the fact that Ubuntu is a widely adopted, well documentedand designed to be as user friendly as possible. Additionally, we elected to incorporate the real-time patchesto ensure proper performance. These patches are a critical element of an embedded real-time linux imple-mentation since they enable kernel pre-emption, priority inheritance, high resolution timers and convert allinterrupt handlers to pre-emptible kernel threads.

The Goggles software API wass designed to provide the following functionality:

1. Capture two frames from the camera and allow processing at a fixed interval

2. Save two frames to file

3. Send control commands to the SPHERE

4. Receive global metrology and gyroscope measurements from the SPHERE

10 of 14

American Institute of Aeronautics and Astronautics

Page 11: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

1. Goggles Application Programmer Interface

Figure 9. Goggles Software API

In order to enable software development for theGoggles hardware, an Application Program-mer Interface (API) was developed using theC programming language. The dependenciesof the API are shown in Figure 9.

The primary functions in the Optics APIare shown in Table 8. This API configures thecameras to run at a specific frame rate basedon their own internal clocks and will generatean interrupt when a new image is ready. Thefunction captureTwoImages() will block untila new frame is captured and this interrupt isgenerated.

The API for the SPHERES-Goggles inter-face is shown in Table 9. This API starts aseperate thread on initialization that is ded-icated to receiving SPHERES state updates(global metrology) and gyroscope measurements. These values are time-stamped and stored in a globalvariable so that they can be used by the control algorithms on the Goggles. One of the important oper-ation aspects of the SPHERES-Goggles interface is that tests are started using the standard SPHERESGraphical User Interface (GUI). At the start of any Goggles program, waitGSPInitTest() must be called,which will block until it receives a start test message from the SPHERE. When the user stops a test usingthe SPHERES GUI, the SPHERE will send a stop test message to the Goggles, which is checked for bycheckTestTerminate(). Control commands are sent by the sendCtrl() function which be specified in termsof targets in the inertial frame, or forces and torques in either the inertial or body frame. These commandsare sent by the Goggles immediately and will be updated on the next control cycle of SPHERES.

Function Description

initTwoCameras() This function initializes the cameras, setsframe rates, exposure, resolution and otherparameters.

closeTwoCameras() This function deallocates all structures for thecameras.

startTwoCameras() This function starts the capturing process.captureTwoImages() This function blocks until a new image is cap-

tured, then returns a pointer to it.

Table 8. Optics API Function

Function Description

initSPHERES() This function initializes the SPHERES hard-ware including the RS232 port.

closeSPHERES() This function deallocates all structures for theSPHERES interface.

waitGSPInitTest() Blocks until a new SPHERES test is started.checkTestTerminate() Checks if a stop test message has been re-

ceived.sendCtrl() This function sends control commands for the

SPHERE to execute.

Table 9. Optics API Function

11 of 14

American Institute of Aeronautics and Astronautics

Page 12: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

D. Flat Goggles Design

(a) Top Down View without SPHERES (b) Front View mounted on SPHERES

Figure 10. Flat Goggles

The first mechanical design of the Goggles is referred to as the Flat Goggles. This design placed all ofthe electronic components on a single flat aluminum plate attached to the SPHERES Air Carriage. Theelectronics were ultimately connected in their final configuration, however this flat layout allowed flexibilityin debugging and development. This also allowed us to separately address the development and testing of theelectronics from the problem of how to tightly package the electronics. The cameras, lights and SPHERESexpansion port connectors were mounted on the SPHERE since their placement and orientation is importantfor testing. The components of the Flat Goggles are shown on a SPHERES air carriage in Figure 10.

E. Integrated Goggles Design

Once the Flat Goggles were built, verified and tested, the next step was to package the Goggles electronicsin an integrated fashion so that they minimized the mass and volume of the Goggles package.

A number of key design decisions were made during the initial phases of integrating the Goggles. Thefirst design decision that was made is that the Integrated Goggles should be separated into an Avionics Stackand an Optics Mount that are separate from each other. The Avionics Stack incorporates the Pico-ITX,battery, power electronics, expansion port connector and switches, while the Optics Mount consists of thecameras, LED lights and the spare ethernet, USB and power connections. The interconnection between theAvionics Stack and Optics Mount is made with one electrical connector and four thumbnuts that connect tostandoffs.

Another key decision was that the Goggles should be easy to assemble and disassemble. This was basedlessons learned from building SPHERES; when something breaks inside a complex integrated system, beingdesigned for serviceability can save a lot of time and effort.

The third primary design decision was that the Avionics Stack should have a shell that is not structuralor used for mounting any components. The shell’s primary purpose is to provide the Goggles with protectionfrom handling in a lab environment. This shell is manufactured using a 3-D Acrylonitrile Butadiene Styrene(ABS) plastic printer and was designed using a Computer Aided Design (CAD) program. The shell can beseen from multiple perspectives in Figure 13.

One key aspect of the Integrated Goggles design was to ensure that there was sufficient cooling for thecomponents inside the Avionics Stack. The Pico-ITX comes with a fan built into its heat sink, and is limitedby a maximum temperature of 50o Celsius. As a result, the Pico-ITX was placed at the outermost point ofthe Avionics Stack, and its exhaust ports were directed in the vertical direction (the direction of gravity),which avoids them creating disturbance forces and torques. The exhaust vents are visible in Figure 13.

Additionally, the 20 Watt (85% efficiency) DC-DC convertor often reaches a temperature of 60o Celsiuson the Flat Goggles platform. Although this is not a problem for the convertor, we wanted to ensure it didn’toverheat any further inside the Avionics Stack (at 75o C it looses efficiency and at 100o C the convertoris operating beyond its specifications). Therefore, the DC-DC convertor was thermally connected to thebattery bracket and holes were designed into the shell to allow for natural convection, which are also visible

12 of 14

American Institute of Aeronautics and Astronautics

Page 13: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

in Figure 13. If the SPHERES Goggles were to be operated on the inside there would a need to investigateand possibly redesign the thermal system for a microgravity environment.

(a) CAD Design of Avionics Stack (b) Fabricated Goggles

Figure 11. Avionics Stack and Optics Mount

(a) Integrated Goggles Design with Right Angle Optics (b) Integrated Goggles Design with Stereo Optics

Figure 12. Right Angle and Stereo CAD Models of Integrated Goggles

Figure 11 shows the final layout of the Integrated Goggles both in CAD and as fabricated with the shellremoved and the Goggles attached to SPHERES. Figure 13 shows four views of the Integrated Goggleswith the shell separately from SPHERES. Figure 12 shows the Integrated Goggles with two different opticsmounts. These mounts contain identical components, however the physical layout is changed.

Figure 13. 4 Views of the Integrated Goggles

13 of 14

American Institute of Aeronautics and Astronautics

Page 14: Design and Development of a Visual Navigation Testbed for ... · and development of computer vision based navigation algorithms for spacecraft proximity operations. This upgrade is

V. Conclusion

This paper has introduced the SPHERES Goggles, which is an upgrade to SPHERES that enablesthe research, development, testing and maturation of technology for computer vision based navigation forspacecraft proximity operations. The concept, design and implementation of this upgrade was discussedalong with the motivations, design philosophies and tradeoffs that were considered in the developmentprocess. Specific details were discussed on the design of the electronics, the software and two mechanicalimplementations (the Flat and Integrated Goggles).

Future work will include considerable development of computer vision software for the purpose of perfor-mance verification and testing. The question remains of how accurately and for what ranges the SPHERESGoggles can be effectively used for vision based navigation and research is currently underway to answerthese questions.

Acknowledgments

This work was performed under contract with the Naval Research Laboratory (NRL), in collaborationwith Aurora Flight Sciences (AFS). The authors would like to thank Glen Henshaw and Stephen Roderickfrom NRL, Joanne Vining, James Peverill and Joe Parrish from AFS. The authors are especially grateful forthe help provided by Paul Bauer, Jakob Katz, Simon Calcutt, Edward Whittemore and all of the SPHERESTeam at MIT.

References

1Fehse, W., Automated Rendezvous and Docking of Spacecraft , Cambridge University Press, 2003.2Deslauriers, A., English, C., Bennett, C., Iles, P., Taylor, R., and Montpool, A., “3D inspection for the Shuttle return to

flight,” Spaceborne Sensors III , edited by R. T. Howard and R. D. Richards, Vol. 6220, SPIE, 2006, p. 62200H.3Howard, R., Heaton, A., Pinson, R., and Carrington, C., “Orbital Express Advanced Video Guidance Sensor,” Aerospace

Conference, 2008 IEEE , March 2008, pp. 1–10.4Lanzerotti, L. J., “Assessment of Options for Extending the Life of the Hubble Space Telescope: Final Report,” Tech.

rep., National Research Council of the National Academies, 2005.5Obermark, J., Creamer, G., Kelm, B. E., Wagner, W., and Henshaw, C. G., “SUMO/FREND: Vision System for

Autonomous Satellite Grapple,” Sensors and Systems for Space Applications, edited by R. T. Howard and R. D. Richards, Vol.6555, SPIE, 2007, p. 65550Y.

6Henshaw, C. G., Henshaw, L., and Roderick, S., “LIIVe: A Small, Low-Cost Autonomous Inspection Vehicle,” AIAASPACE 2009 Conference and Exposition, AIAA 2009-6544, 2009.

7Zimpfer, D., Kachmar, P., and Tuohy, S., “Autonomous Rendezvous, Capture and In-Space Assembly: Past, Present andFuture,” 1st Space Exploration Conference: Continuing the Voyage of Discovery, AIAA 2005-2523, 2005.

8Williams, T. and Tanygin, S., “On-Orbit Engineering Tests of the AERCam Sprint Robotic Camera Vehicle,” Proceedingsof the AAS/AIAA Space Flight Mechanics Meeting, 1998, pp. 1001–1020.

9Fredrickson, S. E., Abbott, L. W., Duran, S., Jochim, J. D., Studak, J. W., Wagenknecht, J. D., and Williams, N. M.,“Mini AERCam: Development of a Free-Flying Nanosatellite Inspection Robot,” Space Systems Technology and Operations,edited by P. Tchoryk, Jr., and J. Shoemaker, Vol. 5088, SPIE, 2003, pp. 97–111.

10McGhan, C. L. R., Besser, R. L., Sanner, R. M., and Atkins, E. M., “Semi-Autonomous Inspection with a NeutralBuoyancy Free-Flyer,” AIAA Guidance, Navigation and Control Conference and Exhibition, AIAA 2006-6800, 2006.

11Romano, M., Friedman, D. A., and Shay, T. J., “Laboratory Experimentation of Autonomous Spacecraft Approach andDocking to a Collaborative Target,” Journal of Spacecraft and Rockets, Vol. 44, Jan. 2007, pp. 164–173.

12Romano, M., “On-the-Ground Experiments of Autonomous Spacecraft Proximity-Navigation using Computer Vision andJet Actuators,” Advanced Intelligent Mechatronics. Proceedings, 2005 IEEE/ASME International Conference on, July 2005,pp. 1011–1016.

13Ledebuhr, A., Ng, L., Jones, M., Wilson, B., Gaughan, R., Breitfeller, E., Taylor, W., Robinson, J., Antelman, D., andNielsen, D., “Micro-satellite ground test vehicle for proximity and docking operations development,” Aerospace Conference,2001, IEEE Proceedings., Vol. 5, 2001, pp. 2493–2504 vol.5.

14Mohan, S., Saenz-Otero, A., Nolet, S., Miller, D. W., and Sell, S., “SPHERES Flight Operations Testing and Execution,”Acta Astronautica, 2009.

15Nolet, S., “The SPHERES Navigation System: from Early Development to On-Orbit Testing,” AIAA Guidance, Navi-gation and Control Conference and Exhibition, AIAA-2007-6354, 2007.

16Nolet, S., Development of a Guidance, Navigation and Control Architecture and Validation Process Enabling AutonomousDocking to a Tumbling Satellite, Ph.D. thesis, Massachusetts Institute of Technology, 2007.

14 of 14

American Institute of Aeronautics and Astronautics