e3. tracking system for isaac construction and

13
E3. TRACKING SYSTEM FOR ISAAC Construction and Implementation of Tracking System for the ISAAC Sounding Rocket Experiment Emil Lundkvist and Markus Silberstein Hont Abstract—The ISAAC rocket experiment is part of the REXUS–15 payload. It has two free-falling units performing infrared spectroscopy between them. One of the free-falling units will actively track the other in the sky so as to enable continuous measurements. In this thesis report, the design of this tracking solution is described. The general concept of the tracking process is outlined, upon which the conceptuals as well as the hardware, software and firmware of each subsystem are explained. Tests of respective subsystems are conducted and conclusions are drawn pertaining to the accuracy and suitability of each subsystem for the tracking system. Index Terms—Tracking system, Sounding Rocket Experiment, VHDL, REXUS, I 2 C, FPGA, CMOS Camera Sensor, Sun Sens- ing, Hardware Design, Software Design ABBREVIATIONS ADC Analog-to-Digital Converter CAD Computer-Aided Design CMOS Complementary Metal–Oxide–Semiconductor CU Common Unit. Constitutes the top parts of both the ISAAC-Rx and ISAAC-Tx DLR German Aerospace Center ESA European Space Agency FFU Free-Falling Unit FPGA Field-Programmable Gate Array I 2 C Inter-Integrated Circuit ISAAC Infrared Spectroscopy to Analyze the middle Atmosphere Composition ISAAC-Rx Receiving FFU ISAAC-Tx Transmitting FFU JTAG Joint Test Action Group KTH Kungliga Tekniska H¨ ogskolan (Royal Institute of Tech- nology) LED Light-Emitting Diode MEMS MicroElectroMechanical System MUSCAT MUltiple Spheres for the Characterization of Atmo- spheric Temperatures PCB Printed Circuit Board RAM Random-Access Memory REXUS Rocket-borne EXperiments for University Students RMU Rocket-Mounted Unit RxSU ISAAC-Rx Specific Unit SNSB Swedish National Space Board SPP Department of Space and Plasma Physics at KTH TxSU ISAAC-Tx Specific Unit VHDL Very High speed integrated Description Language I. I NTRODUCTION I SAAC - Infrared Spectroscopy to Analyze the middle Atmosphere Composition - is a sounding rocket experiment developed at the Division of Space and Plasma Physics at the School of Electrical Engineering and the Division of Mechanics at the School of Engineering Sciences within KTH, together with the Department of Meteorology at Stockholm University (MISU). ISAAC is scheduled to be launched in March 2014 onboard the REXUS–15 sounding rocket. REXUS/BEXUS is a student Fig. 1: Chronological experiment overview program realized under a bilateral agreement between the German Aerospace Center (DLR) and the Swedish National Space Board (SNSB) [1]. A. Scientific Purpose of the ISAAC Project It is widely known that CO 2 in the Earth’s atmosphere has an influence on our planet’s climate, being a major contribution to the greenhouse effect. But less is known about its function at higher altitudes, where it is responsible for strong radiative cooling [2]. One issue is that not much in- formation is available on the CO 2 concentration in the middle atmosphere (altitudes of 12-80 km). The middle atmosphere is particularly interesting since previous research suggests a decrease in CO 2 concentration starting at altitudes of around 70-75 km and better data regarding this would help improve our understanding of the climate of the Earth [3]. B. Experiment Overview Two FFUs, referred to as ISAAC–Tx and ISAAC–Rx, are located inside the RMU, which in turn is placed on the REXUS rocket. The FFUs are ejected from the RMU when the rocket reaches above the dense atmosphere, at an altitude of approx- imately 60 km [4]. After ejection, both FFUs are spinning around their own cylindrical central axes. Throughout its free fall, the ISAAC–Tx is continuously and actively tracked by the ISAAC–Rx which provides the capability of performing IR spectroscopy between them: the change in intensity between emission from the ISAAC–Tx and reception at the ISAAC– Rx is measured and thereby the CO 2 concentration can be calculated in post-processing. (Refer to Fig. 1.)

Upload: others

Post on 25-Apr-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Construction and Implementation of TrackingSystem for the ISAAC Sounding Rocket Experiment

Emil Lundkvist and Markus Silberstein Hont

Abstract—The ISAAC rocket experiment is part of theREXUS–15 payload. It has two free-falling units performinginfrared spectroscopy between them. One of the free-falling unitswill actively track the other in the sky so as to enable continuousmeasurements. In this thesis report, the design of this trackingsolution is described. The general concept of the tracking processis outlined, upon which the conceptuals as well as the hardware,software and firmware of each subsystem are explained. Tests ofrespective subsystems are conducted and conclusions are drawnpertaining to the accuracy and suitability of each subsystem forthe tracking system.

Index Terms—Tracking system, Sounding Rocket Experiment,VHDL, REXUS, I2C, FPGA, CMOS Camera Sensor, Sun Sens-ing, Hardware Design, Software Design

ABBREVIATIONS

ADC Analog-to-Digital ConverterCAD Computer-Aided DesignCMOS Complementary Metal–Oxide–SemiconductorCU Common Unit. Constitutes the top parts of both the

ISAAC-Rx and ISAAC-TxDLR German Aerospace CenterESA European Space AgencyFFU Free-Falling UnitFPGA Field-Programmable Gate ArrayI2C Inter-Integrated CircuitISAAC Infrared Spectroscopy to Analyze the middle Atmosphere

CompositionISAAC-Rx Receiving FFUISAAC-Tx Transmitting FFUJTAG Joint Test Action GroupKTH Kungliga Tekniska Hogskolan (Royal Institute of Tech-

nology)LED Light-Emitting DiodeMEMS MicroElectroMechanical SystemMUSCAT MUltiple Spheres for the Characterization of Atmo-

spheric TemperaturesPCB Printed Circuit BoardRAM Random-Access MemoryREXUS Rocket-borne EXperiments for University StudentsRMU Rocket-Mounted UnitRxSU ISAAC-Rx Specific UnitSNSB Swedish National Space BoardSPP Department of Space and Plasma Physics at KTHTxSU ISAAC-Tx Specific UnitVHDL Very High speed integrated Description Language

I. INTRODUCTION

ISAAC - Infrared Spectroscopy to Analyze the middleAtmosphere Composition - is a sounding rocket experiment

developed at the Division of Space and Plasma Physics atthe School of Electrical Engineering and the Division ofMechanics at the School of Engineering Sciences within KTH,together with the Department of Meteorology at StockholmUniversity (MISU).

ISAAC is scheduled to be launched in March 2014 onboardthe REXUS–15 sounding rocket. REXUS/BEXUS is a student

Fig. 1: Chronological experiment overview

program realized under a bilateral agreement between theGerman Aerospace Center (DLR) and the Swedish NationalSpace Board (SNSB) [1].

A. Scientific Purpose of the ISAAC Project

It is widely known that CO2 in the Earth’s atmospherehas an influence on our planet’s climate, being a majorcontribution to the greenhouse effect. But less is known aboutits function at higher altitudes, where it is responsible forstrong radiative cooling [2]. One issue is that not much in-formation is available on the CO2 concentration in the middleatmosphere (altitudes of 12-80 km). The middle atmosphereis particularly interesting since previous research suggests adecrease in CO2 concentration starting at altitudes of around70-75 km and better data regarding this would help improveour understanding of the climate of the Earth [3].

B. Experiment Overview

Two FFUs, referred to as ISAAC–Tx and ISAAC–Rx, arelocated inside the RMU, which in turn is placed on the REXUSrocket. The FFUs are ejected from the RMU when the rocketreaches above the dense atmosphere, at an altitude of approx-imately 60 km [4]. After ejection, both FFUs are spinningaround their own cylindrical central axes. Throughout its freefall, the ISAAC–Tx is continuously and actively tracked bythe ISAAC–Rx which provides the capability of performing IRspectroscopy between them: the change in intensity betweenemission from the ISAAC–Tx and reception at the ISAAC–Rx is measured and thereby the CO2 concentration can becalculated in post-processing. (Refer to Fig. 1.)

Page 2: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

C. Scope

The goal of this thesis report is to describe the design, theconstruction and the implementation of the tracking systemwhich is crucial for the IR spectroscopy to take place. InSection II, a general overview of the tracking process isoutlined and its different sub-systems are described, bothconceptually and theoretically, and how they interact with eachother in order to achieve tracking. Furthermore, the trackingalgorithm is described on a theoretical level. Section IIIfeatures the in-depth description of the hardware functionality,how the specific components are chosen and how they functionelectronically. Section IV features the actual implementationof the tracking system. Circuit designs and layouts of thePCBs used for the tracking are described. The in-flight dataprocessing and control algorithms which are executed by theISAAC–Rx in order to locate the ISAAC–Tx are presentedin detail in terms of the programming. Section V describestest procedures carried out, including reasons for testing aswell as the ways in which they were performed. Section VIpresents the results from these tests and from the workingprocess of the thesis project. Section VII contains discussion,both concerning results from tests as well as the thesis projectat whole.

II. TRACKING SYSTEM OVERVIEW

In order to perform spectroscopy, the aperture on theISAAC–Rx must be facing the IR light source on the ISAAC–Tx. This is the sole purpose of the tracking system and theconcept of its execution is described below.

A. Tracking System Concept

The ISAAC–Tx serves as a ’passive’ unit in the sense that itwill be completely unaware of the whereabouts of the ISAAC–Rx, which in turn is the unit entirely responsible for carryingout the tracking. When the FFUs are ejected from the RMU,they are spinning at the rate of the rocket [1]. In order toenable the aperture on the ISAAC–Rx to be fixedly pointedat the ISAAC–Tx, the Rx unit is divided into two sub-units:the RxSU, which contains the aperture and all the trackinghardware, and the CU, which contains a GPS system for post-flight recovery, amongst other things which are not relevantfor the tracking. In between the sub-units is mounted a motorwhich allows the RxSU to assume a non-spinning state in thecommon reference frame of the FFUs, which fixes the aperturehorizontally. Vertical adjustments are made by tilting a mirrorwhich directs the IR light onto a sensor. Together, the repeatedvertical and horizontal adjustments allow the spectroscopy tobe carried out continuously throughout the data gathering stagein Fig 1.

The amounts of these adjustments are calculated using datafrom three different kinds of sensors mounted onto the RxSU:

1) Sun Sensors: The initial stage after the ejection ofthe FFUs is for the ISAAC–Rx to roughly approximate thelocation of the Tx unit in the sky. This is done by relating theaxis of rotation to the position of the sun using a sun sensingsystem (the electronics of which is described in further detailin section III).

2) CMOS Camera: Mounted (evenly spaced) onto the skinof the ISAAC–Tx are 20 LEDs (placed in pairs). The purposeof these is to increase the visibility of the Tx unit in the sky.Once the approximate location of the ISAAC–Tx has beencalculated using the sun sensing system, the camera moreaccurately determines where to aim the aperture. Images ofthe LEDs are captured and analyzed (refer to section IV)after which the rotation speed and mirror tilt angle can beappropriately adjusted.

3) Angular Rate Sensor: In order to assume the non-spinning state, the control algorithm needs data input con-cerning the total spinning rate of the RxSU, and it is fromthe angular rate gyro that these data come. Furthermore, theimage processing algorithm (refer to section II-C) uses datafrom this sensor to predict movement of the LEDs across thefield of view.

Figure 2 describes the overall concept of the trackingsystem.

Fig. 2: Conceptual diagram of the tracking system

B. Sub-System Theory

1) Sun Sensors: The FFUs are ejected from the REXUSrocket at a given time, and therefore they follow a certainknown trajectory relative to the sun. This fact matters toISAAC for two reasons. Firstly, the position of the sun is im-portant when, after ejection, the ISAAC-Rx attempts to make afirst rough approximation as to the whereabouts of the ISAAC-Tx using its onboard sun sensors. Secondly, the performanceof the tracking camera would be severely degraded, should itbe ’blinded’ by the sun. This also needs to be accounted for.There are four sun sensors mounted evenly spaced onto theskin of the ISAAC-Rx. They measure the intensity of incidentsun light at any given sampling time. When ejection occurs, theFFUs are spinning at a rate of approximately 4 Hz [1], whichallows for a periodic behavior in the intensity readouts fromeach sun sensor. If all the sensors are sampled in phase witheach other, one can compare differences in intensity betweenthem and thus interpolate the relative position of the sun. Thesedata are then used to control the rate of the de-spinning motormounted between the RxSU and its CU in such a way that theoptical system is horizontally aligned with the ISAAC-Tx.

Page 3: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Fig. 3: Conceptual diagram of the sun sensing system

The goal of the sun sensing system is to control the spinningin such a way that one sun sensor (referred to as SS1 in Fig.3) is constantly aligned with the sun in the plane of the RxSU.This is achieved by seeing to that the intensity readouts fromSS1 are as large as possible. The two sensors situated closestto SS1 (SS2 and SS4 in Fig. 3) serve to enable ’stereo vision’,i.e. they provide information about in which direction to spinthe RxSU in order to achieve maximum intensity at SS1. Forexample, if the intensity difference ISS4− ISS2 is positive (bysome margin where noise is taken into account), then the FFUneeds to spin clockwise (with a negative value of θ), and viceversa.

The fourth sun sensor (SS3 in Fig. 3) compensates foralbedo effects from the Earth, which are due to the sunlightbeing reflected in its surface. These are typically around 35%of the incident sun intensity and cannot be neglected [5].Should SS3 not be present, then the sun sensor system mighterroneously lock onto the ’light source’ stemming from thealbedo effect because it is unaware of there being a strongerlight source present, namely the sun.

When deciding where to mount the four sensors, one needsto – in addition to the fact that they should be evenly spaced– take into account what is considered the optimal position ofthe sun in relation to the IR sensor aperture. Because SS1 is,ideally, always facing the sun, then there exists a polar angle,θa, between the aperture and SS1, which corresponds to theoptimal aperture–to–sun angle that can be predetermined. Onepart of determining this angle is knowing in what directionthe FFUs are ejected from the REXUS rocket. This is doneby mounting yet another sun sensing system onto the RMU,which keeps track of the RMU–to–sun relation and ejects theFFUs once this is optimal. Using this method, parameterssuch as the time of day become irrelevant, because theonly thing that matters is the angle of the sun within thereference frames of the ISAAC–Rx and the RMU, respectively.

2) Distance Dependency of LED Intensity: The intensityof the LEDs will, of course, not be the same at any distancebetween the FFUs. In fact, the intensity of one LED will be

’spread out’ over a spherically shaped surface according to

I(rd) =Φ0∫

Ωr2d dΩ

, (1)

where Φ0 [W] is the radiant flux of the LED, Ω is the totalsolid angle across which the light radiates. rd is the separationdistance between the FFUs, and therefore the integral sum inthe denominator equals the area of the surface over which thelight spreads at rd.

In the particular case of the LEDs used for the ISAACproject, Φ0 = 900 mW per LED and dΩ = sin θdθdφ withθ : −θv → θv and φ : −θv → θv , where θv = 65 is theangular interval from the center axis within which 90% of thetotal flux is radiated [6]. This yields:

I(rd) =900 mW∫

Ωr2d sin θdθdφ m2

≈ 304

r2d

mW/m2 (2)

The intensity at the aperture then equals I(rd) multiplied bythe area of the aperture.

3) Angular Rate Sensor: The angular rate sensor systemis implemented using heritage from MUSCAT, a previousREXUS team [7]. It utilizes the Coriolis effect to measureangular rates. Coriolis forces on a rotating rigid body like theRxSU are on the following form [8]:

Fcor = −2mω × v (3)

where m is the mass of the object and ω is the angular rate ofthe object. Hence, Fcor is proportional to the angular rate ofthe object. The angular rate sensor consists of a MEMS tuningfork which, upon being subjected to Coriolis forces (the mostdominant ones in a rotating system such as the RxSU), vibratesat its eigenfrequency with a velocity, v, of

v = a sinω0t (4)

where ω0 is the eigenfrequency of the tuning fork, and a isthe amplitude. Then, using (3), (4) and the fact that ω ⊥ vin the case of the tuning fork on the FFU, the Coriolis forcebecomes

Fcor = −2maω sinω0t (5)

The tuning fork is deformed, periodically and elastically, bythe Coriolis force, thus producing a current, Q, which can bemeasured electronically [9]. It takes the form

Q = Aω sinω0t (6)

where A is an amplitude constant, dependent on the materialand the geometry. By knowing A, ω can be found.

Page 4: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

C. Algorithm Concept

1) Finding hotspots: A crucial part of the tracking algo-rithm is being able to find the LEDs in the images capturedby the tracking camera. Noise is expected and makes it moredifficult to find the LEDs. In order to narrow down the imageto a set of possible areas, there needs to exist an algorithmwhich locates hotspots – bright areas that represent prospectivelocations of the LEDs. Firstly, one needs to define a threshold.An area containing pixels with brightness above this thresholdwill be considered a hotspot. The challenge is to determinean appropriate value of this threshold brightness – the fewerpresumptive LED locations the better, but selecting an all toohigh threshold increases the risk of not finding any LEDs atall.

The algorithm begins by receiving pixel information fromone captured image and every pixel is gone through andis compared to the intensity threshold. It is also checkedwhether or not the current pixel already belongs to apreviously encountered hotspot. If the pixel intensity is abovethe threshold and does not belong to a previously encounteredhotspot, then a new hotspot is considered to be found and thepixel is added to it. When a new hotspot has been defined,the algorithm begins expanding the hotspot by recursivelyassessing whether the neighboring pixels reach above thethreshold. If a neighbor does qualify, that pixel is also addedto the hotspot, after which all of its neighbors are checked aswell. This continues until all pixels belonging to the hotspotare found. When all neighboring pixels in the hotspots havebeen found, the algorithm continues with the next pixel alongthe row which does not belong to the hotspot and startslooking for the next hotspot as described earlier. When theimage is completely searched through, a number of hotspotshas been found.

2) Tracking modes: In order to be able to perform thetracking described in Section II-A a control algorithm isneeded. This algorithm takes a sun angle, an angular rate anda list of possible hotspots as input. The output is the desiredmirror angle and the despin velocity. This algorithm consistsof three modes referred to as the Despin, Lock and Searchmodes.

When the ISAAC–Rx is ejected from the rocket it will spinwith an angular rate of 4 Hz. At this stage, the algorithm is inthe Despin mode. In this mode the main inputs used are thesun angle and the angular rate. The goal is to stop the RxSUfrom spinning and set the angle to the sun in such a way thatthe aperture is focusing on ISAAC–Tx (which was describedin Section II-B1). In total, this corresponds to

θ → 0 ∧ θ → θdesired, (7)

where θ is the angle between SS1 (in Fig. 3) and the sun. Whenthese requirements are met, within a defined error margin, thealgorithm proceeds to Lock mode.

In Lock mode, the algorithm starts receiving input fromthe camera images, which have been previously pre-processedaccording to Section II-C1. This information consists of a list

of possible hotspots in which the LEDs might be found. In thefirst iteration, it presumes that the most likely hotspot in thefirst image (the brightest one) actually are the LEDs. In thefollowing iterations it uses information from earlier hotspotsto better predict where the LEDs are. It maps hotspots frompreviously taken images to the current one and is thereforeable to track the movement of individual hotspots. Thesemovements are compared to the input movement from thesensors which makes it possible to discard some hotspots asnoise. If the angular rate increases so much that the LEDsmight get out of the field of view or if the sun angle errorbecomes too large, it means that ISAAC–Rx has begun torotate out of the optimal frame-of-view and the algorithmenters Despin mode once again.

If in Lock mode one does not find any hotspots at all, itmost likely means that the LEDs are not in the field of viewof the camera. This triggers the Search mode. In Search mode,the ISAAC–Rx is commanded to start a slow rotation in whichit still receives input from the camera. It then begins searchingfor possible hotspots, and as soon as one has been found (i.e.when a hotspot has been encountered and the condition (7) isfulfilled) it again enters Lock mode.

The complete concept is shown in Fig. 4

Fig. 4: Conceptual diagram of the tracking modes

III. ELECTRONICS OVERVIEW

A. FPGA

FPGAs are integrated circuits with programmable logicwhich are able to direct and process digital signals. Voltagesat the terminal pads of the FPGA are related to some giveninterval (commonly between 0 and 3.3 V). If the voltageis below the midpoint of this interval, then the signal isconsidered LOW (i.e. a digital ’0’ bit), and if it is abovethe midpoint it is considered HIGH (i.e. a digital ’1’ bit).Digital information, consisting of sequences of these bits,is transferred in between different instances of the FPGAand/or different devices in relation to one or several clockswhich are often generated by physical oscillators with knownfrequencies. The signals from this oscillator can be dividedinto lower frequencies, should this be necessary. Table I showshow the oscillator frequency is divided on the tracking PCB.

System Clock Vector DescriptionM TIME[0] 32.768 MHz Oscillator frequencyM TIME[1] 16.384 MHzM TIME[2] 8.192 MHzM TIME[3] 4.096 MHzM TIME[4] 2.048 MHz...

...M TIME[25] 1 Hz Lowest defined frequency

TABLE I: Clock reference vector [10]

Page 5: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

1) FPGA Programming: For the ISAAC tracking system,ProASIC3 FPGAs from Microsemi are used. The particularmodel chosen, the A3P250, has 250.000 logic gates and 68user-customizable I/O ports [11]. Logic is programmed in theVHDL programming language which is written in the LiberoIDE suite provided by Actel. The code is designed using a setof modules: a unit class within the code where the firmwaredesigner defines input and output ports and signals which thenare routed in relation to the clock. Signals are handled bothinternally within the module and sent in-between modules.When the VHDL code has been written, the developingenvironment synthesizes a file containing the logic gate setupwhich is to be used on the FPGA. In the last step, a JTAGinterface is used between the computer and the FPGA, whichthen can be programmed according to the synthesized file.

2) I2C Communication: In order to enable the trackingPCB to communicate digitally with the different sensors, I2Cis used. It is a serial data interface which is used to handlecommunication between digital circuits. It consists of twosignals, the Serial Clock (SCL) and the Serial Data (SDA).One of the devices (the FPGA chips in the case of the ISAACtracking system) is said to be the master, which is responsiblefor generating the SCL and initiating communication withthe slave(s). These receive the SCL and respond to signalssent by the master. Data are transferred through the SDAby relating to the square wave shape of the SCL. Events,i.e. sequences of bits transferred between the devices, occuraccording to pre-defined firmware routines in which SCLand SDA depend on each other. The I2C interface provesuseful when, for example, writing to and reading from theregisters of a digital circuit. Registers store bits of informationpertaining to the functionality of the device (such as writespeeds, hardware configurations, etc.). (Refer to Section III-B4for the configuration of registers in the case of the CMOScamera.)

B. Tracking CameraThe MT9T031 sensor from Aptina is used for the tracking

system. It is a 1/2.5 inch CMOS active-pixel digital imagesensor and has an active imaging pixel array of 2592 by 1944pixels [12]. The sensor is able to perform column and rowskipping and pixel binning, which enables the user to readimages of preferred sizes (at the expense of the resolution).It is also possible to control in detail the ways in which theexposure is carried out. Table II shows the pin configurationof the sensor header board.

Pin name Type Pin descriptionSCLK I/O I2C Serial ClockSDATA I/O I2C Serial DataTRIGGER Input Initiates triggeringSTROBE Output Initiates data readoutEXTCLK Input Master ClockDOUT[11:0] Output 12 bits of pixel dataPIXCLK Output Pixel readout clockFV Output Frame ValidLV Output Line Valid

TABLE II: Pin layout of the camera

Fig. 5: The full pixel array [12]

Fig. 6: Row and column skipping

1) Blanking: In addition to the active imaging pixels (i.e.the pixels that are part of the actual ouput image), thereare two blanking areas and two dark columns, which serveas offset from the sensor edges. Blanked pixels are indeedexposed when the camera is trigged (refer to Section III-B3),but they are not included when the data are read out. Darkcolumns are read, but not exposed and function as a blackreference for the internal ADC of the sensor. Fig. 5 illustratesthis.

2) Skipping and Binning: Skipping of rows and columnsis used to make the resolution of the output images smallerwithout reducing the field of view of the camera. An exampleof this is shown in Fig. 6 where every second row and columnis skipped and is displayed grey in the image. Skipping can,however, introduce undesired aliasing effects, which results inpoor, distorted images [13]. These effects can be reduced byperforming row and column binning. This method combinespixels over some pre-defined number of adjacent rows andcolumns. It can be done either by averaging or by summingover the intensities of the pixels. The concept of binning isshown in Fig. 7.

3) Exposure mode: The default exposure mode for theCMOS sensor is to continuously output frames at a constant,configurable frame rate. In order to better control the dataflow for the tracking system, non-continuous triggering maybe desired. This is made possible in the Snapshot mode, which

Page 6: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Fig. 7: Pixel binning

Fig. 8: The Snapshot exposure mode [12]

allows the user to capture one image at a time, and is set bywriting ’0100 0001 0000 1111’ to the R0x01E register [14].The sensor has two triggering pins, TRIGGER and STROBE:TRIGGER enables the exposure, and STROBE enables theactual readout of the data from the exposure. These pins areused in essentially the same way for all the available exposuremodes [12]:

1) Wait for TRIGGER to occur, then start the exposure2) Wait for STROBE to occur, then start the readout

It is the time interval and/or the pixel distance between thesetwo triggerings which differs between the exposure modes. InSnapshot mode, the enabling TRIGGER is done manually andinitiates the exposure of the first row (with some delay becauseof the dark column blanking which, as mentioned previously,are non-exposed pixels that can be used to fine tune the blacklevel, but is not part of, the output image). After a smallperiod of time, tROW (which depends on binning settings,clock speeds, etc.), typically 10µs, the second row beginsexposing. After an additional tROW, the third row is exposed,and so on. When nrows · tROW has passed (where nrows isthe number of rows on the array), all rows are being exposedupon which STROBE triggers automatically. This marks thebeginning of the data readout sequence. Once the first row hasbeen read out, TRIGGER can be enabled again, thus makingit possible to begin another exposure even before the previousimage has been fully read out. Figure 8 shows the timing ofthe Snapshot mode.

4) Register Configuration: Settings like binning and expo-sure modes are communicated to the camera by setting thesensor registers to desired values. Registers are written to thesensor using the I2C protocol. The sensor is the slave and thecontrol FPGA (see section IV-A) is the master.

A write sequence is initiated by the master setting the SDAline LOW while the SCL line is HIGH. This is called a start

bit. This start bit is then followed by the specific 8-bit slaveaddress of the CMOS sensor sent on the SDA line with onebit per SCL cycle when SCL is HIGH. During the next clockcycle, the master releases the data line and the slave pullsit LOW in order to communicate that the information wasreceived. This is called an acknowledge bit. Thereafter, theregister address is sent to the slave followed by the actualregister data which consists of two sequences with 8 bitsof data transferred in the same way described above, eachfollowed by another acknowledge bit from the sensor.

An example of this process is given in Fig. 9 (in which theclock line is called SCL and the data line is called SDATA).In the example the R0x09 register is written to with the value’0000 0010 1000 0100’.

Table III shows the registers that are relevant for controllingskipping, binning and exposure modes.

RegisterAddress

Register Name Register Description

R0x022 Row Address Mode Controls row binning and skip-ping

R0x023 Column Address Mode Controls column binning andskipping

R0x01E Read Mode 1 Controls exposure modes

TABLE III: Particularly relevant camera registers

5) Capturing: Images are captured from the camera sensorin relation to its pixel clock (PIXCLK) output pin: the 12 databits in each pixel will be parallelly captured with every cycleof PIXCLK. The frame valid (FV) and line valid (LV) outputpins give information about whether the sensor is transferringvalid pixel information or reading from one of the blankingareas. When FV is HIGH, which it will be an integral numberof times equal to the number of rows that are read out, itmeans that the sensor is outputting pixels on rows contiainingactive pixels (but do not necessarily belong to columns withinthe active image area). Similarly, then LV is HIGH, the pixelsthat are output are horizontally aligned with the active imagepixels. Thus, when both FV and LV are HIGH simultaneously,the sensor outputs pixel data belonging to a valid image [12].The 12 bits of data in each pixel are synchronized with thepixel clock and are sent in parallel through DOUT (12 pins)with 12 bits every clock cycle which is seen in Fig 10.

The default operational frequency of the CMOS camerais 96 MHz. At the full resolution of 2592x1944 pixels (5megapixels), the frame rate becomes 14 fps (frames persecond). In the ISAAC tracking system, the highest possiblemaster clock frequency of the oscillator being considered forthe experiment is 32.768 Hz (refer to Table I), and thus theframe rate at maximum resolution would be approximately 4.7fps, i.e. just above one image per revolution of the FFUs attheir initial spinning rate. However, in order to increase theframe rate, the resolution may be lowered by either perform-ing skipping, binning or reducing of the field of view. Forexample, at a resolution of 640x480, the frame rate becomesapproximately 41 fps [12].

6) Color Filter: The LEDs emit light with a wavelengthpeak within [655, 670] nm [6]. This means that a vast part of

Page 7: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Fig. 9: Camera register write sequence [12]

Fig. 10: Image data readout sequence

the spectrum contributes to the noise in the camera images.Therefore, a bandpass color filter is used with its peak at 660nm and 10 nm bandwidth [15]. It is mounted together withthe camera optics on the RxSU.

C. Sun Sensor

The main component of the sun sensing system is a SilonexSLCD-61N2 photodiode. When subjected to incident light, itgenerates a current proportional to the intensity of the light, inaccordance with the photovoltaic effect [16]. This generatedcurrent is converted to a voltage change in a load resistor,which is then amplified using the LM7321 operational ampli-fier from Texas Instruments and digitized in the MAX11617ADC from Maxim Integrated [17]. The acquired data are thenused in the tracking algorithm in every stage of the datagathering process. Figure 11 shows the schematics of one suchsun sensor circuit.

The particular photodiode used has a spectral sensitivityof λs = 0.55 A/W, and an area of A = 21.4 mm2 [18].Together with the fact that the sun radiates approximatelyIsun = 1300 W/m2 on Earth, and that the current is convertedto a voltage drop in a 100 Ω resistor, one would expect thereto be a maximum voltage change of

∆U = λsAIsun ·Rload =

= 0.55 · 21.4 · 10−6 · 1.300 · 100 = 1.53 V (8)

Fig. 11: Sun sensor circuit

D. Angular Rate Sensor

The L3G4200D three-axis digital gyroscope from STMi-croElectronics is used. It has a customizable measuringrange of ±250/±500/±2000 dps and a selectable bandwidthof 100/200/400/800 Hz. The sampling rate can be set to100/200/400/800 Hz. The gyroscope is controlled using theI2C interface [19].

IV. IMPLEMENTATION

A. Schematics and Layout

The tracking PCB on the RxSU needs to handle severaldifferent interfaces: communication with the three sensors (theCMOS camera header-board and the two ADCs which convertthe analog sun and angular rate sensor signals into digitalsignals), writing data from every instance of the trackingsequence to flash memories, executing image pre-processingand control algorithms and actuation of the motors which con-trol the RxSU rotation and mirror tilt angle. Due to physicallimitations in the number of I/O pads on the FPGAs, andbecause of their limited computational capacity, four FPGAchips are used, configured as follows:

• FPGA 1a is responsible for communicating with andcollecting data from the CMOS camera. It reads to andwrites from the registers in order to change configurationsregarding binning, row skipping and the digital gain of thepixels, for example. It also writes each image captured toa flash memory which is saved for post-flight processing.Lastly, it sends the image data to FPGA 2.

• FPGA 2 executes the pre-control processing of the cap-tured image. Each image has a certain number of areas-of-interest (hotspots) in which the LED source might be

Page 8: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

visible. FPGA 2 locates and takes note of these particularareas and sends information about them to FPGA 3. Thepurpose of this is to reduce the amount of data beinghandled in FPGA 3. Because this algorithm cannot berun continuously as the pixel data are being received fromFPGA 1a, there needs to be a buffer from which FPGA2 can read once the entire image has been received. Thisbuffer is a RAM which is connected to FPGA 2.

• FPGA 1b is responsible for communicating with andcollecting data from the sun and angular rate sensors andwrite it to a flash memory. It also pre-processes the datafrom the sun sensors as described in section II-B1. Oncefinished, the angular rate and sun location data are sentto FPGA 3.

• FPGA 3 Processes the hotspot data uses the data fromFPGAs 1b and 2 to execute the control algorithm thatwas described conceptually in section II-C. The results ofthis are stored onto a flash memory. Also, logical pulsesare sent to current-controlled circuits which power therotation and mirror tilt angle stepper motors.

Fig. 12 shows a conceptual layout of the FPGA configura-tion of the RxSU tracking PCB.

The schematics of the PCB are drawn in Mentor Graphics,a CAD software. Once finished, the actual layout of the PCBis created, i.e. the specific components and their wiring arerouted on the board. The PCB designs are then sent to anexternal company which manufactures the board, with wiringand solder pads. The components are then soldered by handat SPP.

Fig. 12: FPGA configuration of the RxSU tracking PCB. The FPGAswith dashed borders are those executing algorithms, and those withnon-dashed borders acquire data.

B. FPGA Programming

The communication with the camera is handled with twomodules called Camera Core and Camera Controller as seen in

Fig. 13. The Core is responsible for the direct communicationwith the camera sensor through the I2C protocol while theController gives information to the Core about what data tosend and when to send them. The arrows indicate in whichdirection each signal is sent internally and Table IV explainsthem.

Initially, the Core and the Controller are in an idle statein which the serial output pins (SCL and SDA) connectedto the camera both are HIGH. The Controller then initiatesthe communication by setting the CoreReturn signal (viewedupon as trigger) to HIGH. The Controller also sends theslave address (SlaveAddress), the number of bytes to write(NumberOfBytes) and then the first byte to write (throughDataOut). The slave address holds information about whichcomponent to communicate with and also whether it is aread or write sequence that is to be executed. For example,in the case of a write sequence to the camera sensor, the0xBA slave address is used [12]. The NumberOfBytes signalindicates how many bytes that is to be written, and becausethe camera registers consist of two bytes this is set to 0x02(refer to Section III-B4). The Controller then waits for theCore to communicate with the camera sensor and because itpossesses information about which state the Core is in (by theI2CStateIn signal), it knows when the Core is finished. TheController then sends the second byte to write to the registerthrough DataOut. Because the Core knows that only two bytesare to be written, it writes the last byte and thereupon endsthe I2C communication.

Signal Name Signal DescriptionCLK External clockReset External reset signalCoreReturn Signals to the core when to start.SlaveAddress Gives the slave address of the camera

to the core.DataIn/DataOut Gives data to send to the camera to the

core.NumberOfBytes Sets the number of bytes which are to

be written or read by the core.StateOut/I2CStateIn Information from the core about in

which state it is.SCL Serial clock to the camera.SDA Serial data to the camera.

TABLE IV: Camera Core and Controller signals

C. Sun Sensor

Using the I2C interface, data are read out from theMAX11617 ADC to FPGA 1b. The ADC has 10 independentinput pins, of which four are used for the photodiode circuitswhile the others are not connected. All sensors are sampledsimultaneously, upon which the data are sent into the controlsystem which will make attempts to align the RxSU with thesun in ways described in Section II-B1.

D. Angular Rate Sensor

Again, using the I2C interface, data are read out from thegyroscope and into FPGA 1b. Once the sensor is powered on,settings are configured on the sensor by the FPGA. Then thegyro begins sampling data at a rate which can be configured in

Page 9: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Fig. 13: Camera Core and Controller

a register in the sensor. The FPGA (which is the I2C master inthis case) enters a loop in which it monitors the state of a statusregister for a ’data-ready’ bit. If such a bit is encountered, allthe 7 bytes of angular rate data are read out. Appended to thedata is also an eighth byte, containing IDs and such. After this,the data are complete and can be written to the flash memoryand passed onto FPGA 2 [10].

V. TEST PROCEDURES

Below are descriptions of three tests conducted in orderto assess the performance of the different components of theISAAC tracking system.

A. Sun Sensors

One desires information about the performance of the sunsensing system when subjected to sunlight from differentangles, θ, in the plane of the RxSU. Therefore, a quantitativetest was performed. The test was set up to simulate as closelyas possible the real scenario of the ISAAC experiment: Fourphotodiode circuits of the kind described in Fig. 11 wereplaced on the circumference of a circle with a radius of 120mm (which matches the dimensions of the ISAAC-Rx) [4].The sensors were placed evenly spaced according to Fig. 3 –each sensor offset by 90 from its neighbors. The referenceframe was set so that θ was zero in the direction of the sun.A screen of plain, white paper was installed at θ = 180 anda distance of 500 mm from the center of the circle so as toimitate albedo effects somewhat.

Measurements were carried out by noting the voltage outputfrom each of the sensors at angles θ : 0 → 350 withincrements of 10.

B. LED Distance

The tracking solution of the RxSU almost entirely dependson how well it is possible to capture images of the LEDmounted onto the TxSU. The goal of this test was to givean overview and to investigate at what distances one LEDwas visible to the camera.

Using a demonstration PC software from Aptina Imaging,which enables simple image capturing through a USB headerboard, a sequence of images was captured at various distances.

Fig. 14: Hotspot tracking test image

The full pixel-array was read out as opposed to implementingbinning and skipping methods discussed in Section III-B2. ThePC software was set to convert all pixel intensities to digitalsignals using the same gain setting. The LED was then locatedmanually in the image, upon which the brightnesses of itspixels were noted, and the brightness of the background wassubtracted so as to not disturb the comparativeness betweenthe different images.

C. Hotspot Algorithm

In order to understand how well the concept of findinghotspots in images (described in Section II-C1) functions, atest program was developed. The program was written in C#and implements the concept which is then tested on sampleimages, both generic ones and images captured from thecamera. The program scans through the images and outputsinformation about which hotspots it has found along with thepixels belonging to them.

Fig. 14 shows an example image that was created usingPaint.NET. This image has some bright areas which, de-pending on the threshold chosen, could be considered to behotspots.

Fig. 15 shows an image captured with the camera sensor,containing an LED at a distance of 340 m. The white circleindicated the location of this LED.

VI. RESULTS

A. Test Results

1) Sun Sensors: The plot in Fig. 16 illustrates the resultsof the measurements described in Section V-A:

In this figure, θ corresponds to the angle between the sunand SS1 in Fig. 3.

Page 10: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

Fig. 15: Hotspot tracking test image. The white circle indicates thelocation of the LED.

Fig. 16: Sun sensor intensity readouts

2) LED Distance: Fig. 17 shows a fitted graph of the datapoints measured. The fitted curve has the following equation:

f(x) = axb (9)

The results of the curve fitting were the following:a = 7.363 · 109

b = −2.851(10)

3) Hotspot Algorithm: Table V shows how the number ofhotspots found in each image (Fig. 14 and Fig. 15) dependson the threshold value chosing and using the test programdescribed in Section V-C. The threshold value ranges from 0to 1 where 0 corresponds to black and 1 to white.

Image Thresholdvalue

Hotspots found

Generic (Fig. 14) 0.8 3 (all white)0.7 5 (2 white, 2 grey, 1 mixed)

From camera (Fig. 15) 0.9 1 (the LED)0.85 20.8 271

TABLE V: Number of hotspots found depending on threshold value

Fig. 17: LED distance dependency results

B. Project Outcome

The following sections outline the final outcome of thisbachelor thesis project, in terms of the results.

1) Hardware: As for the hardware, the goal of the thesisproject was to construct a sun sensing system and a trackingPCB that could handle the incoming data from the sensors.Also, the two existing sensor systems, namely the angularrate sensor and the CMOS camera, were to be tested andincorporated into the tracking system.

The sun sensor system has been constructed and its func-tionality has been tested (refer to Section VII-A1 for thediscussion of the results). A test PCB was designed accordingto Fig. 11 and has been used throughout the course of theproject. A more space-efficient PCB will be designed to fitbetter on the skin of the RxSU.

As for the tracking PCB, the schematics have been designedas specified in Section IV-A. Each component has been chosenand all the schematic wiring has been routed. However, theactual layout has not been drawn, and because of this, noPCB has been ordered.

The angular rate sensor coming from the MUSCAT teamhas been studied on a conceptual and theoretical level. Itsincorporation into the control algorithm has been developed.However, it has not actually been tested for the purposes ofthe ISAAC tracking system.

2) Software and Firmware: Included in scope of the soft-ware development part of the thesis project was to developthe signal communication between the onboard sensors andthe FPGAs. Also, a tracking algorithm was to be designed tocontrol the tracking.

The software that has been developed is:• The I2C register communication with the camera sensor

which controls settings like binning, skipping, exposuremodes and such. (Refer to Section III-B.)

• Storage of image data from the camera sensor to the on-chip FPGA memory buffer.

• Reading data from the on-chip FPGA memory buffer andsending it through a COM interface for viewing on aconnected computer.

• A test program to demonstrate the concept of findinghotspots.

Page 11: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

• A conceptual design of the control algorithm has beendeveloped.

The software that has yet to be developed is:• An optimized hotspot finding algorithm written in C that

can be executed by an embedded core FPGA.• Data gathering procedures for the sun sensors and the

angular rate sensor.• The implementation of the tracking algorithm as de-

scribed by the concept in Section II-C2.

VII. DISCUSSION

A. Test Results and Performance

1) Sun Sensor: As stated in Section II-B1, the intensityreadout should assume a periodic behavior according to Fig.18.

Fig. 18: Theoretical sun sensor voltage readouts

Again, θ is the angle between the sun and SS1 as describedin Fig. 3. The graph shows the intensity readouts in the caseof no albedo effects present. The calculations were carried outusing the circuit design showed in Fig. 3 and eq. (8).

As can be seen in this figure, one expects readouts between0.4 V and 2.9 V. The lower limit was chosen so as not toget signals too close to the supply rail of the operationalamplifier, because such signals would not be amplified at all.This rail distance is amplifier-specific – 0.3 V in the case ofthe LM7321 amplifier which was used for the sun sensors.However, the limit to the upper rail was underestimated: itshould also have been at least 0.3 V as specified by theLM7321 data sheet. In the setup used for the test, the rail-to-rail distance was 3.2 V, with ground being the lower limit.Therefore, with a maximum output of 2.9 V, one would expectthe amplifier to clip somewhat. This phenomenon can indeedbe seen in Fig. 16, where the signal is clipping at just about2.9 V. This means that the signal would have reached a peakhigher than the figure shows, if the setup would have beenmade in a more correct manner. The results are thereforequite poor in that sense (but not completely useless in othersas can be seen below). They are, however, easily corrected,

time allowing, by either increasing the supply voltage of theoperational amplifier or lowering the feedback gain.

From Fig. 11, one expects there to be a ’gap’ between theintensity peaks in which the intensity maximum of the sunfalls precisely in-between two sensors. This would presentproblems with data processing because of the closeness to thenoise (i.e. albedo) floor in the ’blind spots’. This appears tobe confirmed by Fig. 16 – if the signal would not have beensaturated, then it seems that the gaps would have been presentas per the theoretical supposition.

One way of circumventing this problem would be to use alarger amount of photodiodes (at the expense of geometricalspace on the FFU as well as increased complexity in theprocessing) so as to narrow these gaps.

In the other end of the readouts in Fig. 16, the albedoeffects do indeed affect the results at low sun intensitiesby increasing the noise floor. This was expected but doseem compensable by the relatively strong signals at greaterintensities.

2) LED Distance: The curve in Fig. 17 predicts visibilityat a distance of well over 500 meters (depending, of course,on how one chooses to define an LED being visible, interms of numbers of bright pixels and such). However, themethod is rough and effort has to be put into constructing analgorithm which detects the LED in a more accurate manner.Furthermore, weather and other measuring conditions were notoptimal at the occasion – sunlight fell upon the LEDs in sucha way that the brightness of the background was unnecessarilyelevated (which it would not be in the real experiment due tothe sun sensor system and the direction of ejection). Also, thecamera lens did not perform very well in focusing on the LEDat the furthermost distances which, too, affects the results. Thelatter is solved in the ISAAC experiment by using a differentlens.

Geographical conditions did not allow for measurementsat further distances than those shown in Fig. 17. Suchmeasurements would have provided useful information for thetracking solution, because the maximum operating distancebetween the FFUs has yet to be definitively decided upon bythe ISAAC team. Should the spectroscopy require a largerdistance than 500 meters, then the only indicator of sucha distance being suitable for the tracking is the (perhapssomewhat poorely) fitted curve in Fig. 17.

3) Hotspot Algorithm: The testing of the generic image(Fig. 14) shows that the developed program, following the con-cept described in Section II-C1, is indeed able to successfullyfind all the hotspots in the image (above a certain threshold). Italso considers differently bright, but geometrically connectedareas that are above the threshold to belong to the same hotspot(the grey and the white) as desired.

The same result is achieved when testing the image fromthe camera (Fig. 15). If the threshold is high enough, only thehotspot corresponding to the LED is found. If the threshold isset to a lower value, it also considers other bright areas to behotspots.

Page 12: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

When using the concept of this algorithm in the real trackingsystem (according to the theory in Section II-A2) there are anumber of problems to address. Since the computational powerwill be limited (refer to Section IV-A) and the localization ofthe LEDs has to be quick, the algorithm may have to be opti-mized. It also needs to be rewritten (in another programminglanguage than C#) and compiled in order for it to be able tobe executed on an FPGA.

Furthermore, the threshold cannot realistically be set soas to only find the hotspot which corresponds to the actualLED, because the risk of not finding any LED would beconsiderable. A more sober way of processing the hotspotswould be to set the threshold slightly lower, thus finding morehotspots which can be processed by the control algorithm asdescribed in Section II-C.

B. Project Outcome

1) Hardware: As discussed in Section VII-A1, the testconducted concerning the sun sensing system was perhapsnot as conclusive as one desires. This does indeed affectthe outcome of the hardware implementation of said system,because it has not yet been definitively determined whetherthe four photodiodes that were decided upon will suffice (eventhough, as mentioned, the results are reassuring).

The tracking PCB was, as mentioned, not completely de-signed as far as the actual layout. This is, of course, a failurein the perspective of the thesis project. However, one musttake into account the fact that the tracking system constitutesa small part of ISAAC at whole. When it comes to the PCBlayouts, the Mechanical team at ISAAC had yet to specifythe amount of space available on the RxSU for the tracking,power and IR spectroscopy PCBs. This lead to the workingprogress of the Electrical team coming to a halt at the pointof designing layouts. The schematics is the only part whichhas been completed.

What remains to be done with the hardware goals statedin Section VI-B1 is mainly the hardware characteristics of theangular rate sensor. One desires to determine its responsivnessto changes in rotation speeds as well as its accuracy in itsreadouts. Another important thing to determine, which servesas an interface between hardware and software, is the samplingrate at which the sensor collects angular rate data. Thiswould be important in several aspects of the control algorithmdescribed in Section II-C.

2) Software and Firmware: The part of the software devel-opment scope concerning the signal communication betweenonboard sensors has proceeded quite well, both in the sensethat the camera can be controlled by setting registers and thatdata has been successfully read out from the pixel array to anexternal computer. As for the data gathering from the sun andangular sensors which has yet to be developed, much of theheritage code from previous projects can be used. This is whyless effort has been put into this work, contrary to the camerasensor, which is a new kind of hardware for the REXUS teamsat KTH.

The tracking algorithms – the hotspot locating as well asthe the control algorithm – are still in the concept stage. The

hotspot algorithm is written in C# for the x86 architecture.However, it would have been preferable to write it directly inC and then test it on the actual FPGA hardware. There are tworeasons why this has not been done. Firstly, the time neededto develop the software and then test it would have been outof range for the project. Therefore it was decided to developit with C# since it provides a much shorter development timeand enabled proof-of-concept of the functionality. Secondly,executing C code on an FPGA requires an embedded core,which has yet to be implemented by ISAAC.

The main reason that the algorithms were not finished wasthat the E3 group did not prepossess enough expertise andexperience in the beginning of the project. Much time wasspent learning about hardware configuration and design aswell as the development tools and the programming languagesneeded for configuring the software.

VIII. CONCLUSIONS

The goal of the ISAAC project has been briefly explained.The specifics about the ISAAC tracking system has beenpresented; the functionality of the different subsystems, as wellas the concepts of the tracking algorithms have been described.The electronics of each subsystem has been described in fur-ther detail. Choices of hardware and electronic implementationhas been outlined. Strong emphasis has been put on describingthe CMOS camera, as far as its electronic characteristicsand its signal interfaces. The actual implementation has beendescribed in the sense of layouts and data communication.

Tests concerning the sun sensing system, the distancedependency of LED intensity, and image processing havebeen presented. Despite being a little problematic in theirexecution in some cases, they serve as proofs-of-concept forthree important subsystems in the tracking system.

In the end of the project, it remains to put together afully functioning tracking system. However, the different sub-systems used for the tracking system have been thoroughlyexplored. The hardware implementation needs finalizing, andthe control algorithm needs coding and testing.

ACKNOWLEDGEMENTS

We would like to thank our supervisor, Dr. NickolayIvchenko, Associate Professor at the Department of Spaceand Plasma Physics at KTH, for his invaluable support andknowledge. Also, our warmest thanks go to the rest of thestudents in the ISAAC team, together with whom we havestruggled so hard to make a reality of our dreams of space(or, at least, of the middle atmosphere).

REFERENCES

[1] M. Fittock, H. Hellmann, A. Stamminger, O. Persson, M. Inga, H. Page,M. Pinzer and A. Kinnaird, “REXUS User Manual,” 2012.

[2] R. A. Akmaev and V. I. Fomichev, “Cooling of the mesosphere and lowerthermosphere due to doubling of CO2,” Ann. Geophysicae, vol. 16, pp.1501–1512, 1998.

[3] V. S. Kostsov and Y. M. Timofeyev, “Mesospheric carbon dioxidecontent as determined from the crista-1 experimental data,” Atmosphericand Oceanic Physics, vol. 39, no. 3, pp. 322–332, 2003.

Page 13: E3. TRACKING SYSTEM FOR ISAAC Construction and

E3. TRACKING SYSTEM FOR ISAAC

[4] G. Balmer, A. Berquand, S. Bidgol, V. De Roos, V. Granberg, V. Grigore,J. Jansson, E. Lundkvist, M. New-Tolley, J. Pacheco Labrador, M. Sil-berstein Hont and Y. Yuan, “ISAAC Student Experiment Documentation(SED),” 2013.

[5] K. Krogh and E. Schreder, Attitude Determination for AAU CubeSat.Department of Control Engineering, Aalborg University, 2002.

[6] LED Engin, Inc., “High Efficiancy Deep Red LED Emitter, LZ1-00R200,” 2012.

[7] O. Larusdottir, D. Kristmundsson, A. Myleus, E. Lozano, M. Bordogna,L. Fidjeland, A. Haponen, A. Hou, B. Oakes, M. Lindh and M. Fjallid,“MUSCAT Student Experiment Documentation (SED),” 2012.

[8] N. Apazidis, Rigid Body and Analytical Mechanics. Department ofMechanics, KTH, 2010.

[9] H. Takenaka et al., “Angular rate detecting device,” patent US 5 239 868,Aug. 24, 1992.

[10] M. Fjallid and M. Lindh, “Data Acquisition System in the MUSCAT Ex-periment,” in Bachelor Thesis Project Proceedings, School of ElectricalEngineering, KTH, 2012, pp. 227–240.

[11] Microsemi Corporation, “Automotive ProASIC3 Flash Family FPGAs,”2013.

[12] Aptina Imaging Corporation, “MT9P031 - 1/2.5-Inch 5Mp CMOSDigital Image Sensor,” 2005.

[13] D. P. Mitchell and A. N. Netravali, “Reconstruction filters in computer-graphics,” SIGGRAPH Comput. Graph., vol. 22, no. 4, pp. 221–228,Jun. 1988.

[14] Aptina Imaging Corporation, “MT9P031 - Register Reference,” 2010.[15] Edmund Optics, Inc., “Filter Capabilities, Part No. 67-076,” 2010.[16] C. Kittel, Introduction to Solid State Physics, 8th ed. John Wiley Sons,

Inc., 2005.[17] Maxim Integrated Products, Inc., “Low-Power, 4-/8-/12-Channel, I2C,

12-bit ADCs in Ultra-Small Packages,” 2012.[18] Silonex, Inc., “SLCD-61N2 Solderable Planar Photodiode,” 2012.[19] STMicroelectronics, “L3G4200D: three axis digital output gyroscope,”

2011.