mae276 finalproj uav-team

38
UNIVERSITY OF CALIFORNIA, DAVIS UAV Camera Stabilization for Multispectral Crop Imaging MAE 276 Final Project Kevin Brouwers, Kellen Crawford, Matthew Klein 3/18/2013 Spectral imaging of plants can allow one to establish a relationship between the leaf temperature (gathered via proximal infrared sensing) and the Stem Water Potential (SWP). Using this information one may be able to analyze the water demand of a field of crops. Dr. Shrinivasa Upadhyaya's research group in the Biological Systems Engineering Department at UC Davis has developed a UAV outfitted with a spectral camera to allow for remote collection of spectral images for near real-time analysis of crop irrigation demand. Here a team of three students from Dr. Michael Hill's Data Acquisition and Analysis course has experimentally investigated the current issues with the camera stabilization control algorithms in order to maximize the amount of useable images being captured.

Upload: matthew-klein

Post on 13-Aug-2015

197 views

Category:

Documents


0 download

DESCRIPTION

This document is the final report developed for UC Davis' MAE 276 Data Acquisition and Analysis course. Here a UAV Camera Gimbal Stabilization control system was investigated via experimentation as well as system modeling and simulation. The purpose of this investigation was to find the root cause of generation of distorted images that were taken during flight.

TRANSCRIPT

Page 1: MAE276 FinalProj UAV-Team

UNIVERSITY OF CALIFORNIA, DAVIS

UAV Camera Stabilization for Multispectral Crop Imaging

MAE 276 Final Project

Kevin Brouwers, Kellen Crawford, Matthew Klein

3/18/2013

Spectral imaging of plants can allow one to establish a relationship between the leaf temperature (gathered via proximal infrared sensing) and the Stem Water Potential (SWP). Using this information one may be able to analyze the water demand of a field of crops. Dr. Shrinivasa Upadhyaya's research group in the Biological Systems Engineering Department at UC Davis has developed a UAV outfitted with a spectral camera to allow for remote collection of spectral images for near real-time analysis of crop irrigation demand. Here a team of three students from Dr. Michael Hill's Data Acquisition and Analysis course has experimentally investigated the current issues with the camera stabilization control algorithms in order to maximize the amount of useable images being captured.

Page 2: MAE276 FinalProj UAV-Team

INTRODUCTION

LEARNING OBJECTIVES:

1. Develop hands-on ability to use computers for data acquisition and control.

2. Technical understanding of computer architecture, characteristics of transducers,

hardware for laboratory applications of computers, fundamentals of interfaces between

computers and experimental equipment.

3. Technical understanding of programming techniques for data acquisition and control,

basic data analysis.

VISION:

Technological advancements continue to increase productivity and efficiency across the

commercial market. Humans now can do more and produce more with fewer resources

than ever before. Agriculture is one such field which, while generally slower to respond to

technology, has both incredible potential and necessity for advancements in this area.

Though many farmers are portrayed as stubborn and resistant to change, advancements

such as GPS-operated combines, a myriad of fruit harvesters, and a host of other

implements replacing human hands have been wholeheartedly embraced and have driven

production to new levels. As a result, the agricultural sector continues to feed an

exponentially increasing population with fewer and fewer hands dedicated to that field.

Current research and development is paving the way for the next groundbreaking

advancement in agriculture: the many applications of unmanned aerial vehicles (UAVS).

UAVs are notorious for their surveillance capabilities, and have drawn plenty of criticism for

their application in that field domestically. Not surprisingly, the Federal Aviation

Administration (FAA) has placed vast restrictions on the operation of UAVs in U.S. airspace.

A higher awareness about the potential applications of UAVs, however, has begun to open

some of that airspace, and the agricultural sector will be one of the biggest benefactors.

Much of the current information in the agricultural database, including domestic crop

acreage, productivity, irrigation resources, and the effect of weather patterns on our crops

comes from remote sensing data derived from either satellites or high-altitude aircraft. Such

assets depend largely on multispectrometers to measure reflectance data in the visible and

infrared spectra. Simple indices derived from ratios of different reflectance bands contain

much more information about the status of that crop than any visual inspection ever could.

Page 3: MAE276 FinalProj UAV-Team

In its current utilization, however, this information is at too large of a scale, too infrequent,

and too dependent on weather conditions for most farmers to act on. By employing this

multispectral capability on a small, easily operable UAV, this wealth of information can be

utilized by a vast array of farmers at a very precise scale, with real-time information,

opening up the possibility of enormous applications in precision agriculture.

BACKGROUND:

Much of the work in this field is being done at the University of California at Davis, under

the direction of Dr. Shrini Upadhyaya. The current model under development is an

octocopter, about two feet in diameter, with a multispectral camera mounted on a platform

below the guts of the aircraft, as pictured in Figure 1 and Figure 2.

Figure 1 - UC Davis UAV

Figure 2 - Camera platform

Page 4: MAE276 FinalProj UAV-Team

The platform has two axes of rotation, pitch and roll, and is designed to maintain a constant

attitude, pointed straight down to the ground, independent of the attitude of the UAV frame.

It does so utilizing an inertial measurement unit (IMU), in this case an ArduIMU, version 3.

The IMU has its own gyroscopes and accelerometers, in three axes, which feed into a

software-driven control loop.

The UAV is designed to simply fly over a field of interest, using pre-programmed GPS

coordinates, and take images every couple of seconds during the duration of the flight. After

the flight, all of the images (typically a set of several hundred) are fed into a software image-

stitching program to create a mosaic image of the entire field. Before these images are used

in the mosaic, however, a lab technician must go through and examine each image and toss

out any that are distorted. The lab tech must do this, because many of the images, around

20% in most sets, are distorted in a way that warps what the field actually looks like. A

typical distorted image is depicted in Figure 3.

Figure 3 - Typical distorted image

If the above image were to be fed into the mosaic, it would end up distorting the entire

mosaic. The analysis of these images relies heavily on the spatial information of the

vegetation, so distortions such as this one can’t be tolerated. The result is a lab technician

spending hours sifting through hundreds of images obtained during a ten-minute flight, a

very inefficient use of time.

PROPOSAL:

The purpose of this inquiry is to determine the cause of these distorted images and to fix it if

possible. This will save hours of post-processing time and allow for a much more

streamlined analysis of the images, resulting in more immediate feedback to the end user.

The desired final product will be a system as close to the original as possible, with the

Page 5: MAE276 FinalProj UAV-Team

distorted image issue resolved. The end product needs to be as similar to the original as

possible for a smooth transition back to the end user, who will be using this system very

shortly to take more images during this year’s growing season.

APPROACH:

EXPERIMENTAL INVESTIGATION, SOFTWARE: Going in to this project, the only thing the team really knew about the problem was that

some of the pictures taken during these flights are distorted. Upon inspection of the setup,

there are three possible sources of the distortion; either the camera is faulty and does not

properly scan some images; the flight conditions are simply too dynamic, with the entire

airframe being moved during the brief time of exposure; or the control loop of the camera

platform is inadequate. The control loop contains both hardware components and a

software proportional-integrator (PI) control code.

Roughly 20% of the previous year’s images were characterized as distorted. A more

concrete rate of distortion is unknown, because the user simply tossed out the distorted

images without making a detailed record of each set of data. As a result, the team will have

to first quantify the problem being addressed by establishing a “distortion rate” to the

original setup. Even before that can happen, however, the team must identify the conditions

under which the distorted images appear. If the distortion can be reproduced in a lab

environment, it will allow for much closer control of variables affecting the images.

After the problem is quantitatively defined, each potential source of the distortion will be

isolated following the basic logic illustrated in Figure 4.

Page 6: MAE276 FinalProj UAV-Team

Figure 4 - Approach to determining source of distortion

EXPERIMENTAL INVESTIGATION, HARDWARE: The IMU controller runs C programming code created in the standard Arduino Integrated

Development Environment (IDE). C libraries and the architecture laid out in the IDE provide

access to onboard gyroscopes and accelerometers as well as output signal lines from the

board. For our purposes, a loop in the programming code checks the platform orientation

from the gyroscopes and accelerometers, and, through a proportional-integral (PI) control

algorithm, adjusts the positions of servo motors. As a first step in characterizing the

performance, we developed a diagnostic procedure to look at control board output signals

to the servos.

MODELING AND SIMULATION: In addition to the experimental investigation to be conducted in the lab with static

conditions, a dynamic system model will be developed to study the dependence of physical

design parameters to the camera stabilization quality. It was observed that the roll axis is of

primary interest in this dynamic model due to some resonance between the camera

platform and the upper UAV body through rubber bushings designed to dampen vibrations

between the UAV frame and the camera platform.

Model development is critical to creating a controller for dynamic systems. A model allows

for controller design through simulations that may be performed very quickly and thus

allow for an efficient use of resources by minimizing design time and cost of building

prototypes. Here a multi-body dynamic model was developed to model the roll and pitch

Page 7: MAE276 FinalProj UAV-Team

control of the camera gimbal. In the essence of saving time, two planar motion models were

developed in order to study the dynamics of each the roll and pitch control independently

as opposed to creating a three-dimensional model. The servo that controls the pitch angle

of the camera is a direct drive connection and is applied about the actual pitch axis of the

camera’s inertia. The roll control servo, on the other hand, controls the gimbal at a fixed

distance from the roll axis of the camera’s inertia and this causes a reaction moment that is

felt back at the upper UAV body. This must go through the bushings that support the

camera platform-to-UAV frame attachment. Due to the low stiffness of these bushings, a

particular control input can cause a resonance between the UAV body and the Landing Gear

Body. Attention will be paid to study the effect of changing the bushing stiffness in order to

improve the controller performance. Additionally, a design revision could be made in order

to have the roll servo acting on the camera’s roll axis inertia directly. In the configuration it

is in now, there is a long arm connecting that servo to the camera platform which acts a

moment-arm. This alteration could create a performance improvement while allowing for

the bushings to be kept in order to provide the intended cushioning. Figure 5 portrays the

roll axis model that was developed and how the physical system was reduced to a simple

multi-body system of three main components.

Figure 5 - Roll axis model development. Overlaid on the actual UAV studied here.

Figure 7 is a sketch of the model developed for the roll axis. System parameters such as

distances, masses and moments of inertia are applied. They were measured by creating a

Page 8: MAE276 FinalProj UAV-Team

CAD model of the pertinent components, an orthogonal view of which is provided in Figure

6.

Figure 6 - CAD model of UAV

The 3-D modeling, using estimates of material densities, allowed for an estimate of the mass

and moment of inertia of the various components of the system. In creating the CAD model

all necessary model dimensions were gained as well.

There are three main bodies in this model: 1) the UAV body, which has a prescribed input

angular velocity; 2) the Landing Gear body, which hangs from the UAV body and is attached

through an angular spring that was modeled to have a cubic stiffness profile to give the

spring low stiffness for about 10 degrees of displacement and then becomes relatively stiff

at full bushing compression; and finally 3) the Camera body, which consists of the hanging

arm and the camera mounted at the bottom. In Figure 7 the blue dots show the C.o.G. of

each body and the white dots portray the joints of rotation. The upper joint is the bushing

joint and the lower joint is the pivot between the Landing Gear body and the Camera body.

Page 9: MAE276 FinalProj UAV-Team

Figure 7 - Sketch of the model with proper coordinates and parameters applied.

Page 10: MAE276 FinalProj UAV-Team

RESULTS

IDENTIFYING THE PROBLEM:

Initial observations of the camera platform, with its control loop powered on to keep it

pointed straight down, immediately identified a couple of potential sources of distortion.

Even as the UAV was sitting stationary, the control loop was making slight adjustments to

the attitude of the platform. Common sense suggests that, since the UAV is not moving, the

IMU should not be adjusting for anything and should remain motionless once it finds its

proper attitude. These adjustments were being made with relative consistency at a

frequency on the order of 1 Hz or less. Seeing as the camera was programmed to take

pictures about every two seconds, the possibility that the servo made one of these

adjustments in the brief moment the image was being scanned was highly possible.

Furthermore, the servos seemed to have a tendency to shudder every few seconds, which

led to the entire platform being shaken very slightly. This was also quite possibly a source of

distortion. These were the initial observations with the UAV in a static configuration.

In the dynamic world, a couple of other potential problems were discovered. To begin with,

the roll servo had a much longer moment arm transferring its motion to the platform than

did the pitch servo. As a result, the roll servo’s movements, including the shuddering issue

mentioned above, were amplified through that moment arm. Additionally, there are four

rubber bushings connecting the frame of the UAV to the lower camera platform which,

when subjected to a moment from the roll servo’s long moment arm, induced a substantial

oscillation to the camera platform which took about 1.5 seconds to damp out.

REPRODUCING DISTORTED IMAGES IN LAB ENVIRONMENT:

The first step in the diagnosis process was to attempt to reproduce distorted images in a

static lab environment with no flight disturbances. It’s important to remember that the

image viewed in Figure 3 was taken during flight, and as such the vehicle was subjected to a

much more dynamic environment than would be created in a lab setting. Again, for the sake

of simplicity and control of experimental variables, the system was set up in a static lab

environment, with the main body of the UAV stationary. Setup at a height of about 1 foot,

using a piece of engineering paper as a visual target for the camera to help identify image

distortion, the team powered on the UAV in its original configuration. Capturing images

Page 11: MAE276 FinalProj UAV-Team

about every two seconds, 50 images were taken and analyzed, looking for the same

distortion that was prevalent in the field data. Of these 50 images, nine were clearly

distorted, with a resulting distortion rate of 18%. An example of one of these distorted

images, compared to a normal image, is provided in Figure 8.

Figure 8 - Distorted image on left compared to a normal image on right.

The waviness seen in the bottom of the image on the left is almost certainly a result of a

movement of the camera during the brief moment the image was being scanned. Being a

digital camera, the image is scanned from left to right, top to bottom. As a result, the camera

is much more sensitive to sideways movements, because there is a comparatively larger

time difference between the top pixels and the bottom pixels. In its current configuration,

this sideways movement translates to the pitch axis. Tying into the IMU via its serial port,

the acceleration and gyroscopic data, which drive the PI control loop of the platform, were

made available and are presented in Figure 9.

Page 12: MAE276 FinalProj UAV-Team

Figure 9 - Original system's IMU output

This data provided a base-line of the camera platform motion and may be used for

comparison when changes are implemented. A couple of key observations to make note of

regarding this data is the relative differences in magnitude between the roll and pitch

acceleration and angular velocity, the amount of noise in the acceleration and gyroscopic

data, and the large amount of seemingly erroneous signals being sent to the servos. The In

the original code, the noise is canceled out by taking an average of every 32 data points.

There is clearly much more noise in the roll acceleration than the pitch, which is probably a

manifestation of the longer moment-arm of the roll servo mentioned previously. The servo

signals being generated by the software range from 1000 to 2000 microseconds, 1000 being

fully clockwise and 2000 being fully counter clockwise. Since the UAV is stationary, the

ideal output for both servos should be a straight line with a slope of zero. Depending on the

attitude at which the UAV happens to be sitting, both signals should also be right around

1500. The general slopes of each signal are a manifestation of the drift correction of the

original code. This aspect of the code warrants attention, since there is a marked drift in the

roll direction, but will not be part of this analysis, because the team is chiefly concerned

with image distortion. What is worth mentioning is the apparent fluctuation in the servo

0 50 100 150-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 50 100 150

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 50 100 1501000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 50 100 150-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 50 100 150

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 50 100 1501000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 13: MAE276 FinalProj UAV-Team

signals, as if the code can’t quite decide between two different servo positions and ends up

jumping back and forth around the general value it needs to be. At about 60 and 70 seconds,

for example, the signals take a fairly large dip before recovering to a value closer to where

they should be. This is a telltale sign of over-control.

CAMERA ISOLATION:

The team determined there was enough of an issue with the distorted images in a static lab

environment to take the first step in diagnosing the problem: isolating the camera. This

simply consisted of cutting the power to the servos, effectively removing all control aspects

of the system. In the same environment as the distorted images were recorded in, the team

again took 50 pictures for a visual inspection and found no distorted images in the set. This

was conclusive enough to rule out the camera as being a major source of distortion.

SERVO ISOLATION:

As mentioned previously, one of the first observations made was the tendency for the

servos to shudder every couple seconds, with enough movement to noticeably move the

platform. To determine if this shuddering was causing the distorted images, the control

code was hardwired to keep the servos stationary. This removed the PI control aspect of

the system, but kept power to the servos to allow them to shudder. Again, 50 images were

taken in the same environment, and, though the shuddering was quite apparent, there were

no distorted images in the set. This ruled out the probability that the servo shuddering was

a major contributor to the distorted images.

CONTROL LOOP:

Eliminating the camera and the servos as major sources of distortion left the control loop of

the platform as the major culprit. As previously mentioned, the control loop consists of

several hardware components and a PI control software loop. In the static test

environment, the several hardware components identified as potential problems were

immediately eliminated from the equation. To begin with, the soft bushings connecting the

UAV frame to the platform were not receiving any force inputs from a moving UAV frame, so

they were not part of the transfer function of the control loop. Also, the distortion in the

images is a function of pitch movement, as presented in Figure 8. Therefore, the amplifying

effect of the long moment-arm of the roll servo can be ignored for this analysis. That leaves

the software of the control loop, and the signals it was sending to the servos, as the most

Page 14: MAE276 FinalProj UAV-Team

likely source of distortion. Since the UAV was stationary, there was no issue of lag in the

system, so it really came down to a matter of over-control.

The first concern with the code was the highly irregular and erroneous signals that are

visible in Figure 9. These are clearly outliers from the rest of the signals, perhaps generated

by a glitch in the software, and should be disregarded. To address this, a simple filter was

written in to the code to ignore signals that fell outside of the range of the servos.

Figure 10 - Filtered signals

Though the incomprehensible signals were successfully filtered out, analysis of the IMU

data, shown above in Figure 10, did not reveal any stark differences in the acceleration and

gyro data, and the camera trial did not yield any lower distortion than the original code.

The next aspect of the control code for inspection was its operating frequency. The code is

really just a continuous loop calling on several functions that is set to repeat every 5 ms, or

200 Hz. In reality, the loop took about 10 ms to execute each iteration and was therefore

actually running at right around 100 Hz. As a result, the code printed out acceleration,

gyroscopic, and servo signal data for analysis about every 10 ms. Analysis of the servos,

however, led to the understanding that the servos had a constant refresh rate of 20 ms. This

Page 15: MAE276 FinalProj UAV-Team

meant the control code was sending commands to the servos at twice the rate they were

being executed, causing unnecessary digital noise. As a result of this finding, the frequency

of the code was lowered to something closer to the servo refresh frequency. After trying

several different frequencies, including 50 Hz, 40 Hz, 30 Hz, and 25 Hz, 40 Hz was found to

be an optimum frequency, and resulted in an image distortion rate of just over 5%.

Compared to the original code’s distortion rate of 18%, this simple change resulted in an

almost 4-fold improvement to the system. Figure 11 lays out the IMU data from that

frequency trial, with the erroneous signal filter still in place.

Figure 11 - Frequency changed to 40 Hz

Though a greater difference in the IMU data between the 100 Hz code and the 40 Hz code

was expected, the significance of the change lies in the lowered distortion rate.

To get to the bottom of the distortion the team needed a way to tie the IMU data to the time

when the image was being scanned. That was, the acceleration and gyro data during a

distorted image scan could be analyzed instead of having to look at general system

behavior. In its original configuration, the camera capture and control loop were

independent of each other so there was no way to synchronize the two sources of data. All

that was known was a window of 2-3 seconds in the IMU data where each image was being

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 16: MAE276 FinalProj UAV-Team

captured. When dealing with a code that updated every 25 ms, and a camera that took

about 5 ms to scan, a 2-3 second window is relatively large. To solve this dilemma and

allow for better analysis of the distortion, the team retrofitted the camera to be triggered by

the control loop. Due to the limitations of the camera, an image could only be taken at a 4

second interval. As the purpose of this alteration was to examine distorted images, the

original parameters of the code were run, at 100 Hz, with the only addition being the few

lines of code having to do with triggering the camera. The resulting image set yielded zero

distorted images. Though this was not expected and removed the possibility of performing

the originally planned analysis of the distorted images, a simple solution was stumbled

upon, and allowed for a greater insight in to the mechanics of the system, which will be

discussed in the conlusion.

CONTROL LOOP DYNAMICS Though there was not quite enough time to continue the project, the next step is the fine-

tuning of the system in its dynamic environment. The equations of motion for this dynamic

system were generated by creating a Bond Graph model of this system. A similar model was

found in the text Advances in Computational Multibody Systems written by Jorge A.

Ambrosio. On page 138, Figure 9 of that text, a double jointed robot arm is modeled using

the Bond Graph technique. In Figure 12 a Bond Graph was drawn for the roll-axis model of

Figure 7 representing the UAV camera gimbal.

Page 17: MAE276 FinalProj UAV-Team

Figure 12 - Bond graph representing the roll axis model of the UAV and camera gimbal. The actual MTF parameters will be shown in equations later. A program for LaTeX was used to generate this figure.

The 1-junctions with Inertial Elements with a J appended model the rotational dynamics of

the three bodies. Connected to those junctions via Modulated Transformers are the 1-

junctions that model the translational velocities of each body in both the x- and y-directions.

Attached to those are the masses, md, mh, and mb. In order to maintain proper integral

causality for all energy storing elements additional states were added in the form of

capacitances that link the translational inertias to the respective rotational ones. These can

be thought of as modeling the translational joint stiffness and their stiffness’ were

calculated by selecting a high natural frequency for the joint. Additionally, the joint

displacements were monitored and the frequency was modified iteratively until these were

at sufficiently low levels relative to the system dimensions. Specifically, here the UAV body

dimension is approximately 5cm long and thus the joint displacements were considered to

be acceptable if they were below 0.5mm or about 1/100 of the smallest system dimension.

Resistive elements were paired with the joint stiffness’ to reduce computational chatter.

Their values were computed by selecting a value that ensured a critically damped system

Page 18: MAE276 FinalProj UAV-Team

given the previously computed stiffness. These are all shown as the 1/kkmm and Rkmm in

Figure 12. The notation, KMM, was used as this method of introducing state variables to

ensure proper causality is commonly referred to as the Karnopp-Margolis Method which

was developed by Dr. Dean Karnopp and Dr. Donald Margolis at the University of California,

Davis.

The input flight disturbance is modeled as the flow input Sf:θ1(t) and the KMM was applied

here as well as there would be a causality conflict if a flow input was placed on a 1-junction

that also had an inertial elements attached. Again this angular displacement was designed

to be small in order for the input velocity to be very close to the actual velocity if the UAV

body at which it was being applied. The relative velocity between the UAV body and the

Landing Gear body is modeled by the 0-junction that has the C: 1/kslop and R:Rslop elements

attached. This stiffness/damper pair model the bushings between the UAV body and the

Landing Gear. The 0-junction that models the relative velocity between the Landing Gear

and Camera bodies has an Effort Source attached which models the angular control servo

motor. A resistive element was also placed there to model a bearing friction. If this is not a

realistic component then the friction coefficient can be set to a small enough relative value

such that it does not significantly contribute to the system.

The coefficients for the Modulated Transformers are not shown in Figure 12, but are listed

below in Table 1. The equations relate the translational velocities to the angular velocities

of the bodies and the force to the torques.

Table 1 - Equations that describe the translational velocities as a function of the angular velocities. These are used to derive the Modulated Transformer coefficients for the Bond Graph.

Page 19: MAE276 FinalProj UAV-Team

A key component of the model development process is performing a proper diagnosis of the

model’s performance in order to understand whether or not the results being produced are

physically understandable. Determining how to go about testing the model to understand

its “conceptual” validity was a new process that was learned here. A good first step found

here was to compare the model, in some constrained operation if necessary, to a previously

developed analytical model. In this case the system is really just a complicated pendulum

with multiple bodies and strange joints. Thus, here the first and second bodies were fixed in

position while the Camera body was started from an initially displaced angle to test

whether or not the results matched that of a simple pendulum of the same system

specifications. Figure 13 shows that the model does in fact act as expected. The two lines

overlap each other almost exactly. There is a small difference due to a small amount of

bearing friction that was left in the system. Figure 14 shows the free response of the

Camera body as a function of varying the bearing friction on the Camera joint. A value of

0.015 N-m-s/rad was chosen as this provided the closest response to the actual system.

This parameterization was performed qualitatively and therefore was not compared to

actual system response data, but done through observing the system upon an input and

seeing how many oscillations it had before coming to rest. Figure 15 is a plot of the joint

forces and displacements that are the result of the Karnopp-Margolis method. The

displacements should be small and they are below 10^-5 m, which was decided as discussed

earlier to be adequate. This was based on selecting a frequency of 150 Hz for the joint

stiffness calculation. Figure 16 is a plot of the free response of the entire system starting at

an initial displacement of 10 degrees. The UAV body stays fixed at 10 degrees due to the

KMM spring applied relative to the input angular velocity which in this case is zero.

Page 20: MAE276 FinalProj UAV-Team

Figure 13 - Comparison between the analytical model for a simple pendulum and a constrained version of the camera gimbal roll axis model.

Figure 14 - Testing the base oscillations as a function of the bearing friction coefficient.

Page 21: MAE276 FinalProj UAV-Team

Figure 15 - Testing initial 10 degree displacement on all three components to see response. (a) Restrain forces in joint, (b) Joint Displacements (c) Relative angular displacement of the bushing joint.

Figure 16 - Plot of the UAV body, Landing Gear body and Camera body angles when starting from a 10 degree displacement.

Page 22: MAE276 FinalProj UAV-Team

After completing the initial conceptual validation of the model the PID control gain tuning

was started. This was tested for step, ramp and sinusoidal inputs. Figure 17 plots the

response of the bodies after starting from an initial angular displacement of 10 degrees.

The right plot of Figure 17 shows the torque required by the servo in order to achieve the

response shown on the left. The servo torque is modeled to saturate at a maximum of 5.2

kg-cm of torque in each direction per the manufacturer’s specifications. The left plot in

Figure 17 is the same as Figure 16, although here the servo torque is applied to the system.

It is seen that the Camera body (Base Angle in the plot legend) drops to zero degrees in

about 0.25 seconds as opposed to the free response of 2.5 seconds, which is a ten-fold

improvement.

Figure 17 - Response of system with servo motor controlling at the Camera body joint from an initial angular displacement similar to Figure . (a) Angular response of the three bodies. (b) The control torque required to

achieve response.

Figure 18 shows the response to a constantly ramping input angular velocity on the UAV

body when starting from zero initial angular displacement for all of the bodies. It is seen

that at about 0.4 seconds the control torque reaches its peak limit, which also corresponds

to a jump in displacement for the Camera body as seen in the angle plot of Figure 18. Figure

19 shows the response to a sinusoidal input angular velocity and Figure 20 to a constant

angular velocity step input.

Page 23: MAE276 FinalProj UAV-Team

Figure 18 - Response for a ramping angular velocity input. Notice that the control torque saturates briefly at the maximum of 5.2 kg-cm. This is a limit that was placed on the control based on the manufacturer’s specifications

for the servos being used here.

Figure 19- Response to a varying sinusoidal input angular velocity.

Page 24: MAE276 FinalProj UAV-Team

Figure 20 - Response to a step input angular velocity.

A frequency analysis was performed in order to understand the sensitivity of the controller

performance to the flight disturbance input frequency. Input frequencies of 1, 10, 100, and

1000 radians per second were applied. It is observed in Figure 22 that the controller

performs well for the first two plots and then begins having trouble above 100 radians per

second. The controller can properly stabilize the camera up to about a 6 Hz input frequency

before the platform accelerations and velocities reach values above which will allow for a

non-distorted image.

Page 25: MAE276 FinalProj UAV-Team

Figure 21 - Camera Body Response as a function of the bushing stiffness.

Figure 22 - A frequency analysis was performed to test the control capability as a function of frequency. An arrow highlights the angular response of the Camera body for 4 different frequencies. The angular displacement stays

low for all, however, the velocities.

Page 26: MAE276 FinalProj UAV-Team

HARDWARE DIAGNOSTICS The other essential task to optimizing the system is characterizing the communication lines

inside the control loop. The servo motors, as mentioned before, are mostly a “black box”

transfer function that are simply fed a signal and spit out an angular displacement. The

lower servo is connected directly to the camera base plate, controlling the pitch movement,

and the upper servo is connected through the linkages shown in Figure 23, controlling the

roll movement. Using the National Instruments ELVIS DAQ both servos were monitored

simultaneously to assess the performance of the system.

It was found that the UAV uses standard servos that are positioned in proportion to the duty

cycle of the PWM signal. The time length of the high “pulse” translates into a servo

command that sets the motor shaft angle. A 1.5 ms pulse held high at 5 V, corresponds to the

motor’s neutral 90° shaft position. 1 ms and 2 ms pulses correspond respectively to 0° and

180° maximum shaft angles. Figure 24 reflects the way the servo commands are refreshed.

The data showed that a deeper look into the programming code was needed to explain how

the pulse refresh rate was independent of the timing of changes in servo commands within

the control code. It turns out that the servo library driving the motors was hard coded to

deliver the signals on its own timer.

Figure 23 – CAD model of the UAV for model parameterization purposes.

Servo 1: Pitch

Servo 2: Roll

Page 27: MAE276 FinalProj UAV-Team

Figure 24 - Servo motor input control signal.

Figure 24 is a slice of the data from the servo signal. The pulse spacing at a constant 20 ms

period with about 1.5 ms high time ( > 5 Volts) may be seen.

Figure 25 - Servo Pulse Signal Integrity.

Figure 25 shows the quality of the sampling for a single 1.5 ms pulse. The data typically

reflected a fluctuation of around 3 mV for a high signal and remained high for the duration

of the 1-2 ms it was commanded to do so.

0

1

2

3

4

5

6

0 20 40 60

Serv

o S

ign

al (

V)

Time (ms)

Pulse Stream

5.06

5.062

5.064

5.066

5.068

5.07

5.072

5.074

5.076

5.078

5.08

9 9.5 10 10.5 11 11.5 12

Serv

o S

ign

al (

V)

Time (ms)

Typical Pulse Characterization

Page 28: MAE276 FinalProj UAV-Team

The servo data was sampled for both the roll and pitch signals simultaneously and acquired

data from tests with the vehicle in various modes of operation. The servo signals were

tested with the control loop bypassed within the software and the motors held in their

neutral positions supplied with steady 1.5 ms pulses at the 20 ms pulse interval.

Additionally, an open loop sweep motion algorithm in which the motors were driven back

and forth through 45° rotations was used to check that the input and output matched.

Closed loop operation tests were also run where the system was forced to correct the

camera angle as the vehicle was tilted in pitch and roll directions.

To better utilize the acquired data and get an idea of how the system was functioning, Visual

Basic scripts were written to deliver post-processing analysis of the servo signals recorded

in LabVIEW. The script scans through the LabVIEW .lvm data files and stores pulse lengths

and the corresponding trailing edge times in spreadsheet columns for plotting. This allows

for analysis of the commanded angular positions of the motors versus time. This process

provided a diagnostic method for assessing how well the control loop programming was

working and gave an insight into the functioning of the Arduino code as the study of the

system was initiated.

Figure 26 - Bypassed control servo commands.

50.00

60.00

70.00

80.00

90.00

100.00

110.00

120.00

130.00

140.00

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

An

gle

(°)

Time (s)

Motor Signal Data (Bypassed Control Loop) 1500_hold_Kellen_2channels.lvm

Channel 1

Channel 2

Page 29: MAE276 FinalProj UAV-Team

Figure 26 shows the commanded position of the servo for the system in the “hold” state.

For this data the signal shown was refreshing a 1.5 ms pulse every 20 ms to hold the servo

at its 90° neutral position. Obviously, the data reflects a substantial scatter in the pulse

lengths throughout time. This was suspected as a possible contributing factor to the image

distortion problem, since it could be reflecting significant instability in the accuracy of the

servo commands. It was later determined that regardless of how well the servo motors

were or were not being driven, good images were able to be obtained in this hold state, so

further investigation into the source of the scatter was set aside for the purposes of the

image quality study.

Figure 27 - Open loop control servo motor command signal.

Figure 27 shows the same post-processing method applied to an open loop motion sweep of

the motors back and forth through 45°.

50.00

60.00

70.00

80.00

90.00

100.00

110.00

120.00

130.00

140.00

150.00

160.00

170.00

180.00

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

An

gle

(°)

Time (s)

Motor Signal Data (Open Loop Controlled Sweep) JedCode-40kSamples-RollThenPitch.lvm

Channel 1

Channel 2

Page 30: MAE276 FinalProj UAV-Team

Figure 28 - Controlled displacement response.

Figure 28 shows the closed loop response of the original code as the vehicle frame was

tilted which forced the control code to correct the camera position. As seen in Figure 26, in

both Figure 27 and Figure 28, there is considerable scatter and fluctuation in the command

signals about the line of action as the motors should be receiving smooth motion

commands. As with the hold signal, we did not isolate this deficiency in the motion

command signal data. It could be a problem with the sampling, it is quite possibly a problem

originating in the control software, and it could be hardware related. Apart from a

hardware limitation, acquiring what should be smooth motion curves from the command

signals may require additional signal conditioning, a higher sampling rate, or modifications

to the motion control programming code. Since it was determined that the motor control

was not affecting the image quality, this question was deferred for future investigation.

50.00

60.00

70.00

80.00

90.00

100.00

110.00

120.00

130.00

140.00

150.00

160.00

170.00

180.00

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14

An

gle

(°)

Time (s)

Motor Signal Data (Controlled Displacement Response) SS50-40kS.lvm

Channel 1

Channel 2

Page 31: MAE276 FinalProj UAV-Team

CONCLUSIONS

Static System Image Distortion

The original system had too much random error in the system which ended up manifesting

itself in the form of a distorted image. The largest uncontrolled variable was the rate at

which the camera captured images. In its original continuous capture mode, the time

difference between images was about 2 seconds. As a result of waiting for each image to be

compiled and stored on the onboard memory card, the time between images varied slightly

between 2 and 2.5 seconds. By assigning a concrete time difference between each image,

through the incorporation of the camera trigger in the control loop, that randomness was

eliminated. With the code running at 40 Hz and the servo refresh rate being a constant 50

Hz, the servo movement and the part of the code triggering the camera would be in phase

every 4th control loop. Of those control loops in phase with the servo movement, every 160th

actually triggers the camera. This leads to a maximum likelihood that the servo will be

moved during the time of image capture of

. Making the assumption that the

servo movement during the time of capture accounts for the image distortion, this equates

to a distortion rate of 0.16%. With neither the time nor the computing power to record and

analyze the amount of images it would take to verify this figure, the results of the final

variation to the code, with camera trigger encoded, will have to be sufficient.

Dynamic Model

The primary interest in creating the model for the roll axis was to understand if an actual

camera stabilization improvement could be made by tightening the bushings on the UAV.

Figure 14 showed the Camera body response as a function of the bushing stiffness. A

bushing stiffness above 30 Hz provides adequate resistance against camera disturbance for

the ramp type input that is placed on the UAV body at about t=4s and for the earlier applied

sinusoidal input it does help, but not dramatically.

Another important capability that this model allows is for one to ensure that the selected

servos can provide the necessary torques for proper camera stabilization. It was shown in

Figure 17, Figure 18, Figure 19, and Figure 20 that when implementing a control torque

maximum it has little effect on the requested torque as most of this time the necessary

torque is much less than the maximum.

Page 32: MAE276 FinalProj UAV-Team

COLOPHON The project supported the several learning objectives through its computer-driven

approach to a real-life control system.

LEARNING OBJECTIVE 1:

Develop a hands-on ability to use computers for data acquisition and control.

One of the most crucial aspects of this project involved characterizing electrical

signals being sent between various working components involved in a control loop.

The team learned how to interface with the ArduIMU to examine and manipulate

variables in the control algorithm, as well as analyze the generated signals sent to

the servos for integrity. In doing so, a great deal was learned about the Arduino

coding environment and the Labview software.

LEARNING OBJECTIVE 2:

Develop a technical understanding of computer architecture, characteristics of transducers,

hardware for laboratory applications of computers, fundamentals of interfaces between

computers and experimental equipment.

Perhaps the largest benefit from this project was the experience gained with the

Arduino environment. Arduino, and other open-source interfaces, really drive the

learning experience through the ability to control what’s going on behind the scenes.

The only “black boxes” in this project that proved difficult to examine at a

fundamental level were the servos. Though their control loop was closed off,

however, the team was still able to characterize the signals coming out of them,

which proved invaluable.

LEARNING OBJECTIVE 3:

Develop a technical understanding of programming techniques for data acquisition and

control, basic data analysis.

There were two separate methods for data acquisition in this project: IMU data on

command through the Arduino script and through non-intrusive analysis of the

servo signals. Though very few data acquisition techniques are truly non-intrusive,

analyzing the signals with a voltage pin did not significantly degrade the signals

being sent to the servos, but more importantly, did not involve any alteration to the

system. The IMU data, on the other hand, involved manipulating the control code to

print those values, and so was a relatively intrusive technique. The minute amount

of time added to the control loop to print those values, however, did not push the

loop near any threshold of being operated outside of the original parameters.

Page 33: MAE276 FinalProj UAV-Team

APPENDIX A:

ADDITIONAL IMU DATA

Figure 1 – Servos hardwired at 1500 (neutral)

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)angula

r ra

te (

degre

es/s

ec)

gyroy

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 34: MAE276 FinalProj UAV-Team

Figure 2 – 50 Hz frequency

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 35: MAE276 FinalProj UAV-Team

Figure 3 – 33 Hz frequency

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 20 40 60 80 100 120 140 160-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 20 40 60 80 100 120 140 160

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 20 40 60 80 100 120 140 1601000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 36: MAE276 FinalProj UAV-Team

Figure 4 – Camera trigger in control loop – 100 Hz

0 50 100 150 200 250 300 350 400-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 50 100 150 200 250 300 350 400

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 50 100 150 200 250 300 350 4001000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 50 100 150 200 250 300 350 400-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 50 100 150 200 250 300 350 400

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 50 100 150 200 250 300 350 4001000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 37: MAE276 FinalProj UAV-Team

Figure 5 – Camera trigger in control loop – 40 Hz

0 50 100 150 200 250 300 350 400-2000

-1000

0

1000

2000

time(seconds)

roll

accele

ration

ay

0 50 100 150 200 250 300 350 400

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyroy

0 50 100 150 200 250 300 350 4001000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

roll

0 50 100 150 200 250 300 350 400-2000

-1000

0

1000

2000

time(seconds)

pitch a

ccele

ration

ax

0 50 100 150 200 250 300 350 400

-0.2

-0.1

0

0.1

0.2

0.3

time(seconds)

angula

r ra

te (

degre

es/s

ec)

gyrox

0 50 100 150 200 250 300 350 4001000

1200

1400

1600

1800

2000

time(seconds)

serv

o s

ignal (m

icro

seconds)

pitch

Page 38: MAE276 FinalProj UAV-Team

APPENDIX B:

CAD MODELS OF THE UAV FOR MODEL PARAMETERIZATION