pixhawk camera review

2
Paper: An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications (https://pixhawk.org/_media/modules/px4flow_paper.pdf) 1. Provide a brief summary of the paper? This paper presents an open source hardware and software design of an optical flow sensor for estimating motion of a robot. The paper builds on the recent work in the area of autonomous GPS denied navigation using optical mouse sensors. The authors have built a sensor using the ARM Cortex M4 chip, a CMOS image sensor, a gyroscope for angular compensation and an ultrasonic sensor for distance scaling. The sensor system achieves a high frame rate of computation by subsampling the image and calculating the optic flow over a block of 64x64 pixels. A block matching algorithm is used to computer optic flow. The ultrasonic sensor is used to measure the distance to the ground and scale the optic flow values in pixels to metric values. Angular effects also introduce changes to optic flow readings and these effects are eliminated using a gyroscope from which differences due to angular motion are discarded from translational motion. A wide angle lens is incorporated to gather more information of the surroundings. A very interesting method employed is the use of the special imager bus and Direct Memory Access with double buffering to speed up computations. SAD block matching algorithm is used for optic flow estimation with 64 sample points. This measurement is refined with half pixel step size in each direction to find the best matching pixel. 2. What are its strong points and main contributions? For real time applications, the update rate required is high so as to account for the fluctuations in the motion of the robot. The low level block matching algorithm is written in C, and some heavy computation is optimised using assembly instructions. This helps produce a very high frame rate of 250 Hz at the time the paper was written. Being open source software, it is continuously updated and made better. The update rate currently stands at the 400Hz, which is helpful in the most tightest manoeuvres. The system presented in the paper has made its mark in the RC hobbyist community, as the hardware and software is open source and was picked up by a lot of manufacturers for large scale production. It features on the toolkit of many RC pilots. It has enabled GPS denied flight for a limited range available to the masses.The open source architecture also remains a very strong point of the paper. The software and hardware are updated and made available to anyone who might be interested in making them. 3. What are its weak points? The weak points of the paper are those that are traditionally inherent in optical flow based systems, i.e. accumulation errors over time. Although we have a systems running at a very high update rate, the integration errors do accumulate over a period of time. thus any system that completely relies on the optical flow sensor will tend to drift over a period of time. This can however be offset by using another sensor(vision sensor etc) and fuse these readings to get a better pose estimate. The initial version had a strong electromagnetic interference from the onboard components. Practically, there are some issues using the sensor in low light conditions and smooth reflective surfaces. Also, it is optimised to be used a downward facing camera and doesn't work as well as expected if used as a front facing one. The range of working is limited by the range of the ultrasonic sensor. 4. Are the approaches technically sound? Yes, the approach is definitely technically sound. The authors have a taken a long standing problem in the area of autonomous navigation and brought it to the fore. The authors have very well exploited the benefits of low level micro controller code to enable faster computations. This, coupled with a good algorithm(block matching) made for a really good sensor architecture. A lot of effort has gone into making the hardware and not just presenting the idea. 5. Comment on the experimental methodology used in the paper (if applicable). The authors have used indoor and outdoor testing techniques to test their sensor hardware. The sensor was first benchmarked indoors using a Vicon motion tracking system as ground truth. After that, the sensor is tested indoor for hovering and waypoint navigation under low light conditions. Then the sensor is tested on

Upload: vikym14

Post on 30-Jan-2016

213 views

Category:

Documents


0 download

DESCRIPTION

review of the pixhawk camera researxh paper

TRANSCRIPT

Page 1: Pixhawk camera review

Paper: An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications (https://pixhawk.org/_media/modules/px4flow_paper.pdf)

1. Provide a brief summary of the paper?

This paper presents an open source hardware and software design of an optical flow sensor for estimating motion of a robot. The paper builds on the recent work in the area of autonomous GPS denied navigation using optical mouse sensors. The authors have built a sensor using the ARM Cortex M4 chip, a CMOS image sensor, a gyroscope for angular compensation and an ultrasonic sensor for distance scaling. The sensor system achieves a high frame rate of computation by subsampling the image and calculating the optic flow over a block of 64x64 pixels. A block matching algorithm is used to computer optic flow. The ultrasonic sensor is used to measure the distance to the ground and scale the optic flow values in pixels to metric values. Angular effects also introduce changes to optic flow readings and these effects are eliminated using a gyroscope from which differences due to angular motion are discarded from translational motion. A wide angle lens is incorporated to gather more information of the surroundings. A very interesting method employed is the use of the special imager bus and Direct Memory Access with double buffering to speed up computations. SAD block matching algorithm is used for optic flow estimation with 64 sample points. This measurement is refined with half pixel step size in each direction to find the best matching pixel.

2. What are its strong points and main contributions?

For real time applications, the update rate required is high so as to account for the fluctuations in the motion of the robot. The low level block matching algorithm is written in C, and some heavy computation is optimised using assembly instructions. This helps produce a very high frame rate of 250 Hz at the time the paper was written. Being open source software, it is continuously updated and made better. The update rate currently stands at the 400Hz, which is helpful in the most tightest manoeuvres. The system presented in the paper has made its mark in the RC hobbyist community, as the hardware and software is open source and was picked up by a lot of manufacturers for large scale production. It features on the toolkit of many RC pilots. It has enabled GPS denied flight for a limited range available to the masses.The open source architecture also remains a very strong point of the paper. The software and hardware are updated and made available to anyone who might be interested in making them.

3. What are its weak points?

The weak points of the paper are those that are traditionally inherent in optical flow based systems, i.e. accumulation errors over time. Although we have a systems running at a very high update rate, the integration errors do accumulate over a period of time. thus any system that completely relies on the optical flow sensor will tend to drift over a period of time. This can however be offset by using another sensor(vision sensor etc) and fuse these readings to get a better pose estimate. The initial version had a strong electromagnetic interference from the onboard components. Practically, there are some issues using the sensor in low light conditions and smooth reflective surfaces. Also, it is optimised to be used a downward facing camera and doesn't work as well as expected if used as a front facing one. The range of working is limited by the range of the ultrasonic sensor.

4. Are the approaches technically sound?

Yes, the approach is definitely technically sound. The authors have a taken a long standing problem in the area of autonomous navigation and brought it to the fore. The authors have very well exploited the benefits of low level micro controller code to enable faster computations. This, coupled with a good algorithm(block matching) made for a really good sensor architecture. A lot of effort has gone into making the hardware and not just presenting the idea.

5. Comment on the experimental methodology used in the paper (if applicable).

The authors have used indoor and outdoor testing techniques to test their sensor hardware. The sensor was first benchmarked indoors using a Vicon motion tracking system as ground truth. After that, the sensor is tested indoor for hovering and waypoint navigation under low light conditions. Then the sensor is tested on

Page 2: Pixhawk camera review

outdoor flights over a distance of around 200m and its performance is compared to that of a standard optical mouse sensor.

6. How is the organization and presentation of the paper?

The paper has a very methodical and organised approach. The authors start with an overview of previous work in the field and their limitations. They go on to explain the theory, the system diagram, implementation, applications and results. I think the paper was written in a fluid manner explaining each aspect. There is adequate comparison with other sensors to prove its merit.

7. Describe an idea on how you might improve upon the work reported in this paper, or how it might inspire you to apply the findings into another application, or generalize it to a broader formulation, etc.

This sensor can very well be used a part of a larger system where measurements from different systems can be utilised to get a better sense of the position of the vehicle. I envision an aerial vehicle that can traverse a forest, avoiding trees and other obstacles. Here, one of the methods applied for motion estimation can be optic flow readings which could then be fused with other sensor readings that are then fed to a learning algorithm. Here I would like to have a front facing optic flow camera to obtain motion with respect to obstacles and hence would work on making the algorithm work better while facing the front.