multispectral camera simon belkin, audrey finken, grant george, matthew walczak faculty advisor:...

2
Multispectral Camera Simon Belkin, Audrey Finken, Grant George, Matthew Walczak Faculty Advisor: Prof. Mario Parente Department of Electrical and Computer Engineering ECE 415/ECE 416 – SENIOR DESIGN PROJECT 2013 College of Engineering - University of Massachusetts Amherst SDP13 Abstract Block Diagram System Overview Results Specifications Acknowledgement s Team Logo Multi-spectral cameras capture images through special optical filters, essentially band-pass filters, which allow only a certain range of wavelengths to pass through while blocking out the rest. The multi-spectral camera system will be operated by a Raspberry-Pi brand microcontroller system. This system will position an individual filter in front of a monochromatic camera and an image will be captured, processed, and displayed to the operator We would like to thank: Seahorse Bioscience for donating the filter wheel used in this project. Akshaya Shanmugam for her help in Spectrometer Testing System electronically commands and controls the filter wheel assembly and monochrome camera Differentiate between various rocks based on the spectrum provided Pixels of images taken at various filters alignments Project budget is set at $500 Filter wheel thickness required optical engineering to determine required focal lengths The primary goal of the SDP13 Multi- Spectral Camera System is to develop an affordable multi-spectral imagery system that will have the ability to be installed onto a Mars Rover type of vehicle and perform imagery analysis at close to medium distances, (.3-10m). A secondary goal is to create a very useful, and relatively affordable multi- spectral imagery system that enables amateur scientists to view and learn about their surroundings in an affordable and non-complicated way No Filter 425nm 436nm 450nm 860nm 510nm 750nm 990nm

Upload: moris-newton

Post on 17-Dec-2015

214 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Multispectral Camera Simon Belkin, Audrey Finken, Grant George, Matthew Walczak Faculty Advisor: Prof. Mario Parente Department of Electrical and Computer

Multispectral CameraSimon Belkin, Audrey Finken,

Grant George, Matthew WalczakFaculty Advisor: Prof. Mario Parente

Department of Electrical and Computer Engineering

ECE 415/ECE 416 – SENIOR DESIGN PROJECT 2013

College of Engineering - University of Massachusetts AmherstSDP13

Abstract

Block Diagram

System Overview

Results

Specifications

Acknowledgements

Team Logo

Multi-spectral cameras capture images through special optical filters, essentially band-pass filters, which allow only a certain range of wavelengths to pass through while blocking out the rest.

The multi-spectral camera system will be operated by a Raspberry-Pi brand microcontroller system. This system will position an individual filter in front of a monochromatic camera and an image will be captured, processed, and displayed to the operator

We would like to thank:• Seahorse Bioscience for donating the filter

wheel used in this project.• Akshaya Shanmugam for her help in

Spectrometer Testing

System electronically commands and controls the filter wheel assembly and monochrome camera

Differentiate between various rocks based on the spectrum provided

Pixels of images taken at various filters alignments Project budget is set at $500 Filter wheel thickness required optical engineering

to determine required focal lengths

The primary goal of the SDP13 Multi-Spectral Camera System is to develop an affordable multi-spectral imagery system that will have the ability to be installed onto a Mars Rover type of vehicle and perform imagery analysis at close to medium distances, (.3-10m). A secondary goal is to create a very useful, and relatively affordable multi-spectral imagery system that enables amateur scientists to view and learn about their surroundings in an affordable and non-complicated way

No Filter 425nm

436nm 450nm

860nm

510nm 750nm

990nm

Page 2: Multispectral Camera Simon Belkin, Audrey Finken, Grant George, Matthew Walczak Faculty Advisor: Prof. Mario Parente Department of Electrical and Computer

Cost Accounting

Raspberry Pi

Getting Images via Image Registration

Development (R&D Costs)

Production Cost

Item Unit Cost Item Unit Cost

Filter Wheel $0.00 Filter Wheel $350.00

Raspberry Pi $0.00 Raspberry $35.00

Mightex USB Camera $219.00 Mightex USB Camera $127.02

Pentax Lens $99.95 Pentax Lens $57.97

Bi-Convex Lens $4.00 Bi-Convex Lens $2.32Adaptor - C-Mount to SM1 $19.75

Adaptor - C-Mount to SM1 $11.46

Lens Tube 0.5" $12.59 Lens Tube 0.5" $7.30

Lens Tube 3.0" $25.75 Lens Tube 3.0" $14.94

Stepper Motor Driver $14.95 Stepper Motor Driver $8.67Adaptor - C-Mount M/M $25.00

Adaptor - C-Mount M/M $14.50

Filter 425nm Filter $0.00 Filter 425nm Filter $57.42

Filter 436nm Filter $37.00 Filter 436nm Filter $20.00

Filter 670nm Filter $35.00 Filter 670nm Filter $20.30

Filter 750nm Filter $99.00 Filter 750nm Filter $57.42

Filter 860nm Filter $99.00 Filter 860nm Filter $57.42

Filter 990nm Filter $99.00 Filter 990nm Filter $57.42

Total Part Cost $785.99 Total Part Cost $899.16

Optics and Ray Tracing

Filter Selection

Above are the important wavelengths that differentiate the rocks based on those values

Pixel error = [1.654, 0.9432]Focal Length = (2034.19, 2087.65)Principal Point = (405.59, 402.928)Skew = 0Radial coefficients = (2.092, -15.64, 0)Tangential coefficients = (0.1224, -0.002745)

+/- [1811, 1886]+/- [74.77, 235.8]

+/- 0+/- [3.722, 54.88, 0]

+/- [0.2361, 0.03374]

0 100 200 300 400 500 600 700

0

50

100

150

200

250

300

350

400

450

5

55 5

5

5

10

10

10

1015

1520

25

Complete Distortion Model

X

YO

Image 2 - Image points (+) and reprojected grid points (o)

100 200 300 400 500 600 700

50

100

150

200

250

300

350

400

450

• Extension tubes added to facilitate a greater focal length.

• Thin lens equation, 1/di = 1/f – 1/do, determines the distance to the object, do, and to the image, di. 

Image Registration is the process of estimating an optimal transformation between two images. We transformed a picture taken with filters based on a reference image taken without filters. The Software was done in python to work on the pi.

Pixel error = [1.654, 0.9432]Focal Length = (2034.19, 2087.65)Principal Point = (405.59, 402.928)Skew = 0Radial coefficients = (2.092, -15.64, 0)Tangential coefficients = (0.1224, -0.002745)

+/- [1811, 1886]+/- [74.77, 235.8]

+/- 0+/- [3.722, 54.88, 0]

+/- [0.2361, 0.03374]

0 100 200 300 400 500 600 700

0

50

100

150

200

250

300

350

400

450 5

55

10

10

10

10

15

15

15

15

20

20

20

20

25

25

25

30

3035

Radial Component of the Distortion Model

Pixel error = [1.654, 0.9432]Focal Length = (2034.19, 2087.65)Principal Point = (405.59, 402.928)Skew = 0Radial coefficients = (2.092, -15.64, 0)Tangential coefficients = (0.1224, -0.002745)

+/- [1811, 1886]+/- [74.77, 235.8]

+/- 0+/- [3.722, 54.88, 0]

+/- [0.2361, 0.03374]

0 100 200 300 400 500 600 700

0

50

100

150

200

250

300

350

400

450

5

55

5

10

10

10

10

15

15

15

20

20

20

25

25

2530

30353540

Tangential Component of the Distortion Model

Geometric calibration was done in Matlab to remove geometric distortions caused by various filters. The images above were taken from Matlab while performing camera calibration. It shows the distortion model of images from filters and the settings of the camera at various filters