research article calibration of a stereo radiation detection ...particle detectors, are used to...
TRANSCRIPT
Research ArticleCalibration of a Stereo Radiation Detection CameraUsing Planar Homography
Seung-Hae Baek Pathum Rathnayaka and Soon-Yong Park
School of Computer Science and Engineering Kyungpook National University Daegu 702-701 Republic of Korea
Correspondence should be addressed to Soon-Yong Park syparkknuackr
Received 20 February 2015 Revised 3 May 2015 Accepted 4 May 2015
Academic Editor Hocine Cherifi
Copyright copy 2016 Seung-Hae Baek et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited
This paper proposes a calibration technique of a stereo gammadetection camera Calibration of the internal and external parametersof a stereo vision camera is a well-known research problem in the computer vision society However few or no stereo calibrationhas been investigated in the radiation measurement research Since no visual information can be obtained from a stereo radiationcamera it is impossible to use a general stereo calibration algorithm directly In this paper we develop a hybrid-type stereo systemwhich is equipped with both radiation and vision cameras To calibrate the stereo radiation cameras stereo images of a calibrationpattern captured from the vision cameras are transformed in the view of the radiation cameras The homography transformationis calibrated based on the geometric relationship between visual and radiation camera coordinates The accuracy of the stereoparameters of the radiation camera is analyzed by distancemeasurements to both visual light and gamma sourcesThe experimentalresults show that the measurement error is about 3
1 Introduction
Detection of radioactive sources is one of the importanttopics in many research areas particularly in nuclear instru-mentation nuclear physics nuclear medicine and manynondestructive tests [1] Radiation detectors also known asparticle detectors are used to detect track and identifythe presence of radioactive sources within a given area orenvironment As a generic definition radiation is an energythat comes from a source and travels through some materialor space They can be found in many different formats suchas alpha beta gamma and X-ray The detection of sourcesthat emit radiation plays a very significant role in manyareas of science and industry Mainly gamma radiation hasmanaged to attract the interest of most of the scientists sincethe advance of the Anger camera [2]
Not only the detection of radioactive sources is crucialbut also the accurate estimation of their three-dimensional(3D) information is important If the 3Dposition informationof radioactive (such as gamma) sources is available theiractivity (strength) can be estimated more precisely Howeverdetection of gamma sources which are widely distributed in
the 3D environment is not an easy task to accomplish as itmayseem Determining or estimating the 3D coordinates of suchwidely spread gamma sources using a single camera is almostimpossible Therefore in this paper we use a stereo gammadetection device instead of using mono devices to estimatetheir coordinates precisely
When detecting and extracting metric information fromtwo-dimensional (2D) radiation images using a stereogamma camera (SGC) we must consider the calibration ofthe camera Proper calibration methods have to be imple-mented in order to obtain more accurate 3D coordinatemeasurement Since accurate calibration of vision cameras isof interest in computer vision many related works have beendone throughout the last few decades [3ndash7]
Although much work related to the calibration of stereoand mono vision camera has been introduced a propercalibration method of an SGC has not been introduced yetConsidering it as a fact we propose a new simplified methodto calibrate such SGC devices using planar homographyrelationshipsThe experiment system consists of a 1D gammasensor and a 2D vision sensor which are mounted on a pan-tilt motion device Because the gamma sensor is a 1D point
Hindawi Publishing CorporationJournal of SensorsVolume 2016 Article ID 8928096 11 pageshttpdxdoiorg10115520168928096
2 Journal of Sensors
Area scan
Vision image (IL)
Tilt Tilt
Vision camera
Pan
Radiation detector
Radioactive material
Radiation image (JR)
Figure 1 Obtaining a gamma image by using a gamma detector and a pan-tile module
sensor the pan-tilt device scans the sensor in horizontal andvertical directions to acquired 2Dgamma images In additionthe pan-tilt device also rotates both the gamma sensor andvision sensor in a symmetric position so that each sensor canobtain stereo images with the same baseline Therefore thegamma sensor combined with the pan-tilt device works as a2D radiation camera and the vision camera is used to acquirethe corresponding 2D visual information
To measure 3D information from the stereo gammaimages the stereo gamma camera must be calibrated How-ever no visual information can be obtained from the cameraand we transform the visual images obtained from the visioncamera to the view of the gamma camera In this way wecan obtain virtual calibration images of the gamma camerato calibrate the stereo geometry of the gamma camera Oncethe devices are fully calibrated we can measure the parallaxin the stereo gamma images and distance to the radioactivesources
The paper is constructed as follows Section 2 introducesthe panning and tilting method implemented to generateradiation images Section 3 describes the method used to cal-culate the planar homography relationships between gammacamera images and vision camera images The proposed cal-ibration method is described in Section 4 Section 5 presentsthe experiments we have performed to measure 3D distanceto both visual light and gamma sources In this section weevaluate the accuracy of the proposed calibration methodby comparing the acquired results with real data Section 6consists with the conclusion
2 Obtaining Stereo Gamma Images Usinga Gamma Detector
The approach of obtaining gamma images is based on scan-ning a 1D gamma detector using a pan-tilt motion moduleFigure 1 shows a simplified diagram of this concept A 1Dgamma detector is mounted on a pan-tilt module which isconnected to a general purpose PC The rotation of the pan-tilt module in panning and tilting directions is controlled by aterminal program of the PC According to the resolution of agamma image the sensing direction of the pan-tilt module isdetermined At every sensing direction the detector obtainsthe corresponding pixelrsquos grey value of the image By obtain-ing all pixel values a 2D gamma image is then generatedTherefore we can consider that there is a virtual 2D gammacamera which can capture a gamma image
As a gamma detector a SPMT (small photo multipliertube) is used with a Nal (Tl) type scintillator The detectionsensitivity of the SPMT is the maximum at 420 nm Theoutput type of the detector is the pulse count of photonswhich are generated by the SPMT For more information seereference [8] The pulse counts from the SPMT are recordedand mapped to the grey value of a pixel in the gamma image
A 2D vision camera is also mounted on the opposite sideof the gamma detector The vision camera only captures twoways of visual information the left and the right images ofthe scanning area The viewing angle of the vision camera isdetermined by the lens of the camera and it is usually differentthan the scanning area of the detector However it can be
Journal of Sensors 3
Radiation detector
Vision camera
Pan part
Tilt part
(a)
Tilt part
Pulse counter
Gamma detector
Pan part
Vision camera
(b)
Figure 2 (a) CAD model and (b) real view of the pan-tilt module and two enclosures for mounting gamma and vision cameras
Pan (180∘)
Vision camera
OR
JR
OL
IL
Tilt (180∘)
OR
OL
JL
IR
Gamma camera
Figure 3 Panning and tilting of sensing devices for obtaining stereo gamma and vision images
transformed to be matched with the scanning area of thedetector by using the calibration parameters of the camera
Figure 2 shows a CAD model and a real view of thepan-tile module equipped with two enclosures for mountinggamma and vision cameraThepan part of themodule rotatesthe tilt part and two sensing devices more than 180 degreesIn addition the tilt part rotates the gamma detector and thevision camera more than 180 degrees also Figure 3 showsdiagrams to explain the way of capturing stereo images oftwo sensing devices In the beginning suppose the gammacamera is on the right side of the pan-tile module and thevision camera is on the left side Therefore we can obtain theright gamma image (119869R) from the virtual gamma camera andthe left vision image (119868L) from the vision camera Next let usrotate the pan and tilt parts 180 degrees respectivelyThen theorigin of the sensor coordinate systems is exactly exchangedwhile the image planes of the cameras are rotated 180 degreeswith respect to the image center The image rotation can beeasily rotated to the right directions finally we can obtain theleft gamma image (119869L) and the right vision image (119868R) in thefinal position
In order to rotate the pan-tilt module accurately wecontrol the module using very accurate DC motors and amotion controller The motors are equipped with encoderswith the resolution of 2048 pulses per revolution In additionthe rotation of the pan and tilt module is reduced by 100 1gear ratio Therefore the rotation accuracy is very high andwe assume the error is less than 001 degrees Table 1 showsthe specification of some categories of the pan-tile module
Table 1 Specifications of the pan-tilt module
Category SpecificationBaseline (|119874L minus 119874R|) 280mmPantilt encoder resolution 2048 pulserevPantilt gear ratio 100 1Rotation accuracy lt001∘
Mechanical backlash lt01∘
Dimension 360 times 270 times 200mm3 (119882times119867 times 119863)
3 Homography Relationship between Gammaand Vision Cameras
The first step of the proposed method is calculating thehomography relationship between two different image planesof the gamma and vision cameras Currently two types ofgamma detectors are used pin-hole and coded aperturetypes [9] As described in the previous section our gammasensor is a pin-hole detector which consists of a cylindricalcollimator and an SPMT Assuming that the gamma andvision cameras follow the pinhole camera model the imageformation between the gamma and vision cameras is relatedby a homography In the previous section we show thatthe coordinate origins of the gamma and vision camerasare the same because of the symmetric motion of the pan-tilt module There are two homography relations betweengamma and vision images one is between left gamma and
4 Journal of Sensors
Stereo radiation image
Stereo vision image
JR
HL
JL
IL
HR
IR
Vision cameraRadiation detector
Switching left andright devices
Figure 4 Homography relations between gamma and vision images
(a) (b) (c)
Figure 5 Gamma and vision images of four LED lights (a) calibration setup (b) gamma image (top 119869L bottom 119869R) and (c) vision image(top 119868L bottom 119868R)
vision images (119869L and 119868L) and the other is between rightgamma and vision images (119869R and 119868R)
Let 119867L and 119867R be the left and the right homographyof the gamma and vision cameras respectively As shownin Figure 4 119867L is the homography from 119868L to 119869L and 119867Ris the homography from 119868R to 119869R To transform the visionimages to gamma images 119868L and 119868R are multiplied by 119867Land 119867R respectively as follows and then the results arehomogeneously equal to the gamma image
119869L cong 119867L119868L
119869R cong 119867R119868R(1)
Because two sensorrsquos images captured from the same captureposition for example from 119868L and 119869L have the same coordi-nate origin 2D homography relation is enough to transformone image plane to the other
To calculate the planar homography matrix 119867 we needat least four corresponding image points between the gamma
and vision images This means that we need to know at leastfour sets of 2D image coordinates of any object which canbe captured in both gamma and vision cameras Fortunatelythe gamma detection sensor is designed to respond to visiblelight Thus we can use four-point light sources to find thehomography
Figure 5 shows an example of capturing four correspond-ing image points between the gamma and vision camerasFour LED light sources are fixed in front of the camerasas calibration objects and gamma and vision images of thelight sources are obtained Both in gamma and vision imagesthe image intensity of the LED sources is higher than thoseof their neighboring pixels We apply the Gaussian fittingmethod [10] to get the exact coordinates of the correspondingpoints and calculate the homography 119867L and the homogra-phy119867R using a linear algebra equation
119867L and119867R are 3 times 3 matrices but their scale is unknownTherefore using four pairs of matching points betweengamma and vision images we can compute eight unknown
Journal of Sensors 5
(a)
(b)
Figure 6 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
parameters of the homography Equation (2) shows homog-raphy 119867L and homography 119867R computed by the images inFigure 5
119867L =[[[
[
053197 minus002558 minus164897
minus002713 057747 minus90443
minus17225 times 10minus4
minus11717 times 10minus4
10
]]]
]
119867R =[[[
[
052141 minus0023501 minus165795
minus004064 052391 minus82434
minus20699 times 10minus4
minus23451 times 10minus4
10
]]]
]
(2)
Figure 6 shows an example of applying the119867L and119867R tothe vision images In Figure 6(a) left and right vision images119868L and 119868R are captured in 640 times 480 resolution Using twohomography matrices virtual gamma images 119869L and 119869R areobtained in 400 times 400 resolution Each pair of left and rightimages shares the same coordinate origin thus there areonly scale and rotation transformation between the visionand gamma images (ideally speaking there is no rotationtransformation if the coordinate axes of the two sensors areparallel)
In Figure 7 more light sources are used to calibrate thehomography between the two stereo cameras Images of eight
LED light points are captured in the images from both thevision and gamma cameras Images in Figure 7(a) show theoriginal captured images of light points from the vision andgamma cameras Homography relation between the left andright sensors is shown in (3) In Figure 7(c) are virtual gammaimages obtained from the homography relation
119867L =[[[
[
0951368 0012253 minus387186
0043855 0905885 minus230053
31221 times 10minus4
42492 times 10minus5
10
]]]
]
119867R =[[[
[
0826863 0025824 minus307124
0017759 0821441 minus163538
19185 times 10minus4
20055 times 10minus5
10
]]]
]
(3)
4 Stereo Calibration of Vision andGamma Cameras
In the field of computer vision calibration of a stereo visioncamera is a general but important task Many calibrationalgorithms have been introduced however currently Zhangrsquosmethod is widely used To use Zhangrsquos calibration method itis recommended to capture at least six images of a calibrationpattern which usually consists of checkerboard patterns
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
2 Journal of Sensors
Area scan
Vision image (IL)
Tilt Tilt
Vision camera
Pan
Radiation detector
Radioactive material
Radiation image (JR)
Figure 1 Obtaining a gamma image by using a gamma detector and a pan-tile module
sensor the pan-tilt device scans the sensor in horizontal andvertical directions to acquired 2Dgamma images In additionthe pan-tilt device also rotates both the gamma sensor andvision sensor in a symmetric position so that each sensor canobtain stereo images with the same baseline Therefore thegamma sensor combined with the pan-tilt device works as a2D radiation camera and the vision camera is used to acquirethe corresponding 2D visual information
To measure 3D information from the stereo gammaimages the stereo gamma camera must be calibrated How-ever no visual information can be obtained from the cameraand we transform the visual images obtained from the visioncamera to the view of the gamma camera In this way wecan obtain virtual calibration images of the gamma camerato calibrate the stereo geometry of the gamma camera Oncethe devices are fully calibrated we can measure the parallaxin the stereo gamma images and distance to the radioactivesources
The paper is constructed as follows Section 2 introducesthe panning and tilting method implemented to generateradiation images Section 3 describes the method used to cal-culate the planar homography relationships between gammacamera images and vision camera images The proposed cal-ibration method is described in Section 4 Section 5 presentsthe experiments we have performed to measure 3D distanceto both visual light and gamma sources In this section weevaluate the accuracy of the proposed calibration methodby comparing the acquired results with real data Section 6consists with the conclusion
2 Obtaining Stereo Gamma Images Usinga Gamma Detector
The approach of obtaining gamma images is based on scan-ning a 1D gamma detector using a pan-tilt motion moduleFigure 1 shows a simplified diagram of this concept A 1Dgamma detector is mounted on a pan-tilt module which isconnected to a general purpose PC The rotation of the pan-tilt module in panning and tilting directions is controlled by aterminal program of the PC According to the resolution of agamma image the sensing direction of the pan-tilt module isdetermined At every sensing direction the detector obtainsthe corresponding pixelrsquos grey value of the image By obtain-ing all pixel values a 2D gamma image is then generatedTherefore we can consider that there is a virtual 2D gammacamera which can capture a gamma image
As a gamma detector a SPMT (small photo multipliertube) is used with a Nal (Tl) type scintillator The detectionsensitivity of the SPMT is the maximum at 420 nm Theoutput type of the detector is the pulse count of photonswhich are generated by the SPMT For more information seereference [8] The pulse counts from the SPMT are recordedand mapped to the grey value of a pixel in the gamma image
A 2D vision camera is also mounted on the opposite sideof the gamma detector The vision camera only captures twoways of visual information the left and the right images ofthe scanning area The viewing angle of the vision camera isdetermined by the lens of the camera and it is usually differentthan the scanning area of the detector However it can be
Journal of Sensors 3
Radiation detector
Vision camera
Pan part
Tilt part
(a)
Tilt part
Pulse counter
Gamma detector
Pan part
Vision camera
(b)
Figure 2 (a) CAD model and (b) real view of the pan-tilt module and two enclosures for mounting gamma and vision cameras
Pan (180∘)
Vision camera
OR
JR
OL
IL
Tilt (180∘)
OR
OL
JL
IR
Gamma camera
Figure 3 Panning and tilting of sensing devices for obtaining stereo gamma and vision images
transformed to be matched with the scanning area of thedetector by using the calibration parameters of the camera
Figure 2 shows a CAD model and a real view of thepan-tile module equipped with two enclosures for mountinggamma and vision cameraThepan part of themodule rotatesthe tilt part and two sensing devices more than 180 degreesIn addition the tilt part rotates the gamma detector and thevision camera more than 180 degrees also Figure 3 showsdiagrams to explain the way of capturing stereo images oftwo sensing devices In the beginning suppose the gammacamera is on the right side of the pan-tile module and thevision camera is on the left side Therefore we can obtain theright gamma image (119869R) from the virtual gamma camera andthe left vision image (119868L) from the vision camera Next let usrotate the pan and tilt parts 180 degrees respectivelyThen theorigin of the sensor coordinate systems is exactly exchangedwhile the image planes of the cameras are rotated 180 degreeswith respect to the image center The image rotation can beeasily rotated to the right directions finally we can obtain theleft gamma image (119869L) and the right vision image (119868R) in thefinal position
In order to rotate the pan-tilt module accurately wecontrol the module using very accurate DC motors and amotion controller The motors are equipped with encoderswith the resolution of 2048 pulses per revolution In additionthe rotation of the pan and tilt module is reduced by 100 1gear ratio Therefore the rotation accuracy is very high andwe assume the error is less than 001 degrees Table 1 showsthe specification of some categories of the pan-tile module
Table 1 Specifications of the pan-tilt module
Category SpecificationBaseline (|119874L minus 119874R|) 280mmPantilt encoder resolution 2048 pulserevPantilt gear ratio 100 1Rotation accuracy lt001∘
Mechanical backlash lt01∘
Dimension 360 times 270 times 200mm3 (119882times119867 times 119863)
3 Homography Relationship between Gammaand Vision Cameras
The first step of the proposed method is calculating thehomography relationship between two different image planesof the gamma and vision cameras Currently two types ofgamma detectors are used pin-hole and coded aperturetypes [9] As described in the previous section our gammasensor is a pin-hole detector which consists of a cylindricalcollimator and an SPMT Assuming that the gamma andvision cameras follow the pinhole camera model the imageformation between the gamma and vision cameras is relatedby a homography In the previous section we show thatthe coordinate origins of the gamma and vision camerasare the same because of the symmetric motion of the pan-tilt module There are two homography relations betweengamma and vision images one is between left gamma and
4 Journal of Sensors
Stereo radiation image
Stereo vision image
JR
HL
JL
IL
HR
IR
Vision cameraRadiation detector
Switching left andright devices
Figure 4 Homography relations between gamma and vision images
(a) (b) (c)
Figure 5 Gamma and vision images of four LED lights (a) calibration setup (b) gamma image (top 119869L bottom 119869R) and (c) vision image(top 119868L bottom 119868R)
vision images (119869L and 119868L) and the other is between rightgamma and vision images (119869R and 119868R)
Let 119867L and 119867R be the left and the right homographyof the gamma and vision cameras respectively As shownin Figure 4 119867L is the homography from 119868L to 119869L and 119867Ris the homography from 119868R to 119869R To transform the visionimages to gamma images 119868L and 119868R are multiplied by 119867Land 119867R respectively as follows and then the results arehomogeneously equal to the gamma image
119869L cong 119867L119868L
119869R cong 119867R119868R(1)
Because two sensorrsquos images captured from the same captureposition for example from 119868L and 119869L have the same coordi-nate origin 2D homography relation is enough to transformone image plane to the other
To calculate the planar homography matrix 119867 we needat least four corresponding image points between the gamma
and vision images This means that we need to know at leastfour sets of 2D image coordinates of any object which canbe captured in both gamma and vision cameras Fortunatelythe gamma detection sensor is designed to respond to visiblelight Thus we can use four-point light sources to find thehomography
Figure 5 shows an example of capturing four correspond-ing image points between the gamma and vision camerasFour LED light sources are fixed in front of the camerasas calibration objects and gamma and vision images of thelight sources are obtained Both in gamma and vision imagesthe image intensity of the LED sources is higher than thoseof their neighboring pixels We apply the Gaussian fittingmethod [10] to get the exact coordinates of the correspondingpoints and calculate the homography 119867L and the homogra-phy119867R using a linear algebra equation
119867L and119867R are 3 times 3 matrices but their scale is unknownTherefore using four pairs of matching points betweengamma and vision images we can compute eight unknown
Journal of Sensors 5
(a)
(b)
Figure 6 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
parameters of the homography Equation (2) shows homog-raphy 119867L and homography 119867R computed by the images inFigure 5
119867L =[[[
[
053197 minus002558 minus164897
minus002713 057747 minus90443
minus17225 times 10minus4
minus11717 times 10minus4
10
]]]
]
119867R =[[[
[
052141 minus0023501 minus165795
minus004064 052391 minus82434
minus20699 times 10minus4
minus23451 times 10minus4
10
]]]
]
(2)
Figure 6 shows an example of applying the119867L and119867R tothe vision images In Figure 6(a) left and right vision images119868L and 119868R are captured in 640 times 480 resolution Using twohomography matrices virtual gamma images 119869L and 119869R areobtained in 400 times 400 resolution Each pair of left and rightimages shares the same coordinate origin thus there areonly scale and rotation transformation between the visionand gamma images (ideally speaking there is no rotationtransformation if the coordinate axes of the two sensors areparallel)
In Figure 7 more light sources are used to calibrate thehomography between the two stereo cameras Images of eight
LED light points are captured in the images from both thevision and gamma cameras Images in Figure 7(a) show theoriginal captured images of light points from the vision andgamma cameras Homography relation between the left andright sensors is shown in (3) In Figure 7(c) are virtual gammaimages obtained from the homography relation
119867L =[[[
[
0951368 0012253 minus387186
0043855 0905885 minus230053
31221 times 10minus4
42492 times 10minus5
10
]]]
]
119867R =[[[
[
0826863 0025824 minus307124
0017759 0821441 minus163538
19185 times 10minus4
20055 times 10minus5
10
]]]
]
(3)
4 Stereo Calibration of Vision andGamma Cameras
In the field of computer vision calibration of a stereo visioncamera is a general but important task Many calibrationalgorithms have been introduced however currently Zhangrsquosmethod is widely used To use Zhangrsquos calibration method itis recommended to capture at least six images of a calibrationpattern which usually consists of checkerboard patterns
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 3
Radiation detector
Vision camera
Pan part
Tilt part
(a)
Tilt part
Pulse counter
Gamma detector
Pan part
Vision camera
(b)
Figure 2 (a) CAD model and (b) real view of the pan-tilt module and two enclosures for mounting gamma and vision cameras
Pan (180∘)
Vision camera
OR
JR
OL
IL
Tilt (180∘)
OR
OL
JL
IR
Gamma camera
Figure 3 Panning and tilting of sensing devices for obtaining stereo gamma and vision images
transformed to be matched with the scanning area of thedetector by using the calibration parameters of the camera
Figure 2 shows a CAD model and a real view of thepan-tile module equipped with two enclosures for mountinggamma and vision cameraThepan part of themodule rotatesthe tilt part and two sensing devices more than 180 degreesIn addition the tilt part rotates the gamma detector and thevision camera more than 180 degrees also Figure 3 showsdiagrams to explain the way of capturing stereo images oftwo sensing devices In the beginning suppose the gammacamera is on the right side of the pan-tile module and thevision camera is on the left side Therefore we can obtain theright gamma image (119869R) from the virtual gamma camera andthe left vision image (119868L) from the vision camera Next let usrotate the pan and tilt parts 180 degrees respectivelyThen theorigin of the sensor coordinate systems is exactly exchangedwhile the image planes of the cameras are rotated 180 degreeswith respect to the image center The image rotation can beeasily rotated to the right directions finally we can obtain theleft gamma image (119869L) and the right vision image (119868R) in thefinal position
In order to rotate the pan-tilt module accurately wecontrol the module using very accurate DC motors and amotion controller The motors are equipped with encoderswith the resolution of 2048 pulses per revolution In additionthe rotation of the pan and tilt module is reduced by 100 1gear ratio Therefore the rotation accuracy is very high andwe assume the error is less than 001 degrees Table 1 showsthe specification of some categories of the pan-tile module
Table 1 Specifications of the pan-tilt module
Category SpecificationBaseline (|119874L minus 119874R|) 280mmPantilt encoder resolution 2048 pulserevPantilt gear ratio 100 1Rotation accuracy lt001∘
Mechanical backlash lt01∘
Dimension 360 times 270 times 200mm3 (119882times119867 times 119863)
3 Homography Relationship between Gammaand Vision Cameras
The first step of the proposed method is calculating thehomography relationship between two different image planesof the gamma and vision cameras Currently two types ofgamma detectors are used pin-hole and coded aperturetypes [9] As described in the previous section our gammasensor is a pin-hole detector which consists of a cylindricalcollimator and an SPMT Assuming that the gamma andvision cameras follow the pinhole camera model the imageformation between the gamma and vision cameras is relatedby a homography In the previous section we show thatthe coordinate origins of the gamma and vision camerasare the same because of the symmetric motion of the pan-tilt module There are two homography relations betweengamma and vision images one is between left gamma and
4 Journal of Sensors
Stereo radiation image
Stereo vision image
JR
HL
JL
IL
HR
IR
Vision cameraRadiation detector
Switching left andright devices
Figure 4 Homography relations between gamma and vision images
(a) (b) (c)
Figure 5 Gamma and vision images of four LED lights (a) calibration setup (b) gamma image (top 119869L bottom 119869R) and (c) vision image(top 119868L bottom 119868R)
vision images (119869L and 119868L) and the other is between rightgamma and vision images (119869R and 119868R)
Let 119867L and 119867R be the left and the right homographyof the gamma and vision cameras respectively As shownin Figure 4 119867L is the homography from 119868L to 119869L and 119867Ris the homography from 119868R to 119869R To transform the visionimages to gamma images 119868L and 119868R are multiplied by 119867Land 119867R respectively as follows and then the results arehomogeneously equal to the gamma image
119869L cong 119867L119868L
119869R cong 119867R119868R(1)
Because two sensorrsquos images captured from the same captureposition for example from 119868L and 119869L have the same coordi-nate origin 2D homography relation is enough to transformone image plane to the other
To calculate the planar homography matrix 119867 we needat least four corresponding image points between the gamma
and vision images This means that we need to know at leastfour sets of 2D image coordinates of any object which canbe captured in both gamma and vision cameras Fortunatelythe gamma detection sensor is designed to respond to visiblelight Thus we can use four-point light sources to find thehomography
Figure 5 shows an example of capturing four correspond-ing image points between the gamma and vision camerasFour LED light sources are fixed in front of the camerasas calibration objects and gamma and vision images of thelight sources are obtained Both in gamma and vision imagesthe image intensity of the LED sources is higher than thoseof their neighboring pixels We apply the Gaussian fittingmethod [10] to get the exact coordinates of the correspondingpoints and calculate the homography 119867L and the homogra-phy119867R using a linear algebra equation
119867L and119867R are 3 times 3 matrices but their scale is unknownTherefore using four pairs of matching points betweengamma and vision images we can compute eight unknown
Journal of Sensors 5
(a)
(b)
Figure 6 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
parameters of the homography Equation (2) shows homog-raphy 119867L and homography 119867R computed by the images inFigure 5
119867L =[[[
[
053197 minus002558 minus164897
minus002713 057747 minus90443
minus17225 times 10minus4
minus11717 times 10minus4
10
]]]
]
119867R =[[[
[
052141 minus0023501 minus165795
minus004064 052391 minus82434
minus20699 times 10minus4
minus23451 times 10minus4
10
]]]
]
(2)
Figure 6 shows an example of applying the119867L and119867R tothe vision images In Figure 6(a) left and right vision images119868L and 119868R are captured in 640 times 480 resolution Using twohomography matrices virtual gamma images 119869L and 119869R areobtained in 400 times 400 resolution Each pair of left and rightimages shares the same coordinate origin thus there areonly scale and rotation transformation between the visionand gamma images (ideally speaking there is no rotationtransformation if the coordinate axes of the two sensors areparallel)
In Figure 7 more light sources are used to calibrate thehomography between the two stereo cameras Images of eight
LED light points are captured in the images from both thevision and gamma cameras Images in Figure 7(a) show theoriginal captured images of light points from the vision andgamma cameras Homography relation between the left andright sensors is shown in (3) In Figure 7(c) are virtual gammaimages obtained from the homography relation
119867L =[[[
[
0951368 0012253 minus387186
0043855 0905885 minus230053
31221 times 10minus4
42492 times 10minus5
10
]]]
]
119867R =[[[
[
0826863 0025824 minus307124
0017759 0821441 minus163538
19185 times 10minus4
20055 times 10minus5
10
]]]
]
(3)
4 Stereo Calibration of Vision andGamma Cameras
In the field of computer vision calibration of a stereo visioncamera is a general but important task Many calibrationalgorithms have been introduced however currently Zhangrsquosmethod is widely used To use Zhangrsquos calibration method itis recommended to capture at least six images of a calibrationpattern which usually consists of checkerboard patterns
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
4 Journal of Sensors
Stereo radiation image
Stereo vision image
JR
HL
JL
IL
HR
IR
Vision cameraRadiation detector
Switching left andright devices
Figure 4 Homography relations between gamma and vision images
(a) (b) (c)
Figure 5 Gamma and vision images of four LED lights (a) calibration setup (b) gamma image (top 119869L bottom 119869R) and (c) vision image(top 119868L bottom 119868R)
vision images (119869L and 119868L) and the other is between rightgamma and vision images (119869R and 119868R)
Let 119867L and 119867R be the left and the right homographyof the gamma and vision cameras respectively As shownin Figure 4 119867L is the homography from 119868L to 119869L and 119867Ris the homography from 119868R to 119869R To transform the visionimages to gamma images 119868L and 119868R are multiplied by 119867Land 119867R respectively as follows and then the results arehomogeneously equal to the gamma image
119869L cong 119867L119868L
119869R cong 119867R119868R(1)
Because two sensorrsquos images captured from the same captureposition for example from 119868L and 119869L have the same coordi-nate origin 2D homography relation is enough to transformone image plane to the other
To calculate the planar homography matrix 119867 we needat least four corresponding image points between the gamma
and vision images This means that we need to know at leastfour sets of 2D image coordinates of any object which canbe captured in both gamma and vision cameras Fortunatelythe gamma detection sensor is designed to respond to visiblelight Thus we can use four-point light sources to find thehomography
Figure 5 shows an example of capturing four correspond-ing image points between the gamma and vision camerasFour LED light sources are fixed in front of the camerasas calibration objects and gamma and vision images of thelight sources are obtained Both in gamma and vision imagesthe image intensity of the LED sources is higher than thoseof their neighboring pixels We apply the Gaussian fittingmethod [10] to get the exact coordinates of the correspondingpoints and calculate the homography 119867L and the homogra-phy119867R using a linear algebra equation
119867L and119867R are 3 times 3 matrices but their scale is unknownTherefore using four pairs of matching points betweengamma and vision images we can compute eight unknown
Journal of Sensors 5
(a)
(b)
Figure 6 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
parameters of the homography Equation (2) shows homog-raphy 119867L and homography 119867R computed by the images inFigure 5
119867L =[[[
[
053197 minus002558 minus164897
minus002713 057747 minus90443
minus17225 times 10minus4
minus11717 times 10minus4
10
]]]
]
119867R =[[[
[
052141 minus0023501 minus165795
minus004064 052391 minus82434
minus20699 times 10minus4
minus23451 times 10minus4
10
]]]
]
(2)
Figure 6 shows an example of applying the119867L and119867R tothe vision images In Figure 6(a) left and right vision images119868L and 119868R are captured in 640 times 480 resolution Using twohomography matrices virtual gamma images 119869L and 119869R areobtained in 400 times 400 resolution Each pair of left and rightimages shares the same coordinate origin thus there areonly scale and rotation transformation between the visionand gamma images (ideally speaking there is no rotationtransformation if the coordinate axes of the two sensors areparallel)
In Figure 7 more light sources are used to calibrate thehomography between the two stereo cameras Images of eight
LED light points are captured in the images from both thevision and gamma cameras Images in Figure 7(a) show theoriginal captured images of light points from the vision andgamma cameras Homography relation between the left andright sensors is shown in (3) In Figure 7(c) are virtual gammaimages obtained from the homography relation
119867L =[[[
[
0951368 0012253 minus387186
0043855 0905885 minus230053
31221 times 10minus4
42492 times 10minus5
10
]]]
]
119867R =[[[
[
0826863 0025824 minus307124
0017759 0821441 minus163538
19185 times 10minus4
20055 times 10minus5
10
]]]
]
(3)
4 Stereo Calibration of Vision andGamma Cameras
In the field of computer vision calibration of a stereo visioncamera is a general but important task Many calibrationalgorithms have been introduced however currently Zhangrsquosmethod is widely used To use Zhangrsquos calibration method itis recommended to capture at least six images of a calibrationpattern which usually consists of checkerboard patterns
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 5
(a)
(b)
Figure 6 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
parameters of the homography Equation (2) shows homog-raphy 119867L and homography 119867R computed by the images inFigure 5
119867L =[[[
[
053197 minus002558 minus164897
minus002713 057747 minus90443
minus17225 times 10minus4
minus11717 times 10minus4
10
]]]
]
119867R =[[[
[
052141 minus0023501 minus165795
minus004064 052391 minus82434
minus20699 times 10minus4
minus23451 times 10minus4
10
]]]
]
(2)
Figure 6 shows an example of applying the119867L and119867R tothe vision images In Figure 6(a) left and right vision images119868L and 119868R are captured in 640 times 480 resolution Using twohomography matrices virtual gamma images 119869L and 119869R areobtained in 400 times 400 resolution Each pair of left and rightimages shares the same coordinate origin thus there areonly scale and rotation transformation between the visionand gamma images (ideally speaking there is no rotationtransformation if the coordinate axes of the two sensors areparallel)
In Figure 7 more light sources are used to calibrate thehomography between the two stereo cameras Images of eight
LED light points are captured in the images from both thevision and gamma cameras Images in Figure 7(a) show theoriginal captured images of light points from the vision andgamma cameras Homography relation between the left andright sensors is shown in (3) In Figure 7(c) are virtual gammaimages obtained from the homography relation
119867L =[[[
[
0951368 0012253 minus387186
0043855 0905885 minus230053
31221 times 10minus4
42492 times 10minus5
10
]]]
]
119867R =[[[
[
0826863 0025824 minus307124
0017759 0821441 minus163538
19185 times 10minus4
20055 times 10minus5
10
]]]
]
(3)
4 Stereo Calibration of Vision andGamma Cameras
In the field of computer vision calibration of a stereo visioncamera is a general but important task Many calibrationalgorithms have been introduced however currently Zhangrsquosmethod is widely used To use Zhangrsquos calibration method itis recommended to capture at least six images of a calibrationpattern which usually consists of checkerboard patterns
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
6 Journal of Sensors
(a)
(b)
(c)
Figure 7 Example of generating calibration pattern images from the view of the stereo gamma camera (a) left and right vision images and(b) left and right gamma images
Zhangrsquos calibration algorithm is available in several computervision libraries such as OpenCV and MATLAB In thispaper we also use the OpenCV computer vision library forcalibrating the vision and gamma cameras
Obtaining the images of the calibration pattern fromthe gamma camera is a main problem of this paper It isbecause that the gamma camera cannot capture the imagesof the printed calibration pattern To solve this problemwe use the homography relationship between the gammaand vision cameras Once several images 119868L and 119868R of thecalibration board from the vision camera are obtained theseimages are transformed by 119867L and 119867R to generate virtualcalibration images 119869L and 119869R from the view of the left andright gamma camerasThen the transformed images are used
as input of Zhangrsquos algorithm to obtain the stereo calibrationparameters of the stereo gamma camera The whole stereocalibration process is shown in Figure 8 The intrinsic andextrinsic parameters of both left and right radiation detectorsare depicted in Tables 2 and 3 Here 119877
119909 119877119910 and 119877
119911represent
the rotationmatrix and 119905119909 119905119910 and 119905
119911represent the translation
components
5 Distance Measurement toRadioactive Sources
In order to evaluate the accuracy of the proposed calibra-tion method we performed distance measurement experi-ments using the calibrated stereo gamma camera The main
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 7
Table 2 Intrinsic parameters of the left and right gamma cameras
Camera Focal length (119891119909
) Focal length (119891119910
) Principal point (119862119909
) Principal point (119862119910
)Left camera 104801 104452 30231 29764Right camera 104892 104495 31061 30264
Step 1 Capture LED light and find homography
Gammacamera
Visioncamera
Step 2 Obtain several visual images of a calibration pattern
Left images Right images
Step 4 Apply Zhangrsquos method to virtual gamma images
Left gammacamera
parameters
Right gammacamera
parameters
Zhangrsquosmethod
Step 3 Apply HL and HR to the left and right vision images
Left virtual gamma images
Right virtual gamma images
Left vision images
Right vision images
HL
HR
Figure 8 The complete processes of stereo gamma camera calibration
Table 3 Extrinsic parameters of the right gamma camera withrespect to the left camera
119877119909
119877119910
119877119911
119905119909
119905119910
000783 minus004690 000209 minus5103 minus114
approach of this is to calculate distances to the previouslyused bright LED spots which are installed at already knowndistance Two kinds of experimental setup ldquoP4rdquo and ldquoP8rdquo areused in distance measurement which are shown already inFigures 5 and 7 In the first experimental setup four LEDs areplaced on different positions as shown in Figure 5(a) In thesecond experimental setup eight LEDs are placed on differentpositions as shown in Figure 7(a)
The positions of the light spot are measured using aBosch GLM 250 VF Professional laser rangefinder Aftergamma camera calibration the intrinsic and extrinsic cameraparameters can be obtained and they are used to calculateleft and right perspective projectionmatrices (PPM) Figure 9
shows left and right stereo gamma images where imageprocessing methods such as adaptive thresholding contourdetection and ellipse fitting are applied to find the centerpoints ((119906 V) and (119894 119895)) of each LED light spot of theldquoP4rdquo setup Essential and fundamental matrices can alsobe calculated using the camera parameters according to(4) and the epipolar geometry is applied to identify thecorresponding matching points of left and right gammaimagesThese correspondences are representedwith identicalcolors in Figure 9
119864 = 119877 [119905]119909
119865 = 1198701015840119879minus1
119864119870minus1
(4)
119864 and 119865 represent the essential and fundamental matricesof the stereo gamma camera respectively 119877[119905]
119909is the cross
product of the extrinsic camera parameters whereas 1198701015840 and119870 represent the intrinsic parameters of the left and rightcameras respectively The 3D coordinates 119882(119883119884 119885) for
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
8 Journal of Sensors
(a) (b)
Figure 9 Representing corresponding matching points in left and right gamma images of ldquoP1rdquo setup
each matching point set can be calculated using a linearequation (5)
119860119882 = 119884 (5)
Here 119860 represents a 4 times 3 matrix and 119884 represents a 4 times 1matrix as below The inverse of matrix 119860 is calculated usingthe singular value decomposition (SVD) method
PPML=
[[[[
[
(1198861)119879
11988614
(1198862)119879
11988624
(1198863)119879
11988634
]]]]
]
PPMR=
[[[[
[
(1198871)119879
11988714
(1198872)119879
11988724
(1198873)119879
11988734
]]]]
]
119860 =
[[[[[[[
[
(1198861minus 1199061198863)119879
(1198862minus V1198863)119879
(1198871minus 1198941198873)119879
(1198872minus 1198951198873
)119879
]]]]]]]
]
119884 =
[[[[[
[
(11988614
minus 11990611988634)
(11988624
minus V11988634)
(11988714
minus 11989411988734)
(11988724
minus 11989511988734
)
]]]]]
]
(6)
Figure 10 shows ldquoP4rdquo experimental setup In this setupwe have done two experiments by changing the position ofthe four light points as shown in the figure Figure 10(a) is thesame one used in the homography calibration Figure 10(b)shows four LED lights are placed at the same distance tothe gamma camera Using the two kinds of setup two
Table 4 Distance measurement results of ldquoP4rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4
P4E1Distance (cm) 210 230 250 270 113 113Measured (cm) 207 228 249 274P4E2Distance (cm) 300 300 300 300 241 159Measured (cm) 298 305 287 309
distance measurement experiments ldquoP4E1rdquo and ldquoP4E2rdquo aredone As shown in Table 4 the distance measurement is veryaccurate where the average percentage error is about 113in ldquoP4E1rdquo and 241 in ldquoP4E2rdquo experiments To each pointlight in Figure 10(a)measurement error is about 1sim4 cm fromcamera distance of 210sim270 cm If the distance to the LEDlight increases the measurement error also increases
We use more LED light points to analyze the accuracyof distance measurement Figure 11 shows the second exper-imental setup ldquoP8rdquo using eight LED lights which are fixedin the different distances As shown in the figure ldquopoint1rdquoldquopoint2rdquo ldquopoint6rdquo and ldquopoint7rdquo are on the same plane Fourother light points are fixed at the different distancesWith thissetup we have done two experiments ldquoP8E1rdquo and ldquoP8E2rdquo byplacing the gamma camera at the different distance from thissetup The ground truth distance to each LED light is shownin Table 5
Figure 12 shows the left and the right gamma images ofthe ldquoP8E1rdquo experiments Stereo correspondences between theleft and right images are shown in the same colored ellipsemodels The center coordinates of the corresponding ellipsemodels are used to compute the distance of each LED lightpoint The calibration parameters of these experiments arealso derived by using the same setup as described in Section 3
Table 5 shows the results of experiments ldquoP8E1rdquo andldquoP8E2rdquo The average percentile distance error is about 2
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 9
Table 5 Distance measurement results of ldquoP8rdquo experimental setup
Experiments Test point Avg error () Std Dev1 2 3 4 5 6 7 8
P8E1Distance (cm) 320 320 295 310 330 320 320 345 209 163Measured (cm) 315 314 291 324 328 320 312 360P8E2Distance (cm) 300 300 310 325 300 300 275 290 771 583Measured (cm) 312 291 261 272 300 324 298 308
Point1
Point2
Point3Point4
(a)
Point1
Point2
Point3
Point4
(b)
Figure 10 The first experimental setup of the distance measurement using four LED lights (a) P4E1 and (b) P4E2
Point1Point2
Point3Point4
Point5Point6Point7
Point8
Figure 11 Experimental setup of the distance measurement using 8LED lights Two experiments ldquoP8E1rdquo and ldquoP8E2rdquo are done using thesame setup
in ldquoP8E1rdquo and 8 in ldquoP8E2rdquo experiment respectively Themeasurement error is a little higher than that of using fourLEDs This may be caused by using more light points in the
computation of the homography relationshipWe assume thatthe centers of the vision and gamma camera are the same sothat light points which are not in the plane can be used forhomography computation However in reality the alignmenterror of the centers of the two cameras can yield higher errornot only in the homography computation but also in thedistance measurement
As the final experiment we use a real radioactive sourceto obtain stereo gamma images and measure the distance tothe source Figure 13(a) shows a facility for general purposeexperiments using a single radioactive source In this facilitya radioactive source of Cs-137 exposes gamma radiation of100mSvh at 2m distance As shown in the figure the gammasource is exposed as a point source We change the positionof the stereo gamma camera module from 150 cm to 400 cmand measure the distance to the source using stereo gammaimages captured in each distance
Figure 13(b) shows gamma images captured from thestereo gamma camera The position difference in stereoimages becomes smaller when the distance to the radioactivesource becomes farther due to the parallel stereo geometry Ineach distance the center coordinates of the gamma source are
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
10 Journal of Sensors
Table 6 Distance measurement results to real gamma
Distance (cm) 150 200 250 300 350 400 Avg error () Std DevMeasured (cm) 1499 1941 2498 3004 3522 4109 329 398
(a) (b)
Figure 12 Corresponding points in left and right gamma images of 8 LED points
Gammadetector
Gamma source(point)
(a)
Left image
Right image
150 cm 200 cm 250 cm 300 cm 350 cm 400 cm
(b)
Figure 13 Stereo gamma image acquisition using a real radiation source (a) experimental facility and (b) stereo gamma image at differentmeasurement distances
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
Journal of Sensors 11
extracted and the real distances to the source are calculated asin Table 6 Average percentage error is about 329 and it isreasonably accurate results
6 Conclusion
In this paper we proposed a new method to calibrate stereoradiation detection devices using planar homography rela-tionship Because it is impossible to capture the calibrationpattern images from a gamma camera we use the planarhomography relationship between gamma vision camerasto generate virtual calibration images Bright LED pointlights are used as calibration objects and a gamma camerais used as a radiation detection device A set of calibrationpattern images captured from a vision camera are convertedinto gamma images by applying homography relationshipsZhangrsquos calibration method is applied to calibrate the gammacamera afterwards We performed distance measurementexperiments using the calibrated gamma camera to check theaccuracy of the proposedmethodThe 3D coordinates of LEDpoint lights and a real gamma radiation source are obtainedwith high accuracy using stereo gamma images
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper
Acknowledgments
This work was supported by the Civil Military TechnologyCooperation Center and the Korea Atomic Energy ResearchInstitute
References
[1] W Lee and D Wehe ldquo3D position of radiation sources usingan automated gamma camera and ML algorithm with energy-dependent response functionsrdquo Nuclear Instruments and Meth-ods in Physics Research A vol 531 no 1-2 pp 270ndash275 2004
[2] T E Peterson and L R Furenlid ldquoSPECT detectors the AngerCamera and beyondrdquo Physics in Medicine and Biology vol 56no 17 pp R145ndashR182 2011
[3] H Kwon J Park and A C Kak ldquoA new approach foractive stereo camera calibrationrdquo in Proceedings of the IEEEInternational Conference on Robotics and Automation (ICRArsquo07) pp 3180ndash3185 April 2007
[4] H Yu and Y Wang ldquoAn improved self-calibration method foractive stereo camerardquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA rsquo06) vol 1 pp5186ndash5190 June 2006
[5] Z Zhang ldquoA flexible new technique for camera calibrationrdquoIEEE Transactions on Pattern Analysis andMachine Intelligencevol 22 no 11 pp 1330ndash1334 2000
[6] S-Y Park and G G Park ldquoActive calibration of camera-projector systems based on planar homographyrdquo in Proceedingsof the 20th International Conference on Pattern Recognition(ICPR rsquo10) pp 320ndash323 August 2010
[7] G-Q Wei and S D Ma ldquoComplete two-plane camera calibra-tion method and experimental comparisonsrdquo in Proceedings of
the IEEE 4th International Conference on Computer Vision pp439ndash445 May 1993
[8] Y G Hwang N H Lee J Y Kim and S H Jeong ldquoThe studyfor the performance analysis of the stereo radiation detectorrdquoin Proceedings of the International Conference on ElectronicsInformation and Communication (ICEIC rsquo15) 2015
[9] P T Durrant M Dallimore I D Jupp and D RamsdenldquoThe application of pinhole and coded aperture imaging inthe nuclear environmentrdquo Nuclear Instruments and Methods inPhysics Research A vol 422 no 1ndash3 pp 667ndash671 1999
[10] R Hartley and A Zisserman Multiple View Geometry inComputer Vision Cambridge University Press Cambridge UK2004
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of
International Journal of
AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014
RoboticsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Active and Passive Electronic Components
Control Scienceand Engineering
Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
International Journal of
RotatingMachinery
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporation httpwwwhindawicom
Journal ofEngineeringVolume 2014
Submit your manuscripts athttpwwwhindawicom
VLSI Design
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Shock and Vibration
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Civil EngineeringAdvances in
Acoustics and VibrationAdvances in
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Electrical and Computer Engineering
Journal of
Advances inOptoElectronics
Hindawi Publishing Corporation httpwwwhindawicom
Volume 2014
The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014
SensorsJournal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Chemical EngineeringInternational Journal of Antennas and
Propagation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
Navigation and Observation
International Journal of
Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014
DistributedSensor Networks
International Journal of