marking parts to aid robot vision (1981)

Upload: john-wayland-bales

Post on 08-Apr-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    1/37

    NASA Technical Paper 1819

    Marking Parts To Aid

    John W. Balesand L. Keith Barker

    APRIL 1981

    Robot Vision

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    2/37

    TECH LIBRARY KAFB, NM

    NAS A Technical Paper 1819

    Marking Parts To Aid Robot Vision

    John W. BalesTzlskegee llstitUteTzlskegee, AlabamaL. Keith BarkerLullgleyResearchCellterHuttzptolz, Virgiuia

    National Aeronauticsand Space AdministrationScientific and TechnicalInformation Branch1981

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    3/37

    SUMMARYThe premarking of parts for subsequent identificationy a robot visionsystem appears to be beneficial as an aid in the automation of certain tasks,such as construction in space. Accordingly, a simple, color-coded markingsystem is presented which allows a computer vision system to locate an objecalculate its orientation, and determine its identity. Furthermore, such asystem has the potential toperate accurately, and, because the computer shapeanalysis problem has been greatly simplified, it has the potential to operatin real time.

    INTRODUCTION

    The use of robots in industry has increased dramatically in theastseveral years, particularly in Japan butl s o in this country. A robot isusually a programmable, general purpose arm dedicated to specific tasks. Forthe most part, these arms are blind. hat is, the software controlling mostarms makes no use f visual inputs. How to incorporate visual inputs into suchsoftware is a problem of much current interest.One of the most basic tasks of a robot vision system is to locate andidentify parts for handling. Four primary problems which must be overcome inorder to carry out such a task are noted in the succeeding paragraphs.The first problem is to distinguish the part from its background. Manysophisticated approaches exist for distinguishing the image of a sought objecfrom its background. However, most approaches require toouch computation toallow real-time operation and, in addition, require specially controlled lighting. Any system can be said to perate in real time if it permits the robotarm to operate at an adequate rate. An adequateate for one situation roperation may be hopelessly slow for others. Since robot arms will, for theimmediate future,be operating alongside human workers, any rate which slowsdown the workers would be inadequate.The second problem is to determine the orientationf the part. In aprecisely controlled situation, theought.part may be prepositioned in thecorrect orientation. This approach limits the versatility f the system. Anadequate vision system should end the necessity of precise placement of par

    for the robot arm. However, software for determining the orientation of com-plex parts can involve excessive computation, slowing real-time operation.The third problem is o determine the position of the part. n manymanufacturing situations this problem is simplifiedy the fact that the partis on a conveyer belt, the position of which is known. n this situation, asingle camera is adequate to provide the information needed to ascertain theposition of the part. If the part is not n a conveyer belt and is, in fact,randomly positioned and oriented, some other technique is required.t is

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    4/37

    natural to consider the use of two cameraso provide the position informatioto a robot vision system in this more complex domain. However, the extencomputation required for the cross correlationf two images severely handicaany two-camera system because of the real-time requirement. Currently, rangfinding systems involving a single camera are more promising.f particularinterest is a system first describedy Shirai (ref. 1 ) which uses a inglecamera and a plane of light as a range finder.n this system a sheetr planeof light is projected toward an object to be identified.he intersection ofthis plane f light with the object results in a visible light stripe bedrawn across the object's surface. This stripes essentially moved across thsurface by panning the plane-of-light generator. The resulting series of lightstripes are recorded by a V camera. Computer software is used to classify etsof straight line segments in these stripes into planes. The objects thenrecognized by the relationship between these planes. Other plane-of-lightsystems appear in references and 3 . Investigators at the National BureaufStandards ( G . J. VanderBurg, J. S. Albus, and E. Barkmeyer) performed experimenin 1979 in which a plane-of-light vision system was usedo successfully acquian object randomly placed n a table. At present, this type of range-findersystem is too low for real-time operation in three-dimensional space.The fourth problem is to trackhe part if it is in motion. Unfortunately,the time required o distinguish the part from its background, determine orientation of the part, and calculate its position restricts the rate athe system can track a moving object.The purpose of this paper is to reportn investigations into a visionsystem capable f handling the previously mentioned problems while requirinonly one camera and a minimum of software. For faster execution time, a scolor-coded marking system eliminates the time-consuming shape analysis reby plane-of-light systems to operate in a three-dimensional environment.

    SYMBOLSpoint on observed objectconstantpoint on observed objectexact center of circular imagecentroid of illuminated pixelsdiameter of cylinder, cmdistance from lens center to front image plane,mdiameter of disc image,mconstant

    2

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    5/37

    M wid th of image p lane,i x e l sm mean valuen image diameter , p i x e l sP l e n se n t e rn.P h y p o t h e t i c a le f l e c t i o nf p o i n t Pp1 rP21P31P41P5 pointo c a t i o n sny l i n d e rR* r a n g er o me n se n t e r to o b j e c tr o j e c t i o nnr i n c i p a lr 1 r ~ , r 2 , r ~ , r ~ , r 5 a n g e from l e n s e n t e r to p o i n to no b j e c t , cmr (A ) , r (B) r ange from l e n s e n t e r t o p o i n t s A and B, r e s p e c t i v e l y , cmS diameter of d i s c , c m

    d o t o c a t i o n so no b se r v e d ob jec tu n i tv e c t o ru n i t e c t o r s from P toward A and B , r e s p e c t i v e l yw i d t h of image plane, c mspacingbetween m a r k s onobserved ob jec t , cmr e c t a n g u l a rco ord in a te s n image p lanee x a c tc e n t e r of c i r c u l a r i m a g ec o o r d i n a t e s of c e n t e ro f i t h i l l u m i n a t e dp i x e lc e n t r o i dc o o r d i n a t e s of i l l u m i n a t e dp i x e l sang lesused t o compute ro ta t ion a n g l e $ fo r c y l i n d e r ,d e gangu la r f i e l d of viewmeasured on d i a g o n a l of image p lane ,de gtilt angle away fromperpend icu la r t o l i n e o f s i g h t ,d e gangu la repara t ion e tweena ysmana t ing from P through A ' andB', deg

    81 ,82 ,83 ,84 ngular separat ion of su r f acemark ings , eg0 l e a s t s q u a r e si t of O n, p i x e l s or cm

    3

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    6/37

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    7/37

    a f l a t su r f ace . The m a r k s i n v e s t i g a t e dw h i c hg i v e a n g e (R) a n do r i e n t a t i o n ( O ) ,p l u s a given number of e x t r a parameters (N), w i l l b er e f e r r e d t o a s RO + N m a r k s .

    RO + 1 Ma r k f o r C y l i n d e r sThe RO + 1 m a r k f o r a c y l i n d e rc o u l dc o n s i s to f t w o s t r i p e s p l a c e d r a d i a l l ya r o u n d h ec y l i n d e rw i t hb o t h s t r i p es connected by a t h i r d which s p i r a l s o n c e

    a r o u n d h ec y l i n d e r f i g .1 ) . The s p i r a l s t r i p e andone r a d i a l s t r i p e a re t h esame color , w h i l e h eo t h e r r a d i a l s t r i p e i s the complement of t h a t color t or e so l v ea n ya m b i g u i t y n h ed i r e c t i o n .Th i sm a r k i n g scheme is u se d nanapproximate way t o f i n d h e r a n g e a n d o r i e n t a t i o n of t h e c y l i n d e r , p l u s h ed i a m e t e r of t h ec y l i n d e r . T h e diameter is encodedby pac ing he two c i r c u l a rst r ipes so t h a t t h ew i d t h of t h e space times t h ed i a m e t e r of t h ec y l i n d e r i sa prede te rminedc o n s t a n t .

    Image p la ne . - n igu re 2 , the image p lane i s d e p i c t e d a s a m a t r i xo f i g h ts e n s i t i v ee l e m e n t sc a l l e dp i x e l s .A c t u a l l y , t h e image p la ne shown i n ig u r e 2i s an mag i nary ron t image p lane n t rod uced f o r convenience by workers i ncomputergraphicsandmachinepercept ion t o d i s p l a y h e image i n a n u p r i g h tp o s i t i o n . Th e r e a l image plane i s located b e h i n d h e e n sc e n t e ra n dg i v e sa ni n v e r t e d image. mage c o o r d i n a t e s n h e t w o p l a n e s a re r e l a t e ds i m p l y b y am u l t i p l i c a t i v e a c t o ro fm i n u su n i t y . The l e n sc e n t e r i s located a t a knownd i s t a n c e from t he image p laneo n t h e p r i n c i p a l a y ,w h i c h i s p e r p e n d i c u l a r t ot he imageplane.

    A n g u l a rsepara t ion of image points . - If t h eand B ' i n h e image p lane (shown i n i g . 3 ) a r eare known, th e ng ul ar ep ar a t io n 8 be tweenheth rough A ' and B ' caneound from t h e e f i n

    c o o r d i n a t e s of two p o i n t s A 'known a n d h ec o o r d i n a t e so f Pr ays mana t ing from P

    i t i o n o f h e d o t produc t as

    I n a camera, P i s t h ee n s e n t e rw h i c h i s known in dv an ce , nd A ' and B'a re d i r e c t l y measurable i n h e image p l a n ec o o r d i n a t es y s t e m f i g . 2 ) .Diameter and ange . - Shown in f i g u r e 4 ( a ) i s t h ep o r t i o no f a c y l i n d e rw i t h

    t h e t w o c i r c u l a r s t r i p e s a n d h e e p a r a t i o nd i s t a n c e w be tween hese s t r i p e s .Themage p l a n e n r o n to f h e e n s e n t e r P i s not shown.The p o in t s P1and P2 are po in t sw h e r e P would e located i f it were s h i f t e d p a r a l l e l t ot h e c y l i n d e r ' s a x i s t o a p o i n t y i n g n h e p l a n e of t h e c i r c u l a r stripes.Assuming t h a t P i s located a t P1 l eads t o t h e e o m e t r y f i g u r e ( b ) .However, th ec i r c u l a r s t r i p e and P a r e n o t e a l l yc o p l a n a r ; h e r e f o r e ,u s eo f i g u r e4 ( b ) n s t e a do f i g u r e 4 ( a ) r e s u l t s n h ea p p r o x i m a t i o n

    r l = - csc (2)5

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    8/37

    where d is the momentarily unknown diameter of the cylinder. The angle 01can be found using equation ( 1 1 , where the points A' and B' are the endpoints of the visible circular arc n the image plane. In a similar manner,the approximate equation for r2 is

    d

    Referring to figure 4(c), notice that by theaw of cosines

    By assumption, the product of the width w and diameter d is a predeterminedconstant k, so that

    kdw = -

    Substituting equations (2), (31 , and ( 5 ) into equation (4) gives

    k2 d2d2- " csc 2w + $ csc2 (>) - g csc (>) csc (2) cos @

    in which only d is unknown. Solving for d yields

    This value of d may then be substituted into equations (2) and3 ) to solvefor rl and 2. Hence, both range and cylinder diameter are found.Direction cosines of cylinder's axis.- In figure 5, A and B+ arepoints in the field of view whose ranges3re known. Let r A) = ]PA1 and

    r (B) = I PBI . Then, the orientation of AB is specified by the unit vectorP6

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    9/37

    ZG+=-B whose coordinates are the direction cosines f AB. Let-+I 4

    and

    then

    so

    This procedureis applied to figure to obtain the direction cosines of thecylinder's axis.Rotation~~ "" ~ o f cylinder ab ut^ its axis.- he orientation ofhe cylindercan be specified by the direction cosines of the cylinder axis and by an anglbetween Oo and 360, to indicate the amount of rotation about that axis. Fig-ure 7 is used to examine rotation. The imaginary line 2 in figure7 is paral-lel to the cylinder axis and indicates that pints P1 and P3 are lined upwith some distinctive feature of the cylinder, such as the notch shown. Traveling from P1 to P3 via the spiral stripe represents a rotationof 360, so

    traveling from P1 to P2 represents some fraction of 360, which is equiva-lent to the amountof rotation. The angle $ indicates the amount of rotationabout the axis with respect to an imaginary line, which is parallel o thecylinder axis. For example, if the pints P2, Pgr and P-) are coincident,then $ = Oo; and if P2, P4r and P3 are coincident, then 9 = 360O. Theangle Q is given exactly by

    7

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    10/37

    where I s s and 1 s 5 a r e l i n e s e g me n t e n gt h s . n h i sd e v e l o p m e n t ,e q u a t i o n ( 1 2 ) i s approximated by

    The an gl es a1nd a2 an be approximated as follows. The a r c p o r t i o no f h ec i r c u l a r s t r i p e 1 w i l l appear on he image p lane,which is no t shown i nf i g u r e 7. The p o i n t P; is approximated by t h ep o i n t n h e image p l a n ewhichcorresponds t o a pointhal fwaybe tween he two e n dp o i n t so f h e arc .P o i n t P 4 i s s i m i l a r l y p p r o x i m a t e d .L i n e 1 i s then pproximated by t h es t r a i g h t i n ec o n n e c t i n g h e s e t w o a p p r o x i m a t i o n so f h e m a g ep o i n t sP iand P i . N m , t h es p i r a l s t r ipe w i l l c a u se a s e t o fp i x e l s t o be i l l u m i n a t e di n h e im ag epl an e. Common p i x e lp T i n t s of + i n e 1 a n d h e s p i r a l s t r i p e locateP i . i n c ehe image po in t s i , P2,nd5 a r e known, t h en g u l a rs e p a r a t i o n s 1 n d 2 are computed as 8 was i n q u a t i o n (1 ) .

    RO + N Ma r k for F l a t S u r f a c e sShown i n f i g u r e 8 ( a ) i s a w i d g e tw i t h h r e ec i r c u l a r spots e q u a l l ysp a c e d

    i n a s t r a i g h t i n e . One o f h ee n d spots is thecomplementary color o f h eo t h e r t w o , t o e s t a b l i s hd i r e c t i o n . n i g u r e 8 ( b ), S1 , S2, ndS3 enotet h e h r e e spots o nhe ar t . The an gl es 81 and 2 a r e foundsingqua-t i o n ( 1 ) . Th e t r i a n g l eP s i s 3 i s extended t o a paral le logram PS1%3 t oi l l u s t r a t e h ee q u a t i o n sw h i c h o l l o w .

    "-* "-j +L e t r l = IPS1 I , r 2 = I PS2I , and 3 = I PS31 . Then, i f w1 i s t h ew i d t hbetween spots, t h e law o f o s i n e s p p l i e d t o t h e r i a n g l eP s i s 2g i v e s

    and ,sing PS2S3,

    By a p p l y i n g h e d e n t i t ys in (180 - 8 ) = s i n 8 a n d h e law of s i n e so n h et r i a n g l e S I 5 it follows t h a t8

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    11/37

    In equations (1 4 ) , (1 5), and (1 6), the quantities 81 , 82, and w1 are known,and rl r2, and r3 are unknown. From equation (16),

    2r2 sin 8 2sin (81 + 8 2 )1 =

    and

    2r2 sin 1sin (01 + 82)3 =

    Substituting equation (1 7) into equation (1 4 ) , or substituting equation(18)into equation (15), leaves r2 as a function of w1, 81, and 82. The result-ing equation in either situation can e expressed as

    Then, 1-1 and r3 are found by substituting equation (19) into equations (1 7 )and (18). The direction cosines can nowe found by the,same,procedury usedin equations (8 ) to (11 ) , remembering that the images S1 , S2, and S3are directly measurable n the image plane.There is insufficient information to deduce the rotation f the part infigure 8(b) about the S1S3 axis. A set of three spots perpendicular to theoriginal three could resolve this ambiguity and, coincidentally, allow theencoding of an additional parameter. ne -way to construct such aO + 1 markis shown in figure 9(a) with the orthogonal sets of spots sharing a center sIn figure 9(b), the product of the distances w1 and w2 from the center spotto alternate corner spots s a prespecified constant k, that is, wlw2 = k.Spots S1 , S2, and S3 in figure9 lead to equations (1 7) (1 8) , and (1 9).Similarly, spots S 4 r S2, and S5 in figure 9 lead to the following equations:

    2r2 sin 84sin (83 + 84)4 =

    9

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    12/37

    2 r 2 s i n 83s i n ( e 3 + 8,)r 5 =

    where r4 , r 5 , 83, and4 a r e shown i n i g u r e ( c ) .M u l t i p l y i n g q u a t i o n s (1 9) a n d 2 2 )g i v e s n q u a t i o n o r rZ2 i n terms

    o f h e known p ro d u c t w1w2 = k a n d r i g o n o m e t r i c u n c t i o n so f h e n g l e s 81,82 , 83, and 8 4 . T a k i n gh e square root g i v e s2. Then, w1 an d w2 ca nbeound by us in g qua t ions 19) and 221 , espec t iv e ly . The r a t i o -1w2 r epr e-s e n t s an a d d i t i o n a l parameter whichcan be used t o i d e n t i f y a par t i cu la r par t .F u r t h e r m o r e , h ed i r e c t i o nc o s i n e sca n now be oun d or two o r t h o g o n a lv e c t o r sin f o r m a t io n t o c o m p l e t e l y s p e c i f y h e o r i e n t a t i o n o f h e pa r t .and S S , y i n go n h e l a t surface c o n t a i n i n g h e m a r k . T h i s i s enough

    T h e co n f ig u r a t i o n of a RO + 2 m a r k fo r f l a t sur faces i s shown in f i g u r e 1 0 .The corner spot a u t o m a t i c a l l yp r o v i d e s h ed i r e c t i o no f h e m a r k f r e e i n g h ecomplementary co lo r s fo r use as a b i n a r yc o d e o re a c ho f h e i v e spots. T h i sp r o v i d e sa na d d i t i o n a l parameter w i t h 3 2 d i s c r e t e v a l u e s ( e a c h o f h e f i v espots can be e i t h e ro f two c o l o r s ) , i na d d i t i o n t o t h e -1 parameter. Anw2equiva len t "T" shaped m a r k c o u l d a l s o be used.

    ERROR ANALYSISFrom a g e o m e t r i c a ls t a n d p o i n t h e RO marks d i s c u s s e dh e r e a re f u l l y

    a d e q u a t e o r h e purpose o f i n d i n g h ed i r e c t i o n , a n g e ,a n do r i e n t a t i o n ofa g iv en par t . Hawever , inaccuracies of measurement mu s t be t a k e n n t oa c c o u n tin n c tu a l y s t em . The d e t e r m in a t i o n of r a n g e ne q u a t i o n s 2 ) , ( 3 ) , and(17 ) t o ( 2 2 ) n v o lv es h e accura te d e t e r m i n a t i o no fs e p a r a t i o na n g l e s frome q u a t i o n ( 1 ) . B u t , s e p a r a t i o na n g l e sc a n n o t be d e t e r m in ed x ac t l y , because iti s n o tp o s s i b l e t o f i n d h ee x a c tc e n t e ro fan image i n h e im ag ep lane . no r d e r t o i n v e s t i g a t e h i s p r o b l e m it was assumed t h a t he image p l an e c o n s i s t e dof a r e c t a n g u l a ra r r a yo f picture e lements , or p i x e l s . n i g u r e11 , he imageof a c i r c u l a rd i s c is s e e non he image pl ane . I f a g i v e n . p i x e 1 i s i l l u m i n a t e ds u f f i c i e n t l y t o exceed some p r e d e t e r m i n e d h r e s h o l dv a l u e , it i s s a i d t o be" i l l u m i n a t e d ." no r d e r t o s i m p l i f y h i s aspect of theprob lem it is assumedt h a t a g i v e n p i x e l is i l l u m i n a t e d f i t s c e n t e r f a l l s w i th in h eb o u n d a r y oft h e im ag e ;otherwise, it is assumednot t o b e l l u m i n a t e d . n i g u r e1 1 , h es h a d e d q u a r e s e p r e s e n t h e l l u m i n a t e dp ixe l s . The cen te rof he image i s ca n d h ec e n t e ro f h ec o l l e c t i o no f l l u m i n a t e dp i x e l s i s D. L e t t h e eq u en ce(x1 ,y1) , x2 ,y2) , . . ., ( x k , Y k )d en o t e h eC en te r s of t h e l l u m i n a t e dp i x e l s .1 0

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    13/37

    I f

    and

    k

    i= l

    then D = (X,y) i s t h e e n t r o i d of t h e l l u m i n a t e d i x e l s . f C = (xc ,yc ) ,then

    i s t h ed e v i a t i o no f h ec e n t r o id r o m h e t r u e c e n te rof he image.For a c i r cu l a r image with a diameter g i v e n n n p i x e l sa n d a g i v e n loca-

    t i o no f i t s c e n t e r C on some p i x e l , D - C can be e x a c t l yd e t e r m i n e d n a l y t -i c a l l y . n f a c t , t h o u g h ,h e e n t e r C is known o n l y n a s t a t i s t i c a l se n se ,because it i s randomlyplaced omewherewi th in a g ivenp ixe l .Th u s , D - Cwould be a random variable.F o r h e purpose o f h i ss t u d y , h e random v a r i a b l ex - xc fromequa t ion (25) was i n v e s t i g a t e d i r s ta n d h e r e s u l t s su b se q u e n t l ya p p l i e d t o - yc and D - C. The c i r c u l a r i s c s used as images were assumedt o have diameters whichvar i ed rom 2 to 50 p i x e l s .F o re a c hspeci f ic imaged i a m e t e r , 1 00 c h o i c e so f C were s e l e c t e du n i f o r m l y a t random within a g i v e np ixe l . o r ach ho ice f n, 1 0 0 samples of Z - xc were computed. Themean m a n d h e t a n d a r d e v i a t i o n un of t h e s e 1 0 0 samples a re shown i nt a b l e I . The un i t s of t h e s e s t a t i s t i c s cou ldbemeasured n ac tua l d i s t a n c e s ,such as cen t imete r s ;however , he S t a t i s t i c a l va lueswouldvary ,depending upont h e a r t i c u l a r i m e n s i o n s fh em a g i n g e v i c e .Th e r e f o r e , m and 0 a rem e a s u r e d np i x e lu n i t s . The i n a b l e I denotes image diameter i np i x e l s .Because t h ev a l u e s of m were c o n s i s t e n t l y small andno r end was a p p a r e n t ,it w a s assumed that m = 0. The assumption i s made t h a t 0 i s i n v e r s e l y pro-p o r t i o n a l t o t h e q u a r e root o fn ; h a t i s ,

    -

    1 1

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    14/37

    The best v a l u e of a, i n h e e a s t - s q u a r e s e n s e , i s

    i nn=2 na =5 1

    nn=2

    A p p l i c a t i o n of e q u a t i o n 2 7 ) t o t h ev a l u e s of 0, i n a b l e I y i e l d s

    IT=- 0.222

    For l a t e r r e f e r e n c e ,h e q u a t i o n fo r 20 i s

    20 =-. 4 4 4fiI n i g u r e1 2 , h ev a l u e s of On i n a b l e I an dh e p p r o x i m a t i o n 0 i ne q u a t i o n 2 8 ) are p l o t t e d i np e r c e n t a s u n c t i o n so fc i r c u l a r i m a g ediametern.

    Because he f i t i s a p p a r e n t l y a goodone,equat ion 28) i s t a k e n as a good e s t i -mate of t h e t a n d a r dd e v i a t i o no f E - xc as a f u n c t i o nof image iameter .I n a g re e me n tw i t h e f e r e n c e 4 , t h ea c c u r a c y of p r e d i c t i n g h ec e n t e ro f h e c i r -c u l a r image mproves as t h e i mage s i z e n c r e a s e s .

    The lengtho f h e error v e c t o r ne q u a t i o n 2 5 ) i s

    ID - CI = F- c ) 2 + ( y - yc) 2S u b s t i t u t i n g ?I - xc = 20nd - yc = 20 in to qua t ion 3 0 ) n d s i n g q u a -t i o n 2 9 )g i v e s

    ID - C( =-.628fias anapproximat ion t o t h e maximum l e n g t h of t h e error v e c t o r .1 2

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    15/37

    T h e l o ca t i o n er ro r o f h e c i r cu l a r image ce nte r ore q u a t i o n s 2 9 )an d (31)i s shown in f i g u r e 13 a l o n gw i t hn u m e r i c a ls o l u t i o n sf o r h eapproximate maximumer ror f o rd i s c r e t ep o in t s computed i n e f e r en ce 4. E q u a t i o n 2 9 ) e p r e s e n t s a20 e r ror i ne i t h e r h e x- or y - d i r e c t i o n . E q u a t i o n (31) r e p r e s e n t s n p p r o x -im a t io n t o t h e maximum error for 20 v a r i a t i o n s nb o th h e x- an dy - d i r ec t i o n s .In g e n e r a l ,e q u a t i o n (31) g i v e s a c o n s e r v a t i v ee st im a te of t h e maximum e r ro r fo rt h ed i a m e t e r sc o n s i d e r e d n i g u r e 13. As e x p e c t e d , h e estimate given by equa-t i o n 2 9 ) i s l o w due t o l i m i t i n g to o n l yo n ed i r e c t i o n .

    The ine q u a t i o n (28) c a n be r e l a t ed t o p h y s i c a lp a r a m e t e r s n i g -ures 1 4 and 15. F ig u r e 14 shows the d i sc w i thd i am e te r s i n e n t i m e t e r s n di t s image wi th iam et er I i n e n t i m e t e r s . By s imi la r t r i a n g l e s ,I S2f 2R *

    --OK

    fSI = -R* (33)

    The r a t i o of t h e disc image diameter n i np i x e l s t o t h ew i d t h of t h eimage plane M i np i x e l s is equal t o t h e a t i o of t h ed isc image diameter Ii n centimeters to t h ew i d t h of the image plane W i nen t im e te r s .H en ce ,

    " - IM W

    or

    Mn = I w

    S u b s t i t u t i n ge q u a t i o n (33) i n t oe q u a t i o n (35) g i v e s

    fS MR* Wn = - -

    (34)

    (35)

    (36)

    13

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    16/37

    The f in equation (36) can be expressed as a function of the field-of-view angle y and width W of the image plane in figure 5 as

    cot (;) = ;por

    f = cot (f)Ewhere y is the field of iew measured on the diagonal of the image plane.Substitute equation (38) into equation (36) to obtain

    (37)

    n = -

    Equation (39) can be used to express equation 2 8) as

    To convert 0 from pixel units to centimeters, multiply y the scalefactor W/M to get

    (39)

    CJ = ( 0 .264) -M

    In figure 11 , it is assumed that D - C = (X - xc, y - yc) is a randomvariable satisfying a bivariate normal distribution with each variable havimean m = 0 and standard deviation 0 satisfying equation ( 4 1 ) . For a givencharge-coupled-device (CCD) imaging chip, the values of M and W may bedifferent in the - and y-directions, giving different values of I in thex- and y-directions.

    1 4

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    17/37

    The accuracy w i t h w h i c h h e r a n g e a n d or ienta t ion may be computed i s deter-mined fo r a sample problem byassuming th a t , ne q u a t i o n (41), W = S = 0.625and M = 256 i nb o t h h e x- andy-di rect ions . notherwords, he magings u r f a c e i s sq u a r ew i t h s ides W e q u a l t o 0.625 c e n t i m e t e r , h e m a g i n g u r -face i s composed of a m a t r i x of 256 by 256 p i x e l s (M = 256), a n d h ed i a m e t e rof t h e di sc m a r k S is 0.625 cen t imete r .Thus , qua t ion (41) becomes

    The h ree -do tmark ing n igu re 8 s u f f i c e s fo r t e s t i n gr a n g ea c c u r a c yf r o me q u a t i o n s (1 7 ) , (la), and (1 9) and t e s t s t h e aspect o fo r i e n t a t i o n most suscep-t i b l e t o e r r o r s ; namely, t h e amount f tilt 6 away from a p e r p e n d i c u l a r t ot h e i n eo f i g h t f i g . 16). To f u r t h e r i m p l i f y h ea n a l y s i s , it was assumedt h a t t h e imageo f t h e m a r k l i e s h o r i z o n t a l l y n h e image p l a n e ( f i g . 1 7 ) .Errors were c a l c u l a t e d n r a n g e a n d tilt fo r r a n g ev a l u e s from 50 c m t o 150 c mi n 25-cm inc rem ent sand fo r tilt v a l u e s from 5O t o 85O i n 20 i nc remen t s . Wide-a n g l e (45O) andnarrow-angle ( 5O ) f i e l d s of view were used.For ach choiceof parameters, 1000 uniformly random choices were made wi th in he en t i r e imagep l a n e fo r t h ec o o r d i n a t e s of the image of t h ec e n t e r spot ( f i g . 1 7 ) . For achof t h e s e 1 0 0 0 c h o i c e s , h e r u e p o s i t i o n s of t h e h r e e m a g e s were a l t e r e d nt h e x - and y - d i r e c t i o n s b y a normal andom va ri ab lewith mean m = 0 and(5 determinedby quat ion (42). Of co ur se , he h r ee image spots w i l l n o ta lways be s i m u l t a n e o u s l yv i s i b l e n h e windowassumed fo r he ce n t e r spot,e s p e c i a l l y o r h e smaller f i e l d of view (Y = 5 O ) ; however, t h i s d i f f i c u l t y i ssepara te from theproblem of de te rmin ing heaccuracyof hemeasured tilt whenit i s measurable .

    The rangeand tilt were t h e nc a l c u l a t e du s i n g h e a l t e r e d p o s i t i o n s of t h eimagesand t h e d i f f e r e n c e s r o m h e r u ev a l u e s were ca lc ul a t ed . The mean ands t a n d a r d d e v i a t i o n of t h e s e d i f f e r e n c e s were c a l c u l a t e d fo r eachcombinat ionof range, tilt, a n d i e l d - o f - v i e wp a r a m e t e r s .Th e e su l t s a re shown i nt a b l e s I1 and 111.

    The r e s u l t s o r h en a r r o w (5O) f i e l d of view a re g o o d ,su g g e s t i n g h a tt h e r a n g e a n d o r i e n t a t i o n of a p a r t can be d e t e r m i n e d f a i r l y a c c u r a t e l y whent h e tilt i s n o t more t h a n 45O a n d h e a n g e i s n o t more t h a n 125 cm . Ther e s u l t s fo r t h ew i d e i e l d of view (45O) a r e l e ss e n c o u r a g i n ga n dsu g g e s t h a ta wide-angle ensshouldonly be used t o loca te t h e m a r k , a f t e r which hecamera sh o u l d sw i t c h t o a narrow-angle ens .

    I n h e r e n t n h e er ror a n a l y s i s a re t h ea p p r o x i m a t i n ga s s u m p t i o n s h a t h ep l a n e of each d i s c i s p a r a l l e l t o the image plane a n d h a te a c h d i s c pro jec t sas a c i rc le on he imageplane. A more thorough error a n a l y s i s is needed t oexamine he mpl i ca t ions of t h e ses i m p l i f y i n ga s su m p t i o n s .

    1 5

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    18/37

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    19/37

    REFERENCES1 . Shirai, Yoshiaki: Recognition of Polyhedrons With a Range Finder. PatternRecognition, vol. 4, 1972, pp. 243-250.2. Nevatia, Ramakant; and Binford, Thomas.: Description and Recognition ofCurved Objects. Artif. Intell., vo l . 8, 1977, pp. 77-98.3. Popplestone, R. J.; and Ambler, A . P.: Forming Body Models From Range Data.D.A-1. Res. Rep. No. 46, Dep. Artif. Intell., Univ. Edinburgh, Oct. 1977.4. McVey, E. S . ; and Jarvis, G. L.: Ranking of Pabterns for Use in Automation.

    IEEE Trans. Ind. Electron. & Control Instrum., vol. IECI-24, no. 2,May 1977, pp. 211-213.

    77

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    20/37

    TABLE: I.- VARIATION OF STATISTICS OF x - Xc WITH PIXEL DIAMETER OF DISCn, p i x e l s

    123456789

    1 0111 21 31 41 51 61 71 81 9202122232425

    m, p i x e l s_""_-0.01 5

    -. 05.002.009.002,008-. 02-. 02.003-. 08.005-. 04-. 01-. 01-. 1 0-. 02-. 07-. 06-. 06-. 02. ooo-. 01-. 1 1-. 01

    un, p i x e l s n, p i x e l s"_"0.1 68

    .133

    . l o 3

    . l o 9

    .091

    .082

    .085

    .064

    .065

    .061

    .062

    .067

    .055

    .061

    .052

    .059

    .052

    .048

    .053

    .043

    .048

    .048

    .042

    .042

    26272829303132333435363738394041424344454647484950

    m, p i x e l s-0.003-. 03-. 02.004-. 01. O O l-. 04-. 04-. 03-. 01.003. 1 0,002. ooo-. 02-. 01.004-. 02.OOl-. 02-. 03-. 01-. 04.005.ooo

    on, p i x e l s0.041

    .050,035.041.039.036.039.032.044.036.033.036.033.041.033.032.033.032.035.030.029.031.030.032.026

    1 8

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    21/37

    TABLE 11.- ERROR I N CALCULATED TILT ANGLE AND RANGE FOR So FIELD OF VIEW (y = So)-Actual t i l t a n g l e ,

    6 , deg

    5

    "25

    ~.45

    65

    85

    -

    Error inc a l c u l a t e d t i l ta n g l e

    m g r deg0.0;0.o.o.o

    0.0.o.o.o.o0.0. o.o- 0.o

    0.0.o.o.o. o0.0.o.o. o.o

    06, deg0.0.o

    .1

    .2. 30.0.o

    .1.2

    . 30.0.o

    .1

    .2. 30.0.o

    .1.2

    . 30.0.o.1.2. 3

    A c t u a l ran(r r cm

    50751 001251 505075

    1 001 251 505075

    1 001 251 505075100

    1 251 505075

    1 001251 50

    Error inc a l c u l a t e drange

    m r ,0.00-00. 0.o o.o o0.00.oo.oo.oo. o o0.00. o o.o o.o o.oo

    0.00. o o.oo.oo. .oo0.00.oo.oo.oo.25

    cm

    0.00. o o.o o.oo.o o0.00.oo.oo.25.250.00.o o-25.25.75

    0.00.25.25.751.500.25.751.754.007.75

    1 9

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    22/37

    TABLE 111 . - ERROR I N CALCULATED TILT ANGLE ANI: ZANGEOR 45O FIELD OF VIEW (y = 450)

    Actual t i l t a n g l e ,6 , de g

    Actual t i l t a n g l e ,6 , de g

    5

    25

    45

    65

    85

    5

    I

    25

    T E r r o r i nc a l c u l a t e d t i l ta ng le0.0.o. o. a.o,o . 0-.l-.l-.

    -1 .o0.0- 0

    -.l-.4-.0.0.o-.-.

    -1.7-0.1-.-5.2

    -20.5-36.5

    06 , de g".41.22.44.46.90.41.22.44.46.70.41.22.44.36.90.41.22.54.77.8

    ~~

    ___

    0.51.422.0

    49.366.4

    Actualrange,r r cm

    50751 00? 251 505075

    1 001 251 505075

    1 001251 505075

    1 001251 505075

    1 001251 50

    E r r o r i nc a l c u l a t e drange

    m,, m0.00.oo.o o-. 0

    -1 .oo0.00. 0.oo.oo.25

    0.00.o o. 0.50.250.00.oo.501.25

    8.000.754.00

    49.75108.0095.50

    - --

    -"

    ~-

    -

    O y r cm0.00-25-50

    1 .oo2.25

    "

    -0.25-75

    2.004.257.750.251 .504.009.0017.000.753.259.2521 -25

    39.754.2521 .oo

    168.50280.75305.75

    -

    20

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    23/37

    Visible portionsof spiral str ipe

    Circularstr ipes 1Figure 1 .- RO + 1 mark f o r cylinders.

    /L/PrincipaI rayi l luminated

    P [lens centerFigure 2.- Array of pixels (light-sensitive elements).

    21

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    24/37

    A

    B

    Image planeLens centerF i g u r e 3.- A n g u la rs ep a r a t i o n of image p o i n t s .

    22

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    25/37

    STR IPE 2

    Stripe 1

    (a ) Location of lens enter P r e l a t i v e o t r i p e s .

    (b) Assuming that P is l o c a t e d t PI.

    W

    P(c ) Separa tion of s t r i pe s .

    Figure 4. - Coded str ipes on cylinder.23

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    26/37

    Figure 5.- Orientation of two points withknown ranges.

    Str ipe 2 Str ipe 1

    PFigure 6.- Orientation of cylinder.

    24

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    27/37

    Visible portionsof spiral stripe

    ,-Circular stripe 1

    Figure 7.- Rotation of cylinder.

    25

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    28/37

    ( a ) Three-discmarkingon a widget .

    N

    d / \\ p c / \\

    (b) Geometry of three-discmarking.F i g u r e 8.- I l l u s t r a t i o n a ndgeometry of

    t h r e e - d i s c mar king.26

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    29/37

    ( a ) RO + 1 mark onwidget .

    P Ws10 1 .3

    (b) Geometry of RO + 1 mark.F i g u r e 9.- I l l u s t r a t i o n andgeometry of

    RO + 1 m a r k .

    27

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    30/37

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    31/37

    I

    Figure 1 0 . - RO + 2 mark or f la t su rfa ce s .

    C : Center of ci rc leD : Centroid of illuminated pixels

    Figure 1 1 . - Pix el image of ci rc ul ar di sc .

    29

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    32/37

    npercentof pixel

    0 10 20 30 50\ n, pixels

    F i g u r e 1 2 .- S t a n d a r dd e v i a t i o n of x component of c e n t r o i d r o m xc.30

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    33/37

    2 1 "3 1 - I4 5 6.ICircle d iame te r , n , p i x e l s

    F i g u r e 13.- Locat ion e r ro r of c i r c u l a r image c e n t e rv e r s u ss i z e .

    DiscIIII

    II I

    Sr a d i u s -2

    F i g u r e 1 4 . - I l l u s t r a t i o n of d i s c and i t s image.

    31

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    34/37

    PF i g u r e 15 . - I l l u s t r a t i o n o f a n g u l a r

    f i e l d of viewmeasuredondiagonalof mageplane.

    PF i g u r e 1 6 . - Range and tilt of

    t h r ee -d i sc mark ing .

    32. . . . -. . d

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    35/37

    F i g u r e

    P

    -XImage pla ne

    17.- Image of t h ree -d i sc marking i n h eimageplane.

    33

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    36/37

  • 8/6/2019 Marking Parts to Aid Robot Vision (1981)

    37/37

    National Aeronautics andSpace AdministrationWashington, D.C.20546Official BusinessPenalty for Private Use, $300

    THIRD-CLASSBULKRATE

    m

    Postage and Fees PaidNational Aeronautics andSpace AdministrationNASA451

    POSTMASTER: If UndeliverableSection 15 8Postal Manual) Do Not Retur