shape from shading of reflectance 1
TRANSCRIPT
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 1/18
COMPUTER VISION AND IMAGE UNDERSTANDING
Vol. 67, No. 2, August, pp. 143–160, 1997ARTICLE NO. IV970522
Shape from Shading with a Generalized Reflectance Map Mod
Kyoung Mu Lee
Department of Radio Engineering, Hong-Ik University, 72-1 Sangsu-Dong, Mapo-Gu, Seoul, Korea 121-791
and
C.-C. Jay Kuo†
Signal and Image Processing Institute and the Department of Electrical Engineering-Systems, University of Southern California,
Los Angeles, California 90089-2564
Received June 22, 1994; accepted March 14, 1996
rithms have been developed under three simple but r
tive assumptions on the image formation model to simMost conventional SFS (shape from shading) algorithms havebeen developed under three basic assumptions on surface prop- the problem, namely, the ideal Lambertian surface,erties and imaging geometry to simplify the problem, namely, tant point light source and orthographic projectiona Lambertian surface, a distant point light source and ortho- 11, 14, 15, 17, 18, 24, 30, 32, 39, 41], etc. We refer tgraphic projection. In this research, we derive a physics-based for a detailed performance analysis and comparisgeneralized reflectance map model which includes diffuse and several important algorithms. By means of orthogrspecular reflection effects, a nearby point light source and per- projection and distant light source, the image intensispective projection, and then we develop a new direct shape
be simply characterized by a reflectance map whichrecovery algorithm from shaded images. The basic idea of our
scene radiance as a function of only two variables ( p, qsolution method is to discretize the image irradiance equation
gradient of the corresponding surface point, in a viwith a finite triangular element surface model, to express thecentered coordinated system [7]. With the Lambertiresulting nonlinear system of equations in terms of depth vari-sumption, the reflectance map can be further simp
ables only and to recover the object shape by linearizing the to be independent on viewer’s direction. However,nonlinear equations and minimizing a quadratic cost func-assumptions are not valid in practical environmentional. We perform numerical experiments with one or multiplethe following reasons. First, surface reflection in gephotometric stereo images to demonstrate the performance of
the derived physics-based reflectance map model and the pro- contains both the diffuse and the specular compoposed SFS algorithm. © 1997 Academic Press Second, the light source is often located in a finite di
from the object so that the image brightness dependsdistance between every object point and the light s
1. INTRODUCTION Third, images are formed through a pin-hole camera should be modeled by perspective projection. Since
There has been a considerable amount of interest and ventional SFS algorithms do not use accurate physiceffort on shape extraction from image intensities in com- optical models, apparent distortions of reconstructeputer vision research for the last several decades. The
faces often occur in many real applications.brightness of a pixel in an image is generated through an To make SFS algorithms practically useful, we himage formation process governed by the optical, physical, consider a more realistic model of surface reflectioand geometrical factors including the object shape, the the imaging process. Several attempts to relax the surface reflectance property, and the illumination and sen- restrictive assumptions have been made so far [1, 5,sor characteristics. Thus, to get an accurate reconstructed 13, 19, 20, 28, 33]. We summarize important developsurface, it is crucial to understand and model the whole below. First, the use of more sophisticated non-Lambcomplicated imaging process. Most conventional SFS algo- reflectance maps in the SFS problem has been studi
quite a few researchers. Ikeuchi [10] used a doublespecular model to determine orientations of a spe* This work was supported by the National Science Foundation Presi-surface with photometric stereo images generated bdential Faculty Fellow Award ASC-9350309.
† E-mail: [email protected]. tributed light sources. Coleman and Jain [1] used fou
143
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 2/18
144 LEE AND KUO
source photometric stereo images to extract the shape of the surface property parameters, the surface normal apoint, the light source direction, specular directiotextured and specular surfaces. Healy and Binford [5] and
Tsai and Shah [36] employed the Torrance–Sparraw model viewer direction, and the distance between the supoint and the light source. The SFS problem wouin their work to determine the shape of a specular surface
with a single image. Recently, Tagare and deFigueiredo very difficult if we attempted to determine the sorientation as well as the position of a surface po[33] presented and analyzed a class of m-lobed non-Lam-
bertian reflectance maps and developed a photometric solving the nonlinear image irradiance equation witgeneralized reflectance map. However, the solution pstereo SFS method to determine the local surface normal
and the reflectance map parameters. Second, the nearby dure can be greatly simplified by introducing a trianelement surface model for the discretization and papoint light source for the SFS problem was examined by
Kim and Burger [13]. They derived a reflectance map with terization of the image irradiance equation. Throughprocedure, we can express the resulting discrete nona nearby point light source under the orthographic projec-
tion and Lambertian surface assumptions, and determined system of equations in terms of depth variables onlwe recover the object shape by linearizing the equalocal positions and orientations with photometric stereo
images. Finally, the perspective projection model has been and minimizing a quadratic cost functional. Sincmethod recovers depth variables directly, no integraapplied to the SFS problem by Penna [28, 29] and the
authors [18]. Penna presented a local SFS analysis of a constraint is required.The paper is organized as follows. In Section 2, wesingle perspective image of a Lambertian polyhedron-
shaped surface and proposed an algorithm which recovers a generalized reflectance map model which takes th
lowing factors into account: the specular and diffusethe local shape of the polyhedron by solving a nonlinearsystem of algebraic equations. The potential extension of ponents of light reflection, a nearby point light sourcperspective projection. We consider the discretizatiohis method to a non-Lambertian object was also men-
tioned. However, this algorithm is practically unreliable parameterization of the nonlinear reflectance map on a triangular element surface model in Section due to its sensitivity to noise and numerical finite precision.
Lee and Kuo [18] developed a more robust SFS algorithm are able to simplify the discrete reflectance map for sspecial cases, and these simplifications are examinebased on a triangular element surface model and a linear
approximation of a Lambertian reflectance map. It recov- in this section. The formulation and solution of thproblem is studied in Section 4. We perform numers the shape of a Lambertian surface directly with single
or multiple photometric stereo images taken under per- experiments with one or multiple photometric stereages to demonstrate the performance of the derivedspective projection. An effort trying to solve the SFS prob-
lem in a more general setting can be found in the original ics-based reflectance map model and the propose
algorithm in Section 5. Some concluding remarks arework of Horn [6], in which he used a perspective projectionmodel, a nearby light source, and an arbitrary reflectivity in Section 6.function. A first-order partial differential equation knownas the image brightness equation was derived in terms of 2. GENERALIZED REFLECTANCE MAP MODfive dependent variables ( x, y, z, p, q), and then a set of equivalent five ordinary partial equations were derived and The reflectance map was first defined by Horn [7
function that relates reflected radiance to local ssolved by the characteristic strip method. However, thismethod suffers from several practical problems including orientation in a viewer-centered coordinate. It is ind
dent of all other variables such as position in the imnoise sensitivity and error accumulation in numerical inte-gration of the differential equations. This notion of reflectance map is not general enou
describing the situation where there is perspective pIn this research, we derive a physics-based generalized
reflectance map model which includes perspective projec- tion and a finite distant light source. In this sectioincorporating more accurate surface reflectance phetion, a nearby light source, and diffuse and specular reflec-
tion effects, and then we develop a new unified algorithm ena and image formation effects, we propose a generreflectance map which depends on not only gradienfor recovering depth variables from shaded images with
such a generalized reflectance map. We employ the Tor- ables but also variables of the position and the distalight source. The radiance of the reflected light frrance and Sparrow BRDF (bidirectional reflectance distri-
bution function) to model diffuse and specular reflections surface patch is derived in terms of BRDF and the inirradiance due to a nearby point light source in Seof the reflectance map based on geometrical optics. Fur-
thermore, we incorporate the nearby point light source 2.1. By combining the imaging geometry with perspprojection and the result in Sections 2.1, we obtain aand perspective projection in the model. The complete
model is quite complicated where the brightness at a sur- ics-based generalized reflectance map model and thesponding image irradiance equation in Section 2.2.face point in an image depends on many factors including
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 3/18
SHAPE FROM SHADING
specular components. The Lambertian model anTorrance–Sparrow model have been widely used fdiffuse and specular reflection in the community ofputer vision and computer graphics.
Several comprehensive models for diffuse reflhave been recently examined. Oren and Nayar [25posed a generalized Lambertion model for rough dsurfaces by taking into account complex geometric
fects. The Oren–Naylar model has been used not onvision but also graphics [26] and visual psychophysi27]. The accuracy and generality of the model waillustrated by the experimental results in their work. [37, 38] also presented an accurate diffuse reflecmodel for the smooth dielectric surface. Even though
FIG. 1. Reflection geometry model. highly nonlinear involving functions of the viewing tion, interreflection, and Fresnel coefficient, the abovmodels are able to capture the reflectance of real-worfaces.2.1. Reflection Due to a Nearby Point Light Source
Furthermore, Wolff [37] showed that the proWe illustrate the reflection geometry used in this paper
model can be well approximated by the Lambertian min Fig. 1, where r is the distance between a surface point when both the angles of incident and emittance are simP and a point light source S, n is the unit surface normal neously less than 50 and the angle between the vat P, i is the unit vector toward the light source, v is the direction and the illumination direction is less thaunit vector toward the camera, and h is the unit vector Under these assumptions, we employ the Lambalong the specular direction. By definition [34], the bisector model for the diffuse component, and the Torrof i and v specifies the specular direction so that we have Sparrow model for the specular components, respec
in this paper. That is,
h i v
i v.
f d(i, n, v) 1/
The i , v , and denote the angles between n and i, n
and v, and n and h, respectively. The zenith and azimuth andangles ( , ) with a proper subscript represent a unit vectorin a given polar coordinate system.
The light reflection depends on the radiance of the light f s(i, n, v)
1
cos i cos v
expk2source and the bidirectional reflectance distribution func-tion (BRDF). The BRDF proposed by Nicodemous et al .[23] is a useful tool for characterizing light reflection from
1
(iTn)(vTn) expk[cos1(hTn)]2,
solid surfaces and it provides the information of the bright-ness of a surface patch with given viewing and illuminationdirections. By definition, it is written as where k is the surface roughness parameter.
As indicated in (2.1), the reflected radiance L de
f
L/E (2.1) on the incident irradiance E as well as the BRSince the amount of light energy falling on a s
where L and E denote the reflected radiance in the emitting patch is proportional to the foreshortened areadirection and the irradiance in the direction of incident irradiance E at that patch is proportional to a light, respectively. function of the angle between the surface normal
The model general light reflection from surfaces, the tion and the light source direction. Moreover, sincBRDF f consists of both the diffuse component f d and the distance between the surface patch and the light sspecular component f s [21, 33]; i.e., is finite, the irradiance is inversely proportional to s
of the distance. Assume that the point light sourc f d f d s f s , (2.2) isotropic radiance I 0 . Then, it is easy to deriv
irradiance E at a surface point from a point light sin the direction i or (( i , i).where d and s are the weighting factors of diffuse and
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 4/18
146 LEE AND KUO
E(x, n, r ) I 0
r 2 max[0, iTn] (x i), (2.3)
where is a solid angle delta function [33] defined by
(x i) ( x i , x i) ( x i) ( x i)
sin i,
where is the Dirac delta function.Given a BRDF f of a surface and a point light source,
the radiance of reflected light along the viewer directionv can be computed as
L(i, n, v r ) x
f (x, n, v)E(x, n, r ) d x
(2.4)
f (i, n, v) I 0
r 2 max[0, iTn]. FIG. 2. Imaging geometry model.
By substituting (2.2) into (2.4), we can represent the re-flected hybrid radiance as x f
X
Z , y f
Y
Z .
The irradiance E at an image point of the film is obtL(i, n, v, r ) [ d f d(i, n, v) s f s(i, n, v)] I 0
r 2 max[0, iTn]
(2.5) through the lens system via the radiance L of the sponding surface point. It was shown in [9] that n dLd(i, n, v, r ) sL s(i, n, v, r ),but only a portion of reflected light comes througlens system which is known as the lens collection [4
wheredescribed by
Ld I 0r 2
max[0, iT n],
(2.6)E
4
d f
2cos4 L,
L s I 0
r 2expk[cos1(hTn)]2
(iTn)(vTn) max[0, iTn] where d is the diameter of the lens and is the
between the ray from the object point to the cenprojection and the optical axis.
are the diffuse and specular radiance components, respec-In general, the converted image intensity or
tively.value through an electronic sensor device can beten as
2.2. Derived Reflectance Map Model and ImageIrradiance Equation I gE b,
The general perspective projection, which models the where g and b are the sensor gain and the bias, respecideal pinhole camera, is employed in this work. As depictedwhich can be determined by proper camera sensor cain Fig. 2, we consider a camera-centered Cartesian coordi-tion. As a direct consequence of (2.9), we can definnate system with the lens at the origin called the center of generalized reflectance map functionprojection and the optical axis aligned with the Z -axis.
The actual film is located at Z f , where f is the focal
length behind the lens. However, to avoid the sign reversalR(i, n, v, r ) g
4 d
f 2
cos4 L(i, n, v, r ) b, of the coordinates, we assume without loss of generalitythat the image plane x y is located at Z f in frontof the lens. With this model, the mapping between a surface so that the image intensity I at a point p in the i
plane can be characterized by the image irradpoint P ( X , Y , Z )T and the projected image point p
( x, y, f )T obeys the relationship equation
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 5/18
SHAPE FROM SHADING
I (p) R(i(S, P), n(P), v(P), r (S, P)), (2.11) in the image plane are specified, R reduces to a funof three variables, i.e., depth Z and the gradients p a
where the relationship between the image point p and the since the components of P can be represented in tercorresponding surface point P is defined by the perspective Z by the perspective law with known p. Thus, we alaw in (2.7). The cosine function of off-axis angle can be to a problem of solving three dependent variableswritten as and q, characterizing the underlying surface with one
equation. To solve them uniquely, conventional photcos vTnz , ric stereo methods use additional image irradiance
tions due to other light sources as constraints [13, 33where n s (0, 0, 1) is the unit normal of X-Y plane. approach has one difficulty, namely, since the variabBy substituting (2.5) and (2.6) into (2.10), the generalized p,and q are viewedas independent variables, the recoreflectance map can be written as depth variables and surface normals are often incons
This is known as the integrability problem. Note alsR(i, n, v, r ) the idea of using discrete approximations of p and q
straightforward finite difference method, which has g
I 0
r 2 (vTnz)
4
4 d
f 2
applied to the orthographic SFS problem [15, 35],longer applicable in this context since the position conents ( X , Y ) are in fact functions of the depth Z .
d
sexpk[cos1(hTn)]2
(iTn)(vTn) max[0, iTn] b By introducing a triangular element surface mod
represent the surface normal as functions of the depth. Consequently, the reflectance map R can be dized and parameterized with only the nodal depthables. Such a procedure greatly simplify the SFS pro
g
I 0
r 2 (vTnz)4
diTn s
expk[cos1(hTn)]2vTn
b, iTn 0,
0, otherwise,
formulation and its solution.
3.1. Discretization and Parameterization witTriangular Element Surface Model
Our basic idea of discretization and parameterizawhere the diffuse and specular albedo, d and s , are con-to approximate a smooth object surface by the unistants of proper dimension that makes R a valid reflectancetriangular surface patches called triangular element
map. We note that in order to make our algorithm work
that the approximating surface can be written as a properly and reconstruct accurate surfaces, the generalizedcombination of a set of nodal basis functions of com
reflectance map parameters, d , s , and k of an objectsupport. Let us triangulate a square image domain
should be known a priori. Several algorithms for estimatingdividing it into a set of M t nonoverlapping triangl
these parameters of hybrid reflection have been proposedi 1, . . . , M t , with M n nodal points pi , i 1, . . .
in the literatures, and we refer to [12, 33] for more detailedso that the intensity within each triangle is almost hom
discussion. Methods for estimating the albedo and illumi-neous. Then, we approximate a smooth object surfa
nation direction for Lambertian surface can be found ina piecewise linear surface consisting of triangular su
[16, 41].patches Si with nodal points Pi in such a way that
Pi are perspectively projected to T i and pi , respectiv3. DISCRETE GENERALIZED REFLECTANCE MAP
the image plane. The approximating surface caPARAMETERIZED BY DEPTH VARIABLES
uniquely specified by Pi , or equivalently by theface nodal depth variables Z i associated with piThe SFS problem can be viewed as a problem of solving1, . . . , M n .the image irradiance equation (2.11) with given (observed)
Let us now focus on a triangular surface patch Simage intensity at p. The generalized reflectance map Rthe corresponding projected triangle T k on the imagein (2.10) is a function of the light source position S, theas shown in Fig. 3. We denote the nodal vectors (coposition of a surface point P and the surface normal atpoints) of three vertices of Sk asthat point which can be expressed as
Pi ( X i , Y i , Z i), P j ( X j , Y j , Z j ), Pl ( X l , Yn
( p, q, 1)
p2 q2
1, where p
Z ( X , Y )
X , q
Z ( X , Y )
Y .
and the corresponding projected nodal points in the plane asOnce the light source position and the projected point p
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 6/18
148 LEE AND KUO
depends only on the depth values of three nodal poinZ j , and Z l . The unit vector ik of the light source directthe center of the surface patch Sk can be written as
ik 1
r k(S Pkc)
1
r k(S x X kc , S y Y kc , Sz Z kc
where
r k S Pkc
[(S x X kc)2
(S y Y kc)2
(Sz Z kc)2]1/ 2
is the distance between the light sourceS andPkc . Simthe unit vector vk along the viewing direction frosurface patch Sk can be represented by
vk O Pkc
O Pkc
FIG. 3. Image formation on a triangular surface patch.
1
( X 2kc Y 2kc Z 2kc)1/ 2 ( X kc , Y kc , Z kc).
The unit vector hk along the specular direction on Spi ( xi , yi , f ), p j ( x j , y j , f ), pl ( xl , yl , f ).also be determined by
The center of Sk is
hk ik vk
ik vk.Pkc ( X kc , Y kc , Z kc)
Besides, the surface normal nk of the triangular su
X i X j X l
3 ,
Y i Y j Y l
3 ,
Z i Z j Z l
3
.
patch Sk can be uniquely determined by its three vectors Pi , P j , and Pl via
By using the perspective relationship in (2.7), we can re-write the components of Pkc in matrix form as
nk (P j Pi) (Pl Pi)
(P j Pi) (Pl Pi)
( X j X i , Y j Y i , Z j Z i) ( X l X i , Y l Y i , Z l
( X j X i , Y j Y i , Z j Z i) ( X l X i , Y l Y i , Z l X kc
Y kc
Z kc
1
3 f xi xi xl
yi y j yl
f f f
Z i
Z j
Z l .
By using the perspective relationship in (2.7), we cWe assume that f is known and the points ( xi , y i), ( x j , y j ),and ( xi , yl ) are observed in the image plane. Then, Pkc write nk in terms of the location of the image point
nk
1
f ( xiZ i x j Z j ),
1
f ( yiZ i y j Z j ), Z j Z i 1
f ( xiZ i xl Z l ),
1
f ( yiZ i yl Z l ), Z l Z i
1
f ( xiZ i x j Z j ),
1
f ( yiZ i y j Z j ), Z j Z i 1
f ( xiZ i xl Z l ),
1
f ( yiZ i yl Z l ), Z l Z i
( k , k , k)T
( 2k 2k 2
k)1/ 2 ,
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 7/18
SHAPE FROM SHADING
wherei
S O
S O
(S x , S y , Sz)
(S2 x S2
y S2z)1/ 2 .
3.2.2. Distant Object k
k
k
f ( yi y j ) f ( yl yi) f ( y j yl )
f ( x j xi) f ( xi xl ) f ( xl x j )
( xi y j x j yi) ( xl yi xi yl ) ( x j yl xl y j )
Z iZ l
Z iZ j
Z l Z j .
When an object is far away from the camera whilight source is near the object, the perspective projemodel can be approximated by the simpler orthogr
Note that since the vectors ik
, vk
, nk
, hk
, and r k
are allprojection model. With orthogonal projection, we haexpressed in terms of Z i , Z j , and Z l in the above discussion, following relationship:
the reflectance map Rk is only a function of Z i , Z j , and Z l ,i.e., the depth variables of three vertices of Sk .
x X , y Y . Finally, by using the image irradiance equation, we can
relate the image intensity I k of a triangle T k directly to theFor this case, since all rays from the surface points nodal depth values of the corresponding surface patch Sk:parallel with each other and orthogonal to the image the unit vector vk along the viewer direction froI k Rk(ik , nk , vk , r k)surface patch Sk in (3.2) becomes independent on thetion of the patch, i.e.,
vk v nz (0, 0, 1)T.
I 0
r 2k(vT
knz)4
diTknk s
expk[cos1(hTknk)]2
vTknk , iTknk 0,
0, otherwise,Moreover, since
cos nTznz 1,
Rk(Z i , Z j , Z l ).
(3.4)the lens collection is independent on the surface po
3.2. Special Cases and (2.8) reduces to
We discuss several simplified reflectance map modelsE CL,for some special cases below.
3.2.1. Distant Light Source where C is a constant collection factor d2/(4 f 2
sides, by using the orthographic projection relatioIf the light source is located far away from the object andin (3.7), the surface normal in (3.3) can be approximthe camera, the relative depth difference between surfacebypoints is negligible compared to the average distance from
the object to the light source. For this case, the radianceflux arriving at a surface patch can be approximated as
n ˆ k ( x j xi , y j yi , Z j Z i) ( xl xi , yl yi , Z l
( x j xi , y j yi , Z j Z i) ( xl xi , yl yi , Z l
di I 0
r 2dA max[0, cos i] I l dA max[0, cos i], (3.5)
( ˆ k , ˆ k , k)T
( ˆ 2k ˆ
2k
2k)1/2 ,
where where
I 0
r 2 I l I 0/r2
ˆ k ( y j yl )Z i ( yi y j )Z l ( yl yi)Z j ,
ˆ k ( xl x j )Z i ( x j xi)Z l ( xi xl )Z j ,and where r is the average distance between the object
k ( x j yl xl y j ) ( xi y j x j yi) ( xl yi xi yl )surface and the light source. Besides, since the light sourceis far away, we can assume that incident rays from the light
3.2.3. Distant Light Source and Object source are in parallel with each other. Then, the unit vectorik of the light source direction in (3.1) is independent of If the light source, the object and the camera a
away from one another, both the distant light point sthe position of the surface patch so that we can drop thesubscription and approximate it as and orthographic projection assumptions hold. Fo
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 8/18
150 LEE AND KUO
FIG. 4. Test Problem 1: (a)ground truth, and reconstructed results with (b)a singleimage, (c) two photometric stereo images, (d)threephotstereo images; (e) three noisy photometric stereo images corrupted by 10% randim noise; (f) the 1D sliced plot of several reconstructed
case, the vectors ik , vk , and hk are no longer dependent 4. FORMULATION OF THE SFS PROBLEMon the surface position. Once the light source direction is
The shape of an object, which is approximated wspecified, the brightness of a pixel in an image plan isdetermined only by the surface normal nk . By using (3.5), union of linear triangular surface patches, can be char(3.6), (3.8), (3.9), (3.10) and dropping off the unnecessary ized by the coordinates of the nodal points Pm ( X msubscripts, we obtain the simplified reflectance map Z m), m 1, . . . , M n . Since the positions of the p
pm ( xm , ym , f ), m 1, . . . , M n , in the image Rk(i, n ˆ k , v) are given, one can determine the positions of Pm by
lating the nodal depths Z m , m 1, . . . , M n , and apthe perspective law (2.7) to find out the correspondi
I l diTn ˆ k s
expk[cos1(hTn ˆ k)]2vTn ˆ k
, iTn ˆ k 0,
0, otherwise.and Y m . The reflectance map Rk given by (3.4) is a near function of depth variables Z i , Z j , and Z l which m
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 9/18
SHAPE FROM SHADING
In terms of mathematics, we can derive a linear approtion of Rk at the nth iteration by taking the Taylor expansion about the reference depth values obtainthe (n 1)th iteration as
Rk(Z i , Z j , Z l )
Rk(Z n1i , Z n1
j , Z n1l )
mi, j ,l
(Z m Z n1m )
Rk(Z i , Z j , Z l )
Z m
(Z n1i ,Z n1
j ,Z
mi, j ,l
Rk(Z i , Z j , Z l )
Z m
(Z n1i ,Z n1
j ,Z n1l )
Z m
Rk(Z n1i , Z n1
j , Z n1l )
mi, j ,l
Rk(Z i , Z j , Z l )
Z m
(Z n1
i ,Z n1 j ,Z n1
l ) Z
n1
m
.
Since the second term of the above equation is equconstant, the reflectance map Rk over Sk is a linear funof depth values Z i , Z j , and Z l of the three vertices We may rewrite Rk in terms of all nodal depth varZ m , m 1, . . . , M n . That is,
Rk M n
m1
kmZ m k ,
where
km Rk(Z i , Z
j , Z l )
Z m
(Z n1i
,Z n1 j
,Z n1l
), if mV k i, j , l
0, otherwise,
and
FIGURE 4—Continued
TABLE 1
Depth and Orientation Error of the Reconstructed Surthe SFS problem difficult to solve. However, we can sim- in Test Problem 1plify the solution process by linearizing the reflectance
Hmap, solving the linearized problem and then by applyingError H-H1 H-H2 H-H3 (with
a successive linearization scheme to improve the accuracyMean-absolute Z 1.3578 0.9729 0.0092 0.of the computed solution.Std-absolute Z 0.5279 0.3275 0.0023 0.By successive linearization, we mean that the nodal val-Mean-relative Z (%) 2.4243 1.7879 0.0163 0.ues obtained from the previous iteration are used as theMean- pq 0.1760 0.0976 0.0584 0.
reference points for linearization for the current iteration.
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 10/18
152 LEE AND KUO
FIG. 5. Test Problem 2: (a) ground truth of a spherical surface; (b)–(c) photometric stereo images of a Lambertian surface; (d)–(g) photstereo images of a hybrid surface.
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 11/18
SHAPE FROM SHADING
Az b. k Rk(Z n1
i , Z n1 j , Z n1
l ) M n
m1
kmZ n1m , (4.3)
Since the stiffness matrix A is sparse and symmetrsystem can be efficiently solved by iterative method
and where V k denotes the index set of vertices of T k . as the multigrid method and the preconditioned conOur objective is to determine the nodal depth Z m with gradient method [3, 31].
one or multiple images. To achieve this goal, we employa cost functional minimization approach with J different 5. EXPERIMENTAL RESULTS
photometric stereo images taken by various illuminationWe present some experimental results to demondirections while keeping the camera position fixed. The
the performance of the proposed SFS algorithm ischeme reduces to the single image SFS algorithm forsection. In the experiment, we put light sources oJ 1. The cost functional is chosen to beZ 0 plane centered around the origin. When twosources are used, they are chosen to be orthogonal toother in the azimuth angle, and when three are usedE M t
k1 J j 1
E j k M t
k1 J j 1
( I j k R j k)2, (4.4)
are placed 120 apart in the azimuth angle. The between the light ray to the object center and caoptical axis is maintained at less than 45 for eachwhere E
j k denotes the cost term corresponding to the kth
source. Unless specified otherwise, the initial depthtriangular domain of the j th image, and I j k and R j k are the
mates Z 0i , i 1, . . . , M n , are set at an arbitrary conobserved image irradiance and the reflectance map overand no a priori knowledge about the true depth is assthe kth triangular domain of the j th image, respectively.Since the nodal points whose depth values can be It is worthwhile to emphasize that no regularization termmined by the SFS algorithm are irregular and sparis used in (4.4).the object domain, we perform interpolation to incBy substituting (4.1) into (4.4) and simplifying the ex-the resolution and visibility of the reconstructed supression, we obtainTo show the performance of the algorithm in a quanway, the absolute and relative depth error, as well orientation error, of the reconstructed surfaces are E M t
k1 J j 1 I j k M n
m1
j kmZ m j k2
(4.5) lated and illustrated for each of the synthetic test prowhere the ground truth are known.
zTAz bTz c, z [Z 1 , Z 2 , . . . , Z M n]T,
TEST PROBLEM 1: Spherical polyhedron. We ex
the performance of the proposed algorithm appliewhere the stiffness matrix A and the load vector b are thespherical polyhedron whose surface is compos
sum of each individual stiffness matrix A j and the loadpiecewise triangular patches so that it fits the mod
vector b j of j th image, respectively. In terms of mathemat-scribed in Section 3.1. The 17 17 surface depth v
ics, we haveassociated with image nodal points are shown in FThe ideal image intensity I k of each triangle T k with reto each triangular surface patch Sk can be exactly A J
j 1
A j , b J j 1
b j ,mined by (3.8) so that I k contains no noise. We sfocal length f 30, the diffuse and specular weigfactors d 0.6, s 0.4, and the surface roughness pwhere the individual stiffness matrix A j and the load vectoreter k 10, respectively.b j can be determined by
The recovered surface depth values with a single generated by a light source located at (S x , S y ,
(52, 30, 0) are shown in Fig. 4b. The result with two p[A j ]m,n 2 M t
k1
j km
j kn ,
metric stereo images generated by two light so(S x , S y , Sz) (52, 30, 0) and (30, 52, 0) is shown i4c. We also show in Fig. 4d the reconstructed surfac[b j ]m 2 M t
k1
( I j k j k) j km , 1 m, n M n ,
three photometric stereo images with light sour(S x , S y , Sz) (60, 0, 0), (30, 52, 0), and (30,
To examine the robustness with respect to noise, weand j and j are the coefficients in (4.2) and (4.3) for thej th image. The minimization of the quadratic functional in our algorithm to three noisy photometric stereo i
generated by adding 10% pseudo random intensity(4.5) with respect to the nodal variables z is equivalent tofinding the solution of a linear system of equations to the original images and show the result in Fig. 4
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 12/18
154 LEE AND KUO
FIG. 6. Reconstruction results of the spherical surface test problem: (a) applying the Lambertian model to two photometric stereo imaLambertian surface; (b) applying the Lambertian model to two photometric stereo images of a hybrid surface; (c) applying the Lambertianto three photometric stereo images of a hybrid surface; (d) applying the general reflectance model to three photometric stereo images ofsurfaces; (e) 1D sliced plot of reconstructed Lambertian surfaces; (f ) 1D sliced plot of reconstructed hybrid surfaces.
To see the accuracy of the recovery results more clearly, We observe from these results that a single imagenot provide accurate depth as well as orientation infowe present the 1D sliced view of three reconstructed
surfaces along with the original one in Fig. 4f, where tion. Moreover, it is interesting to see that, unlikLambertian case where two images are sufficient to rethe solid, dotted, dashdot, ‘‘’’ and ‘‘o’’ marked lines
are used to represent the ground truth and the results accurate results [19, 18], the non-Lambertian surfachardly be recovered correctly with two photometric depicted in Figs. 4b, c, d, and e, respectively. The initial
surface was to be a plane Z 100 in the experiment. images. The result in Fig. 4 and Table 1 shows thathree photometric stereo images, we can obtain rWe also calculate the mean and standard deviation of the
absolute depth error, the mean of the relative percentage and very accurate reconstructions of the non-Lambhybrid test surface. Even with images corrupted bdepth error, and the average orientation error of each
reconstructed surface in Figs. 4b–e and summarize them random noise, the algorithm recovers the surface rowith the relative depth error of less than 0.5% anin Table 1.
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 13/18
SHAPE FROM SHADING
average orientation error of 0.066. However, we obthat when the specular factor s or the surface rougk becomes larger, the accuracy of the reconstructed even with three images becomes degraded.
TEST PROBLEM 2: Sphere. We sythesize test imailluminating a 129 129 spherical surface as shoFig. 5a via a pointwise mapping of (2.11) with the su
normal approximated by
n ( p, q, 1)T
p2 q2
1,
where p Z ( X 1, Y ) Z ( X , Y ), q Z ( X , Y
Z ( X , Y ), and we estimate the average intensity I k oimage triangle T k on the tessellated image domain acussed in [18]. By setting f 200, we obtain a perspective test images of size 64 64 by varyingsurface reflection parameters and the source posit
shown in Figs. 5b–g. Figures 5b and c are two photomstereo images of a Lambertian surface ( d 1, swith light sources at (S x , S y , Sz) (150, 260, 0),150, 0), while Figs. 5d–g are those of a hybrid surfac( d , s , k) (0.5, 0.5, 7) and light sources at ( s x , S
(150, 260, 0), (260, 150, 0), (260, 150, 0), an300, 0), respectively.
Figures 6a and b show the reconstructed Lamband non-Lambertian surfaces by using a Lambertiflectance map model with two sets of photometric simages in Figs. 5b and c and Figs. 5d and f, respect
The results of applying the Lambertian reflectanceand the generalized reflectance map to three photomimages of the hybrid surface in Figs. 5e, f, and g are sin Figs. 6c and d, respectively. The mean and stadeviations of the absolute depth error, the mean of redepth error, and the average orientation error of the rstructed surfaces in Figs. 6a–d are shown in Table
FIGURE 6—Continued comparing these results, we see clearly that applyinLambertian model to a non-Lambertian hybrid suusing two or even three photometric stereo imageduces a substantial amount of reconstruction error thus not appropriate in practice.
TABLE 2
Depth and Orientation Error of the Reconstructed Surfaces in Test Problem 2
L-L2 H-HError L-L2 H-L2 H-L3 H-H3 (w. depth constr.) (w. depth c
Mean-absolute Z 4.1282 83.2214 70.5369 6.4409 0.5869 0.245Std-absolute Z 0.4084 11.4276 11.3204 0.3720 0.5008 0.267Mean-relative Z (%) 1.1501 12.0248 17.7108 1.6058 0.2920 0.199Mean- pq 0.0618 0.1285 0.2573 0.0599 0.0646 0.063
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 14/18
156 LEE AND KUO
TABLE 3 the first and fourth columns of Table 2, that the The Effect of the Distance of the Object from the Camera values are relatively large, while the standard devia
in Test Problem 3 and the orientation error are small. By referring the rof the ideal case in Fig. 4c, this discrepancy is primError-distance 300 400 500
due to the quantization noise and noise caused by estMean-absolute Z 2.7531 6.4409 10.6165 ing the image intensity I k via an averaging process. Mean-relative Z (%) 0.9335 1.6058 2.1435
ever, this depth discrepancy can be reduced by incorMean- pq 0.0664 0.5999 0.0614ing the depth information of a single surface poinused Z ref 380, the depth of the center points, as aconstraint. We show the results in Figs. 6e and f with dlines, and also we illustrate the statistics of depth erIn Figs. 6e and f, we show the 1D sliced view of recon-
structed surfaces, where the solid lines denote the ground the last two columns of Table 2. We note that the absolute depth error, as well as the mean relative truth while the dotted lines represent results by using cor-
rect reflectance maps (i.e., results in Figs. 6a and d). Al- are greatly reduced. These data show that more acreconstructed surfaces canbe obtainedby imposing athough the overall shapes are close to the original one,
compared to the absolute depth of the ground truth, the constraint when the noise level in the image is relasmall. We also tested the effect of the distance of an oreconstructed surfaces are shifted globally by a certain
amount, which is also represented in the error statistics in from the camera. The test results of placing the same s
FIG. 7. Test Problem 3: (a) ground truth of a penny surface; (b)–(d) photometric stereo images.
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 15/18
SHAPE FROM SHADING
f 250, d 0.4, s 0.6, k 10 and light sour(S x , S y , Sz) (285, 285, 0), (386, 104, 0) and (104,0), are given in Figs. 7b–d, respectively. Figure 8a reconstructed depth by applying the general reflecmap with a depth constraint Z ref 491. In this casreconstruction error at the depth discontinuities isvisible. For comparison, we show in Fig. 8b the 1Dplot of several reconstructed surfaces, where the
dashdot, dotted, and dashed lines denote the gtruth and the results obtained by applying a Lambmodel, a generalized reflectance model without anda depth constraint, respectively. Table 4 comparedepth and orientation error of those reconstrsurfaces.
Compared to previous test problems, both the and orientation error becomes substantial even icase when a depth constraint is imposed. This is padue to the error occurring around the four corner reof the object, where the original surface has rela
large depth discontinuities. Besides, since the test suis more complicated than the smooth spherical and drical surfaces, with considerable local depth varthe averaging effect in estimating I k is more seThis explains why the reconstructed surface witgeneral reflectance model, but without using the constraint, becomes worse for this problem than foof Test Problem 2.
TEST PROBLEM 4: Golf ball. In this test, real itaken by a CCD camera are used for the experimenthree phometric stereo images of a golf ball are sho
Figs. 9a–c. The ball is located at a distance of 45 front of the camera, and the three-point light sourcplaced on the Z 0 plane 120 apart from each ota circular fashion to make the illumination angle center of the object approximately 30. We performtest with the estimated reflection parameters d
s 0.2, and k 8 without any depth constraintFIG. 8. Results of the penny surface test problem: (a) reconstructed reconstructed surface is shown in Fig. 9d. We also
surface using a general reflectance map model with a single depth con-the 1D depth profile along a diagonal direction i
straint; (b) 1D sliced view of several reconstructed surfaces.9e. Note that the object surface is inhomogeneoulaminated with transparent paint. Even though the rstructed surface does not produce very accurate de
ical object 300, 400, and 500 units from the camera, arereported in Table 3. These results show that the depth
TABLE 4error tends to increase as the object becomes far from theDepth Error of the Reconstructed Surfaces incamera. This can be easily expected since the projected
Test Problem 3image resolution and thus the depth information becomesdegraded when the object gets far away. We note, however, H-H
Error H-L3 H-H3 (w. depth due to the smooth varing curvature property of thespherical object, the orientation error does not change
Mean-absolute Z 184.28 71.5691 13.44so much. Std-absolute Z 11.2425 4.7793 5.29
Mean-relative Z (%) 38.4671 14.8325 1.97TEST PROBLEM 3: Penny. The penny surface is shownMean- pq 0.7712 0.4499 0.46
in Fig. 7a, and test images synthesized with parameters
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 16/18
158 LEE AND KUO
FIG. 9. Test Problem 4: (a)– (c) photometric stereo images of a golf ball; (d) reconstructed surface; (e) 1D slice plot.
information which may be caused by the inaccurarcy of 6. CONCLUSION
estimated reflection parameters, one can still see from theA new general physics-based reflectance map mresult that the proposed algorithmdoes recover the compli-
which includes diffuse and specular reflection effecated surface well qualitatively.nearby point source andperspective projection wasd
TEST PROBLEM 5: Mannequin. The test object is a head in this research. We discussed the discretization and peterization of the derived reflectance map in terms ofof a mannequin as shown in Fig. 10a. Three real photomet-
ric stereo images are obtained by arranging light sources depth variables by using a triangular element surfacesentation, and we proposed a direct shape recovery120 apart from each other in a circular fashion on the
source plane Z 0. The focal length of the camera is 8mm rithm from shaded images by successively linearizinreflectance map and minimizing a quadratic costand the object is placed approximately 80 cm form the
camera. In this test, we make a guess on the reflection tional. The proposed method is practically attractiveit recovers a broad range of object surfaces with difparameters d 0.7, s 0.3, and k 7, and do not
impose any depth constraint. The reconstructed surface reflective properties andunder various geometric anding environments. We performed some experimentand the corresponding level contour plot of the marked
region of the face in Fig. 10a are illustrated in Figs. 10b synthetic and real images to demonstrate the excperformance of the new method. The accurate and rand c, respectively. Although there might be substantial
error in the reconstructed depth due to inaccuracy in esti- estimation of the surface reflectivity parameters anlight source geometry information is crucial for themating the light source, camera and reflectance parame-
ters, as well as the measurement noise, the overall shape rate surface reconstruction, which is the research topare currently investigating.of the reconstructed surface looks good.
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 17/18
SHAPE FROM SHADING
FIG. 10. Test Problem 5: (a) ground truth of a mannequin; (b) reconstructed surface; and (c) level contour plot.
5. G. Healey and T. O. Binford, Local shape from specularity, CoREFERENCESVision, Graphics, and Image Processing 42, 1988, 62–86.
1. E. N. Coleman Jr. and R. C. Jain, Obtaining 3-dimensional shape of 6. B. K. P. Horn, Obtaining Shape from Shading Informatio
textured and specular surface using four-source photometry, Comput. Press, Cambridge, MA, 1975.
Graphics Image Process. 18, 1982, 309–328. 7. B. K. P. Horn, Understanding image intensities, Artif. Intell
2, 1977, 201–231.2. P. Dupuis and J. Oliensis, Direct method for reconstructing shape
from shading, in IEEE Conference on Computer Vision and Pattern 8. B. K. P. Horn, ‘‘Height and gradient from shading,’’ InternRecognition, Champaign, IL, June 1992. pp. 453–458. Journal of Computer Vision 5, 1990. 584–595.
3. W. Hackbusch, Multi-Grid Methods and Applications, Springer- 9. B. K. P. Horn and R. W. Sjoberg, Calculating the reflectancVerlag, Berlin, 1985. App. Opt. 18, 1979, 1770–1779; in Shape from Shading (B. K. P
and M. J. Brooks, Eds.), MIT Press, Cambridge, MA, 1989.4. R. M. Haralick and L. G. Shapiro, Computer and Robot Vision,
Addison-Wesley, Reading, MA, 1992. 10. K. Ikeuchi, Determining surface orientations of specular surf
7/23/2019 Shape From Shading of Reflectance 1
http://slidepdf.com/reader/full/shape-from-shading-of-reflectance-1 18/18
160 LEE AND KUO
using the photometric stereo method, IEEE Trans. Pattern Anal. in IEEE Conference on Computer Vision and Pattern Recog
1993, pp. 290–295.Mach. Intell. PAMI-3, No. 6, 1981, 661–669.
11. K. Ikeuchi and B. K. P. Horn, ‘‘Numerical shape from shading and 26. M. Oren and S. K. Nayar, Generalization of Lambert’s remodel, in Proc. of ACM SIGGRAPH , Orlando, Florida, Juoccluding boundaries,’’ Artificial Intelligence 17, 1981, 141–184; in
Shape from Shading, B. K. P. Horn and M. J. Brooks (eds.) 1989, 27. M. Oren and S. K. Nayar, Generalization of the LambertianMIT Press, Cambridge, MA. and implication formachine vision, International Journal of Co
Vision 14, No. 3, 1995, 227–251.12. K. Ikeuchi and K. Sato, Determining reflectance properties of anobject using range and brightness images, IEEE Trans. Pattern Analy- 28. M. A. Penna, Local and semi-local shape from shading for sis and Machine Intelligence 13, No. 11, 1991, 1139–1153. perspective image of a smooth object, Computer Vision, Gr
and Image Processing 46, 1989, 346–366.13. B. Kim and P. Burger, Depth and shape from shading using thephotometric stereo method, Comput. Vision Graphics Image Process. 29. M. A. Penna, A shape from shading analysis for a single pers Image Understanding 54, No. 3, 1991, 416–427. image of a polyhedron, IEEE Trans. Pattern Analysis and M
Intelligence PAMI-11, 1989, 545–554.14. R. Kozera, Existence and uniqueness in photometric stereo, I, Appl.
Math. Comput. 44, 1991, 1–103. 30. A. P. Pentland, Shape information from shading: a theoryhumanperception, in Proc. of International Conf.on Compute15. Y. G. Leclerc and A. F. Bobick, The direct computation of height
from shading, in IEEE Conference on Computer Vision and Pattern 1988, 404–413.Recognition, Hawaii, May 1991, 552–558. 31. R. Szeliski, Fast surface interpolation using hierarchical bas
tions, IEEE Trans. Pattern Analysis and Machine Intelligence16. C. H. Lee and A. Rosenfeld, Improved methods of estimating shape6, 1990, 513–528.from shading using light source coordinate system, Artificial Intelli-
gence 26, 1985; 125–143. Also in Shape from Shading, B. K. P. Horn 32. R. Szeliski, Fast shape from shading Computer Vision, Gand M. J. Brooks (eds.) 1989, MIT Press, Cambridge, MA. Image Processing: Image Understanding 53, No. 2, 1991, 129
17. K. M. Lee and C.-C. J. Kuo, Shape from shading with a linear triangu- 33. H. D. Tagare and R. J. P. DeFigueiredo, A theory of photlar element surface model, IEEE Trans. Pattern Analysis and Machine stereo for a class of diffuse non-Lambertian surface, IEEE Intelligence 15, 1993, 815–822. Pattern Analysis and Machine Intelligence 13, No. 2, 1991, 13
18. K. M. Lee and C.-C. J. Kuo, Surface reconstruction from photometric 34. K. E. Torrance and E. M. Sparrow, Theory for off-specular restereo images, Journal of Optical Society of America: A 10, 1993, from roughened surfaces, Journal of Optical Society of Ame855–868. No. 9, 1967, 1105–1114.
19. K. M. Lee and C.-C. J. Kuo, Shape from shading with perspective 35. P. S. Tsai and M. Shah, A fast linear shape from shading, inprojection, Computer Vision, Graphics, Image Processing: Image Un- Conferenceon Computer Vision and Pattern Recognition, Chamderstanding, 59, 1994. Illinois, June 1992, pp. 734–736.
20. S. K. Nayar, K. Ikeuchi, and T. Kanade, Determining shape and 36. P. S. Tsai and M. Shah, A Simple Shape from Shading Algreflectance of hybrid surfaces by photometric sampling, IEEE Trans. Tech. Rep. CS-TR-92-24, CS Dept., Univ. of Central FloridRobotics and Automation 6, 1990, 418–431. 37. L. B. Wolff, Diffuse-reflectance model for smooth dielectric
21. S. K. Nayar, K. Ikeuchi, and T. Kanada, Surface reflection: physical Journal of Optical Society of America: A 11, 1994, 2956–296and geometricl perspectives, IEEE Trans. Pattern Analysis and Ma-
38. L. B. Wolff, On the relative brightness of specular and diffusechine Intelligence 13, 1991, 611–634. tion, in IEEE Conference on Computer Vision and Pattern R22. S. K. Nayar and M. Oren, Visual Appearance of Matte Surfaces, tion, 1994, pp. 369–376.
Science 267, No. 5201, 1995, 1153–1156. 39. R. J. Woodham, Analyzing imagesof curvedsurfaces, Artificia23. F. E. Nicodemus, J. C. Richmond, J. J. Hsia, and I. W. Ginsberg, gence 17, No. 1-3, 1981, 117–140.
Geometrical Considerations and Nomenclature for Reflectance, Boul- 40. R. Zhang, P. S. Tsai, J. E. Cryer, and M. Shad, Analysis ofder CO: National Bureau of Standards, 1977. from Shading Techniques, in IEEE Conference on Computer
24. J. Oliensis, Shape from Shading as a Partially Well-Constrained Prob- and Pattern Recognition, 1994, pp. 377–384.lem, Computer Vision, Graphics, and Image Processing: Image Under- 41. Q. Zheng and R. Chellappa, Estimation of illumination di standing 54, No. 2, 1991, 163–183. albedo, and shape from shading, IEEE Trans. Pattern Analy
Machine Intelligence 13, No. 7, 1991, 680–702.25. M. Oren and S. K. Nayar, Diffuse Reflectance from Rough Surfaces,