illumination simulation and design considerations for ... · even with freeforms, ar systems can...

57
Page 1 of 57 Illumination Simulation and Design Considerations for Mobile Virtual Reality Systems Evan Richards Opti 909 Report for Masters of Science in Optical Sciences November 16, 2015 Introduction Head mounted displays (HMDs) have been in wide military and commercial use for decades [1]. HMDs present visual information to the user, whether it be symbols, text, or imagery. These devices come in a variety of design forms and can be transparent or occluded [2] [3]. Within the past four years, consumer electronics versions of HMDs have been front page news through developer kits like Google Glass Explorer Edition and the pending launch of the consumer Oculus Rift in the first quarter of 2016. In general, consumer HMDs fall into three classes of development informative/microinteration, augmented reality, and virtual reality. Some classify the informative/microinteraction devices as smart glasses or smart eyewear [4]. This first category includes Google Glass [5], seen in Figure 1, where the HMD may sit above the eye and outside of the straight ahead gaze of the user. These systems tend to have small diagonal fields of view, less than 20 degrees, in order to optimize for compactness and minimum visibility for those around the user. In an effort to enhance usability beyond information display, these systems may include cameras, audio, and other input interaction. Many times these optics can be see through, as is the case with Google Glass incorporating a partially transparent beamsplitter and large fully transparent section of the light pipe. This allows the user to still directly see the physical environment when looking through the device. Due to the field of view and the placement out of the straight forward gaze position, these devices do not overlay information directly on top of the environment and are not considered augmented reality. As these are intended for everyday use, minimizing weight and size are critical considerations and highly constrain the optical system. This is one of the reasons that Google Glass used a single spherical mirror to act as a magnifier and not form a pupil for the user. Figure 1. Google Glass sitting above the user’s eye [6].

Upload: others

Post on 25-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 1 of 57

Illumination Simulation and Design Considerations for Mobile Virtual Reality

Systems Evan Richards

Opti 909 – Report for Masters of Science in Optical Sciences

November 16, 2015

Introduction

Head mounted displays (HMDs) have been in wide military and commercial use for decades [1].

HMDs present visual information to the user, whether it be symbols, text, or imagery. These

devices come in a variety of design forms and can be transparent or occluded [2] [3]. Within the

past four years, consumer electronics versions of HMDs have been front page news through

developer kits like Google Glass Explorer Edition and the pending launch of the consumer

Oculus Rift in the first quarter of 2016.

In general, consumer HMDs fall into three classes of development – informative/microinteration,

augmented reality, and virtual reality. Some classify the informative/microinteraction devices as

smart glasses or smart eyewear [4]. This first category includes Google Glass [5], seen in Figure

1, where the HMD may sit above the eye and outside of the straight ahead gaze of the user.

These systems tend to have small diagonal fields of view, less than 20 degrees, in order to

optimize for compactness and minimum visibility for those around the user. In an effort to

enhance usability beyond information display, these systems may include cameras, audio, and

other input interaction. Many times these optics can be see through, as is the case with Google

Glass incorporating a partially transparent beamsplitter and large fully transparent section of the

light pipe. This allows the user to still directly see the physical environment when looking

through the device. Due to the field of view and the placement out of the straight forward gaze

position, these devices do not overlay information directly on top of the environment and are not

considered augmented reality. As these are intended for everyday use, minimizing weight and

size are critical considerations and highly constrain the optical system. This is one of the reasons

that Google Glass used a single spherical mirror to act as a magnifier and not form a pupil for the

user.

Figure 1. Google Glass sitting above the user’s eye [6].

Page 2: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 2 of 57

Augmented reality (AR) systems intend to overlay content on top of the real world in real time

[7]. In this case, the optics are placed directly in front of the user’s forward gaze. These systems

require larger fields of view than the smart glasses category. Combined with the requirement for

see through performance, these systems require more complex optics for enhanced performance.

As an example of complexity, the DK40 [8], a development kit made by Lumus Ltd., uses a

collimator combined with a lightguide containing a series of parallel beamsplitters to present the

image to the user. This model offers 25 degrees FFOV, while other models offer up to 40

degrees FFOV. Illustrations of the technology are shown in Figure 2.

a) b) Figure 2. Lumus DK40 lightguide for AR [8].

Microsoft recently announced developer kits for the HoloLens system, but a minimal amount of

public information is known about the optical architecture of the system [9]. Figure 3 shows the

graphics on Microsoft’s HoloLens website, noting the use of holographic optics to create AR.

Holographic combiners can be thin as shown in the figure, but tend to have narrow FOV. This

narrow FOV is challenging for the AR space.

Figure 3. Hardware information available from Microsoft HoloLens website [9].

Freeform optics are showing great promise in this area by eliminating some of the issues that

conventional systems have scaling to larger fields of view. Additionally, demonstrations of

hardware have been performed that present integral imaging and freeform optics to provide relief

to eye fatigue, such as the optical layout shown in Figure 4 [7]. The optics enable compact

Page 3: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 3 of 57

designs along with several depth planes to eliminate conflicting depth perception. With a

freeform optic providing the projection, a secondary freeform corrector lens is added to the

system to prevent distortion of the see through image. In addition to multiple depth planes,

compact eye tracking has been demonstrated for AR freeform systems by placing an IR camera

near the microdisplay [10]. Even with freeforms, AR systems can still be bulky and very

apparent when on the face of the user.

Figure 4. AR freeform system consisting of freeform combiner and freeform see through corrector.

This system also generates several depth planes [7].

In contrast to AR systems, virtual reality (VR) systems intend to fully replace the physical world.

As such, the optical designs incorporate fully occluded optics and allow a variety of design

options for screen size and optical complexity. These systems have large fields of view greater

than 90 degrees full field of view (FFOV) in order to immerse the user and provide a sense of

VR presence. The size of the system enables the use of much larger (diameter and thickness)

optics to create the wide field of view. Some of these systems, such as the Oculus Rift DK2, use

rotationally symmetric systems with one single lens placed in front of each eye to magnify the

display in a non-pupil forming architecture. Systems like Oculus Rift make use of powerful

desktop PCs to render a large number of pixels at high frame rates. This requires the user to be

relatively stationary or within a bounded environment. Thus, occluded displays can be used and

are desired to also block out the external environment.

Mobile phones and AMOLED displays have now reached the performance level where they are

capable of rendering VR experiences. In 2014, Oculus VR and Samsung Electronics announced

the Gear VR Innovator Edition mobile virtual reality headset [11]. The system consists of two

separate parts – a Galaxy Note 4 cell phone and a headset rig. The phone is installed into the

headset rig and locks into place relative to magnifying optics. Mechanical features locate the

display inside the front focal point of the lens to create an enlarged, virtual image in front of the

user. This is a non-pupil forming architecture. Views of the system with and without the Note 4

installed, along with the user’s view of the optics, are shown in Figure 5. Since that time, a new

Page 4: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 4 of 57

innovator edition has been released for the Galaxy S6 smartphone and has been additionally

buoyed by the announcement of a $99 version of Gear VR [12].

a) b)

c)

Figure 5. a) Separated Gear VR and Note 4. b) Note 4 installed into Gear VR harness. c) View of

system while on from the user side [13].

Table 1 lists the requirements for different classifications of HMDs. The authors of the

comparison do not explicitly call out the requirements from AR. In this regard, many of the

requirements for smart glasses can be applied to AR headsets, but with larger field of view and

proper registration to overlay content on the real world. VR headsets also require spatial and

inertial tracking that smart glasses do not require.

Page 5: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 5 of 57

Table 1. Comparison of requirements for various HMDs [4].

Illumination Considerations for Consumer HMDs

As with many other systems, it is important to evaluate the complete performance before

entering production. Non-sequential ray tracing models allow the analysis to include multiple

interactions between the optical and mechanical elements that are not captured in sequential ray

tracing models like Zemax or Code V. Material properties, such as coatings and absorption, can

be easily included and analyzed. The flexibility to define the illumination source size, location,

spectral content and spatial and angular distributions provide the essentials for defining the

requirements of the display or analyzing candidate technology.

The three classes of devices each have their own unique challenges that require analysis using a

non-sequential model. The first case is to analyze the illumination path exclusively, with the

source, optics and mechanics being the surfaces analyzed. Using Google Glass as an example

for the microinteraction category, the compact size of the beamsplitter creates the risk of

reflections of the primary image. If the top and bottom surfaces of the beamsplitter are specular

or nearly specular, a case may occur as seen in Figure 6. Some of these reflections can be seen

on the Google Glass Explorer Edition. Similar cases may occur for AR systems as well as

compact performance is desired. Holographic and diffractive systems have the risk of multiple

images or crosstalk between spectral channels. This can result in overlapping images seen once

performing analysis with the complete system and its complete spectral parameters. For VR

systems, the primary concern is eliminating distractions presented to the user. Since the Gear

VR uses rotationally symmetric optics, the main concern is stray or scattered light from the

optics and housing.

Page 6: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 6 of 57

a) b) Figure 6. a) Desired image and b) example parasitic reflections from the surfaces of the

beamsplitter in Google Glass. Images adapted from [14].

The second case is to evaluate the impact of light from the environment to the user. For the

cases of microinteraction and AR devices, the environment plays a very strong role. These

devices have the goal of showing information out in the world, either informatively or overlayed

upon the environment. Optics with planar surfaces, such as Google Glass or the visor from

HoloLens, can reflect sunlight or other bright light sources to the user. These appear like glare

from the road when driving, or reflections from the surface of a lake. Making an illumination

model with external sources is the way to analyze this case. In VR systems, the concern is any

light leakage from the outside environment to the user. This can come from gaps or holes

between the system and the user’s face or gaps that occur between the cellphone and the headset

for the case of Gear VR. This could be done using an external source, like a lamp or the sun, and

CAD models of a user’s head with the Gear VR in compression against the CAD head.

For this report, the illumination path from display to the user’s eye will be evaluated for the Gear

VR system. Since no design files are available for the Gear VR, this model will be created

entirely based on measurements of a purchased Gear VR headset and Note 4 phone. Some of

these measurements may not be exact, such as physical dimensions measured with calipers rather

than a coordinate measuring machine (CMM). Reasonable assumptions and design methods will

be used along the way to create the final model. The case of stray light from the environment

will not be addressed in this report.

The objectives of this report are as follows: 1) perform on- and off-axis measurements to model

the Note 4 display, 2) create a viewing lens design based on measurements and specifications of

the Gear VR headset, 3) create a reasonable solid model of the opto-mechanical mounting of the

Gear VR, 4) import all geometry into LightTools for the illumination model and 5) verify

performance of the initial model, 6) update the eye model used as a receiver for both narrow and

wide FOV assessment, and 7) perform a stray light analysis of the system.

Page 7: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 7 of 57

Note 4 Measurements

The source is arguably the most critical part of an illumination model to set up. In the case of

LCD systems, a fully defined model will typically include a reflector, textured backlight, optical

source (like LEDs), diffusers, brightness enhancing films, liquid crystal performance, and

polarizers [15]. An example is shown in Figure 7.

Figure 7. Typical LCD and backlight structure [15].

If the 3D geometry and materials of these items are not available, one can collect or specify

performance for the flux, distribution, viewing angle and spectral/color content. This is the case

used in this analysis for the AMOLED display used in Gear VR. When looking at cell phone

displays like the Note 4, information is not typically provided by manufactures aside from basic

information. All display information from the Samsung Note 4 website is provided in Table 2.

The information from Samsung was useful to create the size of the display in the illumination

model.

Table 2. Key display parameters as listed on Samsung Galaxy Note 4 website [16].

Item Specification

Display type Quad HD Super AMOLED

Display resolution (HxV, pixels) 2560x1440

Display diagonal (mm) 143.9

Display pixel pitch (mm, calculated) 0.049

Phone dimensions (WxLxH, mm) 153.5x78.6x8.5

Gathering the spectral content from the display is a basic requirement for the illumination model.

With the AMOLED display, three spectral peaks are expected and those peaks can be used as the

primary wavelengths for the viewing optics design. Since this isn’t readily provided by

Samsung, measurements of a samples Note 4 display can be made to determine this information.

A spectroradiometer, such as those made by Photo Research Inc. or Konica Minolta, will

measure luminance, spectral radiance, and chromaticity over a small cone angle of <2 degrees

[17] [18]. These measurements were taken in the lab with a Photo Research PR-745

Page 8: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 8 of 57

spectroradiometer. The instrument had been calibrated within the past year at the Photo

Research headquarters.

For measurement setup, the Note 4 was oriented vertically using adhesive to mount the Note 4 to

a pair of optical posts on a rotation stage. The PR-745 was oriented such that its optical axis was

horizontal and normal to the display surface using the mounting features of the optical table. For

each measurement, the display was rotated to the desired measurement angle. The aim spot was

checked in the visual viewfinder of the PR-745 to ensure that the measurement was spatially

centered on the center of the display and that no clipping occurred on the sides of the display.

Figure 8. Setup for measuring Note 4 display performance over viewing angle. Display emitting

surface centered over rotation stage axis. Viewed from top down.

The Note 4 has several modes of operation that change the performance of the screen [19]. To

be consistent throughout the testing, the display was set to maximum brightness with the

adjustment for ambient light turned off. Basic Mode (sRGB/Rec. 709) was selected for the color

space although the color space used on the Note 4 during VR operation is not known. Images for

testing were created in Adobe Photoshop and saved as BMPs with no color management.

Images were displayed on the device using the default Google Photos app after being transferred

via USB. The parameters and results of the on-axis measurement at the center of the screen are

summarized in Table 3. Chromaticity values are reported using CIE 1931 2 degree observer.

Measured spectral radiance is shown in Figure 9.

Page 9: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 9 of 57

Table 3. On-axis measurement parameters and results for the Note 4 display. All values collected

via laboratory measurement.

Item Specification/Result

Instrument Photo Research PR-745 with MS-75 lens

Measurement aperture (degrees) 2

Measurement bandwidth (nm) 4

Instrument to display distance (m) 0.57

Display operation mode Basic Mode

Display system settings Full brightness, ambient light sensor off

White image brightness (nits) 330.1

Red image brightness (nits) 70.82

Green image brightness (nits) 232.6

Blue image brightness (nits) 20.45

White color coordinate (x, y) (0.3100, 0.3271)

Red color coordinate (x, y) (0.6370, 0.3344)

Green color coordinate (x, y) (0.2856, 0.6060)

Blue color coordinate (x, y) (0.1541, 0.0537)

White image CCT (K) 6457

Figure 9. Plot of on-axis spectral radiance for white, red, green and blue primary images.

In order to determine if the display was Lambertian as assumed, the measurements were taken at

various angles relative to the display surface normal. Equations for luminance are shown in Eqs.

(1) and (2).

𝐿𝑣 =𝑑2Φ𝑣

𝑑𝐴𝑠,𝑝𝑟𝑜𝑗𝑑Ω (1)

𝐿(𝑟, 𝜃, 𝜙) = 𝐿(𝑟) = 𝐿𝑠 (2)

0.00E+00

2.00E-03

4.00E-03

6.00E-03

8.00E-03

1.00E-02

1.20E-02

1.40E-02

38

0

40

0

42

0

44

0

46

0

48

0

50

0

52

0

54

0

56

0

58

0

60

0

62

0

64

0

66

0

68

0

70

0

72

0

74

0

76

0

78

0

W/s

r/m

^2/n

m

Wavelength (nm)

Note 4 Spectral Radiance

White

Red

Green

Blue

Page 10: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 10 of 57

Equation 2 assumes that the source has uniform spatial emission. The spectroradiometer

measures a fixed angle and is focused at a single distance. The measured result will be an

average over the spatial measurement area. This means that dAs,proj and dΩ are constant for the

spectroradiometer measurements. Thus, if the source is Lambertian, all luminance

measurements will be the same regardless of angle. The Note 4 display was measured using the

same conditions listed above from 0 to 60 degrees AOI relative to the surface normal, in steps of

5 degrees, using a white image. The luminance measurements are shown in Figure 10.

Figure 10. Plot of luminance over angle measurements for Note 4.

The display does not exhibit Lambertian behavior beyond 10 degrees to the surface normal. This

may be caused by the performance of the circular polarizer on top of the display. It is

worthwhile to also look at the spectral content variation over angle. Some minor color variations

were noted and were seen in the data as well. Because of the close spacing of some of the

curves, these plots are shown in Figure 11 in intervals of 10 degrees. Because of the

performance and the interest in looking at color over viewing angle, this will need to be

addressed in the LightTools model.

0

50

100

150

200

250

300

350

0 10 20 30 40 50 60

Lum

inan

ce (

nit

s)

Angle from Normal (deg)

Note 4 Luminance over Angle

Page 11: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 11 of 57

Figure 11. Spectral Radiance of Note 4 display as measured over angle. AOI denotes the angle

relative to the display normal. Rotations were performed in the horizontal axis.

It is advantageous to include this information on a spectral basis into LightTools. Rays are

generated within the specified wavelength region for the source and the proper power applied

based on the particular wavelength of the ray. For comparison purposes, it is easier to see the

white point plotted in different forms. First, Figure 12 shows the color difference at various

vireing angles relative to the viewing the display at the surface normal. Second, Figure 13 shows

the plot of these white points overlaid on the CIE 1931 and CIE 1976 color diagrams.

Perceptually, a small color shift can be observed when looking at higher angles relative to the

display normal even though the grouping on the color space diagrams appears tight. This will be

an item to watch through later analysis.

Figure 12. Color difference in CIE 1976 from on-axis measurement for Note 4 display.

0.00E+00

2.00E-03

4.00E-03

6.00E-03

8.00E-03

1.00E-02

1.20E-02

1.40E-02

1.60E-02

38

04

00

42

04

40

46

04

80

50

05

20

54

05

60

58

06

00

62

06

40

66

06

80

70

07

20

74

07

60

78

0

W/s

r/m

^2/n

m

Wavelength (nm)

Note 4 Spectral Radiance over Angle

0° AOI

10° AOI

20° AOI

30° AOI

40° AOI

50° AOI

60° AOI

0.000

0.005

0.010

0.015

0.020

0.025

0 10 20 30 40 50 60

Δu

'v'

Angle from Normal (degrees)

Note 4 Δu'v' from On-Axis Measurement

Page 12: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 12 of 57

a) b) Figure 13. Scatter distribution of the white point by viewing angle for Note 4 display in a) CIE

1931 and b) CIE 1976 color spaces. Data labels not included due to tight grouping. CIE images

adapted from [20] and [21].

Viewing Lens Design

As previously noted, the lens prescription for the Gear VR headset rig is not available.

Inspection shows that the system uses a single lens with no coatings on the lens surfaces. The

starting point for the design comes from the Samsung Gear VR website and are listed in Table 4.

The interpupillary distance for the headset is adjustable using a knob on the headset. For ease of

nomenclature, surface 1 will designate the lens surface closest to the user’s eye and surface 2

will designate the lens surface closest to the display

Table 4. Key system parameters as listed on Samsung Gear VR website [22].

Item Specification

Full field of view (per eye, degrees) 96

Focal adjustment Included

Interpupillary Distance (IPD) setting (mm) 55-71

Headset dimensions (WxLxH, mm) 198x116x90

Since the physical parts are available for inspection, a number of parameters were measured

from the Gear VR hardware. For the lens itself, the clear aperture was directly measured at

36mm. Using a series of measurements on the housing to the lens vertices, the lens thickness

was determined to be 12mm with a distance from the last surface to the screen of 26mm. Using

the thickness of the lens retainer, the sag on surface 1 was constrained to be less than 5.5mm.

This sag requirement is important to not have a highly protruding lens surface towards the user’s

eye.

Page 13: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 13 of 57

The lens was entered into Zemax, tracing from the stop to the display, and made into a rough

spherical version of the lens as a starting point. The object was placed at infinity. The eye relief,

which was initially set to 15mm, was reduced to 10mm for vignetting reasons. For material

selection, a low cost polymer option was selected. Two designs were run in parallel – one with

polycarbonate and one with PMMA. At first, the higher index of the polycarbonate was

appealing (1.5855 vs. 1.4918), but the PMMA was selected for its higher Abbe number when

considering polychromatic performance (57.441 vs. 29.909). From a manufacturing perspective,

PMMA also produces less wear on the injection molding tool. Finally, the design wavelengths

were selected based on the measured peak wavelengths for the red, green and blue output of the

AMOLED display.

Since the system is tracing from the stop to the display, the complete system appears like a

conventional landscape lens. Setting an appropriate stop size is important to understand the

magnitude of aberrations that the user will encounter in typical use. This value can be

determined based on experimental results that measured eye pupil diameter based on ambient

brightness. Figure 14 shows two studies measuring pupil diameter for extended illumination in

young eyes and narrow 10 degree illumination [23]. With the Note 4 display measuring at 330

nits, a reasonable selection of pupil size for this display is 4-6mm. The lens was optimized at

5mm stop diameter, and then stopped down to 4mm to reduce the vignetting at the largest field

angle.

Figure 14. Plot of human pupil diameter as a function of illumination. Extended illumination is

shown with the blue line, and 10 degree illumination is shown with the red line [23].

Field points were added in Zemax to sample every 5 degrees from 0 to 45 degrees and a final

field at 48 degrees was added. A default merit function to optimize for RMS spot size (x + y

instead of radius) relative to the centroid was included with the sag requirement for surface 1.

All field points were given equal weights in the merit function. This was due to the lack of eye

tracking and thus the eye can look at any point in the field and not just the central foveal region.

Page 14: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 14 of 57

The system was iteratively optimized. The first optimization allowed the radii of curvature to

vary, with the resulting lens showing very strong field curvature. The second optimization

allowed the radii of curvature to vary along with the conic for surface 2. This result improved

the field curvature substantially. Finally, the radii of curvature and conics for both surface 1 and

surface 2 were allowed to vary. The key parameters and results are listed in Table 5. The optical

prescription is listed in Table 6.

Table 5. Key system parameters, specifications and results for the viewing lens design.

Item Specification Result

Virtual image distance (m) Infinity Same

Full field of view (degrees) 96 Same

Measured clear aperture (mm) 36 Same

Measured lens thickness (mm) 12 Same

Measured lens to screen distance (mm) 26 Same

Eye relief (mm) 15 10

Pupil diameter (mm) 5 4

Lens material PMMA or Polycarbonate PMMA

Eye side lens surface sag (mm) <5.5 4.261

Edge thickness (mm) >0 2.083

Design wavelengths (nm) 458, 530, 612 Same

Table 6. Optical prescription for viewing lens. All units in mm.

Surface Radius Thickness Material Semi Diameter Conic

0 - Object Infinity Infinity Infinity

1 - Stop (Eye Relief) Infinity 10.000 2.000

2 - Lens 35.424 12.000 PMMA 18.000 -2.219

3 -22.475 26.000 18.000 -3.180

4 - Image Infinity - 23.937

The layout of the lens is shown in Figure 15. Key performance metrics, including field

curvature, distortion, spot diagrams, and MTF, are shown in Figure 16. The spot diagrams show

the lateral color caused by using a single refractive element. Given the narrow spectral profiles

for the red, green and blue, each can be corrected for lateral location individually in software.

The MTF for the system is also shown for 530nm and the polychromatic case which includes the

peaks for the red and blue wavelengths. With the tendency of looking straight forward, the low

field angles are more important than the edge fields. The distortion is also of note for a design

such as this, and is not atypical of expectations. It is not of as much concern as it can be

corrected in rendering and does not have significant appearance when using the Gear VR system.

Distortion characteristics will be something useful as a diagnostic in the illumination model to

determine that everything has been set up properly.

Page 15: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 15 of 57

Figure 15. Viewing Lens Layout.

a) b)

c) d) Figure 16. a) Field curvature and distortion plot. Scale on field curvature is ±3mm and distortion

is ±30%. b) RMS spot sizes for fields of 5, 10, 15, 20, 25, 30, 35, 40, 45, and 48 degrees HFOV.

Scale is 1000 microns. c) Monochromatic MTF plot at 530nm for all fields. d) Polychromatic MTF

plot for 458, 530 and 612nm. Pupil diameter set to 4mm for all analysis.

One of the advantages to setting the system up in this way is that performance is compared at the

display plane. The designer knows the pixel size, pixel spacing, and curvature of the display.

Resolution is evaluated based on the size and spacing of the individual display pixels and

Page 16: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 16 of 57

requires no assumptions about the performance or behavior of the human eye. The distortion

curve from Zemax gives the information to pre-distort the image so that it appears undistorted.

Chromatic aberration can be corrected in a similar fashion. Field curvature plots appear as

variation from the display plane and provide comparison with which most designers are familiar

with, such as a camera lens imaging onto a sensor.

CAD Creation

In order to create the housing for the system, SolidWorks was used starting with a rectangular

block design. As seen in Figure 5, the screen side has circular cutouts that allow the light from

the display to reach the optics. A conical cutout is used immediately contacting the lens. An

additional cutout is in the middle of the housing, likely for light-weighting purposes. On the user

side of the lens, a circular retaining mount holds the lens in place. These features are shown in

detail in Figure 17. All features were measured using a pair of Mitutoyo calipers.

The existing lens surfaces from the Zemax design were maintained as clear apertures. A 1.0 mm

wide flange was added to the outside of the lens to assist in mounting. A 0.050 mm gap was left

on any side of the flange for placement of adhesive. There are no plans to include this adhesive

in the model.

A single IPD value of 66mm was chosen for the spacing between the optical axes for the two

eyes. This value was within the specified performance range of the IPD adjustment. By using

mirroring features in SolidWorks, this could be easily updated to reflect a different IPD value.

On the user side, a rounded section creates a facial interface that can be seen in Figure 5. That is

not included here as it is assumed that the majority of stray light will come from the housing

features between the display and optics rather than the facial interface.

Page 17: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 17 of 57

a)

b) c) Figure 17. a) Measured dimensions of internal portion of the rig housing. Units in mm.

Highlighted features are revolved around the line 33mm from the centerline. b) Top cross section

view showing optics held in place. c) Isometric cross section view highlighting the lightweighting

cuts in the housing.

Preliminary LightTools Model Verification

First, the STEP file from SolidWorks was imported into LightTools. All surfaces were set to

Mechanical Absorbers and perfectly absorb incident light. The lenses were made in native

LightTools geometry using the Quick Lens command. Flanges were added and all lens surfaces

were set 100% transmitting or TIR mode. Later in the analysis process, the plastic housing will

include some reflectance and scattering and the lenses will act like bare surfaces and exhibit

Fresnel reflections after the initial model performance has been verified. Figure 18 shows two

isometric views of the combined geometry in LightTools.

Page 18: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 18 of 57

a) b) Figure 18. LightTools geometry of lenses and housing. No source or receivers have been added.

Lenses are shown in pink for transmitting geometry and the housing is shown in light purple.

Image a shows the side closest to the Note 4 and image b shows the side closest to the user.

The lenses were designed out of PMMA. Before proceeding with the rest of the model, the

parameters of default PMMA in LightTools were analyzed by extracting the material properties.

Figure 19 shows the intrinsic transmittance of PMMA in LightTools along with index of

refraction. As a sanity check, the extrinsic transmittance of a 3mm thick cast PMMA sheet from

polymer manufacturer Evonik is shown as well. The full parameters of this PMMA sample are

not known, so it is being used to make sure nothing is severely wrong with the default

LightTools PMMA parameters. Taking into account the Fresnel reflections from both surfaces

(~8% total), the spectrum over the visible appears correct in LightTools. The index of refraction

also seems quite reasonable.

Page 19: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 19 of 57

a) b)

c) Figure 19. a) Intrinsic transmittance for given thickness in LightTools and b) extrinsic

transmittance of cast Acrylite® PMMA from Evonik [24]. Of interest is the FF (gray) line as that

material does not have the UV blocking additive that the OP3 (purple) line has. c) Index of

refraction for standard PMMA in LightTools.

The source in the LightTools model was created next. A 66mm horizontal by 75mm vertical

rectangular source was added and placed to match the center of the source with the optical axis

of the right eye. Only the surface facing the optics was set to emit rays. No surface properties

were added to the geometry and the material was set to air. The selection of air as the material

was to avoid any rays getting internally reflected for n>1. The measured spectral data from the

Note 4 panel was added into the source. The spectrum was cutoff where the value was less than

1% of maximum to avoid creating rays with low power.

A block of glass was added in front of the display to simulate the display coverglass. One side

was set have the coating to correct for the display luminance and spectral variations over viewing

angle. Figure 20 shows this coating performance in transmission and absorbs rays that aren’t

transmitted. For AOIs beyond 60 degrees, the values at 60 degrees AOI are used. The other side

was set to account for Fresnel reflectance and transmittance. This was done as setup for stray

light analysis in the system later on but will not impact the initial model verification. It is

assumed that any Fresnel effects from the side of the coverglass internal to the display are

already included in the measurement results. The glass type was selected as Schott BK7 because

the coverglass material is unknown. The source was set to be Lambertian and emission angle

restricted to be from 0 to 90 degrees from source normal to emit into one full hemisphere.

Page 20: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 20 of 57

Figure 20. Coating transmittance used to model the screen performance over viewing angle.

The user sets the total flux from a source in LightTools. The program then automatically

calculates ideal luminance for Lambertian sources. To make the display the correct brightness,

the equivalent luminous flux was set to 5.1318 lumens, which resulted in 330 nits of brightness

at 0 degrees AOI. No modifications were made for the spatial uniformity of the source. Once

complete, the source was duplicated and placed in front of the left eye in the same manner.

Eye Model Implementation

Next, it is necessary to include an eye model to analyze the performance once passed through the

entire system. Zemax and CodeV offer paraxial lens equivalents that some designers use when

tracing from display to retina. LightTools does not offer this option and requires the construction

of an eye model from real geometry. As many eye models exist [25], one could also consider

designing a replacement lens for use in the illumination model with a stop at the first element to

represent the pupil. This may be a good option as some eye models, such as the Arizona eye

model, include off-axis aberrations that may not be necessary for the analysis.

In order to compare the MIL-HDBK-141 Eye Model [26] and the Arizona Eye Model [25], the

prescriptions were first put into Zemax for sequential performance characterization. The

Arizona Eye Model changes with accommodation distance to simulate the performance of the

eye as the lens changes shape. This is advantageous because it can be easily parameterized in

LightTools and one variable used to control accommodation. Additionally, neither of these

models include gradient index materials like some newer eye models. The lack of gradient index

materials makes the implementation straight forward. The prescriptions used for these two

models are shown in Table 7 and Table 8. Both lenses were set to have the object at infinity,

matching the designed VR image distance.

Page 21: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 21 of 57

Table 7. Prescription for MIL-HDBK-141 Eye Model [26]. All units in mm.

Surface Radius Thickness Material Semi Diameter Conic

0 - Object Infinity Infinity Infinity

1 - Cornea 7.980 1.150 367000 4.152

2 - Aqueous 6.220 2.390 336000 3.253

3 - Lens (stop) 10.200 4.060 420482 1.775

4 - Vitreous -6.170 17.150 337000 3.480

5 - Image -11.100 - 10.739

Table 8. Prescription for Arizona Eye Model for no accommodation (focused at infinity) [25]. All

units in mm.

Surface Radius Thickness Material Semi Diameter Conic

0 - Object Infinity Infinity Infinity

1 - Cornea 7.800 0.550 377571 4.142 -0.250

2 - Aqueous 6.500 2.970 337613 3.662 -0.250

3 - Lens (stop) 12.000 3.767 420519 1.782 -7.519

4 - Vitreous -5.225 16.713 336611 3.319 -1.354

5 - Image 13.400 - 11.553

Both of these models were set to include fields up to 50 degrees HFOV. The MIL eye has a

focal length of 17.09mm, while the Arizona eye has a focal length of 16.50mm. Each lens was

set to have a 4mm entrance pupil diameter. The MIL eye only includes dispersion for the

crystalline lens, whereas the Arizona eye includes dispersion for all elements. The Arizona Eye

Model includes the use of aspheric terms while the MIL eye is all spherical. Each of these

includes the retina as a curved image plane. Layouts are shown in Figure 21.

a) b) Figure 21. Layouts for a) MIL-HDBK-141 and b) Arizona Eye Models. Scale bar is 4.0mm for

each.

In order to compare performance of the lenses, spot diagrams and MTF plots were generated for

comparison. These are shown in Figure 22 and Figure 23.

Page 22: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 22 of 57

a) b) Figure 22. Spot diagrams for a) MIL-HDBK-141 and b) Arizona Eye Models. Fields are from 0 to

50 degrees in increments of 10 degrees. Scale bar is 400 microns. F, d, and C wavelengths used for

analysis.

a) b) Figure 23. Polychromatic Diffraction MTF plots for a) MIL-HDBK-141 and b) Arizona Eye

Models. Diffraction limit shown as black line.

The Arizona Eye Model shows improved performance compared to the MIL-HDBK-141 eye.

Over the FOV of interest, the performance is not drastically different between these two lenses

and either may be suitable for analysis. For additional comparison, field curvature and distortion

plots were generated. Relative illumination plots were also created. These are shown in Figure

24 and Figure 25, respectively.

Page 23: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 23 of 57

a) b) Figure 24. Field curvature and distortion plots for a) MIL-HDBK-141 and b) Arizona Eye Models.

Scale on field curvature is ±2.0mm and scale on distortion is ±50%.

a) b) Figure 25. Relative illumination plots for a) MIL-HDBK-141 and b) Arizona Eye Models.

The Arizona Eye Model shows less field curvature and astigmatism than the MIL Eye Model.

Axial chromatic aberration is improved in the Arizona Eye Model, but is still present. Both

models show significant distortion at the higher field angles, with the Arizona Eye Model

showing improvement over the MIL model. Relative illumination is slightly worse for the

Arizona Eye Model at 80% for the full field.

With the comparison completed, the MIL and Arizona Eye Models were created in LightTools.

Each component was created as a lens element and touching interfaces were made to have

optical contact. All extra surfaces were set as optical absorbers. The retinas were implemented

as curved image planes. The layouts are shown in Figure 26. For the image planes, the size and

number of bins are set by the user. The size of the image plane will restrict the FOV for the

receiver. The receiver area is subdivided into bins and the rays incident on each bin are summed

for power, luminance, and chromaticity. With more bins for the same simulation, the ability to

resolve fine detail is increased, but the statistical error for the raytrace is higher since less

samples are made within each bin. Reporting the error from the raytrace provides an estimate on

the fidelity of the results. The revised MIL eye and the Arizona eye both had an illuminance

-50-2.00 00.00 502.00

Millimeters Percent

+Y +Y

Field Curvature Distortion

T ST ST S

Mil Hdbk Eye Model Report.zmxConfiguration 1 of 1

Field Curvature / F-Tan(Theta) Distortion

5/3/2015Maximum Field is 50.000 Degrees.Wavelengths: 0.486 0.588 0.656

-50-2.00 00.00 502.00

Millimeters Percent

+Y +Y

Field Curvature Distortion

T ST ST S

Arizona Eye Model Report.zmxConfiguration 1 of 1

Field Curvature / F-Tan(Theta) Distortion

5/3/2015Maximum Field is 50.000 Degrees.Wavelengths: 0.486 0.588 0.656

Page 24: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 24 of 57

mesh with 401x401 bins at 26.8mm square, thus slightly different sized images based on the

different focal lengths.

a) b) Figure 26. LightTools layouts for a) revised MIL-HDBK-141 Eye Model and b) Arizona Eye

Model.

In order to update the eyes to match the model, the receivers were switched from radiometric to

photometric quantities. An initial forward ray trace showed that the system was working

properly and the illuminance mesh boundaries were updated to a larger size to capture the edges

of the field. However, when tracing from the display to the eye, only a small percentage of rays

were successfully propagating to the retina. This is because the display is spatially large

compared to the 4mm aperture of the eye. Additionally, the display is emitting in a 90 degree

cone from the normal, so many rays do not have the opportunity to pass all the way to the retina.

A backwards ray trace was then set up to trace from the eye to display and then update the

performance at the retina based on where it interacted with the display. An aim region was

selected on the eye to aim only at the optic and not send extraneous rays out to be absorbed by

the housing. The receiver was set to 171x171 bins at 34mm square with smoothing off.

A checkerboard on the display added such that each square was 1.5x1.5mm. An additional circle

mask of 24.630mm radius was added to match the design field of view of the lens to provide

more delineation as to the boundary of the system. With all optical geometry set to have 100%

transmission or TIR, the ray power threshold was set to 1% for tracing. The ray trace was run

backwards with 2.147 billion rays, as previously performed, for comparison. LightTools is

limited to 2.147 billion rays based on its random number generating algorithm. These ray traces

took approximately 1.25 hours each on a laptop.

Page 25: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 25 of 57

a) b)

c) d) Figure 27. On-axis illuminance meshes for a) revised MIL-HDBK-141 Eye Model and b) Arizona

Eye Model. 30 degree off-axis illuminance meshes for c) revised MIL-HDBK-141 Eye Model and d)

Arizona Eye Model.

In comparing the performance, both models appear very similar to each other. For additional

comparison purposes CIE meshes, or 2D plots, were included in the same analysis. These are

shown in Figure 28.

Page 26: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 26 of 57

a) b)

c) d) Figure 28. On-axis CIE meshes for a) revised MIL-HDBK-141 Eye Model and b) Arizona Eye

Model. 30 degree off-axis CIE meshes for c) revised MIL-HDBK-141 Eye Model and d) Arizona

Eye Model.

The plots show 100% contrast since no scattering or Fresnel reflections were present in the

simulation. The expected lateral chromatic aberration from the viewing lens is visible in these

plots. The MIL and Arizona Eye Models exhibit axial chromatic aberration but not a very

pronounced lateral chromatic aberration. The advantage to using the Arizona Eye Model is the

wide field of view performance that can assess the entire system performance at once. The

disadvantage to the Arizona Eye Model is that it does not offer high resolution, contains

chromatic aberration, and has significant distortion. Having a lens without these issues will

allow assessment of the headset optical system itself without introducing any error. This is the

motivation for creating an “Ideal” Eye Model and will be discussed in detail in later sections.

Stray Light Analysis

With representative performance from the display and the viewing optics in the model,

representative performance from the plastic housing was the final item for inclusion. All of the

surfaces between the display and the user are black in color. Most of the surface was textured

and, when illuminated with a green laser beam, produced a very diffuse reflection. Only one

Page 27: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 27 of 57

surface had a specular appearance and it was around the retaining ring for the lens. This single

surface is shown in Figure 29.

Figure 29. Designation for black specular housing surface, shown in gray and highlighted using

green arrow. Housing is shown in dark red, lens is shown in magenta, Arizona Eye Model shown in

gold.

Visibly rough black plastic surfaces were set to Lambertian scattering with 8% reflectance. For

the specular surface, it was left as a specular reflector of 8% reflectance. This was an

assumption based on the estimation that the black surfaces would be 5-10% reflective.

For this analysis, a wide field of view was desired. Using the 1.5x1.5mm checkerboard pattern,

the resulting stray light will appear in the portions of the checkerboard that are dark. This makes

a good case for using the Arizona Eye Model for this simulation. Additionally, the detector was

switched from linear to logarithmic to simulate the detection method of the eye. The dynamic

range of the eye for a single scene varies in the literature from 2.4 orders of magnitude [27] to 6

orders of magnitude [28]. A value of 4 orders of magnitude was selected as an estimate since it

was between these values. The ray tracing power threshold was also dropped to 1x10-6 to ensure

the rays would arrive properly at the detector for this thresholding.

When analyzing stray light, one can use selection criteria on the ray paths to see specific surface

interactions. As a preference for this setup, five different ray traces were run in order to compare

the contributions step by step. In the first ray trace, the baseline performance was set. In this

model, the viewing lens geometry was set to transmitting and all housing parts remained as

mechanical absorbers. In the second ray trace, Fresnel reflectance/transmittance was added to all

viewing lens surfaces – including the flange and edge. Ray splitting behavior was set to

probabilistic to maintain the “one ray in, one ray out” approach. In the third ray trace, the

Fresnel performance remained on the viewing lens and the specular reflection was added to the

retaining lens surface while the rest of the housing remained as a mechanical absorber. In the

fourth ray trace, the Fresnel performance remained on the viewing lens and the Lambertian

reflection was added to the whole housing except the retaining ring, which remained as a

mechanical absorber. In the fifth and final ray trace, Fresnel performance was on the viewing

lens, the retaining ring exhibited specular reflectance, and the rest of the housing exhibited

Page 28: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 28 of 57

Lambertian reflectance. The illuminance meshes are shown in Figure 30 and Figure 31. Each

simulation ran with 2.147 billion rays in reverse ray trace mode, taking approximately 1 hour

each to complete.

For the first ray trace (Figure 30), the expected behavior is observed with illuminance seen only

in the white portions of the checkerboard. The black portions have illuminance below the

threshold, but in reality are 0 lux. The boundary is clear on the edge of the field. For the second

case (Figure 31a), stray light occurs in the center of the field of view from the Fresnel reflections

of the viewing lens and the coverglass from the screen. For the third case (Figure 31b), the

specular reflectance on the retaining ring adds illumination at the boundary of the field as

expected. Fortunately, this does not creep too far into the FOV, but is still present. For the

fourth case (Figure 31c), the Lambertian scattering adds noise outside of the region where the

Fresnel reflections contribute. There is a small ring section that does not appear to have stray

light from either effect. In the fifth simulation (Figure 31d), the stray light contributions all add

together to produce the final expected result.

Page 29: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 29 of 57

Figure 30. Baseline illuminance mesh for stray light analysis. All geometry set as 100%

transmitting for optical components and mechanical absorber for housing surfaces.

a) b)

c) d) Figure 31. Illuminance meshes for stray light analysis using the Arizona Eye Model. All meshes

include Frensel transmittance/reflectance on the viewing lens with probabilistic ray splitting. a)

Fresnel on lens only. b) Black specular retaining ring surface added. c) Black Lambertian scatter

on mechanical housing. d) Black specular retaining ring and black Lambertian scatter on housing

combined. All charts set to a maximum luminance of 19.4 lux and a minimum of 1.94x10-3.

Page 30: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 30 of 57

For ease of comparison, the modulation contrast was calculated at 0 degrees and 30 degrees FOV

for all five cases. The results are shown in Table 9. The largest cumulative effect occurs at the

edge of the field of view. There is significant benefit from eliminating the contribution of the

Fresnel reflections from the lens, like the use of an AR coating.

Table 9. Modulation contrast performance for the various stray light cases.

Model Case 0 Degrees 30 Degrees 48 Degrees

None 100.0% 100.0% 100.0%

Lens Fresnel Reflection Only 98.5% 100.0% 99.9%

Black Specular Retaining Ring + Lens Fresnel 98.5% 100.0% 98.5%

Lambertian Scattering Housing + Lens Fresnel 98.5% 99.8% 99.8%

Black Specular Retaining Ring + Lambertian

Scattering Housing + Lens Fresnel

98.5% 99.8% 98.4%

For a future simulation, the expected scattering from the lens optical surfaces could be included

based on assumed molding conditions or the measurement of surface roughness with a white

light scanning interferometer. The modeling of the diffuse reflecting surfaces could also be

improved through measurements, although that could be time consuming or costly.

Ideal Eye Motivations

For virtual reality, one can think of the example of recreating a real world scene inside the VR.

The objective is to make the virtual scene appear as realistic as possible. The scene presented to

the user is the result of all interactions between the display, optics, and mechanical features of

the virtual reality headset.

In the previous examples, the MIL and Arizona Eye Models were used to “measure” the

performance of the VR system. As with any measurement, the measurement device must have

less intrinsic error than the required fidelity of the measurement. Likewise, the replacement eye

model used for assessment must have very little error. For the lens system that goes in place of

the eye, this would ideally have an infinitesimally small points [29], have equal relative

illumination for all fiend angles, and be free from distortion and chromatic aberration.

The primary reason for wanting an “Ideal eye replacement” is that it gives the designer a clear

understanding of what is occurring in the VR headset. A perfect VR system analyzed using the

Arizona Eye Model would still include all the clinical levels of aberration and distortion. It

would be difficult for the designer to understand if distortion correction on the image was being

applied properly. It would be difficult for the designer to see if fine features were being

obscured since the Arizona Eye Model is not diffraction limited.

The designer should also not design the system to counteract the aberrations present in the

Arizona Eye Model. When the eye looks at a scene in the real world, all of the aberrations from

the eye are applied to the image. The brain is expecting these aberrations and compensates

appropriately for them. Since most people have eye performance that varies from the Arizona

Eye Model, it doesn’t make sense to attempt to compensate specifically for the model. This also

Page 31: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 31 of 57

does not take into account cases of myopia or hyperopia, as those conditions also influence the

appearance of aberration in the eye by acting as defocus of the imaging plane [29]. This is not to

say that the Arizona Eye Model is not useful in illumination and system design, but it is to say

that alternative eye models provide significant analysis benefit. The purpose of the Ideal eye is

to create a tool that allows analysis of the rest of the system without contributing error that is

significant in the analysis.

Implementation of this scheme varies based on sequential or non-sequential raytracing codes. In

Zemax, the evaluation could be performed by reversing the existing system such that the object

is the display. A replacement for the eye would then be added at the appropriate location and the

image would be representative of the image on the retina. This eye replacement would simply be

a paraxial lens with a user specified focal length. In LightTools, this is problematic because the

chief and marginal rays are not calculated from the model. Thus, LightTools has no capability

for a paraxial lens.

Using a variety of designs that are not physically realizable, Ideal Eye Models can be designed in

sequential lens design software. Diffraction is still present in the sequential software, but is not

calculated in LightTools. This means for the illumination simulation that spot size should be the

key criteria even though other optical performance parameters will be analyzed. For an

illumination eye model, the goal will also include minimizing the number of surfaces involved

for less interactions during ray tracing.

Narrow Field Ideal Eye

In order to simulate the eye, it is good to have the stop of this lens system as the first element of

the system to clearly define the eye relief being used in the system. The design objectives for

this are to create a model with low distortion, a flat image plane, no chromatic issues, and

diffraction limited performance since the eye can achieve this with a 2mm diameter pupil [30].

It is useful to have a narrow field of view eye because all rays from the simulation are spread

over the field of view when reverse raytracing. A narrow field eye allows gathering of precise

performance at a specific viewing angle. A wide field eye model allows viewing of stray light

and an overview of the complete image as previously demonstrated. Thus, these two types of

models complement each other nicely for system analysis.

For assessing the system performance, luminance, color, stray light, resolution, distortion,

chromatic aberration, and contrast performance are all of key interest. Creating an eye that is flat

fielded, or acts like a spatial luminance meter, will allow assessment of luminance and luminance

fall off throughout the system. A lens system without chromatic aberration will allow for exact

color analysis. Using the size of the cones of the eye allows a strict criteria to be set for

distortion performance. The use of non-absorbing, non-scattering materials for the lens, along

with perfect absorbers for mechanical features, enables the assessment of stray light cause by the

VR system. All of these can be accomplished to make excellent analysis tools.

To start this, the CodeV lens database was searched for a starting candidate for the design. Japan

Patent 61_4088 was selected and the design was input into Zemax. The lens was scaled to

16.27mm effective focal length and operated at f/4.06. This five element lens had a 20 degree

Page 32: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 32 of 57

diagonal FFOV as designed. The prescription is shown in Table 10 and the initial layout is

shown in Figure 32.

Table 10. Starting prescription based on JP 61_4088. All units in mm.

Surface Radius Thickness Material Semi Diameter

0 - Object Infinity Infinity Infinity

1 - Stop 4.330 0.693 589610 2.095

2 17.640 0.016 2.068

3 3.494 0.326 806409 2.021

4 2.257 0.815 618634 1.814

5 8.300 0.244 1.790

6 17.600 0.392 720420 1.752

7 2.606 3.470 1.582

8 8.413 0.354 658338 2.235

9 18.877 8.272 2.236

10 - Image Infinity - - 2.985

Figure 32. Starting lens layout based on JP 61_4088.

Making the lens achromatic across the visible from 380-780nm can be challenging with real

glasses. However, since this is an analysis tool the lens can be designed at a single wavelength.

The resulting index of refraction at the design wavelength can be applied to all other

wavelengths, giving the lens the same performance across all wavelengths. In this manner, the

d-line at 587.6nm was selected. This wavelength was set in the system and fields of 0, 7, and 10

degrees were set for initial lens performance. Spot diagrams, MTF, field curvature, distortion,

and relative illumination plots are shown in Figure 33.

Page 33: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 33 of 57

a) b)

c) d) Figure 33. a) Spot diagram for JP 61_4088 at 0, 7, and 10 degree fields. Scale is 200 microns. b)

Diffraction MTF for JP 61_4088. c) Field curvature and distortion plots for JP 61_4088. Scale for

field curvature is ±0.020mm and scale for distortion is ±1%. d) Relative illumination plot for JP

61_4088.

One of the reasons this lens was selected was the low distortion performance as a starting point.

To begin the optimization process, the model glasses were changed to real Schott glasses with

Zemax making the automatic selection. Typically, using the 0, 0.7 and 1.0 fields is sufficient,

but for this design 12 evenly spaced field points were used to aid in optimizing for spot size

without creating larger spots in between the sample points. The maximum field was set to 15

degrees.

Ophthalmic research has shown the mean peak acuity of 8 subjects was found to be 67.2 cycles

per degree with a mean receptor spacing of 2.5 microns for the fovea [31]. The highest peak

acuity of the study was 84.5 cycles per degree. This gives some notion of performance

requirements for spot size and distortion. Applying this to the 15 degree HFOV, the distortion

target should be less than 0.04% if applied across the entire field. A value of 0.01% was selected

at 0.25, 0.50, 0.75 and 1.0 fields. A default merit function was loaded for peak-to-valley spot

size, including constraints on glass and air thicknesses. Correction for specific aberrations was

not included. Optimization was first allowed for all radii and the back focal distance. Next, all

other thicknesses were allowed to vary. Finally, the glasses from all five lenses were allowed to

be substituted with other glasses from the Schott catalog. For the glass selection, this could have

been done using a model glass with any index the program would like. An initial optimization

Page 34: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 34 of 57

using Hammer optimization to allow for the glass substitution was performed. This was

followed by approximately 2000 cycles of global optimization. The design parameters and

results are listed in Table 11 and the final prescription is listed in Table 12.

Table 11. Design specifications and results for the Ideal narrow field eye.

Item Specification Result

Effective focal length (mm) 16 Same

Entrance pupil diameter (mm) 4 Same

Stop location Surface 1 Same

First optical surface Surface 2 Same

Object / virtual image distance (m) Infinity Same

Design wavelength (nm) 587.6 Same

Distortion at any field point (%) <0.01 Same

Spot radius diameter <Airy Disk Same

Minimum glass center thickness (mm) 0.050 Same

Maximum glass center thickness (mm) 5.000 Same

Minimum glass edge thickness (mm) 0.050 Same

Minimum air center thickness (mm) 0.050 Same

Maximum air center thickness (mm) 1000 Same

Minimum air edge thickness (mm) 0.050 Same

Table 12. Prescription for Ideal Narrow Eye Model. All units in mm.

Surface Radius Thickness Material Semi Diameter

0 - Object Infinity Infinity 0.000

1 - Stop Infinity 0.001 2.000

2 14.260 4.931 SF11 2.040

3 8.501 0.159 2.448

4 12.272 3.088 FK3 2.470

5 -6.645 4.955 N-LAK33A 2.914

6 -16.869 3.457 4.187

7 30.201 1.387 SK15 5.274

8 -16.401 13.211 5.278

9 -10.062 0.050 LASF35 4.101

10 -1019.157 0.050 4.264

11 - Image Infinity - - 4.268

One of the objectives was to find an all spherical solution. This was achieved for this lens

design. Some of the glasses selected are a bit unique because of their high refractive index. The

final element has also moved closer to the image plane to act like a field flattener. The layout for

the lens is shown in Figure 34 and the performance of the lens is shown in Figure 35.

Page 35: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 35 of 57

Figure 34. Layout for Ideal Narrow FOV Eye Model.

a) b)

c) d) Figure 35. a) Spot diagram for Ideal Narrow FOV Eye Model at 0, 3, 6, 9, 12, and 15 degree fields.

Scale is 10 microns. b) Diffraction MTF for Ideal Narrow FOV Eye Model. c) Field curvature and

distortion plots for Ideal narrow FOV Eye Model. Scale for field curvature is ±0.002mm and scale

for distortion is ±0.01%. d) Relative illumination plot for Ideal Narrow FOV Eye Model.

The key design objectives have been achieved with this lens, including spot sizes smaller than

the Airy disk and diffraction limited performance. The distortion is very good and the field

curvature is nearly non-existent. One item that was not constrained in the design was the relative

Page 36: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 36 of 57

illumination. At the edge of the field, the relative illumination is 87%. In order to counteract

this issue, another coating can be used on a flat plate in front of the lens. This profile is shown in

Figure 36. As with the previous coatings, the power for the source will need to be increased to

counteract the lost transmission.

Figure 36. Angular dependence for the relative illumination corrector coating for the Ideal narrow

eye. Corrector coating is equal for all wavelengths (no spectral dependence).

The design was then entered into LightTools using the Quick Lens function to build each of the

individual elements. The cemented interface between elements 2 and 3 was made by declaring

the surface as optically contacted. A user defined material was created for each element to have

a constant index matching nd for the glasses in the optimization. Each material was non-

absorbing and the lens surfaces were set to 100% transmitting. A block of air was added at the

front interface with the relative illumination corrector coating applied. An absorbing thin skin

was added via skinned solid in sheet mode to allow only light through the optics to reach the

detector. This blocks light that did not pass through the optics from hitting the detector plane.

To verify the performance of the eye model, a Lambertian source was placed in front of the lens.

Source power was set to 1 W over the entire sphere, placed 0.1mm in front of the lens aperture,

and oversized to 8mm circular diameter to fully illuminate the lens. The wavelength was

selected as 550nm. Figure 37 shows the performance of the lens with and without the relative

illumination corrector coating in place. As expected, the illuminance at the retina is constant

across the field. With this lens now flat fielded, it can be used to compare the performance of a

simulated scene without introducing error that is significant to the simulation.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Tran

smit

tan

ce

Field angle (deg)

Relative Illumination Corrector - Narrow Ideal Eye

Page 37: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 37 of 57

a) b) Figure 37. Illuminance mesh of narrow Ideal eye a) without and b) with relative illumination

corrector coating. Source power is the same for both measurements. Maximum scale is 2.2x105 lux

for both plots.

Once the Ideal lens was complete, it was placed into the system LightTools model. The lens was

positioned such that the aperture stop was 10mm from the rear lens vertex – or coincident with

the iris location for the Arizona Eye Model. This is shown in Figure 38.

Figure 38. Complete system in LightTools with Narrow FOV Ideal Eye Eodel.

With the eye models in place, the receiver size was changed to be 4.212mm square

corresponding to a horizontal FFOV of 15.0 degrees. The model was then rotated in the

horizontal plane to 0, 15, 30, and 45 degrees. Each ray trace took approximately 60 minutes on a

laptop, yielding 75 million backwards traced rays for about 4.75% error. The receiver was set to

401x401 pixels and did not include smoothing. Figure 39 through Figure 42 show these various

meshes for illuminance and color.

Page 38: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 38 of 57

a) b) Figure 39. a) Illuminance mesh and b) CIE mesh for system with Ideal eye looking on-axis.

Illuminance mesh maximum is 1.9 lux.

a) b) Figure 40. a) Illuminance mesh and b) CIE mesh for system with Ideal eye looking 15 degrees off-

axis. Illuminance mesh maximum is 1.9 lux.

Page 39: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 39 of 57

a) b) Figure 41. a) Illuminance mesh and b) CIE mesh for system with Ideal eye looking 30 degrees off-

axis. Illuminance mesh maximum is 1.9 lux.

a) b) Figure 42. a) Illuminance mesh and b) CIE mesh for system with Ideal eye looking 45 degrees off-

axis. Illuminance mesh maximum is 1.9 lux.

The Ideal Narrow Eye Model shows its utility through this series of analysis. First, the

illuminance change from the system is visible and there is no contribution from the Ideal eye.

Second, the resolution of this is very good. The measurement at 45 degrees shows this in

particular because of the lateral color and distortion coming from the optical system and not the

eye model. Third, this model allows the peak error estimate to be reduced significantly. Here,

5% error was found during a 60 minute ray trace. This error can be additionally reduced by

tracing more rays. Alternatively, if ray trace time is a concern, this be reduced to 15 minutes of

ray tracing time by going to 201x201 receiver bins. Based on the data, the images were stitched

together in Adobe Photoshop for illustrative purposes. One could automate the system to stitch

the data together using a macro or other piece of analysis software like MATLAB. The images

in Figure 43 show the performance of the whole system with excellent resolution.

Page 40: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 40 of 57

Figure 43. Stitched illuminance mesh (top) and CIE color mesh (bottom) from the Ideal eye model.

Scales not adjusted for illuminance mesh during stitching operation.

Color and Relative Illumination Analysis

With the Ideal Narrow FOV Eye Model in LightTools, the color and relative illumination

performance of the display and viewing optics together can be analyzed without any contribution

from the eye model. In previous analysis, illuminance and CIE mesh data were collected. For

this analysis, the same will be performed but with slight modifications. At this stage, Fresnel

reflectance/transmittance was added to the viewing lens. This used probabilistic ray behavior

rather than splitting the rays at the interfaces for “one ray in, one ray out” performance. This

represented the performance of just the lens and display together. Contributions from the

housing will be assessed in the next section.

Since continuous color and relative illumination performance are desired, the checkerboard

pattern was removed to make the source solid white. Backwards raytracing will still be used for

this simulation. The retinal receiver size is reduced to be 0.4mm in the vertical direction and

8.572mm wide since a 1D plot is desired rather than 2D. This will increase the efficiency of the

ray trace and utilize the full 30 degrees FFOV capability of the lens. The detector array was set

to 201x1 pixels to generate this radial plot. A ray trace of 100M rays took approximately 3

minutes and yielded less than 2% error. With the Ideal Narrow FOV Eye Model now operating

at 30 degrees full horizontal FOV, the ray trace was run at 15 degrees and 45 degrees relative to

the optical axis of the viewing lens.

Here, the low distortion of the Ideal Eye Model provides another benefit. The virtual screen

appears at infinity, so the spatial locations on the receiver can be directly converted into field

angles. The illuminance has already been corrected by the Ideal eye’s relative illumination

corrector coating, so the data requires only normalization by the 0 degree illuminance. The CIE

mesh in LightTools was reported in CIE 1931 color space. In order to compare in the more

uniform CIE 1976 color space, the results were converted from CIE 1931 x and y values to CIE

1976 u’ and v’ coordinates. This allows a more useful value of Δu’v’ reported for color shift.

Equations 3, 4 and 5 show this process.

Page 41: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 41 of 57

𝑢′ =4𝑥

−2𝑥+12𝑦+3 (3)

𝑣′ =9𝑦

−2𝑥+12𝑦+3 (4)

∆𝑢′𝑣′ = √(𝑢𝑟𝑒𝑓′ − 𝑢′)2 + (𝑣𝑟𝑒𝑓

′ − 𝑣′)2 (5)

For this case, the reference white point was determined by averaging the u’ and v’ values from 0

to 1 degrees AOI. The relative illumination and color difference are shown in Figure 44. The

relative illumination shows good performance better than 90% out beyond 44 degrees. Beyond

44 degrees, there is a sharp fall off. This fall off may be caused by the chromatic aberration at

the edge of the field where longer wavelengths will not be present. The plot was restricted to 48

degrees HFOV, corresponding to the specified FOV of the system. The color difference from

on-axis to the edge of the field is less than 0.005, which is also very good performance. The

simulation was performed using only the white image color spectrum. This simulation could

also be performed using the red, green, and blue channels individually as well.

Figure 44. Plot of relative illumination and color difference from the simulated system. Relative

illumination shown in red on the primary axis, color difference shown in blue on the secondary

axis.

Wide Field Ideal Eye

The motivations for making an Ideal Narrow FOV Eye apply the same to making an ideal eye

with a wide field of view. In this case, the FOV of the Ideal Eye should exceed the per eye FOV

of the VR system. The same constraints apply – low distortion, diffraction limited performance,

no chromatic aberration, and a flat image plane.

One of the techniques for mitigating distortion and chromatic aberration is to render the opposite

distortion and chromatic aberration introduced by the lens. The user then sees an image that

0.000

0.005

0.010

0.015

0.020

0.025

00.10.20.30.40.50.60.70.80.9

1

0 10 20 30 40Δ

u'v

' (b

lue

line)

No

rmal

ized

Illu

min

ance

(re

d li

ne)

Viewing Angle (Deg)

Simulations Over Viewing Angle

Normalized Illuminance Color Difference

Page 42: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 42 of 57

does not appear to have either distortion or chromatic aberration. The desktop Oculus Rift does

this as an example, and a sample image rendered to the display is shown in Figure 45. By having

a wide angle lens that has practically no distortion or chromatic aberration, the correction

performance can be evaluated in LightTools. This is one of the key benefits for making this

Ideal wide angle lens.

Figure 45. Sample image rendered to Oculus Rift showing distortion and chromatic aberration

correction [32].

In the case of the Ideal Narrow FOV Eye, the design started with the patent lens and was

optimized using real Schott glasses. After monochromatic optimization, the d-wavelength index

of the glass was applied to all wavelengths (i.e. each glass was made dispersionless).

Maintaining similar performance characteristics over a wide FOV will be very challenging, so

one option is to allow dispersionless glasses of n>1 for all elements. The index of refraction will

have no upper numerical limit.

This enables a much wider range of solutions. Looking at the Narrow Ideal Eye Model,

analyzing 12 different versions of this eye model provide insight into how the numerical

ficticious glasses help to create the final eye model. The patent lens was adapted into twelve

cases – 4, 5, and 6 elements; 15 and 30 degrees HFOV; and real Schott glasses and numerical

fictitious glasses. For the case of 4 elements, one element of the doublet was removed. The

same merit function was used with a default merit function to minimize RMS spot size, a focal

length of 16mm, distortion < 0.01%, and spacing constraints as listed in Table 11.

Using the same merit function used to find the Ideal narrow FOV lens, each lens was Hammer

and global optimized in a similar fashion. Figure 46 shows these results.

Page 43: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 43 of 57

Figure 46. Plot of Merit Function value. Results show much quicker convergence to a solution

using fictitious glasses.

When using fictitious glasses, performance with 4 elements is near the performance limit with 15

degrees HFOV where it takes 6 elements of real glasses to match the performance. For 30

degrees HFOV, performance with 5 elements is near the best possible while the real performance

with 6 elements is still less than ideal. This provides the motivation for moving to fictitious

glasses with n>1 such that there are no issues with the program for indices less than 1. This will

allow the best solution with the minimum number of elements possible to speed up the non-

sequential ray trace.

Having demonstrated the utility of the fictitious glasses, design work on the Ideal Wide FOV

Eye. Starting with the design for the Ideal narrow FOV eye, the cemented interface was broken

and the FOV was iteratively increased using Hammer optimization until a lens solution existed at

55 degrees HFOV. The field points were updates to still use 12 fields but now evenly spaced in

5 degree increments. Since the distortion was set to a stringent requirement, much of the merit

function was able to be reused.

One challenge from preliminary work was the observation of the ray angles incident on the

image plane. This is an issue because an aim area or aim cone is required to define the

backwards ray trace. One test solution had ray angles exceeding 60 degrees AOI on the image

plane. Ray tracing required more than 250 attempts per ray (or 0.4% efficiency) for their cone or

aim area considerations. By restricting the cone angle, it was determined that significant

improvement could be found. This was necessary because a large diameter element near the

focal plane acted like a field flattener for many solutions. Thus, this was a necessity based on the

requirement of using a flat image plane. Setting the maximum ray angle on the detector to less

than 11 degrees seemed to meet the balance of ray tracing performance and sequential design

performance. An additional diameter constraint was added to the image plane such that it was

<55mm wide. This allows two eye models to be placed side-by-side for evaluation of small

IPDs. As an additional goal of optimization, it is desired to have excellent performance at

0.00E+00

2.00E-04

4.00E-04

6.00E-04

8.00E-04

1.00E-03

3 4 5 6 7

MF

valu

e

# of elements

Narrow FOV Merit Function

15 Deg HFOV - Real Glasses 15 Deg HFOV - Numerical Glasses

30 Deg HFOV - Real Glasses 30 Deg HFOV - Numerical Glasses

Page 44: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 44 of 57

narrow FOVs as well. In this manner the same wide angle lens can be used to gain high

resolution information by restricting the size of the detector appropriately. It is then convenient

to not have to switch eye models for the two types of analysis.

Starting with the 5 element system without doublets, the system was Hammer optimized using

automatic settings. It was then run in Global optimization mode for at least 2500 cycles. After

each run, an additional element was added in airspaces as a low power meniscus or flat. The

system was then reoptimized and this was performed iteratively until an 11 element system was

created. The merit function of each final solution was plotted in Figure 47. The goal was to

have a minimum number of elements to improve ray trace time and minimize input error. The

11 element solution looked best and was ultimately selected for input into LightTools.

Figure 47. Plot of wide Ideal lens merit function by number of elements.

In each spot diagram, the chief ray was used as the reference and the Airy disc was centered on

that location. The first system to have all geometrical rays within the Airy disc was the 10

element system. There was still marked improvement in the system going to 11 elements, so that

was selected as the final system. A 12 element system could have been tried to analyze the merit

function performance but was decided against due to computation time for the large number of

variables and the 11 element system already meeting the performance metrics. The performance

of this system matches the requirements and results set forth in Table 11. The prescription for

the 11 element solution is shown in Table 13.

0.0E+00

5.0E-04

1.0E-03

1.5E-03

2.0E-03

2.5E-03

3.0E-03

3.5E-03

4.0E-03

4.5E-03

4 5 6 7 8 9 10 11 12

Mer

it F

un

ctio

n V

alu

e

Number of Elements

Merit Function Values for Optimized 55 degree HFOV Lens

Page 45: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 45 of 57

Table 13. Prescription for Ideal Wide FOV Eye Model. All units in mm.

Surface Radius Thickness Index Semi Diameter

0 - Object Infinity Infinity 0.000

1 - Stop Infinity 0.012486 2.000

2 25.484636 3.110460 1.0998070 2.147

3 -3.259906 0.097488 3.069

4 -3.508120 0.345565 1.7340500 3.216

5 -4.114163 0.221424 3.620

6 -11.658705 0.061959 1.1918830 5.020

7 -210.347452 1.126908 6.110

8 -86.801440 0.131001 8.0777250 7.430

9 -69.461866 2.196302 7.439

10 -12.567586 0.505727 3.9153580 7.526

11 -11.153354 0.467530 7.566

12 -24.567679 0.155856 13.6465270 8.485

13 -23.340381 1.719599 8.508

14 -14.593251 0.084513 4.1279930 8.606

15 -16.324492 1.194084 8.793

16 -55.375366 0.319021 1.5236660 9.980

17 111.651331 6.564033 10.551

18 -75.311038 1.356898 3.2364220 13.618

19 -38.301755 5.768550 13.644

20 -15.450047 0.052458 5.2799610 13.653

21 -18.178830 0.051697 14.704

22 -431.615084 0.331581 42.8068350 23.103

23 -304.745794 0.944040 23.108

24 - Image Infinity 0.000000 22.843

The prescription shows several elements that are well beyond the typical index of refraction

values. Many of the lenses have poor thickness to diameter aspect ratios that are required in a

conventional design. Out of concern for precision in LightTools, values are shown and were

input into LightTools with 6 decimal places of accuracy. The Zemax lens layout and

performance are shown in Figure 48 and Figure 49, respectively.

Page 46: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 46 of 57

Figure 48. Layout for Ideal Wide FOV Eye Model.

a) b)

c) d) Figure 49. a) Spot diagram for Ideal Wide FOV Eye Model at 0 through 55 degree fields in

increments of 5 degrees. Scale is 40 microns. b) Diffraction MTF for Ideal Wide FOV Eye Model.

c) Field curvature and distortion plots for Ideal Wide FOV Eye Model. Scale for field curvature is

±0.100mm and scale for distortion is ±0.01%. d) Relative illumination plot for Ideal Wide FOV

Eye Model.

A general rule in any optical system is to minimize the number of elements. The use of 11

elements would be challenging if this were a physical system. However, since this lens will

Page 47: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 47 of 57

remain analytical, it only takes care to initially add the system into LightTools. Synopsys offers

a direct link between CodeV and LightTools that would minimize this work if the lens was

designed in CodeV. Since the work here was done in Zemax, manual entry will have to be

performed.

In this case, going to 11 elements is necessary to meet the desired level of performance defined

earlier. To meet the non-sequential raytracing requirements, the maximum angle of incidence on

the image plane is 11 degrees. The relative illumination shows significant change and will

require an adjustment of source power if comparing between the Narrow and Wide Ideal Eye

Models. The corrector “coating” is shown Figure 50 for this lens.

Figure 50. Angular dependence for the relative illumination corrector coating for the Ideal wide

field eye. Corrector coating is equal for all wavelengths (no spectral dependence).

One goal was to be able to switch to a narrow FOV and still maintain the required performance

parameters. The fields were reduced to match the previous analysis on the narrow field Ideal

eye. The geometrical spot sizes are shown in Figure 51. While the spot sizes have increased, the

maximum geometrical diameter is less than 1.72 microns for both lenses over 15 degrees HFOV.

This value is smaller than the mean spacing of the retinal receptors at 2.5 microns.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0 5 10 15 20 25 30 35 40 45 50 55

Tran

smit

tan

ce

Field Angle (deg)

Relative Illumination Corrector - Wide Ideal Eye

Page 48: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 48 of 57

a) b) Figure 51. Comparison of geometric spot sizes for a) narrow FOV and b) wide FOV Ideal lenses

for 0, 3, 6, 9, 12, and 15 degrees field angle. Scale bar is 10 microns for each.

The Wide Ideal Eye was then entered into LightTools using the QuickLens function. All radii,

thicknesses and indices of refraction were entered with six decimal places of precision. The

semi-diameters were fixed to not allow vingetting over the 55 degree HFOV. Lens materials

were set as non-dispersive and non-absorbing. Additional mechanical structure was added to

shield the detector from stray light coming from outside the eye.

Before adding to the system model, the relative illumination corrector performance was tested

using a Lambertian source at 550nm in front of the lens. Source power was 1 W over the entire

sphere, placed 0.1mm from the lens aperture, and oversized to 8mm circular diameter to fully

illuminate the lens. Figure 52 shows the performance of the lens with and without the relative

illumination corrector coating in place. As expected, the illuminance at the retinal plane is

constant across the field. With this lens now flat fielded, it can be used to compare the

performance of a simulated scene without introducing error that is significant to the simulation.

a) b)

Figure 52. Illuminance mesh of wide Ideal eye a) without and b) with relative illumination

corrector coating. Source power is the same for both measurements. Maximum scale is 315 lux for

both plots.

Page 49: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 49 of 57

Using the baseline model with no stray light, the wide Ideal eyes were placed in the model with

the stops at the appropriate eye relief of 10mm from the rear vertex. The system illustration is

shown in Figure 53 and shows the capability to place two eye models side-by-side.

Figure 53. Complete system in LightTools with Wide Ideal Eye Model.

The system model had no scattering or Fresnel reflection/loss on the components, which loaded

the same checkerboard and mask settings as previously used in the Arizona Eye Model

simulation. Using a backwards ray trace with a half cone angle of 11 degrees, a backwards ray

trace was performed. This ray trace took about 8 hours on a laptop, tracing a maximum of 2.147

billion rays. The receiver was set to 401x401 pixels and did not include smoothing, which

resulted in 1.98% error. Figure 54 and Figure 55 show the meshes for illuminance and color.

The power was not adjusted from the source to compensate for the decrease in on-axis

transmittance due to the relative illumination corrector.

a) b) Figure 54. On axis illuminance meshes for a) Arizona Eye Model and b) wide Ideal Eye Model.

Scales are set to the respective maximum for each plot and a minimum of 0 nits.

Page 50: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 50 of 57

a) b) Figure 55. On axis CIE meshes for a) Arizona Eye Model and b) Wide Ideal Eye Model.

The Wide Ideal Eye Model shows several advantages compared to the Arizona Eye Model. The

chromatic aberration of the lens is visibly apparent in the ideal eye, whereas it appears to cancel

out for this complete system using the Arizona eye. The distortion is also clearly apparent and

makes a good comparison to the stitched images of the narrow field eye. An advantage of

restricting the ray cone is that a plot similar to Figure 43 can be created by simply changing the

area of the retinal plane and maintaining the same aim configuration. For the wide Ideal eye, the

lack of distortion or chromatic aberration correction on the display is clearly apparent and could

be evaluated with the proper correction. Additionally, plots of proper color and relative

illumination performance, similar to Figure 44 could be generated in one step by performing a

ray trace with a fully white screen.

With the model performance tested, the stray light simulations were run using the same

parameters as the stray light analysis performed with the Arizona Eye Model. The same five

cases were run for comparison using the same ray behavior, counting the reference image shown

in Figure 55. Figure 56 shows the other four cases for the wide Ideal eye.

Page 51: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 51 of 57

a) b)

c) d) Figure 56. Illuminance meshes for stray light analysis using the Wide FOV Ideal Eye Model. All

meshes include Frensel transmittance/reflectance on the viewing lens with probabilistic ray

splitting. a) Fresnel on lens only. b) Black specular retaining ring surface added. c) Black

Lambertian scatter on mechanical housing. d) Black specular retaining ring and black Lambertian

scatter on housing combined. All charts set to a maximum luminance of 1.62 lux and a minimum of

1.62x10-4 lux.

The overall results produce the same conclusions previously seen with the Arizona Eye Model.

One of the advantages to the Wide Ideal Eye Model is that the angular extent of the stray light

can be observed better than the Arizona Eye Model.

With the final set of LightTools files completed, both forward and backwards ray tracing

performance can be analyzed. After performing a forward ray trace, LightTools has an option to

automatically calculate the aim parameters for the backwards ray trace. This method will be

compared with using a specified cone angle and setting the aim area to the clear aperture of the

last element. For the case of the specified cone angle, the ray angle on the image plane was

analyzed in Zemax to determine the maximum angle on the retinal plane. These half cone angles

were determined to be 35.17, 41.03 and 11.01 degrees for the Narrow FOV Ideal Eye, Arizona

Eye Model, and the Wide Ideal Eye Model respectively. As a comparison, a forward ray traces

from the display to the retina were run as well setting the Lambertian emitting cone angle to be

Page 52: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 52 of 57

15, 55 and 90 degrees. Backwards ray trace efficiency was defined by the number of rays

making it to the source as a percentage of attempted rays. Similarly, forwards ray trace

efficiency was defined as the number of rays reaching the retinal detector plane as a percentage

of attempted source rays. The simulation results are shown in Table 14.

Table 14. Ray trace efficiency for the various system eye models with no checkerboard on the

display plane.

Backwards Raytracing Efficiency Forward Raytracing Efficiency

Aim Area

Type

Cone

Angle

Aim

LightTools

Automatic

Solve

Aim Area

at Last

Element

Lambertian

Cone Half

Angle = 90°

Lambertian

Cone Half

Angle = 55°

Lambertian

Cone Half

Angle = 15°

Narrow Ideal

(15° HFOV)

4.21% 39.37% 1.24% 0.003% 0.006% 0.083%

Arizona Eye

(55° HFOV)

1.21% 41.15% 11.19% 0.111% 0.259% 1.666%

Wide Ideal

(55° HFOV)

11.09% 0.79% 0.40% 0.078% 0.183% 1.415%

Backwards ray tracing for this situation proved to be substantially more effective than forwards

ray tracing in all cases. LightTools made the best calculations when looking at the narrow Ideal

and the Arizona Eye Model. It did not have good performance for the wide Ideal eye, where

setting the cone angle provided the best performance. This may have been due to better

concentration of rays based on the geometries of the Narrow FOV Ideal Eye or the Arizona Eye

Model. Such a concentration may not be possible when meeting the distortion criteria or other

requirements set up for analysis. Thus, there will be more ray tracing time to obtain the desired

results from the wide Ideal eye.

Commentary and Additional Recommendations

The objective of this work was to create a complete illumination model of a commercially

available mobile virtual reality HMD exclusively by measurements of physical hardware. In this

process, work in both sequential and non-sequential lens design was performed. By designing a

viewing lens, creating CAD geometry, measuring source performance, creating a proper source

model, and evaluating different eye models, the foundation was set to analyze complete system

performance for color, relative illumination and stray light.

For this particular model, the display type and source were critical. Visual observations of the

display showed minimal color shift over viewing angle, which matched the chromaticity

measurements. Thus the variations of the emitted spectrum over viewing angle were not

expected. Combining the performance of the display with the transmittance of the PMMA optics

improved the chromaticity performance over viewing angle. Use of LCD over AMOLED would

have significantly increased the complexity of the model and likely caused more contrast and

viewing angle issues.

Stray light performance behaved as expected. The Fresnel reflection contributions at the center

of the field were limited to the central viewing angles of the system. This is likely due to

multiple interactions cause by the small angles of incidence near the optical axis of each lens.

Page 53: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 53 of 57

The textured nature of the housing made up for this and filled in the rest of the field. When

inspecting the Gear VR headset during mechanical measurements, the highly specular surface of

the lens retaining ring was of most concern for stray light contribution. This surface only

contributed stray light at the edge of the field, which was better than expected. This edge

contribution should still not be downplayed and can be improved in future designs. Another

recommendation for improvement is the use of AR coatings. A simulation using a quarter wave

of MgF2 designed at 550nm is shown in Figure 57 with the standard 4 orders of magnitude.

These significantly improve performance at the center of the field of view and eliminate some

stray light near the edge of the field. The contrast modulation improves to 99.8% from 98.5%

with the inclusion of this quarterwave AR coating. Improvements to the scattering of the plastic

mechanical features would also improve stray light performance.

a) b) Figure 57. Simulation a) without and b) with a quarter wave MgF2 coating designed at 550nm.

Scales are 1.61 to 1.61x10-4 lux and 1.76 to 1.76x10-4 lux for the plots, respectively.

One of the key tenants of VR is to have lifelike reproduction of real-life scenes. These scenes

should be free from aberrations when the reach the eye. Since these scenes are recreated using

optics, they will have aberrations, stray light, and other visual artifacts caused by the virtual

reality hardware. Thus, it is necessary to create a new class of tools that have no measurable

impact on the VR scene. For this report, it was the development of two “Ideal” eye models made

out of physical lens geometry – one narrow FOV and one wide FOV. The Narrow FOV Ideal

Eye enables higher resolution restricting the field of view while maintain the same number of

receiver bins. The wide field eye enables complete analysis of the field to determine various

contributions to stray light. The Wide FOV Ideal Eye Model has lower efficiency than the

Arizona Eye Model, but clearly shows the distortion and chromatic issues of the system.

One of the key assumptions for a new eye model for analysis was to use a flat image plane. The

use of a curved image plane, or more elements could restrict the cone angle and improve

performance. For these systems, the assumption is a 4mm stop as the first optical component of

the system. Based on first order properties, the increase of focal length of the lens would restrict

the cone angle on-axis based on numerical aperture. For the current lens at 16mm focal length,

the minimum cone angle becomes approximately 7.2 degrees. In order to meet the ray trace

Page 54: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 54 of 57

efficiency of the Arizona Eye Model, the focal length would require an increase to 25.4mm to

give some margin to create a 5.5 degree cone angle. Unfortunately, this would increase the size

of the eye model elements at a 55 degree HFOV to be oversized to not allow two eye models

side-by-side at nominal interpupillary distances. Thus, adapting the lens model to work at 21mm

focal length with a 4mm stop and a 6 degree cone angle seems most reasonable and efficient for

a redesign. An alternative would be to design a simple lens with a curved image plane and very

low distortion. This system could then be analyzed in LightTools to verify proper transformation

from the spherical detector to a flat detector plane. This would allow curvature of the image

plane as another variable for use.

Some of the unique challenges of illumination design and analysis for virtual reality systems

have been highlighted in this report. As this area and field are expected grow in the years to

come, the work here has been a good foundation for assessing other mobile virtual reality

systems as they arrive on the market. This work is also applicable to PC based virtual reality

systems as well.

Page 55: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 55 of 57

References

[1] J. Melzer and K. Moffitt, Head Mounted Displays: designing for the user, Mac Graw Hill, 1997.

[2] J. P. Rolland and O. Cakmakci, "The past, present, and future of head-mounted display designs,"

in Proc. SPIE 5638, Optical Design and Testing II, 368, 2005.

[3] O. Cakmakci and J. Rolland, "Head-Worn Displays: A REview," J. of Display Technology, vol.

2, no. 3, pp. 199-216, 2006.

[4] B. Kress, E. Saeedi and V. Brac-de-la-Perriere, "The segmentation of the HMD market: Optics

for smart glasses, smart eyewear, AR and VR headsets.," Proc. of SPIE, vol. 9202, p. 92020D,

2014.

[5] Wikimedia Foundation, Inc., "Google Glass - Wikipedia, the free encyclopedia," 30 October

2015. [Online]. Available: https://en.wikipedia.org/wiki/Google_Glass. [Accessed 31 October

2015].

[6] A. Zugaldia, "Detail of Google Glass," 27 June 2012. [Online]. Available:

https://www.flickr.com/photos/azugaldia/7457645618. [Accessed 18 October 2015].

[7] H. Hong and B. Javidi, "Augmented Reality: Easy on the Eyes," Optics and Photonics News, no.

February 2015, pp. 26-33, 2015.

[8] Lumus, Ltd., "Lumus - Consumer Market Products," [Online]. Available: http://www.lumus-

optical.com/?option=com_content&task=view&id=9&Itemid=15. [Accessed 31 October 2015].

[9] Microsoft, Inc., "Microsoft Hololens hardware," [Online]. Available:

http://www.microsoft.com/microsoft-hololens/en-us/hardware. [Accessed 2015 October 31].

[10] H. Hong and G. Chunyu, "A compact eyetracked optical see-through head-mounted display,"

Stereoscopic Displays and Applications XXIII, vol. 8288, p. 82881F, 2012.

[11] Oculus VR, LLC., "Introducing the Samsung Gear VR Innovator Edition | Oculus Rift - Virtual

Reality Headset for 3D Gaming," 3 September 2014. [Online]. Available:

https://www.oculus.com/blog/introducing-the-samsung-gear-vr-innovator-edition/. [Accessed 8

March 2015].

[12] J. Constine, "Oculus’ New $99 Samsung Gear VR Makes Serious Virtual Reality Affordable,"

TechCrunch (AOL Inc.), 24 September 2015. [Online]. Available:

http://techcrunch.com/2015/09/24/gear-vr-for-all/#.fwjfuf:6RLx. [Accessed 18 October 2015].

[13] R. Smith, "AnandTech | Samsung Announces Gear VR: A VR Harness For Your Note 4.," 3

September 2014. [Online]. Available: http://www.anandtech.com/show/8466/samsung-

announces-gear-vr-a-vr-harness-for-your-note-4. [Accessed 8 March 2015].

[14] Google Inc., ""okay glass," - Google Glass Help," [Online]. Available:

https://support.google.com/glass/answer/3079305. [Accessed 31 October 2015].

[15] S. Kobayashi, S. Mikoshiba and S. Lim, LCD Backlights, West Sussex: John Wiley & Sons,

Ltd., 2009.

[16] Samsung Electroncs Co. Ltd., "Samsung GALAXY Note4 - Specs," [Online]. Available:

http://www.samsung.com/global/microsite/galaxynote4/note4_specs.html. [Accessed 9 March

2015].

[17] Konica Minolta Sensing Singapore Pte Ltd, "CS-2000 Spectroradiometer - Konica Minolta

Measuring Instruments | Asia Pacific," 2015. [Online]. Available:

Page 56: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 56 of 57

http://sensing.konicaminolta.asia/products/cs-2000-spectroradiometer/. [Accessed 9 March

2015].

[18] Photo Research, Inc., "Photo Research, Inc. - PR-740 / PR-745 SpectraScan Spectroradiometer,"

2015. [Online]. Available: http://www.photoresearch.com/current/pr740.asp. [Accessed 9 March

2015].

[19] R. Soniera, "Galaxy Note 4 and Note Edge OLED Display Technology Shoot-Out," [Online].

Available: http://www.displaymate.com/Galaxy_Note4_ShootOut_1.htm. [Accessed 9 March

2015].

[20] Wikimedia Commons, "File:CIExy1931.png - Wikimedia Commons," 21 June 2005. [Online].

Available: https://commons.wikimedia.org/wiki/File:CIExy1931.png. [Accessed 31 October

2015].

[21] Wikimedia Commons, "File:CIE_1976_UCS.png - Wikimedia Commons," 24 March 2008.

[Online]. Available: https://commons.wikimedia.org/wiki/File:CIE_1976_UCS.png. [Accessed

31 October 2015].

[22] Samsung Electronics Co. Ltd., "Samsung Gear VR," [Online]. Available:

http://www.samsung.com/global/microsite/gearvr/. [Accessed 8 March 2015].

[23] M. Bass, S. DeCusatis, J. Enoch, V. Lakshminarayanan, G. Li, C. MacDonald, V. Mahajan and

E. Van Stryland, Handbook of Optics, Third Edition Volume III: Vision and Vision Optics,

McGraw Hill Professional, 2009.

[24] Evonik Cyro, LLC, "Light Transmission & Reflectance Tech Data - Acrylite," [Online].

Available:

http://www.acrylite.net/sites/dc/Downloadcenter/Evonik/Product/ACRYLITE/acrylite%C2%AE-

ff--gp---light-transmission--reflectance-tech-data-(1213g).pdf. [Accessed 10 March 2015].

[25] J. Schwiegerling, Field guide to visual and ophthalmic optics, Bellingham, WA: SPIE Press,

2004.

[26] United States Department of Defense, Military Standardation Handbook - Optical Design (MIL-

HDBK-141), Washington, DC: Defense Supply Agency, 1962.

[27] R. J. Koshel, "Lit Appearance Modeling of Illumination Systems," Novel Optical Systems Design

and Optimization V, vol. 4768, pp. 65-73, 2002.

[28] D. Chenwei, M. Lin, L. Weisi and N. N. King, Visual Signal Quality Assessment: Quality of

Experience (QoE), New York City: Springer, 2014.

[29] J. Schwiegerling, "Theoretical Limits to Visual Performance," Surv Ophthalmol, vol. 45, no. 2,

p. 139–146, 2000.

[30] A. Ivanoff, "Night Binocular Convergence and Night Myopia," JOSA, vol. 45, pp. 769-770,

1955.

[31] C. A. Curcio, K. R. Sloan, R. E. Kalina and A. Hendrickson, "Human Photoreceptor

Topography," J. of Comparative Neurology, vol. 292, pp. 497-523, 1990.

[32] T. Rudderham, "Visit Gondor from LOTR in The Great River," 1 October 2014. [Online].

Available: http://www.theriftarcade.com/wp-content/uploads/2014/09/The-Great-River-Oculus-

Rift-2.jpg. [Accessed 19 October 2015].

[33] Synopsys Optical Solutions Group, Eye_Model MIL_HDBK_141.pdf, Pasadena, CA:

Documentation with program. C:\Program Files\Optical Research Associates\LightTools

8.2.0\ExamplesLibrary\Applications\HumanEye\, 2013.

Page 57: Illumination Simulation and Design Considerations for ... · Even with freeforms, AR systems can still be bulky and very apparent when on the face of the user. Figure 4. AR freeform

Page 57 of 57

[34] Synopsys, LigthTools Help Documentation, 2015.