oculus rift developer kit 2 and latency mitigation techniques
DESCRIPTION
This was a presentation I gave at HPG 2014 in Lyon, France.TRANSCRIPT
![Page 1: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/1.jpg)
DK2 and Latency Mitigation
Cass EverittOculus VR
![Page 2: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/2.jpg)
Being There
• Conventional 3D graphics is cinematic– Shows you something• On a display, in your environment
• VR graphics is immersive– Takes you somewhere• Controls everything you see, defines your environment
• Very different constraints and challenges
![Page 3: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/3.jpg)
Realism and Presence
• Being there is largely about sensor fusion– Your brain’s sensor fusion– Trained by reality– Can’t violate too many hard-wired expectations
• Realism may be a non-goal– Not required for presence– Expensive– Uncanny valley
![Page 4: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/4.jpg)
Oculus Rift DK2
• 90°-110° FOV• 1080p OLED screen– 960x1080 per eye
• 75 Hz refresh• Low persistence• 1 kHz IMU• Positional tracking
![Page 5: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/5.jpg)
Low Persistence
• Stable image as you turn - no motion blur• Rolling shutter– Right-to-left– 3ms band of light– Eyes offset temporally
![Page 6: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/6.jpg)
Positional Tracking
• External camera, pointed at user
• 80° x 64° FOV
• ~2.5m range• ~0.05mm @ 1.5m
• ~19ms latency– Only 2ms of that is vision processing
![Page 7: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/7.jpg)
Position Tracking
+ =
technology magic
The good news: You don’t need to know.
![Page 8: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/8.jpg)
Image Synthesis
• Conventional planar projection– GPUs like this because • Straight edges remain straight• Planes remain planar after projection
• Synthesis takes “a while”– So we predict the position / orientation– A long range prediction: ~10-30ms out
![Page 9: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/9.jpg)
Note on Sample Distribution
• Conventional planar projection, not great for very wide FOV– Big angle between samples at center of view
![Page 10: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/10.jpg)
Alternative Sample Distributions
• Direct render to cube map may be appealing• Tiled renderers could do piecewise linear – Brute force will do in the interim– But not much FOV room left at 100°
![Page 11: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/11.jpg)
Optical Distortion
![Page 12: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/12.jpg)
Distortion Correction
![Page 13: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/13.jpg)
Optical Distortion
• HMD optics cause different sample distribution – and chromatic aberration
• Requires a resampling pass– Synthesis distribution -> delivery distribution– Barrel distortion to counteract lens’s distortion
• Could be built in to a “smarter” display engine– Handled in software today• Requires either CPU, separate GPU, or shared GPU
![Page 14: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/14.jpg)
Display Engine (detour)
• In modern GPUs, the 3D synthesis engine builds buffers to be displayed
• A separate engine drives the HDMI / DP / DVI output signal using that buffer
• This engine just reads rows of the image• More on this later…
![Page 15: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/15.jpg)
Time Warp
• Optical resampling provides an opportunity– Synthesized samples have known location
• Global shutter, so constant time
– Actual eye orientation will differ• Long range prediction had error• Better prediction just before resampling• Both predictions are for the same target time
• So resample for optics and prediction error simultaneously!
• Note: This just corrects the view of an “old” snapshot of the world
![Page 16: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/16.jpg)
Time Warp + Rolling Shutter
• Rolling shutter adds time variability– But we know time derivative of orientation
• Can correct for that as well– Tends to compress sampling when turning right– And stretch out sampling when turning left
![Page 17: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/17.jpg)
Asynchronous Time Warp
• So far, we have been talking about 1 synthesized image per eye per display period– @75 Hz, that’s 150 Hz for image synthesis– Many apps cannot achieve these rates
• Especially with wide-FOV rendering
• Display needs to be asynchronous to synthesis– Just like in conventional pipeline– Needs to be isochronous – racing the beam– Direct hardware support for this would be straightforward
![Page 18: Oculus Rift Developer Kit 2 and Latency Mitigation techniques](https://reader035.vdocuments.us/reader035/viewer/2022062511/54b744e04a7959796e8b4664/html5/thumbnails/18.jpg)
Asynchronous Time Warp
• Slower synthesis requires wider FOV– Will resample the same image multiple times
• Stuttering can be a concern– When display and synthesis frequencies “beat”– Ultra-high display frequency may help this– Tolerable synthesis rate still TBD
• End effect is, your eyes see the best information we have– Regardless of synthesis rate