Previous | Next --- Slide 24 of 53
Back to Lecture Thumbnails
mlandis

Something maybe worth noting: As an avid developer and consumer of VR I have spent a LOT of time in virtual reality. At some point, your brain just learns to adapt to the various motion-sickness causing elements, what people have called getting your "VR lungs." For me, I got nauseous and had headaches early on, but around the 200 hour mark all symptoms were completely gone and I haven't been motion sick since. I have no issues looking at objects that are close to my face in VR as a result. It makes one wonder if the solution to VR nausea is to enhance our rendering and simulation, or if it's to just to find ways to override how the brain responds to them.

jc2

It seems that if a VR display can adaptively change the the distance between the display screen and the eyes based on eyes' retina positions (basically eye tracking technology) can lessen the effects of the accomodation/vergence conflict.

Btw, This is interesting summary on the kinds of VR technologies for trying to solve this issue: http://www.computationalimaging.org/publications/optimizing-vr-with-gaze-contingent-and-adaptive-focus-displays/

et17

At ophthalmologists they have machines that can determine your prescription by having you look at an image and editing the focus and zoom of the image. How does this machine work? By tracking the same out of focus areas and sensing if conversion is off?

ellahofm

I think you're talking about an auto-refactor: here's what I could find on physics stack exchange ~ https://physics.stackexchange.com/questions/70976/how-does-autorefractor-work

tbell

Is there any work being done on screens that can simulate different depths?