Combating Virtual Reality Sickness

Jun 02 2017 | By Ann Rae Jonas | Images courtesy of Ajoy Fernandes and Steve Feiner

The dreaded queasiness or nausea can occur in a car, on a ship, or on a plane: motion sickness. The cause is conflicting messages received by the brain from the eyes and ears. If you’re reading a book, say, or checking the messages on your phone, your eyes perceive that you are stationary. At the same time, the vestibular system of your inner ears detects motion. The simplest solution is to look out a window so your eyes, too, perceive motion.

This video presents research performed by Ajoy Fernandes and Steven Feiner at Columbia University Engineering School's Computer Graphics and User Interfaces Lab. It accompanied a paper given at IEEE 3DUI 2016 that received the IEEE 3DUI 2016 Best Paper Award and a demo presented at IEEE VR 2016.

Virtual reality (VR) can induce the same discomfort—but in reverse—called VR sickness. Your eyes, through which you travel within a virtual landscape, perceive motion; your ears perceive only that your head is tilted as you slouch in your chair while wearing a head-worn display, such as an Oculus Rift or HTC Vive.

Ajoy S. Fernandes MS’16 and Steven K. Feiner have devised a way to combat VR sickness. In a March 2016 paper, Fernandes and Feiner, professor of computer science at Columbia Engineering, director of the Computer Graphics and User Interfaces Lab, and codirector of the Columbia Vision and Graphics Center, described their technique, which uses virtual soft-edged circular cutouts to narrow the VR user’s field of view (FOV): It automatically decreases the FOV when the virtual motion is likely to cause discomfort, then restores the FOV when discomfort is less likely. Participants in their study reported significantly less discomfort, but most didn’t notice the FOV changes. “And those who did notice them,” says Feiner, “said they’d prefer to have them in future VR experiences.”

Feiner has also done pioneering work on augmented reality (AR), which overlays VR onto the real world interactively (can you say Pokémon Go). Mechanics, for example, can don lightweight eyewear that guides them through complex tasks, showing them which machine parts and tools to use.

Then there’s what one might call “mix and match” AR. Feiner is pursuing hybrid user interfaces that can allow 2D, touchsensitive, flat-panel displays to be used in tandem with head-worn and handheld displays that enable 3D interaction in the space around the panel.

Stay up-to-date with the Columbia Engineering newsletter

* indicates required