Skip to main content
Search scope

Improving the 3D Virtual Reality Experience

SFU Scholarly Impact of the Week profiled April 7, 2021

banner

By Heather Sanders 

Think of a transformative experience that you would like to share with others. It might be learning to drive a vehicle, scuba diving in tropical waters, or strolling the streets of your home town. Virtual reality (VR) can make these and other experiences available to a broad audience. Effective VR can deliver transformative first-person experiences, entertain us, help us engage with each other and teach us new skills.

Researchers at Simon Fraser University’s School of Interactive Arts and Technology (SIAT) are working on new technologies that deliver more authentic and user-friendly three-dimensional (3D) VR experiences. A new navigation interface developed by SFU researchers – the NaviBoard – is showing promising results, and may lead to more efficient and accessible options for navigating virtual worlds.

SIAT Master’s student Thinh (Ted) Nguyen Vo, designed and led an experiment to evaluate and compare the NaviBoard to other forms of VR navigation. Under the mentorship of professor Bernhard Riecke, head of the iSpace (immersive, Spatial, Perception, Action/Art, Cognition, Embodiment lab) at SFU, they were joined by professor Wolfgang Stuerzlinger, head of the Virtual and Augmented Reality, Visual Analytics, Interaction, Systems & Experiments Lab (VVISE Lab), SIAT student Duc-Minh Pham, and professor Ernst Kruijff from Bonn-Rhein-Sieg University of Applied Sciences, Germany. Professors Riecke and Stuerzlinger have each been working on various ideas for 3D navigation, which made it easy for them to collaborate on the study, NaviBoard and NaviChair: Limited Translation Combined with Full Rotation for Efficient Virtual Locomotion.

Being able to easily and effectively move through the virtual world whether walking, driving, swimming or flying is key to the overall user experience. Several technologies to perform such locomotion are relatively inexpensive and easy to set up, such as gamepads, joysticks, VR controllers, or keyboards.

However, many interfaces that simulate motion but do not provide any physical self-motion can feel inauthentic or leave users feeling dizzy, disoriented, or even nauseous. Whether we move in the real or virtual world, we need to perceive the right cues and remain aware of elements and events in our environment. This feedback loop is referred to as spatial updating and situational awareness. Providing suitable visual and non-visual cues to ensure proper orientation while preventing motion sickness will enhance the overall experience.

Researchers in this field consider walking to be the gold standard for navigating VR worlds. Wearing a head-mounted display and physically walking through space provides users the best sense of presence and spatial orientation, and enables them to perform well in tasks. However, walking requires large, tracked spaces that are expensive to build and maintain. Safety issues are another concern for users walking freely in VR. A possible solution is a treadmill, once thought to be the ideal answer to VR locomotion, however the safety concerns, cost, and complexity of these have yet to be overcome.

Another concern in VR is the usability of the interface – which is related the user task load and cognitive load. Imagine performing a virtual task that took longer and was more frustrating than performing it in real life. Participants would quickly become frustrated and lose interest. “I frequently observe how frustrated people can get with user interfaces that violate basic interface design guidelines or that are not well-adapted to a given task,” says professor Stuerzlinger. “Thus, I want to create better ways for people to interact with computers in 2D and 3D.”

Nguyen-Vo’s NaviBoard research experiment asked participants to enter a virtual 3D world wearing a head-mounted display and complete a task of collecting hidden balls. There were four locomotion interfaces: seated on a rotating chair with a controller; seated on a NaviChair; standing on a NaviBoard; and walking in a tracked space. The NaviBoard is an innovative navigation interface designed by the researchers that allows whole body leaning and stepping to control speed and direction – the more you lean, the faster you go. The NaviChair is an adaption of the NaviBoard interface that enables users to control their movement direction and speed by rotating and leaning their upper body while sitting on a swivel chair. With both interfaces, the leaning motion helps deliver the appropriate kinesthetic (body awareness) information and vestibular (inner ear) cues to minimize disorientation and motion sickness. Both enable users to traverse large distances without requiring a large, tracked space.

The results of the experiment performed in the iSpace Lab were promising. Study participants using the NaviChair, NaviBoard and walking saw the best performance in the task. NaviBoard and walking participants experienced significantly reduced motion sickness, as compared to using the hand-held controller. NaviBoard and walking significantly reduced task load compared to the controller. The study demonstrated the leaning-based interfaces of NaviChair and NaviBoard were comparable to the gold standard of walking and may present a cost-effective alternative to walking in VR, without increasing motion sickness or task load. More information on the experiment is available on the iSpace website at: NaviBoard: Efficiently Navigating Virtual Environments.

While more investigation is still needed, the researchers point out that the NaviBoard can be made from common and affordable materials (e.g., plywood, cardboard, pieces of carpet, or Styrofoam), and its model can be easily applied to a basic swivel chair, as was done to create the NaviChair. And, it’s a navigational interface that can be used with a wide range of applications that rely on spatial updating and/or situational awareness. These include everything from training simulations to education, tourism, sports and gaming, and many other forms of entertainment. 

Professor Riecke is especially proud of what students Nguyen-Vo and Duc-Minh Pham were able to accomplish in the experiment. “As a researcher, I want to empower people to create positive virtual experiences,” he says. “Designing intuitive and effective ways to move through virtual worlds without being distracted by how to accomplish that is an essential pre-requisite for most impactful VR experiences.”

Lead researcher Ted Nguyen-Vo was pleased to see his VR simulation deliver promising results. “As a member of the iSpace Lab, I was able to design and develop experiments from end to end – from planning and designing to coding and analysis, and present my ideas to Bernhard and the team,” says Nguyen-Vo, who now works as a software engineer at Microsoft. “Publishing and presenting my work was a significant milestone in my Master’s, and overall, the experience helped me identify opportunities, both in academia and industry.”

The next steps are investigating how to further improve these interfaces and ensure they work for a wide range of applications, allow for embodied flying experiences, and allow people to user their hands to interact and communicate, similarly to how we can use our hands while walking. Professor Stuerzlinger notes that since the publication of their paper, researchers are continuing to make improvements to the design of VR simulations and locomotion interfaces. “It’s a field that is advancing in leaps and bounds,” he says.

And it’s one to watch – for both technology users and for the researcher and innovators moving our VR worlds forward.