UI in VR - Interface principles in the 3rd dimension.
One of the challenges developers face as we move into the first person three-dimensional space of VR is how to translate the language of user interface from the two-dimensional space to the third.
Our challenge came in the form of Think F.A.S.T., a VR app that teaches players how to spot the signs of an oncoming or in-progress stroke. The app utilized LeapMotion’s 6-Degrees-of-Freedom technology; which enables the user to actually walk around and explore the virtual space. It also tracks the player’s hands, allowing them to act as the controller rather than learning a new device.
As we approached this project, we knew we were going to need “traditional” UI elements to help introduce the player to the experience, as well as walking them through the crash course on spotting strokes. We also knew there would be concerns on how to convey information that would typically presented in a 2D UI as opposed to a 3D UI.
In a game or app played on a comfortably-sized screen most of the experience is viewed with unobtrusive, unconscious eye movement called saccades. This also incorporates a high resolution portion of vision in the center of the retina called the “fovea.”
Because the fovea in humans is very small (a few degrees of vision), when we are introduced to VR that replicates a realistic experience we are forced to engage in real-life behavior. This can include conscious looking around, or “obtrusive movement”, e.g. turning your head or moving your eyes out of a comfortable range to view information.
Out of all of the UI elements, the most used was a floating panel we called a “V.P.S.” or “Virtual Panel System.” This consisted of a flat panel that existed in the actual world space, displaying information alongside the instructions given by a robotic medical assistant.
Initially, there were two paths we explored with the V.P.S. system. The first was based on interaction with LeapMotion’s hand tracking system. In this model the user turns one palm up to show the VPS panel. We liked this idea for several reasons; it’s a similar posture to reading and has the ease of showing and dismissing the panel with a simple hand gesture. We also liked that it kept the UI “close to the vest” so to speak, and out of the way of the player’s forward vision. This enabled them to see most of the virtual environment.
Ultimately we went with a system where the panel would be displayed in-world near the patient. We felt that the quickness of the demo, combined with the unfamiliarity of the in-world mechanics, could otherwise lead the user to miss out on vital information.
The panel system also ended up giving out more information than originally anticipated, so we felt it was best to place it in-world to reduce redundant obtrusive head motion as much as possible. Anchoring the element in the world also allowed the player to know where to look for new information.
Some of the initial idea did make it in however - since the panel can take up a significant amount of the user’s view we added a virtual button that can be summoned by turning the left palm up, allowing you to show or hide the panel.
We are very excited about the future of VR UI, and advances in VR technology will only increase that feeling.
At some point there may even be a fusion between the principles of AR and VR. For example, there could be a VR device that would track and scan the user’s pupils. By knowing where they are looking, you could display UI over that area of the screen as opposed to than having the UI sit in a portion of the screen or require some movement that forces the user to look away from what they’re engaged in.
In the meantime, it will be important to understand how humans gather information from their eyes, and in what ways having a completely virtual experience can be used to deliver the critical information a user needs.
Who We Are
Pixel and Texel is a development studio based in Dallas, TX, that is developer owned, operated, and founded. We have been creating made to order apps, VR, websites and backend solutions since 2011. Our team of full-stack developers combines decades of code experience, project planning, and coffee drinking to build the bridge that connects brands to consumers. We’re started this company to put development back into the hands of developers.