- Posted by Larry Brangwyn
- On 28th February 2018
Virtual Reality opens up an entirely new way of interacting with the digital world. Not only can users see more of the environment around them, but they perceive and interact with things in different ways as well, leading to a new VR UX.
Most advances in VR technology are designed to increase this sense of immersion, ultimately making it easier for users to forget where they really are and enjoy VR. However, current VR user interfaces seem to have a slight hangover from the familiar interfaces of today’s computers or websites and tend to look like desktop user interfaces projected on larger areas. This can result in the unique usability considerations of virtual reality being overlooked.
It got us thinking about the different UX aspects of VR, and how we can all design immersive experiences to make the most of current and emerging technology.
Basic visuals and sound
360˚ video is one of the basic forms of VR, and simply being able to look around you in a natural manner – moving your head to orient your vision instead of using a mouse pointer – is enough for some people to lose themselves in what they’re seeing. Shutting out your senses from the outside world is powerful enough (think about trying to walk confidently in a straight line with your eyes closed) but replacing them is something else. The drawback is that if you try to move normally, what your other senses are telling you and what you see might not match up.
Using headset tracking allows 4 degrees of freedom; the ability to not only look, but physically move around naturally within your environment – providing you don’t go too far in your physical space. For most users, this extra dimension immediately reduces any doubt that what you’re seeing and hearing is what you’re experiencing.
The obvious drawback is that it’s possible for our virtual environment to stretch beyond the confines of your physical space, meaning you have to watch out for walls and other obstacles.
The chaperone is a nifty piece of UI that shows you where the boundaries of your physical space are, overlayed on top of your virtual environment. It brokers the sense of space between the two in a way that hopefully doesn’t break the illusion that VR UX is showing you. It displays large obstacles like walls, which are easy to bump into, and helps you avoid them. However, some implementations of this are heavier-handed than others.
In the future, longer range dynamic tracking of objects and surfaces will help to incorporate more than just walls, but other obstacles like tables and chairs too.
Using a high-end VR headset usually means having a cable running the audio and video from a PC to your Head-Mounted Display (HMD). However, the weight and restriction of the cable, not to mention the possibility of getting tangled up can give you the nagging feeling you’re tethered, as well as cause accidents.
Wireless headsets (or processors that can render high-end content and fit inside your headset) are definitely the future, allowing unrestricted movement.
Synchronicity between the senses is paramount to a good VR experience. When the signals your brain is getting from the physical world and the virtual world don’t match up, it’s possible to get VR motion sickness, which feels a bit like vertigo or dizziness.
A good example of this effect is found in rollercoaster experiences, a mainstay of VR demonstrations. Some people feel like they should be compensating for the movement they are seeing and lean, exacerbating the effect and causing them to lose balance completely. Or occasionally, they’re pushed…
The difference between observing and interacting was originally overcome with controllers that track the hand movements and offered common mechanisms such as a button press or pulling a trigger, to accomplish a user action. For simple navigation, this is fine but the wide range of possible situations means the interaction doesn’t feel tactile or natural.
Object tracking allows the precise positioning of a physical object and it’s virtual equivalent in real-time, meaning it’s possible to touch and interact naturally with an object. Holosphere’s tests with object tracking showed it was even possible to catch an object thrown to a user who was wearing an HMD.
However, only objects that are tracked can have their position recreated in VR. This can cause problems when the expectation is set to the user that an object has physicality when another one does not. A good example of this is Ronnie O’Sullivan trying to lean on the non-existent pool table as he might in a real game. Ouch.
We’ve explored a lot of small details that can bring the user out of immersion in the VR experience, some more sudden and violent than others. However, when these are reduced in number or eliminated altogether, the power of the overall sensory experience can be sufficient to elicit the brain’s help in solidifying the illusion.
In the same way the brain tries to compensate for things it doesn’t think is right, it’s possible for it to trick you into believing what you’re seeing even more – a process known as “filling-in”.
We discovered an example of this when user testing our guided meditation experience, the Forest of Serenity. At one point during the ten-minute program, the user steps from a wooded area into a sunny clearing by a lake. Some users reported that upon walking into the virtual sunlight, they felt warmer. One even asked if we had turned on a heat lamp in 4D cinema style to round out the sensory experience. We hadn’t, it was simply that reduced noise from other distracting factors makes this process of filling-in more likely and effective.
As the above technology becomes more mainstream then the doors are open to more natural and intuitive ways of interacting with the environment – the longer-term ideal being exactly as you would in the real world.