To Move Or Not To Move…

Developing a VR framework in UnReal Engine 4

Developing a VR framework that can be re-purposed to anything we wish has been a journey of fits and starts, especially with regards to user input devices.

Initially, when Oculus released their DK1 and DK2 kits, the demos that were made typically relied on gamepad and/or keyboard and mouse input.
Consequently, what we saw were demos that included standard First Person Shooter type control mappings.

The Oculus Rift CV1 itself shipped with a gamepad.

Then came the Vive

160129-Vive-V1PT5-CV1-Hero1
HTC Vive Motion Tracked Controller

Tracked motion controllers add a new dimension to immersion as far as we are concerned. To be able to decouple a weapon from hovering in front of your face and attach it to a motion controller is a very liberating experience!

I even found myself blowing the muzzle of my virtual pistol after dusting off an enemy!

We have attempted to contact Oculus and ask for engineering samples of Oculus touch, but to no avail as yet. They may have run out of their dev allocation.

We do hope to attend Oculus Connect 3, however. This should be a fantastic event and should broaden our knowledge considerably!

Our framework plan was to incorporate Vive controllers, Oculus Touch and Gamepad control, thereby making user input transparent and ubiquitous.
Now nearing completion, with an object interaction system, a weapons system and an inventory system, the framework is taking shape.
With these basic game/experience mechanics in place, we then had to deal with the issue of locomotion in VR, which was always rolling around in the back of our minds.

VR Locomotion And Potential Motion Sickness

Since the official launch of both the Rift and the Vive, more and more consumers are jumping on the VR bandwagon. Consequently, more and more people are voicing their opinions and thoughts with regards to locomotion.

Back in the DK days, it was accepted that certain ways of moving through the virtual world might lead to slight motion sickness and nausea. For the most part, the early adopters were already seasoned gamers and as such, were used to these kinds of movements.
However, it soon became apparent that these standard motion controls (such as gamepads) were prone to inducing motion sickness through a disconnect between perceived and actual motion.

The vestibular system of the average person seemed barely able to tolerate this type of motion in VR.
The biggest problems seem to be acceleration and the dreaded “right thumbstick yaw” movement, where you can turn your body position one direction and move your head in the other… definitely an issue.

Certain medical white papers tried to explain the disconnect, but since the general consumer adoption of VR technology (April 2016), more and more people who have tried VR have complained about the different forms of locomotion afforded to them… and nobody seems to agree on any of them.

This makes it difficult for developers such as ourselves to provide an all-encompassing solution that promotes VR adoption to the masses.

Potential Solutions

The favorite technique for locomotion in VR right now is to use what is called “Teleportation” or “Blink” movement.

This mechanism involves projecting a 3D cursor onto the virtual world. The player moves this cursor in the direction they want to go (using either motion tracked controllers or a gamepad) and then pressing a button to “teleport” or “blink” to that location.

Anyone who has played the game Dishonored will know that this was a “power up” in the game. This method takes artificial acceleration out of the motion equation and is very comfortable.

 

teleport
Teleporting To Your Destination

Another remedy seems to be the “Snap To Look” or “VR Comfort” mode, where the yaw of the gamepad right thumbstick is replaced by discreetly stepped turns.
Instead of a smooth rotation around the Z-axis, the turn is broken up into discreet steps, typically 45 degrees at a time. This again provides an instantaneous movement and doesn’t upset the vestibular system as much as continuous artificial acceleration.

The trouble with these solutions is that they break immersion to a degree. With these solutions, you can’t just walk up to an object in VR. Instead you are taking discreet steps.
Also, these solutions kind of limit your freedom in the VR world. I think this defeats the whole purpose of VR, which surely is to allow the user to go anywhere they wish in their virtual world.

Best Of Both Worlds?

It occurred to me that if we could offer regular, standard FPS style locomotion AND these other locomotion solutions outlined above, then we could cater to both the hardcore, seasoned gamer and the new VR consumer alike.
We could also offer a hybrid type of locomotion, combining the best elements of both systems for a better, less nauseating experience.

In order to do this, we prototyped a player pawn blueprint in Unreal Engine 4, that organizes the component hierarchy of the pawn based on the method of control the user wants to employ.

For instance, if the player is using a gamepad, the weapons and other objects are coupled to the HMD (aiming is done with head movement… look-and-shoot). If the player is using tracked motion controllers (the Vive), the weapons and other objects are decoupled from the HMD and coupled to the controllers, affording a much more natural range of movement.

 

UE4 Pawn Blueprint Hierarchy
UE4 Pawn Blueprint Hierarchy

We still await true studies of the numbers and percentages involved with motion sickness in VR but, if we cover all bases, we should be able to mitigate some of these issues and have more people enjoy the wonders of VR!

To be continued!

 

Motion Tracked Shooting!
Motion Tracked Shooting!