I have to say this one is pretty cool and possibly not what you think. The Unreal Engine is getting VR support. That’s not as in support for VR platforms, it already has that. No, Unreal Editor is soon going to have support for the HTC Vive and Oculus Rift enabling you to edit and create your game in VR, taking full use of both platforms motion controllers. This is a while out however as they aren’t even announcing a release date until March 16th at the GDC.
From the Unreal blog:
The Unreal Editor is up and running in VR, so you can build VR content in VR. Using the Oculus Touch and HTC Vive motion controllers, your movement in the real world is mapped one-to-one in VR; you can reach out, grab, and manipulate objects just as you would in real life. You already know how to use this tool, because it works like the world works.
These are the early days of the revolution in immersive VR content creation, but we’re so excited about what’s up and running that we couldn’t keep it a secret anymore! VR movement and editing controls are functional, along with key parts of the Unreal Editor UI, including the Details Panel and the Content Browser. We’ll be showing more and announcing the release date at GDC on Wednesday March 16, 2016. And when it’s released, it will be a built-in feature of the freely-downloadable Unreal Engine, with full source on GitHub
Best of all, this isn’t a limited mode for VR preview and tweaking. It is the Unreal Editor, now running in VR. The same Unreal Editor that’s used by everyone ranging from indies and mod makers to triple-A development teams with $100,000,000 budgets. And it runs in VR!
A BOX OF TOYS
You start out in the VR editor at a human scale, and can directly manipulate objects by moving around in a room-scale VR setting. But you can also use a smartphone-like pinching motion to zoom in and out. With one pinch, the world is shrunk to the size of a Barbie Doll house on your table. You can manipulate it granularly and ergonomically, and then zoom back to human scale.
Besides directly manipulating objects, you also have a laser pointer. Point at a far-away object and you can move it around, or “reel it in” like a fishing rod. Or teleport to the laser pointer’s target location with a single button click, inspired by Bullet Train’s locomotion.
THE VR USER INTERFACE: IPAD MEETS MINORITY REPORT
As a pro tool, the Unreal Editor features a rich 2D user interface, and it’s being rolled out naturally in VR: One button-press places an iPad-like tablet in your hand, and you use the other hand to interact with the tablet. Scroll, press buttons, tweak Object Details, interact with menus, drag objects out of the Content Browser and drop them directly in the world.
It’s an intuitive way to place a 2D user interface in a VR world that builds on everyone’s Unreal Editor experience, and the underlying Slate user-interface framework provides a great foundation we’ll build on as we work to roll out the entire Unreal Editor UI in VR.
As game developers, we at Epic pride ourselves in creating high-productivity tools optimized for shipping products, and VR editing provides a great path forward.
With a mouse, several operations are often required to transform an object along multiple axes in 3D. In VR, you can frequently accomplish the same result with a single, intuitive motion. This should come as no surprise, as a mouse only tracks two degrees of movement (X and Y), but in VR your head and two hands track six degrees of freedom each: X, Y, Z, and three rotational axes. That’s 9 times the high-fidelity input bandwidth!
More details are available here. It’s interesting to see if this is useful or just a gimmick. Obviously working on a VR game in full VR certainly has it’s advantages. At the end of the day though, few control schemes actually usurp the mighty mouse and keyboard. Add to that fact, at least with my Gear VR, its tiring both physically and on the eyes after a couple hours. I cant imagine doing the 9-5 routine with one of these devices strapped to your head. Time will tell I suppose.