Intel RealSense Plugin For Unreal Engine Released

 

Intel RealSense is a technology and SDK for computer vision including motion controls, facial recognition and more.  There are several cameras and laptops on the market these days that are compatible with RealSense.  Very similar in scope and function to Kinect for the Xbox and Xbox One.

Well, earlier this week a plugin was released for Unreal Engine enabling RealSense support.  From the announcement:

Intel is always excited to introduce innovative tools and technologies that empower the world’s most passionate content creators. In case you’re unfamiliar, the Intel RealSense cameras use infrared light to compute depth in addition to normal RGB pictures and video. To assist in the development of applications with this technology, Intel created the RealSense SDK, a library of computer vision algorithms including facial recognition, image segmentation, and 3D scanning. 

Short-Range, User-Facing RealSense Camera Developer Kit

Seeing the potential use cases for this technology in gaming, we would now like to introduce you to the RealSense Plugin, a collaborative effort among games engineers at Intel to expose the features of the RealSense SDK to the Blueprints Visual Scripting System in UE4.

Check out the plugin source code and a sample project here.

PLUGIN OVERVIEW

The plugin is architected as a set of Actor Components, each of which encapsulates a distinct set of features from the RealSense SDK. Using these relatively lightweight components, you can add 3D sensing capabilities to nearly any actor in your game, and you can access this data anywhere by simply instantiating another instance of the same component.

Figure 2: ace scanning and mapping in Unreal Tournament using the Scan 3D Component

PLUGIN COMPONENTS

Currently, the plugin features these three RealSense Components:

  1. Camera Streams Component: Provides access to the raw color and depth video streams from the RealSense camera.
  2. Scan 3D Component: Supports the scanning of real-world objects and human faces (Pictured above).
  3. Head Tracking Component (Preview): Supports the detection and tracking of a user’s head position and orientation.

The downside to head tracking controls is the user ultimately still has to look at the screen!  So while it would be awesome to have the computer track your head movements in say… a car racing sim, you still need to keep your head looking straight ahead.  Well except of course in VR, where this entire process is done in the hardware.  Have any of you encountered actual cool usage of RealSense in a game?

GameDev News


Scroll to Top