Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
12. February 2016

 

SDKBOX started life as a tool that made integrating 3rd party SDKs such as Google Analytics or Google/Apple IAP into Cocos2d-x applications easy.  It was just recently spun out from under the Cocos2d-x project by parent company Chukong.  This move seems to be in advance of adding support for other engines, namely Unreal and Unity.  Right now however the implementation for both engines is minimal, limited to just the In-App Purchases module.  I assume in time more plugins will be brought over to the other engines.

 

From the above linked article:

Asked why the company did this, Zhao said in an email, “By being a standalone company we can cover a much broader (fully half) portion of the market. SDK fatigue is an efficiency drain for all mobile developers, regardless of the engine they are using. By defining the problem we are working on, rather than the engine we are working with, we can serve game developers impartially.”

SDKBOX has also begun supporting rival game engines from Unity Technologies and Epic Games with its 2.0 update. And earlier this month, the SDKBOX in-app purchase controller was approved for sale and listed in the Unity Asset Store.

Haozhi Chen, CEO of Chukong Technologies, said in a statement, “Spinning off SDKBOX was the next logical step for scaling the business within the games industry. This move provides more autonomy for SDKBOX to support additional engines and grow market share in the live operations technology market for games. Additionally, the new company will have dedicated financial and strategic support from Chukong, and we’re looking forward to the future growth of this sector for our overall business.”

GameDev News


11. February 2016

 

I have to say, I just came across this custom branch of Blender and it’s astonishingly good.  In addition to a number of improvements to viewport lighting, this Blender branch actually brings PBR (Physically Based Rendering) to the Blender viewport.  To really appreciate what has been accomplished here, you really need to check out the video.

Blender PBR viewport Branch v0.2 from Clément FOUCAULT on Vimeo.

You can also see a direct before and after shot below!

This is truly some impressive work. It is available on the Author's website. By the way, he is looking for work in the GameDev field... someone hire this guy!

GameDev News Art


11. February 2016

 

In a move that will surprise absolutely nobody, today at the Vision VR/AR Summit, Valve announced they will be bringing SteamVR support to Unity, as well as a new VR rendering plugin for extended functionality.

From the announcement:

Today, during the opening keynote of the inaugural Vision VR / AR Summit, Valve and Unity Technologies announced a new collaboration to offer native support forimage SteamVR in the Unity Platform, giving developers new reach at no extra cost. Additionally, we will be adding a new VR rendering plugin to further enhance functionality.

The collaboration means that all of Unity’s developers will have access to native support for Valve’s upcoming SteamVR platform. Beyond SteamVR support, Valve has developed an advanced rendering plugin for Unity to further enhance fidelity and performance, bringing consumers more realistic experiences.

Valve co-founder Gabe Newell announced the news during a special video address at the Vision Summit, adding:“We made many of our Vive demos using Unity, and continue to use it today in VR development. Through that process, and in working with VR developers, we found some opportunities to make Unity even more robust and powerful for us and really want to share those benefits with all VR content creators.”

Unity CEO John Riccitiello went on to discuss the news during the Vision Summit opening keynote: “Valve and Unity are both dedicated to creating the highest quality VR experiences possible. That means giving developers every possible chance to succeed, and our collaboration with Valve is designed to do just that.”

Valve will also be providing a talk at Vision, and to celebrate the launch, they are surprising every developer at the conference with a free HTC Vive Pre, the latest SteamVR development system. For more information, please visit http://visionsummit2016.com/

GameDev News


10. February 2016

 

This news care of /r/gamedev, EA have released EASTL under the BSD license on Github.

EASTL has been around for ages, and got a limited release a number of years back, but this is the first time there has been a complete release under a BSD license.  EASTL is an implementation of the Standard Template Libraries in C++, but optimized for game performance, especially on consoles.  This new release represents a more up to date version and one that is actively maintained.

If you are familiar with the C++ STL or have worked with other templated container/algorithm libraries, you probably don't need to read this. If you have no familiarity with C++ templates at all, then you probably will need more than this document to get you up to speed. In this case, you need to understand that templates, when used properly, are powerful vehicles for the ease of creation of optimized C++ code. A description of C++ templates is outside the scope of this documentation, but there is plenty of such documentation on the Internet.

EASTL is suitable for any tools and shipping applications where the functionality of EASTL is useful. Modern compilers are capable of producing good code with templates and many people are using them in both current generation and future generation applications on multiple platforms from embedded systems to servers and mainframes.

The Github repository is available here.

GameDev News


10. February 2016

 

Intel RealSense is a technology and SDK for computer vision including motion controls, facial recognition and more.  There are several cameras and laptops on the market these days that are compatible with RealSense.  Very similar in scope and function to Kinect for the Xbox and Xbox One.

Well, earlier this week a plugin was released for Unreal Engine enabling RealSense support.  From the announcement:

Intel is always excited to introduce innovative tools and technologies that empower the world's most passionate content creators. In case you're unfamiliar, the Intel RealSense cameras use infrared light to compute depth in addition to normal RGB pictures and video. To assist in the development of applications with this technology, Intel created the RealSense SDK, a library of computer vision algorithms including facial recognition, image segmentation, and 3D scanning. 

Short-Range, User-Facing RealSense Camera Developer Kit

Seeing the potential use cases for this technology in gaming, we would now like to introduce you to the RealSense Plugin, a collaborative effort among games engineers at Intel to expose the features of the RealSense SDK to the Blueprints Visual Scripting System in UE4.

Check out the plugin source code and a sample project here.

PLUGIN OVERVIEW

The plugin is architected as a set of Actor Components, each of which encapsulates a distinct set of features from the RealSense SDK. Using these relatively lightweight components, you can add 3D sensing capabilities to nearly any actor in your game, and you can access this data anywhere by simply instantiating another instance of the same component.

Figure 2: ace scanning and mapping in Unreal Tournament using the Scan 3D Component

PLUGIN COMPONENTS

Currently, the plugin features these three RealSense Components:

  1. Camera Streams Component: Provides access to the raw color and depth video streams from the RealSense camera.
  2. Scan 3D Component: Supports the scanning of real-world objects and human faces (Pictured above).
  3. Head Tracking Component (Preview): Supports the detection and tracking of a user’s head position and orientation.

The downside to head tracking controls is the user ultimately still has to look at the screen!  So while it would be awesome to have the computer track your head movements in say... a car racing sim, you still need to keep your head looking straight ahead.  Well except of course in VR, where this entire process is done in the hardware.  Have any of you encountered actual cool usage of RealSense in a game?

GameDev News


GFS On YouTube

See More Tutorials on DevGa.me!

Month List