Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
16. July 2014



How to take a Blender model you downloaded from the web and make it actually usable in your game in 28 easy steps!


… granted, the second title doesn’t have the same flow to it, does it?


I just had to run through this process and I figured I would share it as it is something that occurs fairly often.  When working with Blender, there are dozens of behavioral textures available that can make for some very nice results quickly.  The only problem is, when you get your asset out of Blender and into your game engine, things suddenly go horribly wrong.  The problem is, those textures only make sense inside of Blender.  Fortunately through the magic of baking, you can easily convert them into a texture map usable in any game engine.


Let’s take a look how.


First we need a model.  I am using a beautiful new model that was recently added to Blend-Swap.  It’s a free download but you need to register.  Don’t worry, you can use a real email address, they don’t spam, or at least haven't so far.  The model in question looks like this:




Unfortunately when we load it in Blender we quickly learn this model is in no way game ready.  Let’s take a look:



Ick.  So instead of a single Mesh, we have a dozen individual meshes.  Problem is, we need to unwrap them as a single object, so let’s join them all together.  First let’s get the camera out of the default layer.


If you look at the way this particular Blend is setup, there are currently two layers, the second contains the armature, the first contains everything else.



Lets get the camera out of there.  Select the camera object then hit the M key.  Then select the layer you want to move the camera to, like so:



Now click the first layer ( bottom left box ) and it should now only contain geometry.


We want to join everything together.  Press ‘A’ to select everything in the layer, then hit “Ctrl + J” to join everything into a single set of geometry.  Now it should look something like this:



Perfect, now we can unwrap our model.  Switch in to EDIT mode



Press ‘A’ again, until all faces are selected, like so:



Now we unwrap our model.  Select Mesh->UV Unwrap-> Unwrap ( or Smart UV Project ).


Switch your view to UV/Image Editor



It should look something like this:



Now create a New Image:



This image is where we are going to render our texture to.  Here are the settings I used.  Remember, games like Power of 2 textures.



Ok, now let’s look at the actual render to texture part.  Take a quick look at how the model is currently shaded:



Frankly none of those are really game engine friendly.  So let’s render all of those materials out to a single texture.  Go to the render tab



Scroll down and locate Bake.

In the UV Editor window, make sure everything is selected ( using ‘A’.  They should be highlighted in yellow ).  At this point, with your generated image and all the UV’s selected, it should look like:




Now under bake, set the following settings:


The key values being Bake Mode = Full Render and Selected to Active checked.  Now click the Bake button.


Up in your top part of Blender, you should see a progress bar like so:




Now if you go back to the UV/Image viewer, and select your image RenderedTexture, you should see:





Let’s save the result to an external ( game engine friendly ) texture.  Select Image->Save as Image.  Save the image somewhere.  Remember where.




Now lets modify the textures on our model to use only our newly generated texture map.  First in 3D View, switch back to Object Mode from Edit mode.

Then, open the materials tab:



Select each material and hit the – ( or killswitch engage! ) button.  So it should ultimately look like this:



Now hit the + button and create a new Material.  Then click the New button.



The default values for the material should be OK, but depending on your game engine, you may have to enable Face Textures:



Now click over to the Texture tab.  Click New.



Drop down the Type box and select Image or Movie.



Scroll down to the Image section and select Open.  Pick the image you saved earlier.



Now scroll down to Mapping, drop down Coordinates and select UV.



Under Map select UVMap.



Now if you go to the 3D View and set the view mode to Texture:



TADA!  A game ready model.


One word of caution though, if you render this scene in Blender you will get the following result:



Don’t worry.  That’s just a biproduct of going from Blender materials to texture mapping.  If you want the texture to be seen, you need to add some lights to the scene.  Or change the material so it has an Emit value > 0, so it will provide it’s own light source.


With Emit set to .92, here is the result if you render it:




Now, what about it game?


Let’s create a simple LibGDX project that loads and displays our exported model:


package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.utils.UBJsonReader;

public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;

    public void create() {
        camera = new PerspectiveCamera(


        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f;
        camera.far = 300.0f;

        modelBatch = new ModelBatch();

        UBJsonReader jsonReader = new UBJsonReader();
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        model = modelLoader.loadModel(Gdx.files.getFileHandle("robot.g3db", FileType.Internal));
        modelInstance = new ModelInstance(model);

        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));

    public void dispose() {

    public void render() {, 0,,;, 1, 1, 1); | GL20.GL_DEPTH_BUFFER_BIT);


        modelBatch.render(modelInstance, environment);

    public void resize(int width, int height) {

    public void pause() {

    public void resume() {


And we run it and:



Wow, a model downloaded randomly from the Internet actually working in the game engine!  How often does that actually happen? ;)

Programming Art

15. July 2014


There was recently a flood of Three.js books on Safari lately including Essential Three.js and Game Development with Three.jsThree.js is a JavaScript based 3D library using WebGL ( and not, if not available ).  More importantly, it’s just really fun to play with!  Something about working in full 3D in a scripting language is just really satisfying.  I’ve only just been playing and really don’t have a clue what I’m doing, but I figured I would share my results.  As I have been on a TypeScript kick lately, I’ve been writing in TypeScript instead of plain JavaScript, but frankly the differences are fairly minimal.  You can get the TypeScript definitions on DefinatelyTyped.


I think I should make something perfectly clear… I have NO idea what I am doing, I am simply playing around.  This isn’t a WebGL tutorial by any definition of the word, just me having skim read a couple of books and played around with a new technology, nothing more.  So if you look at some code and thing “damn that looks hacky” or “isn’t that a really stupid thing to do?” the answer is probably yes! :)


So, disclaimer given, let’s jump right in. 


Since this is a web app, we need a host HTML page.  So, here is ours:

<!DOCTYPE html>

<html lang="en">
    <meta charset="utf-8" />
    <title>ThreeJS Test</title>
    <script src=""></script>
    <script src="app.js"></script>
<h1>ThreeJS Test</h1>

<div id="content" style="width:500px;height:500px"></div>


Nothing really shocking here.  We include three.js using the cloudflare content delivery network.  If you wanted of course you could download the library locally and deploy it from your own servers.  I assume you don’t have servers situated around the world, so a CDN will generally thrash your own servers performance.  Next we include app.js, the generated output from our typescript application.  In the actual HTML we create a 500x500 DIV named content, for predictably enough, our content!


Now lets take a look at a super simple example app, app.ts:

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

    start() {

window.onload = () => {
    var three = new ThreeJSTest();


Here in the constructor we create a WebGLRenderer, size it, set the background color to red ( using HTML format hex color coding ) then wire the renderer to the content div.


When you run it you should see:



Cool, our first Three.js application.  Now let’s do something 3D!  Let’s start by creating a camera and rendering a built in 3D object in wireframe. It's commented heavily, so I wont be explaining what is going on. If you are curious why I did something, leave a comment.

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    scene: THREE.Scene;
    camera: THREE.Camera;

        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white

        // Bind the renderer to the HTML, parenting it to our 'content' DIV

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin = new THREE.Vector3(0,0,-20); THREE.Vector3(0,0,0));

        // Createa the geometry for a sphere with a radius of 5
        var sphereGeometry = new THREE.SphereGeometry(5);

        // Create a wireframe material that's blueish
        var sphereMaterial = new THREE.MeshBasicMaterial(
            {color: 0x7777ff, wireframe: true});

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry,sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0,0,0);

        // Add it to the scene and render the scene using the Scene and Camera objects

    start() {
        // Well, arent I a bit pointless?

window.onload = () => {
    var three = new ThreeJSTest();


And when run it we get:




Cool!  Now time for some texturing ( and as a result, lighting ).

///<reference path="./three.d.ts"/>

class ThreeJSTest {

    constructor() {
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin = new THREE.Vector3(0, 0, -20); THREE.Vector3(0, 0, 0));

        // Createa the geometry for a sphere with a radius of 5
        // This time we cranked up the number of sections horizontal and vertical to make a 
higher resolution globe
        var sphereGeometry = new THREE.SphereGeometry(5, 20, 20);

        // This time we create a Phong shader material and provide a texture.
        var sphereMaterial = new THREE.MeshPhongMaterial(
                map: THREE.ImageUtils.loadTexture("earth_sphere.jpg")

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry, sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0, 0, 0);

        // Add it to the scene and render the scene using the Scene and Camera objects

        // We need some light so our texture will show, ad an ambient light to the scene
        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).getHex()));

    render() {
        // Each frame we want to render the scene again
        // Use typescript Arrow notation to retain the thisocity passing render to requestAnimationFrame
        // It's possible I invented the word thisocity.
        requestAnimationFrame(() => this.render());

    start() {
        // Not so pointless now!

window.onload = () => {
    var three = new ThreeJSTest();

Bet you can't guess what texture I went with!




So apparently textured 3D objects are nothing difficult.


This is getting pretty long, so I’ll cut it off here.  Next up I’m going to look at getting a Blender object rendering in Three.JS.


9. June 2014


I just received the following email from Autodesk:


SAN FRANCISCO, June 9, 2014 -- Autodesk, Inc. (Nasdaq: ADSK) has acquired Stockholm-based Bitsquid AB, the creator of the Bitsquid game engine. The acquisition brings to Autodesk expertise in 3D game development and proven technology that will enable Autodesk to supercharge its portfolio of tools for game makers through the development of a new 3D game engine. Multiple game developers have used the modern and flexible Bitsquid engine to create 3D games for next-generation consoles and PCs, and Autodesk will continue to work with many of these companies to develop the new 3D game engine. Terms of the acquisition were not disclosed.

"Bitsquid has been a key success factor for Fatshark, as we’ve been able to produce high quality games with short development times,” said Martin Wahlund, CEO, Fatshark. "We are excited to see how Bitsquid evolves now that it is part of Autodesk.”

In addition to acquiring the Bitsquid game engine, the acquisition of the Bitsquid team and technology will enable Autodesk to create new tools that push the limits of real-time 3D visualization for architects and designers, many of whom face challenges placing design data into real world contexts. The new technology will also be incorporated into solutions for customers outside of the games industry, including architecture, manufacturing, construction, and film. Autodesk plans to create new types of design exploration tools that allow visualization and contextualization of designs using the same fluid control and immediate feedback that exist today in modern console and PC games.

"Autodesk's acquisition of Bitsquid will revolutionize real-time exploration of complex data. Imagine being able to walk through and explore any type of design, from buildings to cars, with the same freedom you experience in the open world of a next-generation console game. Game engine technologies will be an increasingly critical part of the workflow, not only for creating games, but also for designing buildings or solving complex urban infrastructure challenges," said Chris Bradshaw, senior vice president, Autodesk Media & Entertainment. "The Bitsquid acquisition brings to Autodesk both the expertise and the technology that will enable us to deliver a groundbreaking new approach to 3D design animation tools, and we welcome the team and community to Autodesk."

Additional information on the new Autodesk 3D game engine, which will compliment Autodesk's industry leading games portfolio of middleware tools and 3D animation software including Autodesk Maya LT, Autodesk Maya and Autodesk 3ds Max, will be available later this year.


In that press release, it sounds like a relatively minor acquisition, they could simply be rolling the technology in to one of their existing products.  However, if you read this site, they obviously have bigger plans:


More than just games – This is going to be BIG

With the acquisition of Bitsquid, Autodesk is bringing expertise in 3D game development and proven game engine technology in house. We are significantly expanding our portfolio of game making tools, complementing our middleware and 3D animation tools: Autodesk® 3ds Max®, Autodesk® Maya®, and Autodesk® Maya LT™ software. Across Autodesk, this technology will fuel new product development in our Media & Entertainment business, and enable a new class of design animation tools.

Tools for Game Makers

Later this year, Autodesk will introduce a modern and flexible 3D game engine based on the Bitsquid engine. By introducing a game engine, Autodesk can offer game makers a more complete game creation workflow from concept to release.

A New Era in Design Animation

Many of our manufacturing, architecture, building, and construction customers are also excited about game engine technology– but not because they are making games. Instead, they are looking for new ways to visualize and interact with design data with the same level of control and feedback of modern console or PC games. With the acquisition of Bitsquid, Autodesk will begin exploring the creation of a new interactive design exploration platform, integrated with our design tools, which will help designers contextualize their ideas.

In Film and Television

Autodesk will also be looking at how Bitsquid technology may be applied to workflows such as pre-vizualization and interactive compositing.


Bolded portion mine.  So Autodesk is clearly entering the game engine space and building it around BitSquid.  Ever heard of it?  Yeah, me neither.  It is however the engine powering Magicka:Wizard Wars:

2.jpg (1920×1080)


Their site is fairly minimal, but describes the BitSquid engine as:



Bitsquid is a new high-end game engine, built from the ground up to focus on excellent multicore performance, cache friendly data layouts and advanced rendering techniques.


Bitsquid supports immediate reload of all resources, both scripts and content. You can also test run levels instantly, on PCs, consoles, phones and tablets.


The engine is completely data driven, making it easy to create a highly scalable rendering pipe that shines on both the latest DX11 GPU and mobile devices, just by changing configuration files. And any game or simulation can be authored using just Lua and visual scripting. Of course you can use C as well, where you need the speed.


Written with a minimalistic modular design philosophy the entire engine is less than 200 KLOC and easy to modify.


The technical blog however makes for an interesting read.


Autodesk entering the game space isn’t really a huge shock.  They actually dipped their toe in the pond when they released Scaleform as an Indie game engine.  Considering their heavy role in the game development pipeline ( due mostly to Max and Maya ), this move does make sense.  The question is, will this alienate their existing partners?


EDIT: One thing I didn’t mention in the original post.  Autodesk also announced the lowering of the monthly cost of Maya LT from $50 a month to $30 a month.  Additionally they have made Mudbox available for $10 a month.  This seems like a much better price point to me.  You can now get Photoshop ($30), Maya LT ($30) and Unreal (19$), a complete game development package for less than $80 a month.  Compare that to prices a few years ago and it is simply mind blowing!


Additionally, Develop have published an interview with Autodesk’s Frank Delise discussing the acquisition. 


2. June 2014

EDIT:  For a better understand of Apple’s Metal API and what it means for OpenGL, click here. 

So finally we are getting some developer related announcements out of the Apple Developer Conference.  For game developers, todays announcement is a dozy.  iOS 8 SDK includes 4,000 new API calls but most importantly includes Metal, a new lower level graphics API similar to AMD’s Mantle.  The idea is to get closer to the metal ( thus the name ) and remove the overhead of OpenGL:


Gaming on iOS takes a huge leap forward in iOS 8 with Metal, a new graphics technology that maximizes performance on the A7 chip. With its dramatic 10 times improvement in draw call speed, Metal enables leading game providers for the first time to bring console-class 3D games to mobile devices. For casual games, iOS 8 now features SceneKit, making it easy to create fun 3D games, along with major enhancements to SpriteKit, including field forces, per-pixel physics and inverse kinematics.


10 times performance improvement over OpenGL?  That sounds like marketing BS to me or describes an edge case.  If OpenGL was that bloated it would have died off year ago.  The important take away is it’s A7 only, so newest iPad and iPhones are the only ones that support it.  Unity, Crytek and Unreal are all expected to support it so it should be pretty transparent to most developers.


The other major announcement was Swift:


Swift is a powerful new programming language for iOS and OS X® that makes it easier than ever for developers to create incredible apps. Designed for Cocoa® and Cocoa Touch®, Swift combines the performance and efficiency of compiled languages with the simplicity and interactivity of popular scripting languages. By design, Swift helps developers write safer and more reliable code by eliminating entire categories of common programming errors, and coexists with Objective-C® code, so developers can easily integrate Swift into their existing apps. Xcode® Playgrounds make writing Swift code incredibly interactive by instantly displaying the output of Swift code.


The iOS beta software is available now for registered Apple developers.  XCode 6 is required to support the Swift programming language.  You can learn more about Swift here.  I LOVE new programming languages, so I will certainly be taking a closer look.  Some Apple toted features of swift are:


Swift has many other features to make your code more expressive:

  • Closures unified with function pointers
  • Tuples and multiple return values
  • Generics
  • Fast and concise iteration over a range or collection
  • Structs that support methods, extensions, protocols.


… interesting.  I hate ObjC, so an alternative is certainly appreciated. 

16. May 2014


Unreal just announced the contents of the upcoming 4.2 update.  I’m not sure when exactly it is going to be dropping ( I am downloading a 1.3GB update as we speak, but I don’t think it’s 4.2 ).  Regardless there is a stupid (good) amount of content in this update:



Vehicles are now fully supported in Unreal Engine! To celebrate we’ll be releasing a really cool sample game that demonstrates a fun off-road racing experience!  This will be available for free on the Marketplace with 4.2.

New Sample Vehicle Game

  • Out of the box support for 4WD, FWD, and RWD drive trains.
  • The default drive train simulation assumes 4 wheels; however it will work with any number of wheels.
  • As many gears you desire.
  • Automatic and semi-manual transmissions.
  • And it is all completely exposed to Blueprints!

Mechanical Setup for Vehicle Game

  • Full documentation on setting up a vehicle will be available on the release of 4.2

The Stylized Rendering sample will be available on the Marketplace for free! This sample showcases the rendering flexibly of Unreal Engine 4.

Stylized Rendering Sample


Support for CameraAnims has now been added into Unreal Engine 4! These are very similar to CameraAnims from Unreal Engine 3, but now expanded with Blueprint support.

  • Conceptually, a CameraAnim is simply an animation that can be layered onto the in-game camera.  You can animate the camera position and rotation, FOV, and post process settings.
  • They can be created in the Content Browser, you can convert a track in Matinee to one, or you can import keyframes from external tools like Maya

CameraAnim System

  • To edit a CameraAnim, simply double click the asset in the Content Browser like you would any other asset. The CameraAnim editor is slightly customized version of Matinee. 
  • Multiple CameraAnimInsts (up to 8 currently) can be active at once, all blending and contributing to the final camera settings.

User Defined Structures are a brand new asset that is now available for use from within the editor!

A User Defined Structure can be created in the Content Browser.

User-Define Structures in Blueprints

They can be edited in the standalone editor by double clicking them in the Content Browser. Once you are done editing them, you can create a variable of a type of your newUser Defined Structure in your Blueprints.

Content Browser Blueprints

Like vectors, rotators, and other structures, your User Defined Structure will have Makeand Break nodes. User Defined Structures should behave like native structures: USTRUCT(BlueprintType).


Users can now create a blueprint function library asset!  This allows you to create a nice library of Blueprint functions that can be used throughout your project!

Blueprint Library

Unlike the Blueprint Macro Library, Blueprint Function Libraries don’t require a Parent Class, and are able to be called by all Blueprints in the current project.


FABRIK stands for Forward And Backward Reaching Inverse Kinematic.  It’s an IK solver that works on a chain of bones of any arbitrary length!

Fabrik Inverse Kinematics Solver

  • End Effector settings are the same as our TwoBone_IK node. It can be an absolute Transform, or a relative one (based on another bone from the same Skeleton).
  • In the Solver section, you define the chain of bones to use, from the ‘Root’ to the ‘Tip’. The ‘Tip’ will try to reach the end effector location.
  • End Effector Rotation Source allows you to control the rotation (maintain component space, local space, match end effector target rotation).
  • Precision is how close it needs to get. The lower, the more precise it gets to the End Effector Target, but the more expensive. (Although from tests it does a really nice job, and much quicker than the CCD_IK node).
  • MaxIterations is there to control performance, and make sure edge cases will not take frame rate down.

Thanks to GitHub community member Stephen Whittle for this feature!


You can now draw Canvas UI straight to a texture, and then map that texture to objects in your scene!

  • This is fully supported by both C++ and Blueprint workflows. 
  • A special thanks to community member James Tan for submitting this.

We have a new project template that gives a simple automobile to start your new vehicle-based project with.  Both Blueprint and C++ versions of this template will be included with 4.2!

Vehicle Template


Persona now has the ability to override the assets used in Play and Evaluate nodes in parent animation blueprints from a child blueprint.

The editor for this can be found in the Window menu in Persona:

Asset OVerride

The editor collects all of the nodes that can have their animation asset overridden. New assets can be picked from the pickers on the right of the panel. This works with longer inheritance chains too and shows the most appropriate defaults for that blueprint based upon the blueprints further up in the hierarchy. Finally, the “eye” button will link you to the AnimGraph node you are overriding:



Multi-select support for Anim Notifies has been added into Persona! Shift + click adds to selection, and Ctrl + click toggles a notify selection.

You can drag the selection around and it will remain within the notify panel while obeying snaps:

Image Notifier

Copy/Paste works with groups too with a number of options:

You can paste them at the absolute, relative, or original time, in relation from where they were copied.


Cut/Copy/Paste commands have now been added to Components mode!

Blueprints Cute/Paste

  • Select individual components and either right-click or use keyboard Cut/Paste commands to copy or move components around!.
  • You can cut/copy components from one Blueprint and paste them right into another Blueprint!

If the selection includes a hierarchy, it will be preserved in the pasted copy!


There is a new experimental plugin available for the Math Expression Node for Blueprints. This node enables you to simply type in an expression and it will make the input pins, output pins, and a collapsed graph that contains your math expression.

Experimental Math Expression Node

  • To use this node, simply activate the Math Expression plugin in the plugin manager. It will then appear in the right click context menu of the Blueprint Graph Editor
  • Once created, type in your expression. If you make a mistake, you can edit the expression by renaming the node.

You can now add UV parallax to materials using the SplineThicken Material Function.  This makes it look like your object is round!

SlineThicken Example

  • The normals part of the function has been re-touched and they are now transforming correctly for all object rotations. This gives accurate lighting and specular!

New inputs:

  • UVs for Thickness (V2): This lets you specify a different UV channel for storing the thickness (tip to base) gradient. Useful to have this on UV1 or 2 for trees where there might be an overall tree length included not just a single pipe or branch etc.
  • UVs for Texturing (V2): This is the UVs for any textures you want applied to the pipe. You need to include and scale math here so it knows how much to parallax by. This is only needed if you want the 3D parallax correction results.
  • DeriveNormalZ (Bool): When checked, the shader will use DeriveNormalZ to compute the height of the normal. Gives much nicer ‘round’ shape. When false, 0.62 is assumed which is the average height of half a sphere centered at 0. If you want to use CustomUVs to solve the normal, you either need DeriveNormalZ to be false, or you need a row of vertices in the center of the spline mesh. If you do not have the added verts and use CustomUVs, it will say the normal has 0 height across the entire mesh.
  • AngleCorrectedNormals (Bool): Whether to use angle corrected normals when applying the additional normal texture. More accurate but more expensive.
  • AdditionalNormal (V3): Lets you specify an additional normalmap that will piggyback onto the vertexnormal transformation.

New output:

  • UVs with Parallax: This gives the UVs to use for any textures you want to have the 3d parallax.

Currently the function only handles 1 texture coordinate, so if you want multiple textures to have the correction, they all need to use the same scale.


There are a number of new Animation Debug commands at your disposal. First there is in game rendering of a skeletal mesh’s bones:

Animation Debug Features

This is enabled using the ShowDebug Bones console command. As seen above the bones are represented by individual white lines.

  • An alternative look, matching the bones displayed in Persona, can be enabled via theShowDebugToggleSubCategory 3DBones console command.
  • Next is the animation debug output, which can be enabled using the ShowDebug Animation console command.
  • This is split up into 5 sections, each of which can be toggled on and off using theShowDebugToggleSubCategory command followed by the category name listed below e.g. ShowDebugToggleSubCategory SyncsGroups
    • SyncGroups: Displays the animation assets currently contributing to the final pose, organised by their sync group (or Ungrouped if they don’t belong to a group). By default Blendspaces listed in this section show all their contributing animations / weights. To reduce screen space used by the output this can be toggled off with ShowDebugToggleSubCategory FullBlendspaceDisplay.
    • Montages: Lists the montages currently being used by the character. The active montage is highlighted in green.
    • Curves: List the curve values (in Name: Value pairs) that have been activated by the playing animation(s).
    • Notifies: Display any notify states that are currently in effect.
    • Graph: Displays the active pose graph. The display starts with the last node (the root node) which represents the final pose and goes on to list all the nodes that go into making that final pose. Nodes are represented in such a way as to keep their hierarchy, allowing the user to see which nodes are connected to what without having to refer to the original blueprint asset. Active nodes are coloured green and (if they have been toggled to display usingShowDebugToggleSubCategory FullGraph ) inactive nodes are coloured grey.

Many useful engine stats can be visualized over the viewport.  You can now access these using the new “Stat” section under the viewport “Show” menu.

Level Viewport Stats

  • You can also toggle most of these stats by typing “Stat <name>” into a debug console prompt. 
  • By default the stats aren’t remembered between sessions, but you can turn that on by enabling “Save Engine Stats” in the editor’s Viewport preferences section.

You can now import .obj files for static meshes!

The file format is very simple so keep in mind that it does not support the following features:

  • Vertex color importing.
  • Collision importing.
  • Tangent and binormal importing.
  • Transforms.
    • The model will be rotated if not modeled with Z up because with OBJ importing we have no way of getting the source coordinate system.

The FBX importer as now been upgraded to the 2014 version from Autodesk.

  • This allows Tangent and binormals on mirrored meshes to be imported correctly
  • You can still use the earlier FBX plugins found in any Maya/Max version before 2014, but you may get a warning on import when using a very old file.

Developers working out of GitHub now have the ability to deploy your project to Windows XP.

  • To enable this, set WindowsPlatform.SupportWindowsXP to true in UnrealBuildTool, and edit your project’s settings to enable OpenGL shader support.
  • When running on Windows XP, OpenGL is automatically used instead of DirectX 11 for rendering
  • This feature is early in development and will be improved over time.
  • In Blueprints, the new EndPlay function replaces the Destroyed function.  Existing usages of Destroyed will automatically update to the new function.
  • EndPlay will not just fire when an Actor is explicitly destroyed, but will also execute anytime an Actor ceases to be in the World.  This includes a level transition, a streaming level being unloaded, a PIE session ending, or Destroy being called for an Actor
  • In C++, the AActor::Destroyed virtual function remains, however it is primarily intended for editor transaction purposes.
  • The C++ AActor::EndPlay virtual function takes an enumeration parameter indicating the reason the Actor has been removed from the World.
  • The AActor::OnRemoveFromWorld virtual function, previously called for each Actor when the streaming level they belong to was unloaded, has been removed and its functionality included in AActor::EndPlay.


  • New: Vertex painting now works with Blueprints.
  • New: When attaching actors, you can now use an actor picker to choose which actor to attachto.
  • New: Added check for "Game View" when drawing geometry features in the editor.
  • New: You can now use Alt + [ or ] to adjust the size of the transform gizmo.
  • New: Collections now store and display a custom colour based on the local user settings.
  • New: Added option for Flat Bottomed collision to character components.
  • New: You now have the option to remove content downloaded from marketplace.
  • New: Creating multiple actors using drag and drop from the content browser now undo's as a single transaction.
  • New: Added the ability to refresh the project browser list.
  • New: You can now chooe where to place a class added via the New Class Wizard.
  • New: You can now provide a function to get the objects you want to show in the details view when creating an FSimpleAssetEditor.
  • Moved Source Code Access functionality to a plugin.
    • Source code access is now performed on the main thread.
  • Changed Static Meshes to check Screen Size rather than Distance to calculate which LOD to use.
  • Changed renaming an actor so it now ignores leading or trailing space when validating the name.
  • Fixed static lighting to now be disabled for Instanced Static Mesh Components.
  • Fixed project template names and descriptions to now fall back to English if a valid translation cannot be found for the current language.
  • Fixed the Submit Files Dialog to now follow the same style as the rest of the engine.
  • Fixed the Cascade ToggleRealtime tooltip to match the other Editor Viewports.
  • Fixed Particle System Components to now toggle visibility correctly.
  • Removed viewport-position menu items from the scene outliner menu.
  • Disabled Ctrl-select when using vertex paint.
  • Content Browser
    • New: Enter and Space Bar keys can now be remapped for the Content Browser.
  • Material Editor
    • New: Release Stats and Build In Stats toolbar buttons now have icons.
    • New: Added Find Results tab to Material Editor.
    • New: Added shortcut for component mask on Shift+C.
  • Texture Editor
    • Changed the Use Specified Mip Level property (bUseSpecifiedMipLevel) to now default to false, and the number of cinematic mips is ignored by the UI when it's true.
  • Persona
    • Changed the default and details inspectors so they are now disabled when editing a read only graph.
    • Fixed the erroneously set flag CPF_DisableEditOnTemplate by adding code to clear it.
  • Source Control
    • New: Added ability to sync directories from source control in the Editor.
  • New: Added P4 API 2014.2 with OpenSSL 1.0.1g.
  • BSP
    • New: A BSP actor is now deselected when none of its surfaces are selected.
    • Fixed the Slate Property Editor Combo element (SPropertyEditorCombo) to update the selection before opening itself.
  • UI
    • Fixed restoring from full screen to now set the window position on the native window.
  • Viewports
    • New: In-game Slate UI is now visible in editor viewports when in immersive mode.
  • Scene Outliner
    • New: Added filter that hides actors that aren’t in the current level.
    • New: There is now a menu option for scene outliner folders to select all the actors within that folder.
    • New: Duplicate now functions with hidden actors.
    • Changed the 'eye' visibility icon to acts on the current selection, if the row is selected.
    • Removed viewport-position menu items from the scene outliner menu.
  • Landscape
    • New: Implemented navigation geometry export for instanced static meshes.
    • New: Added support for landscape splines to FBX export.
    • Removed sculpting-only options from flatten tool when in paint mode.
  • Audio
    • New: Integrated Omni Radius from Unreal Engine 3.
  • Animation
    • New: Native Anim Notifies and states are now supported.

iOS, Android, and HTML5

  • Lots of iOS, Android, and HTML5 improvements – full details will be in the final release notes.


  • Fixed duplicate move calls occurring on Blueprint Graph nodes.


  • New: The build scale is now taken into account when calculating a Static Meshes streaming texture factors.
  • New: Particle parameter distributions can now be used in conjunction with Color Over Life for GPU sprites.
  • New: Mesh Modifies Material Position now uses the result of the material translator.
  • New: Particle Lights now work with camera offset (per-view).
  • New: Added commandlet which lists all Static Mesh assets that were imported fromSpeedtrees.



An impressive amount of new feature in this update. The biggest have to be the ability to render UI components to in game objects as well as the new vehicle support.  The new camera animation system is also an impressive addition.  On of the smaller, but probably most important to many indie developers, is the addition of OBJ support for static mesh creation.  This opens up Unreal Engine to a whole realm of 3D creation tools that have no ( or poor ) FBX support, as the > $4,000 price tag on various Autodesk applications can make for a bitter pill.


I have to say, I continue to be amazed at how fast they are updating Unreal Engine.  Hopefully this update drops very soon.


AppGameKit Studio

See More Tutorials on!

Month List