Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
18. July 2014

I just received the following email from Unreal:

 

TranslusentFogDemo

Unreal Engine 4.3 Released!

 

More than 500 updates ship in this release! Unreal Engine 4.3 includes greatly improved mobile support, awesome new rendering features, improved Blueprint workflows, and strides toward an excellent experience on Mac and laptops.

 

Check out the new World Composition tools, spline features, and the preview of Paper2D, our 2D toolset! You also get SpeedTree 7 support, our work on Metal API for iOS 8 to date, and new Oculus Rift features such as time warping.

 

There’s no limit to what you can do with Unreal Engine 4 for only $19 per month.

 

Paper2D Side Scroller Template

Have fun with the new side-scroller template game as you become acquainted with Paper2D.

Read More

 

VR Couch Knights

We love VR, and Unreal Engine 4.3 supports the new Oculus DK2 out of the box! Dive into Epic’s popular “Couch Knights” demo which has been making the rounds at GDC and other shows.

Read More

SpeedTree 7

SpeedTree 7 support is here, and UE4 trees are 33% off in the SpeedTree store throughJuly 26!

Read More

Rendering Goodies

Rendering goodies include distance field ambient occlusion, skylight global illumination and shadowed translucency.

Read More

Behavior Trees

Better AI tools! Switch to the new Blackboard mode inside the Behavior Tree Editor to edit and debug Blackboard entries.

Read More

Large World Support!

Large world support! Check out the new World Composition tools. Create sub-levels and position them anywhere.

Read More

Customize Your Static Mesh Collision!

Customize your static mesh collision!

Read More

Spline Editing

Edit splines directly within your levels!

Read More

 

Build games and apps for Windows, Mac, iOS, Android, PlayStation 4, Xbox One, Linux, SteamOS, HTML5 and VR platforms.

Get Unreal for $19/Month

Mobile Developers!

Zen Gardens

Google recently demonstrated the graphics power of L, the upcoming release of Android, using Epic's Rivalry demo running on Tegra K1 at Google I/O. Mobile is a huge focus for UE4, and we hope you’ll enjoy all the latest improvements!

Read More

 

 

The UE4 Roadmap continues to evolve, and we encourage you to vote for features that want to use.

 

To ask questions and share feedback, please visit the forums or join our live broadcasts at Twitch.tv every Thursday at 2pm ET, which you can always go back and view atYouTube.com.

 

Hats off to the developers who contributed to this great release! These who helped are forever immortalized in the Credits section under the editor’s Help menu.

 

Thank you for being a part of this adventure. We can’t wait to see what you build next.

 

We are one step closer to Paper2D support, which is their support for 2D engines. Occulus Rift support is no doubt cool for those developing for the Rift. Not quite as impressive as the last release, but still a good amount of progress from Unreal.

News


16. July 2014

 

Or…

How to take a Blender model you downloaded from the web and make it actually usable in your game in 28 easy steps!

 

… granted, the second title doesn’t have the same flow to it, does it?

 

I just had to run through this process and I figured I would share it as it is something that occurs fairly often.  When working with Blender, there are dozens of behavioral textures available that can make for some very nice results quickly.  The only problem is, when you get your asset out of Blender and into your game engine, things suddenly go horribly wrong.  The problem is, those textures only make sense inside of Blender.  Fortunately through the magic of baking, you can easily convert them into a texture map usable in any game engine.

 

Let’s take a look how.

 

First we need a model.  I am using a beautiful new model that was recently added to Blend-Swap.  It’s a free download but you need to register.  Don’t worry, you can use a real email address, they don’t spam, or at least haven't so far.  The model in question looks like this:

 

image

 

Unfortunately when we load it in Blender we quickly learn this model is in no way game ready.  Let’s take a look:

image

 

Ick.  So instead of a single Mesh, we have a dozen individual meshes.  Problem is, we need to unwrap them as a single object, so let’s join them all together.  First let’s get the camera out of the default layer.

 

If you look at the way this particular Blend is setup, there are currently two layers, the second contains the armature, the first contains everything else.

image

 

Lets get the camera out of there.  Select the camera object then hit the M key.  Then select the layer you want to move the camera to, like so:

image

 

Now click the first layer ( bottom left box ) and it should now only contain geometry.

 

We want to join everything together.  Press ‘A’ to select everything in the layer, then hit “Ctrl + J” to join everything into a single set of geometry.  Now it should look something like this:

image

 

Perfect, now we can unwrap our model.  Switch in to EDIT mode

image

 

Press ‘A’ again, until all faces are selected, like so:

image

 

Now we unwrap our model.  Select Mesh->UV Unwrap-> Unwrap ( or Smart UV Project ).

 

Switch your view to UV/Image Editor

image

 

It should look something like this:

image

 

Now create a New Image:

image

 

This image is where we are going to render our texture to.  Here are the settings I used.  Remember, games like Power of 2 textures.

image

 

Ok, now let’s look at the actual render to texture part.  Take a quick look at how the model is currently shaded:

image

 

Frankly none of those are really game engine friendly.  So let’s render all of those materials out to a single texture.  Go to the render tab

image

 

Scroll down and locate Bake.

In the UV Editor window, make sure everything is selected ( using ‘A’.  They should be highlighted in yellow ).  At this point, with your generated image and all the UV’s selected, it should look like:

image

 

 

Now under bake, set the following settings:

image

The key values being Bake Mode = Full Render and Selected to Active checked.  Now click the Bake button.

 

Up in your top part of Blender, you should see a progress bar like so:

image

 

 

Now if you go back to the UV/Image viewer, and select your image RenderedTexture, you should see:

image

 

Cool!

 

Let’s save the result to an external ( game engine friendly ) texture.  Select Image->Save as Image.  Save the image somewhere.  Remember where.

image

 

 

Now lets modify the textures on our model to use only our newly generated texture map.  First in 3D View, switch back to Object Mode from Edit mode.

Then, open the materials tab:

image

 

Select each material and hit the – ( or killswitch engage! ) button.  So it should ultimately look like this:

image

 

Now hit the + button and create a new Material.  Then click the New button.

image

 

The default values for the material should be OK, but depending on your game engine, you may have to enable Face Textures:

image

 

Now click over to the Texture tab.  Click New.

image

 

Drop down the Type box and select Image or Movie.

image

 

Scroll down to the Image section and select Open.  Pick the image you saved earlier.

image

 

Now scroll down to Mapping, drop down Coordinates and select UV.

image

 

Under Map select UVMap.

image

 

Now if you go to the 3D View and set the view mode to Texture:

image

 

TADA!  A game ready model.

 

One word of caution though, if you render this scene in Blender you will get the following result:

image

 

Don’t worry.  That’s just a biproduct of going from Blender materials to texture mapping.  If you want the texture to be seen, you need to add some lights to the scene.  Or change the material so it has an Emit value > 0, so it will provide it’s own light source.

 

With Emit set to .92, here is the result if you render it:

 

image

 

Now, what about it game?

 

Let’s create a simple LibGDX project that loads and displays our exported model:

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;


public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;

    @Override
    public void create() {
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());

        camera.position.set(3f,0f,6f);
        camera.lookAt(0f,1f,0f);

        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f;
        camera.far = 300.0f;

        modelBatch = new ModelBatch();

        UBJsonReader jsonReader = new UBJsonReader();
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        model = modelLoader.loadModel(Gdx.files.getFileHandle("robot.g3db", FileType.Internal));
        modelInstance = new ModelInstance(model);

        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);

        camera.update();

        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }
}

 

And we run it and:

image

 

Wow, a model downloaded randomly from the Internet actually working in the game engine!  How often does that actually happen? ;)

Programming Art


15. July 2014

 

There was recently a flood of Three.js books on Safari lately including Essential Three.js and Game Development with Three.jsThree.js is a JavaScript based 3D library using WebGL ( and not, if not available ).  More importantly, it’s just really fun to play with!  Something about working in full 3D in a scripting language is just really satisfying.  I’ve only just been playing and really don’t have a clue what I’m doing, but I figured I would share my results.  As I have been on a TypeScript kick lately, I’ve been writing in TypeScript instead of plain JavaScript, but frankly the differences are fairly minimal.  You can get the TypeScript definitions on DefinatelyTyped.

 

I think I should make something perfectly clear… I have NO idea what I am doing, I am simply playing around.  This isn’t a WebGL tutorial by any definition of the word, just me having skim read a couple of books and played around with a new technology, nothing more.  So if you look at some code and thing “damn that looks hacky” or “isn’t that a really stupid thing to do?” the answer is probably yes! :)

 

So, disclaimer given, let’s jump right in. 

 

Since this is a web app, we need a host HTML page.  So, here is ours:

<!DOCTYPE html>

<html lang="en">
<head>
    <meta charset="utf-8" />
    <title>ThreeJS Test</title>
    <script src="http://cdnjs.cloudflare.com/ajax/libs/three.js/r67/three.js"></script>
    <script src="app.js"></script>
</head>
<body>
<h1>ThreeJS Test</h1>

<div id="content" style="width:500px;height:500px"></div>
</body>
</html>

 

Nothing really shocking here.  We include three.js using the cloudflare content delivery network.  If you wanted of course you could download the library locally and deploy it from your own servers.  I assume you don’t have servers situated around the world, so a CDN will generally thrash your own servers performance.  Next we include app.js, the generated output from our typescript application.  In the actual HTML we create a 500x500 DIV named content, for predictably enough, our content!

 

Now lets take a look at a super simple example app, app.ts:

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    constructor(){
        this.renderer = new THREE.WebGLRenderer({ alpha: true });
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFF0000,1);
        document.getElementById('content').appendChild(this.renderer.domElement);
    }

    start() {
        this.renderer.clear();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

Here in the constructor we create a WebGLRenderer, size it, set the background color to red ( using HTML format hex color coding ) then wire the renderer to the content div.

 

When you run it you should see:

image

 

Cool, our first Three.js application.  Now let’s do something 3D!  Let’s start by creating a camera and rendering a built in 3D object in wireframe. It's commented heavily, so I wont be explaining what is going on. If you are curious why I did something, leave a comment.

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    scene: THREE.Scene;
    camera: THREE.Camera;

    constructor(){
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFFFFFF,1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0,0,-20);
        this.camera.lookAt(new THREE.Vector3(0,0,0));

        // Createa the geometry for a sphere with a radius of 5
        var sphereGeometry = new THREE.SphereGeometry(5);

        // Create a wireframe material that's blueish
        var sphereMaterial = new THREE.MeshBasicMaterial(
            {color: 0x7777ff, wireframe: true});

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry,sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0,0,0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);
        this.renderer.render(this.scene,this.camera);
    }

    start() {
        // Well, arent I a bit pointless?
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

And when run it we get:

 

image

 

Cool!  Now time for some texturing ( and as a result, lighting ).

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;

    constructor() {
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0, 0, -20);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));

        // Createa the geometry for a sphere with a radius of 5
        // This time we cranked up the number of sections horizontal and vertical to make a 
higher resolution globe
        var sphereGeometry = new THREE.SphereGeometry(5, 20, 20);

        // This time we create a Phong shader material and provide a texture.
        var sphereMaterial = new THREE.MeshPhongMaterial(
            {
                map: THREE.ImageUtils.loadTexture("earth_sphere.jpg")
            }
        );

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry, sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0, 0, 0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);

        // We need some light so our texture will show, ad an ambient light to the scene
        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).getHex()));
        this.renderer.render(this.scene, this.camera);
    }

    render() {
        // Each frame we want to render the scene again
        // Use typescript Arrow notation to retain the thisocity passing render to requestAnimationFrame
        // It's possible I invented the word thisocity.
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        // Not so pointless now!
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

Bet you can't guess what texture I went with!

 

image

 

So apparently textured 3D objects are nothing difficult.

 

This is getting pretty long, so I’ll cut it off here.  Next up I’m going to look at getting a Blender object rendering in Three.JS.

Programming


9. June 2014

 

I just received the following email from Autodesk:

 

SAN FRANCISCO, June 9, 2014 -- Autodesk, Inc. (Nasdaq: ADSK) has acquired Stockholm-based Bitsquid AB, the creator of the Bitsquid game engine. The acquisition brings to Autodesk expertise in 3D game development and proven technology that will enable Autodesk to supercharge its portfolio of tools for game makers through the development of a new 3D game engine. Multiple game developers have used the modern and flexible Bitsquid engine to create 3D games for next-generation consoles and PCs, and Autodesk will continue to work with many of these companies to develop the new 3D game engine. Terms of the acquisition were not disclosed.

"Bitsquid has been a key success factor for Fatshark, as we’ve been able to produce high quality games with short development times,” said Martin Wahlund, CEO, Fatshark. "We are excited to see how Bitsquid evolves now that it is part of Autodesk.”

In addition to acquiring the Bitsquid game engine, the acquisition of the Bitsquid team and technology will enable Autodesk to create new tools that push the limits of real-time 3D visualization for architects and designers, many of whom face challenges placing design data into real world contexts. The new technology will also be incorporated into solutions for customers outside of the games industry, including architecture, manufacturing, construction, and film. Autodesk plans to create new types of design exploration tools that allow visualization and contextualization of designs using the same fluid control and immediate feedback that exist today in modern console and PC games.

"Autodesk's acquisition of Bitsquid will revolutionize real-time exploration of complex data. Imagine being able to walk through and explore any type of design, from buildings to cars, with the same freedom you experience in the open world of a next-generation console game. Game engine technologies will be an increasingly critical part of the workflow, not only for creating games, but also for designing buildings or solving complex urban infrastructure challenges," said Chris Bradshaw, senior vice president, Autodesk Media & Entertainment. "The Bitsquid acquisition brings to Autodesk both the expertise and the technology that will enable us to deliver a groundbreaking new approach to 3D design animation tools, and we welcome the team and community to Autodesk."

Additional information on the new Autodesk 3D game engine, which will compliment Autodesk's industry leading games portfolio of middleware tools and 3D animation software including Autodesk Maya LT, Autodesk Maya and Autodesk 3ds Max, will be available later this year.

 

In that press release, it sounds like a relatively minor acquisition, they could simply be rolling the technology in to one of their existing products.  However, if you read this site, they obviously have bigger plans:

 

More than just games – This is going to be BIG

With the acquisition of Bitsquid, Autodesk is bringing expertise in 3D game development and proven game engine technology in house. We are significantly expanding our portfolio of game making tools, complementing our middleware and 3D animation tools: Autodesk® 3ds Max®, Autodesk® Maya®, and Autodesk® Maya LT™ software. Across Autodesk, this technology will fuel new product development in our Media & Entertainment business, and enable a new class of design animation tools.

Tools for Game Makers

Later this year, Autodesk will introduce a modern and flexible 3D game engine based on the Bitsquid engine. By introducing a game engine, Autodesk can offer game makers a more complete game creation workflow from concept to release.

A New Era in Design Animation

Many of our manufacturing, architecture, building, and construction customers are also excited about game engine technology– but not because they are making games. Instead, they are looking for new ways to visualize and interact with design data with the same level of control and feedback of modern console or PC games. With the acquisition of Bitsquid, Autodesk will begin exploring the creation of a new interactive design exploration platform, integrated with our design tools, which will help designers contextualize their ideas.

In Film and Television

Autodesk will also be looking at how Bitsquid technology may be applied to workflows such as pre-vizualization and interactive compositing.

 

Bolded portion mine.  So Autodesk is clearly entering the game engine space and building it around BitSquid.  Ever heard of it?  Yeah, me neither.  It is however the engine powering Magicka:Wizard Wars:

2.jpg (1920×1080)

 

Their site is fairly minimal, but describes the BitSquid engine as:

 

Fast

Bitsquid is a new high-end game engine, built from the ground up to focus on excellent multicore performance, cache friendly data layouts and advanced rendering techniques.

Dynamic

Bitsquid supports immediate reload of all resources, both scripts and content. You can also test run levels instantly, on PCs, consoles, phones and tablets.

Flexible

The engine is completely data driven, making it easy to create a highly scalable rendering pipe that shines on both the latest DX11 GPU and mobile devices, just by changing configuration files. And any game or simulation can be authored using just Lua and visual scripting. Of course you can use C as well, where you need the speed.

Lightweight

Written with a minimalistic modular design philosophy the entire engine is less than 200 KLOC and easy to modify.

 

The technical blog however makes for an interesting read.

 

Autodesk entering the game space isn’t really a huge shock.  They actually dipped their toe in the pond when they released Scaleform as an Indie game engine.  Considering their heavy role in the game development pipeline ( due mostly to Max and Maya ), this move does make sense.  The question is, will this alienate their existing partners?

 

EDIT: One thing I didn’t mention in the original post.  Autodesk also announced the lowering of the monthly cost of Maya LT from $50 a month to $30 a month.  Additionally they have made Mudbox available for $10 a month.  This seems like a much better price point to me.  You can now get Photoshop ($30), Maya LT ($30) and Unreal (19$), a complete game development package for less than $80 a month.  Compare that to prices a few years ago and it is simply mind blowing!

 

Additionally, Develop have published an interview with Autodesk’s Frank Delise discussing the acquisition. 

News


2. June 2014

EDIT:  For a better understand of Apple’s Metal API and what it means for OpenGL, click here. 

So finally we are getting some developer related announcements out of the Apple Developer Conference.  For game developers, todays announcement is a dozy.  iOS 8 SDK includes 4,000 new API calls but most importantly includes Metal, a new lower level graphics API similar to AMD’s Mantle.  The idea is to get closer to the metal ( thus the name ) and remove the overhead of OpenGL:

 

Gaming on iOS takes a huge leap forward in iOS 8 with Metal, a new graphics technology that maximizes performance on the A7 chip. With its dramatic 10 times improvement in draw call speed, Metal enables leading game providers for the first time to bring console-class 3D games to mobile devices. For casual games, iOS 8 now features SceneKit, making it easy to create fun 3D games, along with major enhancements to SpriteKit, including field forces, per-pixel physics and inverse kinematics.

 

10 times performance improvement over OpenGL?  That sounds like marketing BS to me or describes an edge case.  If OpenGL was that bloated it would have died off year ago.  The important take away is it’s A7 only, so newest iPad and iPhones are the only ones that support it.  Unity, Crytek and Unreal are all expected to support it so it should be pretty transparent to most developers.

 

The other major announcement was Swift:

 

Swift is a powerful new programming language for iOS and OS X® that makes it easier than ever for developers to create incredible apps. Designed for Cocoa® and Cocoa Touch®, Swift combines the performance and efficiency of compiled languages with the simplicity and interactivity of popular scripting languages. By design, Swift helps developers write safer and more reliable code by eliminating entire categories of common programming errors, and coexists with Objective-C® code, so developers can easily integrate Swift into their existing apps. Xcode® Playgrounds make writing Swift code incredibly interactive by instantly displaying the output of Swift code.

 

The iOS beta software is available now for registered Apple developers.  XCode 6 is required to support the Swift programming language.  You can learn more about Swift here.  I LOVE new programming languages, so I will certainly be taking a closer look.  Some Apple toted features of swift are:

 

Swift has many other features to make your code more expressive:

  • Closures unified with function pointers
  • Tuples and multiple return values
  • Generics
  • Fast and concise iteration over a range or collection
  • Structs that support methods, extensions, protocols.

 

… interesting.  I hate ObjC, so an alternative is certainly appreciated. 


AppGameKit Studio

See More Tutorials on DevGa.me!

Month List