Game Development Tutorial: Swift and SpriteKit Part 6 - Working with Physics Part 1

25. July 2014

 

Now we are going to look at using Physics in our Spritekit based game.  If you’ve worked with a physics engine before, a lot of this is going to feel very familiar.  One of the key differences from many other engines is SpriteKit handles updating the graphics as well as the physics.  It’s a fairly involved process, so I am going to split this across multiple posts.

 

The first one is going to be short and sweet.  We are going to configure the physics of a sphere, a SKShapeNode.  It is simply going to exist and let gravity take it’s course.  Code time:

 

import SpriteKit

 

class GameScene: SKScene {

    override func didMoveToView(view: SKView) {

        view.scene!.anchorPoint = CGPoint(x: 0.5,y: 0.5)

        

        var shapeNode = SKShapeNode();

        var path = CGPathCreateMutable();

        CGPathAddArc(path, nil, 0, view.bounds.height/2, 45, 0, M_PI*2, true);

        shapeNode.path = path;

        shapeNode.lineWidth = 2.0;

        

        shapeNode.physicsBody = SKPhysicsBody(circleOfRadius: 45.0);

        shapeNode.physicsBody.dynamic = true;

        

        // Make gravity "fall" at 1 unit per second along the y-axis

        self.physicsWorld.gravity.dy = -1;

        self.addChild(shapeNode);

    }

}

 

And once run:

 

 

Well, that was simple enough, let’s take a quick run through the code.

 

We start off by creating an SKShapeNode.  This shape node is defined by a CGPath.  You can think of a CGPath as a set of drawing instructions.  In this case it consists of a single arc that draws a circle.    Once our path is created we set it to be our shapeNodes path.  We set the lineWidth to 2 to make it a bit more visible.

 

Next we define the physicsBody.  Every SKNode has a SKPhysicsBody.  The SKPhysicsBody is the nodes’ representation in the physics engine.  When defining a physics body you pick the shape that most matches your underlying shape.  In this case it’s a no brainer to use a circle.  There are constructors available for rectangles, polygons, edges, etc.  Of all the options however, the circle is the least processing intensive.  So if you need just a rough physical representation, prefer the more simple types ( circle, then rectangle ) and leave the more complex types until required.  The key thing here is the dynamic property.  This is what tells SpriteKit to include your SKNode in the physics calculation.  If this isn’t set, nothing is going to happen!

 

Finally we set the gravity value dy to -1.  This means gravity moves by a delta of -1 unit per second on the y axis.  Notice the concept of “unit” here, it is very import.  When dealing with SpriteKit ( and many other Physics engines ), a unit is completely arbitrary.  So you may ask “1 what?”.  The answer is, 1 whatever… just be consistent about it.  In your code you could choose 1 to be a millimetre, an inch, a foot, a lightyear.  A lot of it comes down to the type of game you are working on.  One thing to keep in mind here though, massive ranges in value are not a good thing…  so don’t mix units.  For example, trying to represent something in millimetres and kilometres in the same simulation is a sure recipe for disaster!

 

Now let’s give our ball something to collide with… that being the edge of the window.  While we are at it, let’s add a bit of bounce to our step:

 

import SpriteKit

 

class GameScene: SKScene {

    override func didMoveToView(view: SKView) {

 

        //Create the ball

        var shapeNode = SKShapeNode();

        var path = CGPathCreateMutable();

        CGPathAddArc(path, nil, 0, 0, 45, 0, M_PI*2, true);

        CGPathCloseSubpath(path);

        shapeNode.path = path;

        shapeNode.lineWidth = 2.0;

        shapeNode.position = CGPoint(x:self.view.frame.width/2,y:self.view.frame.height);

        

        // Set the ball's physical properties

        shapeNode.physicsBody = SKPhysicsBody(circleOfRadius: shapeNode.frame.width/2);

        shapeNode.physicsBody.dynamic = true;

        shapeNode.physicsBody.mass = 1;

        shapeNode.physicsBody.friction = 0.2;

        shapeNode.physicsBody.restitution = 1;

        

 

        // Now make the edges of the screen a physics object as well

        scene.physicsBody = SKPhysicsBody(edgeLoopFromRect: view.frame);

        

        // Make gravity "fall" at 1 unit per second along the y-axis

        self.physicsWorld.gravity.dy = -1;

 

        self.addChild(shapeNode);

    }

}

 

And run this:

 

 

 

Not, the jerkiness you see isn’t from the physics, but instead the animated gif.  Getting that to encode to a reasonable size was oddly a bit of a battle.

 

The key difference here is first of all, we set the edges of the screen as a physics object for our ball to collide against.  A key thing to remember with SpriteKit, every thing is a SKNode, even the scene itself!

 

Next we set a couple key physics properties for our ball, mass, friction and restitution.  Mass is the weight of the object… going back to the age old question, what falls faster… a ton of feathers, or a ton of bricks?  Friction is how two surfaces react to each other, generally used when simulating “sliding” and is fairly irrelevant in this example.  Restitution is the key value.  For lack of a better explanation, restitution is the bounciness of the objet.  Lower value, less bounce, higher value, higher bounce.  A value higher than 1 will result in an object actually gaining momentum after a collision ( aka, going higher UP then it fell from before bouncing ).

 

Next up we will work on actual collisions.

Programming , ,




Playing around with Three.JS -- Part Two of a not quite a tutorial series

23. July 2014

 

In Part One we looked at the basics of working with the Three.js graphics library.  We got as far as creating a camera and a textured 3D object.  Now is the true test of ease of use… getting a 3D model exported from Blender and displayed in our browser.  HTML libraries face an even bigger burden, as their access to the local file system isn’t as seamless as most other games.  Simply opening up an FBX or DAE file isn’t an option.  Let’s take a look at how ThreeJS works around this issues.

 

 

First’s thing first, I needed a Blender Blend file to work with.  This actually lead me down this road, resulting in a post about taking a Blend file from the web and making it game ready.  Anyways, I did.  I started with this file, merged the geometry, UV mapped it, and baked the Blender materials to a single texture map.  I am not entirely sure if I can share the resulting file or not, so you may have to provide your own Blender file or follow the linked tutorial to generate one of your own.

 

Anyways, this is what we are starting with…

image 

 

Let’s see how close we can get with Three.js.

 

The first obvious question is… how the hell do we get this model in to Three.JS from Blender?

Well, the answer is a plugin.  Follow the installation direction, however in my case the path was wrong.  For Blender 2.71, my actual plugin directory is C:\Program Files\Blender Foundation\Blender\2.71\scripts\addons\io_mesh_threejs.

 

There is one very critical thing to be aware of here… when downloading the files from Github, be certain to download the RAW format:

image

 

This particular mistake caused me a bit of pain, don’t make the same mistake!

 

Once you’ve copied each of these three files, configure the plugin in Blender.  If Blender is running, restart it.

Now select File->User Preferences:

image

 

In the resulting dialog select Addons, then in the search box type “three”.  If it installed correctly it will show on the right.  Click the checkbox to enable the plugin.

image

 

Now if you check the File->Export menu, you should see Three.js as an option.

image

 

When exporting you can clearly see there are a ton of options:

image

 

The options I selected above is for just exporting the mesh and materials.  No animation data, lights, cameras, etc… Scaling and Flip YZ all depend on the orientation of your game engine.

 

This exporter creates a JSON js like this one:

 

{

	"metadata" :
	{
		"formatVersion" : 3.1,
		"generatedBy"   : "Blender 2.7 Exporter",
		"vertices"      : 8,
		"faces"         : 6,
		"normals"       : 2,
		"colors"        : 0,
		"uvs"           : [24],
		"materials"     : 1,
		"morphTargets"  : 0,
		"bones"         : 0
	},

	"scale" : 1.000000,

	"materials" : [	{
		"DbgColor" : 15658734,
		"DbgIndex" : 0,
		"DbgName" : "Material",
		"blending" : "NormalBlending",
		"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorEmissive" : [0.0, 0.0, 0.0],
		"colorSpecular" : [0.5, 0.5, 0.5],
		"depthTest" : true,
		"depthWrite" : true,
		"mapDiffuse" : "crate.jpg",
		"mapDiffuseWrap" : ["repeat", "repeat"],
		"shading" : "Lambert",
		"specularCoef" : 50,
		"transparency" : 1.0,
		"transparent" : false,
		"vertexColors" : false
	}],

	"vertices" : [1,-1,0,1,0,1,-1,0,0,0,-1,0,1,0,0,0,1,1,-1,1,0,0,0,0],

	"morphTargets" : [],

	"normals" : [0.577349,0.577349,0.577349,0.577349,0.577349,-0.577349],

	"colors" : [],

	"uvs" : [[0.988679,0.99767,0.988677,0.016243,0.007251,0.016244,0.007252,0.
	997671,0.989755,0.017099,0.989755,0.998526,0.008328,0.998526,0.008328,0.017099,
	0.990714,0.989755,0.009287,0.989755,0.009286,0.008328,0.990713,0.008328,0.
	000516,0.993662,0.981943,0.993661,0.981942,0.012235,0.000516,0.012235,0.987766,
	0.997568,0.987766,0.016141,0.006339,0.016141,0.006339,0.997568,0.986807,0.
	986807,0.986807,0.005381,0.00538,0.00538,0.00538,0.986807]],

	"faces" : [43,0,3,2,1,0,0,1,2,3,0,0,1,1,43,4,7,6,5,0,4,5,6,7,0,0,1,1,43,0,4,5,1,
	0,8,9,10,11,0,0,1,1,43,1,2,6,5,0,12,13,14,15,1,1,1,1,43,2,3,7,6,0,16,17,18,19,1,
	0,0,1,43,3,0,4,7,0,20,21,22,23,0,0,0,0],

	"bones" : [],

	"skinIndices" : [],

	"skinWeights" : [],

  "animations" : []


}

 

In theory things should just work, but since when did gamedev give a damn about theory?  Suffice to say I ran into a bit of a problem fully documented here.  The bug actually had nothing to do with Three.js, it was actually caused by my IDE WebStorm.

 

Anyways… once I figured out the problem, the code to load a model was extremely straightforward:

 

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;

    constructor() {
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        document.getElementById('content').appendChild(this.renderer.domElement);

        this.scene = new THREE.Scene();

        this.camera = new THREE.PerspectiveCamera(75
            , 1
            , 0.1, 1000);

        this.camera.position = new THREE.Vector3(10, 0, 10);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));


        // New code begins below
        // Create a loader to load in the JSON file
        var modelLoader = new THREE.JSONLoader();

        // Call the load method, passing in the name of our generated JSON file
        // and a callback for when loadign is complete.
        // Not fat arrow typescript call for proper thisification.  AKA, we want 
        this to be this, not that
        // or something else completely
        modelLoader.load("robot.jsm", (geometry,materials) => {
            // create a mesh using the passed in geometry and textures
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position.x = 0; mesh.position.y = mesh.position.z = 0;
            // add it to the scene
            this.scene.add(mesh);
        });

        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).
                       getHex()));
        this.renderer.render(this.scene, this.camera);
    }

    render() {
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

And when you run it:

image

 

Mission accomplished!  The astute reader may notice the file was renamed robot.jsm.  That was to work around the problem I mentioned earlier.

 

 

 

This isn’t actually the only option for loading 3D models into Three.js, there are actually a series of loaders available as a separate download from the Github site.  It is however certainly the easiest one!  The next few steps took me two full days to fight my way through!  Some of the blame is on TypeScript, some is on me and of course, CORS reared it ugly head as well.  Oh, and to add to the fun, a recent change in Three.js introduced an error in the version of ColladaLoader.js I downloaded. This was one of those trials that Google was no help with, so hopefully this guide will help others in the future.  Anyways… on with the adventure!

 

We are actually about to run smack into two different problems with using a Three.js plugin.  The first one is TypeScript related.  You see, the ColladaLoader plugin is not a core part of Three.js.  In JavaScript, this is no big deal.  In TypeScript however, big deal.  You see, until now we have been relying on the generated .d.ts file from StrictlyTyped for defining all of the types in Three.js from JavaScript in their corresponding TypeScript form.  However, since this is a plugin and not a core part of Three.js, this means the d.ts file has no idea how ColladaLoader works.

 

Ultimately this means you have to had roll your own Typescript definition file. I had to struggle a bit to find the exact TypeScript syntax to map ColladaLoader so it will run in TypeScript within the THREE namespace with proper callback syntax.  First off, create a file called ColladaLoader.d.ts   Now enter the following code:

 

///<reference path="./three.d.ts"/>

declare module THREE {
    export class ColladaLoader{
        options:any;
        load(
            name:string,
            readyCallback:(result:any)=> void,
            progressCallback:( total:number,loaded:number)=> void);
    }
}

 

 

I should probably point out, I only implemented the barest minimum of what I required.  In your case you may have to implement more of the interface.  Also note I also took the lazy approach to defining options.  By returning any, my code will compile in TypeScript, but I do lose some of the type checking.  The alternative would have been to define the Options type and I am way too lazy for that.  The above definition enable me to call the load() method and set options, which is all I actually needed.  I don’t even want to talk about how long it took me to puzzle out those 10 lines of code so that the generated code actually matched THREE.js!

 

OK, we now have ColladaLoader.d.ts defined let’s look at code to use ColladaLoader to load a DAE (COLLADA) file:

 

///<reference path="./three.d.ts"/>
///<reference path="./ColladaLoader.d.ts"/>


class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;
    loader:THREE.ColladaLoader;
    light:THREE.PointLight;

    constructor() {
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        document.getElementById('content').appendChild(this.renderer.domElement);

        this.scene = new THREE.Scene();;

        this.camera = new THREE.PerspectiveCamera(75
            , 1
            , 0.1, 1000);

        this.camera.position = new THREE.Vector3(5, 0, 5);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));

        // Create a loader
        this.loader = new THREE.ColladaLoader();

        // Add a point light to the scene to light up our model
        this.light = new THREE.PointLight();
        this.light.position.set(100,100,100);
        this.light.intensity = 0.8;
        this.scene.add(this.light);

        this.renderer.render(this.scene, this.camera);

        // Blender COLLADA models have a different up vector than Three.js, set 
        this option to flip them
        this.loader.options.convertUpAxis = true;

        // Now load the model, passing the callback finishedLoading() when done.
        this.loader.load("robot.dae",
            (result) => this.finishedLoading(result),
            (length,curLength)=>{
                // called as file is loading, if you want a progress bar
            }
        );
    }

    finishedLoading(result){
        // Model file is loaded, add it to the scene
        this.scene.add(result.scene);
    }
    render() {
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

 

Finally of course we need to export our COLLADA model.  Using Blender, you can export using File->Export->Collada menu option.  These are the settings I used:

image

 

And when you run it:

image

 

That said, there is a really good chance this isn’t what is going to happen to you when you run your project.  Instead you are going to probably receive a 404 error that your dae file is not found.  This is because, unlike before with the JSON file being added directly to your project, this time you are loading the model using an XML HTTP Request.  This causes a number of problems.  The first and most likely problem you are going to encounter is if you are running your application locally from your file system instead of a server.  By default XHR requests do not work this way ( no idea why, seems kinda stupid to me ).  There is a switch that allows chrome to run XHR local requests ( link here ) using --allow-file-access-from-files.

 

In my case the problem was a little bit different.  I use WebStorm, which includes a built in web server to make this kind of stuff easier.  This however raises a completely different set of problems…  CORS  Cross Origin Resource Sharing.  In a very simple description, CORS is a security method for making XML HTTP Requests across different servers.  That said, how the heck do you set it when you are working with a built in stripped down development server?  Fortunately WebStorm have thought about that.

 

Assuming you are using Chrome and have the Webstorm plugin install, in the Chrome address bar, go to chrome://extensions.

image

 

Click the Options button.

image

Now simply add 127.0.0.1 to the allow list and press Apply. 

 

Now if you run from Webstorm, no more 404 errors.

 

A moment about TypeScript

 

I’ve been using TypeScript a fair bit lately and this is the first time I’ve run into major problems with it.  But the experience is enough that I can safely say…

 

TypeScript is not appropriate for new developers to use!

 

Simply put, unless you have a decent amount of experience with JavaScript, you really shouldn’t use TypeScript.  You will have to read the generated code at some point in time, and if you don’t full understand the code it generates, you are doomed.  The minute definition files arent available to you the experience becomes a hell of a lot less fun.  The process of mapping TypeScript types to existing JavaScript libraries is not trivial and requires you to have a pretty good understanding of both languages.  Especially when the library you are trying to define uses a number of clever JavaScript tricks, which basically… is all of them.  The fact there isnt a reliable tool out there for generating at least boilerplate .d.ts files from .js files is a bit of a puzzle to me.

 

Next, I also have to say TypeScript’s handling of this is just as mind bogglingly stupid as JavaScript’s.  That fat arrow ( => ) functions are used to capture local context, until used as an anonymous method, at which point they capture global (window) context, forcing you to resort to function() is downright perplexing.  I simply couldn’t get anonymous callback functions to have the proper this context no matter what syntax combination I tried.  Infuriatingly, the _this value Typescript automatically contains was set to the right value.

 

One other major downside I noticed about the language with my recent struggles is the newness of the language is very much an annoyance.  When researching bugs or workarounds you quite often find things that are reported as bugs, reported as changed,  or reported and never responded to.  This isn’t a bash on the language as it’s really only a problem that time can solve.  However, for a new developer, dealing with a language where a great deal of the material out there is potentially wrong because of language changes, that is certainly a challenge.  All languages change over time of course, but young languages change more and more dramatically.

 

Don’t get me wrong, I am not off Typescript, even though it spent a good part of the last two days pissing me off.  At least until ECMAScript 6 is the norm I can see a great deal of value in using TypeScript for large projects.

 

For beginners though, with little JavaScript experience… forget about it.  It’s going to cause more headaches than it’s worth.

Programming , ,




WebStorm, Three.JS, Source Maps and ARRRRRGHHHHHHHH

18. July 2014

 

So, as you may be able to tell from the title, I’ve run into a bit of a bug moment.  I am in the process of getting Blender exported models to work with the Three.JS library, as a follow up to this post.  As with all things programming related you are going to run into your share of problems.  This post actually walks through the process of identifying and fixing a bug.

 

The first lesson of bug hunting is always simplify.  Try to replicate the problem in as little code as possible.  What follows below is a blow by blow of  the debugging process.

 

First off, it’s a Three.JS project using TypeScript authored in WebStorm.  All of these points are important to this story.  Not to ruin an upcoming post with too much code, I’ll just give the applicable code.  The problem is in this code… if it’s in code at all that is.  Yeah, that’s a bit of a hint.

 

        var modelLoader = new THREE.JSONLoader();

        modelLoader.load("dice.jsm", function(geometry,materials){
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position. x = mesh.position.y = mesh.position.z = 0;
            this.scene.add(mesh);
        });

 

Anyways, I started off by trying to load this model:

image

Exported from Blender to Three.JS JSON format.

 

When I run the code however I get the following in WebStorm:

image

 

Unexpected token /?

 

Hmmm, looks like something went wrong in the export process.  This is never a fun thing to debug, as the output of the above model is a JSON file 1.5MB in size.

 

So, it’s time to simplify.  I need a model with a texture, nothing more.  Let’s make something about as basic as possible.  So I hack together a quick textured model in Blender and export it.  This is the new model:

image

 

Ok, that is definitely simpler.  Now when I run it I get the exact same error.  Ok, this file should be a hell of a lot easier to debug.  Let’s take a look at the generated JSON file.  Oh, top type… right click the js file and tell Webstorm to treat it as plain text, otherwise it will clobber your CPU trying to parse the javascript!

 

 

{

	"metadata" :
	{
		"formatVersion" : 3.1,
		"generatedBy"   : "Blender 2.7 Exporter",
		"vertices"      : 8,
		"faces"         : 6,
		"normals"       : 2,
		"colors"        : 0,
		"uvs"           : [24],
		"materials"     : 1,
		"morphTargets"  : 0,
		"bones"         : 0
	},

	"scale" : 1.000000,

	"materials" : [	{
		"DbgColor" : 15658734,
		"DbgIndex" : 0,
		"DbgName" : "Material",
		"blending" : "NormalBlending",
		"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorEmissive" : [0.0, 0.0, 0.0],
		"colorSpecular" : [0.5, 0.5, 0.5],
		"depthTest" : true,
		"depthWrite" : true,
		"mapDiffuse" : "crate.jpg",
		"mapDiffuseWrap" : ["repeat", "repeat"],
		"shading" : "Lambert",
		"specularCoef" : 50,
		"transparency" : 1.0,
		"transparent" : false,
		"vertexColors" : false
	}],

	"vertices" : [1,-1,0,1,0,1,-1,0,0,0,-1,0,1,0,0,0,1,1,-1,1,0,0,0,0],

	"morphTargets" : [],

	"normals" : [0.577349,0.577349,0.577349,0.577349,0.577349,-0.577349],

	"colors" : [],

	"uvs" : [[0.988679,0.99767,0.988677,0.016243,0.007251,0.016244,0.007252,0.
	997671,0.989755,0.017099,0.989755,0.998526,0.008328,0.998526,0.008328,0.017099,
	0.990714,0.989755,0.009287,0.989755,0.009286,0.008328,0.990713,0.008328,0.
	000516,0.993662,0.981943,0.993661,0.981942,0.012235,0.000516,0.012235,0.987766,
	0.997568,0.987766,0.016141,0.006339,0.016141,0.006339,0.997568,0.986807,0.
	986807,0.986807,0.005381,0.00538,0.00538,0.00538,0.986807]],

	"faces" : [43,0,3,2,1,0,0,1,2,3,0,0,1,1,43,4,7,6,5,0,4,5,6,7,0,0,1,1,43,0,4,5,1,
	0,8,9,10,11,0,0,1,1,43,1,2,6,5,0,12,13,14,15,1,1,1,1,43,2,3,7,6,0,16,17,18,19,1,
	0,0,1,43,3,0,4,7,0,20,21,22,23,0,0,0,0],

	"bones" : [],

	"skinIndices" : [],

	"skinWeights" : [],

  "animations" : []


}

 

Well, first obvious thing is to look for an offending / in this code.

Hmmm… there is none.  Well we wouldn’t make the big bucks if this was easy now would we?

 

Let’s go back to our error for a second:

image

 

Well, other than the fact we know we have a / where we shouldn’t, we also have the line of code that is going all explodey.  Let’s start there.  This is one of those things that makes WebStorm so freaking cool.  Just click the link “three.js:11960” and it will automatically download that script file and go to that position.  Let’s take a look at the resulting code:

image

 

Ok, that’s some pretty straight forward code.  Basically it’s a XML response function handler.  As we can tell from the above code, the callback executed all nice and dandy.  As you can see on line 11960 I’ve set a breakpoint to pause execution before our exception, let’s see if that gives us any insight.  If you don’t know about breakpoints and debugging, drop everything and learn.  You will literally become a better programmer overnight.

 

So… to the debugger!  Let’s see what the value of responseText is:

 

By hovering over it, everything looks fine, seems to match the file we are expecting:

image

 

That said, let’s take a look at the whole thing.  Now we are going to use a feature called “Evaluate Expression”.  Again, if you don’t know what this is, drop everything and go learn.  I really need to write a debugging tutorial…. 

 

image

 

Copy the value and paste it into your editor of choice.  Then scroll to the very bottom:

image

 

Oh son of a bi….

 

That my friend, is our bug.  See, Webstorm has the ability to generate something called a SourceMap, which helps the debugger translate your running code to the code you created, especially useful if you, like me, are working in a JavaScript generating language like TypeScript.  As you can see, sometimes this is not ideal however.  Basically when run, Webstorm was appending a source map to the bottom of our js file, rendering into invalid JSON and causing the parser to puke.

 

There are two immediate solutions to this problem.  First, we can disable source map generation.  This unfortunately is a project wide setting as far as I can tell, and I rather like the ability to debug.  The easier solution is to change it from a .js file to something different.  However, once deployed to a server this can have unintended side effects.  For example, IIS will not, by default, serve files without a registered mime type.

 

Oh, and for the astute, once I got past the problem be renaming the file extension, I did in fact discover two errors in my code.  The proper TypeScript code is;

 

        var modelLoader = new THREE.JSONLoader();

        modelLoader.load("dice.jsm", (geometry,materials) => {
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position.x = 0; mesh.position.y = mesh.position.z = 0;
            this.scene.add(mesh);
        });

 

Why is an exercise for the reader. :)

Programming , , ,




Baking Blender materials to texture to make them usable in a game engine

16. July 2014

 

Or…

How to take a Blender model you downloaded from the web and make it actually usable in your game in 28 easy steps!

 

… granted, the second title doesn’t have the same flow to it, does it?

 

I just had to run through this process and I figured I would share it as it is something that occurs fairly often.  When working with Blender, there are dozens of behavioral textures available that can make for some very nice results quickly.  The only problem is, when you get your asset out of Blender and into your game engine, things suddenly go horribly wrong.  The problem is, those textures only make sense inside of Blender.  Fortunately through the magic of baking, you can easily convert them into a texture map usable in any game engine.

 

Let’s take a look how.

 

First we need a model.  I am using a beautiful new model that was recently added to Blend-Swap.  It’s a free download but you need to register.  Don’t worry, you can use a real email address, they don’t spam, or at least haven't so far.  The model in question looks like this:

 

image

 

Unfortunately when we load it in Blender we quickly learn this model is in no way game ready.  Let’s take a look:

image

 

Ick.  So instead of a single Mesh, we have a dozen individual meshes.  Problem is, we need to unwrap them as a single object, so let’s join them all together.  First let’s get the camera out of the default layer.

 

If you look at the way this particular Blend is setup, there are currently two layers, the second contains the armature, the first contains everything else.

image

 

Lets get the camera out of there.  Select the camera object then hit the M key.  Then select the layer you want to move the camera to, like so:

image

 

Now click the first layer ( bottom left box ) and it should now only contain geometry.

 

We want to join everything together.  Press ‘A’ to select everything in the layer, then hit “Ctrl + J” to join everything into a single set of geometry.  Now it should look something like this:

image

 

Perfect, now we can unwrap our model.  Switch in to EDIT mode

image

 

Press ‘A’ again, until all faces are selected, like so:

image

 

Now we unwrap our model.  Select Mesh->UV Unwrap-> Unwrap ( or Smart UV Project ).

 

Switch your view to UV/Image Editor

image

 

It should look something like this:

image

 

Now create a New Image:

image

 

This image is where we are going to render our texture to.  Here are the settings I used.  Remember, games like Power of 2 textures.

image

 

Ok, now let’s look at the actual render to texture part.  Take a quick look at how the model is currently shaded:

image

 

Frankly none of those are really game engine friendly.  So let’s render all of those materials out to a single texture.  Go to the render tab

image

 

Scroll down and locate Bake.

In the UV Editor window, make sure everything is selected ( using ‘A’.  They should be highlighted in yellow ).  At this point, with your generated image and all the UV’s selected, it should look like:

image

 

 

Now under bake, set the following settings:

image

The key values being Bake Mode = Full Render and Selected to Active checked.  Now click the Bake button.

 

Up in your top part of Blender, you should see a progress bar like so:

image

 

 

Now if you go back to the UV/Image viewer, and select your image RenderedTexture, you should see:

image

 

Cool!

 

Let’s save the result to an external ( game engine friendly ) texture.  Select Image->Save as Image.  Save the image somewhere.  Remember where.

image

 

 

Now lets modify the textures on our model to use only our newly generated texture map.  First in 3D View, switch back to Object Mode from Edit mode.

Then, open the materials tab:

image

 

Select each material and hit the – ( or killswitch engage! ) button.  So it should ultimately look like this:

image

 

Now hit the + button and create a new Material.  Then click the New button.

image

 

The default values for the material should be OK, but depending on your game engine, you may have to enable Face Textures:

image

 

Now click over to the Texture tab.  Click New.

image

 

Drop down the Type box and select Image or Movie.

image

 

Scroll down to the Image section and select Open.  Pick the image you saved earlier.

image

 

Now scroll down to Mapping, drop down Coordinates and select UV.

image

 

Under Map select UVMap.

image

 

Now if you go to the 3D View and set the view mode to Texture:

image

 

TADA!  A game ready model.

 

One word of caution though, if you render this scene in Blender you will get the following result:

image

 

Don’t worry.  That’s just a biproduct of going from Blender materials to texture mapping.  If you want the texture to be seen, you need to add some lights to the scene.  Or change the material so it has an Emit value > 0, so it will provide it’s own light source.

 

With Emit set to .92, here is the result if you render it:

 

image

 

Now, what about it game?

 

Let’s create a simple LibGDX project that loads and displays our exported model:

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;


public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;

    @Override
    public void create() {
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());

        camera.position.set(3f,0f,6f);
        camera.lookAt(0f,1f,0f);

        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f;
        camera.far = 300.0f;

        modelBatch = new ModelBatch();

        UBJsonReader jsonReader = new UBJsonReader();
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        model = modelLoader.loadModel(Gdx.files.getFileHandle("robot.g3db", FileType.Internal));
        modelInstance = new ModelInstance(model);

        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);

        camera.update();

        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }
}

 

And we run it and:

image

 

Wow, a model downloaded randomly from the Internet actually working in the game engine!  How often does that actually happen? ;)

Programming, Art , ,




Playing around with Three.JS -- Part One of a not quite a tutorial series

15. July 2014

 

There was recently a flood of Three.js books on Safari lately including Essential Three.js and Game Development with Three.jsThree.js is a JavaScript based 3D library using WebGL ( and not, if not available ).  More importantly, it’s just really fun to play with!  Something about working in full 3D in a scripting language is just really satisfying.  I’ve only just been playing and really don’t have a clue what I’m doing, but I figured I would share my results.  As I have been on a TypeScript kick lately, I’ve been writing in TypeScript instead of plain JavaScript, but frankly the differences are fairly minimal.  You can get the TypeScript definitions on DefinatelyTyped.

 

I think I should make something perfectly clear… I have NO idea what I am doing, I am simply playing around.  This isn’t a WebGL tutorial by any definition of the word, just me having skim read a couple of books and played around with a new technology, nothing more.  So if you look at some code and thing “damn that looks hacky” or “isn’t that a really stupid thing to do?” the answer is probably yes! :)

 

So, disclaimer given, let’s jump right in. 

 

Since this is a web app, we need a host HTML page.  So, here is ours:

<!DOCTYPE html>

<html lang="en">
<head>
    <meta charset="utf-8" />
    <title>ThreeJS Test</title>
    <script src="http://cdnjs.cloudflare.com/ajax/libs/three.js/r67/three.js"></script>
    <script src="app.js"></script>
</head>
<body>
<h1>ThreeJS Test</h1>

<div id="content" style="width:500px;height:500px"></div>
</body>
</html>

 

Nothing really shocking here.  We include three.js using the cloudflare content delivery network.  If you wanted of course you could download the library locally and deploy it from your own servers.  I assume you don’t have servers situated around the world, so a CDN will generally thrash your own servers performance.  Next we include app.js, the generated output from our typescript application.  In the actual HTML we create a 500x500 DIV named content, for predictably enough, our content!

 

Now lets take a look at a super simple example app, app.ts:

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    constructor(){
        this.renderer = new THREE.WebGLRenderer({ alpha: true });
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFF0000,1);
        document.getElementById('content').appendChild(this.renderer.domElement);
    }

    start() {
        this.renderer.clear();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

Here in the constructor we create a WebGLRenderer, size it, set the background color to red ( using HTML format hex color coding ) then wire the renderer to the content div.

 

When you run it you should see:

image

 

Cool, our first Three.js application.  Now let’s do something 3D!  Let’s start by creating a camera and rendering a built in 3D object in wireframe. It's commented heavily, so I wont be explaining what is going on. If you are curious why I did something, leave a comment.

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    scene: THREE.Scene;
    camera: THREE.Camera;

    constructor(){
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFFFFFF,1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0,0,-20);
        this.camera.lookAt(new THREE.Vector3(0,0,0));

        // Createa the geometry for a sphere with a radius of 5
        var sphereGeometry = new THREE.SphereGeometry(5);

        // Create a wireframe material that's blueish
        var sphereMaterial = new THREE.MeshBasicMaterial(
            {color: 0x7777ff, wireframe: true});

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry,sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0,0,0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);
        this.renderer.render(this.scene,this.camera);
    }

    start() {
        // Well, arent I a bit pointless?
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

And when run it we get:

 

image

 

Cool!  Now time for some texturing ( and as a result, lighting ).

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;

    constructor() {
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0, 0, -20);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));

        // Createa the geometry for a sphere with a radius of 5
        // This time we cranked up the number of sections horizontal and vertical to make a 
higher resolution globe
        var sphereGeometry = new THREE.SphereGeometry(5, 20, 20);

        // This time we create a Phong shader material and provide a texture.
        var sphereMaterial = new THREE.MeshPhongMaterial(
            {
                map: THREE.ImageUtils.loadTexture("earth_sphere.jpg")
            }
        );

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry, sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0, 0, 0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);

        // We need some light so our texture will show, ad an ambient light to the scene
        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).getHex()));
        this.renderer.render(this.scene, this.camera);
    }

    render() {
        // Each frame we want to render the scene again
        // Use typescript Arrow notation to retain the thisocity passing render to requestAnimationFrame
        // It's possible I invented the word thisocity.
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        // Not so pointless now!
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

Bet you can't guess what texture I went with!

 

image

 

So apparently textured 3D objects are nothing difficult.

 

This is getting pretty long, so I’ll cut it off here.  Next up I’m going to look at getting a Blender object rendering in Three.JS.

Programming ,