Playing around with Three.JS -- Part Two of a not quite a tutorial series

23. July 2014

 

In Part One we looked at the basics of working with the Three.js graphics library.  We got as far as creating a camera and a textured 3D object.  Now is the true test of ease of use… getting a 3D model exported from Blender and displayed in our browser.  HTML libraries face an even bigger burden, as their access to the local file system isn’t as seamless as most other games.  Simply opening up an FBX or DAE file isn’t an option.  Let’s take a look at how ThreeJS works around this issues.

 

 

First’s thing first, I needed a Blender Blend file to work with.  This actually lead me down this road, resulting in a post about taking a Blend file from the web and making it game ready.  Anyways, I did.  I started with this file, merged the geometry, UV mapped it, and baked the Blender materials to a single texture map.  I am not entirely sure if I can share the resulting file or not, so you may have to provide your own Blender file or follow the linked tutorial to generate one of your own.

 

Anyways, this is what we are starting with…

image 

 

Let’s see how close we can get with Three.js.

 

The first obvious question is… how the hell do we get this model in to Three.JS from Blender?

Well, the answer is a plugin.  Follow the installation direction, however in my case the path was wrong.  For Blender 2.71, my actual plugin directory is C:\Program Files\Blender Foundation\Blender\2.71\scripts\addons\io_mesh_threejs.

 

There is one very critical thing to be aware of here… when downloading the files from Github, be certain to download the RAW format:

image

 

This particular mistake caused me a bit of pain, don’t make the same mistake!

 

Once you’ve copied each of these three files, configure the plugin in Blender.  If Blender is running, restart it.

Now select File->User Preferences:

image

 

In the resulting dialog select Addons, then in the search box type “three”.  If it installed correctly it will show on the right.  Click the checkbox to enable the plugin.

image

 

Now if you check the File->Export menu, you should see Three.js as an option.

image

 

When exporting you can clearly see there are a ton of options:

image

 

The options I selected above is for just exporting the mesh and materials.  No animation data, lights, cameras, etc… Scaling and Flip YZ all depend on the orientation of your game engine.

 

This exporter creates a JSON js like this one:

 

{

	"metadata" :
	{
		"formatVersion" : 3.1,
		"generatedBy"   : "Blender 2.7 Exporter",
		"vertices"      : 8,
		"faces"         : 6,
		"normals"       : 2,
		"colors"        : 0,
		"uvs"           : [24],
		"materials"     : 1,
		"morphTargets"  : 0,
		"bones"         : 0
	},

	"scale" : 1.000000,

	"materials" : [	{
		"DbgColor" : 15658734,
		"DbgIndex" : 0,
		"DbgName" : "Material",
		"blending" : "NormalBlending",
		"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorEmissive" : [0.0, 0.0, 0.0],
		"colorSpecular" : [0.5, 0.5, 0.5],
		"depthTest" : true,
		"depthWrite" : true,
		"mapDiffuse" : "crate.jpg",
		"mapDiffuseWrap" : ["repeat", "repeat"],
		"shading" : "Lambert",
		"specularCoef" : 50,
		"transparency" : 1.0,
		"transparent" : false,
		"vertexColors" : false
	}],

	"vertices" : [1,-1,0,1,0,1,-1,0,0,0,-1,0,1,0,0,0,1,1,-1,1,0,0,0,0],

	"morphTargets" : [],

	"normals" : [0.577349,0.577349,0.577349,0.577349,0.577349,-0.577349],

	"colors" : [],

	"uvs" : [[0.988679,0.99767,0.988677,0.016243,0.007251,0.016244,0.007252,0.
	997671,0.989755,0.017099,0.989755,0.998526,0.008328,0.998526,0.008328,0.017099,
	0.990714,0.989755,0.009287,0.989755,0.009286,0.008328,0.990713,0.008328,0.
	000516,0.993662,0.981943,0.993661,0.981942,0.012235,0.000516,0.012235,0.987766,
	0.997568,0.987766,0.016141,0.006339,0.016141,0.006339,0.997568,0.986807,0.
	986807,0.986807,0.005381,0.00538,0.00538,0.00538,0.986807]],

	"faces" : [43,0,3,2,1,0,0,1,2,3,0,0,1,1,43,4,7,6,5,0,4,5,6,7,0,0,1,1,43,0,4,5,1,
	0,8,9,10,11,0,0,1,1,43,1,2,6,5,0,12,13,14,15,1,1,1,1,43,2,3,7,6,0,16,17,18,19,1,
	0,0,1,43,3,0,4,7,0,20,21,22,23,0,0,0,0],

	"bones" : [],

	"skinIndices" : [],

	"skinWeights" : [],

  "animations" : []


}

 

In theory things should just work, but since when did gamedev give a damn about theory?  Suffice to say I ran into a bit of a problem fully documented here.  The bug actually had nothing to do with Three.js, it was actually caused by my IDE WebStorm.

 

Anyways… once I figured out the problem, the code to load a model was extremely straightforward:

 

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;

    constructor() {
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        document.getElementById('content').appendChild(this.renderer.domElement);

        this.scene = new THREE.Scene();

        this.camera = new THREE.PerspectiveCamera(75
            , 1
            , 0.1, 1000);

        this.camera.position = new THREE.Vector3(10, 0, 10);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));


        // New code begins below
        // Create a loader to load in the JSON file
        var modelLoader = new THREE.JSONLoader();

        // Call the load method, passing in the name of our generated JSON file
        // and a callback for when loadign is complete.
        // Not fat arrow typescript call for proper thisification.  AKA, we want 
        this to be this, not that
        // or something else completely
        modelLoader.load("robot.jsm", (geometry,materials) => {
            // create a mesh using the passed in geometry and textures
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position.x = 0; mesh.position.y = mesh.position.z = 0;
            // add it to the scene
            this.scene.add(mesh);
        });

        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).
                       getHex()));
        this.renderer.render(this.scene, this.camera);
    }

    render() {
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

And when you run it:

image

 

Mission accomplished!  The astute reader may notice the file was renamed robot.jsm.  That was to work around the problem I mentioned earlier.

 

 

 

This isn’t actually the only option for loading 3D models into Three.js, there are actually a series of loaders available as a separate download from the Github site.  It is however certainly the easiest one!  The next few steps took me two full days to fight my way through!  Some of the blame is on TypeScript, some is on me and of course, CORS reared it ugly head as well.  Oh, and to add to the fun, a recent change in Three.js introduced an error in the version of ColladaLoader.js I downloaded. This was one of those trials that Google was no help with, so hopefully this guide will help others in the future.  Anyways… on with the adventure!

 

We are actually about to run smack into two different problems with using a Three.js plugin.  The first one is TypeScript related.  You see, the ColladaLoader plugin is not a core part of Three.js.  In JavaScript, this is no big deal.  In TypeScript however, big deal.  You see, until now we have been relying on the generated .d.ts file from StrictlyTyped for defining all of the types in Three.js from JavaScript in their corresponding TypeScript form.  However, since this is a plugin and not a core part of Three.js, this means the d.ts file has no idea how ColladaLoader works.

 

Ultimately this means you have to had roll your own Typescript definition file. I had to struggle a bit to find the exact TypeScript syntax to map ColladaLoader so it will run in TypeScript within the THREE namespace with proper callback syntax.  First off, create a file called ColladaLoader.d.ts   Now enter the following code:

 

///<reference path="./three.d.ts"/>

declare module THREE {
    export class ColladaLoader{
        options:any;
        load(
            name:string,
            readyCallback:(result:any)=> void,
            progressCallback:( total:number,loaded:number)=> void);
    }
}

 

 

I should probably point out, I only implemented the barest minimum of what I required.  In your case you may have to implement more of the interface.  Also note I also took the lazy approach to defining options.  By returning any, my code will compile in TypeScript, but I do lose some of the type checking.  The alternative would have been to define the Options type and I am way too lazy for that.  The above definition enable me to call the load() method and set options, which is all I actually needed.  I don’t even want to talk about how long it took me to puzzle out those 10 lines of code so that the generated code actually matched THREE.js!

 

OK, we now have ColladaLoader.d.ts defined let’s look at code to use ColladaLoader to load a DAE (COLLADA) file:

 

///<reference path="./three.d.ts"/>
///<reference path="./ColladaLoader.d.ts"/>


class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;
    loader:THREE.ColladaLoader;
    light:THREE.PointLight;

    constructor() {
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        document.getElementById('content').appendChild(this.renderer.domElement);

        this.scene = new THREE.Scene();;

        this.camera = new THREE.PerspectiveCamera(75
            , 1
            , 0.1, 1000);

        this.camera.position = new THREE.Vector3(5, 0, 5);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));

        // Create a loader
        this.loader = new THREE.ColladaLoader();

        // Add a point light to the scene to light up our model
        this.light = new THREE.PointLight();
        this.light.position.set(100,100,100);
        this.light.intensity = 0.8;
        this.scene.add(this.light);

        this.renderer.render(this.scene, this.camera);

        // Blender COLLADA models have a different up vector than Three.js, set 
        this option to flip them
        this.loader.options.convertUpAxis = true;

        // Now load the model, passing the callback finishedLoading() when done.
        this.loader.load("robot.dae",
            (result) => this.finishedLoading(result),
            (length,curLength)=>{
                // called as file is loading, if you want a progress bar
            }
        );
    }

    finishedLoading(result){
        // Model file is loaded, add it to the scene
        this.scene.add(result.scene);
    }
    render() {
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

 

Finally of course we need to export our COLLADA model.  Using Blender, you can export using File->Export->Collada menu option.  These are the settings I used:

image

 

And when you run it:

image

 

That said, there is a really good chance this isn’t what is going to happen to you when you run your project.  Instead you are going to probably receive a 404 error that your dae file is not found.  This is because, unlike before with the JSON file being added directly to your project, this time you are loading the model using an XML HTTP Request.  This causes a number of problems.  The first and most likely problem you are going to encounter is if you are running your application locally from your file system instead of a server.  By default XHR requests do not work this way ( no idea why, seems kinda stupid to me ).  There is a switch that allows chrome to run XHR local requests ( link here ) using --allow-file-access-from-files.

 

In my case the problem was a little bit different.  I use WebStorm, which includes a built in web server to make this kind of stuff easier.  This however raises a completely different set of problems��  CORS  Cross Origin Resource Sharing.  In a very simple description, CORS is a security method for making XML HTTP Requests across different servers.  That said, how the heck do you set it when you are working with a built in stripped down development server?  Fortunately WebStorm have thought about that.

 

Assuming you are using Chrome and have the Webstorm plugin install, in the Chrome address bar, go to chrome://extensions.

image

 

Click the Options button.

image

Now simply add 127.0.0.1 to the allow list and press Apply. 

 

Now if you run from Webstorm, no more 404 errors.

 

A moment about TypeScript

 

I’ve been using TypeScript a fair bit lately and this is the first time I’ve run into major problems with it.  But the experience is enough that I can safely say…

 

TypeScript is not appropriate for new developers to use!

 

Simply put, unless you have a decent amount of experience with JavaScript, you really shouldn’t use TypeScript.  You will have to read the generated code at some point in time, and if you don’t full understand the code it generates, you are doomed.  The minute definition files arent available to you the experience becomes a hell of a lot less fun.  The process of mapping TypeScript types to existing JavaScript libraries is not trivial and requires you to have a pretty good understanding of both languages.  Especially when the library you are trying to define uses a number of clever JavaScript tricks, which basically… is all of them.  The fact there isnt a reliable tool out there for generating at least boilerplate .d.ts files from .js files is a bit of a puzzle to me.

 

Next, I also have to say TypeScript’s handling of this is just as mind bogglingly stupid as JavaScript’s.  That fat arrow ( => ) functions are used to capture local context, until used as an anonymous method, at which point they capture global (window) context, forcing you to resort to function() is downright perplexing.  I simply couldn’t get anonymous callback functions to have the proper this context no matter what syntax combination I tried.  Infuriatingly, the _this value Typescript automatically contains was set to the right value.

 

One other major downside I noticed about the language with my recent struggles is the newness of the language is very much an annoyance.  When researching bugs or workarounds you quite often find things that are reported as bugs, reported as changed,  or reported and never responded to.  This isn’t a bash on the language as it’s really only a problem that time can solve.  However, for a new developer, dealing with a language where a great deal of the material out there is potentially wrong because of language changes, that is certainly a challenge.  All languages change over time of course, but young languages change more and more dramatically.

 

Don’t get me wrong, I am not off Typescript, even though it spent a good part of the last two days pissing me off.  At least until ECMAScript 6 is the norm I can see a great deal of value in using TypeScript for large projects.

 

For beginners though, with little JavaScript experience… forget about it.  It’s going to cause more headaches than it’s worth.

Programming , ,




WebStorm, Three.JS, Source Maps and ARRRRRGHHHHHHHH

18. July 2014

 

So, as you may be able to tell from the title, I’ve run into a bit of a bug moment.  I am in the process of getting Blender exported models to work with the Three.JS library, as a follow up to this post.  As with all things programming related you are going to run into your share of problems.  This post actually walks through the process of identifying and fixing a bug.

 

The first lesson of bug hunting is always simplify.  Try to replicate the problem in as little code as possible.  What follows below is a blow by blow of  the debugging process.

 

First off, it’s a Three.JS project using TypeScript authored in WebStorm.  All of these points are important to this story.  Not to ruin an upcoming post with too much code, I’ll just give the applicable code.  The problem is in this code… if it’s in code at all that is.  Yeah, that’s a bit of a hint.

 

        var modelLoader = new THREE.JSONLoader();

        modelLoader.load("dice.jsm", function(geometry,materials){
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position. x = mesh.position.y = mesh.position.z = 0;
            this.scene.add(mesh);
        });

 

Anyways, I started off by trying to load this model:

image

Exported from Blender to Three.JS JSON format.

 

When I run the code however I get the following in WebStorm:

image

 

Unexpected token /?

 

Hmmm, looks like something went wrong in the export process.  This is never a fun thing to debug, as the output of the above model is a JSON file 1.5MB in size.

 

So, it’s time to simplify.  I need a model with a texture, nothing more.  Let’s make something about as basic as possible.  So I hack together a quick textured model in Blender and export it.  This is the new model:

image

 

Ok, that is definitely simpler.  Now when I run it I get the exact same error.  Ok, this file should be a hell of a lot easier to debug.  Let’s take a look at the generated JSON file.  Oh, top type… right click the js file and tell Webstorm to treat it as plain text, otherwise it will clobber your CPU trying to parse the javascript!

 

 

{

	"metadata" :
	{
		"formatVersion" : 3.1,
		"generatedBy"   : "Blender 2.7 Exporter",
		"vertices"      : 8,
		"faces"         : 6,
		"normals"       : 2,
		"colors"        : 0,
		"uvs"           : [24],
		"materials"     : 1,
		"morphTargets"  : 0,
		"bones"         : 0
	},

	"scale" : 1.000000,

	"materials" : [	{
		"DbgColor" : 15658734,
		"DbgIndex" : 0,
		"DbgName" : "Material",
		"blending" : "NormalBlending",
		"colorAmbient" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorDiffuse" : [0.6400000190734865, 0.6400000190734865, 0.6400000190734865],
		"colorEmissive" : [0.0, 0.0, 0.0],
		"colorSpecular" : [0.5, 0.5, 0.5],
		"depthTest" : true,
		"depthWrite" : true,
		"mapDiffuse" : "crate.jpg",
		"mapDiffuseWrap" : ["repeat", "repeat"],
		"shading" : "Lambert",
		"specularCoef" : 50,
		"transparency" : 1.0,
		"transparent" : false,
		"vertexColors" : false
	}],

	"vertices" : [1,-1,0,1,0,1,-1,0,0,0,-1,0,1,0,0,0,1,1,-1,1,0,0,0,0],

	"morphTargets" : [],

	"normals" : [0.577349,0.577349,0.577349,0.577349,0.577349,-0.577349],

	"colors" : [],

	"uvs" : [[0.988679,0.99767,0.988677,0.016243,0.007251,0.016244,0.007252,0.
	997671,0.989755,0.017099,0.989755,0.998526,0.008328,0.998526,0.008328,0.017099,
	0.990714,0.989755,0.009287,0.989755,0.009286,0.008328,0.990713,0.008328,0.
	000516,0.993662,0.981943,0.993661,0.981942,0.012235,0.000516,0.012235,0.987766,
	0.997568,0.987766,0.016141,0.006339,0.016141,0.006339,0.997568,0.986807,0.
	986807,0.986807,0.005381,0.00538,0.00538,0.00538,0.986807]],

	"faces" : [43,0,3,2,1,0,0,1,2,3,0,0,1,1,43,4,7,6,5,0,4,5,6,7,0,0,1,1,43,0,4,5,1,
	0,8,9,10,11,0,0,1,1,43,1,2,6,5,0,12,13,14,15,1,1,1,1,43,2,3,7,6,0,16,17,18,19,1,
	0,0,1,43,3,0,4,7,0,20,21,22,23,0,0,0,0],

	"bones" : [],

	"skinIndices" : [],

	"skinWeights" : [],

  "animations" : []


}

 

Well, first obvious thing is to look for an offending / in this code.

Hmmm… there is none.  Well we wouldn’t make the big bucks if this was easy now would we?

 

Let’s go back to our error for a second:

image

 

Well, other than the fact we know we have a / where we shouldn’t, we also have the line of code that is going all explodey.  Let’s start there.  This is one of those things that makes WebStorm so freaking cool.  Just click the link “three.js:11960” and it will automatically download that script file and go to that position.  Let’s take a look at the resulting code:

image

 

Ok, that’s some pretty straight forward code.  Basically it’s a XML response function handler.  As we can tell from the above code, the callback executed all nice and dandy.  As you can see on line 11960 I’ve set a breakpoint to pause execution before our exception, let’s see if that gives us any insight.  If you don’t know about breakpoints and debugging, drop everything and learn.  You will literally become a better programmer overnight.

 

So… to the debugger!  Let’s see what the value of responseText is:

 

By hovering over it, everything looks fine, seems to match the file we are expecting:

image

 

That said, let’s take a look at the whole thing.  Now we are going to use a feature called “Evaluate Expression”.  Again, if you don’t know what this is, drop everything and go learn.  I really need to write a debugging tutorial…. 

 

image

 

Copy the value and paste it into your editor of choice.  Then scroll to the very bottom:

image

 

Oh son of a bi….

 

That my friend, is our bug.  See, Webstorm has the ability to generate something called a SourceMap, which helps the debugger translate your running code to the code you created, especially useful if you, like me, are working in a JavaScript generating language like TypeScript.  As you can see, sometimes this is not ideal however.  Basically when run, Webstorm was appending a source map to the bottom of our js file, rendering into invalid JSON and causing the parser to puke.

 

There are two immediate solutions to this problem.  First, we can disable source map generation.  This unfortunately is a project wide setting as far as I can tell, and I rather like the ability to debug.  The easier solution is to change it from a .js file to something different.  However, once deployed to a server this can have unintended side effects.  For example, IIS will not, by default, serve files without a registered mime type.

 

Oh, and for the astute, once I got past the problem be renaming the file extension, I did in fact discover two errors in my code.  The proper TypeScript code is;

 

        var modelLoader = new THREE.JSONLoader();

        modelLoader.load("dice.jsm", (geometry,materials) => {
            var mesh = new THREE.SkinnedMesh(geometry,new THREE.MeshFaceMaterial(
                       materials));
            mesh.position.x = 0; mesh.position.y = mesh.position.z = 0;
            this.scene.add(mesh);
        });

 

Why is an exercise for the reader. :)

Programming , , ,




Baking Blender materials to texture to make them usable in a game engine

16. July 2014

 

Or…

How to take a Blender model you downloaded from the web and make it actually usable in your game in 28 easy steps!

 

… granted, the second title doesn’t have the same flow to it, does it?

 

I just had to run through this process and I figured I would share it as it is something that occurs fairly often.  When working with Blender, there are dozens of behavioral textures available that can make for some very nice results quickly.  The only problem is, when you get your asset out of Blender and into your game engine, things suddenly go horribly wrong.  The problem is, those textures only make sense inside of Blender.  Fortunately through the magic of baking, you can easily convert them into a texture map usable in any game engine.

 

Let’s take a look how.

 

First we need a model.  I am using a beautiful new model that was recently added to Blend-Swap.  It’s a free download but you need to register.  Don’t worry, you can use a real email address, they don’t spam, or at least haven't so far.  The model in question looks like this:

 

image

 

Unfortunately when we load it in Blender we quickly learn this model is in no way game ready.  Let’s take a look:

image

 

Ick.  So instead of a single Mesh, we have a dozen individual meshes.  Problem is, we need to unwrap them as a single object, so let’s join them all together.  First let’s get the camera out of the default layer.

 

If you look at the way this particular Blend is setup, there are currently two layers, the second contains the armature, the first contains everything else.

image

 

Lets get the camera out of there.  Select the camera object then hit the M key.  Then select the layer you want to move the camera to, like so:

image

 

Now click the first layer ( bottom left box ) and it should now only contain geometry.

 

We want to join everything together.  Press ‘A’ to select everything in the layer, then hit “Ctrl + J” to join everything into a single set of geometry.  Now it should look something like this:

image

 

Perfect, now we can unwrap our model.  Switch in to EDIT mode

image

 

Press ‘A’ again, until all faces are selected, like so:

image

 

Now we unwrap our model.  Select Mesh->UV Unwrap-> Unwrap ( or Smart UV Project ).

 

Switch your view to UV/Image Editor

image

 

It should look something like this:

image

 

Now create a New Image:

image

 

This image is where we are going to render our texture to.  Here are the settings I used.  Remember, games like Power of 2 textures.

image

 

Ok, now let’s look at the actual render to texture part.  Take a quick look at how the model is currently shaded:

image

 

Frankly none of those are really game engine friendly.  So let’s render all of those materials out to a single texture.  Go to the render tab

image

 

Scroll down and locate Bake.

In the UV Editor window, make sure everything is selected ( using ‘A’.  They should be highlighted in yellow ).  At this point, with your generated image and all the UV’s selected, it should look like:

image

 

 

Now under bake, set the following settings:

image

The key values being Bake Mode = Full Render and Selected to Active checked.  Now click the Bake button.

 

Up in your top part of Blender, you should see a progress bar like so:

image

 

 

Now if you go back to the UV/Image viewer, and select your image RenderedTexture, you should see:

image

 

Cool!

 

Let’s save the result to an external ( game engine friendly ) texture.  Select Image->Save as Image.  Save the image somewhere.  Remember where.

image

 

 

Now lets modify the textures on our model to use only our newly generated texture map.  First in 3D View, switch back to Object Mode from Edit mode.

Then, open the materials tab:

image

 

Select each material and hit the – ( or killswitch engage! ) button.  So it should ultimately look like this:

image

 

Now hit the + button and create a new Material.  Then click the New button.

image

 

The default values for the material should be OK, but depending on your game engine, you may have to enable Face Textures:

image

 

Now click over to the Texture tab.  Click New.

image

 

Drop down the Type box and select Image or Movie.

image

 

Scroll down to the Image section and select Open.  Pick the image you saved earlier.

image

 

Now scroll down to Mapping, drop down Coordinates and select UV.

image

 

Under Map select UVMap.

image

 

Now if you go to the 3D View and set the view mode to Texture:

image

 

TADA!  A game ready model.

 

One word of caution though, if you render this scene in Blender you will get the following result:

image

 

Don’t worry.  That’s just a biproduct of going from Blender materials to texture mapping.  If you want the texture to be seen, you need to add some lights to the scene.  Or change the material so it has an Emit value > 0, so it will provide it’s own light source.

 

With Emit set to .92, here is the result if you render it:

 

image

 

Now, what about it game?

 

Let’s create a simple LibGDX project that loads and displays our exported model:

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;


public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;

    @Override
    public void create() {
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());

        camera.position.set(3f,0f,6f);
        camera.lookAt(0f,1f,0f);

        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f;
        camera.far = 300.0f;

        modelBatch = new ModelBatch();

        UBJsonReader jsonReader = new UBJsonReader();
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        model = modelLoader.loadModel(Gdx.files.getFileHandle("robot.g3db", FileType.Internal));
        modelInstance = new ModelInstance(model);

        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);

        camera.update();

        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }
}

 

And we run it and:

image

 

Wow, a model downloaded randomly from the Internet actually working in the game engine!  How often does that actually happen? ;)

Programming, Art , ,




Playing around with Three.JS -- Part One of a not quite a tutorial series

15. July 2014

 

There was recently a flood of Three.js books on Safari lately including Essential Three.js and Game Development with Three.jsThree.js is a JavaScript based 3D library using WebGL ( and not, if not available ).  More importantly, it’s just really fun to play with!  Something about working in full 3D in a scripting language is just really satisfying.  I’ve only just been playing and really don’t have a clue what I’m doing, but I figured I would share my results.  As I have been on a TypeScript kick lately, I’ve been writing in TypeScript instead of plain JavaScript, but frankly the differences are fairly minimal.  You can get the TypeScript definitions on DefinatelyTyped.

 

I think I should make something perfectly clear… I have NO idea what I am doing, I am simply playing around.  This isn’t a WebGL tutorial by any definition of the word, just me having skim read a couple of books and played around with a new technology, nothing more.  So if you look at some code and thing “damn that looks hacky” or “isn’t that a really stupid thing to do?” the answer is probably yes! :)

 

So, disclaimer given, let’s jump right in. 

 

Since this is a web app, we need a host HTML page.  So, here is ours:

<!DOCTYPE html>

<html lang="en">
<head>
    <meta charset="utf-8" />
    <title>ThreeJS Test</title>
    <script src="http://cdnjs.cloudflare.com/ajax/libs/three.js/r67/three.js"></script>
    <script src="app.js"></script>
</head>
<body>
<h1>ThreeJS Test</h1>

<div id="content" style="width:500px;height:500px"></div>
</body>
</html>

 

Nothing really shocking here.  We include three.js using the cloudflare content delivery network.  If you wanted of course you could download the library locally and deploy it from your own servers.  I assume you don’t have servers situated around the world, so a CDN will generally thrash your own servers performance.  Next we include app.js, the generated output from our typescript application.  In the actual HTML we create a 500x500 DIV named content, for predictably enough, our content!

 

Now lets take a look at a super simple example app, app.ts:

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    constructor(){
        this.renderer = new THREE.WebGLRenderer({ alpha: true });
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFF0000,1);
        document.getElementById('content').appendChild(this.renderer.domElement);
    }

    start() {
        this.renderer.clear();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

Here in the constructor we create a WebGLRenderer, size it, set the background color to red ( using HTML format hex color coding ) then wire the renderer to the content div.

 

When you run it you should see:

image

 

Cool, our first Three.js application.  Now let’s do something 3D!  Let’s start by creating a camera and rendering a built in 3D object in wireframe. It's commented heavily, so I wont be explaining what is going on. If you are curious why I did something, leave a comment.

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer: THREE.WebGLRenderer;
    scene: THREE.Scene;
    camera: THREE.Camera;

    constructor(){
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500,500);
        this.renderer.setClearColor(0xFFFFFF,1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0,0,-20);
        this.camera.lookAt(new THREE.Vector3(0,0,0));

        // Createa the geometry for a sphere with a radius of 5
        var sphereGeometry = new THREE.SphereGeometry(5);

        // Create a wireframe material that's blueish
        var sphereMaterial = new THREE.MeshBasicMaterial(
            {color: 0x7777ff, wireframe: true});

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry,sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0,0,0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);
        this.renderer.render(this.scene,this.camera);
    }

    start() {
        // Well, arent I a bit pointless?
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

 

And when run it we get:

 

image

 

Cool!  Now time for some texturing ( and as a result, lighting ).

///<reference path="./three.d.ts"/>

class ThreeJSTest {
    renderer:THREE.WebGLRenderer;
    scene:THREE.Scene;
    camera:THREE.Camera;

    constructor() {
        // Create the renderer, in this case using WebGL, we want an alpha channel
        this.renderer = new THREE.WebGLRenderer({ alpha: true });

        // Set dimensions to 500x500 and background color to white
        this.renderer.setSize(500, 500);
        this.renderer.setClearColor(0xFFFFFF, 1);

        // Bind the renderer to the HTML, parenting it to our 'content' DIV
        document.getElementById('content').appendChild(this.renderer.domElement);

        // Create a Scene
        this.scene = new THREE.Scene();

        // And a camera.  Set Field of View, Near and Far clipping planes
        this.camera = new THREE.PerspectiveCamera(45
            , 1
            , 0.1, 1000);

        // Position is -20 along the Z axis and look at the origin
        this.camera.position = new THREE.Vector3(0, 0, -20);
        this.camera.lookAt(new THREE.Vector3(0, 0, 0));

        // Createa the geometry for a sphere with a radius of 5
        // This time we cranked up the number of sections horizontal and vertical to make a 
higher resolution globe
        var sphereGeometry = new THREE.SphereGeometry(5, 20, 20);

        // This time we create a Phong shader material and provide a texture.
        var sphereMaterial = new THREE.MeshPhongMaterial(
            {
                map: THREE.ImageUtils.loadTexture("earth_sphere.jpg")
            }
        );

        // Now make a THREE.Mesh using the geometry and a shader
        var sphere = new THREE.Mesh(sphereGeometry, sphereMaterial);

        // And put it at the origin
        sphere.position = new THREE.Vector3(0, 0, 0);

        // Add it to the scene and render the scene using the Scene and Camera objects
        this.scene.add(sphere);

        // We need some light so our texture will show, ad an ambient light to the scene
        this.scene.add(new THREE.AmbientLight(new THREE.Color(0.9,0.9,0.9).getHex()));
        this.renderer.render(this.scene, this.camera);
    }

    render() {
        // Each frame we want to render the scene again
        // Use typescript Arrow notation to retain the thisocity passing render to requestAnimationFrame
        // It's possible I invented the word thisocity.
        requestAnimationFrame(() => this.render());
        this.renderer.render(this.scene, this.camera);
    }

    start() {
        // Not so pointless now!
        this.render();
    }
}

window.onload = () => {
    var three = new ThreeJSTest();
    three.start();
};

Bet you can't guess what texture I went with!

 

image

 

So apparently textured 3D objects are nothing difficult.

 

This is getting pretty long, so I’ll cut it off here.  Next up I’m going to look at getting a Blender object rendering in Three.JS.

Programming ,




LibGDX Tutorial Part 12: Using GLSL Shaders (and creating a Mesh)

8. July 2014

 

In this part of the LibGDX tutorial series we are going to take a look at using GLSL shaders.  GLSL standards for OpenGL Shader Language and since the move from a fixed to programmable graphics pipeline, Shader programming has become incredibly important.  In fact, every single thing rendered with OpenGL has at least a pair of shaders attached to it.  It’s been pretty transparent to you till this point because LibGDX mostly takes care of everything for you.  When you create a SpriteBatch object in LibGDX, it automatically creates a default vertex and fragment shader for you.  If you want more information on working with GLSL I put together the OpenGL Shader Programming Resource Round-up back in May.  It has all the information you should need to get up to speed with GLSL.  For more information on OpenGL in general, I also created this guide.

 

Render Pipeline Overview

 

To better understand the role of GL shaders, it’s good to have a basic understanding of how the modern graphics pipeline works.  This is the high level description I gave in PlayStation Mobile book, it’s not plagiarism because I’m the author. :)

 

A top-level view of how rendering occurs might help you understand the shader process. It all starts with the shader program, vertex buffers, texture coordinates, and so on being passed in to the graphics device. Then this information is sent off to a vertex shader, which can then transform that vertex, do lighting calculations and more (we will see this process shortly). The vertex shader is executed once for every vertex and a number of different values can be output from this process (these are the out attributes we saw in the shader earlier). Next the results are transformed, culled, and clipped to the screen, discarding anything that is not visible, then rasterized, which is the process of converting from vector graphics to pixel graphics, something that can be drawn to the screen.

The results of this process are fragments, which you can think of as "prospective pixels," and the fragment are passed in to the fragment shader. This is why they are called fragment shaders instead of pixel shaders, although people commonly refer to them using either expression. Once again, the fragment shader is executed once for each fragment. A fragment shader, unlike a vertex shader, can only return a single attribute, which is the RGBA color of the individual pixel. In the end, this is the value that will be displayed on the screen. It sounds like a horribly complex process, but the GPUs have dedicated hardware for performing exactly such operations, millions upon millions of times per second. That description also glossed over about a million tiny details, but that is the gist of how the process occurs.

 

So basically shaders are little programs that run over and over again on the data in your scene.  A vertex shader works on the vertices in your scene ( predictably enough… ) and are responsible for positioning each vertex in the world.  Generally this is a matter of transforming them using some kind of Matrix passed in from your program.  The output of the Vertex shader is ultimately passed to a Fragment shader.  Fragment shaders are basically, as I said above, prospective pixels.  These are the actual coloured dots that are going to be drawn on the users screen.  In the fragment shader you determine how this pixel will appear.  So basically a vertex shader is a little C-like program that is run for each vertex in your scene, while a fragment shader is run for each potential pixel.

 

There is one very important point to pause on here…  Fragment and Vertex shaders aren’t the only shaders in the modern graphics pipeline.  There are also Geometry shaders.  While vertex shaders can modify geometry ( vertices ), Geometry shaders actually create new geometry.  Geometry shaders were added in OpenGL 3.2 and D3D10.  Then in OpenGL4/D3D11 Tessellation shaders were added.  Tessellation is the process of sub-dividing a surface to add more detail, moving this process to silicon makes it viable to create much lower detailed meshes and tessellate them on the fly.  So, why are we only talking about Fragment and Vertex shaders?  Portability.  Right now OpenGL ES and WebGL do not support any other shaders.  So if you want to support mobile or WebGL, you can’t use these other shader types.

 

SpriteBatch and default Shaders

 

As I said earlier, when you use SpriteBatch, it provides a default Vertex and Fragment shader for you.  Let’s take a look at each of them now.  Let’s do it in the order they occur, so let’s take a look at the vertex shader first:

 

attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;

uniform mat4 u_projTrans;

varying vec4 v_color;
varying vec2 v_texCoords;

void main()
{
    v_color = a_color;
    v_color.a = v_color.a * (256.0/255.0);
    v_texCoords = a_texCoord + 0;
    gl_Position =  u_projTrans * a_position;
}

 

As I said, GLSL is a very C-like language, right down to including a main() function as the program entry point.  There are a few things to be aware of here.  First are attribute and uniform  variables.  These are variables that are passed in from your source code.  LibGDX takes care of most of these for you, but if you are going to write your own default shader, LibGDX expects all of them to exist.  So then, what is the difference between a uniform and attribute variable?  A uniform stays the same for every single vertex.  Attributes on the other hand can vary from vertex to vertex.  Obviously this can have performance implications, so if it makes sense, prefer using a uniform.  A varying value on the other hand can be thought of as the return value, these values will be passed on down the rendering pipeline ( meaning the fragment shader has access to them ).  As you can see from the use of gl_Position, OpenGL also has some built in values.  For vertex shaders there are gl_Position and gl_PointSize.  Think of these as uniform variables provided by OpenGL itself.  gl_Position is ultimately the position of your vertex in the world.

 

As to what this script does, it mostly just prepares a number of variables for the fragment shader, the color, the normalized ( 0 to 1 ) alpha value and the texture to bind to, in this case texture unit 0.  This is set by calling Texture.Bind() in your code, or is called by LibGDX for you.  Finally it positions the vertex in 3D space by multiplying the vertices position by the transformation you passed in as u_projTrans.

 

Now let’s take a quick look at the default fragment shader:

#ifdef GL_ES
#define LOWP lowp
    precision mediump float;
#else
    #define LOWP
#endif

varying LOWP vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;

void main()
{
    gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}

 

As you can see, the format is very similar.  The ugly #ifdef allows this code to work on both mobile and higher end desktop machines.  Essentially if you are running OpenGL ES then the value of LOWP is defined as lowp, and precision is set to medium.  In real world terms, this means that GL ES will run at a lower level of precision for internal calculations, both speeding things up and slightly degrading the result. 

The values v_color and v_texCoords were provided by the vertex shader.  A sampler2D on the other hand is a special glsl datatype for accessing the texture bound to the shader.  gl_FragColor is another special built in variable ( like vertex shaders, fragment shaders have some GL provided variables, many more than Vertex shaders in fact ), this one represents the output color of the pixel the fragment shader is evaluating.  texture2D essentially returns a vec4 value representing the pixel at UV coordinate v_texCoords in texture u_texture.  The vec4 represents the RGBA values of the pixel, so for example (1.0,0.0,0.0,0.5) is a 50% transparent red pixel.  The value assigned to gl_FragColor is ultimately the color value of the pixel displayed on your screen.

 

Of course a full discussion on GLSL shaders is wayyy beyond the scope of this document.  Again if you need more information I suggest you start here.  I am also no expert on GLSL, so you are much better off learning the details from someone else! :)  This does however give you a peek behind the curtain at what LibGDX is doing each frame and is going to be important to us in just a moment.

 

Changing the Default Shader

 

There comes a time where you might want to alter the default shader and replace it with one of your own.  This process is actually quite simple, let’s take a look.  Let’s say for some reason you wanted to render your game entirely in black and white?  Here are a simple vertex and fragment shader combo that will do exactly this:

 

Vertex shader:

attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;

uniform mat4 u_projTrans;

varying vec4 v_color;
varying vec2 v_texCoords;

void main() {
    v_color = a_color;
    v_texCoords = a_texCoord0;
    gl_Position = u_projTrans * a_position;
}

Fragment shader:

#ifdef GL_ES
    precision mediump float;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;

void main() {
        vec3 color = texture2D(u_texture, v_texCoords).rgb;
        float gray = (color.r + color.g + color.b) / 3.0;
        vec3 grayscale = vec3(gray);

        gl_FragColor = vec4(grayscale, 1.0);
}

I saved each file as vertex.glsl and shader.glsl respectively, to the project assets directory.  The shaders are extremely straight forward.  The Vertex is in fact just the default vertex shader from LibGDX.  Once again remember you need to provide certain values for SpriteBatch to work… don’t worry, things will blow up and tell you if they are missing from your shader! :)  The fragment shader is simply sampling the RGB value of the current texture pixel, getting the “average” value of the RGB values and using that as the output value.

 

Enough with shader code, let’s take a look at the LibGDX code now:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class ShaderTestApp extends ApplicationAdapter {
    SpriteBatch batch;
    Texture img;
    Sprite sprite;
    String vertexShader;
    String fragmentShader;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
        sprite = new Sprite(img);
        sprite.setSize(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());

        vertexShader = Gdx.files.internal("vertex.glsl").readString();
        fragmentShader = Gdx.files.internal("fragment.glsl").readString();
        shaderProgram = new ShaderProgram(vertexShader,fragmentShader);
    }

    @Override
    public void render () {
        Gdx.gl.glClearColor(1, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        batch.begin();
        batch.setShader(shaderProgram);
        batch.draw(sprite,sprite.getX(),sprite.getY(),sprite.getWidth(),sprite.getHeight());
        batch.end();
    }
}

 

And when you run it:

image

 

Tada, your output is grayscale!

As to what we are doing in that code, we load each shader file as a string.  When then create a new ShaderProgram passing in a vertex and fragment shader.  The ShaderProgram is the class the populates all the various variables that your shaders expect, bridging the divide between the Java world and the GLSL world.  Then in render() we set our ShaderProgram as active by calling setShader().  Truth is, we could have done this just once in the create method instead of once per frame.

 

Multiple Shaders per Frame

 

In the above example, when we set the shader program, it applied to all of the output.  That’s nice if you want to render the entire world in black and white, but what if you just wanted to render a single sprite using your shader?  Well fortunately that is pretty easy, you simply change the shader again.  Consider:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class ShaderTest2 extends ApplicationAdapter {
    SpriteBatch batch;
    Texture img;
    Sprite leftSprite;
    Sprite rightSprite;
    String vertexShader;
    String fragmentShader;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
        leftSprite = new Sprite(img);
        rightSprite = new Sprite(img);

        leftSprite.setSize(Gdx.graphics.getWidth()/2, Gdx.graphics.getHeight());
        leftSprite.setPosition(0,0);
        rightSprite.setSize(Gdx.graphics.getWidth()/2, Gdx.graphics.getHeight());
        rightSprite.setPosition(Gdx.graphics.getWidth()/2,0);

        vertexShader = Gdx.files.internal("vertex.glsl").readString();
        fragmentShader = Gdx.files.internal("fragment.glsl").readString();
        shaderProgram = new ShaderProgram(vertexShader,fragmentShader);
    }

    @Override
    public void render () {
        Gdx.gl.glClearColor(1, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);

        batch.setShader(null);
        batch.begin();
        batch.draw(leftSprite, leftSprite.getX(), leftSprite.getY(), leftSprite.getWidth(), leftSprite.getHeight());
        batch.end();

        batch.setShader(shaderProgram);
        batch.begin();
        batch.draw(rightSprite, rightSprite.getX(), rightSprite.getY(), rightSprite.getWidth(), rightSprite.getHeight());
        batch.end();
    }
}

 

And when you run it:

image

 

One using the default shader, one sprite rendered using the black and white shader.  As you can see, it’s simply a matter of calling setShader() multiple times.  Calling setShader() but passing in null restores the default built-in shader.  However, each time you call setShader() there is a fair amount of setup done behind the scenes, so you want to minimize the number of times you want to call it.  Or…

 

Setting Shader on a Mesh Object

 

Each Mesh object in LibGDX has it’s own ShaderProgram.  Behind the scenes SpriteBatch is actually creating a large single Mesh out of all the sprites in your screen, which are ultimately just textured quads.  So if you have a game object that needs fine tune shader control, you may consider rolling your own Mesh object.  Let’s take a look at such an example:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.*;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class MeshShaderApp extends ApplicationAdapter {
    SpriteBatch batch;
    Texture texture;
    Sprite sprite;
    Mesh mesh;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        texture = new Texture("badlogic.jpg");
        sprite = new Sprite(texture);
        sprite.setSize(Gdx.graphics.getWidth(),Gdx.graphics.getHeight());

        float[] verts = new float[30];
        int i = 0;
        float x,y; // Mesh location in the world
        float width,height; // Mesh width and height

        x = y = 50f;
        width = height = 300f;

        //Top Left Vertex Triangle 1
        verts[i++] = x;   //X
        verts[i++] = y + height; //Y
        verts[i++] = 0;    //Z
        verts[i++] = 0f;   //U
        verts[i++] = 0f;   //V

        //Top Right Vertex Triangle 1
        verts[i++] = x + width;
        verts[i++] = y + height;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 0f;

        //Bottom Left Vertex Triangle 1
        verts[i++] = x;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 0f;
        verts[i++] = 1f;

        //Top Right Vertex Triangle 2
        verts[i++] = x + width;
        verts[i++] = y + height;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 0f;

        //Bottom Right Vertex Triangle 2
        verts[i++] = x + width;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 1f;

        //Bottom Left Vertex Triangle 2
        verts[i++] = x;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 0f;
        verts[i] = 1f;

        // Create a mesh out of two triangles rendered clockwise without indices
        mesh = new Mesh( true, 6, 0,
                new VertexAttribute( VertexAttributes.Usage.Position, 3, ShaderProgram.POSITION_ATTRIBUTE ),
                new VertexAttribute( VertexAttributes.Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE+"0" ) );

        mesh.setVertices(verts);

        shaderProgram = new ShaderProgram(
                Gdx.files.internal("vertex.glsl").readString(),
                Gdx.files.internal("fragment.glsl").readString()
                );
    }

    @Override
    public void render () {

        Gdx.gl20.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl20.glClearColor(0.2f, 0.2f, 0.2f, 1);
        Gdx.gl20.glClear(GL20.GL_COLOR_BUFFER_BIT);
        Gdx.gl20.glEnable(GL20.GL_TEXTURE_2D);
        Gdx.gl20.glEnable(GL20.GL_BLEND);
        Gdx.gl20.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);

        batch.begin();
        sprite.draw(batch);
        batch.end();

        texture.bind();
        shaderProgram.begin();
        shaderProgram.setUniformMatrix("u_projTrans", batch.getProjectionMatrix());
        shaderProgram.setUniformi("u_texture", 0);
        mesh.render(shaderProgram, GL20.GL_TRIANGLES);
        shaderProgram.end();
    }
}

And when you run it:

 

image

This sample is long but fairly simple.  In create() we create the geometry for a quad by defining 2 triangles.  We then load our ShaderProgram just like we did in the earlier example.  You may notice in creating the Mesh we define two VertexAttribute values and bind them to values within our ShaderProgram.  These are the input values into the shader.  Unlike with SpriteBatch and the default shader, you need to do a bit more of the behind the scenes work when rolling your own Mesh.

 

Then in render() you see we work with the SpriteBatch normally but then draw our Mesh object using Mesh.render, passing in the ShaderProgram.  Texture.bind() is what binds the texture from LibGDX to texture unit 0 in the GLSL shader.  We then pass in our required uniform values using setUniformMatrix and setUniformi ( as in int ).  This is how you set up uniform values from the Java side of the fence.  u_texture is saying which texture unit to use, while u_projTrans is the transformation matrix for positioning items within our world.  In this case we are simply using the projection matrix from the SpriteBatch.

 

Using a Mesh instead of a Sprite has some disadvantages however.  When working with Sprites, all geometry is batched into a single object and this is good for performance.  More importantly, with Mesh you need to roll all the functionality you need from Sprite as you need it.  For example, if you want to support scaling or rotation, you need to provide that functionality.

Programming , ,