Ok now, what the hell is going on here. Blender and bones are going to be the death of me

18. February 2014

So i’ve posted a lot lately about my recent experiences with working with bones exported from Blender to LibGDX.  I encountered two problems, first only the base of the bone was available once exported.  Second, bones that were external to the mesh weren’t being updated.  The first isn’t really a big deal, except the solution to it seemingly is impacted by the second problem.  There was a thread over on LibGDX forums I posted my experiences on, and Xoppa, the guy behind the 3D portions of LibGDX posted that my observations simply weren’t correct.  This post here is mostly a recap of that thread.  I am not sure if anything in this thread will be of value to any of you, but it does illustrate that sometimes… who knows, it might just be gremlins!

 

So I set about creating a minimal sample to illustrate the problem I was having.  I have literally done this a few dozen, perhaps hundred, of times the past week.

 

I created an ultra simple model in Blender, with a bone external to mesh.  In my experiences to this point, every time I try to get the position of this external bone, the results will always be 0,0,0.  All internal bones will work fine, but the external one won’t.

 

Here is the model:

11

 

Simple enough, a mesh with a simple 3 bone armature bound to it.  What you don’t see is I’ve also done a small animation sequence ( as Default Take ).

 

I then load it with the following code.  The idea is draw a sphere at the location of each bone.  Pretty much what I expected to see is two two spheres, with the third one missing ( as it will actually be at 0,0,0, the same location as the first one.

package com.gamefromscratch;

 

import com.badlogic.gdx.ApplicationListener;

import com.badlogic.gdx.Files.FileType;

import com.badlogic.gdx.Gdx;

import com.badlogic.gdx.graphics.Color;

import com.badlogic.gdx.graphics.GL10;

import com.badlogic.gdx.graphics.GL20;

import com.badlogic.gdx.graphics.PerspectiveCamera;

import com.badlogic.gdx.graphics.VertexAttributes.Usage;

import com.badlogic.gdx.graphics.g3d.Environment;

import com.badlogic.gdx.graphics.g3d.Material;

import com.badlogic.gdx.graphics.g3d.Model;

import com.badlogic.gdx.graphics.g3d.ModelBatch;

import com.badlogic.gdx.graphics.g3d.ModelInstance;

import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;

import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;

import com.badlogic.gdx.graphics.g3d.model.Node;

import com.badlogic.gdx.utils.JsonReader;

import com.badlogic.gdx.graphics.g3d.utils.AnimationController;

import com.badlogic.gdx.graphics.g3d.utils.ModelBuilder;

import com.badlogic.gdx.math.Vector3;

 

 

public class TankDemo implements ApplicationListener {

    private PerspectiveCamera camera;

    private ModelBatch modelBatch;

    private AnimationController animationController;

    

    private Model model;

    private ModelInstance modelInstance;

 

    private Model pivot;

    private ModelInstance p1, p2, p3;

    private Node bone1,bone2,bone3;

    

    private Environment environment;

    

    @Override

    public void create() {  

    camera = new PerspectiveCamera(

   75,

                Gdx.graphics.getWidth(),

                Gdx.graphics.getHeight());

        

        camera.position.set(0f,0f,-8f);

        camera.lookAt(0f,0f,0f);

        camera.near = 0.1f; 

        camera.far = 300.0f;

 

        modelBatch = new ModelBatch();

        

        JsonReader jsonReader = new JsonReader();

        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);

        model = modelLoader.loadModel(Gdx.files.getFileHandle("data/demo.g3dj", FileType.Internal));

        modelInstance = new ModelInstance(model);

        

        animationController = new AnimationController(modelInstance);

        animationController.animate("Default Take",-1,null,0);

        

        bone1 = modelInstance.getNode("Bone");

        bone2 = modelInstance.getNode("Bone_001");

        bone3 = modelInstance.getNode("Bone_002");

        

        ModelBuilder mb = new ModelBuilder();

        

        pivot = mb.createSphere(0.5f,0.5f,0.5f,10,10,GL20.GL_LINES,new Material(ColorAttribute.createDiffuse(Color.RED)),Usage.Position | Usage.Normal);

        p1 = new ModelInstance(pivot);

        p2 = new ModelInstance(pivot);

        p3 = new ModelInstance(pivot);

        

 

        

        environment = new Environment();

        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));

    }

 

    @Override

    public void dispose() {

        modelBatch.dispose();

        model.dispose();

    }

 

    @Override

    public void render() {

        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());

        Gdx.gl.glClearColor(1, 1, 1, 1);

        Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);

        

        animationController.update(Gdx.graphics.getDeltaTime());

        Vector3 pos = new Vector3();

        bone1.globalTransform.getTranslation(pos);

        p1.transform.set(bone1.globalTransform);

        p2.transform.set(bone2.globalTransform);

        p3.transform.set(bone3.globalTransform);

        

        camera.update();

        modelBatch.begin(camera);

        modelBatch.render(modelInstance, environment);

        modelBatch.render(p1, environment);

        modelBatch.render(p2, environment);

        modelBatch.render(p3, environment);

        

        modelBatch.end();

    }

 

    @Override

    public void resize(int width, int height) {

    }

 

    @Override

    public void pause() {

    }

 

    @Override

    public void resume() {

    }

 

}

 

Then when I run this code I see:

1

 

Or, with the line rendering the main mesh itself commented out to more clearly show the bone locations:

2

 

For #$@#$@ing #@$@#$ @#$@$’s sake...

 

Why the swearing?  Because this is working EXACTLY how it is supposed to.  EXACTLY how I expected it to a week ago.  Exactly how Xoppa said it should.

 

What it isn’t doing is behaving EXACTLY how it has been for the last 50 or so times I tried the same thing!  Literally every time I did this before, that final bone didn’t update.  No bones in the armature that were external to the mesh itself were updated.  So I thought maybe that’s it…  maybe something about this export caused the third bone to be part of the geometry… maybe that’s it!

Wtf

No such luck.  The exported g3dj file looks just like any dozen of others I have generated in the past.

 

This is unbelievably frustrating at this point, as a problem I have been trying furiously to work around simply seems to no longer exist.  At this point I simply have NO idea what I was doing in the past that caused the entire process to break.  And that doesn’t leave me with a warm fuzzy feeling.

 

Getting content out of Blender has never been the funnest process, but this last week has been an exercise in frustration… and at the end of the day, the source of the frustration seems to no longer exist.

 

There are a few differences between this example and some of my prior ones.  Generally I load two different Models from disk ( instead of building one using ModelBuilder ), so perhaps once I add another model to the mix I will start seeing the old behaviour ( although that really wouldn’t make much sense ).  Also, I am using a different computer than I normally use ( on my Mac today ), but it is the same version level of Blender and LibGDX, so that shouldn’t be a factor either.

 

At this point, I just don’t know what to say… maybe the fates simply hated me last week… that would explain the infernal cold they inflicted upon me!

Programming , ,




No bones about it. Bones in LibGDX and Blender

14. February 2014

This is one of those things I’ve been fighting with for the past few days so I thought I would share a bit.

 

First a bit of a primer for those of you that aren’t all that familiar with bones yet.  Bones are a common way of animating 3D geometry.  Essentially you add an armature (skeleton) to your scene and bind it to the geometry of your mesh.  Moving the bones the compose your armature will then update the bound geometry.

 

Let’s take a look at an example in Blender.  Here is the anatomy of skeletal animation in Blender:

Blender

 

Each bone in turn has a “weight”.  This is the amount of influence the bone’s movements have on the geometry.  First you need to parent the mesh to the armature.  Simply select the mesh, then shift select the bones and press Ctrl+P.

 

Each bone then has a certain weight attached to it.  The colour determines how much influence a bone has over the geometry.  Consider the following, it’s the weight mapping for the top most bone in that example:

Blender2

 

So now, if in pose mode, I rotate that bone, we see:

Blender3

As you can see, moving the bone changes the surrounding geometry based on the bones influence area.  In a nutshell, that is how bone animation works in Blender.  I cover more of the “how to” in the Blender to LibGDX tutorial if you want details.

 

 

So that’s bones in Blender, lets take a look at the LibGDX side of the equation.  Here is how the skeleton is represented in a g3dj file:

{"id": "Armature", 

"rotation": [-0.707107,  0.000000,  0.000000,  0.707107], 

"scale": [ 1.000000,  1.000000,  1.000000], 

"translation": [ 0.012381, -0.935900, -0.017023], 

"children": [

  {"id": "Bone", 

  "rotation": [ 0.500000, -0.500000,  0.500000,  0.500000], 

  "scale": [ 1.000000,  1.000000,  1.000000], 

  "children": [

    {"id": "Bone_001", 

    "rotation": [ 0.000000,  0.009739, -0.000000,  0.999953], 

    "scale": [ 1.000000,  1.000000,  1.000000], 

    "translation": [ 1.000000,  0.000000,  0.000000], 

    "children": [

      {"id": "Bone_002", 

      "rotation": [-0.000000, -0.013871,  0.000000,  0.999904], 

      "scale": [ 1.000000,  1.000000,  1.000000], 

      "translation": [ 1.575528,  0.000000,  0.000000]}

      ]}

    ]}

  ]}

]

 

You can also see the bones as part of the geometry node as well, like so:

{"id": "Cube",
   "translation": [-0.012381, -0.017023, 0.935900],
   "parts": [
   {"meshpartid": "shape1_part1",
   "materialid": "Material",
   "bones": [
      {"node": "Bone",
      "translation": [ 0.012381, 0.017023, -0.935900, 0.000000],
      "rotation": [ 0.500000, -0.500000, 0.500000, 0.500000],
      "scale": [ 1.000000, 1.000000, 1.000000, 0.000000]},

      {"node": "Bone_002",
      "translation": [ 0.012381, 0.047709, 1.639329, 0.000000],
      "rotation": [ 0.502062, -0.502062, 0.497930, 0.497930],
      "scale": [ 1.000000, 1.000000, 1.000000, 0.000000]},

      {"node": "Bone_001",
      "translation": [ 0.012381, 0.017023, 0.064100, 0.000000],
      "rotation": [ 0.495107, -0.495107, 0.504846, 0.504846],
      "scale": [ 1.000000, 1.000000, 1.000000, 0.000000]}
   ],
   "uvMapping": [[]]}
]},

 

The later are the bones that are contained in the mesh “Cube”.  This will be relevant in a minute.  Instead lets look at the Armature composition.

 

Each bone within the hierarchy is basically just a series of transforms relative to its parent.  The armature itself has a rotation, scale and translation, as do each child.  In your ModelInstance, the Armature is a hierarchy of Nodes, like so:

GDX1

 

Animations then are simply a series of transforms applied to bones over a period of time, like so:

GDX2

 

These values correspond with the keyframe values you set in Blender.

 

Now there are a couple gotchas to be aware of!

First off, in LibGDX a bone is probably more accurately called a joint.  Remember what a bone looked like in Blender:

GDX3

Only the “bone head” is used.  The tail effectively doesn’t exist.

 

So, positioning relative to a bone will bring you to the base, not the tail.  Therefore, if you want to say… use bones for positioning other limbs, you need to create an extra one, and this lead to a problem.

Say I want to create a bone then that I can search for in code to mount a weapon upon.  I would then have to do something like this:

Blender5

 

This allows me to locate the very tip of my geometry.  But there is a catch.  If I export it, I can see the new bone Bone_003 is part of my armature:

Gdx6

 

That said, remember the entry for “Cube” showed the bones it contains… yeah well, that’s a problem.

Gdx7

See… the new bone isn’t actually contained within the geometry.

As a direct result, when working with it in code in LIbGDX, it just doesn’t work.  It never returns the proper position, or at least the position I would expect.  I’ve also had some weird behaviour where an exported model with only a single bone can’t be programmatically updated as well.  I need to investigate this further.

 

As a result, I’ve decided that bones simply aren’t the way to go about it.  Instead what i’ve started doing is putting a null objet in where I want weapon mounts to appear.  It doesn’t seem to have the gotchas that bones have so far.

 

Sorry for the slow rate of updates, I am sick as a dog right now.  So if that post seemed a little incoherent, that’s why! :)

 

Programming , ,




So, this moving bones in LibGDX models is easier than I thought... Sorta

5. February 2014

I literally spent hours on this and it didn’t work.  So I decided to strip it down to absolute basics, create a barebones solution and figure out exactly what is going wrong.

 

The kicker is, the answer is nothing, it works exactly as expected.  Want to manipulate a bone in a Model in LibGDX and see the results propagated?  Well, this is how.

 

First I modelled the following in Blender:

BlobBlender

 

Its a simple mesh with a single animation attached.  If you read my prior tutorials, the how of it will be no problem.

 

Then I ran it with this code:

package com.gamefromscratch;

 

import com.badlogic.gdx.ApplicationListener;

import com.badlogic.gdx.Files.FileType;

import com.badlogic.gdx.Gdx;

import com.badlogic.gdx.Input;

import com.badlogic.gdx.InputProcessor;

import com.badlogic.gdx.graphics.GL10;

import com.badlogic.gdx.graphics.PerspectiveCamera;

import com.badlogic.gdx.graphics.g3d.Environment;

import com.badlogic.gdx.graphics.g3d.Model;

import com.badlogic.gdx.graphics.g3d.ModelBatch;

import com.badlogic.gdx.graphics.g3d.ModelInstance;

import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;

import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;

import com.badlogic.gdx.graphics.g3d.model.Node;

import com.badlogic.gdx.utils.JsonReader;

import com.badlogic.gdx.graphics.g3d.utils.AnimationController;

 

 

public class Boned implements ApplicationListener, InputProcessor {

    private PerspectiveCamera camera;

    private ModelBatch modelBatch;

    

    private Model blobModel;    

    private ModelInstance blobModelInstance;

    private Node rootBone;

    private Environment environment;

 

    private AnimationController animationController;

    

    @Override

    public void create() {        

        camera = new PerspectiveCamera(

                75,

                Gdx.graphics.getWidth(),

                Gdx.graphics.getHeight());

        

        camera.position.set(0f,3f,5f);

        camera.lookAt(0f,3f,0f);

        camera.near = 0.1f; 

        camera.far = 300.0f;

 

        modelBatch = new ModelBatch();

        

        JsonReader jsonReader = new JsonReader();

        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);

        blobModel = modelLoader.loadModel(Gdx.files.getFileHandle("data/blob.g3dj", FileType.Internal));

        blobModelInstance = new ModelInstance(blobModel);

        

        animationController = new AnimationController(blobModelInstance);

        animationController.animate("Bend",-1,1f,null,0f);

        

        rootBone = blobModelInstance.getNode("Bone");

        environment = new Environment();

        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));

        

        Gdx.input.setInputProcessor(this);

    }

 

    @Override

    public void dispose() {

        modelBatch.dispose();

        blobModel.dispose();

    }

 

    @Override

    public void render() {

        // You've seen all this before, just be sure to clear the GL_DEPTH_BUFFER_BIT when working in 3D

        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());

        Gdx.gl.glClearColor(1, 1, 1, 1);

        Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);

        camera.update();

        animationController.update(Gdx.graphics.getDeltaTime());

        modelBatch.begin(camera);

        modelBatch.render(blobModelInstance, environment);

        modelBatch.end();

    }

 

    @Override

    public void resize(int width, int height) {

    }

 

    @Override

    public void pause() {

    }

 

    @Override

    public void resume() {

    }

 

@Override

public boolean keyDown(int keycode) {

if(keycode == Input.Keys.LEFT)

{

rootBone.translation.add(-1f, 0, 0);

returntrue;

}

else if(keycode == Input.Keys.RIGHT){

rootBone.translation.add(1f,0,0);

returntrue;

}

returnfalse;

}

 

@Override

public boolean keyUp(int keycode) {

// TODO Auto-generated method stub

returnfalse;

}

 

@Override

public boolean keyTyped(char character) {

returnfalse;

}

 

@Override

public boolean touchDown(int screenX, int screenY, int pointer, int button) {

// TODO Auto-generated method stub

returnfalse;

}

 

@Override

public boolean touchUp(int screenX, int screenY, int pointer, int button) {

// TODO Auto-generated method stub

returnfalse;

}

 

@Override

public boolean touchDragged(int screenX, int screenY, int pointer) {

// TODO Auto-generated method stub

returnfalse;

}

 

@Override

public boolean mouseMoved(int screenX, int screenY) {

// TODO Auto-generated method stub

returnfalse;

}

 

@Override

public boolean scrolled(int amount) {

// TODO Auto-generated method stub

returnfalse;

}

}

 

End result, you get this:

BonedBlob

 

Press the arrow keys and the root bone is translated exactly as you would expect!

 

Now, I spent HOURS trying to do this, and for the life of me I couldn’t figure out why the heck it doesn’t work.  Sometimes going back to the basics gives you a clue.

 

In my test I used two models, one an animated bending arm, somewhat like the above.  The other was an axe with a single bone for “attaching”.  The exactly same code above failed to work.  Somethings up here...

 

So after I get the above working fine, I have an idea… is it the animation?  So I comment out this line:

animationController.animate("Bend",-1,1f,null,0f);

 

BOOM!  No longer works.

 

So it seems changes you make to the bones controlling a Model only propagate if there is an animation playing.  A hackable workaround seems to be to export an empty animation, but there has to be a better way.  So at least I know why I wasted several hours on something that should have just worked.  Now I am going to dig into the code for animate() and see if there is a call I can make manually without requiring an attached animation.

 

EDIT:

Got it!

Gotta admit it took a bit of digging, but I figured out what I am missing.  Each time you make a change to the bones you need to call calculateTransforms() on the ModelInstance that owns the bone!  Change the code like so:

public boolean keyDown(int keycode) {

if(keycode == Input.Keys.LEFT)

{

  rootBone.translation.add(-1f, 0, 0);

  blobModelInstance.calculateTransforms();

  return true;

}

  else if(keycode == Input.Keys.RIGHT){

  rootBone.translation.add(1f,0,0);

  blobModelInstance.calculateTransforms();

  return  true;

}

 

  return false;

}

And presto, it works!

Just a warning, calculateTransforms() doesn’t appear to be light weight, so use with caution.

If you are curious where in the process calculateTransforms is called when you call animate(), it’s the end() call in BaseAnimationController.java called from the method applyAnimations().

Programming ,




Rendering a 3D model to texture in LibGDX

21. January 2014

 

Back in this post I discussed ways of making dynamically equipped 2D sprites.  One way was to render out a 3D object to 2D textures dynamically.  So far we have looked at working in 3D in LibGDX, then exporting and rendering an animated model from Blender, the next step would be a dynamic 3D object to texture.  At first glance I thought this would be difficult, but reality is, it was stupidly simple.  In fact, the very first bit of code I wrote simply worked!  Well, except the results being upside down I suppose… details details…

 

Anyways, that is what this post discusses.  Taking a 3D scene in LibGDX and rendering in a 2D texture.  The code is just a modification of the code from the Blender to GDX post from a couple days back.

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.InputProcessor;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.Pixmap.Format;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureRegion;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController.AnimationDesc;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController.AnimationListener;
import com.badlogic.gdx.graphics.glutils.FrameBuffer;



public class ModelTest implements ApplicationListener, InputProcessor {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;
    private AnimationController controller;
    private boolean screenShot = false;
    private FrameBuffer frameBuffer;
    private Texture texture = null;
    private TextureRegion textureRegion;
    private SpriteBatch spriteBatch;
    
    @Override
    public void create() {        
        // Create camera sized to screens width/height with Field of View of 75 degrees
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());
        
        // Move the camera 5 units back along the z-axis and look at the origin
        camera.position.set(0f,0f,7f);
        camera.lookAt(0f,0f,0f);
        
        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f; 
        camera.far = 300.0f;

        // A ModelBatch is like a SpriteBatch, just for models.  Use it to batch up geometry for OpenGL
        modelBatch = new ModelBatch();
        
        // Model loader needs a binary json reader to decode
        UBJsonReader jsonReader = new UBJsonReader();
        // Create a model loader passing in our json reader
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        // Now load the model by name
        // Note, the model (g3db file ) and textures need to be added to the assets folder of the Android proj
        model = modelLoader.loadModel(Gdx.files.getFileHandle("data/benddemo.g3db", FileType.Internal));
        // Now create an instance.  Instance holds the positioning data, etc of an instance of your model
        modelInstance = new ModelInstance(model);

        //move the model down a bit on the screen ( in a z-up world, down is -z ).
        modelInstance.transform.translate(0, 0, -2);

        // Finally we want some light, or we wont see our color.  The environment gets passed in during
        // the rendering process.  Create one, then create an Ambient ( non-positioned, non-directional ) light.
        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
        
        // You use an AnimationController to um, control animations.  Each control is tied to the model instance
        controller = new AnimationController(modelInstance);  
        // Pick the current animation by name
        controller.setAnimation("Bend",1, new AnimationListener(){

            @Override
            public void onEnd(AnimationDesc animation) {
                // this will be called when the current animation is done. 
                // queue up another animation called "balloon". 
                // Passing a negative to loop count loops forever.  1f for speed is normal speed.
                controller.queue("Balloon",-1,1f,null,0f);
            }

            @Override
            public void onLoop(AnimationDesc animation) {
                // TODO Auto-generated method stub
                
            }
            
        });
        
        frameBuffer = new FrameBuffer(Format.RGB888,Gdx.graphics.getWidth(),Gdx.graphics.getHeight(),false);
        Gdx.input.setInputProcessor(this);
        
        spriteBatch = new SpriteBatch();
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        // You've seen all this before, just be sure to clear the GL_DEPTH_BUFFER_BIT when working in 3D
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);

        // When you change the camera details, you need to call update();
        // Also note, you need to call update() at least once.
        camera.update();
        
        // You need to call update on the animation controller so it will advance the animation.  Pass in frame delta
        controller.update(Gdx.graphics.getDeltaTime());
        
        
        // If the user requested a screenshot, we need to call begin on our framebuffer
        // This redirects output to the framebuffer instead of the screen.
        if(screenShot)
            frameBuffer.begin();

        // Like spriteBatch, just with models!  pass in the box Instance and the environment
        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
        
        // Now tell OpenGL that we are done sending graphics to the framebuffer
        if(screenShot)
        {
            frameBuffer.end();
            // get the graphics rendered to the framebuffer as a texture
            texture = frameBuffer.getColorBufferTexture();
            // welcome to the wonderful world of different coordinate systems!
            // simply put, the framebuffer is upside down to normal textures, so we have to flip it
            // Use a TextureRegion to do so
            textureRegion = new TextureRegion(texture);
            // and.... FLIP!  V (vertical) only
            textureRegion.flip(false, true);
        }
        
        // In the case that we have a texture object to actually draw, we do so
        // using the old familiar SpriteBatch to do so.
        if(texture != null)
        {
            spriteBatch.begin();
            spriteBatch.draw(textureRegion,0,0);
            spriteBatch.end();
            screenShot = false;
        }
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }

    @Override
    public boolean keyDown(int keycode) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean keyUp(int keycode) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean keyTyped(char character) {
        // If the user hits a key, take a screen shot.
        this.screenShot = true;
        return false;
    }

    @Override
    public boolean touchDown(int screenX, int screenY, int pointer, int button) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean touchUp(int screenX, int screenY, int pointer, int button) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean touchDragged(int screenX, int screenY, int pointer) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean mouseMoved(int screenX, int screenY) {
        // TODO Auto-generated method stub
        return false;
    }

    @Override
    public boolean scrolled(int amount) {
        // TODO Auto-generated method stub
        return false;
    }
}

And… that’s it.

Run the code and you get the exact same results as the last example:

blenderAnimation

 

However, if you press any key, the current frame is saved to a texture, and that is instead displayed on screen.  Press a key again and it will update to the current frame:

image

Of course, the background colour is different, because we didn’t implicitly set one.  The above a is LibGDX Texture object, which can now be treated as a 2D sprite, used as a texture map, whatever.

 

The code is ultra simple.  We have a toggle variable screenShot, that gets set if a user hits a key.  The actual process of rendering to texture is done with the magic of FrameBuffersl  Think of a framebuffer as a place for OpenGL to render other than your video card.  So instead of drawing the graphics to the screen, it instead draws the graphics to a memory buffer.  We then get this memory buffer as a texture using getColorBufferTexture().  The only complication is the frame buffer is rendered upside down.  This is easily fixed by wrapping the Texture in a TextureRegion and flipping the V coordinate.  Finally we display our newly generated texture using our old friend, the SpriteBatch.

 

 

Gotta love it when something you expect to be difficult ends up being ultra easy.  Next I have to measure the performance, so see if this is something that can be done in a realtime situation, or do we need to do this onload/change?

Programming , , ,




3D models and animation from Blender to LibGDX

19. January 2014

 

Let me start off by saying that exporting from Blender is always a pain in the ass.  This experience didn’t prove to be an exception.  I will describe the process in some detail.

 

First and foremost, Blender 2.69 DIDNT work.  At least not for me, not matter what I did, Blender 2.69 would not export texture information in FBX format.  This cost me many hours of my life.  Once I switched to 2.68 everything worked.  Your mileage may vary, but in my experience, 2.69 simply would not work.  Another thing, something I lost another couples hours to… you need to use GL2 in LibGDX!

 

A lot of this process takes place on the Blender end.  I obviously don’t go into depth in how to use Blender.  If you are completely new to Blender, I highly suggest you run through this tutorial, it will teach you pretty much everything you need to know to follow along.

 

STEP 1: Model your … model

 

This step is pretty straight forward, you need to model an object to be exported.  I created this:

image

I took a standard cube, went in edit mode, extrude the face, select all and sub-divided it.

 

STEP 2: Texture your model

 

Not complete sure why, but every model that is exported to FBX seems to require a texture.  Once again, YOU NEED A TEXTURE.

Next, FBX does NOT include the texture information.  This means you have to save your texture to an image file, and add that image to the assets folder of your project.

You need to UV Unwrap your object.  In Edit mode you can accomplish this by selecting all, then UV Unwrap->Smart UV Project.  In the UV/Image window, it should look like:

image

There are a couple limitations here.  You need to make sure you texture face enabled:

image

With your object selected, in materials, make sure Face Textures is selected.

 

Next be sure to select a texture:

image

Type is Image, then under the Image section, select your texture image.  This is the file that needs to be added to your assets folder.

Scroll down to Mapping, change Coordinates to UV, select your UVMap from the drop down, leave Project as flat:

image

 

At this point you can check if your texture is right by switching over to GLSL in the properties panel ( hit N key ) of the 3D View.

 

STEP 3: Set up your rig

 

This part is a bit tricky if you’ve never done it before.  You can create a series of named animations that will be exported.  You animate by setting key frames.  First you need to setup up your skeleton.  Simply select Add->Armature->Single Bone.

image

 

Position the bone within your model.  Switch into edit mode and select the end “knob” of the bode and select Extrude ( E ) to create more bones.  Repeat until it looks like this:  ( FYI, you can use Z to make a model see through, handy when placing bones ).

image

Next you want to set the bones to manipulate the underlying mesh.

In Object mode, select your mesh, then shift click select the underlying bones.  Now parent them by hit CTRL + P and select “With automatic weight”

image

 

STEP 4: Animate your model

 

Now you can set up your animations.  Since we want multiple animations in the same file we are doing this slightly differently.

First set up your duration.  In the timeline, set end to the last frame of your longest animation:

image

 

 

Bring up a dopesheet view, then select the Action Editor:

image

 

Click the + icon, or new button ( depending if you have any existing animations, your options will change )

image

Name the animation.

 

Now go to your first frame ( Frame 1 using the controls in timeline ).

Select your bone/armature and switch to POSE mode.  Press A to select all bones.

Create a keyframe by hitting I ( as in “eye” ) then in the resulting menu select LocRotScale ( as in, Location, Rotation and Scale ), which will create a keyframe tracking those three things.

image

 

Now advance to the next frame you want to create a keyframe upon, rotate, scale or move your bone, then select all bones again and create another key.

You can create multiple named animation using the same process.  SImply click the + Icon, name another animation, go back to frame 1, position it, set a keyframe, animate, setting key frames.

 

STEP 5: Exporting to FBX

 

This part is pretty hit or miss.  You have a couple options, you can select just your mesh and armature, or simply export everything.

Then select File->Export->Autodesk FBX:

image

 

The documents say to stay with the default axis settings and the FBX converter will take care of the rest.  Frankly I could never get this to work.

These are the settings I am currently using:

image

 

STEP 6: Convert to 3dgb format and add to Eclipse

Open a command prompt where ever you exported the FBX.  Make sure your texture file is there as well.  If you haven’t already, make sure to download fbx-conv, extract that file to the directory you exported your FBX to.  Open a command prompt and CD to that directory.  Next type:

fbx-conv-win32 –f yourfilename.fbx

image

 

This should generate a g3db file.  Copy it and your textures to the assets/data directory of your android project:

image

 

STEP 7: The code

 

This code demonstrates how to load a 3D model and play multiple animations.  There is no description, beyond the code comments.  If you have any questions the comments dont cover, ask them below!

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL10;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController.AnimationDesc;
import com.badlogic.gdx.graphics.g3d.utils.AnimationController.AnimationListener;



public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;
    private AnimationController controller;
    
    @Override
    public void create() {        
        // Create camera sized to screens width/height with Field of View of 75 degrees
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());
        
        // Move the camera 5 units back along the z-axis and look at the origin
        camera.position.set(0f,0f,7f);
        camera.lookAt(0f,0f,0f);
        
        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f; 
        camera.far = 300.0f;

        // A ModelBatch is like a SpriteBatch, just for models.  Use it to batch up geometry for OpenGL
        modelBatch = new ModelBatch();
        
        // Model loader needs a binary json reader to decode
        UBJsonReader jsonReader = new UBJsonReader();
        // Create a model loader passing in our json reader
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        // Now load the model by name
        // Note, the model (g3db file ) and textures need to be added to the assets folder of the Android proj
        model = modelLoader.loadModel(Gdx.files.getFileHandle("data/blob.g3db", FileType.Internal));
        // Now create an instance.  Instance holds the positioning data, etc of an instance of your model
        modelInstance = new ModelInstance(model);
        
        //fbx-conv is supposed to perform this rotation for you... it doesnt seem to
        modelInstance.transform.rotate(1, 0, 0, -90);
        //move the model down a bit on the screen ( in a z-up world, down is -z ).
        modelInstance.transform.translate(0, 0, -2);

        // Finally we want some light, or we wont see our color.  The environment gets passed in during
        // the rendering process.  Create one, then create an Ambient ( non-positioned, non-directional ) light.
        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
        
        // You use an AnimationController to um, control animations.  Each control is tied to the model instance
        controller = new AnimationController(modelInstance);  
        // Pick the current animation by name
        controller.setAnimation("Bend",1, new AnimationListener(){

            @Override
            public void onEnd(AnimationDesc animation) {
                // this will be called when the current animation is done. 
                // queue up another animation called "balloon". 
                // Passing a negative to loop count loops forever.  1f for speed is normal speed.
                controller.queue("balloon",-1,1f,null,0f);
            }

            @Override
            public void onLoop(AnimationDesc animation) {
                // TODO Auto-generated method stub
                
            }
            
        });
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        // You've seen all this before, just be sure to clear the GL_DEPTH_BUFFER_BIT when working in 3D
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
        
        // For some flavor, lets spin our camera around the Y axis by 1 degree each time render is called
        //camera.rotateAround(Vector3.Zero, new Vector3(0,1,0),1f);
        // When you change the camera details, you need to call update();
        // Also note, you need to call update() at least once.
        camera.update();
        
        // You need to call update on the animation controller so it will advance the animation.  Pass in frame delta
        controller.update(Gdx.graphics.getDeltaTime());
        // Like spriteBatch, just with models!  pass in the box Instance and the environment
        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }
}

 

Now if you run it, you should see:

blenderAnimation

 

And that, is how you get a textured animated 3D model from Blender to LibGDX.

 

VIDEO EDITION!

 

Ok, that might have been a bit confusing, so I decided to screen capture the entire process.  The following video shows creating, texturing, animating then exporting a model from Blender to LibGDX.  There is no instruction beyond what is above, but it might help you seeing the entire process, especially if something I said above didn’t make sense, or if your own export doesn’t work out.

 

The video on YouTube is much higher resolution than the embedded one below.

 

Let me know if you have any questions.

Programming, Art , , ,