Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
27. August 2014

 

Today we are going to look at implementing physics in LibGDX.  This technically isn’t part of LibGDX itself, but instead is implemented as an extension.  The physics engine used in LibGDX is the popular Box2D physics system a library that has been ported to basically every single platform and language ever invented, or so it seems. We are going to cover how to implement Box2D physics in your 2D LibGDX game.  This is a complex subject so will require multiple parts.

 

If you’ve never used a Physics Engine before, we should start with a basic overview of what they do and how they work.  Essentially a physics engine takes scene information that you provide, then calculates “realistic” movement using physics calculations.  It goes something like this:

  • you describe all of the physics entities in your world to the physics engine, including bounding volumes, mass, velocity, etc
  • you tell the engine to update, either per frame or on some other interval
  • the physics engine calculates how the world has changed, what’s collided with what, how much gravity effects each item, current speed and position, etc
  • you take the results of the physics simulation and update your world accordingly.

 

Don’t worry, we will look at exactly how in a moment.

 

First we need to talk for a moment about creating your project.  Since Box2D is now implemented as an extension ( an optional LibGDX component ), you need to add it either manually or when you create your initial project.  Adding a library to an existing project is IDE dependent, so I am instead going to look at adding it during project creation… and totally not just because it’s really easy that way.

 

When you create your LibGDX project using the Project Generator, you simply specify which extensions you wish to include and Gradle does the rest.  In this case you simply check the box next to Box2d when generating your project like normal:

 

image

 

… and you are done.  You may be asking, hey what about Box2dlights?  Nope, you don’t currently need that one.  Box2dlights is a project for simulating lighting and shadows based off the Box2d physics engine.  You may notice in that list another entity named Bullet.  Bullet is another physic engine, although more commonly geared towards 3D games, possibly more on that at a later date.  Just be aware if you are working in 3D, Box2d isn’t of much use to you, but there are alternatives.

 

Ok, now that we have a properly configured project, let’s take a look at a very basic physics simulation.  We are simply going to take the default LibGDX graphic and apply gravity to it, about the simplest simulation you can make that actually does something.  Code time!

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.math.Vector2;
import com.badlogic.gdx.physics.box2d.*;

public class Physics1 extends ApplicationAdapter {
    SpriteBatch batch;
    Sprite sprite;
    Texture img;
    World world;
    Body body;

    @Override
    public void create() {

        batch = new SpriteBatch();
        // We will use the default LibGdx logo for this example, but we need a 
        sprite since it's going to move
        img = new Texture("badlogic.jpg");
        sprite = new Sprite(img);

        // Center the sprite in the top/middle of the screen
        sprite.setPosition(Gdx.graphics.getWidth() / 2 - sprite.getWidth() / 2,
                Gdx.graphics.getHeight() / 2);

        // Create a physics world, the heart of the simulation.  The Vector 
        passed in is gravity
        world = new World(new Vector2(0, -98f), true);

        // Now create a BodyDefinition.  This defines the physics objects type 
        and position in the simulation
        BodyDef bodyDef = new BodyDef();
        bodyDef.type = BodyDef.BodyType.DynamicBody;
        // We are going to use 1 to 1 dimensions.  Meaning 1 in physics engine 
        is 1 pixel
        // Set our body to the same position as our sprite
        bodyDef.position.set(sprite.getX(), sprite.getY());

        // Create a body in the world using our definition
        body = world.createBody(bodyDef);

        // Now define the dimensions of the physics shape
        PolygonShape shape = new PolygonShape();
        // We are a box, so this makes sense, no?
        // Basically set the physics polygon to a box with the same dimensions 
        as our sprite
        shape.setAsBox(sprite.getWidth()/2, sprite.getHeight()/2);

        // FixtureDef is a confusing expression for physical properties
        // Basically this is where you, in addition to defining the shape of the 
        body
        // you also define it's properties like density, restitution and others 
        we will see shortly
        // If you are wondering, density and area are used to calculate over all 
        mass
        FixtureDef fixtureDef = new FixtureDef();
        fixtureDef.shape = shape;
        fixtureDef.density = 1f;

        Fixture fixture = body.createFixture(fixtureDef);

        // Shape is the only disposable of the lot, so get rid of it
        shape.dispose();
    }

    @Override
    public void render() {

        // Advance the world, by the amount of time that has elapsed since the 
        last frame
        // Generally in a real game, dont do this in the render loop, as you are 
        tying the physics
        // update rate to the frame rate, and vice versa
        world.step(Gdx.graphics.getDeltaTime(), 6, 2);

        // Now update the spritee position accordingly to it's now updated 
        Physics body
        sprite.setPosition(body.getPosition().x, body.getPosition().y);

        // You know the rest...
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        batch.begin();
        batch.draw(sprite, sprite.getX(), sprite.getY());
        batch.end();
    }

    @Override
    public void dispose() {
        // Hey, I actually did some clean up in a code sample!
        img.dispose();
        world.dispose();
    }
}

 

The program running:

 

test

 

What's  going on here is mostly defined in the comments, but I will give a simpler overview in English.  Basically when using a Physics Engine, you create a physical representation for each corresponding object in your game.  In this case we created a physics object ( Body ) that went along with our sprite.  It’s important to realize, there is no actual relationship between these two objects.  There are a couple of components that go into a physics body, BodyDef which defines what type of body it is ( more on this later, for now realize DynamicBody means a body that is updated and capable of movement ) and FixtureDef, which defines the shape and physical properties of the Body.  Of course, there is also the World, which is the actual physics simulation.

 

So, basically we created a Body which is the physical representation of our Sprite in the physics simulation.  Then in render() we call the incredibly important step() method.  Step is what advances the physics simulation… basically think of it as the play button.  The physics engine then calculations all the various mathematics that have changes since the last call to step.  The first value we pass in is the amount of time that has elapsed since the last update.  The next two values control the amount of accuracy in contact/joint calculations for velocity and position. Basically the higher the values the more accurate your physics simulation will be, but the more CPU intensive as well.  Why 6 and 2?  ‘cause that’s what the LibGDX site recommend and that works for me.  At the end of the day these are values you can tweak to your individual game.  The one other critical take away here is we update the sprites position to match the newly updated body’s position.  Once again, in this example, there is no actual link between a physics body and a sprite, so you have to do it yourself.

 

There you go, the worlds simplest physics simulation.  There are a few quick topics to discuss before we move on.  First, units.

 

This is an important and sometimes tricky concept to get your head around with physics systems.  What does 1 mean?  One what?  The answer is, whatever the hell you want it to be, just be consistent about it!  In this particular case I used pixels.  Therefore 1 unit in the physics engine represents 1 pixel on the screen.  So when I said gravity is (0,-98) that means gravity is applied at a rate of –98 pixels along the y axis per second.  Just as commonly, 1 in the physics engine could be meters, feet, kilometer, etc… then you use a custom ratio for translating to and from screen coordinates.  Most physics systems, Box2d included, really don’t like you mixing your scales however.  For example, if you have a universe simulation where 1 == 100 miles, then you want to calculate the movement of an Ant at 0.0000001 x 100miles per hour, you will break the simulation, hard.  Find a scale that works well with the majority of your game and stick with it.  Extremely large and extremely small values within that simulation will cause problems.

 

Finally, a bit of a warning about how I implemented this demo and hopefully something I will cover properly at a later date.  In this case I updated the physics system in the render loop.  This is a possibility but generally wasteful.  It’s fairly common to run your physics simulation at a fixed rate ( 30hz and 60hz being two of the most common, but lower is also a possibility if processing restrained ) and your render loop as fast as possible.

 

In the next part we will give our object something to collide with, stay tuned.

Programming


9. August 2014

 

LibGDX, the cross platform, Java based, open source gaming library for iOS, Android, HTML5 and Desktop has just reached version 1.3.  The details of the new release:

image

 

  • API Addition: Added Input.isKeyJustPressed
  • API Addition: multiple recipients are now supported by MessageDispatcher, see https://github.com/libgdx/libgdx/wiki/Message-Handling#multiple-recipients
  • API Change: State#onMessage now takes the message receiver as argument.
  • API Addition: added StackStateMachine to the gdx-ai extension.
  • API change: ShapeRenderer: rect methods accept scale, more methods can work under both line and fill types, auto shape type changing.
  • API change: Built-in ShapeRenderer debugging for Stage, see https://github.com/libgdx/libgdx/pull/2011
  • Files#getLocalStoragePath now returns the actual path instead of the empty string synonym on desktop (LWJGL and JGLFW).
  • Fixed and improved xorshift128+ PRNG implementation.
  • Added support for Tiled’s animated tiles, and varying frame duration tile animations.
  • Fixed an issue with time granularity in MessageDispatcher.
  • Updated to Android API level 19 and build tools 19.1.0 which will require the latest Eclipse ADT 23.02, see http://stackoverflow.com/questions/24437564/update-eclipse-with-android-development-tools-23 for how things are broken this time…
  • Updated to RoboVM 0.0.14 and RoboVM Gradle plugin version 0.0.10
  • API Addition: added FreeTypeFontLoader so you can transparently load BitmapFonts generated through gdx-freetype via AssetManager, see FreeTypeFontLoaderTest.java
  • Preferences put methods now return “this” for chaining
  • Fixed issue 2048 where MessageDispatcher was dispatching delayed messages immediately.
  • API Addition: 3d particle system and accompanying editor, contributed by lordjone, see pull request 2005
  • API Addition: extended shape classes like Circle, Ellipse etc. with hashcode/equals and other helper methods, see pull request #2018
  • minor API change: fixed a bug in handling of atlasPrefixes, see pull request 2023
  • Bullet: btManifoldPoint member getters/setters changed from btVector3 to Vector3, also it is no longer pooled, instead static instances are used for callback methods
  • Added Intersector#intersectRayRay to detect if two 2D rays intersect, see pull request 2132
  • Bullet: ClosestRayResultCallback, AllHitsRayResultCallback, LocalConvexResult, ClosestConvexResultCallback and subclasses now use getter/setters taking a Vector3 instead of btVector3, see pull request #2175
  • 2d particle system supports pre-multiplied alpha.
  • Bullet: btIDebugDrawer/DebugDrawer now use pooled Vector3 instances instead of btVector3, see pull request #2174

 

 

You can download the LibGDX setup app here.  Of course, GameFromScratch.com has a complete set of LibGDX tutorials to get you started.

Programming News


6. August 2014

 

jrenner/obfuscate( on reddit ) released his in development 3D engine, GDX-Proto earlier today.  In his own words:

 

GDX-Proto is a lightweight 3d engine built with two main objectives:

  • Provide an open source codebase showing how to do many basic and essential things for 3d games with libgdx, a cross-platform Java game framework.
  • Provide a simple, extensible 3d engine that takes care of lower-level things such as physics and networking.

 

While the current version is implemented as a First Person Shooter (FPS) demo, the code is highly adaptable for other uses, without too much work.

Overview of Features

Graphics
  • Basic 3d rendering using a slightly modified version of the default libgdx 3d shader. It takes advantage of the new libgdx 3D API.
  • 3D Particle system based on the new libgdx 3d particle system (version 1.2.1+, not included in 1.2.0)
Physics
  • The Bullet physics library is used for collision detection, but not for collision resolution. This allows for fast and efficient collision detection without the performance penalties of a fully simulated bullet world. A default collision resolution system is included in the Physics class, but it can be modifided to suit your needs.
  • Raycasting for projectile hit detection
Networking
  • Supports local or online play
  • KryoNet based
  • Mix of TCP and UDP where appropriate
  • Entity interpolation
  • Client prediction (for movement only, not yet implemented for projectiles)
  • Simple chat system
  • Supports libgdx headless backend for creating a headless server, such as on a VPS
  • Server transmits level geometry to client upon connection
  • "The server is the man": Most logic is run server-side to prevent cheats or hacking.
Other
  • Basic Entity system with DynamicEntities, represented by either Decals (Billboard sprites) or 3D models
  • Movement component class handles acceleration, velocity, position, rotation, max speeds
  • Subclasses of Movement: GroundMovement and FlyingMovement
  • Optional logging to file, see Log class

 

Right now it’s pretty early on.  It’s not actually a library as of yet, but instead a single project with a sample FPS.  In the future I believe it will be refactored into a more traditional library.  Right now though, it does provide a solid foundation for building a 3D game on top of LibGDX.  Right now, LibGDX provides only a relatively low level 3D layer and this project builds on top of it.

 

Getting started is extremely simple.  First clone the git:

git clone https://github.com/jrenner/gdx-proto.git

Then run the project:

gradlew desktop:run

 

image

 

You can navigate around using standard WASD keys.  For the Gradle averse that want to type the demo out, you can download a playable jar here.  The entire project is available on Github here.

Programming


16. July 2014

 

Or…

How to take a Blender model you downloaded from the web and make it actually usable in your game in 28 easy steps!

 

… granted, the second title doesn’t have the same flow to it, does it?

 

I just had to run through this process and I figured I would share it as it is something that occurs fairly often.  When working with Blender, there are dozens of behavioral textures available that can make for some very nice results quickly.  The only problem is, when you get your asset out of Blender and into your game engine, things suddenly go horribly wrong.  The problem is, those textures only make sense inside of Blender.  Fortunately through the magic of baking, you can easily convert them into a texture map usable in any game engine.

 

Let’s take a look how.

 

First we need a model.  I am using a beautiful new model that was recently added to Blend-Swap.  It’s a free download but you need to register.  Don’t worry, you can use a real email address, they don’t spam, or at least haven't so far.  The model in question looks like this:

 

image

 

Unfortunately when we load it in Blender we quickly learn this model is in no way game ready.  Let’s take a look:

image

 

Ick.  So instead of a single Mesh, we have a dozen individual meshes.  Problem is, we need to unwrap them as a single object, so let’s join them all together.  First let’s get the camera out of the default layer.

 

If you look at the way this particular Blend is setup, there are currently two layers, the second contains the armature, the first contains everything else.

image

 

Lets get the camera out of there.  Select the camera object then hit the M key.  Then select the layer you want to move the camera to, like so:

image

 

Now click the first layer ( bottom left box ) and it should now only contain geometry.

 

We want to join everything together.  Press ‘A’ to select everything in the layer, then hit “Ctrl + J” to join everything into a single set of geometry.  Now it should look something like this:

image

 

Perfect, now we can unwrap our model.  Switch in to EDIT mode

image

 

Press ‘A’ again, until all faces are selected, like so:

image

 

Now we unwrap our model.  Select Mesh->UV Unwrap-> Unwrap ( or Smart UV Project ).

 

Switch your view to UV/Image Editor

image

 

It should look something like this:

image

 

Now create a New Image:

image

 

This image is where we are going to render our texture to.  Here are the settings I used.  Remember, games like Power of 2 textures.

image

 

Ok, now let’s look at the actual render to texture part.  Take a quick look at how the model is currently shaded:

image

 

Frankly none of those are really game engine friendly.  So let’s render all of those materials out to a single texture.  Go to the render tab

image

 

Scroll down and locate Bake.

In the UV Editor window, make sure everything is selected ( using ‘A’.  They should be highlighted in yellow ).  At this point, with your generated image and all the UV’s selected, it should look like:

image

 

 

Now under bake, set the following settings:

image

The key values being Bake Mode = Full Render and Selected to Active checked.  Now click the Bake button.

 

Up in your top part of Blender, you should see a progress bar like so:

image

 

 

Now if you go back to the UV/Image viewer, and select your image RenderedTexture, you should see:

image

 

Cool!

 

Let’s save the result to an external ( game engine friendly ) texture.  Select Image->Save as Image.  Save the image somewhere.  Remember where.

image

 

 

Now lets modify the textures on our model to use only our newly generated texture map.  First in 3D View, switch back to Object Mode from Edit mode.

Then, open the materials tab:

image

 

Select each material and hit the – ( or killswitch engage! ) button.  So it should ultimately look like this:

image

 

Now hit the + button and create a new Material.  Then click the New button.

image

 

The default values for the material should be OK, but depending on your game engine, you may have to enable Face Textures:

image

 

Now click over to the Texture tab.  Click New.

image

 

Drop down the Type box and select Image or Movie.

image

 

Scroll down to the Image section and select Open.  Pick the image you saved earlier.

image

 

Now scroll down to Mapping, drop down Coordinates and select UV.

image

 

Under Map select UVMap.

image

 

Now if you go to the 3D View and set the view mode to Texture:

image

 

TADA!  A game ready model.

 

One word of caution though, if you render this scene in Blender you will get the following result:

image

 

Don’t worry.  That’s just a biproduct of going from Blender materials to texture mapping.  If you want the texture to be seen, you need to add some lights to the scene.  Or change the material so it has an Emit value > 0, so it will provide it’s own light source.

 

With Emit set to .92, here is the result if you render it:

 

image

 

Now, what about it game?

 

Let’s create a simple LibGDX project that loads and displays our exported model:

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationListener;
import com.badlogic.gdx.Files.FileType;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.PerspectiveCamera;
import com.badlogic.gdx.graphics.g3d.Environment;
import com.badlogic.gdx.graphics.g3d.Model;
import com.badlogic.gdx.graphics.g3d.ModelBatch;
import com.badlogic.gdx.graphics.g3d.ModelInstance;
import com.badlogic.gdx.graphics.g3d.attributes.ColorAttribute;
import com.badlogic.gdx.graphics.g3d.loader.G3dModelLoader;
import com.badlogic.gdx.utils.UBJsonReader;


public class ModelTest implements ApplicationListener {
    private PerspectiveCamera camera;
    private ModelBatch modelBatch;
    private Model model;
    private ModelInstance modelInstance;
    private Environment environment;

    @Override
    public void create() {
        camera = new PerspectiveCamera(
                75,
                Gdx.graphics.getWidth(),
                Gdx.graphics.getHeight());

        camera.position.set(3f,0f,6f);
        camera.lookAt(0f,1f,0f);

        // Near and Far (plane) represent the minimum and maximum ranges of the camera in, um, units
        camera.near = 0.1f;
        camera.far = 300.0f;

        modelBatch = new ModelBatch();

        UBJsonReader jsonReader = new UBJsonReader();
        G3dModelLoader modelLoader = new G3dModelLoader(jsonReader);
        model = modelLoader.loadModel(Gdx.files.getFileHandle("robot.g3db", FileType.Internal));
        modelInstance = new ModelInstance(model);

        environment = new Environment();
        environment.set(new ColorAttribute(ColorAttribute.AmbientLight, 0.8f, 0.8f, 0.8f, 1.0f));
    }

    @Override
    public void dispose() {
        modelBatch.dispose();
        model.dispose();
    }

    @Override
    public void render() {
        Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);

        camera.update();

        modelBatch.begin(camera);
        modelBatch.render(modelInstance, environment);
        modelBatch.end();
    }

    @Override
    public void resize(int width, int height) {
    }

    @Override
    public void pause() {
    }

    @Override
    public void resume() {
    }
}

 

And we run it and:

image

 

Wow, a model downloaded randomly from the Internet actually working in the game engine!  How often does that actually happen? ;)

Programming Art


8. July 2014

 

In this part of the LibGDX tutorial series we are going to take a look at using GLSL shaders.  GLSL standards for OpenGL Shader Language and since the move from a fixed to programmable graphics pipeline, Shader programming has become incredibly important.  In fact, every single thing rendered with OpenGL has at least a pair of shaders attached to it.  It’s been pretty transparent to you till this point because LibGDX mostly takes care of everything for you.  When you create a SpriteBatch object in LibGDX, it automatically creates a default vertex and fragment shader for you.  If you want more information on working with GLSL I put together the OpenGL Shader Programming Resource Round-up back in May.  It has all the information you should need to get up to speed with GLSL.  For more information on OpenGL in general, I also created this guide.

 

Render Pipeline Overview

 

To better understand the role of GL shaders, it’s good to have a basic understanding of how the modern graphics pipeline works.  This is the high level description I gave in PlayStation Mobile book, it’s not plagiarism because I’m the author. :)

 

A top-level view of how rendering occurs might help you understand the shader process. It all starts with the shader program, vertex buffers, texture coordinates, and so on being passed in to the graphics device. Then this information is sent off to a vertex shader, which can then transform that vertex, do lighting calculations and more (we will see this process shortly). The vertex shader is executed once for every vertex and a number of different values can be output from this process (these are the out attributes we saw in the shader earlier). Next the results are transformed, culled, and clipped to the screen, discarding anything that is not visible, then rasterized, which is the process of converting from vector graphics to pixel graphics, something that can be drawn to the screen.

The results of this process are fragments, which you can think of as "prospective pixels," and the fragment are passed in to the fragment shader. This is why they are called fragment shaders instead of pixel shaders, although people commonly refer to them using either expression. Once again, the fragment shader is executed once for each fragment. A fragment shader, unlike a vertex shader, can only return a single attribute, which is the RGBA color of the individual pixel. In the end, this is the value that will be displayed on the screen. It sounds like a horribly complex process, but the GPUs have dedicated hardware for performing exactly such operations, millions upon millions of times per second. That description also glossed over about a million tiny details, but that is the gist of how the process occurs.

 

So basically shaders are little programs that run over and over again on the data in your scene.  A vertex shader works on the vertices in your scene ( predictably enough… ) and are responsible for positioning each vertex in the world.  Generally this is a matter of transforming them using some kind of Matrix passed in from your program.  The output of the Vertex shader is ultimately passed to a Fragment shader.  Fragment shaders are basically, as I said above, prospective pixels.  These are the actual coloured dots that are going to be drawn on the users screen.  In the fragment shader you determine how this pixel will appear.  So basically a vertex shader is a little C-like program that is run for each vertex in your scene, while a fragment shader is run for each potential pixel.

 

There is one very important point to pause on here…  Fragment and Vertex shaders aren’t the only shaders in the modern graphics pipeline.  There are also Geometry shaders.  While vertex shaders can modify geometry ( vertices ), Geometry shaders actually create new geometry.  Geometry shaders were added in OpenGL 3.2 and D3D10.  Then in OpenGL4/D3D11 Tessellation shaders were added.  Tessellation is the process of sub-dividing a surface to add more detail, moving this process to silicon makes it viable to create much lower detailed meshes and tessellate them on the fly.  So, why are we only talking about Fragment and Vertex shaders?  Portability.  Right now OpenGL ES and WebGL do not support any other shaders.  So if you want to support mobile or WebGL, you can’t use these other shader types.

 

SpriteBatch and default Shaders

 

As I said earlier, when you use SpriteBatch, it provides a default Vertex and Fragment shader for you.  Let’s take a look at each of them now.  Let’s do it in the order they occur, so let’s take a look at the vertex shader first:

 

attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord;

uniform mat4 u_projTrans;

varying vec4 v_color;
varying vec2 v_texCoords;

void main()
{
    v_color = a_color;
    v_color.a = v_color.a * (256.0/255.0);
    v_texCoords = a_texCoord + 0;
    gl_Position =  u_projTrans * a_position;
}

 

As I said, GLSL is a very C-like language, right down to including a main() function as the program entry point.  There are a few things to be aware of here.  First are attribute and uniform  variables.  These are variables that are passed in from your source code.  LibGDX takes care of most of these for you, but if you are going to write your own default shader, LibGDX expects all of them to exist.  So then, what is the difference between a uniform and attribute variable?  A uniform stays the same for every single vertex.  Attributes on the other hand can vary from vertex to vertex.  Obviously this can have performance implications, so if it makes sense, prefer using a uniform.  A varying value on the other hand can be thought of as the return value, these values will be passed on down the rendering pipeline ( meaning the fragment shader has access to them ).  As you can see from the use of gl_Position, OpenGL also has some built in values.  For vertex shaders there are gl_Position and gl_PointSize.  Think of these as uniform variables provided by OpenGL itself.  gl_Position is ultimately the position of your vertex in the world.

 

As to what this script does, it mostly just prepares a number of variables for the fragment shader, the color, the normalized ( 0 to 1 ) alpha value and the texture to bind to, in this case texture unit 0.  This is set by calling Texture.Bind() in your code, or is called by LibGDX for you.  Finally it positions the vertex in 3D space by multiplying the vertices position by the transformation you passed in as u_projTrans.

 

Now let’s take a quick look at the default fragment shader:

#ifdef GL_ES
#define LOWP lowp
    precision mediump float;
#else
    #define LOWP
#endif

varying LOWP vec4 v_color;
varying vec2 v_texCoords;

uniform sampler2D u_texture;

void main()
{
    gl_FragColor = v_color * texture2D(u_texture, v_texCoords);
}

 

As you can see, the format is very similar.  The ugly #ifdef allows this code to work on both mobile and higher end desktop machines.  Essentially if you are running OpenGL ES then the value of LOWP is defined as lowp, and precision is set to medium.  In real world terms, this means that GL ES will run at a lower level of precision for internal calculations, both speeding things up and slightly degrading the result. 

The values v_color and v_texCoords were provided by the vertex shader.  A sampler2D on the other hand is a special glsl datatype for accessing the texture bound to the shader.  gl_FragColor is another special built in variable ( like vertex shaders, fragment shaders have some GL provided variables, many more than Vertex shaders in fact ), this one represents the output color of the pixel the fragment shader is evaluating.  texture2D essentially returns a vec4 value representing the pixel at UV coordinate v_texCoords in texture u_texture.  The vec4 represents the RGBA values of the pixel, so for example (1.0,0.0,0.0,0.5) is a 50% transparent red pixel.  The value assigned to gl_FragColor is ultimately the color value of the pixel displayed on your screen.

 

Of course a full discussion on GLSL shaders is wayyy beyond the scope of this document.  Again if you need more information I suggest you start here.  I am also no expert on GLSL, so you are much better off learning the details from someone else! :)  This does however give you a peek behind the curtain at what LibGDX is doing each frame and is going to be important to us in just a moment.

 

Changing the Default Shader

 

There comes a time where you might want to alter the default shader and replace it with one of your own.  This process is actually quite simple, let’s take a look.  Let’s say for some reason you wanted to render your game entirely in black and white?  Here are a simple vertex and fragment shader combo that will do exactly this:

 

Vertex shader:

attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;

uniform mat4 u_projTrans;

varying vec4 v_color;
varying vec2 v_texCoords;

void main() {
    v_color = a_color;
    v_texCoords = a_texCoord0;
    gl_Position = u_projTrans * a_position;
}

Fragment shader:

#ifdef GL_ES
    precision mediump float;
#endif

varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform mat4 u_projTrans;

void main() {
        vec3 color = texture2D(u_texture, v_texCoords).rgb;
        float gray = (color.r + color.g + color.b) / 3.0;
        vec3 grayscale = vec3(gray);

        gl_FragColor = vec4(grayscale, 1.0);
}

I saved each file as vertex.glsl and shader.glsl respectively, to the project assets directory.  The shaders are extremely straight forward.  The Vertex is in fact just the default vertex shader from LibGDX.  Once again remember you need to provide certain values for SpriteBatch to work… don’t worry, things will blow up and tell you if they are missing from your shader! :)  The fragment shader is simply sampling the RGB value of the current texture pixel, getting the “average” value of the RGB values and using that as the output value.

 

Enough with shader code, let’s take a look at the LibGDX code now:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class ShaderTestApp extends ApplicationAdapter {
    SpriteBatch batch;
    Texture img;
    Sprite sprite;
    String vertexShader;
    String fragmentShader;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
        sprite = new Sprite(img);
        sprite.setSize(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());

        vertexShader = Gdx.files.internal("vertex.glsl").readString();
        fragmentShader = Gdx.files.internal("fragment.glsl").readString();
        shaderProgram = new ShaderProgram(vertexShader,fragmentShader);
    }

    @Override
    public void render () {
        Gdx.gl.glClearColor(1, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        batch.begin();
        batch.setShader(shaderProgram);
        batch.draw(sprite,sprite.getX(),sprite.getY(),sprite.getWidth(),sprite.getHeight());
        batch.end();
    }
}

 

And when you run it:

image

 

Tada, your output is grayscale!

As to what we are doing in that code, we load each shader file as a string.  When then create a new ShaderProgram passing in a vertex and fragment shader.  The ShaderProgram is the class the populates all the various variables that your shaders expect, bridging the divide between the Java world and the GLSL world.  Then in render() we set our ShaderProgram as active by calling setShader().  Truth is, we could have done this just once in the create method instead of once per frame.

 

Multiple Shaders per Frame

 

In the above example, when we set the shader program, it applied to all of the output.  That’s nice if you want to render the entire world in black and white, but what if you just wanted to render a single sprite using your shader?  Well fortunately that is pretty easy, you simply change the shader again.  Consider:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class ShaderTest2 extends ApplicationAdapter {
    SpriteBatch batch;
    Texture img;
    Sprite leftSprite;
    Sprite rightSprite;
    String vertexShader;
    String fragmentShader;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        img = new Texture("badlogic.jpg");
        leftSprite = new Sprite(img);
        rightSprite = new Sprite(img);

        leftSprite.setSize(Gdx.graphics.getWidth()/2, Gdx.graphics.getHeight());
        leftSprite.setPosition(0,0);
        rightSprite.setSize(Gdx.graphics.getWidth()/2, Gdx.graphics.getHeight());
        rightSprite.setPosition(Gdx.graphics.getWidth()/2,0);

        vertexShader = Gdx.files.internal("vertex.glsl").readString();
        fragmentShader = Gdx.files.internal("fragment.glsl").readString();
        shaderProgram = new ShaderProgram(vertexShader,fragmentShader);
    }

    @Override
    public void render () {
        Gdx.gl.glClearColor(1, 0, 0, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);

        batch.setShader(null);
        batch.begin();
        batch.draw(leftSprite, leftSprite.getX(), leftSprite.getY(), leftSprite.getWidth(), leftSprite.getHeight());
        batch.end();

        batch.setShader(shaderProgram);
        batch.begin();
        batch.draw(rightSprite, rightSprite.getX(), rightSprite.getY(), rightSprite.getWidth(), rightSprite.getHeight());
        batch.end();
    }
}

 

And when you run it:

image

 

One using the default shader, one sprite rendered using the black and white shader.  As you can see, it’s simply a matter of calling setShader() multiple times.  Calling setShader() but passing in null restores the default built-in shader.  However, each time you call setShader() there is a fair amount of setup done behind the scenes, so you want to minimize the number of times you want to call it.  Or…

 

Setting Shader on a Mesh Object

 

Each Mesh object in LibGDX has it’s own ShaderProgram.  Behind the scenes SpriteBatch is actually creating a large single Mesh out of all the sprites in your screen, which are ultimately just textured quads.  So if you have a game object that needs fine tune shader control, you may consider rolling your own Mesh object.  Let’s take a look at such an example:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.*;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.glutils.ShaderProgram;

public class MeshShaderApp extends ApplicationAdapter {
    SpriteBatch batch;
    Texture texture;
    Sprite sprite;
    Mesh mesh;
    ShaderProgram shaderProgram;

    @Override
    public void create () {
        batch = new SpriteBatch();
        texture = new Texture("badlogic.jpg");
        sprite = new Sprite(texture);
        sprite.setSize(Gdx.graphics.getWidth(),Gdx.graphics.getHeight());

        float[] verts = new float[30];
        int i = 0;
        float x,y; // Mesh location in the world
        float width,height; // Mesh width and height

        x = y = 50f;
        width = height = 300f;

        //Top Left Vertex Triangle 1
        verts[i++] = x;   //X
        verts[i++] = y + height; //Y
        verts[i++] = 0;    //Z
        verts[i++] = 0f;   //U
        verts[i++] = 0f;   //V

        //Top Right Vertex Triangle 1
        verts[i++] = x + width;
        verts[i++] = y + height;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 0f;

        //Bottom Left Vertex Triangle 1
        verts[i++] = x;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 0f;
        verts[i++] = 1f;

        //Top Right Vertex Triangle 2
        verts[i++] = x + width;
        verts[i++] = y + height;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 0f;

        //Bottom Right Vertex Triangle 2
        verts[i++] = x + width;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 1f;
        verts[i++] = 1f;

        //Bottom Left Vertex Triangle 2
        verts[i++] = x;
        verts[i++] = y;
        verts[i++] = 0;
        verts[i++] = 0f;
        verts[i] = 1f;

        // Create a mesh out of two triangles rendered clockwise without indices
        mesh = new Mesh( true, 6, 0,
                new VertexAttribute( VertexAttributes.Usage.Position, 3, ShaderProgram.POSITION_ATTRIBUTE ),
                new VertexAttribute( VertexAttributes.Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE+"0" ) );

        mesh.setVertices(verts);

        shaderProgram = new ShaderProgram(
                Gdx.files.internal("vertex.glsl").readString(),
                Gdx.files.internal("fragment.glsl").readString()
                );
    }

    @Override
    public void render () {

        Gdx.gl20.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
        Gdx.gl20.glClearColor(0.2f, 0.2f, 0.2f, 1);
        Gdx.gl20.glClear(GL20.GL_COLOR_BUFFER_BIT);
        Gdx.gl20.glEnable(GL20.GL_TEXTURE_2D);
        Gdx.gl20.glEnable(GL20.GL_BLEND);
        Gdx.gl20.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);

        batch.begin();
        sprite.draw(batch);
        batch.end();

        texture.bind();
        shaderProgram.begin();
        shaderProgram.setUniformMatrix("u_projTrans", batch.getProjectionMatrix());
        shaderProgram.setUniformi("u_texture", 0);
        mesh.render(shaderProgram, GL20.GL_TRIANGLES);
        shaderProgram.end();
    }
}

And when you run it:

 

image

This sample is long but fairly simple.  In create() we create the geometry for a quad by defining 2 triangles.  We then load our ShaderProgram just like we did in the earlier example.  You may notice in creating the Mesh we define two VertexAttribute values and bind them to values within our ShaderProgram.  These are the input values into the shader.  Unlike with SpriteBatch and the default shader, you need to do a bit more of the behind the scenes work when rolling your own Mesh.

 

Then in render() you see we work with the SpriteBatch normally but then draw our Mesh object using Mesh.render, passing in the ShaderProgram.  Texture.bind() is what binds the texture from LibGDX to texture unit 0 in the GLSL shader.  We then pass in our required uniform values using setUniformMatrix and setUniformi ( as in int ).  This is how you set up uniform values from the Java side of the fence.  u_texture is saying which texture unit to use, while u_projTrans is the transformation matrix for positioning items within our world.  In this case we are simply using the projection matrix from the SpriteBatch.

 

Using a Mesh instead of a Sprite has some disadvantages however.  When working with Sprites, all geometry is batched into a single object and this is good for performance.  More importantly, with Mesh you need to roll all the functionality you need from Sprite as you need it.  For example, if you want to support scaling or rotation, you need to provide that functionality.

Programming


GFS On YouTube

See More Tutorials on DevGa.me!

Month List