27. November 2014

Now we are going to look at using the P2 Physics engine.  Truth of the matter is, the experience is very similar to what we just experienced with Arcade physics.  The biggest difference is P2 is more accurate, has greater options for simulation and critically, takes up more CPU power.

Let’s jump in straight away with a simple example.  We are going to load two identical sprites, anchor one and create a spring between them.  Code time!

```/// <reference path="phaser.d.ts"/>
class SimpleGame {
game: Phaser.Game;
player1: Phaser.Sprite;
player2: Phaser.Sprite;

constructor() {
this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', {
});
}
}
create() {
this.player1 = this.game.add.sprite(this.game.width/2, 0 + 50, "decepticon");
this.player2 = this.game.add.sprite(this.game.width/2, this.game.height, "decepticon");

this.game.physics.startSystem(Phaser.Physics.P2JS);

// Enabled physics on our sprites
this.game.physics.p2.enable([this.player1, this.player2]);

// Make our one body motionless
this.player1.body.static = true;

// Now create a sprite between our two bodies, parameters are rest length, stiffness and damping
// Rest length is the length of the spring at rest ( where it's not under pressure )
// Stiffness is the resistance to movement of the spring
// Damping determines how fast the spring loses it's "boing"  Our low damping keeps our spring "boinging"
// Boing is a word I made up to describe the up and down motion of a spring doing it's spring thing
this.game.physics.p2.createSpring(this.player1, this.player2, 200, 2, 0.3);

// Lets loop a timed event every 10 seconds that moves the one end of our spring back to the start
// Mostly just so people that didn't see it run the first time in the browser have something to see!
this.game.time.events.loop(Phaser.Timer.SECOND * 10, () => {
this.player2.body.x = this.game.width/2;
this.player2.body.y = this.game.height;
}, this);

}
}

window.onload = () => {
var game = new SimpleGame();
};```

The code is fairly well commented, so I'll just go through the gist of what's happening here.  Just like before, you have to start the Physics subsystem, this time we pass in Phaser.Physics.P2JS.  You also have to register your objects with the physics system.  In this case we do it by passing an array of Sprite to p2.enable();  We set one of our physics bodies to static, which will cause it to not be moved or updated by the physics system.  We then create a spring between the two sprites.  This will cause the second sprite to “spring” toward the first, until the rest length is met ( plus any momentum ), then the spring will, well, spring.  Finally, just to make the demo run a bit better in the browser, we schedule a looping event every 10 seconds that resets the second sprites physics body back to the starting position to start the whole game all over again.

Here is our code running in the browser:

One of the big advantages to P2 is it can use Polygon data for more precise collision calculates than body boxes.  Consider the following sprite:

When doing bounding box based collisions tests, this would be considered a “hit”:

Ideally instead we should create a tighter bounding volume for our collision tests, like so:

Well, the good news is we can, but we need a third party editor to create the bounding volumes.  One such tool and the one I am going to use in this example is Physics Editor by CodeAndWeb.  It’s commercial software, but there is a free version with the following restrictions:

It can be purchased currently for about \$20.

CURRENTLY, I COULD NOT GET THE CURRENT WINDOWS VERSION TO EXPORT!

There is a DLL missing, I’ve reported it to the developer, for now instead use the older version available here.

For this, I am just using the unregistered version.  Load it up, then click Add Sprites:

Navigate to the sprite you want to work with and click OK.  It should appear in the window like so:

I’m full of serious lazy, so I’m going to let the app do the work, simply click the Shape Tracer icon:

Now in the resulting dialog, play with the tolerance and alpha value until you have a bounding volume you like with a vertex count you can deal with:

Once you are happy with it, set the Exporter on the right to Lime + Corona ( JSON ).

Save the resulting file to your project directory.

Now time to actually use the results in Phaser:

```/// <reference path="phaser.d.ts"/>
class SimpleGame {
game: Phaser.Game;
player1: Phaser.Sprite;
player2: Phaser.Sprite;

constructor() {
this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', {
});
}
}
create() {
this.player1 = this.game.add.sprite(0, 0, "megatron");
this.player2 = this.game.add.sprite(0, 0, "megatron");

// Being lazy, positioning sprite after creation so we have a valid width/height
this.player1.position.set(this.player1.width / 2, this.player1.height / 2);

// Now another sprite on the right side of the screen, down slightly
this.player2.position.set(this.game.width - (this.player2.width / 2),  this.player2.height / 2 + 85);

this.game.physics.startSystem(Phaser.Physics.P2JS);

// Passing in true while enabling physics on an object causes the debug renderer to draw the physics body
this.game.physics.p2.enable([this.player1, this.player2], true);

// You need to call clearShapes() to get rid of the existing bounding box
this.player1.body.clearShapes();
this.player2.body.clearShapes();

// Now load the polygon bounding data we created externally

// Now let's get this party started
this.player2.body.moveLeft(80);

// Finally, when the collision occurs, move back to the beginning and start over
this.player2.body.onBeginContact.add((body, shapeA, shapeB, equation) => {
this.player2.body.x = this.game.width - (this.player2.width / 2);
this.player2.body.y = this.player2.height / 2 + 85;
}, this);
}
}

window.onload = () => {
var game = new SimpleGame();
};```

WARNING!  When using P2 physics, if your physics body starts even slightly outside the bounds of the screen, it will start with forces applied against it!

When you run this code, it creates two identical Megatrons, on each side of the screen, slightly offset, then sends one towards the other… ok, let’s just watch it:

As you can see, using Polygon volumes allow for much greater precision in your collision tests.  Using simple bounding boxes, the collision would occur a great deal earlier.

There is a lot more about P2 physics, but in all honesty, Polygon’s are probably the single biggest reason to use it.  Again it is important to remember that P2 is the slowest option, but the most precise.  Also, you have the option of mixing and matching physics engines within the same game.

24. November 2014

In this video tutorial, we look at handling Keyboard, Mouse and Touch input in LibGDX.  We look at both handling input via polling as well as an event driven approach.

You can see the video in full 1080p definition here.  Once again, all the code included in the video is available below:

Polled Input Sample

```package com.gamefromscratch;

public class PolledInputDemo extends ApplicationAdapter {
SpriteBatch batch;
Texture img;
Sprite sprite;

@Override
public void create () {
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
sprite = new Sprite(img);
sprite.setPosition(Gdx.graphics.getWidth()/2 - sprite.getWidth()/2,
Gdx.graphics.getHeight()/2 - sprite.getHeight()/2);
}

@Override
public void render () {

// Keyboard events
if(Gdx.input.isKeyPressed(Input.Keys.LEFT))
sprite.translateX(-1f);
if(Gdx.input.isKeyPressed(Input.Keys.RIGHT))
sprite.translateX(1f);
if(Gdx.input.isKeyPressed(Input.Keys.SPACE))
sprite.setPosition(Gdx.graphics.getWidth()/2 - sprite.getWidth()/2,
Gdx.graphics.getHeight()/2 - sprite.getHeight()/2);

if(Gdx.input.isButtonPressed(Input.Buttons.RIGHT))
sprite.setPosition(Gdx.input.getX(),Gdx.graphics.getHeight() - Gdx.input.getY());

Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.draw(sprite, sprite.getX(), sprite.getY());
batch.end();
}

@Override
public void dispose(){
img.dispose();
}
}```

Event Driven Input Sample

```package com.gamefromscratch;

public class EventDrivenInputDemo extends ApplicationAdapter implements InputProcessor {
SpriteBatch batch;
Texture img;
Sprite sprite;
boolean movingRight = false;

@Override
public void create () {
batch = new SpriteBatch();
img = new Texture("badlogic.jpg");
sprite = new Sprite(img);
sprite.setPosition(Gdx.graphics.getWidth()/2-sprite.getWidth()/2,
Gdx.graphics.getHeight()/2 - sprite.getHeight()/2);

Gdx.input.setInputProcessor(this);
}

@Override
public void render () {

if(movingRight)
sprite.translateX(1f);
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();
batch.draw(sprite, sprite.getX(),sprite.getY());
batch.end();
}

@Override
public boolean keyDown(int keycode) {
if(keycode == Input.Keys.RIGHT)
movingRight = true;
return true;
}

@Override
public boolean keyUp(int keycode) {
if(keycode == Input.Keys.LEFT)
sprite.translateX(-1f);
if(keycode == Input.Keys.RIGHT)
movingRight = false;
return true;
}

@Override
public boolean keyTyped(char character) {
return false;
}

@Override
public boolean touchDown(int screenX, int screenY, int pointer, int button) {
return false;
}

@Override
public boolean touchUp(int screenX, int screenY, int pointer, int button) {
return false;
}

@Override
public boolean touchDragged(int screenX, int screenY, int pointer) {
// If the user is holding down ( or was holding down, and hasnt released ) three fingers, move the sprite
if(pointer ==2)
sprite.setPosition(screenX,Gdx.graphics.getHeight()-screenY);
return true;
}

@Override
public boolean mouseMoved(int screenX, int screenY) {
sprite.setPosition(screenX,Gdx.graphics.getHeight()-screenY);
return true;
}

@Override
public boolean scrolled(int amount) {
if(amount > 0)
sprite.translateY(1f);
if(amount < 0)
sprite.translateY(-1f);

return true;
}
}```
23. November 2014

Today I was contacted by a developer from the jMonkeyEngine team who was interested in spreading the word about the OPENGEX format among indie developers, and I agree completely with the goal.

Before I even get in to OpenGEX, it’s helpful to first look at the alternatives available for 3D import/export.  Moving 3D assets between software packages and into game engines has long been a challenge and there have been a number of popular formats as a result.  Let’s take a quick look at the more popular choices.

COLLADA is probably the monster in the segment now, standing for COLLAborative Design Activity, ranking up there amongst the most contrived acronyms I’ve ever encountered.  COLLADA started life at Sony and was later submitted to Khronos, the party responsible for OpenGL, for standardization.  Several major players in the industry signed on to support COLLADA, pretty much everybody actually, but most importantly, all the big boys signed on,  including Alias ( Maya ), Avid ( Softimage, at the time anyway… ) and Autodesk ( 3D Studio Max ).

COLLADA was designed primarily as an interchange format between tools, allowing you to for example use Max and Softimage in your art pipeline rather seamlessly.  For the most part it worked too, the industry came together and interchange was probably better than it’s ever been.

Obviously there is a but, or I wouldn’t be writing this, I would just say “hey, we should all use COLLADA!” and be done with it.  So, time for the but… and what a big but it is. ( sorry… )  First off, COLLADA is a huge, some could say bloated format, that is ill suited for games.  Again, it was designed for communication between authoring tools, ease of use and performance certainly weren’t priorities.  Being controlled by the Khronos board certainly couldn’t help either, and it didn’t.  The format became convoluted over time.

The next major problem is a change in the industry… you see, Autodesk bought everything.  So suddenly most of the software this open standard was designed to work with are owned by a single company.  Now with each new version released, things are broken, often needlessly.  For a large company like Unreal or Unity, supporting a complicated and constantly moving target isn’t a big deal, they can frankly just throw money at it.  For smaller, indie or open source game engines, this isn’t a solution.

### FBX

The FBX format started life in a product called Filmbox, made by a company known as Kaydara.  Filmbox started life as a motion capture suite and obviously needed to support various software packages, so they created the FBX file format.  In the early days, well before the rise of COLLADA, it was supported by pretty much all of the common 3D packages of the day ( 3D Studio Max, Lightwave, Softimage, Power Animator ( Maya’s precursor ), Cinema3D, etc ).  Well, Filmbox was eventually renamed MotionBuilder, a product that exists to this day and it was acquired by Alias, the makers of Maya and PowerAnimator before it.  Remember how in the COLLADA write up I said Autodesk bought everything?  Well, that wasn’t really an exaggeration… in 2006, Autodesk purchased Alias and with it gained control of MotionBuilder and the FBX file format.

So, fast forward to today and Autodesk controls Softimage, 3D Studio Max, Motion Builder, Softimage and more.  FBX is the format they use internally to communicate between their own tools.  So at the end of the day, if you are working entirely in the Autodesk ecosystem it’s the easiest and safest route to go.

For game developers though, it’s a bit of a different story.  Once again, the large guys can easily support FBX, just like COLLADA.  Perhaps most importantly, Autodesk make available an SDK for working with FBX.  Some game engines make use of this SDK such as LibGDX’s fbx-conv tool.  There are limitations on this license however on of the biggest is that it is incompatible with GPL, meaning Blender and similar tools can’t use the SDK.  ( although I would put this on the awfulness of GPL instead of Autodesk’s license, but that’s splitting hairs ).  This means Blender uses a “clean room” implementation of FBX and this means that the FBX support in Blender is, well, pretty crappy.

FBX however is not a game friendly format either, once again, it’s designed for communication between game tools, not for fast and efficient storage.  So even in the case of tools like LibGDX that support it, they ultimately just use it as a source and convert it to their own internal format.  This means each time you change your asset in your modeling tool, you have to convert it again and again.

### OBJ, 3DS, MD2, DXF, etc

This is a catch all category, but it’s worth noting the above.  Open Source game engines commonly support some of the older simpler formats, one of the most common being OBJ.  These were the file export formats from popular 3D applications from years ago ( OBJ = Alias, 3DS = 3D Studio, DXF = AutoCAD, MD2 = Quake Engine ).  The reason they are supported is the formats tend to be fairly simple with a large body of code already available.

On the other hand, they are also ancient and incredibly limited, especially when it comes to animation data.  If you have simple requirements, a simple format should do the trick and frankly you will often see OBJ format supported when animation data isn’t needed ( such as by Mixamo Fuse or Poser, for model data ), but once you start adding complexity, these formats start to show their age.

I am mostly mentioning them for completeness only.

So, a TL;DR summary of the negatives of each format:

• bloated and complicated format
• run by Khronos group ( this is a good and bad thing )
• fragile between versions, breaks easily
• not game engine friendly

FBX

• proprietary file format
• controlled by Autodesk
• not open source friendly license
• not game engine friendly

The Others

• ancient
• poor animation support
• limited functionality

### Enter OpenGEX

This brings us at last to OpenGEX ( Open Game Exchange ), an open 3D file format targeted specifically at game developers for use in game engines.  It was created by Eric Lengyel originally for the C4 Game Engine and was funded by a successful IndieGoGo campaign.  Essentially you can think of OpenGEX as a stripped down, game engine focused version of COLLADA.  Instead of being XML based, it uses OpenDDL (Link).

(Edit to fix JSON error)

The easiest way to understand the value of OpenGEX is to compare the format to COLLADA.  Fortunately the OpenGEX site provides just such an example.  Looking at the OpenGEX format, it’s quite clean, very reminiscent of OBJ, but with support for modern features.  The purpose behind OpenGEX’s creation is nicely captured in this comment:

The OpenGEX format was created because Collada, the open standard that we all hoped would provide a well-supported asset exchange format, has proven to be an unreliable mess. The most common source of problems has been the poor quality of the Collada export plugins available for software such as 3D Studio Max and Maya, and we attribute this to Collada’s over-engineered design and its mutating under-specified format.

Now of course, a format is of little use if no tools support it, and this is where OpenGEX shines.  There are already exporters for Max, Maya and Blender.  Additionally there is an Import template available for implementing OpenGEX in your game or engine.  It’s basically a Visual Studio 2013 project with the code used by the C4 Engine to load OpenGEX files.

If you are interested in learning more, you can read the PDF spec here.

### So…. what?

So what’s the value in all of this to you as an indie game developer?

Well, if you are working with Unity or Unreal Engine, very little actually.  They have the resources to support the COLLADA, with all of it’s quirks, breaks and other warts.  If however you are working with a open source or smaller game engine, moving to a much more sane format can make everyones life easier.

As is the nature of any format, the more it’s used, generally the better it becomes ( unless of course committee bloat sets in ).  Basically, the less time smaller developers have to spend on content pipeline tools, the more time they have to work on improving their engine.  Additionally, the less headaches the game developer suffers getting assets in their game, again, the more time they have to work on the game.

There has been some recent movement in regards to supporting OpenGEX.

First off, the developer who contacted me from the jMonkeyEngine has recently made a Java library on Github available for handling OpenGEX files.  This could potentially enable other popular Java based game engines *cough*LibGDX*cough* to support OpenGEX.

It was recently announced too that Ogre3D is working to support the OpenGEX format as well.  Again, the developers words perfectly capture why this is important:

Partly to address that I'm planning on adding support for the OpenGEX format.
Because of two reasons:

1. The format is actually really good, easy; and contains everything we need. It's basically our XML mesh and skeleton format, but in JSON.
2. Joint effort. There are already plugins. Open Source plugins. Which are being used for the C4 engine. If Ogre adopts the same format, we can share the burden of maintaining 3D asset exporters. Those are hard to maintain given Autodesk always releases a new version every year with partically no difference but breaking many plugins on the way. It should also empower more adoption of the OpenGEX format, and hopefully get others in. Unlike Collada, OpenGEX is good.

That second reason sums it up perfectly.  If a number of indie game engines all adopt the same format, the burden of maintenance is spread across a number of developers instead of each developing their own proprietary format and all the support that entails.  It also makes creating game tools that work across game engines a much easier task.  Finally, it helps break Autodesk’s chokehold on the industry!

So mostly it’s a matter of trying to spread the word and gain support.  One critical component would certainly be getting support into the Assimp ( Open Asset Import Library ), an open source model importing library that is the underpinning for many game engines importers today.  There is already an open feature request, so if you feel an open game friendly COLLADA option would be useful, that would certainly be a good place to start.

22. November 2014

The Urho3D C++ cross platform game engine just released version 1.32.  The following is the change log from this release:

• Finalized Urho2D functionality, including 2D physics using Box2D, sprite animation and tile maps
• Threaded background resource loading. Must be manually triggered via ResourceCache or by loading a scene asynchronously
• Attribute and material shader parameter animation system
• Customizable onscreen joystick for mobile platforms. Used in examples
• Touch camera control in examples on mobile platforms
• Touch emulation by mouse
• Multi-touch UI drag support
• Consistent touch ID’s across platforms
• Absolute, relative and wrap modes for the operating system mouse cursor
• Support for connecting & removing joysticks during runtime
• Negative light & light brightness multiplier support
• Transform spaces for Node’s translate, rotate & lookat functions
• Scrollable console
• Selectable console command interpreter (AngelScript, Lua, FileSystem)
• Touch scroll in ScrollView & ListView
• UI layout flex scale mode
• Custom sound streams from C++
• LogicComponent C++ base class with virtual update functions similar to ScriptObject
• Signed distance field font support
• JSON data support
• Matrix types in Variant & XML data
• Intermediate rendertarget refactoring: use viewport size to allow consistent UV addressing
• ParticleEmitter refactoring: use ParticleEffect resource for consistency with ParticleEmitter2D and more optimal net replication
• Expose LZ4 compression functions
• Support various cube map layouts contained in a single image file
• Configurable Bullet physics stepping behavior. Can use elapsed time limiting, or a variable timestep to use less CPU
• Default construct math objects to zero / identity
• Mandatory registration for remote events. Check allowed event only when receiving
• Teapot & torus builtin objects
• FXAA 3.11 shader
• Triangle rendering in DebugRenderer (more efficient than 3 lines)
• Material/texture quality and anisotropy as command line options and engine startup parameters
• Spline math class, which the SplinePath component uses
• Console auto-show on error
• DrawableProxy2D system for optimizing 2D sprite drawing
• Possibility to decouple BorderImage border UV’s from element size
• Editor & NinjaSnowWar resources split into subdirectories
• UI hover start & end events
• UI drag cancel by pressing ESC
• Allowed screen orientations can be controlled. Effective only on iOS
• Rendering sceneless renderpaths
• Define individual material passes as SM3-only
• Support for copying ListView text to system clipboard
• Async system command execution
• Generic attribute access for Lua script objects
• Use Lua functions directly as event subscribers
• Touch gesture recording and load/save
• AssetImporter option to allow multiple import of identical meshes
• Automatically create a physics world component to scene when necessary
• GetSubimage function in the Image class
• Possibility to clone existing components from another scene node
• Improve terrain rendering on mobile devices
• Refactoring of camera facing modes in BillboardSet & Text3D
• Additive alpha techniques for particle rendering
• Possibility to use CustomGeometry component for physics triangle mesh collision
• Access to 2D node coordinates for convenience when using 2D graphics features
• Save embedded textures in AssetImporter
• Use best matching fullscreen resolution if no exact match
• Use SDL_iPhoneSetAnimationCallback instead of blocking main loop
• Allow fast partial terrain updates by modifying the heightmap image
• API for setting image pixels by integer colors
• Refactor to remove the separate ShortStringHash class
• Deep clone functionality in Model resource
• Zone can define a texture which is available to shaders. Not used by default
• Allow logging from outside the main thread
• Log warnings for improper attempts to use events from outside main thread
• Improved CustomGeometry dynamic updates
• ConvexCast function in PhysicsWorld
• Screen to world space conversion functions in Viewport class
• Allow sending client rotation to server in addition to position
• Allow accessing and modifying the engine’s next timestep
• DeepEnabled mechanism for disabling node or UI element hierarchies and then restoring their own enabled state
• Allow to prevent closing a modal window with ESC
• Per-viewport control of whether debug geometry should render
• Optional interception of resource requests
• Readded optional slow & robust mode to AreaAllocator
• Optionally disable RigidBody mass update to allow fast adding of several CollisionShape components to the same node
• Runtime synchronization of resource packages from server to client
• Disable multisample antialiasing momentarily during rendering. Used by default for UI & quad rendering
• Glyph offset support in Font class
• Font class internal refactoring
• Allow to create AngelScript script objects by specifying the interface it implements
• Window position startup parameters
• Functions to get time since epoch & modify file’s last modified time
• Optionally auto-disable child elements of a scroll view when touch scrolling
• Allocate views permanently per viewport to allow querying for drawables, lights etc. reliably
• Allow to specify material techniques/passes that should not be used on mobile devices
• Reduced default shadow mapping issues on mobile devices
• Minor rendering optimizations
• Build system: possibility to build Urho3D without networking or 2D graphics functionality
• Build system: improved generated scripting documentation
• Build system: improved support for IDE’s in CMake scripts
• Build system: support up to Android NDK r10c and 64-bit ABIs
• Build system: numerous other improvements
• Editor: resource browser
• Editor: spawn window for random-generating objects
• Editor: allow either zoom or move from mouse wheel
• Editor: locate object by doubleclicking node in hierarchy
• Editor: take screenshots with F11, camera panning
• Editor: button in value edit fields that allows editing by mouse drag
• Updated SDL to 2.0.3.
• Updated AngelScript to 2.29.1
• Updated assimp
• Updated Recast/Detour
• Fix MinGW build issues
• Fix techniques referring to wrong shaders
• Fix Node::LookAt() misbehaving in certain situations
• Fix resize event not reporting correct window size if window is maximized at start
• Fix PhysicsWorld::GetRigidBodies() not using collision mask
• Fix zone misassignment issues
• Fix Lua not returning correctly typed object for UIElement::GetChild() & UIElement::GetParent()
• Fix uninitialized variables in 2D physics components
• Fix quad rendering not updating elapsed time uniform
• Fix forward rendering normal mapping issues by switching calculations back to world space
• Fix wrong logging level on Android
• Fix multiple subscribes to same event on Lua
• Fix missing Octree update in headless mode
• Fix crash when using FreeType to access font kerning tables
• Fix ReadString() endless loop if the string does not end
• Fix shadow mapping on OS X systems with Intel GPU
• Fix manually positioned bones being serialized properly
• Fix file checksum calculation on Android
• Fix accelerometer input on Android when device is flipped 180 degrees
• Fix missing or misbehaving Lua bindings
• Fix crashes in physics collision handling when objects are removed during it
• Fix shader live reload if previous compile resulted in error
• Fix named manual textures not recreating their GPU resource after device loss
• Fix skeleton-only model not importing in AssetImporter
• Fix terrain raycast returning incorrect position/normal
• Fix animation keyframe timing in AssetImporter if start time is not 0
• Fix storing Image resources to memory unnecessarily during cube/3D texture loading
• Fix to node transform dirtying mechanism and the TransformChanged() script function
• Fix returned documents directory not being writable on iOS
• Fix click to emptiness not closing a menu
• Fix FileWatcher notifying when file was still being saved. By default delay notification 1 second
• Fix .txml import in the editor
• Fix erroneous raycast to triangles behind the ray
• Fix crash when multiple AnimatedModels exist in a node and the master model is destroyed
• Fix missing Matrix4 * Matrix3x4 operator in script
• Fix various compile warnings that leak to applications using Urho3D
• Fix DebugHud update possibly being late one frame
• Fix various macros not being usable outside Urho3D namespace
• Fix erroneous layout with wordwrap text elements
• Fix debug geometry rendering on flipped OpenGL viewports
• Fix kNet debug mode assert with zero sized messages
• Fix not being able to stop and restart kNet server
• Fix AreaAllocator operation
• Fix possible crash with parented rigidbodies
• Fix missing network delta update if only user variables in a Node have been modified
• Fix to only search for June 2010 DirectX SDK, as earlier SDK’s will fail
• Fix wrong search order of added resource paths
• Fix global anisotropic filtering on OpenGL
• Fix animation triggers not working if trigger is at animation end
• Fix CopyFramebuffer shader name not being used correctly on case-sensitive systems
• Fix UI elements not receiving input when the window containing them is partially outside the screen to the left
• Fix occlusion rendering not working with counterclockwise triangles
• Fix material shader parameter animations going out of sync with other animations when the object using the material is not in view
• Fix CPU count functions on Android

The project homepage is available here.

21. November 2014

In the last video tutorial we used a graphic instead of text to create a Hello World.  This is because drawing text is actually a multi step process in LibGDX and not really appropriate for a first tutorial.  It is however perfect for a second tutorial, so here we are! ;)

In this video we explore the difference between TTF and Bitmap fonts, show how to run the Hiero font generation tool in both Eclipse and IntelliJ IDEA then create and save a bitmap font.  We then explore the code needed to show a bitmap font on screen, including how to measure the results, apply color and text justification.

The video is available in up to 1080P on YouTube by clicking here.

The source code:

Initial Example – Loading a font and drawing text:

```package com.gamefromscratch;

public class gdxtext extends ApplicationAdapter {
SpriteBatch batch;
BitmapFont font;

@Override
public void create () {
batch = new SpriteBatch();
font = new BitmapFont(Gdx.files.internal("Papy.fnt"));
}

@Override
public void render () {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
batch.begin();

//Example One -- Drawing Text
font.draw(batch,"Hello World",Gdx.graphics.getWidth()/2,Gdx.graphics.getHeight()/2);

batch.end();
}
}```

Example 2 – Measuring and centering text:

```BitmapFont.TextBounds bounds = font.getBounds("Hello World");
font.draw(batch,"Hello World",
Gdx.graphics.getWidth()/2 - bounds.width/2,
Gdx.graphics.getHeight()/2 + bounds.height/2);```

Example 3 – Multi-line Text

```String debugString = "I took one, one cause you left me\n"
+ "Two, two for my family\n"
+ "Three, three for my heartache\n"
+ "Four, four for my headaches\n"
+ "Five, five for my sorrow\n";
BitmapFont.TextBounds bounds = font.getMultiLineBounds(debugString);```

Example 4 -- Center justified text in the colour purple

```    font.setColor(Color.PURPLE);
font.drawMultiLine(batch,
debugString,
0,
Gdx.graphics.getHeight()/2 + bounds.height/2,
Gdx.graphics.getWidth(),
BitmapFont.HAlignment.CENTER
);```

#### Month List

Unity Release Open Source AssetBundle Tool Prototype

# Home > GameDev News >

26. October 2016

Unity Asset Bundles enable you to stream assets over the web and use them at run-time.  The problem is, they are tricky to use, need to be generated for each platform and often required each studio to create a tool to properly make use of them.  Today Unity released a tool aimed at making Asset Bundles easier to create and manage.  Keep in mind the tool is just a prototype so expect some bugs and usability issues.

From the announcement blog:

##### Make AssetBundle workflow visual, more intuitive

An easy to learn and flexible visual tool for creating a build pipeline that can support some of the complexities of real productions, the Asset Bundle graph tool provides a workflow with the following features:

• Create AssetBundles with no coding. Create nodes, connect them and simply press the build button to give you the AssetBundles you desire!
• Easily understand what happens when you build: which files are targeted, where they go and how they are included in which asset bundle files.
• Create pipeline in a rule-based model. So once you configure the pipeline and set up simple rules with name and path conventions, you can then completely forget about AssetBundle configurations. Share that rule with your team and just keep creating and adding assets as long as your team is following the conventions. When you press build, those new files are automatically targeted and you will get new AssetBundles accordingly.

We have come to the point when we’re ready to share this with you. Like many other tools we released recently, we are releasing this tool in open-source, under the MIT license. The tool is still in prototype phase, so we would be delighted if you gave it a try and tell us what you think. You can also freely modify the tool to match your needs, or join in its development.

The source code for this new tool is available on Bitbucket.  Yeah, not Github, BitBucket.