LibGDX Tutorial 13: Physics with Box2D Part 1: A Basic Physics Simulations.

27. August 2014

 

Today we are going to look at implementing physics in LibGDX.  This technically isn’t part of LibGDX itself, but instead is implemented as an extension.  The physics engine used in LibGDX is the popular Box2D physics system a library that has been ported to basically every single platform and language ever invented, or so it seems. We are going to cover how to implement Box2D physics in your 2D LibGDX game.  This is a complex subject so will require multiple parts.

 

If you’ve never used a Physics Engine before, we should start with a basic overview of what they do and how they work.  Essentially a physics engine takes scene information that you provide, then calculates “realistic” movement using physics calculations.  It goes something like this:

  • you describe all of the physics entities in your world to the physics engine, including bounding volumes, mass, velocity, etc
  • you tell the engine to update, either per frame or on some other interval
  • the physics engine calculates how the world has changed, what’s collided with what, how much gravity effects each item, current speed and position, etc
  • you take the results of the physics simulation and update your world accordingly.

 

Don’t worry, we will look at exactly how in a moment.

 

First we need to talk for a moment about creating your project.  Since Box2D is now implemented as an extension ( an optional LibGDX component ), you need to add it either manually or when you create your initial project.  Adding a library to an existing project is IDE dependent, so I am instead going to look at adding it during project creation… and totally not just because it’s really easy that way.

 

When you create your LibGDX project using the Project Generator, you simply specify which extensions you wish to include and Gradle does the rest.  In this case you simply check the box next to Box2d when generating your project like normal:

 

image

 

… and you are done.  You may be asking, hey what about Box2dlights?  Nope, you don’t currently need that one.  Box2dlights is a project for simulating lighting and shadows based off the Box2d physics engine.  You may notice in that list another entity named Bullet.  Bullet is another physic engine, although more commonly geared towards 3D games, possibly more on that at a later date.  Just be aware if you are working in 3D, Box2d isn’t of much use to you, but there are alternatives.

 

Ok, now that we have a properly configured project, let’s take a look at a very basic physics simulation.  We are simply going to take the default LibGDX graphic and apply gravity to it, about the simplest simulation you can make that actually does something.  Code time!

 

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.Texture;
import com.badlogic.gdx.graphics.g2d.Sprite;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.math.Vector2;
import com.badlogic.gdx.physics.box2d.*;

public class Physics1 extends ApplicationAdapter {
    SpriteBatch batch;
    Sprite sprite;
    Texture img;
    World world;
    Body body;

    @Override
    public void create() {

        batch = new SpriteBatch();
        // We will use the default LibGdx logo for this example, but we need a 
        sprite since it's going to move
        img = new Texture("badlogic.jpg");
        sprite = new Sprite(img);

        // Center the sprite in the top/middle of the screen
        sprite.setPosition(Gdx.graphics.getWidth() / 2 - sprite.getWidth() / 2,
                Gdx.graphics.getHeight() / 2);

        // Create a physics world, the heart of the simulation.  The Vector 
        passed in is gravity
        world = new World(new Vector2(0, -98f), true);

        // Now create a BodyDefinition.  This defines the physics objects type 
        and position in the simulation
        BodyDef bodyDef = new BodyDef();
        bodyDef.type = BodyDef.BodyType.DynamicBody;
        // We are going to use 1 to 1 dimensions.  Meaning 1 in physics engine 
        is 1 pixel
        // Set our body to the same position as our sprite
        bodyDef.position.set(sprite.getX(), sprite.getY());

        // Create a body in the world using our definition
        body = world.createBody(bodyDef);

        // Now define the dimensions of the physics shape
        PolygonShape shape = new PolygonShape();
        // We are a box, so this makes sense, no?
        // Basically set the physics polygon to a box with the same dimensions 
        as our sprite
        shape.setAsBox(sprite.getWidth(), sprite.getHeight());

        // FixtureDef is a confusing expression for physical properties
        // Basically this is where you, in addition to defining the shape of the 
        body
        // you also define it's properties like density, restitution and others 
        we will see shortly
        // If you are wondering, density and area are used to calculate over all 
        mass
        FixtureDef fixtureDef = new FixtureDef();
        fixtureDef.shape = shape;
        fixtureDef.density = 1f;

        Fixture fixture = body.createFixture(fixtureDef);

        // Shape is the only disposable of the lot, so get rid of it
        shape.dispose();
    }

    @Override
    public void render() {

        // Advance the world, by the amount of time that has elapsed since the 
        last frame
        // Generally in a real game, dont do this in the render loop, as you are 
        tying the physics
        // update rate to the frame rate, and vice versa
        world.step(Gdx.graphics.getDeltaTime(), 6, 2);

        // Now update the spritee position accordingly to it's now updated 
        Physics body
        sprite.setPosition(body.getPosition().x, body.getPosition().y);

        // You know the rest...
        Gdx.gl.glClearColor(1, 1, 1, 1);
        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
        batch.begin();
        batch.draw(sprite, sprite.getX(), sprite.getY());
        batch.end();
    }

    @Override
    public void dispose() {
        // Hey, I actually did some clean up in a code sample!
        img.dispose();
        world.dispose();
    }
}

 

The program running:

 

test

 

What's  going on here is mostly defined in the comments, but I will give a simpler overview in English.  Basically when using a Physics Engine, you create a physical representation for each corresponding object in your game.  In this case we created a physics object ( Body ) that went along with our sprite.  It’s important to realize, there is no actual relationship between these two objects.  There are a couple of components that go into a physics body, BodyDef which defines what type of body it is ( more on this later, for now realize DynamicBody means a body that is updated and capable of movement ) and FixtureDef, which defines the shape and physical properties of the Body.  Of course, there is also the World, which is the actual physics simulation.

 

So, basically we created a Body which is the physical representation of our Sprite in the physics simulation.  Then in render() we call the incredibly important step() method.  Step is what advances the physics simulation… basically think of it as the play button.  The physics engine then calculations all the various mathematics that have changes since the last call to step.  The first value we pass in is the amount of time that has elapsed since the last update.  The next two values control the amount of accuracy in contact/joint calculations for velocity and position. Basically the higher the values the more accurate your physics simulation will be, but the more CPU intensive as well.  Why 6 and 2?  ‘cause that’s what the LibGDX site recommend and that works for me.  At the end of the day these are values you can tweak to your individual game.  The one other critical take away here is we update the sprites position to match the newly updated body’s position.  Once again, in this example, there is no actual link between a physics body and a sprite, so you have to do it yourself.

 

There you go, the worlds simplest physics simulation.  There are a few quick topics to discuss before we move on.  First, units.

 

This is an important and sometimes tricky concept to get your head around with physics systems.  What does 1 mean?  One what?  The answer is, whatever the hell you want it to be, just be consistent about it!  In this particular case I used pixels.  Therefore 1 unit in the physics engine represents 1 pixel on the screen.  So when I said gravity is (0,-98) that means gravity is applied at a rate of –98 pixels along the y axis per second.  Just as commonly, 1 in the physics engine could be meters, feet, kilometer, etc… then you use a custom ratio for translating to and from screen coordinates.  Most physics systems, Box2d included, really don’t like you mixing your scales however.  For example, if you have a universe simulation where 1 == 100 miles, then you want to calculate the movement of an Ant at 0.0000001 x 100miles per hour, you will break the simulation, hard.  Find a scale that works well with the majority of your game and stick with it.  Extremely large and extremely small values within that simulation will cause problems.

 

Finally, a bit of a warning about how I implemented this demo and hopefully something I will cover properly at a later date.  In this case I updated the physics system in the render loop.  This is a possibility but generally wasteful.  It’s fairly common to run your physics simulation at a fixed rate ( 30hz and 60hz being two of the most common, but lower is also a possibility if processing restrained ) and your render loop as fast as possible.

 

In the next part we will give our object something to collide with, stay tuned.

Programming , , ,




Unity release 4.6 Beta and announced open sourcing of new UI System

21. August 2014

Today ( well yesterday technically ) Unity announced the open beta of Unity 4.6.  Along with that announcement was news that they would be open sourcing a much of their new UI system along with other open source projects that I didn’t actually know about.

 

First, let’s look at what’s new in Unity 4.6:

Features

  • New UI System: Design UIs for your game or application using Unity's powerful new component based UI framework and visual tools.
    • Create UIs in screen space with or without perspective and with support for pixel perfect alignment, or in world space for simple creation of in-world interactions.
    • The layout system combines anchoring to sides, corners, or custom points in the parent container with the ability to stretch to a percentage or full extent of the parent width or height. Simple Scene View handles allows intuitive visual control and tweaking of the positional behavior.
    • Build up designs and customized controls composed of image, text, masking, and effects. All graphical components have full support for custom materials and lighting.
    • Built-in controls for buttons, sliders, scroll views, input fields and more have been designed for full cross platform deployment, supporting touch controls and mouse as well as directional navigation such as arrow keys or gamepad controls.
    • The UI system integrates with Unity’s animation smoothly, meaning that you can use state machines and other animation features to control your UI elements and panels.
    • The UI system is designed to be extensible so that it can meet the needs of your projects. Combine existing UI elements together to create new controls, extend existing controls, or write new controls from scratch.
    • Easily set up callbacks for your UI controls in the Inspector using the new persistent delegate system. A lot of functionality can be hooked up to your UI with no programming needed!
  • New Rect Tool: In previous versions, rect handles were used only in 2D mode and only for SpriteRenderers. With the introduction of the new UI system that is a hybrid of 2D and 3D, more explicit control is needed to be able to efficiently position elements. The new Rect tool in the main Unity toolbar can be used for sprites, UI elements, and any other object as well. The rect handles have also received an overhaul to be more consistent with the other tools, and to be more useful for a wide variety of uses.
    • Rect handles now support Pivot/Center mode as set in the toolbar as well Local/Global mode as set in the toolbar.
    • Rect handle squashing (holding Control/Command while dragging a side) now supports preserving volume rather than area when holding down Shift.
    • Rect handles for objects that appear small in the Scene View now show a disc handle that can be used to move the object in the plane of the rect. When zooming further in, the disc will fade out as the normal rect handles fade in.
  • Extensible Event Messaging System: Use and extend the new Event System framework. The system is used for the new UI to send and receive events, but it can be extended to support custom input devices and custom event logic.
    • Inbuilt support for touch and standalone.
    • Supports 2d, 3d, and UI components.
    • Extensible; add custom input handling and custom events.
  • Persistent Delegates (Unity Events): A new way to set callbacks via the UI and have them be built into a player. This helps non-programmers quickly add functionality to your application within the editor.

 

Now the new open source announcement:

We’re starting this project by making the recently-updated source to the Unity Test Tools available on BitBucket in a repository ready for you to fork, modify, and even open pull requests with contributions if you wish.


The next component we plan to give you the source to (unless we get something else ready before then!) is the long-awaited new UI system. We’re convinced you’ll do awesome things with the source once you get it in your hands.


Beyond that, we don’t have a concrete plan, but we have a lot of things in the pipeline. These components (like the new UI system) will all be isolated from Unity in such a way that you can modify them and use your own modified version with the official public Unity release.

 

 

The source code is released under MIT/X11 open source license ( a very liberal license so far as OSS goes with none of the GPL gotchas ) and is available to everyone.




Adventures in Phaser with TypeScript–Audio

19. August 2014

 

Now that we’ve covered input and graphics, it’s time we move on to audio.  Here is a bit of a warning upfront…

 

HTML5 audio sucks.

 

Yeah, that’s about it in nutshell.  This is one of those areas where games in HTML5 really suffer and sadly it doesn’t seem to be going away any time soon.  There are two major things to be aware of right up front.

 

First, audio file formats.  Different browsers support different file formats, but not all the same formats sadly.  You can read a pretty good chart on the topic right here.  In a nutshell, some support ogg, some support mp3, some support wav and some support m4a.  To make things even more fun, mp3 is a license encumbered format that could result in heavy licensing fees if your game is successful.  In reality, it’s never really enforced, but just the possibility should be enough to make you wary.  Fortunately, Phaser provides a way to deal with all of this that you will see shortly.

 

Second, well, Safari on iOS really kinda stinks at audio for a couple glaring reasons.  First and foremost, you can only play one sound or video at a time…  Yeah, really.  If you play a second sound, the first immediately stops.  Kinda. Big. Deal.  You can however work around this using an audio sprite, which is basically a bunch of audio files smashed together into a single file.  A second major failing is you can’t play audio on page load in Safari.  This means if you want to load your game up to a title screen with some music playing… you cant.  Audio can only be started in response to a users touch.  Yeah, this sucks too.  This one unfortunately you cannot work around, short of using a wrapper like CocoonJS for “natively” deploying your HTML5 game.

 

Through this all, you are probably going to need a tool to either merge your audio, or convert to a variety of formats.  Fortunately there is a free and excellent audio editor named Audacity available, that makes the process fairly painless.  In order to follow this tutorial, you are going to have to get an audio file and save it in mp3 and ogg format and add it to your project.

 

OK, enough about the HTML5 audio warts, let’s get to some code! 

 

/// <reference path="phaser.d.ts"/>
class SimpleGame {
    game: Phaser.Game;
    sound: Phaser.Sound;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload });
    }
    preload() {
        this.game.load.audio("GameMusic", ["song.mp3","song.ogg"]);
    }
    create() {
        this.sound = this.game.add.audio('GameMusic');
        this.sound.play();
    }
}

window.onload = () => {
    var game = new SimpleGame();
};

 

It’s important to note, this example wont actually work in Safari on iOS due to the limitation of only being able to play audio in response to a user touch.  You would have to change it so the play() call is done in a touch handler.

 

Here what you are doing is preloading audio using game.load.audio().  The parameters are the key to refer to the audio file, and a list of files to load.  This is where the magic happens, you should provide all supported file formats, then Phaser will serve the one that performs best on the browser the user is using.  M4A files perform the best on iOS, and don’t have the legal encumbrance that MP3, so OGG and M4A would probably be all you needed, but I may be wrong here.  If in doubt, provide all 3.

 

Now let’s look at an example that allows you to control playback of the sound file:

 

/// <reference path="phaser.d.ts"/>
class SimpleGame {
    game: Phaser.Game;
    sound: Phaser.Sound;
    playButton: Phaser.Button;
    pauseButton: Phaser.Button;
    stopButton: Phaser.Button;
    volUpButton: Phaser.Button;
    volDownButton: Phaser.Button;
    muteButton: Phaser.Button;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload, 
render
: this.render }); } preload() { this.game.load.audio("GameMusic", ["song.mp3", "song.ogg"]); this.game.load.image("button", "button.png", false); } render() { this.game.debug.soundInfo(this.sound, 0, 100); } create() { this.sound = this.game.add.audio('GameMusic'); // Set up sound event handlers on the sound object this.sound.onPlay.add(() => { alert("Played"); }); this.sound.onPause.add(() => { alert("Paused"); }); // Play Button this.playButton = this.game.add.button(0, 0, "button", () => { if (this.sound.currentTime > 0) this.sound.resume(); else this.sound.play(); } , this); this.playButton.addChild(new Phaser.Text(this.game, 17, 18, "Play", { fill: '#ff0000' })); // Pause Button this.pauseButton = this.game.add.button(95, 0, "button", () => { this.sound.pause(); } , this); this.pauseButton.addChild(new Phaser.Text(this.game, 12, 18, "Pause", { fill: '#ff0000' })); // Stop Button this.stopButton = this.game.add.button(190, 0, "button", () => { if (this.sound.isPlaying) { this.sound.stop(); this.sound.currentTime = 0; } } , this); this.stopButton.addChild(new Phaser.Text(this.game, 17, 18, "Stop", { fill: '#ff0000' })); // Volume Up Button this.volUpButton = this.game.add.button(300, 0, "button", () => { this.sound.volume += 0.1; } , this); this.volUpButton.addChild(new Phaser.Text(this.game, 17, 18, "Vol +", { fill: '#ff0000' })); // Volume Down Button this.volDownButton = this.game.add.button(400, 0, "button", () => { this.sound.volume -= 0.1; } , this); this.volDownButton.addChild(new Phaser.Text(this.game, 17, 18, "Vol -", { fill: '#ff0000' })); // Mute Button this.volDownButton = this.game.add.button(500, 0, "button", () => { // Global mute! Use this.sound.mute to mute a single sound this.game.sound.mute = !this.game.sound.mute; } , this); this.volDownButton.addChild(new Phaser.Text(this.game, 17, 18, "Mute", { fill: '#ff0000' })); } } window.onload = () => { var game = new SimpleGame(); };

 

Here is the resulting app:

 

 

It’s a bit of a monster, but most of the code is actually just wiring up the buttons.  There are a few important take away points here.  First, you can work with the local sound object, or all sounds globally using game.sound.  Second, each action on the sound file ( play, resume, stop, etc ) have an appropriate event handler you can implement.  I only did a couple in this example.  Finally, volume is a value from 0 to 1.  You can go above or below this range, but it wont actually do anything.  All said, once you get over the file format issues, playing audio is relatively straight forward.

 

Finally, let’s touch on audiosprites again for a second.  As mentioned earlier, an audiosprite is actually a bunch of audio files smashed together into a single file.

 

Consider the following sound effects.  A gun cocking ( http://www.freesound.org/people/woodmoose/sounds/177054/ ) and a gun shot ( http://www.freesound.org/people/18hiltc/sounds/228611/ ).  Load your first sound effect into audacity, like so:

image

 

Take note of the end of the file, in this case 0.785 seconds.

 

Now, select File->Import-> Audio

image

 

Import your additional sound effects, in this case I’m opening the gunshot file.  You should now have two tracks in Audacity, like so:

image

 

Now select the bottom track by double clicking.  It should highlight in gray, like so:

image

Now click at the end of the first track and paste ( CTRL + V, or EDIT->Paste ).  You should now have a single track that looks like this:

image

 

Now save this file ( might as well create OGG, MP3 and M4A versions while you are at it ).

 

Now lets take a look how we use it in code.

 

/// <reference path="phaser.d.ts"/>
class SimpleGame {
    game: Phaser.Game;
    sound: Phaser.Sound;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload, 
render
: this.render }); } preload() { this.game.load.audio("sfx", ["GunSounds.ogg", "GunSounds.wav"]); } render() { this.game.debug.soundInfo(this.sound, 0, 100); } create() { this.sound = this.game.add.audio('sfx'); this.sound.addMarker("gunCock", 0, 0.785); this.sound.addMarker("gunShoot", 0.786, 1.49); this.sound.play("gunCock"); this.sound.onMarkerComplete.add(() => { this.sound.play("gunShoot"); }); } } window.onload = () => { var game = new SimpleGame(); };

 

This plays the first sound, then immediately once it is complete, it plays the other.  As you can see, you do so by setting markers within the entire sound file.  0.785 is the length of the first sound effect, then 1.49 is the length of the entire file.  You simply name each marker, then give it’s start and end locations within the file.  You can get these values using an audio editor, such as audacity.  This allows you to load all of your similar sound effects into a single quicker to load file and should help you work around the single sound stream limitations of iOS.

Programming , ,




Xamarin release CocosSharp game library

12. August 2014

Xamarin, the makers of Xamarin Studio, a cross platform .NET development kit ( and the underpinnings of Unity and PlayStation Mobile) announced the release of CocosSharp. In a nutshell, CocosSharp is:

 

This project is a fork of the Cocos2D-XNA project, which is a port of the C++-based Cocos2D-X API, which in turn is a cross-platform port of the cocos2d-iphone project.

The focus of this fork is to create a library that is idiomatically correct for C# and remove many of the historical warts inherited from the straight ports from C++ and Objective-C.


So basically it's a C# port of Cocos2D-X ( built using Monogame ) which is a C++ port of a popular iOS library Cocos2D which was written in Objective C... Which it self was a port of a Python library! Clear as mud?


What is kind of nice about this release is, it fully supports Visual Studio, so even if you don't use Xamarin, you can still work in CocosSharp. The following details come from the Xamarin blog:


Today we are introducing CocosSharp a cross-platform library for building 2D games.

CocosSharp blends the power of the Cocos2D programming model with C# and the .NET Framework. Developers familiar with Cocos2D will feel right at home with CocosSharp, as the API has been designed to follow C# and .NET idioms.

NuGet

We wanted to make CocosSharp easy to adopt in your application, so we created NuGet packages that are available both on Visual Studio and Xamarin Studio. Our NuGet packages come with Portable Class Libraries, allowing developers to build portable class library components on top of it, or build their game logic in a portable class library and reuse the code across all platforms, including iOS, Android, Mac, Windows Desktop, Windows Store and Windows Phone.

Samples

To get you started, we have provided many sample programs; in particular, check out Angry Ninjas:

AngryNinjas

Open Source

CocosSharp is an open source library and is built on top of the MonoGame engine and the fine work from the Cocos2D, Cocos2D-x and Cocos2D-XNA communities. We took those efforts and improved upon them.

Learn More at Evolve

Join us for Xamarin Evolve 2014 in Atlanta, Georgia this October, where we’ll have several sessions covering game development in C#, including Mike Bluestein’s talk, CocosSharp: C# Games that Run Everywhere. Check out theXamarin Evolve 2014 site to learn more.

CocosSharp and Xamarin Indie

CocosSharp is available in all Xamarin Subscriptions, including our Xamarin Indie plan. Sign up for Xamarin Indie in August to take advantage of our $25 month-to-month pricing experiment, and start building games with Xamarin and CocosSharp today.



CocosSharp is available on Github.

 




Adventures in Phaser with TypeScript–Handling Mouse/Touch Input

11. August 2014

 

In the previous tutorial, which I will admit… some time has elapsed since I wrote it, sorry about that… anyways, in the last tutorial we looked at handling keyboard input using Phaser.  In this tutorial we are going to look at handling mouse and touch events.  Just like the last part, this one is going to be relatively light on explanation and high on code.

 

Ok, let’s jump right in.  The first example shows how to poll for mouse/touch information:

 

/// <reference path="phaser.d.ts"/>

// This code demonstrates polling for touch or mouse clicks
class SimpleGame {
    game: Phaser.Game;
    jetSprite: Phaser.Sprite;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', {
            create: this.create, preload: this.preload, update: this.update
        });
    }

    preload() {
        var loader = this.game.load.image("jet", "jet.png");
    }

    create() {
        var image = <Phaser.Image>this.game.cache.getImage("jet");
        this.jetSprite = this.game.add.sprite(
            this.game.width / 2 - image.width / 2,
            this.game.height / 2 - image.height / 2,
            "jet");

        this.jetSprite.pivot.x = this.jetSprite.width / 2;
        this.jetSprite.pivot.y = this.jetSprite.height / 2;
    }

    update() {

        // You can poll mouse status
        if (this.game.input.activePointer.isDown) {
            // This will be set to true if any mouse button is pressed or a finger is touched
            this.jetSprite.position.set(this.game.input.mousePointer.x,
                this.game.input.mousePointer.y);

            // You can test if the user is using a mouse
            if (this.game.input.activePointer.isMouse) {
                // In the case of a mouse, you can check mouse button status
                if (this.game.input.activePointer.button == Phaser.Mouse.RIGHT_BUTTON) {

                    // We just want to clear it, so this doesnt fire over and over, dont do this in production
                    this.game.input.activePointer.reset();
                    alert("Right button pressed");
                }
            }
            else {
                // On the other hand, if you are dealing with touch, you can have multiple touches/pointers
                // By default there are two pointers defined, so you can have up to two touches.
                // You can add more pointers using input.addPointer();
                if (this.game.input.pointer1.isDown && this.game.input.pointer2.isDown)
                    alert("Multitouch!");
            }
        }
    }
}

window.onload = () => {
    var game = new SimpleGame();
};

 

 

Pointer is an abstraction for both touch and mouse.  As you can see above, there are multiple pointers ( input.pointer1, input.pointer2, etc… ) but by default only two will be enabled.  If you want to handle multitouch with more than 2 fingers you need to register more using addPointer().   You can add up to 10 pointers total.  pointer1 is the first finger to touch the screen, pointer2 the second, etc.

 

Now let’s take a look at handling mouse input using callbacks instead of polling:

 

/// <reference path="phaser.d.ts"/>

// This code demonstrates polling for touch or mouse clicks
class SimpleGame {
    game: Phaser.Game;
    jetSprite: Phaser.Sprite;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', {
            create: this.create, preload: this.preload
        });
    }

    preload() {
        var loader = this.game.load.image("jet", "jet.png");
    }

    create() {
        var image = <Phaser.Image>this.game.cache.getImage("jet");
        this.jetSprite = this.game.add.sprite(
            this.game.width / 2 - image.width / 2,
            this.game.height / 2 - image.height / 2,
            "jet");

        this.jetSprite.pivot.x = this.jetSprite.width / 2;
        this.jetSprite.pivot.y = this.jetSprite.height / 2;

        // You can handle mouse input by registering a callback as well
        // The following registers a callback that will be called each time the mouse is moved
        this.game.input.moveCallback = (pointer:Phaser.Pointer,x:number,y:number) => {
            this.jetSprite.position.set(x, y);
        }

        // This one registers a mouse click handler that will be called
        this.game.input.onDown.add(SimpleGame.prototype.mouseDown);

    }

    mouseDown(event:MouseEvent) {
        alert("Mouse is down " + event.button);
    }

}

window.onload = () => {
    var game = new SimpleGame();
};

 

 

In this case we register a function that will be called whenever the mouse moves.  In this case I am using an anonymous delegate.  You can also handle mouse down or touch events.  The format is slightly different as onDown is a Signal instead of a callback.  That said, a Signal is basically a wrapper around a callback function, so it’s not really all that different.  Except of course with Signals, you can register multiple for the same event.

 

Finally, in the previous two examples, we handled the input at an application level.  If it makes sense for your game, you can actually handle input at the Sprite level.  Like so:

 

/// <reference path="phaser.d.ts"/>

// In this example we illustrate input events can be applied to Sprites
class SimpleGame {
    game: Phaser.Game;
    jetSprite: Phaser.Sprite;

    constructor() {
        this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', {
            create: this.create, preload: this.preload
        });
    }

    preload() {
        var loader = this.game.load.image("jet", "jet.png");
    }

    create() {
        var image = <Phaser.Image>this.game.cache.getImage("jet");
        this.jetSprite = this.game.add.sprite(
            this.game.width / 2 - image.width / 2,
            this.game.height / 2 - image.height / 2,
            "jet");

        this.jetSprite.pivot.x = this.jetSprite.width / 2;
        this.jetSprite.pivot.y = this.jetSprite.height / 2;

        // First enable the sprite to receive input
        this.jetSprite.inputEnabled = true;
        
        // Then add an event handler for input over
        this.jetSprite.events.onInputOver.add(() => {
            alert("The mouse passed over the sprite!");
        });
    }


}

window.onload = () => {
    var game = new SimpleGame();
};

 

 

First thing you need to do is tell Phaser that your Sprite will handle input by setting inputEnabled to true.  Next we add an input handler for onInputOver, which once again is a Signal.  There are a number of events that Sprite can respond to, all predictably enough in the events class.

 

As a side note, part of the delay in updating this tutorial series is I was actually intending to write one about using GamePads.  First I started writing code and my XBox 360 wireless receiver died horribly.  Apparently this is a very common problem with them, a tiny little fragile fuse broke.  I then ordered a replacement from Amazon and it took three weeks to arrive.  Then when it did arrive, well…

 

Let’s just say Gamepad support in HTML5 is complete and utter garbage.  There are a bunch of projects in the works, but they are all a far way away from being actually usable.  If you are looking to add gamepad support to your HTML5 project, expect it to be very browser specific.  By this I actually mean exact version of the browser…  The APIs are changing constantly and are breaking just as often.  In my opinion, it’s simply not worth it at this point.

Programming , , ,