Adventures in Phaser with TypeScript–Audio

 

 

Now that we’ve covered input and graphics, it’s time we move on to audio.  Here is a bit of a warning upfront…

 

HTML5 audio sucks.

 

Yeah, that’s about it in nutshell.  This is one of those areas where games in HTML5 really suffer and sadly it doesn’t seem to be going away any time soon.  There are two major things to be aware of right up front.

 

First, audio file formats.  Different browsers support different file formats, but not all the same formats sadly.  You can read a pretty good chart on the topic right here.  In a nutshell, some support ogg, some support mp3, some support wav and some support m4a.  To make things even more fun, mp3 is a license encumbered format that could result in heavy licensing fees if your game is successful.  In reality, it’s never really enforced, but just the possibility should be enough to make you wary.  Fortunately, Phaser provides a way to deal with all of this that you will see shortly.

 

Second, well, Safari on iOS really kinda stinks at audio for a couple glaring reasons.  First and foremost, you can only play one sound or video at a time…  Yeah, really.  If you play a second sound, the first immediately stops.  Kinda. Big. Deal.  You can however work around this using an audio sprite, which is basically a bunch of audio files smashed together into a single file.  A second major failing is you can’t play audio on page load in Safari.  This means if you want to load your game up to a title screen with some music playing… you cant.  Audio can only be started in response to a users touch.  Yeah, this sucks too.  This one unfortunately you cannot work around, short of using a wrapper like CocoonJS for “natively” deploying your HTML5 game.

 

Through this all, you are probably going to need a tool to either merge your audio, or convert to a variety of formats.  Fortunately there is a free and excellent audio editor named Audacity available, that makes the process fairly painless.  In order to follow this tutorial, you are going to have to get an audio file and save it in mp3 and ogg format and add it to your project.

 

OK, enough about the HTML5 audio warts, let’s get to some code! 

 

/// <reference path="phaser.d.ts"/>  class SimpleGame {      game: Phaser.Game;      sound: Phaser.Sound;        constructor() {          this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload });      }      preload() {          this.game.load.audio("GameMusic", ["song.mp3","song.ogg"]);      }      create() {          this.sound = this.game.add.audio('GameMusic');          this.sound.play();      }  }    window.onload = () => {      var game = new SimpleGame();  };  

 

It’s important to note, this example wont actually work in Safari on iOS due to the limitation of only being able to play audio in response to a user touch.  You would have to change it so the play() call is done in a touch handler.

 

Here what you are doing is preloading audio using game.load.audio().  The parameters are the key to refer to the audio file, and a list of files to load.  This is where the magic happens, you should provide all supported file formats, then Phaser will serve the one that performs best on the browser the user is using.  M4A files perform the best on iOS, and don’t have the legal encumbrance that MP3, so OGG and M4A would probably be all you needed, but I may be wrong here.  If in doubt, provide all 3.

 

Now let’s look at an example that allows you to control playback of the sound file:

 

/// <reference path="phaser.d.ts"/>  class SimpleGame {      game: Phaser.Game;      sound: Phaser.Sound;      playButton: Phaser.Button;      pauseButton: Phaser.Button;      stopButton: Phaser.Button;      volUpButton: Phaser.Button;      volDownButton: Phaser.Button;      muteButton: Phaser.Button;        constructor() {          this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload, 
render
: this.render }); } preload() { this.game.load.audio("GameMusic", ["song.mp3", "song.ogg"]); this.game.load.image("button", "button.png", false); } render() { this.game.debug.soundInfo(this.sound, 0, 100); } create() { this.sound = this.game.add.audio('GameMusic'); // Set up sound event handlers on the sound object this.sound.onPlay.add(() => { alert("Played"); }); this.sound.onPause.add(() => { alert("Paused"); }); // Play Button this.playButton = this.game.add.button(0, 0, "button", () => { if (this.sound.currentTime > 0) this.sound.resume(); else this.sound.play(); } , this); this.playButton.addChild(new Phaser.Text(this.game, 17, 18, "Play", { fill: '#ff0000' })); // Pause Button this.pauseButton = this.game.add.button(95, 0, "button", () => { this.sound.pause(); } , this); this.pauseButton.addChild(new Phaser.Text(this.game, 12, 18, "Pause", { fill: '#ff0000' })); // Stop Button this.stopButton = this.game.add.button(190, 0, "button", () => { if (this.sound.isPlaying) { this.sound.stop(); this.sound.currentTime = 0; } } , this); this.stopButton.addChild(new Phaser.Text(this.game, 17, 18, "Stop", { fill: '#ff0000' })); // Volume Up Button this.volUpButton = this.game.add.button(300, 0, "button", () => { this.sound.volume += 0.1; } , this); this.volUpButton.addChild(new Phaser.Text(this.game, 17, 18, "Vol +", { fill: '#ff0000' })); // Volume Down Button this.volDownButton = this.game.add.button(400, 0, "button", () => { this.sound.volume -= 0.1; } , this); this.volDownButton.addChild(new Phaser.Text(this.game, 17, 18, "Vol -", { fill: '#ff0000' })); // Mute Button this.volDownButton = this.game.add.button(500, 0, "button", () => { // Global mute! Use this.sound.mute to mute a single sound this.game.sound.mute = !this.game.sound.mute; } , this); this.volDownButton.addChild(new Phaser.Text(this.game, 17, 18, "Mute", { fill: '#ff0000' })); } } window.onload = () => { var game = new SimpleGame(); };

 

Here is the resulting app:

 

 

It’s a bit of a monster, but most of the code is actually just wiring up the buttons.  There are a few important take away points here.  First, you can work with the local sound object, or all sounds globally using game.sound.  Second, each action on the sound file ( play, resume, stop, etc ) have an appropriate event handler you can implement.  I only did a couple in this example.  Finally, volume is a value from 0 to 1.  You can go above or below this range, but it wont actually do anything.  All said, once you get over the file format issues, playing audio is relatively straight forward.

 

Finally, let’s touch on audiosprites again for a second.  As mentioned earlier, an audiosprite is actually a bunch of audio files smashed together into a single file.

 

Consider the following sound effects.  A gun cocking ( http://www.freesound.org/people/woodmoose/sounds/177054/ ) and a gun shot ( http://www.freesound.org/people/18hiltc/sounds/228611/ ).  Load your first sound effect into audacity, like so:

image

 

Take note of the end of the file, in this case 0.785 seconds.

 

Now, select File->Import-> Audio

image

 

Import your additional sound effects, in this case I’m opening the gunshot file.  You should now have two tracks in Audacity, like so:

image

 

Now select the bottom track by double clicking.  It should highlight in gray, like so:

image

Now click at the end of the first track and paste ( CTRL + V, or EDIT->Paste ).  You should now have a single track that looks like this:

image

 

Now save this file ( might as well create OGG, MP3 and M4A versions while you are at it ).

 

Now lets take a look how we use it in code.

 

/// <reference path="phaser.d.ts"/>  class SimpleGame {      game: Phaser.Game;      sound: Phaser.Sound;        constructor() {          this.game = new Phaser.Game(640, 480, Phaser.AUTO, 'content', { create: this.create, preload: this.preload, 
render
: this.render }); } preload() { this.game.load.audio("sfx", ["GunSounds.ogg", "GunSounds.wav"]); } render() { this.game.debug.soundInfo(this.sound, 0, 100); } create() { this.sound = this.game.add.audio('sfx'); this.sound.addMarker("gunCock", 0, 0.785); this.sound.addMarker("gunShoot", 0.786, 1.49); this.sound.play("gunCock"); this.sound.onMarkerComplete.add(() => { this.sound.play("gunShoot"); }); } } window.onload = () => { var game = new SimpleGame(); };

 

This plays the first sound, then immediately once it is complete, it plays the other.  As you can see, you do so by setting markers within the entire sound file.  0.785 is the length of the first sound effect, then 1.49 is the length of the entire file.  You simply name each marker, then give it’s start and end locations within the file.  You can get these values using an audio editor, such as audacity.  This allows you to load all of your similar sound effects into a single quicker to load file and should help you work around the single sound stream limitations of iOS.

 

Programming


Scroll to Top