Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

29. April 2013

So we've already look at general development, graphics programming and handling input with Haxe with NME, now we are going to take a look at audio support.  This is by far going to be the shortest post of the series, as frankly, it's the simplest.

 

Let's jump right in with playing audio.

 

 

package gfs;

import nme.Assets;
import nme.display.Sprite;
import nme.Lib;
import nme.media.Sound;

class Main extends Sprite
{
	public function new()
	{
		super();
	    var song = Assets.getSound("audio/background.mp3");
		var soundfx1 = Assets.getSound("audio/effect2.wav");
		
		song.play();
		soundfx1.play();
	}
}

 

 

Well, that was pretty straight forward.  There are only a few things to be aware of here.  First, notice we loaded the sounds using the Assets class with the path "/audio/something".  This path needs to be defined in the NMML file.  Here is what mine looks like:

<assets path="Assets/audio" rename="audio" type="audio" include="*" />

The actual directory on your hard disk should be located within the folder Assets.  The next up thing to know is the file formats.  At the end of the day the supported formats are defined by the SDL_mixer library.  Notice how I used loaded the soundfx from a wav file, while the background music as an MP3?  There is a very important reason for this; you can only have one mp3 playing at a time.  Other more "short term" sound effects should be stored in a  lighter format.  Audio is going to be one of those tricky things, as different platforms can play different formats. It's also important to realize that mp3 is a patent encumbered format, so if you are looking to sell your game, be careful with mp3, or they could come back to you looking for licensing fees!  The ogg vorbis format is a free alternative, but isn't as widely supported.

 

Now let's take a look at a slightly more advanced sample:

 

 

 

package gfs;

import haxe.Timer;
import nme.Assets;
import nme.display.Sprite;
import nme.events.Event;
import nme.Lib;
import nme.media.Sound;
import nme.media.SoundTransform;

class Main extends Sprite
{
	var playLeft = true;
	var song : Sound;
	
	public function new()
	{
		super();
	    song = Assets.getSound("audio/background.mp3");
		var soundfx1 = Assets.getSound("audio/effect2.wav");
		
		var soundChannel = song.play();
		
		// Call our sound effect every second... forever.
		new Timer(1000).run = function() {
			soundfx1.play();
		}
		
		soundChannel.addEventListener(Event.SOUND_COMPLETE, onSongEnd );
	}
	public function onSongEnd(event:Event) {
			Lib.trace("Fired");
			var channel = song.play();
			playLeft = !playLeft;
			
			if (playLeft)
				channel.soundTransform = new SoundTransform(1, -1);
			else
				channel.soundTransform = new SoundTransform(1, 1);

			channel.addEventListener(Event.SOUND_COMPLETE, onSongEnd );
		}
}

 

 

This example is a bit more complex to show off a couple of the features of the audio libraries.  You may notice that play() returns a SoundChannel object.  This object has a number of useful features… you can manipulate the playback of the sound using this guy, by applying SoundTransform's like we do here, to pan sound left or right, you can modify the volume, get current playback position, etc…  In this particular example, we load our two sound effects, start the background music playing, then create a timer that fires ever second, playing our second sound effect over and over and over.

We also wire up an event handler that will be fired when your soundChannel gets to the end ( SOUND_COMPLETE event ).  When that occurs, we toggle the sound to either play only on the left channel, or right channel.  We then recursively wire up our onSongEnd event on our newly created SoundChannel.  This code worked perfectly on every tested platform, although Android had some weird issues… it didn't play properly at first, but once I lost and regained focus, it worked perfectly.

 

So, what about Video?

This is one area current NME is a bit lacking.  There are projects out there that provide native video playback on iOS as well as this project for playing back webm encoded video files, otherwise I believe you are currently out of luck.  So if you need to play cut screens, you are currently left rolling your own solution for each platform.  Fortunately, I do not need video support. :)

,

27. April 2013

One of the downsides to using the Tilesheet class in NME is that it can't receive events.  Just as earlier on we created a sprite object to hold our bitmap when we wanted it to receive events, how exactly do you handle this when working with the Tilesheet class?  In the Haxe/NME Input tutorial, I got asked exactly this question:

 

I tired to wrap the animation of previous tutorial before into this one.

but since the animation is an tileSheet object it doesn't want to be wrapped into a Sprite
it said

nme.display.Tilesheet should be flash.display.DisplayObject
for:
sprite.addChild(sprites); // sprite is my tileSheet and sprites is a sprite

any idea to solve the problem or it's going to come in a next tutorial

 

Hmmm… good question…  short answer is, I didn't know.  Slightly longer answer is, when you call drawTiles, you can specify where to draw.  In the earlier example we simply drew to the stage.  So, we simply create a Sprite around an empty Bitmap add it to the scene and add the sprite to the scene.

Let's look at some code:

package gfs;

import nme.Assets;
import nme.display.Bitmap;
import nme.display.DisplayObject;
import nme.display.Graphics;
import nme.display.SpreadMethod;
import nme.display.Sprite;
import nme.display.Tilesheet;
import nme.events.Event;
import nme.geom.Rectangle;
import nme.Lib;
import nme.ui.Keyboard;
import nme.events.KeyboardEvent;


class Main extends Sprite
{
	private var currentFrame: Int = 5;
	private  var sprites: Tilesheet;
	var sprite:nme.display.Sprite;
	
	public function new()
	{
		super();
		sprites = new Tilesheet(Assets.getBitmapData("img/mechwarrior.png"));
		var tileData = new Array<Float>();
		
		for (i in 0...11)
		{
			sprites.addTileRect(new Rectangle(0, i * 80, 90, 80));
		}
		
		sprite = new Sprite();
		var bmp = new Bitmap();
		bmp.width = 90;
		bmp.height = 80;

		sprite.addChild(new Bitmap());
		
		sprite.x = 0;
		sprite.y = 0;
		
		sprites.drawTiles(sprite.graphics, [0, 0, currentFrame,4],false,Tilesheet.TILE_SCALE);
		
		Lib.current.stage.focus = sprite;
		sprite.addEventListener(KeyboardEvent.KEY_DOWN, function(e) {
			graphics.clear();
			sprite.graphics.clear();
			if (e.keyCode == Keyboard.DOWN)
			{
				if (++currentFrame == 11) currentFrame = 0;
			}
			else if (e.keyCode == Keyboard.UP)
			{
				if (--currentFrame == 0) currentFrame = 10;
			}
			sprites.drawTiles(sprite.graphics, [0,0, currentFrame,4], false, Tilesheet.TILE_SCALE);
		});
		Lib.current.stage.addChild(sprite);
	}
}

 

You can run this application by clicking here. Press up and down to animate.

 

And there you go.  If you are working with a sprite animation populated by a Tilesheet that you want to receive events directly, this is one option.  A couple of things though…  handling keyboard events in this case works really really really poorly… it only works in Flash…  generally you will want to handle keyboard events at the stage level.  This code was more a proof of concept than anything else.  Simply put, it's probably a really stupid way to do this.

,

25. April 2013

 

As you may know from previous posts I am rather a big fan of Lua based game engines.  When I learned about a new one completely off my radar, I just had to check it out.  The game engine in question is Dreemchest.

image

 

As I mentioned earlier, Dreemchest is scripted using Lua.  Underneath Dreemchest is powered by a C++ core.  In terms of other Lua game engines, Dreemchest is probably most similar to Corona in scope.  It comes with a WYSIWYG editor and somewhat uniquely, enables you to use Flash to create your user interface.  Unlike Corona, you don’t need to authorize against and build on their servers, everything is done locally, I know some people really hate that about Corona and Gideros ( and more recently Loom ). Oh yeah, it’s also free(for now?).  Let’s jump in and take a look at Dreemchest.

 

First things first, you need to download Dreemchest, which you can do right here.  It is available for Windows and MacOS, although I have to admit, the MacOS version seems to be a lot less stable right now, so if you have a choice I would consider choosing the Windows version.  There is no installer, simply download the archive, extract it then run composer.  Dreemchest seems to be a Qt app, so I’m a bit shocked a Linux build isn’t available.  Then again, Linux is a fairly small sub-set of the population, so maybe that’s a down the road feature.

 

Meet Dreemchest Composer.

 

Dreemchest Composer in action

image

This is the WYSIWIG environment in action with the Animation sample loaded.  Currently there are over 20 samples available showing you how to perform various actions in Dreemchest.  As you can see from the Window above, it’s a pretty sparse environment, but most of the information you need is available.  Across the top is the toolbar you would use to configure and run your application.  Below the is the WYSIWYG editing area and below that is the output panel.  On your right hand side is the Property window, which is populated dynamically by your script objects, allowing you to view and configure values visually instead of in code.  Below that are your assets, that you can import, create then drag and drop into your scene.

 

The UI itself is incredibly customizable, every dialog can be detached, moved or closed, leaving you law things out exactly as you want.

image

 

Coding and documentation

 

So, where exactly do you do the coding?  If you have the Animation sample open, take a look at the assets panel and you will see a pair of script objects, App and Hero.

image

 

Double click one of these script files and it will open in the integrated text editor.

image

It’s a fairly barebones editor, but it does what you need including syntax highlighting, auto-completion and automatic indention.  It’s nice not having to switch apps to edit code.

 

The programming language itself is standard Lua 5.1, with an class inheritance system.  If you know Lua you will be immediately comfortable with Dreemchest.  If you press play or F5 to run your application, it runs directly inside Dreemchest:

image

As of right now, debugging options are quite light.  You can alter position, from portrait to landscape, simulate home button press and not much else.  Oddly enough, once the application is running, you see the options traditional debugging options, such as step in, step out and continue.  That said, I cant figure out how to add an actual breakpoint.  My guess is, this is a feature under active development.  I look forward to it too, as for now you would be limited with printing to the output window while debugging.  Build and load times are virtually non-existent, which is nice.

 

From a coding perspective, there is a pretty good amount of documentation, especially for such a young project.  As mentioned earlier, there are currently 20+ samples included with the download.  There are a series of tutorials available here as well as an API reference available here.  The API is quite straight forward, somewhat minimal, but still under developed.  Pretty much everything you need to create a 2D game is currently there, including graphics, tweening, audio and physics.  For physics, there is also an integrated shape editor which is rather convenient.  Still under development, new functionality is being added with each release.  This is critical though, as you don’t get source code, so you need all functionality needs to be in box.

 

Flashing

 

Perhaps the most innovative feature of Dreemchest is the ability to embed and communicate with Flash objects for creating your UI layer.  You can included an SWF file just like you do any other graphic file.  The swf file can contain ActionScript2 code, and the two languages can communicate back and forth.  Here is a simple example from the SDK on working with a Flash animation, showing how you can load and communicate between the languages.  uiButtons is the swf file that has been added to the scene.

class "Main"( StageObjectContainer )

function Main:main()
    -- Register necessary functions for Flash UI
    UIManager.registerFunction( 'nativeSetPitch', function( value ) trace( 'Pitch set to: '..value ) end )
        
    -- Attach to events dispatched from Flash UI
    UIManager.attachListener( 'uiStop', self )
    UIManager.attachListener( 'uiToggleMusic', self, 'onMusicToggle' )
    
    local ui = uiButtons.new()
    self:attach( ui )
        
    -- Notify Flash UI by dispatching an event
    UIManager.dispatchEvent( 'onRefresh', { version = 104, message = 'hi there!' } )
end
    
function Main:uiStop( e )
    trace( 'Stop the music' )
end
    
function Main:onMusicToggle( e )
    if e.pause then
        trace( 'Pause music' )
    else
        trace( 'Continue music playback' )
    end
end

This allows you to use the rich UI functionality of Flash/ActionScript for your UI layer, while performing game logic and rendering in Lua.

 

Building your application

 

When it comes to creating and deploying your application, that’s a pretty simple process.  Simply select the File->Export menu and select the platform.  You need to have a Mac to build iOS or OSX target.  You need to install the Android or iOS SDK and point Dreemchest at the proper directory before you can export to either platform.  The results of the build (an apk in the case of Android) are in the output subdirectory although I had to do a bit of searching to figure this out.

 

image

 

Conclusion

 

Dreemchest is certainly a young engine, but it has a great deal of potential.  I did experience crashes and a few UI glitches, although the newest release seems a great deal more stable.  I’m actually quite surprised by just how much it did improve in just a couple weeks, this bodes well for the future. This is certainly an engine worth keeping an eye on, especially if you like Lua and are targeting iOS or Android.  It may not be for everyone, if you need source code for example, Moai is a better fit.  But if you are looking for an all in one accessible toolkit, Dreemchest is a good pick.  Of course, it’s free too, so you’ve got nothing to lose by checking it out.

Programming ,

24. April 2013

In the past we covered basic program structure as well as using graphics using Haxe and NME.  In this section we are going to look at handling input.  A number of these samples, such as multi touch, will only work on a device, although we will briefly look at how to create device specific code that still compiles on other platforms.  Alright, let's jump right in.

 

First lets look at the basics of handling touch and mouse click events.

 

Mouse click and finger touch event handling

 package gfs;

import nme.Assets;
import nme.display.Bitmap;
import nme.display.Sprite;
import nme.events.Event;
import nme.events.TouchEvent;
import nme.events.MouseEvent;
import nme.Lib;


class Main extends Sprite
{
	private var sprite:Sprite;
	
	public function new()
	{
		super();
		
		var img = new Bitmap(Assets.getBitmapData("img/mechwarriorFrameScaled.png"));
		
		//Bitmap unfortunately cannot receive events, so we wrap it in a sprite
		sprite = new Sprite();
		sprite.addChild(img);
		
		sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
		sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;
		
		sprite.addEventListener(MouseEvent.CLICK, mechClicked);
		Lib.current.stage.addEventListener(TouchEvent.TOUCH_TAP, onTouch);
		Lib.current.stage.addEventListener(MouseEvent.CLICK, onClick);

		Lib.current.stage.addChild(sprite);
	}
	
	public function mechClicked(event: MouseEvent)
	{
		sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
		sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;
		event.stopImmediatePropagation();
	}
	
	public function onTouch(event: TouchEvent)
	{
		sprite.x = event.stageX - sprite.width / 2;
		sprite.y = event.stageY - sprite.height / 2;
	}
	
	public function onClick(event: MouseEvent)
	{
		sprite.x = event.stageX - sprite.width / 2;
		sprite.y = event.stageY - sprite.height / 2;
	}
	
} 

When you run this code you should see:

1

The mech will move wherever you click or touch.  If you click on the mech sprite, it will move back to the centre of the screen.

 

Now back to the code. I was informed that I didn't need to create a static main function when working with NME, so in my prior examples… ignore that bit. :)

Therefore we now start of in our constructor new.  First we create an image, a small version of our now familiar mech image.  There is a catch though, even though it has the method addEventListener, a Bitmap object can't actually receive events.  Therefore we are creating a Sprite object to hold our Bitmap.  Next up we centre the sprite to the screen.  Then we wire up sprite to receive MouseEvent.CLICK events, which will causer it to call the function mechClicked.  We also wire up TouchEvent.TOUCHT_TAP and MouseEvent.CLICK event handlers to the actual stage.

It's important to realize the order things are handled when a click or touch occurs.  It will ultimately be the object that is touched that get's first crack at the event, then it's parent, then it's parent, etc…  this is called 'bubbling'.  It's a very important thing to get your head around, as consider the current situation…  we are setting up a click handler in our mech sprite that handles the click, causing the sprite to be re-entered back to the middle of the screen if clicked.  Then the stage handles the event and moves the sprite to where the user clicked, effectively overwriting the actions the Sprite's event handler took.  In this case we want to stop handling the event  once the mech is clicked, so it's parent's handler won't run.  This is accomplished by calling event.stopImmediatePropagation().  The only other thing to notice is the different event types passed in to the various event handlers.

There is however one current gotcha in the above code… for CPP targets it doesn't actually work.  The call to stopImmediatePropagation() doesn't actually do what it says.  I reported this on the forums and most impressively, they already fixed the bug!  Colour me impressed on that one.  So, depending on when you are reading this, mechClicked() may not appear to be working if run on one of the CPP targets.  It does however work just fine on Flash and HTML5.

If you aren't working from NME sources, you can work around with this simple fix:

class Main extends Sprite
{
	private var sprite:Sprite;
	var clickHandled = false;
	
	public function new()
	{
		super();
		
		var img = new Bitmap(Assets.getBitmapData("img/mechwarriorFrameScaled.png"));
		
		//Bitmap unfortunately cannot receive events, so we wrap it in a sprite
		sprite = new Sprite();
		sprite.addChild(img);
		sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
		sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;
		
		sprite.addEventListener(MouseEvent.CLICK, mechClicked);
		
		Lib.current.stage.addEventListener(TouchEvent.TOUCH_TAP, onTouch);
		Lib.current.stage.addEventListener(MouseEvent.CLICK, onClick);
		
		
		Lib.current.stage.addChild(sprite);
	}
	
	public function mechClicked(event: MouseEvent)
	{
			sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
			sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;
			clickHandled = true;
	}
	public function onTouch(event: TouchEvent)
	{
		sprite.x = event.stageX - sprite.width / 2;
		sprite.y = event.stageY - sprite.height / 2;
	}
	
	public function onClick(event: MouseEvent)
	{
		if (clickHandled) {
			clickHandled = false;
			return;
		}
		sprite.x = event.stageX - sprite.width / 2;
		sprite.y = event.stageY - sprite.height / 2;
	}
}

Basically you just set a flag that the current click action has been handled.  Of course, this system falls on it's face once you start looking at multi-touch, as we will shortly

 

Keyboard Handling

So that's the basics of clicking and touching, now let's take a look at using keyboard control.  This will only work on devices with a physical keyboard.

package gfs;

import nme.Assets;
import nme.display.Bitmap;
import nme.display.Sprite;
import nme.events.Event;
import nme.events.KeyboardEvent;
import nme.Lib;
import nme.ui.Keyboard;


class Main extends Sprite
{
	private var sprite:Sprite;
	
	public function new()
	{
		super();
		
		var img = new Bitmap(Assets.getBitmapData("img/mechwarriorFrameScaled.png"));
		
		//Bitmap unfortunately cannot receive events, so we wrap it in a sprite
		sprite = new Sprite();
		sprite.addChild(img);
		
		sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
		sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;

		Lib.current.stage.addEventListener(KeyboardEvent.KEY_UP, function(event) {
			if (event.keyCode == Keyboard.RIGHT)
			{
				if (event.altKey || event.ctrlKey || event.shiftKey)
					sprite.x += 10;
				else
					sprite.x += 1;
			}
			if (event.keyCode == Keyboard.LEFT)
			{
				if (event.altKey || event.ctrlKey || event.shiftKey)
					sprite.x -= 10;
				else
					sprite.x -= 1;
			}
		});

		Lib.current.stage.addChild(sprite);
	}
	
	public static function main()
	{
		Lib.current.addChild(new Main());
	}
} 

When you run this code, you can use the left and right arrows to move the sprite left or right.  Holding down the alt key, shift key or control key will cause the sprite to move by 10 pixels, otherwise it moves by 1 pixel in the given direction.

As you can see, handling keyboard events is almost exactly the same as handling mouse and touch events.  You simply wire up the stage to listen for the event KeyboardEvent, in this case KEY_UP.  You simple compare the keyCode of the event against the enum specified in Keyboard to see if your key has been pressed.  There are also a series of flags, altKey, ctrlKey and shiftKey that tell you the status of each of these keys.

One important thing to note, you do not use KeyboardEvent to handle text field input, instead use the methods of TextField.

 

Accelerometer/motion control handling 

Now let's take a look at how to handle the Accelerometer.  This code will compile on any platform, but will only work on mobile platforms for obvious reasons.

 

package gfs;

import nme.Assets;
import nme.display.Bitmap;
import nme.display.Sprite;
import nme.events.Event;
import nme.events.AccelerometerEvent;
import nme.Lib;
import nme.sensors.Accelerometer;
import nme.ui.Accelerometer;


class Main extends Sprite
{
	private var sprite:Sprite;
	
	public function new()
	{
		super();
		
		var img = new Bitmap(Assets.getBitmapData("img/mechwarriorFrameScaled.png"));
		
		//Bitmap unfortunately cannot receive events, so we wrap it in a sprite
		sprite = new Sprite();
		sprite.addChild(img);
		
		sprite.x = Lib.current.stage.stageWidth / 2 - sprite.width / 2;
		sprite.y = Lib.current.stage.stageHeight / 2 - sprite.height / 2;

		Lib.current.stage.addEventListener(Event.ENTER_FRAME, onFrameEnter);
		Lib.current.stage.addChild(sprite);
	}
	
	public function onFrameEnter(event:Event) {
		#if (android || ios)
		if (Accelerometer.isSupported) {
			var y = nme.ui.Accelerometer.get().y;
			Lib.trace(y);
			if (y < 1 && y > 0.6) sprite.y-=5; // straight up and down
			else if (y < 0.4 && y > 0.0) sprite.y+=5;  // turned 90 degrees from user
		}
		else
			Lib.trace("No Accelerometer support");
		#else
			Lib.trace("Not a mobile target");
		#end
	}
	
	public static function main()
	{
		Lib.current.addChild(new Main());
	}
}

When you run this code, our trusty mech sprite is drawn centred to the screen.  If the phone is straight up and down ( parallel to your head ) nothing will happen.  As you tilt the phone away from you, for the first 40 or so degrees the sprite will move up.  Then a dead zone, then for the next 40 os so degrees the sprite will move down, until you are holding the phone perpendicular to you, at which point the sprite will stop moving.

Now for a bit of a red herring… there are Accelerometer events… THEY DONT WORK!  Don't go that route.  The events are simply never fired.  Instead you need to poll the Acelerometer when you want data.  In this case we are going to use the ENTER_FRAME event, which is fired at the beginning of each frame, causing the onFrameEnter function being called.

In onFrameEnter, we have our first compiler conditional.  If you are used to pre-processor directives, this will be familiar with you.  Basically code starting with the # symbol tell the compiler what to id.  In this case, code within the #if (android || ios) will only be executed if those values are true.  If it is an iOS or Android device, we simply read the G value of the y axis.  A value of 1 indicates a motionless phone straight in from of you.  A value of 0 is a motionless phone perpendicular to you.  You can get the motion values of all 3 axis and use them to determine orientation as well as movement.

 

Handling multitouch

Finally let's take a look at multitouch support.  Once again you obviously need a multi-touch capable device to run this sample.

 

 

package gfs;


import nme.Assets;
import nme.display.Bitmap;
import nme.display.Sprite;
import nme.display.Stage;
import nme.display.StageScaleMode;
import nme.events.Event;
import nme.events.TouchEvent;
import nme.Lib;
import nme.ui.Multitouch;
import nme.ui.MultitouchInputMode;


class Main extends Sprite
{
	private var sprites: Array<Bitmap>;
	
	public function new()
	{
		super();
		
		sprites = new Array<Bitmap>();
		
		sprites.push(new Bitmap(Assets.getBitmapData("img/mechwarriorFrame.png")));
		sprites.push(new Bitmap(Assets.getBitmapData("img/mechwarriorFrame.png")));
		sprites.push(new Bitmap(Assets.getBitmapData("img/mechwarriorFrame.png")));
		sprites.push(new Bitmap(Assets.getBitmapData("img/mechwarriorFrame.png")));
		
		for (i in 0 ... sprites.length) {
			sprites[i].x = Std.random(800);
			sprites[i].y = Std.random(1280);
			Lib.current.stage.addChild(sprites[i]);
		}

		Lib.current.stage.addEventListener(TouchEvent.TOUCH_BEGIN, function(e) {
			sprites[e.touchPointID].x = e.localX;
			sprites[e.touchPointID].y = e.localY;
		});
	}

}

Here is the code running on my Galaxy Note.  Each mech sprite is positioned where you touched.  Up to four sprites/touches can be tracked in this example.

Export 04

So, each sprite represents the location I most recently touched with each finger. 

 

We simple create an array of images all holding the same image, one for each of our four fingers.  We then randomize it's starting location ( for no particular reason ) then add it to the stage.  Touch is handled exactly as it was before, the only real difference is the event can be fired a number of times.  You determine if it is multiple touch by checking the touchPointID value of event.  This ID value can be thought of as the index value of your finger… so if you are using a single finger, it will have a value of 0.  If you are using 3 fingers, and it is the third touchpoint, the value will be 2.  

 

One word of warning, there is mention in the docs of GESTURE modes… such as ZOOM, etc… these do not exist in NME.  If you want gesture support, you need to roll it yourself.

 

So that's IO in Haxe…  I have to say, other then a bug ( which was fixed almost instantly! ) and a red herring in the documentation about gesture support, that was a painless experience and has all the functionality I need.  Of course, I just used a portion of what's there… you can also mouse over, key up, mouse enter, etc...

 

I did run into a few annoyances with FlashDevelop though.  First, if you run an cpp target project, while one is already running on your device, not only will it fail, it will bork the debugger forcing you to reset FlashDevelop.  It sounds like a mild annoyance, but I did it probably 15 times!  Another annoyance is Flashdevelop loses intellisense when inside anonymous methods, or whatever Haxe/Flash calls them.

,

23. April 2013

As some of you may already know, I recently published my first book. Now teaming up with my publisher Packt Publishing we are organizing a giveaway!  Three lucky winners stand a chance to win an e-copy of PlayStation®Mobile Development Cookbook. Keep reading to find out how you can win a copy.

 EDIT: Winners have been announced

Overview of PlayStation®Mobile Development Cookbook:

Psm

• Learn how you can create your own fantastic PlayStation®Mobile (PSM) applications

• Develop 2D games quickly and easily, complete with graphics, audio, and input

• Discover how to construct your own 3D world, import models, and even create texture and vertex shaders

You can read more about this book and download free Sample Chapter: 

 

How to Enter?

In a nutshell, you just need to leave a comment in this thread.  Of course, I'd love to hear what you think of the contents of the book when you comment! The winner will be contacted via email, so be sure to use a valid email address or I won't be able to contact you.  The contest will run until Monday April 29th, at which point I will collect all the email addresses and select three at random as winners.

, ,

Month List

Popular Comments

PlayStation Mobile SDK .99 has been released
Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon


Home > News >

13. July 2012

Sony have just announced the release of a new version of the PlayStation Mobile SDK, with the following new features:

  • In-App Purchase library
  • Digital Signature & Encryption library for content protection
  • GamePad support for UI Toolkit library
  • ModelViewer for previewing .mdx files on PC
  • 50+ bug fixes

     

    The is a guide on moving to the new SDK available here.

     

     

    There is one major ramification to this upgrade, you need to recompile all of your projects, and that includes any of the projects that I have made available in various tutorials.  There is a batch file named project_conv_098to099.bat in the tools folder of your install directory.  Drag your project onto it to update them.  You will also need to update the assistant on your Vita if required.

     

    Additionally, the namespace Sce.Pss has been changed to Sce.PlayStation, as part of the rebrand.  To me this is simply annoying, as it just broke every single tutorial, all of the existing documentation, etc… for very little gain.

     

     

    App.cfg has also been rendered redundant and has been replaced by app.xml, which will be automatically generated for you by Studio.

     

     

    There are already some bug reports coming in for .99 with file reading, so be cautious when upgrading.

     

    Here are the full .99 release notes:

     

    ver 0.99

    Overall

    Notice for migration of the project created in Ver 0.98 into Ver 0.99

    • The application setup file "app.xml" has been introduced. Along with the introduction, "app.cfg" is withdrawn.

      • Along with the above change, the batch file “project_conv_098to099.bat” is provided to convert “.cs”, “.csproj” and “app.cfg”. Please execute any of the followings for project conversion.

      How to execute in the console screen

      1. Go to [Start Menu] - [All Programs] - [Accessories] - [Command Prompt].
      2. Input cd "%SCE_PSM_SDK%/tools/” to move from the current directory.
      3. Specify and execute [sample098_project_folder], the top folder of the project to be converted into “project_conv_098to099.bat”.

      > project_conv_098to099.bat [sample098_project_folder]

      How to execute in Explorer

      1. Drag and drop [sample098_project_folder] onto “project_conv_098to099.bat” in Explorer.

    New Additions and Modifications

    • The SDK name has been changed from "PlayStation Suite SDK" to "PlayStation Mobile SDK".

    • Namespace of API has been changed from "Sce.Pss" to "Sce.PlayStation".

    • Signature verification / encryption have been supported for the files included in the application packages.

    • The number of sub-directory levels (incl. files) that can be created under the directories of Application/, Documents/, Temp/ has been changed from 5 levels to 6 levels.

      • "/Application/1/2/3/4/5/6.dat" -> OK (Documents/, Temp/ as well)
      • "/Application/1/2/3/4/5/6/7.dat" -> NG (Documents/, Temp/ as well)

    Audio

    New Additions and Modifications

    • Position property has been added to the BgmPlayer class.

    Environment

    New Additions and Modifications

    • The following class has been withdrawn.

      • PersistentMemory

    Limitations

    • SystemEvents.onRestored event does not work correctly on PlayStation(R)VITA.

    Graphics

    New Additions and Modifications

    • Default screen size has been changed to 960x544 which is same as PlayStation(R)VITA.
    • The value of TextureWrapMode has been changed.

    Limitations

    • PlayStation(R)VITA has the following limitations.
      • GraphicsContext.SetVertexBuffer() applies only the first VertexBuffer.
      • GraphicsContext.SetPolygonOffset() is not supported. Specified parameters are ignored.
      • Texture2D.GenerateMipmap() may not operate correctly.
      • TextureCube is not supported. An exception occurs when created.

    Input

    New Additions and Modifications

    • Dead-Band has been set around the center of the analog stick in GamePad.

    Imaging

    New Additions and Modifications

    • The ReadBuffer method has been added to the Image class.

    Model

    New Additions and Modifications

    • MDX file format has been modified and its version has been changed to 1.00.

    • MDX files of old format cannot be used directly. If you want to use them, please use the source code project of Model API instead of its prebuilt assembly and replace MdxLoader.cs with MdxLoader095.cs which is included in the project.

    • Implemented the following features.
      • Loading block names, bounding spheres
      • Motion transition, motion repeat mode
      • UV offset/scale, UV animation
      • Texture-less material
    • Updated the toolkit used by the converter.
      • Crosswalk 2013.0
      • FBXSDK 2011.3

    Services

    New Additions and Modifications

    • Sce.PlayStation.Core.Services namespace has been newly provided.
    • API InAppPurchaseDialog has been newly provided for In-App Purchase.

    UI Composer

    New Additions and Modifications

    • Custom widget functionality to locate user’s arbitrary widget has been added.

    • The custom panel functionality provided until version 0.98 has been withdrawn.
      • For the UIC files where the custom panels are provided, internal change is made to replace the custom panel with the custom widget.
      • Addition of the custom panel to WidgetList will be invalid.
    • The files can be opened by drag & drop of project file (*.uic) for the running UIComposer.

    • The initial value of Anchor of the Widget has become Anchor.None.
      • However, some may not be Anchor.None if conversion of size is restricted such as CheckBox, DateTimerPicker, Slider, ProgressBar and PopupList.

    Limitations

    • UIComposer is not started unless it performs as an administrator.

    UI Toolkit

    New Additions and Modifications

    • Functionality to perform focus operation in gamepad has been added.
      • For further information, see [UI Toolkit Programming Guide] - [Basic Concept] - [Gamepad Operation].
    • Key event distribution structure has been changed.
      • For further information, see [UI Toolkit Programming Guide] - [Basic Concept] - [Key Event Distribution].
    • The functionality to specify pixel density has been added.
      • For further information, see [UI Toolkit Programming Guide] - [Basic Concept] - [Pixel Density].
    • The property of Font type of each widget has been changed to UIFont type.

    • Event types of Hiding and Hidden in Dialog have been changed from EventHandler to EventHandler<DialogEventArgs>.

    • If widgets parent and child with different scroll directions were located, they cannot scroll correctly. This problem has been fixed.

    • Too many instances of items were created with ListPanel.Move method. This problem has been fixed.

    • When trying to display a dialog just after closing another dialog, display was failed. This problem has been fixed.

    • API which was Obsolete in ver 0.98 has been deleted.

    PSM Studio

    New Additions and Modifications

    • The base has been changed to MonoDevelop 2.8.8.4.
    • If debugging is executed on PlayStation(R)VITA, no string is displayed in the application output window. This problem has been fixed.
    • Edit app.xml working with PSM Publishing Utility.
    • Add a functionality of signature verification / encryption. Content Protection (Plain, Signed, SignedAndEncrypted) can be selected from Property of files.

    Restrictions

    • In rare cases, app.exe.mdb is occupied by process to fail in build. In the case of occurrence, please restart PSM Studio.
    • It may fail in opening a project. Once close and then reopen it.

    PSM Publishing Utility

    New Additions and Modifications

    • New release
    • Metadata and master packages are created.

    PSM Development Assistant (for PlayStation(R)VITA)

    New Additions and Modifications

    • Along with the name change of SDK, the name has been changed to "PlayStation Mobile Development Assistant".

    • PSM Development Assistant (for PlayStation(R)VITA) 0.99 has been released as a patch package. The version can be updated to 0.99 by selecting and apply the update icon displayed on LiveArea in PSM Development Assistant (for PlayStation(R)VITA) 0.98. In order to use PSM Development Assistant (for PlayStation(R)VITA) 0.99, the system software of PlayStation(R)VITA system needs to be updated to 1.69 or later version.

    • Note that it cannot be run correctly with the combination of PSM Studio (former name: PSS Studio) released in SDK 0.98 and PSM Development Assistant (for PlayStation(R)VITA) 0.99. PSM Development Assistant (for PlayStation(R)VITA) must be equivalent to the SDK version.

      • Correctly operating combination:
        • PSM Studio for SDK 0.98 & PSM Development Assistant (for PlayStation(R)VITA) 0.98
        • PSM Studio for SDK 0.99 & PSM Development Assistant (for PlayStation(R)VITA) 0.99
      • Incorrectly operating combination (Becomes error at installation):
        • PSM Studio for SDK 0.98 & PSM Development Assistant (for PlayStation(R)VITA) 0.99
        • PSM Studio for SDK 0.99 & PSM Development Assistant (for PlayStation(R)VITA) 0.98
    • Graphic asset has been changed.

    • The busy dialog of "Installing..." has been displayed during installation from PSM Studio. Moreover, PS button has been locked during installation.

    • The links to Debug Setting Items and Developer Forum website have been provided in the option menu. Ticket information of InAppPurchaseDialog can be reset on PlayStation(R)VITA by selecting [Debug Settings]-[Reset Ticket Information].

    • Wording contents in the Intellectual Property Notices pages and scroll performance have been adjusted.

    Document

    New Additions and Modifications

    • Added the following documents.
      • Basic Usage of 2-Dimensional Physics Simulation Physics2D Library
      • In-App Purchase
      • About Build Action
      • Folder Structure In PSM App
      • Safe Application Implementation
      • Setting Signature and Encryption
      • Uninstall
      • Mastering
      • Usage of Publishing Utility

    News ,

    blog comments powered by Disqus

    Month List

    Popular Comments