Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

5. May 2016

 

Introduction

Welcome to the first part of a multipart tutorial series on creating games using the Haxe programming language with the Heaps graphics engine.  Heaps is under development by Nicolas Cannasse, the guy that created the Haxe language in the first place.  In his own words:

Heaps is a cross platform graphics engine designed for high performance games. It's designed to leverage modern GPUs that are commonly available on both desktop and mobile devices. The framework currently supports HTML5 WebGL, Flash Stage3D, native Mobile (iOS and Android) and Desktop with OpenGL. 

Heaps is a very cool library, but not an incredibly well documented one.  So that’s where this tutorial series comes in.  We are going to look at creating 2D then 3D graphics using the Heaps engine.  For this series I will be doing both text and video versions of each tutorial.  The video version of this tutorial is available here.

 

Setting Up Your Haxe Development Environment

This tutorial is going to assume you are using Windows, if you are on another operating system, the steps are going to vary slightly.

First head to Haxe.org/download and download the appropriate Haxe installer for the most reason version.

image 

In my case I am going with the Windows Installer.  Run the executable and say yes to any security messages you receive.  You want to install everything like so:

image

 

Install where ever you like:

image

 

Verify your install worked correctly.  Fire up a command prompt and type haxelib version:

image

 

That was easy, eh?  Next you will probably want an IDE or Editor.  Personally I am using Haxe Develop, a special port of Flash Develop.  This is a Windows only IDE though.  Another option is Visual Studio Code with the Haxe language extensions.

 

Finally we need to install the Heaps library.  It’s not registered with Haxelib yet, so we currently have to install it from Github.  Run the command:

 

haxelib git heaps https://github.com/ncannasse/heaps.git

image

 

And done.

 

Creating a Hello World application

Now let’s create our first application to make sure everything is up and running correctly.  A simple hello world app.

Assuming you are using HaxeDevelop, go ahead and create a new project via Project->New Project

image

 

I created a JavaScript project like:

image

 

Inside our project folder, we need to create a folder for our resources.  I simply created a directory called res.  Simply right click your project in the Project panel and select Add->New Folder...

image

 

Next we need a TTF file, I personally used this font.  Simply download that zip and copy the ttf file into the newly created res directory.  You can open an Explorer window to that directory by right clicking it and selecting Explore.  I personally renamed it to not be all caps, it should work either way though.  If you are using HaxeDevelop, your project should look something like this:

image

 

We have two final bits of configuration.  First we need to text HaxeDevelop that we use the Heaps library, and that the resource folder is named Res.  Right click your project and select Properties

image

 

Next select the Compiler Options tab.  First add an entry to Compiler options with the value –D resourcePath=”res”.  Then add a value to Libraries of heaps.  That’s it, click Apply then Ok.

image

 

Finally some code!  First we need a WebGL canvas for our application to run in.  Simply open up index.html located in the Bin folder and add a canvas.  Your code should look something like:

<!DOCTYPE html>
<html lang="en">
<head>
	<meta charset="utf-8"/>
	<title>JSTest</title>
	<meta name="description" content="" />
</head>
<body>
	<canvas id="webgl" style="width:100%;height:100%"></canvas>
	<script src="JSTest.js"></script>
</body>
</html>

 

Now we need to edit our main Haxe code.  By default it will be called Main.hx and it’s the entry point (and entirety) of our program.  Enter the following code:

import h2d.Text;
import hxd.Res;
import hxd.res.Font;
import js.Lib;

class Main extends hxd.App {
        var text : Text;

		// Called on creation
        override function init() {
			// Initialize all loaders for embeded resources
			Res.initEmbed();
			// Create an instance of wireframe.tff located in our res folder, then create a font2d of 128pt size
			var font = Res.wireframe.build(128);
			// Create a new text object using the newly created font, parented to the 2d scene
            text = new Text(font, s2d);
			// Assign the text
			text.text = "Hello World";
			// Make it read, using hex code ( RR GG BB, each two hex characters represent an RGB value from 0 - 255 )
			text.textColor = 0xFF0000;
        }

		// Called each frame
        override function update(dt:Float) {
			// simply scale our text object up until it's 3x normal size, repeat forever
			var scaleAmount = 0.01;
			if (text.scaleX < 3.0) text.setScale(text.scaleX + scaleAmount);
			else text.setScale(1);
			
        }

        static function main() {
            new Main();
        }
    }

Now go ahead and run your code by either hitting F5 or click this button image

You should see:

GIF

 

Congratulations, your first Haxe programming using the Heaps library.  Next up, we will jump into 2D graphics with Heaps.

 

Video

Programming , ,

25. April 2016

 

One of the major advantages to working in 3D is once you have your character modeled and rigged, creating new animations is simply a matter of defining a series of poses on a timeline.  Animations are generally defined by moving a series of bones controlling your mesh, which in turn are powered by a system called inverse kinematics.  IK is basically a fancy way of saying “move an end bone and the computer will calculate how all the other bones in the chain will respond” enabling you to animate by positioning the foot forimage example and the ankle, knee and hip will rotate appropriately.  It’s a pretty powerful way to perform animation and every single major 3D application implements IK (and FK – forward kinematics).

 

In the land of 2D art, the process is often quite different.  Generally the approach here is to generate a sprite sheet, which is a sequence of slightly altered versions of the same character, which played in sequence results in an animation.  If you ever done a flipbook animation at the top corner of any of your textbooks, you already have the process of traditional 2D animation down.  There are other techniques such as onion skinning and rotoscoping to aid in the animation process, but it still remains time intensive.  If only there was some way to take the 3D worlds bone based animations and apply them to generating 2D art?  Well, there is... Spine.

 

Today we are going to look inside Spine, look at the art generation process, how to make sprite graphics that are animation ready, define an animation, then perhaps most importantly, play that animation back in our game engine of choice.  Since Spine itself is built over top of the LibGDX library (by one of the frameworks founders to boot), therefore I suppose a LibGDX example makes the most sense.  If you are bored, the story of how Spine came to be is an interesting read.

 

Full disclosure, I requested a review license in order to get hands on time with Spine.  Additionally some of the assets I am using in this demonstration are part of asset packs available for purchase and aren’t my creation.  Spine is commercial software, ranging in price from $70 for the essentials version, $300 for professional and $2200 for enterprise (which is tied to your companies revenue).  There is a free trial available and capable of doing everything we are about to do below except export and run in code.  Without further ado, let’s jump in.  As is often the case on GameFromScratch, if you prefer a video version one is available here as well as embedded below.

 

Meet Spine

Here is the main Spine interface:

image

 

It’s actually an exercise in simplicity which I appreciate.  It also supports UI scaling, so works well on high DPI displays, something far too many applications suck at, so I also appreciate that.  The left hand viewport is where the magic happens, this is where you compose your characters and animations, while on the right hand side you’ve got your project hierarchy a scene graph of sorts.  The primary UI is across the bottom of the screen.  You can easily pan and zoom around the display using a combination or RMB and Ctrl + RMB.  There is some additional complexity hidden away behind this menu:

image

 

But most of the time, what you see is actually all that you need.  It’s a very clean and simple UI.  Notice in the top left corner it says SETUP.  This is because you are currently in Setup mode.  Once our Sprite has been assembled and our bones have been arranged ( more on this in a moment ), we can then switch in to animation mode by clicking SETUP.

image

In animation mode, its all about posing our character.  Notice SETUP changes to ANIMATE and our interface changes slightly.  Now we have a timeline across the bottom of the screen.  We will get back to that in a moment.

 

Creating Spine Ready Sprites

Creating a sprite that is ready to be animated in Spine is pretty close to traditional sprite based animation with two major exceptions.  First, you cut your image up into several different pieces.  You can draw your sprite as a single image if you wish, but once you are done you need to cut it into several different animatable pieces.  Consider the sprite from the above screenshots:

image

This looks like a single drawn sprite, but it’s actually made up for several pieces arranged together.  If you look in the images section of the hierarchy, you can see it’s actually composed of several different images:

image

 

Again, you can draw your sprite how you normally would, but each animatable piece will need to be cut up to proceed in Spine.  This leads to our second requirement...  you also need to draw parts of the images that are normally obscured.  Again, using this example, even if the upper arm isn’t full shown due to being obscured by the body you still need to draw the entire arm, as the visibility can change as the sprite moves, for example:

imageimage

 

So when drawing the pieces of your sprite, you have to think about the depth as well.  Here for example are all the pieces that go together to make this character:

image

 

Rigging Your Character

Next up comes perhaps the most time intensive portion of working with Spine, rigging you character.  You can think of this as arranging all the various images together to create your character, while defining the underlying armature (fancy word for skeleton).  We will do a very simple skeleton, just to demonstrate the process.  You will notice in the tree view that there is a root node under our skeleton:

image

 

This is the very base of the skeleton and all bones are parented to it ultimately.  From here we need to create a root bone, it’s very common to start from the hips, which is what we will do.  Using the create tool, we will quickly create a simple leg skeleton:

image

Click once to set the start of the skeleton, then move the mouse and click again to set the first bone.  Now move down slightly and set another bone, like so:

image

In the hierarchy I rename the bones to values that make sense.

image

Now that we have bones, let’s attach some images to each.  From the images section you can simply drag the appropriate image onto the bone, like so:

image

You will be prompted if you want to go ahead with it:

image

 

The image is now parented to that bone.  By selecting the image you can now transform, rotate and resize it so it best matches the underlying bone:

image

You can also modifying the bone length by hovering over the tip, like so:

GIF

 

Now repeat for the lower bone, like so:

image

 

You end up with a hierarchy like:

image

 

Extremely simple, but the character is rigged, well, the leg is anyways.

 

Creating an Animation

 

Now that we have a very simple animatable character, let’s now switch over to ANIMATE mode.  In the tree view, you should see a section called Animations.  There may be a default one there, otherwise create one using the New Animation button that appears when animation is selected:

image

image

 

Keyframed animation is pretty simple in concept.  You will notice at the bottom of the screen there is now a Dopesheet view:

image

 

Your animation is composed of a set of “key” frames.  That is, you post your character and take a snapshot of the location/rotation/scale of a given bone, then advance the timeline to a different value and repeat the process.  The computer then interpolates between keyframes to create a smooth animation.  You can turn “autokey” on, so that any changes you make in the editing window automatically set a key.  Otherwise you can manually create the key by clicking the green key to the right of each transform:

image

 

Set a key for the default rotate, translate and scale values, or use Autokey.  Next advance the timeline to say 5, like so:

image

 

Next using rotations, manipulate each bone, like so:

gif2

 

Advance the timeline slightly more, then repeat the process all over again.  You can control the playback of your animation using these simple VCR style controls:

image

 

Here is a very simple and crude kicking animation:

gif3

 

Another cool thing you can do is add Events as part of your timeline, like so:

image

image

Enabling you to create events that can be fired in code, allowing you to incorporate programmatic aspects into your animations, such as playing a footstep audio effect.  We will see this process shortly.

 

Exporting the Animation

Now that we’ve got an animation to use in our game, it’s time to export it.  Here there are a couple of choices. 

image

 

You can export your results as a video, a sequence of images or as data.  If you chose to export as an image you can actually have some rather advanced controls, including generating a texture atlas (directly usable in LibGDX) or sprite sheet:

image

 

With results like:

skeleton-kick

 

This approach can be utilized in just about every single kind of game engine available today.  However, where Spine shines is when you chose to export as data instead.  This is where runtimes come in.  These are essentially libraries or code for the various game engines that enable you to use spine format natively.  Full source is available on github and runtimes exist for most 2D engines available including Unity, LibGDX, Love, MonoGame, Torque2D, Cocos2d-x and many more.  In this example I will be using LibGDX.

 

In this case I’m going to export to JSON and generate a texture atlas using the following settings:

image

 

Now let’s break out some code.

 

Using Spine In Game

As mentioned earlier Spine have several runtimes available on github.  In the case of the LibGDX project, you simply have to copy the code into your appropriate source code folder.  Assuming you created a project using the setup utility, this means copying the contents of esotericsoftware to your core\src\com directory.  Then I wrote the following code, adapted from one of their LibGDX examples.

Make sure that you’ve exported your assets and created the atlas in your working directory, most likely \core\assets.  Then use the following code:

package com.gamefromscratch;

import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.OrthographicCamera;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.graphics.g2d.TextureAtlas;
import com.esotericsoftware.spine.*;

public class Spine2 extends ApplicationAdapter {
    private OrthographicCamera camera;
    private SpriteBatch batch;
    private SkeletonRenderer renderer;
    private TextureAtlas atlas;
    private Skeleton skeleton;
    private AnimationState state;

	public void create () {
		camera = new OrthographicCamera();
        camera.setToOrtho(false);
		batch = new SpriteBatch();
		renderer = new SkeletonRenderer();
		renderer.setPremultipliedAlpha(true); // PMA results in correct blending without outlines.

		atlas = new TextureAtlas(Gdx.files.internal("skeleton.atlas"));
		SkeletonJson json = new SkeletonJson(atlas);
		SkeletonData skeletonData = json.readSkeletonData(Gdx.files.internal("skeleton.json"));
		skeleton = new Skeleton(skeletonData);
		skeleton.setPosition(0, 0);

		AnimationStateData stateData = new AnimationStateData(skeletonData);
		state = new AnimationState(stateData);

        // Set up an animation listener so we can respond to custom events or completion
        final AnimationState.TrackEntry track = state.setAnimation(0, "kick", false);
        track.setListener(new AnimationState.AnimationStateListener() {
            @Override
            public void event(int trackIndex, Event event) {
                // Check for the "half" event we defined in the editor
                if(event.getString().equals("half"))
                    System.out.println("Half way baby");
            }

            @Override
            public void complete(int trackIndex, int loopCount) {
                // or the complete event (not END!) when done, fire the idle animation instead
                state.setAnimation(0,"idle",false);
            }

            @Override
            public void start(int trackIndex) {
            }

            @Override
            public void end(int trackIndex) {
            }
        });
	}

	public void render () {
		state.update(Gdx.graphics.getDeltaTime()); // Update the animation time.
		state.apply(skeleton);
		skeleton.updateWorldTransform();

        Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
		camera.update();
		batch.getProjectionMatrix().set(camera.combined);
		batch.begin();
		renderer.draw(batch, skeleton);
		batch.end();
	}

	public void dispose () {
		atlas.dispose();
	}
}

 

When you run this code...

gif4

 

In the above code example you can see how you can handle an event you defined in Spine.  Otherwise it’s pretty simply to load and play animations on a character developed in Spine.  There is a comprehensive API, I’ve only touched on a very small part of it here due to space (this is already pretty long...).  There are also several features I never got to mention such as free form deformation ( useful for shapes such as capes ), swappable skins, place able props, etc..  If you are doing 2D animation, Spine is certainly a product you should check it.  Spine is by no means the only option when it comes to 2D animation in games, Spriter and Creature are two other popular alternatives.  It is however a very good option.

 

The Video

Art, Programming , ,

13. April 2016

 

Welcome to another episode in the ongoing Closer Look At game engine series.  This series combines an overview, review and getting started guide into a single document and hopefully let you evaluate in minutes instead of days if a game engine is the right fit for you.  Today we are going to be looking at the Stingray Engine from Autodesk.

 

As always, there is an HD video version of this review available here.

 

About The Stingray Engine

 

The Stingray Engine started life as BitSquid by a company called FatShark.   That company was acquired by Autodesk in 2014 and then finally released in 2015.  Stingray is an attempt to compete with the likes of Unreal Engine and Unity and has already been used to ship a number of commercial games including Magicka Wars, Warhammer : End Times – Vermintide and HellDivers.

As Autodesk make a number of important content creation applications such as Maya and 3D Studios Max, Stingray is well supported.  Additionally Autodesk have directly integrated their Scaleform technology into the engine, a topic we will discuss shortly.

Stingray is priced somewhat differently than it’s competitors.  It is licensed on a monthly basis, with the current pricing shown below:

image

 

There is no additional royalty.  The vast majority of people will not however license the Stingray engine in this manner.  Instead it will more often be acquired together with Maya LT, an indie game developer focused version of their Maya 3D application, purchasable only via subscription.  A MayaLT subscription however includes a license for Stingray.  Current Maya LT pricing is shown below:

image

 

Stingray tooling is available only on Windows 7 and higher, while it is capable of targeting Windows, iOS, Android as well as PS4 and Xbox One with the appropriate developer license.  Stingray also offers HTC Vive and Oculus Rift VR support.  Notably there is no Mac OS, HTML or Linux support among the target platforms.

 

Both Maya LT and Stingray have 30 day trials available.  In fact, it’s a 30 day trial I am using for this review and as a result I do not believe I have any access to the underlying C++ source code.  Alright, let’s jump in.

 

The Stingray Editor

This is Stingray:

image

 

First off the engine is in a single word clean.  Compared to Unity or Unreal, the interface is incredibly streamlined and especially compared to Unity, it is extremely customizable.  Any window can be torn off, resized, docked, run floating, on multiple monitors etc.  Performance is fine, the UI responds quickly enough.  I did however find on occasion that certain windows stopped refreshing manually and required me to resize or minimizing the UI to get them to draw properly again.  The main window works just fine on High DPI displays, although oddly enough some of the stand alone tools such as the  do not.  Hopefully these will be fixed in time.

 

I am running on Windows 10, and receive the following dialog when starting up Stingray:

image

I have never received this message from any other application since upgrading to Windows 10.

 

Oh, and one very unpleasant note about the Stingray installation process... it installs this application:

image

I view Akamai in the same realm as malware and am certainly not happy about it being on my machine.  As soon as this review is done, it will be uninstalled.  I have no idea if the main application is hindered by it’s removal, so I have begrudgedly allowed it to remain.  I know many people will not take the same attitude.

 

Now let’s break down the starting UI.

 

Level Viewport:

image

This is where you compose your 3D by placing entities (Units).  The toolbar across the left offers the standard object manipulators for scaling, translating and rotating and use the standard Maya QWERTY hot key scheme.  Additionally there is a placement mode, support for snapping and rotation intervals.  The view itself can be manipulated using the ALT key plus various mouse combinations, such as ALT + MMB to pan, ALT + LMB to Orbit and ALT + Scroll to zoom.

You can toggle viewport overlays, settings and more using the various labels in the top left of the view:

image

 

On the top right hand corner of the view there is a toggle enabling you to switch between a single view and a quad view, like so:

image

 

Notice the tabs across the top.  They can be used to “float” a window or to organize multiple windows into a single tab via drag and drop:

image

There are also several editor windows that can be launched via the + icon:

image

 

The asset browser enables you to import and organize the assets used by your game:

image

 

Where it makes sense ( materials, FBX models, etc ), a live preview thumbnail is generated.  Additionally selecting an asset will show it in the asset previewer:

image

 

This viewer can be manipulated in the same was as the main 3D scene, allowing you to zoom, orbit, etc..

The Log Console enables you to see warnings, errors, etc., as well as serving as the standard Lua console.

image

The results can be filtered by type, searched, cleared, etc.  This is an invaluable tool and will be spending a lot of time up on screen.  Somewhat unfortunately there is no link between code errors and the code editor, so clicking a syntax error message will not bring you to the appropriate line of code.  Hopefully this is fixed in the future.

 

The Explorer view is your typical scene graph, showing a breakdown of entities and units in your current level:

image

 

Depending on what you have selected in the Explorer, the properties window changes:

image

This window enables you to configure the appropriate properties and in the case of Entities, add or remove components.

So, what’s the difference between Entities and Units... well, this is a bit confusing, more so than it should be in my opinion.

A Unit is a scene graph node, often called an actor or node in other game engines.  It’s a thing that can be placed in the world and basically represents the stuff that exists in your game world.  The player, enemies, obstacles, etc.. 

An Entity on the other hand is a container for components, and components are reusable chunks of “stuff”.  Wasn’t that a helpful description?  A component is simply a way of describing properties and methods in a reusable way.  You bundle together the data and methods required for lighting as a light component for example.  In most modern game engines the entity/component relationship is pretty much the defacto way of doing things.  In Stingray, entities currently consist of the terrain and environment, while Units represent the rest of “stuff” in the world.  This is an artificial and confusing divide and is part of a transition to an ECS system as evidenced by this post:

Currently, our "basic object" is a unit. That .unit file will contain all the info about that unit including physics, animations, mesh data, etc ... The entity system, which currently consists of the terrain and the environment, will have a scene graph make up ( a basic node hierarchy ), that will hold all the components of that entity. Moving forward, we will be iterating on the entity system and how that works with a component library of sorts. That would mean you could have an entity, like a space ship, and have multiple components ( turrents, weapons, wings, engines ) inside of it that you could control via flow or scripting.

Anyways... back to our UI tour...

We have the Create dialog, for creating various units in your scene:

image

 

Instances are created by simply clicking in the Create dialog then placing in the 3d view.  Once created the properties can be configured in the properties view.  Instancable objects include AI type Helpers such as Nav Markers, cameras, lights and geometric primitives such as cubes, cylinders and spheres.  You can also create splines, prototypes and terrain maps this way.

Prototypes are simple cube objects for rapidly prototyping level geometry ( or as invisible actors for physics objects ), while terrain creates a new height map as well as a corresponding unit.

image

 

Then when you select the newly created Terrain unit, in the Property Editor, you now have access to additional terrain sculpting tools:

image

 

You can sculpt, flatten and smooth the terrain, as well as paint the RGB and alpha channels of the terrain texture. You’ve also got the options of importing the height map and texture from an existing terrain generation tool.

 

That completes the tour of the primary UI of the Stingray Engine, but leaves an absolute ton of tools uncovered, as you can see from the Window menu:

image

 

Many of these windows we have already covered, although several we have not.

 

Coding in Stingray

Next let’s move on to look at the workflow for coding in Stingray.  There are three ways to develop in Stingray.  The engine itself it written in C++ and can be extended in C++ if desired.  New C++ code can be exposed to both Flow and Lua code, so slots seamlessly in with the rest of the engine.  I will not be discussing C++ development in Stingray any further as I frankly have no access currently.  This is also the workflow more intended for extending the engine itself, not as a primary means of development.  For example, there is virtually no C++ related documentation or reference materials available.  This leaves the other two options.

 

The first is Lua development and the second is using Flow, a visual graph based programming language.  We will look at both below, starting with Flow.

 

Programming In Flow

If you’ve used Blueprints in Unreal Engine, you immediately have an idea of what flow is.  You visual create graphs of nodes that control certain events.  Flows can be created both at the scene/level and unit level. (Ok, confusing choice of terms there...)

You may have noticed in the 3D view of the default scene, there was another tab there:

image

This can be used to create Flow graphs for that selected level.  These respond to various inputs and events and are called back at various points in time by the game engine.  We will in this example instead look at creating a Flow for a unit.  Select any unit in your world, or create one either using an imported FBX object, using the Create dialog to create a primitive like a Cube or Sphere, it doesn’t really matter.  It just matters that you have a Unit we can attach a Flow to.

 

Now select the menu Window->Unit Editor.

This will bring up a stand alone window.  In the new windows menu select File->Open and locate your Unit:

image

The Unit editor should now look something like:

image

 

Notice at the bottom it says Unit Flow.  Click that to bring up the Unit Flow editing canvas.

Right clicking brings up a menu that presents the various nodes you can create.

image

 

It is by connecting together these nodes we program the execution of our program.  We respond to various events, either game based callbacks, or respond to input events.  You then wire these nodes together as either input or output to other nodes.  Here is a very simple example:

image

This flow is called in response to the Keyboard Button “m” being hit.  This in turn is wired to the In event of the Set Unit Local Position.  The Position input of our selected Unit is then set through a combination of nodes.  In this case we query it’s current position, create a Vector with the value (0.1, 0, 0) and add them together with an add vector node.  The results of this action are pipes into the position parameter of the Set Unit Local Position node.  You will notice inputs are on the left side, outputs are on the right side and compatible nodes are color coded.  Essentially this is how Flows are programmed.  You can create a flow on a unit by unit basis, or a level by level basis.  Flows also have a node capable of calling or being called from a Lua script, enabling you to mix and match the two programming concepts if you wish.

 

Programming In Lua

Lua is the other major option for programming in Stingray.  When you chose a Basic Project from the starting template, you get a basic implementation of a game in Lua.  In this example however, we will look at the minimal implementation of a game loop using Lua.  There is an integrated Lua code editor in Stingray:

image

Even if you create an empty project, at it’s root you will see a configuration file named settings.ini.  This file can be edited using the Script Editor.  The following setting tells Stingray what script to use as your application “main” if you will:

// The script that should be started when the application runs.
//boot_script = "core/appkit/lua/main"
boot_script = "script/lua/player"

You can see we commented out the default implementation and replaced the entry for boot_script to script/lua/player.  This is the path within our project to the .lua file controlling our game, with file extensions removed.  Note the commented out core/appkit entry.  We will get back to that in a moment.

Now let’s look at the implementation of player.lua:

package = nil
world = nil
viewport = nil
shading_environment = nil
camera = nil
box = nil

function init()
    world = stingray.Application.new_world()
    viewport = stingray.Application.create_viewport(world, "default")
    shading_environment = stingray.World.create_shading_environment(world,
        "core/stingray_renderer/environments/midday/midday")
    stingray.World.set_shading_environment(world, shading_environment,
        "core/stingray_renderer/environments/midday/midday")

    local camera_unit = stingray.World.spawn_unit(world, "core/units/camera")
    camera = stingray.Unit.camera(camera_unit, "camera")
    stingray.Camera.set_local_position(camera, camera_unit, stingray.Vector3(0,-10,2))

    world:load_level("content/levels/scene1")
    box = stingray.World.unit_by_name(world,"myCube")
end

function update(dt)
    stingray.World.update(world, dt)
    local pos = stingray.Unit.local_position(box,1)
    pos.x = pos.x -0.01
    stingray.Unit.set_local_position(box,1,pos)
end

function render()
    stingray.Application.render_world(world, camera, viewport, shading_environment)
end

function shutdown()
    --stingray.Application.release_world(world)
end

 

The boot_script implements several callbacks that are called by the Stingray engine over the lifecycle of the game, startup, ending and per frame for updating and rendering.  In this script we set up the environment, camera etc.  Then you can see we load a level with a call to world:load_level().  We then get a reference to a Unit we created in the level with the name “myCube”.  Each frame update is called, and during that call we simply move the x value of our box slightly each frame.

 

Remember the core/appkit call earlier.  This is a framework to a game and series of Lua bindings available in your Stingray directory.  You should take a moment to peruse it’s contents:

image

In addition to all the lua bindings, simple application implementations etc, there are also sample shaders, materials, etc..  Like in Flow, Lua scripts can also be attached to Units and units can be controlled that way.  Flow’s can also be attached at the Level um... level.  Lua and Flow are also able to communicate with each other, as well as underlying C++ code.  Full details of Lua and Flow programming are far beyond the scope of this review.  Check the Documentation section however for more information.

 

Art and Assets

 

I suppose it should come as no surprise that the tool is heavily coupled with Autodesk’s tools, specifically Maya.   While there are tools such as the Skeleton Preview tool (shown below) and Asset Preview we saw earlier, all modeling, animation and texturing is done in your DCC tool.  I have however imported FBX files generated in Blender without issue.

image

 

Assets are imported via the Import button in the Asset Browser, and presents several options:

image

As FBX is Autodesk’s interchange format of choice, the results are pretty rock solid.

 

The artist also has control over the shading and lighting of a scene.  As we saw earlier, lights can be instanced using the Create panel.  Each scene also has a shading environment that can be configured:

image

 

Finally, one major feature of the Stingray Engine, and increasingly a huge aspect of modern game engines, is the inclusion of an asset store.  This can be accessed via the Creative Market:

image

 

Currently holding only 426 items, it is a fraction of the size of the Unity asset store, but nice to see none the less. 

 

There is one elephant in the room... 2D.  How good is this engine at 2D games?  Well that’s a tricky question to answer.  On the one hand the engine essentially has ZERO 2D support.  Any 2D done in the engine is essentially faked in 3D and it doesn’t really contain any tools to make this process simpler.  On the other hand, Stingray fully integrates and even includes Scaleform and this is the recommended route for supporting or integrating 2D into your game.  Scaleform can essentially be thought of as Flash optimized for games, which is exactly what it is.  It’s a special version of the Flash player that was designed for games and has commonly been used for implementing UI overlays in AAA games.  In more recent years Autodesk have also been pushing it as a 2D game engine with mixed success.  Scaleform is WAY beyond the scope of this review, but if you want more information on Scaleform go here.  The Stingray engine heavily integrates the Scaleform engine and can be used with Lua or Flow coding.

 

Documentation and Community

 

So, how is the documentation for Stingray?  In a word, great.  All of the documentation is online, containing both a good user manual as well as comprehensive reference materials.

image

They have also created a series of getting started videos available on YouTube.  The LUA Api Reference is available here, while the Flow reference is available here.  When creating your project there are a couple of templates you can start with:

image

Hopefully more templates come soon.

 

The community is forum based, while not extremely large, questions are answered.

 

Summary

I did run into some annoyances, ranging from minor to major in my time with Stingray.  I am not sure if High DPI displays are to blame here, but I ran into many off screen issues.  When running my application it appeared in the bottom right corner of the screen and can’t easily be resized.  I had to maximize (alt + enter) and minimize the window to have it display properly.  I also had windows that simply didn’t fit on the screen at all, in the case of the Property editor, it took up the entire screen and the problem persisted between sessions making it impossible to continue without resetting the default layout.  As mentioned earlier, some stand alone editors such as the external flow editor looked terrible on my display.  This is also the only area where I experienced crashes.  From a usability perspective, being prompted to save every single time I ran my application, even though I already did, and even though the application runs with a saved copy anyways, got very tiresome quickly.  There should be a setting to remove this behavior.  For the most part though, usability and stability were both very good.  None of these gripes even approach being deal breakers.

At the end of the day Stingray is a very competent, feature complete and relatively streamlined game engine for making 3D games.  If you are looking for a 2D or 2.5D solution, you are probably better served looking elsewhere.  This is a production proven, easily to learn and capable 3D game engine however, easily able to compete toe to toe with the likes of Unity or Unreal.  In the end it is probably going to come down to a combination of supported platforms, preferred programming languages and yes, ultimately cost, that is going to decide if Stingray is right for you.  It is certainly worth checking out, especially if you already work with Autodesk’s suite of applications.

 

The Video

Programming , ,

22. March 2016

 

This page is in support for the video tutorial on using Sprites/Images in Love as part of the GameDev For Complete Beginners tutorial series.  It contains the code and images used in the tutorial.  There is also a copy of the tutorial embedded below.

 

Images

The graphics are from this pack of images, made transparent using this process.

ALC-17AttackChoppers

 

Simply right click and save to your project directory.

 

Source Code

Drawing an Image

local imageFile

function love.load()
    imageFile = love.graphics.newImage("ALC-17.PNG")
end

function love.draw()
    love.graphics.draw(imageFile)
end

 

Using a Spritesheet

local imageFile
local frames = {}

local activeFrame
local currentFrame = 1
function love.load()
    imageFile = love.graphics.newImage("AttackChoppers.PNG")
    frames[1] = love.graphics.newQuad(0,0,128,64,imageFile:getDimensions())
    frames[2] = love.graphics.newQuad(128,0,128,64,imageFile:getDimensions())
    frames[3] = love.graphics.newQuad(0,64,128,64,imageFile:getDimensions())
    frames[4] = love.graphics.newQuad(128,64,128,64,imageFile:getDimensions())
    activeFrame = frames[currentFrame]
    print(select(4,activeFrame:getViewport())/2)
end

function love.draw()
    --love.graphics.draw(imageFile,activeFrame)
--[[    love.graphics.draw(imageFile,activeFrame,
        love.graphics.getWidth()/2 - (select(3,activeFrame:getViewport())/2) * 2,
        love.graphics.getHeight()/2 - (select(4,activeFrame:getViewport())/2) * 2,

            0,
            2,
            2)
]]--
    -- draw image 4x size centered
    love.graphics.draw(imageFile,activeFrame,
        love.graphics.getWidth()/2 - ({activeFrame:getViewport()})[3]/2 * 4,
        love.graphics.getHeight()/2 - ({activeFrame:getViewport()})[4]/2 * 4,
        0,
        4,
        4)
end

local elapsedTime = 0
function love.update(dt)
    elapsedTime = elapsedTime + dt

    if(elapsedTime > 1) then
        if(currentFrame < 4) then
            currentFrame = currentFrame + 1
        else
        currentFrame = 1
        end
        activeFrame = frames[currentFrame]
        elapsedTime = 0
        end
end

 

 

The Video

Programming , ,

9. March 2016

 

So I’ve decided that it’s time to do another series about Haxe development here on GameFromScratch.  I did a short five part tutorial series on using Haxe and NME a few years back before it was rebranded OpenFL, but haven’t really touched it much since.  Before jumping in to a Haxe game engine, I had to decide which game engine to jump into!  Part of my decision is going to be informed by a poll I’m currently running on Twitter.  Of course, it’s useful to take a current look at the Haxe game engine landscape.  I did a post entitled Choosing A Haxe/NME Game Engine three years ago, but simply put, a lot changes in 3 years!  So what follows is a quick roundup of the most popular Haxe based game engines and a quick blurb about each.  If I miss one, please be sure to let me know in comments below, on Twitter or via email!

Oh... and this is not a review in any sense of the word.  I have very little working experience with the majority of engines I am about to cover and have no informed opinion to share as a result!  They are also in no particular order unless you can consider “completely random” as a type of order!

 

Haxe Game Engines

 

 

Flash Engine Ports

 

HaxeFlixel

http://haxeflixel.com/

HaxeFlixel is an interesting story... it started life as a Haxe port of the popular Flash 2D game engine, Flixel.  Then, well, Flixel basically died and HaxeFlixel lived on.  In fact HaxeFlixel 4.0 was just released, while Flixel hasn’t been updated in years.  It is a complete 2D game engine for Haxe.

 

HaxePunk

http://haxepunk.com/

Wow, I could almost copy and paste the HaxeFlixel entry for HaxePunk.  It also started life as a Flash library port, and since lives on while it’s inspiration is no longer updated.  That said,FlashPunk is much less active than HaxeFlixel.

 

Haxeling(Starling Port)

https://github.com/Haxeling/haxe-starling

This one is more of a WIP than the prior two, and is a port of the popular Starling Flash library to the Haxe language.  There are limits right now such as the lack of an HTML5 target and a few missing features.  Starling is perhaps best known for being the framework Angry Birds was initially created in.

 

Away3D

https://github.com/away3d/away3d-core-openfl

A port of the Away3D 3D flash framework to the Haxe language.  One of the few 3D libraries that has model loading support out of the box (oddly...).  Obviously from the name, this is a 3D focused library.

 

Popular Haxe Frameworks

 

Kha

http://kha.tech/ || https://github.com/KTXSoftware/Kha

Kha is a low level framework whose design favors speed over other priorities.  It supports 2D and 3D but again is quite low level (so no model api for example).  Kha runs at the level similar to SFML, SDL or JGWL, with game engines built over top of it.  Such engines include:

KhaPunk || Cycles || Komponent2D || Kha2D

These engines work at a higher level of abstraction (further from the metal), making use of Kha to provide multimedia functionality.

 

NME

https://github.com/haxenme/nme

NME is the library that lead to OpenFL.  It is similar to Kha in that it provides low level cross platform implementation of the technical “stuff” that makes up a game, then more high level game engines are built on top of it.  Haxe became OpenFL, but it seems there was a community that want to see NME continue as it was, so NME seems to have some life as well.

 

OpenFL

http://www.openfl.org/

Starting life as an implementation of Flash API in a cross platform manner using the Haxe programming language.  Basically it allows Flash developers to seemlessly transition to Haxe development.  A number of engines are layered on top of OpenFL.  As mentioned above, OpenFL was originally NME, although they have no evolved in different directions.  OpenFL was used to make several commercial games such as Papers Please. HaxePunk, HaxeFlixel and Stencyl all are (or were) layered over top of OpenFL.

 

awe6

https://github.com/hypersurge/awe6

awe6 is awesome, but oddly never really seems to have taken off.  I’m glad to see that it’s still under development even if there isn’t much of a community around it.  awe6 is build around the idea of inversion of control (and dependency injection) and I really can’t do it justice in a single paragraph.  The lack of a community though makes this very much not an engine for people that aren’t able to solve problems on their own.  I did a closer look at awe6 years ago, and it should remain equally valid.

 

snowkit

http://snowkit.org/

This is one of the new kids on the block and a very cool looking collection of tools.  Snowkit is composed of flow – a build tool, snow – a low level media library, luxe – a game engine built on snow, mint – a UI library, linc – hxcpp bindings to several popular game libraries (SDL, OpenAL, etc) and hxsw a string/shader library.  The collection all together is snowkit and provides, with haxe, a complete framework for creating games.  It’s a cool concept, but it’s also much more complicated due to all the moving pieces.

edit—To clarify, snowkit is an umbrella term for all of the above mentioned technologies combined as well as the community around it.  The actual game engine is luxe, not snowkit.

 

lime

https://github.com/openfl/lime

Lime isn’t a game engine, it’s a cross platform media layer.  Basically OpenFL and other libraries/engines are built on top of lime.  It provides logic for windows, input, audio, rendering, networking, etc... in a cross platform manner.  Obviously it’s pretty low level.  Heck OpenFL is pretty low level and its over top of lime after all.

 

BabylonHx

http://babylonhx.gamestudiohx.com/

BabylonHx is a Haxe port of BabylonJS which I looked at in depth here.  I like BabylonJS, it’s a great Javascript game engine with a clean easy to understand design.  BabylonHx is an incomplete port of babylonjs...  how incomplete I do not know.

 

Cycles

http://cyclesgame.org/

This one I know very little about, but I am going to change that, because the concept sounds really cool.  Basically it integrates into Blender and uses Haxe (and under that Kha) as the programming language.  Basically you create your game world in Blender and code the logic in Haxe.  It’s similar I suppose to Blend4Web or BDX.  The choice of Cycles is probably a poor one though, as Cycles is the name of the modern renderer for Blender.  I am intrigued though... enough that I am going to download it now, so that’s the end of this list... ;)  EDIT- Doh... coming soon.  Boo

 

The Ones I Missed

And the following are my wall of shame, the engines I missed and various members in the community pointed out to me (thanks for that btw). 

 

Heaps

https://github.com/ncannasse/heaps

Heaps is a game engine by Nicolas Cannasse, who can basically be considered the father of the Haxe language, as well as CastleDB and other important Haxe projects.  Heaps was used to make Evoland 2.  It is a 2D and 3D game engine capable of targeting WebGL, Flash 3D, Mobile and Desktops using OpenGL.

 

Flambe

http://getflambe.com/

A 2D cross platform Haxe based game engine including tools for importing Flash animations, creating particle systems and glyphs.  Can target iOS, Android, Flash, HTML5 and desktop targets using Adobe’s AIR.  Open source and MIT licensed.

Did I miss any (that aren’t unsupported or extremely unstable)?  Which of these engines are you most interested in?

Programming

Month List

Popular Comments