Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
15. April 2018

A couple days back AppGameKit v2018.4.12 was released with the major new feature being AR (Augmented Reality) support.  I decided to give the new AR functionality a shot and it was really impressive how easy it was.  In order to get started with AR and AppGameKit you are going to need an AR compatible device.  On iOS, this means an ARKit compatible device, which basically means an iPhone 6S or newer device, while on Android device you need an ARCore compatible device from this list of phones.

I modified the AR example slightly, to remove a bit of functionality and to instead load a simple Tie Fighter model I downloaded off the web and converted to .X format.  AppGameKit can be coded using either C++ or their higher level Basic like script, which is what was used in this example.  Here is the slightly modified source code used:

// set window properties
SetWindowTitle( "AR Tie Fighter" )
SetWindowSize( 1024, 768, 0 )

// set display properties
SetVirtualResolution( 1024, 768 )
SetOrientationAllowed( 1, 1, 1, 1 )
SetClearColor( 101,120,154 )
SetGenerateMipmaps( 1 )

// camera range from 0.1 meters to 40 meters
SetCameraRange( 1, 0.1, 40 )
SetAmbientColor( 128,128,128 )
SetSunColor( 255,255,255 )

// load tie fighter
LoadObject( 1, "tie.x")
SetObjectPosition( 1, 0,0.1,0 )
LoadImage(1, "diffuse.jpg")
SetObjectImage (1,1,0) 

function ShowModel( show as integer )
  SetObjectVisible( 1, show )

ShowModel( 0 )

function ScaleModel( amount as float )
  SetObjectScalePermanent( 1, amount, amount, amount )

ScaleModel( 0.025 )

// create some planes to show detected surfaces, initially hidden
for i = 101 to 150
  CreateObjectPlane( i, 1,1 )
  SetObjectRotation( i, 90,0,0 )
  FixObjectPivot( i )
  SetObjectVisible( i, 0 )
  SetObjectColor( i, 255,255,255,128 ) // 50% transparent
  SetObjectTransparency( i, 1 )
next i

// add some buttons to control various features
AddVirtualButton( 1, 100,565,100 )
AddVirtualButton( 2, 100,665,100 )
SetVirtualButtonText( 1, "Scale +" )
SetVirtualButtonText( 2, "Scale -" )

AddVirtualButton( 3, 924,665,100 )
SetVirtualButtonText( 3, "Hide" )

function ShowHUD( show as integer )
  SetVirtualButtonVisible( 1, show )
  SetVirtualButtonVisible( 2, show )
  SetVirtualButtonVisible( 3, show )
  SetVirtualButtonActive( 1, show )
  SetVirtualButtonActive( 2, show )
  SetVirtualButtonActive( 3, show )

// initialize AR, if possible
while( ARGetStatus() = 1 )
  // wait while user is being prompted to install ARCore

AnchorID as integer = 0
ShowPlanes as integer = 1
ambientScale# = 1.0

  // get light estimation
  ambient = ARGetLightEstimate() * 255 * ambientScale#
  SetAmbientColor( ambient,ambient,ambient )
  // check screen tap for plane hits, but only if buttons are visible
  if ( GetPointerReleased() and ShowPlanes = 1 )
    // check the point that the user tapped on the screen
    numHits = ARHitTest( GetPointerX(), GetPointerY() )
    if ( numHits > 0 )
      ShowModel( 1 )
      // delete any previous anchor, could keep it around instead
      if ( AnchorID > 0 ) then ARDeleteAnchor( AnchorID )
      // hit test results are ordered from closest to furthest
      // place the object at result 1, the closest
      AnchorID = ARCreateAnchorFromHitTest( 1 )
      ARFixObjectToAnchor( 1, AnchorID )
      // if the user didn't tap on any planes then hide the object
      ShowModel( 0 )
    // clean up some internal resources
  // place the buttons at the edge of the screen
  // needs to be done regularly in case orientation changes
  SetVirtualButtonPosition( 1, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-210 )
  SetVirtualButtonPosition( 2, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-105 )
  SetVirtualButtonPosition( 3, GetScreenBoundsRight()-105, GetScreenBoundsBottom()-105 )
  // detect button presses if they are visible
  if ( ShowPlanes = 1 )
    if ( GetVirtualButtonPressed(1) )
      ScaleModel( 1.05 )
    if ( GetVirtualButtonPressed(2) )
      ScaleModel( 0.95 )
    if ( GetVirtualButtonPressed(3) )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 0 )
    // screen tap whilst button are hidden shows them again
    if ( GetPointerReleased() )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 1 )
  // hide old planes
    for i = 101 to 150
    SetObjectVisible( i, 0 )
  next i
  // show detected planes
  if ( ShowPlanes )
    numPlanes = ARGetPlanes(0)
    // this demo stops at 50 planes, but there is no internal limit
    if numPlanes > 50 then numPlanes = 50
    for i = 1 to numPlanes
      SetObjectPosition( i+100, ARGetPlaneX(i), ARGetPlaneY(i), ARGetPlaneZ(i) )
      SetObjectRotation( i+100, ARGetPlaneAngleX(i), ARGetPlaneAngleY(i), ARGetPlaneAngleZ(i) )
      SetObjectScale( i+100, ARGetPlaneSizeX(i), 1, ARGetPlaneSizeZ(i) )
      SetObjectVisible( i+100, 1 )
    next i
    if ( ShowPlanes )
    Print( "FPS: " + str(ScreenFPS()) )
    select( ARGetStatus() )
      case 2 :  Print( "AR Active" ) : endcase
      case -1 :  Print( "AR Not Available" ) : endcase
      case -2 :  Print( "AR Install Rejected" ) : endcase
    Print( "Number of Planes Detected: " + str(numPlanes) )
    Print( "Light Estimation: " + str(ARGetLightEstimate()) )
    Print( "Light Boost: " + str(ambientScale#,1) )
  // draw the camera feed, and then the rest of the scene

You can see the results of this code and get a bit more detail by watching the video below:

If you are interested in learning more about AppGameKit, be sure to check out our Closer Look available here.


15. March 2018

Google just open sourced Resonance Audio, their 3D spatial audio rendering SDK.  It supports multiple platforms and game engines including Unreal, Unity, wWise, FMod, iOS, Android and Web.  You can learn more about Resonance Audio here, while the source is hosted on Github under the liberal Apache 2 source license.  Resonance enables you to create and position audio sources in 3D, define audio properties of your world, position the listener and it then calculates the results for you.

The following is a simple HTML5 sample that illustrates creating a 3D audioscape using the Web api.  It creates 3 different sounds, a laughing woman, a waterfall and a bird chirping, all downloaded from  The lady laughing can be moved around using the arrow keys as well as page up/down to move around the Z axis.  You can move the listeners ear left and right using the A and D key.  Finally the bird chirping will appear every 4 seconds in a random location relative to the users ear, + or – 1 meter.

You can get more details and hear the demo results in this video, which I will also embed below.  The sound works best when heard through a set of headphones.

<!DOCTYPE html>
<html lang="en">
    <meta charset="UTF-8">
    <title>GFS Resonance Audio Test</title>

    <script src=""></script>

        var laughX = 0.0;
        var laughY = 0.0;
        var laughZ = 0.0;

        var x = 0.0;
        var y = 0.0;
        var z = 0.0;

        // Create an AudioContext
        let audioContext = new AudioContext();

        // Create a (first-order Ambisonic) Resonance Audio scene and pass it
        // the AudioContext.
        let resonanceAudioScene = new ResonanceAudio(audioContext);

        // Connect the scene’s binaural output to stereo out.

        // Define room dimensions.
        // By default, room dimensions are undefined (0m x 0m x 0m).
        let roomDimensions = {
            width: 3.1,
            height: 2.5,
            depth: 3.4,

        // Define materials for each of the room’s six surfaces.
        // Room materials have different acoustic reflectivity.
        let roomMaterials = {
            // Room wall materials
            left: 'metal',
            right: 'curtain-heavy',
            front: 'curtain-heavy',
            back: 'curtain-heavy',
            // Room floor
            down: 'grass',
            // Room ceiling
            up: 'grass',

        // Add the room definition to the scene.
        resonanceAudioScene.setRoomProperties(roomDimensions, roomMaterials);

        /// -----------------  Laugh audio
        // Create an AudioElement.
        let audioElement = document.createElement('audio');

        // Load an audio file into the AudioElement.
        audioElement.src = 'laugh.wav';
        audioElement.loop = true;
        // Generate a MediaElementSource from the AudioElement.
        let audioElementSource = audioContext.createMediaElementSource(audioElement);
        // Add the MediaElementSource to the scene as an audio input source.
        let source = resonanceAudioScene.createSource();
        // Set the source position relative to the room center (source default position).
        source.setPosition(laughX, laughY, laughZ);

        /// -----------------  Waterfall
        // Create an AudioElement.
        let audioElement2 = document.createElement('audio');
        audioElement2.src = 'waterfall.wav';
        audioElement2.loop = true;
        let audioElementSource2 = audioContext.createMediaElementSource(audioElement2);
        let source2 = resonanceAudioScene.createSource();

        /// -----------------  Bird noises
        let audioElement3 = document.createElement('audio');
        audioElement3.src = 'bird.wav';
        audioElement3.loop = false;
        let audioElementSource3 = audioContext.createMediaElementSource(audioElement3);
        let source3 = resonanceAudioScene.createSource();

        // Play the audio.;;

            //randomly position bird  -1 to +1 x/y relative to the listeners location every 4 seconds
            source3.setPosition(x + Math.random() * 2 - 1 ,y + Math.random() * 2 - 1,1);

        window.addEventListener("keyup", function(event) {

            // Move laugh audio source around when arrow keys pressed
            if (event.which == 37) // left arrow key
                    source.setPosition(laughX -= 0.10, laughY, laughZ);
            if (event.which == 39) // right arrow key
                    source.setPosition(laughX += 0.10, laughY, laughZ);
            if (event.which == 38) // up arrow key
                    source.setPosition(laughX , laughY += 0.10, laughZ);
            if (event.which == 40) // down arrow key
                    source.setPosition(laughX, laughY -= 0.10, laughZ);
            if (event.which == 33) // page up arrow key
                source.setPosition(laughX , laughY, laughZ += 0.10);
            if (event.which == 34) // page down arrow key
                source.setPosition(laughX, laughY, laughZ -= 0.10);
            if (event.which == 32) // space key
                laughX = 0;
                laughY = 0;
                laughZ = 0;
                source.setPosition(laughX, laughY, laughZ);

            // Move the listener left or right on A/D keys
            if (event.which == 65){ //A
            if (event.which == 68){ //D
        }, this);



The video

GameDev News Programming

12. February 2018

Today Unity have released a very high quality and full featured 2D Game Kit, available here.  The kit is a combination of 2D platformer game with multiple levels, a loading screen etc.  It has been designed in such a way that much of the game content can be customized and configured without ever having to write a single line of code.  Additionally there is a step by step tutorial series as well as comprehensive reference material that show you how to create your own 2D game using the game kit.  The kit is available completely free and can be downloaded from the asset store.

If you are interested in seeing the Unity 2D game kit in action be sure to check out this quick video, which is also embedded below.

Programming GameDev News

9. February 2018

With the release of Unity 2018.1 beta, Unity have developed a completely new programmable graphics pipeline.  On top of this new rendering technology Unity have a new shader tool called Shader Graph.  Shader Graph enables you to create shaders using a drag and drop interface by creating graphs of render nodes.

To get started with the new Shader Graph, you need to be running Unity 2018.1 Beta or newer.  Additionally you currently need to download this example scene.  You can see the Shader Graph in action in this video also embedded below.

Programming Art

1. February 2018

As part of the Godot 3 release, Godot got official support for VR headsets using Cardboard, SteamVR and OpenHMD interfaces implemented using the new GDNative functionality in Godot.  Today I decided to test it using my Samsung Odyssey HMD a Windows Mixed Reality headset that has beta compatibility with SteamVR.  I personally had very little hope for things to go smoothly… boy was I wrong.  What follows is a step by step guide to using VR in Godot.  This whole process is made possible by the hard work of Bastiaan Olij, his Godot 3 OpenVR project is available here.

First, we assume that you are using Godot 3 or higher.  If you havent already installed Godot 3 or higher, go do so now.

Next, create a new project, the specifics really don’t matter.  There are a few requirements, every scene must have a ARVRCamera and the camera must have an ARVROrigin as it’s parent.  I start with the following setup:


The ARVROrigin only has one property, the world scale.  The ARVRCamera has several more options such as FoV, an Environment and more.  For now the defaults are fine.  Next we need to do a small bit of code to run the VR server.  Attach a script to the root node and add the following code to _ready:

func _ready():
	var vr = ARVRServer.find_interface("OpenVR")
	if(vr and vr.initialize()):
		get_viewport().arvr = true
		get_viewport().hdr = false

And… done!  Really, that’s it.  Add a few objects to your scene under the ARVROrigin.  Plugin in your headset and press play.  At this point in time your scene should render on your headset and you should already have head tracking enabled!

Next up, let’s go ahead and install the OpenVR functionality.  First select the AssetLib tab:


Now search for VR and select OpenVR module:


Click the install button.  Then once downloaded, click install again:


Now click Install once again and addons will be copied to your project including all of the dlls and scenes we need.

Next it’s time to implement some controller logic.  You could implement them yourself using ARVRController, or you can let someone else do the hard work!  With ARVROrigin selected, right click and select Instance Child Scene…


Navigate into the module we installed earlier into the folder addons/godot-openvr/scenes and select ovr_controller.tscn.


Next you can add default behavior to the controller you just created.  Right click the newly created controller node, instance child scene and this time select Function_Pointer.tscn.  Your scene should now look like:


At this point you now have a 3D game with full head tracking, a single controller with pointer functionality.  Pretty awesome!  For even more functionality you can implement another controller, attach teleport controls to it and you will have the ability to move around.  Next replace your camera with a ovr_first_person scene and presto, you’ve got a VR game!

If you’d prefer the video version check here (or embedded below):


AppGameKit Studio

See More Tutorials on!

Month List