Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

17. April 2018


If you are a Blender Game Engine (BGE) fan, I have some bad news for you.  Earlier today BGE was removed from the Blender 2.8 branch of source code.  This means in the next version of Blender and beyond, there will no longer be an in-built game engine.  The game engine was never particularly popular and apparently caused a bit of a code maintenance nightmare, so the decision was made to remove it.  Then changes to the game engine are massive, touching 916 files in the code base.

Details of the change from the Blender code commit comments:

Removing Blender Game Engine from Blender 2.8

Folders removed entirely:

  • //extern/recastnavigation
  • //intern/decklink
  • //intern/moto
  • //source/blender/editors/space_logic
  • //source/blenderplayer
  • //source/gameengine

This includes DNA data and any reference to the BGE code in Blender itself.
We are bumping the subversion.

Pending tasks:

  • Tile/clamp code in image editor draw code.
  • Viewport drawing code (so much of this will go away because of BI removal that we can wait until then to remove this.

You can learn more about the change in this video, also embedded below.

Art, Programming, GameDev News

15. April 2018


A couple days back AppGameKit v2018.4.12 was released with the major new feature being AR (Augmented Reality) support.  I decided to give the new AR functionality a shot and it was really impressive how easy it was.  In order to get started with AR and AppGameKit you are going to need an AR compatible device.  On iOS, this means an ARKit compatible device, which basically means an iPhone 6S or newer device, while on Android device you need an ARCore compatible device from this list of phones.


I modified the AR example slightly, to remove a bit of functionality and to instead load a simple Tie Fighter model I downloaded off the web and converted to .X format.  AppGameKit can be coded using either C++ or their higher level Basic like script, which is what was used in this example.  Here is the slightly modified source code used:

// set window properties
SetWindowTitle( "AR Tie Fighter" )
SetWindowSize( 1024, 768, 0 )

// set display properties
SetVirtualResolution( 1024, 768 )
SetOrientationAllowed( 1, 1, 1, 1 )
SetScissor(0,0,0,0)
SetClearColor( 101,120,154 )
SetGenerateMipmaps( 1 )
UseNewDefaultFonts(1)
SetPrintSize(20)

// camera range from 0.1 meters to 40 meters
SetCameraRange( 1, 0.1, 40 )
SetAmbientColor( 128,128,128 )
SetSunColor( 255,255,255 )

// load tie fighter
LoadObject( 1, "tie.x")
SetObjectPosition( 1, 0,0.1,0 )
LoadImage(1, "diffuse.jpg")
SetObjectImage (1,1,0) 
SetObjectRotation(1,270,0,0)

function ShowModel( show as integer )
  SetObjectVisible( 1, show )
endfunction

ShowModel( 0 )

function ScaleModel( amount as float )
  SetObjectScalePermanent( 1, amount, amount, amount )
endfunction

ScaleModel( 0.025 )

// create some planes to show detected surfaces, initially hidden
for i = 101 to 150
  CreateObjectPlane( i, 1,1 )
  SetObjectRotation( i, 90,0,0 )
  FixObjectPivot( i )
  SetObjectVisible( i, 0 )
  SetObjectColor( i, 255,255,255,128 ) // 50% transparent
  SetObjectTransparency( i, 1 )
next i

// add some buttons to control various features
AddVirtualButton( 1, 100,565,100 )
AddVirtualButton( 2, 100,665,100 )
SetVirtualButtonText( 1, "Scale +" )
SetVirtualButtonText( 2, "Scale -" )

AddVirtualButton( 3, 924,665,100 )
SetVirtualButtonText( 3, "Hide" )

function ShowHUD( show as integer )
  SetVirtualButtonVisible( 1, show )
  SetVirtualButtonVisible( 2, show )
  SetVirtualButtonVisible( 3, show )
  SetVirtualButtonActive( 1, show )
  SetVirtualButtonActive( 2, show )
  SetVirtualButtonActive( 3, show )
endfunction

// initialize AR, if possible
ARSetup()
while( ARGetStatus() = 1 )
  // wait while user is being prompted to install ARCore
  Sync()
endwhile

AnchorID as integer = 0
ShowPlanes as integer = 1
ambientScale# = 1.0

do
  // get light estimation
  ambient = ARGetLightEstimate() * 255 * ambientScale#
  SetAmbientColor( ambient,ambient,ambient )
  
  // check screen tap for plane hits, but only if buttons are visible
  if ( GetPointerReleased() and ShowPlanes = 1 )
    // check the point that the user tapped on the screen
    numHits = ARHitTest( GetPointerX(), GetPointerY() )
    if ( numHits > 0 )
      ShowModel( 1 )
      // delete any previous anchor, could keep it around instead
      if ( AnchorID > 0 ) then ARDeleteAnchor( AnchorID )
      // hit test results are ordered from closest to furthest
      // place the object at result 1, the closest
      AnchorID = ARCreateAnchorFromHitTest( 1 )
      ARFixObjectToAnchor( 1, AnchorID )
    else
      // if the user didn't tap on any planes then hide the object
      ShowModel( 0 )
    endif
    // clean up some internal resources
    ARHitTestFinish()
  endif
  
  // place the buttons at the edge of the screen
  // needs to be done regularly in case orientation changes
  SetVirtualButtonPosition( 1, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-210 )
  SetVirtualButtonPosition( 2, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-105 )
  SetVirtualButtonPosition( 3, GetScreenBoundsRight()-105, GetScreenBoundsBottom()-105 )
  
  // detect button presses if they are visible
  if ( ShowPlanes = 1 )
    if ( GetVirtualButtonPressed(1) )
      ScaleModel( 1.05 )
    endif
    if ( GetVirtualButtonPressed(2) )
      ScaleModel( 0.95 )
    endif
    if ( GetVirtualButtonPressed(3) )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 0 )
    endif
  else
    // screen tap whilst button are hidden shows them again
    if ( GetPointerReleased() )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 1 )
    endif
  endif
  
  // hide old planes
    for i = 101 to 150
    SetObjectVisible( i, 0 )
  next i
  
  // show detected planes
  if ( ShowPlanes )
    numPlanes = ARGetPlanes(0)
    // this demo stops at 50 planes, but there is no internal limit
    if numPlanes > 50 then numPlanes = 50
    for i = 1 to numPlanes
      SetObjectPosition( i+100, ARGetPlaneX(i), ARGetPlaneY(i), ARGetPlaneZ(i) )
      SetObjectRotation( i+100, ARGetPlaneAngleX(i), ARGetPlaneAngleY(i), ARGetPlaneAngleZ(i) )
      SetObjectScale( i+100, ARGetPlaneSizeX(i), 1, ARGetPlaneSizeZ(i) )
      SetObjectVisible( i+100, 1 )
    next i
    ARGetPlanesFinish()
  endif
    
    if ( ShowPlanes )
    Print( "FPS: " + str(ScreenFPS()) )
    select( ARGetStatus() )
      case 2 :  Print( "AR Active" ) : endcase
      case -1 :  Print( "AR Not Available" ) : endcase
      case -2 :  Print( "AR Install Rejected" ) : endcase
    endselect
    Print( "Number of Planes Detected: " + str(numPlanes) )
    Print( "Light Estimation: " + str(ARGetLightEstimate()) )
    Print( "Light Boost: " + str(ambientScale#,1) )
  endif
    
  // draw the camera feed, and then the rest of the scene
  ARDrawBackground()
    Sync()
    RotateObjectLocalZ(1,1)
loop


You can see the results of this code and get a bit more detail by watching the video below:


If you are interested in learning more about AppGameKit, be sure to check out our Closer Look available here.

Programming ,

15. March 2018


Google just open sourced Resonance Audio, their 3D spatial audio rendering SDK.  It supports multiple platforms and game engines including Unreal, Unity, wWise, FMod, iOS, Android and Web.  You can learn more about Resonance Audio here, while the source is hosted on Github under the liberal Apache 2 source license.  Resonance enables you to create and position audio sources in 3D, define audio properties of your world, position the listener and it then calculates the results for you.


The following is a simple HTML5 sample that illustrates creating a 3D audioscape using the Web api.  It creates 3 different sounds, a laughing woman, a waterfall and a bird chirping, all downloaded from freesound.org.  The lady laughing can be moved around using the arrow keys as well as page up/down to move around the Z axis.  You can move the listeners ear left and right using the A and D key.  Finally the bird chirping will appear every 4 seconds in a random location relative to the users ear, + or – 1 meter.


You can get more details and hear the demo results in this video, which I will also embed below.  The sound works best when heard through a set of headphones.

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>GFS Resonance Audio Test</title>


    <script src="https://cdn.jsdelivr.net/npm/resonance-audio/build/resonance-audio.min.js"></script>

    <script>
        var laughX = 0.0;
        var laughY = 0.0;
        var laughZ = 0.0;


        var x = 0.0;
        var y = 0.0;
        var z = 0.0;

        // Create an AudioContext
        let audioContext = new AudioContext();

        // Create a (first-order Ambisonic) Resonance Audio scene and pass it
        // the AudioContext.
        let resonanceAudioScene = new ResonanceAudio(audioContext);

        // Connect the scene’s binaural output to stereo out.
        resonanceAudioScene.output.connect(audioContext.destination);

        // Define room dimensions.
        // By default, room dimensions are undefined (0m x 0m x 0m).
        let roomDimensions = {
            width: 3.1,
            height: 2.5,
            depth: 3.4,
        };

        // Define materials for each of the room’s six surfaces.
        // Room materials have different acoustic reflectivity.
        let roomMaterials = {
            // Room wall materials
            left: 'metal',
            right: 'curtain-heavy',
            front: 'curtain-heavy',
            back: 'curtain-heavy',
            // Room floor
            down: 'grass',
            // Room ceiling
            up: 'grass',
        };

        // Add the room definition to the scene.
        resonanceAudioScene.setRoomProperties(roomDimensions, roomMaterials);

        /// -----------------  Laugh audio
        // Create an AudioElement.
        let audioElement = document.createElement('audio');

        // Load an audio file into the AudioElement.
        audioElement.src = 'laugh.wav';
        audioElement.loop = true;
        // Generate a MediaElementSource from the AudioElement.
        let audioElementSource = audioContext.createMediaElementSource(audioElement);
        // Add the MediaElementSource to the scene as an audio input source.
        let source = resonanceAudioScene.createSource();
        audioElementSource.connect(source.input);
        // Set the source position relative to the room center (source default position).
        source.setPosition(laughX, laughY, laughZ);
        source.setMaxDistance(3);

        /// -----------------  Waterfall
        // Create an AudioElement.
        let audioElement2 = document.createElement('audio');
        audioElement2.src = 'waterfall.wav';
        audioElement2.loop = true;
        let audioElementSource2 = audioContext.createMediaElementSource(audioElement2);
        let source2 = resonanceAudioScene.createSource();
        audioElementSource2.connect(source2.input);
        source2.setPosition(0,0,0);
        source2.setMaxDistance(3);


        /// -----------------  Bird noises
        let audioElement3 = document.createElement('audio');
        audioElement3.src = 'bird.wav';
        audioElement3.loop = false;
        let audioElementSource3 = audioContext.createMediaElementSource(audioElement3);
        let source3 = resonanceAudioScene.createSource();
        audioElementSource3.connect(source3.input);
        source3.setPosition(0.5,0,1);
        source3.setMaxDistance(3);

        // Play the audio.
        audioElement.play();
        audioElement2.play();

        setInterval(()=>{
            //randomly position bird  -1 to +1 x/y relative to the listeners location every 4 seconds
            source3.setPosition(x + Math.random() * 2 - 1 ,y + Math.random() * 2 - 1,1);
            audioElement3.play();
        },4000);

        resonanceAudioScene.setListenerPosition(x,y,z);
        window.addEventListener("keyup", function(event) {

            // Move laugh audio source around when arrow keys pressed
            if (event.which == 37) // left arrow key
                {
                    source.setPosition(laughX -= 0.10, laughY, laughZ);
                }
            if (event.which == 39) // right arrow key
                {
                    source.setPosition(laughX += 0.10, laughY, laughZ);
                }
            if (event.which == 38) // up arrow key
                {
                    source.setPosition(laughX , laughY += 0.10, laughZ);
                }
            if (event.which == 40) // down arrow key
                {
                    source.setPosition(laughX, laughY -= 0.10, laughZ);
                }
            if (event.which == 33) // page up arrow key
            {
                source.setPosition(laughX , laughY, laughZ += 0.10);
            }
            if (event.which == 34) // page down arrow key
            {
                source.setPosition(laughX, laughY, laughZ -= 0.10);
            }
            if (event.which == 32) // space key
            {
                laughX = 0;
                laughY = 0;
                laughZ = 0;
                source.setPosition(laughX, laughY, laughZ);
            }



            // Move the listener left or right on A/D keys
            if (event.which == 65){ //A
                resonanceAudioScene.setListenerPosition(x-=0.1,y,z);
            }
            if (event.which == 68){ //D
                resonanceAudioScene.setListenerPosition(x+=0.1,y,z);
            }
        }, this);

    </script>
</head>
<body>

</body>
</html>


The video

GameDev News, Programming

12. February 2018


Today Unity have released a very high quality and full featured 2D Game Kit, available here.  The kit is a combination of 2D platformer game with multiple levels, a loading screen etc.  It has been designed in such a way that much of the game content can be customized and configured without ever having to write a single line of code.  Additionally there is a step by step tutorial series as well as comprehensive reference material that show you how to create your own 2D game using the game kit.  The kit is available completely free and can be downloaded from the asset store.


If you are interested in seeing the Unity 2D game kit in action be sure to check out this quick video, which is also embedded below.


Programming, GameDev News , ,

9. February 2018


With the release of Unity 2018.1 beta, Unity have developed a completely new programmable graphics pipeline.  On top of this new rendering technology Unity have a new shader tool called Shader Graph.  Shader Graph enables you to create shaders using a drag and drop interface by creating graphs of render nodes.


To get started with the new Shader Graph, you need to be running Unity 2018.1 Beta or newer.  Additionally you currently need to download this example scene.  You can see the Shader Graph in action in this video also embedded below.

Programming, Art

Month List

Popular Comments