Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
31. January 2019

VReal is a application for broadcasting and watching supported VR applications.  It enables you to participate in the game you are watching, freely navigating the game world while potentially communicating with other spectators.  This required the broadcaster and audience to use VR headsets, greatly shrinking the possible audience, until today.  Today VRreal announced “Desktop Mode”, that enables people without VR headsets to get in on the fun.

From the news release:

Hello friends,We’re starting off 2019 with a big update: We’re introducing a new way for everyone to watch their favorite VR content creators in Vreal – no VR headset required! (WHAAAAT?)

That’s right; we’re introducing Desktop mode, which gives you an immersive, two-dimensional look into the VR content made in Vreal. Using the WASD keys and your mouse, you can move around anywhere within the game, giving you full control of where you want to watch from; your very own spectator mode for VR game content.

The viewing experience is cross-platform; so you can watch with friends in Desktop mode or in VR from your own headset.

Simply download the application on Steam, then launch in Non-VR mode when prompted.  VReal is currently in early access and is free.  There are developer kits for Unity and Unreal Engine if you wish to integrate VReal into your VR title, but sadly the developer website does not appear to be working.

GameDev News

27. September 2018

Today Google announced the release of ARCore 1.5 as well as Sceneform, a real time 3D framework with a physically based renderer for Android.  The 1.5 release comes with runtime support for loading glTF models, the ability to ID individual point cloud points, and a newly open source UX library in Sceneform.  In addition to the Android release, there are builds of ARCore 1.5 for Unity and Unreal Engine developers as well.

Details from the Google developer blog:

Today, we're releasing updates to ARCore, Google's platform for building augmented reality experiences, and to Sceneform, the 3D rendering library for building AR applications on Android. These updates include algorithm improvements that will let your apps consume less memory and CPU usage during longer sessions. They also include new functionality that give you more flexibility over content management.

Here's what we added:

    • Supporting runtime glTF loading in Sceneform
    • Publishing the Sceneform UX Library's source code
    • Adding point cloud IDs to ARCore
    • New devices (plus Chrome OS in the form of Chromebook Tab 10)

You can download the source for Sceneform here on Github, the code is released under the Apache 2.0 source license.  Unity developers can click here, while Unreal Engine developers should click here.

GameDev News

26. September 2018

Today at Oculus Connect 5, Oculus announced the upcoming Oculus Quest headset.  What makes this headset special is it is full 6 degree of freedom tracking, without base stations while being completely wireless.  It slots into the lineup between the cheaper but less capable Oculus Go, and the more expensive but wired Oculus Rift, which requires a full desktop PC to function.

Details of the Oculus Quest from the Oculus blog:

We’re excited to usher in the next era of VR gaming with the introduction of Oculus Quest, our first all-in-one VR gaming system. Oculus Quest will launch in Spring 2019 for $399 USD. Offering six degrees of freedom and Touch controllers, Oculus Quest makes it easy to jump right into the action—with no PC, no wires, and no external sensors. We have over 50 titles lined up for launch, with even more in the works including some of your favorite Rift games like Robo Recall, The Climb, and Moss.

Oculus Insight
We also unveiled Oculus Insight, our breakthrough technology that powers inside-out tracking, Guardian, and Touch controller tracking. This innovative system uses four ultra wide-angle sensors and computer vision algorithms to track your exact position in real time without any external sensors. Insight gives you a greater sense of immersion, presence, and mobility, plus the ability to go beyond room-scale. And we’ve brought over Guardian to help keep you safer while in VR. It’s easy to setup and experience whenever you want.

The Best VR Games Deserve the Best VR Controllers
With the same buttons, thumbsticks, and sensors that have defined VR gaming, our intuitive Touch controllers bring your real hands into VR and let you easily and naturally interact with the world around you. By shipping Oculus Quest with Touch, everything developers have learned about game design for Rift applies to Oculus Quest. Now you can enjoy the best that VR gaming has to offer, starting at $399 USD for a 64GB headset—with the convenience and portability of all-in-one VR.

Quality Meets Comfort
Oculus Quest includes the same best-of-class optics as Oculus Go with a display resolution of 1600x1440 per eye, while incorporating a lens spacing adjustment to help maximize visual comfort. And we’ve improved our built-in audio, so you get high-quality, immersive sound with even deeper bass.

GameDev News

15. April 2018

A couple days back AppGameKit v2018.4.12 was released with the major new feature being AR (Augmented Reality) support.  I decided to give the new AR functionality a shot and it was really impressive how easy it was.  In order to get started with AR and AppGameKit you are going to need an AR compatible device.  On iOS, this means an ARKit compatible device, which basically means an iPhone 6S or newer device, while on Android device you need an ARCore compatible device from this list of phones.

I modified the AR example slightly, to remove a bit of functionality and to instead load a simple Tie Fighter model I downloaded off the web and converted to .X format.  AppGameKit can be coded using either C++ or their higher level Basic like script, which is what was used in this example.  Here is the slightly modified source code used:

// set window properties
SetWindowTitle( "AR Tie Fighter" )
SetWindowSize( 1024, 768, 0 )

// set display properties
SetVirtualResolution( 1024, 768 )
SetOrientationAllowed( 1, 1, 1, 1 )
SetClearColor( 101,120,154 )
SetGenerateMipmaps( 1 )

// camera range from 0.1 meters to 40 meters
SetCameraRange( 1, 0.1, 40 )
SetAmbientColor( 128,128,128 )
SetSunColor( 255,255,255 )

// load tie fighter
LoadObject( 1, "tie.x")
SetObjectPosition( 1, 0,0.1,0 )
LoadImage(1, "diffuse.jpg")
SetObjectImage (1,1,0) 

function ShowModel( show as integer )
  SetObjectVisible( 1, show )

ShowModel( 0 )

function ScaleModel( amount as float )
  SetObjectScalePermanent( 1, amount, amount, amount )

ScaleModel( 0.025 )

// create some planes to show detected surfaces, initially hidden
for i = 101 to 150
  CreateObjectPlane( i, 1,1 )
  SetObjectRotation( i, 90,0,0 )
  FixObjectPivot( i )
  SetObjectVisible( i, 0 )
  SetObjectColor( i, 255,255,255,128 ) // 50% transparent
  SetObjectTransparency( i, 1 )
next i

// add some buttons to control various features
AddVirtualButton( 1, 100,565,100 )
AddVirtualButton( 2, 100,665,100 )
SetVirtualButtonText( 1, "Scale +" )
SetVirtualButtonText( 2, "Scale -" )

AddVirtualButton( 3, 924,665,100 )
SetVirtualButtonText( 3, "Hide" )

function ShowHUD( show as integer )
  SetVirtualButtonVisible( 1, show )
  SetVirtualButtonVisible( 2, show )
  SetVirtualButtonVisible( 3, show )
  SetVirtualButtonActive( 1, show )
  SetVirtualButtonActive( 2, show )
  SetVirtualButtonActive( 3, show )

// initialize AR, if possible
while( ARGetStatus() = 1 )
  // wait while user is being prompted to install ARCore

AnchorID as integer = 0
ShowPlanes as integer = 1
ambientScale# = 1.0

  // get light estimation
  ambient = ARGetLightEstimate() * 255 * ambientScale#
  SetAmbientColor( ambient,ambient,ambient )
  // check screen tap for plane hits, but only if buttons are visible
  if ( GetPointerReleased() and ShowPlanes = 1 )
    // check the point that the user tapped on the screen
    numHits = ARHitTest( GetPointerX(), GetPointerY() )
    if ( numHits > 0 )
      ShowModel( 1 )
      // delete any previous anchor, could keep it around instead
      if ( AnchorID > 0 ) then ARDeleteAnchor( AnchorID )
      // hit test results are ordered from closest to furthest
      // place the object at result 1, the closest
      AnchorID = ARCreateAnchorFromHitTest( 1 )
      ARFixObjectToAnchor( 1, AnchorID )
      // if the user didn't tap on any planes then hide the object
      ShowModel( 0 )
    // clean up some internal resources
  // place the buttons at the edge of the screen
  // needs to be done regularly in case orientation changes
  SetVirtualButtonPosition( 1, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-210 )
  SetVirtualButtonPosition( 2, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-105 )
  SetVirtualButtonPosition( 3, GetScreenBoundsRight()-105, GetScreenBoundsBottom()-105 )
  // detect button presses if they are visible
  if ( ShowPlanes = 1 )
    if ( GetVirtualButtonPressed(1) )
      ScaleModel( 1.05 )
    if ( GetVirtualButtonPressed(2) )
      ScaleModel( 0.95 )
    if ( GetVirtualButtonPressed(3) )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 0 )
    // screen tap whilst button are hidden shows them again
    if ( GetPointerReleased() )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 1 )
  // hide old planes
    for i = 101 to 150
    SetObjectVisible( i, 0 )
  next i
  // show detected planes
  if ( ShowPlanes )
    numPlanes = ARGetPlanes(0)
    // this demo stops at 50 planes, but there is no internal limit
    if numPlanes > 50 then numPlanes = 50
    for i = 1 to numPlanes
      SetObjectPosition( i+100, ARGetPlaneX(i), ARGetPlaneY(i), ARGetPlaneZ(i) )
      SetObjectRotation( i+100, ARGetPlaneAngleX(i), ARGetPlaneAngleY(i), ARGetPlaneAngleZ(i) )
      SetObjectScale( i+100, ARGetPlaneSizeX(i), 1, ARGetPlaneSizeZ(i) )
      SetObjectVisible( i+100, 1 )
    next i
    if ( ShowPlanes )
    Print( "FPS: " + str(ScreenFPS()) )
    select( ARGetStatus() )
      case 2 :  Print( "AR Active" ) : endcase
      case -1 :  Print( "AR Not Available" ) : endcase
      case -2 :  Print( "AR Install Rejected" ) : endcase
    Print( "Number of Planes Detected: " + str(numPlanes) )
    Print( "Light Estimation: " + str(ARGetLightEstimate()) )
    Print( "Light Boost: " + str(ambientScale#,1) )
  // draw the camera feed, and then the rest of the scene

You can see the results of this code and get a bit more detail by watching the video below:

If you are interested in learning more about AppGameKit, be sure to check out our Closer Look available here.


1. February 2018

As part of the Godot 3 release, Godot got official support for VR headsets using Cardboard, SteamVR and OpenHMD interfaces implemented using the new GDNative functionality in Godot.  Today I decided to test it using my Samsung Odyssey HMD a Windows Mixed Reality headset that has beta compatibility with SteamVR.  I personally had very little hope for things to go smoothly… boy was I wrong.  What follows is a step by step guide to using VR in Godot.  This whole process is made possible by the hard work of Bastiaan Olij, his Godot 3 OpenVR project is available here.

First, we assume that you are using Godot 3 or higher.  If you havent already installed Godot 3 or higher, go do so now.

Next, create a new project, the specifics really don’t matter.  There are a few requirements, every scene must have a ARVRCamera and the camera must have an ARVROrigin as it’s parent.  I start with the following setup:


The ARVROrigin only has one property, the world scale.  The ARVRCamera has several more options such as FoV, an Environment and more.  For now the defaults are fine.  Next we need to do a small bit of code to run the VR server.  Attach a script to the root node and add the following code to _ready:

func _ready():
	var vr = ARVRServer.find_interface("OpenVR")
	if(vr and vr.initialize()):
		get_viewport().arvr = true
		get_viewport().hdr = false

And… done!  Really, that’s it.  Add a few objects to your scene under the ARVROrigin.  Plugin in your headset and press play.  At this point in time your scene should render on your headset and you should already have head tracking enabled!

Next up, let’s go ahead and install the OpenVR functionality.  First select the AssetLib tab:


Now search for VR and select OpenVR module:


Click the install button.  Then once downloaded, click install again:


Now click Install once again and addons will be copied to your project including all of the dlls and scenes we need.

Next it’s time to implement some controller logic.  You could implement them yourself using ARVRController, or you can let someone else do the hard work!  With ARVROrigin selected, right click and select Instance Child Scene…


Navigate into the module we installed earlier into the folder addons/godot-openvr/scenes and select ovr_controller.tscn.


Next you can add default behavior to the controller you just created.  Right click the newly created controller node, instance child scene and this time select Function_Pointer.tscn.  Your scene should now look like:


At this point you now have a 3D game with full head tracking, a single controller with pointer functionality.  Pretty awesome!  For even more functionality you can implement another controller, attach teleport controls to it and you will have the ability to move around.  Next replace your camera with a ovr_first_person scene and presto, you’ve got a VR game!

If you’d prefer the video version check here (or embedded below):


GFS On YouTube

See More Tutorials on!

Month List