Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
27. September 2018


Today Google announced the release of ARCore 1.5 as well as Sceneform, a real time 3D framework with a physically based renderer for Android.  The 1.5 release comes with runtime support for loading glTF models, the ability to ID individual point cloud points, and a newly open source UX library in Sceneform.  In addition to the Android release, there are builds of ARCore 1.5 for Unity and Unreal Engine developers as well.

Details from the Google developer blog:

Today, we're releasing updates to ARCore, Google's platform for building augmented reality experiences, and to Sceneform, the 3D rendering library for building AR applications on Android. These updates include algorithm improvements that will let your apps consume less memory and CPU usage during longer sessions. They also include new functionality that give you more flexibility over content management.

Here's what we added:

    • Supporting runtime glTF loading in Sceneform
    • Publishing the Sceneform UX Library's source code
    • Adding point cloud IDs to ARCore
    • New devices (plus Chrome OS in the form of Chromebook Tab 10)

You can download the source for Sceneform here on Github, the code is released under the Apache 2.0 source license.  Unity developers can click here, while Unreal Engine developers should click here.

GameDev News


26. September 2018


Today at Oculus Connect 5, Oculus announced the upcoming Oculus Quest headset.  What makes this headset special is it is full 6 degree of freedom tracking, without base stations while being completely wireless.  It slots into the lineup between the cheaper but less capable Oculus Go, and the more expensive but wired Oculus Rift, which requires a full desktop PC to function.

Details of the Oculus Quest from the Oculus blog:

We’re excited to usher in the next era of VR gaming with the introduction of Oculus Quest, our first all-in-one VR gaming system. Oculus Quest will launch in Spring 2019 for $399 USD. Offering six degrees of freedom and Touch controllers, Oculus Quest makes it easy to jump right into the action—with no PC, no wires, and no external sensors. We have over 50 titles lined up for launch, with even more in the works including some of your favorite Rift games like Robo Recall, The Climb, and Moss.

Oculus Insight
We also unveiled Oculus Insight, our breakthrough technology that powers inside-out tracking, Guardian, and Touch controller tracking. This innovative system uses four ultra wide-angle sensors and computer vision algorithms to track your exact position in real time without any external sensors. Insight gives you a greater sense of immersion, presence, and mobility, plus the ability to go beyond room-scale. And we’ve brought over Guardian to help keep you safer while in VR. It’s easy to setup and experience whenever you want.

The Best VR Games Deserve the Best VR Controllers
With the same buttons, thumbsticks, and sensors that have defined VR gaming, our intuitive Touch controllers bring your real hands into VR and let you easily and naturally interact with the world around you. By shipping Oculus Quest with Touch, everything developers have learned about game design for Rift applies to Oculus Quest. Now you can enjoy the best that VR gaming has to offer, starting at $399 USD for a 64GB headset—with the convenience and portability of all-in-one VR.

Quality Meets Comfort
Oculus Quest includes the same best-of-class optics as Oculus Go with a display resolution of 1600x1440 per eye, while incorporating a lens spacing adjustment to help maximize visual comfort. And we’ve improved our built-in audio, so you get high-quality, immersive sound with even deeper bass.

GameDev News


15. April 2018


A couple days back AppGameKit v2018.4.12 was released with the major new feature being AR (Augmented Reality) support.  I decided to give the new AR functionality a shot and it was really impressive how easy it was.  In order to get started with AR and AppGameKit you are going to need an AR compatible device.  On iOS, this means an ARKit compatible device, which basically means an iPhone 6S or newer device, while on Android device you need an ARCore compatible device from this list of phones.


I modified the AR example slightly, to remove a bit of functionality and to instead load a simple Tie Fighter model I downloaded off the web and converted to .X format.  AppGameKit can be coded using either C++ or their higher level Basic like script, which is what was used in this example.  Here is the slightly modified source code used:

// set window properties
SetWindowTitle( "AR Tie Fighter" )
SetWindowSize( 1024, 768, 0 )

// set display properties
SetVirtualResolution( 1024, 768 )
SetOrientationAllowed( 1, 1, 1, 1 )
SetScissor(0,0,0,0)
SetClearColor( 101,120,154 )
SetGenerateMipmaps( 1 )
UseNewDefaultFonts(1)
SetPrintSize(20)

// camera range from 0.1 meters to 40 meters
SetCameraRange( 1, 0.1, 40 )
SetAmbientColor( 128,128,128 )
SetSunColor( 255,255,255 )

// load tie fighter
LoadObject( 1, "tie.x")
SetObjectPosition( 1, 0,0.1,0 )
LoadImage(1, "diffuse.jpg")
SetObjectImage (1,1,0) 
SetObjectRotation(1,270,0,0)

function ShowModel( show as integer )
  SetObjectVisible( 1, show )
endfunction

ShowModel( 0 )

function ScaleModel( amount as float )
  SetObjectScalePermanent( 1, amount, amount, amount )
endfunction

ScaleModel( 0.025 )

// create some planes to show detected surfaces, initially hidden
for i = 101 to 150
  CreateObjectPlane( i, 1,1 )
  SetObjectRotation( i, 90,0,0 )
  FixObjectPivot( i )
  SetObjectVisible( i, 0 )
  SetObjectColor( i, 255,255,255,128 ) // 50% transparent
  SetObjectTransparency( i, 1 )
next i

// add some buttons to control various features
AddVirtualButton( 1, 100,565,100 )
AddVirtualButton( 2, 100,665,100 )
SetVirtualButtonText( 1, "Scale +" )
SetVirtualButtonText( 2, "Scale -" )

AddVirtualButton( 3, 924,665,100 )
SetVirtualButtonText( 3, "Hide" )

function ShowHUD( show as integer )
  SetVirtualButtonVisible( 1, show )
  SetVirtualButtonVisible( 2, show )
  SetVirtualButtonVisible( 3, show )
  SetVirtualButtonActive( 1, show )
  SetVirtualButtonActive( 2, show )
  SetVirtualButtonActive( 3, show )
endfunction

// initialize AR, if possible
ARSetup()
while( ARGetStatus() = 1 )
  // wait while user is being prompted to install ARCore
  Sync()
endwhile

AnchorID as integer = 0
ShowPlanes as integer = 1
ambientScale# = 1.0

do
  // get light estimation
  ambient = ARGetLightEstimate() * 255 * ambientScale#
  SetAmbientColor( ambient,ambient,ambient )
  
  // check screen tap for plane hits, but only if buttons are visible
  if ( GetPointerReleased() and ShowPlanes = 1 )
    // check the point that the user tapped on the screen
    numHits = ARHitTest( GetPointerX(), GetPointerY() )
    if ( numHits > 0 )
      ShowModel( 1 )
      // delete any previous anchor, could keep it around instead
      if ( AnchorID > 0 ) then ARDeleteAnchor( AnchorID )
      // hit test results are ordered from closest to furthest
      // place the object at result 1, the closest
      AnchorID = ARCreateAnchorFromHitTest( 1 )
      ARFixObjectToAnchor( 1, AnchorID )
    else
      // if the user didn't tap on any planes then hide the object
      ShowModel( 0 )
    endif
    // clean up some internal resources
    ARHitTestFinish()
  endif
  
  // place the buttons at the edge of the screen
  // needs to be done regularly in case orientation changes
  SetVirtualButtonPosition( 1, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-210 )
  SetVirtualButtonPosition( 2, GetScreenBoundsLeft()+105, GetScreenBoundsBottom()-105 )
  SetVirtualButtonPosition( 3, GetScreenBoundsRight()-105, GetScreenBoundsBottom()-105 )
  
  // detect button presses if they are visible
  if ( ShowPlanes = 1 )
    if ( GetVirtualButtonPressed(1) )
      ScaleModel( 1.05 )
    endif
    if ( GetVirtualButtonPressed(2) )
      ScaleModel( 0.95 )
    endif
    if ( GetVirtualButtonPressed(3) )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 0 )
    endif
  else
    // screen tap whilst button are hidden shows them again
    if ( GetPointerReleased() )
      ShowPlanes = 1 - ShowPlanes
      ShowHUD( 1 )
    endif
  endif
  
  // hide old planes
    for i = 101 to 150
    SetObjectVisible( i, 0 )
  next i
  
  // show detected planes
  if ( ShowPlanes )
    numPlanes = ARGetPlanes(0)
    // this demo stops at 50 planes, but there is no internal limit
    if numPlanes > 50 then numPlanes = 50
    for i = 1 to numPlanes
      SetObjectPosition( i+100, ARGetPlaneX(i), ARGetPlaneY(i), ARGetPlaneZ(i) )
      SetObjectRotation( i+100, ARGetPlaneAngleX(i), ARGetPlaneAngleY(i), ARGetPlaneAngleZ(i) )
      SetObjectScale( i+100, ARGetPlaneSizeX(i), 1, ARGetPlaneSizeZ(i) )
      SetObjectVisible( i+100, 1 )
    next i
    ARGetPlanesFinish()
  endif
    
    if ( ShowPlanes )
    Print( "FPS: " + str(ScreenFPS()) )
    select( ARGetStatus() )
      case 2 :  Print( "AR Active" ) : endcase
      case -1 :  Print( "AR Not Available" ) : endcase
      case -2 :  Print( "AR Install Rejected" ) : endcase
    endselect
    Print( "Number of Planes Detected: " + str(numPlanes) )
    Print( "Light Estimation: " + str(ARGetLightEstimate()) )
    Print( "Light Boost: " + str(ambientScale#,1) )
  endif
    
  // draw the camera feed, and then the rest of the scene
  ARDrawBackground()
    Sync()
    RotateObjectLocalZ(1,1)
loop


You can see the results of this code and get a bit more detail by watching the video below:


If you are interested in learning more about AppGameKit, be sure to check out our Closer Look available here.

Programming


1. February 2018


As part of the Godot 3 release, Godot got official support for VR headsets using Cardboard, SteamVR and OpenHMD interfaces implemented using the new GDNative functionality in Godot.  Today I decided to test it using my Samsung Odyssey HMD a Windows Mixed Reality headset that has beta compatibility with SteamVR.  I personally had very little hope for things to go smoothly… boy was I wrong.  What follows is a step by step guide to using VR in Godot.  This whole process is made possible by the hard work of Bastiaan Olij, his Godot 3 OpenVR project is available here.


First, we assume that you are using Godot 3 or higher.  If you havent already installed Godot 3 or higher, go do so now.

Next, create a new project, the specifics really don’t matter.  There are a few requirements, every scene must have a ARVRCamera and the camera must have an ARVROrigin as it’s parent.  I start with the following setup:

image


The ARVROrigin only has one property, the world scale.  The ARVRCamera has several more options such as FoV, an Environment and more.  For now the defaults are fine.  Next we need to do a small bit of code to run the VR server.  Attach a script to the root node and add the following code to _ready:

func _ready():
	var vr = ARVRServer.find_interface("OpenVR")
	if(vr and vr.initialize()):
		get_viewport().arvr = true
		get_viewport().hdr = false


And… done!  Really, that’s it.  Add a few objects to your scene under the ARVROrigin.  Plugin in your headset and press play.  At this point in time your scene should render on your headset and you should already have head tracking enabled!


Next up, let’s go ahead and install the OpenVR functionality.  First select the AssetLib tab:

image


Now search for VR and select OpenVR module:

image


Click the install button.  Then once downloaded, click install again:

image


Now click Install once again and addons will be copied to your project including all of the dlls and scenes we need.


Next it’s time to implement some controller logic.  You could implement them yourself using ARVRController, or you can let someone else do the hard work!  With ARVROrigin selected, right click and select Instance Child Scene…

image


Navigate into the module we installed earlier into the folder addons/godot-openvr/scenes and select ovr_controller.tscn.

image



Next you can add default behavior to the controller you just created.  Right click the newly created controller node, instance child scene and this time select Function_Pointer.tscn.  Your scene should now look like:

image


At this point you now have a 3D game with full head tracking, a single controller with pointer functionality.  Pretty awesome!  For even more functionality you can implement another controller, attach teleport controls to it and you will have the ability to move around.  Next replace your camera with a ovr_first_person scene and presto, you’ve got a VR game!


If you’d prefer the video version check here (or embedded below):

Programming


18. January 2018


Today Oculus announced a new developer program, Oculus Start.  If you are accepted to the program youimage can get a year free of Unity plus or a royalty free Unreal Engine license!  Additionally you get access to additional support directly from Oculus as well as addition SDK and beta tools access.

From the Oculus blog:

Today, we're launching a developer program called Oculus Start aimed at providing access, support and savings to qualifying VR developers. We're launching this program to offload some of the development costs of qualified developers so that they can focus on what's really important - creating amazing VR applications. We know there is no shortage of inspired ideas and creative minds breaking ground in VR. Creativity isn't the barrier. Resources shouldn't be either. If your first app is underway, we can help you optimize for more success in this project and your next. Just as we're scaling VR through our devices, we're scaling support to the developer ecosystem.

We're accepting applications starting today!

To see if you qualify, fill in the appropriate info in the application form.Once submitted, we'll review your submission and we'll get back to you shortly.

There are no fees or catches (but, be sure to check out the important info and link to governing terms. We simply want to help support developers on their VR journey and continue to build the VR future together.


The criteria to qualify for Oculus Start is as follows:

Oculus may approve your application in its sole discretion, and approval may be withheld or withdrawn without notice. You are eligible to participate if you: (a) have a valid email address; (b) are at least the age of majority in your jurisdiction of residence; (c) have never yourself or through a VR project received funding from a platform (e.g., without limitation, Oculus, Google, Microsoft, Valve, Steam, HTC), venture capital, or crowdsourcing over USD$10,000.00; (d) as of the date of application, must have published an app on the Oculus Store or another virtual reality platform; (e) have an Oculus developer account; and (f) if you are participating in connection with your VR work within an organization, that organization must be privately held. Limit of one (1) application per person and up to two (2) per organization. Each of up to two (2) developers working on VR projects within the same organization may apply to the program, however, no developer’s application will be accepted if they work with more than one (1) other developer on VR projects within an organization.

So then, what are the benefits of Oculus Start?  They are threefold, access, support and savings as follows:

Access

A direct path to early tech and networking opportunities get you going faster, first. Benefits may include:

Get developer kits for new and existing hardware.

Receive access to beta tools and services.

Gain new knowledge and bond with fellow developers at industry events like Oculus Connect.

Support

Oculus experts will help you troubleshoot and elevate your VR creations. Benefits may include:

Receive dedicated technical support.

Meet 1:1 with our veteran VR team at local events.

Connect with the community of VR developers to share your development experiences.

Savings

Oculus partnerships and network benefits will help offset development costs. Benefits may include:

Receive one year free Unity Plus license or a royalty free Unreal license.*

Get to know the Oculus Store better with Oculus wallet credits.

Learn more at the Oculus Start homepage.

GameDev News


See More Tutorials on DevGa.me!

Month List