Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

6. June 2016

 

In a recent blog post the Amazon developer team discussed the upcoming support for VR devices in Lumberyard 1.3.  This support comes in two forms, supporting actual VR devices in your game and using VR to develop your game.  Both are being provided in the form of “Gems”, which is basically Amazon’s way of saying plugin.  To add support for a new VR device, you create a gem that implements the IHMDDevice interface, acting as a bridge between Lumberyard and the device’s SDK.

Utilizing Gems, small chunks of code can be created that interact with the engine but don’t require editing the engine code itself. This means that developers can add support for any VR device without having to delve into the engine source. As long as a new VR device conforms to the public interfaces that Lumberyard has defined, the engine will automatically use it. Developers can create their own integrations for additional devices without having to wait for an official Lumberyard update, as they would in other engines. With so many new VR devices coming out soon, we wanted to provide a way for customers to make their own support decisions. Additionally, developers can easily override existing device support to add any experimental features that may be important for their gameplay. Below is a high-level diagram of the way this works inside the engine.

The HMDManager contains an IHMDDevice, which is then implemented by a device-specific Gem. The manager takes care of device initialization and device-abstracted head-mounted display (HMDs) interaction with the rest of the system. On the rendering side, Lumberyard’s stereo renderer makes use of the D3DHMDRender object, which takes care of creating graphics-API-specific render targets, social screen rendering, and frame submission to the VR device. To add support for any new VR devices, you simply wrap the vendor-specific SDK in a Gem as defined by IHMDDevice. That’s it! There’s no need to edit Lumberyard’s underlying HMD code, which is represented by the Lumberyard Engine section of the diagram.

On engine startup, the selected HMDs are scanned for connectivity and selected for use. If you want to support both the Rift and the Vive, for example, simply go into the Project Configurator, enable both Gems, and the engine will pick which one to use at runtime based on which device is plugged in.

 

They also go on to describe the new VR developer functionality that will be part of Lumberyard 1.3:

Developing in VR

Game developers need to be able to see what they’re doing in the editor at all times. Without a way to see VR in the editor, developers would have to export a level, load it into the launcher, enable VR, and take a look around. This is obviously inefficient. The Lumberyard Beta 1.3 editor will have full VR Preview support built in. VR Preview utilizes the same Gems system as the engine runtime, and it works in a similar fashion. We’ve added the “VR Preview” button to the editor, which you can click to see in VR right away. This allows developers to make VR-specific adjustments to their level designs right in the editor, which reduces iteration time. Flow Graph nodes are an important part of developing in Lumberyard, but they can only be debugged in the editor. With VR Preview, users can debug their VR Flow Graph nodes and see what they’re doing.

The cool part of their implementation is there is no performance penalty for enabling VR if VR functionality isn’t used, making this functionality “free” from a processing perspective.

 

So, what devices are supported?  Well until 1.3 ships the answer is unknown.  They address it with this comment:

Rift and HTC Vive support were top requests (our demo was presented on the Rift), but many developers were just as interested in other devices, like the Samsung GearVR, PSVR, and OSVR.

But never actually state what gems will ship with the 1.3 release, meaning it might be left to developers to implement the various VR headset SDKs. 

GameDev News

6. June 2016

 

V8 is a popular embeddable JavaScript engine, perhaps most famously used to power the popular Node development framework or as the JavaScript engine used by Chrome.  They just announced the release of version 5.2.  From the release notes V8 gained the following features:

ES6 & ES7 support

V8 5.2 contains support for ECMAScript 6 (aka ES2015) and ECMAScript 7 (aka ES2016).
Exponentiation operator
This release contains support for the ES7 exponentiation operator, an infix notation to replace Math.pow.
let n = 3**3; // n == 27
n **= 2; // n == 729
Evolving spec
For more information on the complexities behind support for evolving specifications and continued standards discussion around web compatibility bugs and tail calls, see the V8 blog post ES6, ES7, and beyond.

Performance

V8 5.2 contains further optimizations to improve the performance of JavaScript built-ins, including improvements for Array operations like the isArray method, the in operator, andFunction.prototype.bind. This is part of ongoing work to speed up built-ins based on new analysis of runtime call statistics on popular web pages. For more information, see the V8 Google I/O 2016 talk and look for an upcoming blog post on performance optimizations gleaned from real-world websites.

V8 API

Please check out our summary of API changes. This document gets regularly updated a few weeks after each major release.

GameDev News

6. June 2016

 

As you may recall, last week Unity announced some pretty major price changes.  The response has not exactly been great, especially among those that purchased their PRO license outright.  Yesterday, Unity co-founder and CTO Joachim Ante released this blog post explaining the move to a subscription model:

Why Subscription?

When we started Unity, we would ship Unity every once in a while on just 2 platforms. Initially just Aras and I, gradually adding a couple engineers every few months. We’d decide on a couple major features and focus working on that for a year and a bit, go through beta and then ship it.

Today Unity lets you target 28 platforms. No one targets all platforms at the same time, but the ability to choose to easily switch your game to any platform gives Unity developers incredible advantages.

Each platform is supported by a team of dedicated engineers. We have teams focused on different areas of the engine, working on improving each major area all the time.

We ship a patch release every week. Supported by the awesome Sustained Engineering team.

We ship point releases with major new features and improvements multiple times per year.

All of this is necessary because the platforms we support rapidly change. In today’s world, we can’t leave customers behind for a year because we are in the process of releasing a major version. We think it would be very bad for Unity developers if we held features for a full number release, rather than launch these features along the way, when they are ready.

With this in mind, we want to be clear. There will be no major Unity 6 release.

In the dev team we wanted to stop doing major releases for a long time. With the major releases model we had done up until Unity 5, it has always forced us to bundle up a bunch of features and release them in one big splash. Usually it results in that good & complete features would be artificially held back for a long time while other features are still maturing, and eventually releasing some of these features before they are ready. All in the name of creating one big splashy release that customers feel is worth upgrading to. It’s what we did because we had to in a model where we worked toward an unnatural new major release every few years. This is not some evil marketing team pushing for it, it is the inherent nature of that business model. It was always a painful process for us and you and it really serves no one.

With our switch to subscription we can make Unity incrementally better, every week. When a feature is complete, we will ship it. If it is not ready we will wait for the next point release.

Our switch to subscription is absolutely necessary in order for us to provide a robust and stable platform.

Pay to own!

Along with the new subscription model, we are introducing “pay to own”. After having paid for 24 months of subscription, you can stop paying and keep on using the version you have at that point. Of course, you would also stop getting new features, services or fixes; choice is yours.

If you are upgrading from a previously bought perpetual license of Unity and you are switching to subscription after March 2017,  then you get “pay to own” right away with your subscription license.

Pay to own applies to everyone; there’s no special “license option” you have to get. Simple!

Thanks for listening, I hope this gives some much needed background on our switch to subscription.

 

In some ways this move makes sense.  Both Unreal and Unity have moved to a more rapid release schedule, making make 1.0 releases somewhat of a thing of the past.  The problem for Unity is, they are selling software still using a version by version model, Unreal obviously don’t have this issue as their revenue is royalty based.  A quick look through the comments in response to this post show that the community isn’t exactly mollified at this point!  At first glance the Pay To Own license sounds like a good deal, but all that is really saying is, after 2 years of paying licensing fees you get a perpetual license for that version (and not further updates without a subscription).  Considering you could previously buy Unity outright for $1,500, “owning” it after 24 payments of $125 ($3,000) is only a deal if you are using all three versions, otherwise it’s a doubling of the price.

 

A point that might be somewhat confusing is There will be no major Unity 6 release. This is some truly horrible wording and is incredibly misleading.  Yes there will be a Unity 6, it just has absolutely no impact on licensing.  All subscriptions from Unity are now time limited, not release oriented.

GameDev News

6. June 2016

 

LWJGL (Light Weight Java Game Library), a set of Java bindings for several low level libraries ( Open GL, ES, Vulkan, OpenCL, GLFW, etc. ) just released version 3.0.0 after 3 years in development.  This is the underlying technology used by LibGDX for desktop support for example.  This release brings several changes, including:

BINDINGS

  • Added support for Java array parameters and HotSpot Critical Natives. (#175)
  • Added Vulkanl bindings. (#50)
  • Added NanoVG bindings. (#99)
  • Added NativeFileDialog bindings.
  • Added par_shapes.h bindings.
  • Added dyncall bindings.
  • Added jawt bindings for AWT/Swing integration. (#125)
  • Added simple OS-specific window creation bindings, for custom window/context creation. (#105)
  • Added missing OpenCL and OpenAL extensions.
  • Fully documented OpenCL and OpenAL.
  • Moved WGL and GLX capabilities to the new WGLCapabilities and GLXCapabilities classes, respectively. Functionality in WGL, GLX and corresponding extensions that does not require a current context can now be used without creating a dummy context first. (#171)

IMPROVEMENTS

  • Added stack allocation APIs (the MemoryStack class and new allocation methods in struct classes and MemoryUtil).
  • Made the implementations of PointerBuffer and Struct/StructBuffer subclasses as lightweight as possible. This makes it easier for escape analysis to eliminate allocations.
  • Minor struct API improvements.
  • Added nullability information to struct members, to protect against buggy code crashing the JVM.
  • All bindings are updated to the latest versions of the corresponding libraries. Notably, GLFW now hasglfwSetWindowIcon and glfwSetWindowMonitor, it now doesn't lack anything compared to LWJGL 2's Display.
  • Refactored callbacks for Java 8. (#182)
  • Added NativeResource interface and made freeable objects usable as resources in try-with-resources statements. (#186)
  • Faster thread-local lookups for the stack and current capabilities. New options in Configuration can be used tocomplete eliminate thread-local lookup in OpenGL, OpenGL ES and OpenAL, when it is known that only a single context will be used, or that all contexts will be compatible (same capabilities and same function pointers).
  • Added memSlice for all buffers types in MemoryUtil. (#179)
  • Refactored the Configuration class for type safety and added more options.
  • JDK 9 can now be used to build and run LWJGL.
  • Javadoc is now generated with JDK 9. The API is fully indexed and search functionality is available. Also made multiple Javadoc formatting improvements.
  • Improved debug diagnostics on startup and when loading the LWJGL shared library fails.
  • Optimized memSet and memCopy for small buffers.

FIXES

  • Stopped using UPX compression for binaries. This eliminates various integration issues and virus scanning false-positives.
  • The SharedLibraryLoader now works with any shared library, not only libraries LWJGL knows about. (#176)

BREAKING CHANGES

  • LWJGL now requires Java 8 to build and run. Certain custom interfaces have been replaced with java.util.functioninterfaces. (#177)
  • Dropped support for Linux x86. (#162)
  • Dropped libffi bindings.
  • Dropped ALDevice/ALContext wrappers from OpenAL and CLPlatform/CLDevice wrappers from OpenCL. (#152)
  • Dropped the getInstance() method from bindings loaded from shared libraries. Function pointers are now stored either in capabilities classes or in a nested Functions inner class.
  • Dropped infrequently used method overloads in bindings. Full javadoc is now generated on (almost) all overloads.
  • Dropped utility classes that were not useful.
  • Added AutoSize support to struct members. Instance setters for the corresponding count/size members were removed to avoid bugs and confusion.
  • Replaced MemoryUtil.memFree(StructBuffer) with StructBuffer.free().
  • Renamed __ALIGNMENT to ALIGNOF in struct classes.
  • Removed org.lwjgl.system.Retainable interface. Closure and FunctionProvider subclasses are now destroyed using.free() instead of .release().
  • Moved xxHash and SSE bindings to the org.lwjgl.util package.
  • Integer-boolean native types (0 or 1 are the only legal values) are now mapped to Java booleans. (#181)
  • Macros without parameters are now generated as static final values, not methods.

You can read more about the release here.

GameDev News ,

6. June 2016

 

The following is a recap of major events in the world of game development for the week ending June 5th, 2016.  I do a weekly video recapping the news available here with this week’s video embedded below.  This post is a collection of links mentioned in the recap.

 

The Video

GameDev News

Month List

Popular Comments

Taking a look at HaXe
Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon


Home > >

15. April 2013

So last week I decided to run a poll to see which gaming technology people would be most interested in and the results actually shocked me:

 

 

Haxe narrowly edged out LibGDX ( by two votes ), while my original plan of HTML5 came in a distant third.  The other category seemed to be mostly composed of people interested in MonoGame.

 

I have long been a fan of C# and XNA, so Monogame was an obvious option.  It was ( and is ) discounted for a couple reasons.  First is the inability to target the web.  It's a shame Microsoft put a bullet in Silverlight, as otherwise this limitation wouldn't exist.  Second, paying 300$ to deploy on iOS and another 300$ to deploy on Android is a bit steep.  LibGDX suffers this to a degree too, as you need Xamarin to target iOS.  Hopefully in time this changes.

 

Hello Haxe World

So over the last couple days I've taken a closer look at Haxe or more specifically Haxe + NME.  Haxe as a programming language is an interesting choice.  It has an ActionScript like syntax and compiles to it's own virtual machine like Java ( the VM is called Neko, so if you are wondering what Neko is…  well, now you know!)  If that was all Haxe did, we wouldn't be having this conversation.  More interestingly, it compiles to a number of other programming languages including HTML5 ( JavaScript ), ActionScript ( Flash ) and C++.  As a result, you can compile to C++ for Android and iOS and get native performance.  This makes Haxe quite often many times faster than ActionScript.

 

There is however a serious problem or flaw with Haxe.  Let's take a quick look at part of the Haxe API and you will see what I mean:

 

Haxe Api

 

Notice how there are a number of language specific apis, such as cpp, which I've expanded above.  So, what happens if you use the cpp Random api and want to target Flash? Short answer is, you can't.  If course you could right a great deal of code full of if X platform, do this, otherwise do that, but you will quickly find yourself writing the same application for each platform.  So, what do you do if you want to write for multiple platforms with a single code base?

 

Meet NME

This is where NME comes in.  NME builds on top of the Haxe programming language and provides a Flash like API over top.  Most importantly, it supports Windows, Mac, Linux, iOS, Android, BlackBerry, webOS, Flash and HTML5 using a single code base.  Let's take a quick look at part of the NME libraries to show you what I mean:

NME api

 

There are still platform/language specific libraries, like the neko.vm above.  There are also the nme.* libraries.  These provide a Flash like API for the majority of programming tasks needed to make a game.  Targeting your code towards these libraries, and minimizing the use of native libraries, and you can target a number of different platforms with a single code base.

 

There is another aspect to NME that makes life nice.  If you read my review of the Loom Game Engine you may recall I was a big fan of the command line interface.  NME has a very similar interface.  Once you've installed it, creating a new project from one of the samples is as simple typing nme create samplename then you can test it by running nme test platform.

 

Here is an example of creating the DisplayingABitmap sample and running it as an HTML5 project:

Creating an NME project

 

And your browser will open:

NME running as HTML5

 

What makes this most impressive is when you target C++, it fires off the build for you using the included tool chain.  Configuring each platform is as simple as nme setup platformname, valid platforms are currently windows, linux, android, blackberry and webos.  Unfortunately Xcode can't be installed automatically, so this process won't work for iOS or Mac, you need to setup Xcode yourself.  As you can see, setting up and running NME/Haxe is quite simple, and worked perfectly for me out of the box.  If you are curious what options you can provide for sample names, you can get them from the Github directory.  There are a fair number of samples to start with.

 

Now is when things take a bit of a turn for the worse…  getting an IDE up and running.  This part was a great deal less fun for me.  You can get basic syntax highlighting and autocompletion working on a number of different IDEs, here is the list of options.  This is when things got rather nasty.  I was working on MacOS so this presented a bit of a catch.  I tried getting IntelliJ to work.  Adding the plugin is trivial ,although for some reason you need to use the zip version, it's not in the plugin directory.  Configuring the debugger though… that's another story.  I spent a couple hours googling, and sadly only found information a year or two old.

 

Then I tried MonoDevelop, there is a plugin available and its supposed to be a great option on MacOS.  And….  MonoDevelop 3 is no longer available…  It's now Xamarin Studio 4 and the plugin doesn't work anymore.  Good luck getting a download for MonoDevelop 3!  There is also FDT, which I intend to check out, but it's built on top of Eclipse and I HATE Eclipse.  Eventually I got IntelliJ Flash debugging to work but it was a great deal less fun.

 

After this frustrating experience, I rebooted into Windows and things got a TON better.  FlashDevelop is easily the best option for developing with Haxe but sadly it's only available on Windows.  There was however a major catch… debugging simply did not work.  After some digging, it turns out you have to run the 32bit JDK or it doesn't work.  Seriously, in this day and age, still having Java VM problems like this is just insane.  Once I got that licked, I was up and running.

 

At this point I have a working development environment up and running I can get to coding.  If you are working on Windows, using FlashDevelop you can get up and running very easily, so long as you are running 32bit Java.  On MacOS though, expect a much rockier experience.  It would be great if FlashDevelop could be ported to Mac, but apparently it can't be… there have been a number of attempts.  They have however provided a configuration for working in Virtualized settings ( VMWare, Parallels, etc ).

 

Stay tuned for some code related posts soon.

blog comments powered by Disqus

Month List

Popular Comments