Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

9. February 2016

 

Not every day that there is a new player in the AAA game space, but that’s exactly what just happened with the release of Lumberyard by Amazon.  Amazon has been getting more and more involved with gaming with the launch of their own game studio coupled with their purchased of Double Helix games back in 2014.  Their cloud computing solution, AWS (and more specifically EC2 and S3) have both proven incredibly popular with game developers, providing the networking back end for companies such as Rovio and Ubisoft.  Today however they just made a much bigger splash with the release of a complete game engine, Lumberyard.

Now Lumberyard isn’t actually a brand new engine, in fact it appears to be a mashup of a number of technologies including CryEngine, in house tools created by Double Helix Games and cloud services from AWS, specifically the new Amazon Gamelift service, which is described as:

Amazon GameLift, a managed service for deploying, operating, and scaling session-based multiplayer games, reduces the time required to build a multiplayer backend from thousands of hours to just minutes. Available for developers using Amazon Lumberyard, Amazon GameLift is built on AWS’s highly available cloud infrastructure and allows you to quickly scale high-performance game servers up and down to meet player demand – without any additional engineering effort or upfront costs.

Lumberyard will also feature Twitch integration, and perhaps most interestingly, launch with support, both in forum and tutorial form but also in a paid form, something that is often lacking.  Lumberyard tools only run on Windows 7,8 and 10, while the supported targets at launch are Windows, PS4 and Xbox One.  Of course a developer license is required to target either console.  About the technical bits of Lumberyard:

The Lumberyard development environment runs on your Windows PC or laptop. You’ll need a fast, quad-core processor, at least 8 GB of memory, 200 GB of free disk space, and a high-end video card with 2 GB or more of memory and Direct X 11 compatibility. You will also need Visual Studio 2013 Update 4 (or newer) and the Visual C++ Redistributables package for Visual Studio 2013.

The Lumberyard Zip file contains the binaries, templates, assets, and configuration files for the Lumberyard Editor. It also includes binaries and source code for the Lumberyard game engine. You can use the engine as-is, you can dig in to the source code for reference purposes, or you can customize it in order to further differentiate your game. The Zip file also contains the Lumberyard Launcher. This program makes sure that you have properly installed and configured Lumberyard and the third party runtimes, SDKs, tools, and plugins.

The Lumberyard Editor encapsulates the game under development and a suite of tools that you can use to edit the game’s assets.

The Lumberyard Editor includes a suite of editing tools (each of which could be the subject of an entire blog post) including an Asset Browser, a Layer Editor, a LOD Generator, a Texture Browser, a Material Editor, Geppetto (character and animation tools), a Mannequin Editor, Flow Graph (visual programming), an AI Debugger, a Track View Editor, an Audio Controls Editor, a Terrain Editor, a Terrain Texture Layers Editor, a Particle Editor, a Time of Day Editor, a Sun Trajectory Tool, a Composition Editor, a Database View, and a UI Editor. All of the editors (and much more) are accessible from one of the toolbars at the top.

In order to allow you to add functionality to your game in a selective, modular form, Lumberyard uses a code packaging system that we call Gems. You simply enable the desired Gems and they’ll be built and included in your finished game binary automatically. Lumberyard includes Gems for AWS access, Boids (for flocking behavior), clouds, game effects, access to GameLift, lightning, physics, rain, snow, tornadoes, user interfaces, multiplayer functions, and a collection of woodlands assets (for detailed, realistic forests).

Coding with Flow Graph and Cloud Canvas
Traditionally, logic for games was built by dedicated developers, often in C++ and with the usual turnaround time for an edit/compile/run cycle. While this option is still open to you if you use Lumberyard, you also have two other options: Lua and Flow Graph.

Flow Graph is a modern and approachable visual scripting system that allows you to implement complex game logic without writing or or modifying any code. You can use an extensive library of pre-built nodes to set up gameplay, control sounds, and manage effects.

Flow graphs are made from nodes and links; a single level can contain multiple graphs and they can all be active at the same time. Nodes represent game entities or actions. Links connect the output of one node to the input of another one. Inputs have a type (Boolean, Float, Int, String, Vector, and so forth). Output ports can be connected to an input port of any type; an automatic type conversion is performed (if possible).

There are over 30 distinct types of nodes, including a set (known as Cloud Canvas) that provide access to various AWS services. These include two nodes that provide access to Amazon Simple Queue Service (SQS),  four nodes that provide access to Amazon Simple Notification Service (SNS), seven nodes that provide read/write access to Amazon DynamoDB, one to invoke an AWS Lambda function, and another to manage player credentials using Amazon Cognito. All of the games calls to AWS are made via an AWS Identity and Access Management (IAM) user that you configure in to Cloud Canvas.

Finally we come to price.  Lumberyard is free*.  I say free* instead of free because of course there is a catch, but an incredibly fair one in my opinion.  If  you use Lumberyard you either have to host it on Amazon servers or on your own.  Basically you can’t use Lumberyard then host it on a competitor such as Azure or Rackspace.  Pricing is always a bit tricky when it comes to Amazon services, but unlike Google, they have never once screwed their user base (Google once jacked up prices by an order of magnitude, over night, forever souring me on their technology), so you are pretty safe in this regard.  More details on pricing:

Amazon GameLift is launching in the US East (Northern Virginia) and US West (Oregon) regions, and will be coming to other AWS regions as well. As part of AWS Free Usage tier, you can run a fleet comprised of one c3.large instance for up to 125 hours per month for a period of one year. After that, you pay the usual On-Demand rates for the EC2 instances that you use, plus a charge for 50 GB / month of EBS storage per instance, and $1.50 per month for every 1000 daily active users.

I intend to look closer at the Lumberyard game engine as soon as possible, so expect a preview, review or tutorial shortly.

GameDev News ,

8. February 2016

 

This story coming care of /r/gamedev, BDX released version 0.2.3.  BDX is a game engine hosted inside Blender using LibGDX and Java for game programming.  Essentially it enables you to define and create your game in Blender, including complete physics integration, while generating LibGDX code.  I did a pretty in-depth tutorial on working with BDX a while back.

In this release:

Here's a short change-log:

  • Per-pixel sun, point, and spot lighting. As it was before, you can simply create the lights in Blender to have them show up in-game, or spawn them during play.
  • Ability to turn off per-pixel lighting for lower-spec targeted platforms and devices.
  • Improvements to the profiler.
  • GameObjects can now switch the materials used on their mesh. You can specify the name of a material available in the scene in Blender, or you can directly provide a LibGDX material to use, in case you have one custom-made.
  • Various fixes and QOL improvements.

Check it out! We could always use some more feedback and testing.

It’s a cool project and if you are working in Blender and LibGDX is certainly something you should check out!

GameDev News ,

8. February 2016

 

The road map for the Atomic Game Engine, which we looked at late last year, was just released and highlights upcoming developments for the engine.

2016 Roadmap

DISCLAIMER: As with most roadmaps, this one is subject to change. This is a snapshot of current planning and priorities, things get moved around, opportunities happen, etc. It is also not “complete”

  1. New WebSite - We need a new website, badly. The main page and landing video have not been updated since the initial March 4th Early Access!
  2. New User Experience, documentation and tutorial videos
  3. Improved iOS/Android deployment with support for shipping on App Store/Google Play. We also plan on publishing a mobile iOS/Android example
  4. Continued work on editor asset pipeline, scene editor, etc
  5. WebGL improvements, there is a lot going on currently with WebGL and we need to update the build and provide a means to communicate with the page JavaScript
  6. Script debugging with breakpoints, callstacks, locals, etc, including on device
  7. First class TypeScript support with round trip code editing, compiling, debugging
  8. Basic Oculus Rift support (Q2)
  9. Multiple top level windows for the Atomic Editor
  10. Improvements to the new Chromium WebView API
  11. Examples, examples, examples, including a bigger “full game” example
  12. Animation Editor
  13. Evaluate lightmap generation with Blender cycles
  14. The things that need to happen, or are under NDA, and are not listed on this roadmap :)

In addition to the roadmap, a thorough history of the engine and the company people it are available here.

GameDev News

5. February 2016

 

While this one is certainly skirting the border of game development news as almost no games were developed for Firefox OS, but the death of an OS generally merits at least a few lines of text. Started in 2011, the first Firefox OS device shipped in 2013, FF OS was designed as a Linux kernel that booted to the Firefox browser.  Very similar in scope to Chrome OS, it received even less success than Chrome OS, which itself is on life support these days. Even the idea of a mobile OS built around a web stack wasn’t new, as Palm, then HP attempted it with WebOS, another mobile HTML5 powered operating system that has joined the operating system graveyard.

I don’t know about you but to me this one seems to be a non starter at this point.  It’s actually a bit of a shame too as iOS is getting worse with every release and frankly I trust Apple or Google about as far as I can throw a transport truck, so a viable free and open option in the OS space would certainly be a good addition.

From the announcement on Mozilla:

Dear Mozillians,

The purpose of this email is to share a follow up to what was announced by Ari Jaaksi, Mozilla’s SVP of Connected Devices, in early December49 -- an intent to pivot from “Firefox OS” to “Connected Devices” and to a focus on exploring new product innovations in the IoT space. We’re sharing this on behalf of Ari and the Connected Devices leadership group.

In particular, there are a few decisions that we want to share along with what will happen next. We’ll elaborate more below, but let us start by being very clear and direct about 4 decisions that have been made:

  1. We will end development on Firefox OS for smartphones after the version 2.6 release.
  2. As of March 29, 2016, Marketplace will no longer accept submissions for Android, Desktop and Tablet, we will remove all apps that don’t support Firefox OS. Firefox OS apps will continue to be accepted into 2017 (we have yet to finalize a date for when we won’t continue accepting these apps).
  3. The Connected Devices team has been testing out a new product innovation process with staff, 3 products have passed the first “gate” and many more are in the pipeline. Having multiple different product innovations in development will be the approach moving forward, and we’re hoping to open up the formal process to non-staff participation in the first half of the year.
  4. The foxfooding program will continue and will focus on these new product innovations (rather than improving the smartphone experience). We expect the Sony Z3C foxfooding devices to be useful in this, but we expect it to take until the end of March to figure out the specific design of this program.

Obviously, these decisions are substantial. The main reason they are being made is to ensure we are focusing our energies and resources on bringing the power of the web to IoT. And let’s remember why we’re doing this: we're entering this exciting, fragmented space to ensure users have choice through interoperable, open solutions, and for us to act as their advocates for data privacy and security.

RIP Firefox OS.

GameDev News

5. February 2016

 

I have to say this one is pretty cool and possibly not what you think.  The Unreal Engine is getting VR support.  That’s not as in support for VR platforms, it already has that.  No, Unreal Editor is soon going to have support for the HTC Vive and Oculus Rift enabling you to edit and create your game in VR, taking full use of both platforms motion controllers.  This is a while out however as they aren’t even announcing a release date until March 16th at the GDC.

From the Unreal blog:

The Unreal Editor is up and running in VR, so you can build VR content in VR. Using the Oculus Touch and HTC Vive motion controllers, your movement in the real world is mapped one-to-one in VR; you can reach out, grab, and manipulate objects just as you would in real life. You already know how to use this tool, because it works like the world works.

These are the early days of the revolution in immersive VR content creation, but we’re so excited about what’s up and running that we couldn’t keep it a secret anymore!  VR movement and editing controls are functional, along with key parts of the Unreal Editor UI, including the Details Panel and the Content Browser.  We’ll be showing more and announcing the release date at GDC on Wednesday March 16, 2016.  And when it’s released, it will be a built-in feature of the freely-downloadable Unreal Engine, with full source on GitHub

Best of all, this isn’t a limited mode for VR preview and tweaking.  It is the Unreal Editor, now running in VR. The same Unreal Editor that’s used by everyone ranging from indies and mod makers to triple-A development teams with $100,000,000 budgets. And it runs in VR!

A BOX OF TOYS

You start out in the VR editor at a human scale, and can directly manipulate objects by moving around in a room-scale VR setting.  But you can also use a smartphone-like pinching motion to zoom in and out. With one pinch, the world is shrunk to the size of a Barbie Doll house on your table. You can manipulate it granularly and ergonomically, and then zoom back to human scale.

Besides directly manipulating objects, you also have a laser pointer. Point at a far-away object and you can move it around, or “reel it in” like a fishing rod. Or teleport to the laser pointer’s target location with a single button click, inspired by Bullet Train’s locomotion.

THE VR USER INTERFACE: IPAD MEETS MINORITY REPORT

As a pro tool, the Unreal Editor features a rich 2D user interface, and it’s being rolled out naturally in VR: One button-press places an iPad-like tablet in your hand, and you use the other hand to interact with the tablet.  Scroll, press buttons, tweak Object Details, interact with menus, drag objects out of the Content Browser and drop them directly in the world.

It’s an intuitive way to place a 2D user interface in a VR world that builds on everyone’s Unreal Editor experience, and the underlying Slate user-interface framework provides a great foundation we’ll build on as we work to roll out the entire Unreal Editor UI in VR.

Content Browser

PRODUCTIVITY

As game developers, we at Epic pride ourselves in creating high-productivity tools optimized for shipping products, and VR editing provides a great path forward.

With a mouse, several operations are often required to transform an object along multiple axes in 3D.  In VR, you can frequently accomplish the same result with a single, intuitive motion.  This should come as no surprise, as a mouse only tracks two degrees of movement (X and Y), but in VR your head and two hands track six degrees of freedom each: X, Y, Z, and three rotational axes. That’s 9 times the high-fidelity input bandwidth!

 

More details are available here.  It’s interesting to see if this is useful or just a gimmick.  Obviously working on a VR game in full VR certainly has it’s advantages.  At the end of the day though, few control schemes actually usurp the mighty mouse and keyboard.  Add to that fact, at least with my Gear VR, its tiring both physically and on the eyes after a couple hours.  I cant imagine doing the 9-5 routine with one of these devices strapped to your head.  Time will tell I suppose.

GameDev News , ,

Month List

Popular Comments