Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
4. January 2016

 

I was recently working with a tool that exported it’s level data as a directory full of obj files, literally hundreds of them.  You can import them into Blender using File->Import->Obj, unfortunately there is no way to do a select all.  Apparently in Blender 2.4x if you held SHIFT while selecting OBJ import, it would import an entire directory, but this doesn’t appear to work in modern Blender.  You can also SHIFT+Click multiple files to do multiple selection, but this gets tedious when you have hundreds of them.  Unfortunately CTRL + A doesn’t work…

 

Thankfully Blender is extremely scriptable, so let’s turn to Python for a solution.  The following script will import a directory full of OBJ files into the current scene.

import bpy
import os

def fileList(path): 
    for dirpath, dirnames, filenames in os.walk(path):
        for filename in filenames:
            yield os.path.join(dirpath, filename)

for f in fileList("C:\\file\\path\\here\\"):
    if f.lower().endswith(".obj"):
        bpy.ops.import_scene.obj(f)


Be sure to change the path to your directory and if on Mac OS or Linux, to change the path format /to/this/style. Otherwise this script will chug away importing the OBJ files for you. Hopefully at some point Blender gives you the ability to select all while importing and the need for this script goes away completely.

Art Programming


4. January 2016

 

We recently covered getting started in GearVR development using Samsung’s GearVRf library and the Unity Engine.  Today we are going to finish it off with coverage of getting started using the Unreal Engine.  As with the Unity tutorial, although this is focused on the GearVR, most of the instruction should be equally valid for all Oculus Rift devices.  Of all the game engines, the Unreal was actually the easiest engine to get up and running with the GearVR.

 

There are a few pre-requisites before getting started:

There is a video based version of this tutorial available here and embedded below.

 

Configuring Unreal Engine

In order to be able to run code on your phone we need to add the Oculus Signature file to the application.  With Unreal Engine, this is configured at the engine level which is handy as you only have to configure it once for and it will work with all future projects.  If you haven’t already, sign up for an Oculus developer account, then get your key.  This will generate a file that we need to copy now.

The directory you want is [Unreal Install Dir][Version]\Engine\Build\Android\Java\assets.  In my install it was:

image

 

Creating a Project

You can use just about any settings, but these ones work well.

image

This will mostly just set presets that are mobile friendly.  Click Create Project when ready.

 

Configuring Your Project

If not done already, load your project in Unreal Engine, then select Edit->Plugins:

image

 

Now verify (or turn on) the GearVR and Oculus rift libraries and plugins, like so:

image

 

Now select Edit->Project Settings:

image

 

Locate Android SDK on the left hand side then on the right configure the location of your Android SDK install.

image

 

Now go to Android, set the Android minimum level then click the Configure Now button:

image

 

While in the Android settings, be sure to enable GearVR support!

image

 

Build Your App

 

Now build and package for Android, in the File Menu, select Package Project->Android->Android (ETC2):

image

Pick a location for the generated APK then Ok.

image

For the record, Unreal Engine Android builds are really really really slow.

 

When complete this will generate the APK file to run on your phone, as well as a pair of BAT files to quickly deploy to your device.  The noOBB batch file doesn’t copy data, only the updated executable.  Keep in min,d, you need to have your device either tethered or running wirelessly over adb to deploy.  This means removing it from the GearVR, which is an annoying process.  If you don’t need to fully test VR functionality, it might be faster to work in developer mode. [Link soon]

 

Video

Programming


3. January 2016

 

Earlier we look at getting started doing Gear VR development using Samsung’s GearVRf library.  In that tutorial I mentioned a few times that it is a great deal easier to get started doing Gear VR development using the Unity game engine.  Today we are going to look at the process of getting a game up and running using the Unity game engine.  This tutorial is GearVR specific, however it should be just as applicable for the Oculus Rift.

 

Truth of the matter is getting started with Unity is exceptionally easy, but…

 

The information out there is all out of date and makes the learning curve much harder than it used to be.  First off, Samsung make a set of utilities available and you need to make sure you get the newest version.  However, Unity now has VR built in out of the box, so you don’t need to do anything at all.  This renders almost all of the existing getting started type tutorials completely wrong.

There is a video version of this tutorial available here.

Getting Started

 

First off, create a project like normal, no need to do anything special.

image

 

Now select Build Settings in the File menu

image

 

Select Android, then Player Settings…

image

 

Now select Virtual Reality Supported.  Also be sure to set your bundle identifer and optionally the Minimum API Level (the lowest possible level is a Galaxy Note running 4.4):

image

 

If you run it now you will get the following error:

image

 

This is because the GearVR wont allow your app to run without the Oculus key.  To get this you need a (free) developer account from Oculus and register your device using this form.  That will generate a file that needs to be copied into your project.

Inside your Unity project, locate the Assets directory and create the following directories Plugins\Android\Assets and copy the signing key in, here is mine:

image

 

Your game should now run just fine on your device.

 

Oculus Utilities for Unity 5

 

The “old” way of supporting GearVR and Oculus Rift in Unity involved importing a set of utilities called the Oculus Utility of Unity 5.  Since these were released however Oculus and Unity have worked to make the integration in Unity more complete.  That said, the transition isn’t complete as of writing.  You still don’t have access to the universal menu or overlays using Unity so you still may have to use the Oculus Utilities.  Fortunately they can still be imported and exist side by side with a default Unity install.

 

In addition to the Oculus Utilities of Unity there is also a set of example projects that you can import.  There is also a guide to Unity 5 integration available on Oculus, although be aware that parts of it are now obsolete.

 

Unity VR Samples

 

There is also a pretty significant set of samples illustrating using VR in Unity available on the Unity Store.  Simply create an empty project and import in the example.  These VR Samples illustrate a number of different concepts required for working with VR in Unity.

 

 

In time, more and more functionality should be built into Unity, making this process even easier.

 

Video Version

Programming


1. January 2016

 

A reader wrote in to share this bundle currently running on BundleStars, and I am glad they did.  Somewhat similar to a humble indie bundle from a few months back but containing a great deal more art assets.  I’ve actually used the tilesets from the earlier humble bundle several times here on GFS, so I will be picking up the $16 tier.

 

The ultimate Game Makers bundle:

image

 

For between $3 and $15 you can end up with a heck of a large volume of assets to make games with.

GameDev News Art


1. January 2016

 

When I set up the news monitoring on GameFromScratch.com I noticed that Ogre3D hadn’t had an update since June and was somewhat concerned that the project was dying off.  Thankfully today there were signs of life over at ogre3d.org.  If you have never heard of it, Ogre3D is a C++ based renderer and scene graph and has been used to create several shipped titles.  The major challenge to Ogre3D is that it is currently Windows and Linux only.

 

From the update:

So… what’s new?

1. Added TagPoints to the new Skeleton system! This has been a sort of unfinished business for me. I’m glad it’s finally done!

2.1’s TagPoints are superior to their 1.x counterparts. The TagPoints from v1.x had many issues: they didn’t follow RenderQueue, visibility rules, nor LOD rules correctly (they were subordinated to the settings from the main entity/skeleton they were attached to). The v1 TagPoints also belonged to an Entity. If the Entity was destroyed, it took down its TagPoints with it. Meaning if you wanted to still keep the attachments, you had to iterate through them, detach them from their TagPoints, and add them to a new SceneNode. Ugh!!! Personally, I gave up trying to use those in my projects a long time ago.

In Ogre 2.1; TagPoints are much better: they are exactly like regular SceneNodes, except they occupy a little more RAM (a few more bytes per node), and can be attached to Bones. Other than RAM consumption, there is no performance penalty for replacing SceneNodes with TagPoints (*).

You can make a TagPoint child of a SceneNode, a SceneNode child of a TagPoint, and a TagPoint child of another TagPoint. The only thing you can’t do is make a SceneNode child of a Bone. You must use a TagPoint for that.

If you want, you can use TagPoints throughout your entire codebase and forget about having to deal with whether an Item/Entity was attached to a TagPoint or a SceneNode and get downcasts correctly.

(*)When a SceneNode or TagPoint is attached to a TagPoint that is child of a Bone, the performance is slower because these nodes need to support non-uniform scaling. But if the TagPoint is child of the Root SceneNode (instead of a bone) like all regular SceneNodes, then there’s no performance penalty.


2. Added PSOs (Pipeline State Objects). This brings us one step closer to Vulkan, DX12 and Metal support. We’ve also noticed some minor performance improvements since there are less hoops now when tying shaders with input layouts in the D3D11 RenderSystem. Overall it simplified our workflow. It also fixed a rare culling winding bug in GL3+ as a nice side-effect.

This work is in the branch 2.1-pso. It’s ready. It hasn’t been merged yet back to main 2.1 branch because I am waiting to test it on a big engine (since it was a big change) in case there are edge cases to fix.


3. Added alpha tested shadows and transparency to PBS! These have requested by many. Alpha tested shadows are useful for grids, leaves and other alpha tested objects.

We offer two transparency modes: Transparent and Fade. The former is physically based and still emits some specular light even at alpha = 0; the latter is good old alpha blending; and becomes invisible when alpha = 0.

 


4. Added Metallic and Specular workflow options. We’ve been working hard and closely with other teams and artists, tuning the BRDF settings. We now have 3 workflows: specular_ogre, specular_as_fresnel (what most popular engines do when they say “specular” workflow) and metallic.

For the tech-curious, specular_ogre maps the specular texture to the coefficient “kS”, whereas specular_as_fresnel maps the specular texture to the fresnel, colouring it. And metallic uses the slot reserved for the specular texture to control the metallness.

John Hable has an interesting discussion about the topic. Long story short specular_ogre allows more variety in the amount of materials that can be represented; but the specular_as_fresnel is more intuitive and is what most artists are used to.


5. Optimized vertex buffers for shadow mapping! If you’ve got VRAM to spare, you may want to enable this option to generate vertex buffers optimized specifically for shadow mapping.

Normally, GPUs require us to split a vertex when two triangles can’t share it (i.e. it has a different normal, it has got an UV seam, etc). But shadow mapping only requires position, and these cloned vertices (alongside fatter vertices) reduce performance unnecessarily. This new feature creates a vertex buffer with position data exclusively, thus reducing bandwidth requirements, fitting the cache better; and reducing the vertex count by avoiding duplicates.

Performance improvements vary wildly depending on scene and model complexity.

To enable it, set the global variables:

Mesh::msOptimizeForShadowMapping = true;
v1::Mesh::msOptimizeForShadowMapping = true;

Note that this will increase loading times. For large meshes (i.e. 300k vertices), it is better to do it offline and save it to disk using the OgreMeshTool. If you do this, then there is no need to set msOptimizeForShadowMapping to true.

Also bare in mind this optimization will not be used for meshes using materials with alpha testing, as we require UV data for alpha testing; and not just position data.

For more information, visit the forum thread.


6. Updated and merged OgreMeshUpgrader and OgreXMLConverter into one tool: OgreMeshTool. This new tool supports many similar options and most of the same functionality the other two provided; with the addition of new v2.1 features like exporting to v2 formats, optimizing vertex formats to use half floating point and QTangents, generate optimized vertex buffers for shadow mapping.


7. Compositor improvements. Finished UAV (texture) support, finished resource transition/barriers project (needed by OpenGL, Vulkan and D3D12) see what are barriers and why they are needed, Compositor now allows for rendering to 3D and texture 2D arrays, also declaring cubemaps.

Automatic generation of mipmaps was added for RTTs (useful for cubemap probes). See section 4.1.3.2 generate_mipmaps from the porting manual for more information.

With proper UAV support, compute shaders are around the corner!


8. Added JSON materials. The previous script syntax was not scaling well. JSON rocks! Drop on the forum thread for more details. It was our latest addition so there may still be things to polish. The old hlms material script system will continue to work while we iron out the issues and finish DERGO as a material editor.

GameDev News


AppGameKit Studio

See More Tutorials on DevGa.me!

Month List