Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

28. April 2017

 

With the release of Unity 5.6, Unity have switched to a new naming convention with the next release being Unity 2017.  They just announced that Unity 2017.1b Beta is now available.  Be aware that this release is a beta and should not be used for production development, expect bugs and warts.  This release brings major new functionality including improved cut-scene tools, 2D improvements including sprite masking and a new physics shape editor.  FBX importing, animations and particle systems have also been improved.

More details from the announcement:

Tools for storytelling

2017.1 beta introduces Timeline and Cinemachine.

Timeline is a powerful new visual tool that allows you to create cinematic content (like the Adam short film). You can use it to cut scenes, create gameplay sequences and much more, by orchestrating your game objects, animations, sounds and scenes.

Cinemachine brings an advanced camera system that enables you to compose your shots like a movie director, including using real world camera settings and simple directions like “follow the head of the character.”

Timeline’s track-based sequencing tool and Cinemachine’s smart cameras system bring storytelling to artists who can now create stories focusing on the art direction, not the implementation details.

 

2D improvements

Following up on the major 2D feature improvements in 5.6, we are introducing 2D Sprite masking, which enables you to use masks with Sprites to create new kinds of effects.

Management of Sprites and Atlases have also been made easier by introducing the Sprite Atlas asset, which gives the developer more control on how to pack sprites and access them at runtime.

Another example of a workflow improvement is the addition of the Physics Shape Editor to the Sprite Editor, which allows you to create and edit a custom physics shape for a Sprite, which will then be used when  generating collider shapes with a PolygonCollider2D Component.

Scene and Asset Bundle Loading Improvements

We made several improvements to loading in-game scenes and Asset Bundles. The changes to the underlying architecture make the loading of scenes and Asset Bundles faster resulting in a smoother player experience.

Model Importer improvements

FBX import in Unity now supports Segment Scale compensation for models exported from Maya. We also added the option of computing weighted normals when importing FBX files and fixed normal generation for hard edges. Lights and cameras are now imported from FBX files, and visibility properties (including animation) can also be imported.

Particle system improvements

Even more features and improvements will be made available for the particle system, including culling mode with tooltip messages, edit modes for particle system collision mode planes, as well as several other improvements to collisions and physics.

Animation Improvements

Animation windows have been updated to improve the keyframing workflow of working with animations and interacting with animator state-machines. Performance Recording will be provided as an experimental release.

Collaborate

Collaborate is a simple way for teams to save, share, and sync their Unity project, regardless of location or role. In 2017.1 we will continue to improve the workflow with new features like browser integration and the ability to publish selectively.

 

The beta installer is available for download here.

GameDev News

27. April 2017

 

Amazon have released version 1.9 of the Lumberyard game engine.  Lumberyard is a fork of Crytek’s CryEngine, used in such titles as Crysis, Ryse, MechWarrior Online and the massive Star Citizen, which has now switched to using Lumberyard.  It’s available for developers free of cost with the only caveat being use of Amazon’s online services.

The 1.9 release brings new functionality including:LY

 

  • New Player Account Cloud Gem for player authentication and management
  • Web portal for managing player data
  • Improvements to the Particle Editor
  • Express install option (thank goodness!) for easier installs
  • Blend layer updates (specular color and smoothness sliders)
  • New VR features, mostly starter projects
  • UI system improvements
  • Physically based shader reference examples compiled into a gem
  • New Comment component

 

You can read more details in the release notes available here.

GameDev News

27. April 2017

 

Unreal Engine have been a bit quite on releases lately, however they just dropped a preview of their upcoming massive release.  Of course it is just a preview so I wouldn’t recommend using this release in a production environment.  There are just a massive number of improvements and new features in this release however including an all new audio engine, the addition of volumetric fog, a new cloth simulation, HTML5 support of WebAssembly and much more.

 

Details of the release from the Unreal Engine forums:

 

  • Animation/Physics Updates:
    • A new clothing solver from Nvidia called NvCloth replaces APEX. This new solver gives more control over simulation and it is pretty close to the core solver of the previous APEX solution so should behave largely the same, although there are a few slight behavior changes and some extra exposed parameters for inertia settings. We now also always simulate clothing in local space - this gives us the benefit of only having one transform path through the solver and allows external inertia to be tweaked in much finer detail.
    • Vertex & Texture Painting on Skeletal Meshes has been added.
    • Vertex & Texture Painting on Static Meshes has been overhauled to improve usability.
    • A Spline IK node has been added to Animation Blueprints; useful for character spines.
    • Animation Modifiers are a new type of class which allow developers to apply a sequence of actions to a given animation sequence or skeleton. They should always implement OnApply and OnRevert to allow modifying data and remove previously applied changes.
      • Accessing Animation Modifiers is done through a new tab in the skeleton and animation editor.
      • A new set of functions to access specific animation data has been added, they’re contained by a new editor-only function library (Animation blueprint Library).
    • The RigidBody node supports simulating a physics asset inside an Animation Blueprint. This is similar to AnimDynamics, but instead of specifying a node per bone a single node is used for the entire skeletal mesh. Performance has also been improved.
    • PoseDriver has been improved, including more control over selected and modified bones, curve control, and UI improvements.
    • The ability to have Kinematic bodies with simulated parents has been added, allowing effects like a kinematic hand attached to other objects.
    • The 'Look At' node has been improved to be usable relative to a bone or socket.
    • Capsule collision can now be imported from an FBX file. You now use the ‘UCP’ prefix on a capsule poly mesh, and it will be removed on import, and replaced by a corresponding capsule collision shape.
    • The ability to store asset viewer profiles on a 'shared' or 'local' level has been added. This allows team to have a shared set of profiles which can be used as a unified scene to assess art assets. Shared profiles are stored in DefaultEditor.ini and will require you to check out or make it writable.
    • Added support to import Retarget Base Pose. A new option allows you to import the pose from Pose Asset.
    • Animation Export improvements support run-time retargeting results and animation that applies Post Process Graph.
    • Add ‘GetMaterialFromFaceIndex’ function for components retrieves the actual material after performing a complex line trace. This is supported for StaticMeshes, ProceduralMeshes and BSP.
    • The Play Montage Blueprint is an Async node, that can be used in any Blueprint logic to play Anim Montages, and have easy access to some of the callback events it provides.
  • Rendering Updates:
    • Volumetric Fog is now supported. Please follow this link to provide feeback.
      • Volumetric Fog controls are on the Exponential Height Fog Component.
      • Localized fog density can be controlled via particles. To do so, create a Material, set the material's domain to Volume, and apply it to a particle emitter. The volume material describes Albedo, Emissive and Extinction for a given point in space. Albedo is in the range [0-1] while Emissive and Extinction are world space densities with any value greater than 0. Volume materials currently only work on particles, and only positions inside the particle's radius are valid.
      • Each light has a 'Volumetric Scattering Intensity' and 'Cast Volumetric Shadow' setting. (fast-changing lights like flashlights and muzzle flashes leave lighting trails. Disable volumetric fog contribution on these lights with 'Volumetric Scattering Intensity' set to 0)
      • Supported lights include:
        • A single Directional Light, with shadowing from Cascaded Shadow Maps or static shadowing, with a Light Function
        • Any number of point and spot lights, with dynamic or static shadowing if 'Cast Volumetric Shadow' is enabled.
        • A single skylight, with shadowing from Distance Field Ambient Occlusion if enabled
        • Particle lights, if 'Volumetric Scattering Intensity' is greater than 0
      • Not currently supported:
        • Precomputed global illumination
        • Shadowing of Stationary skylights
        • IES profiles and Light Functions on point and spot lights
        • Shadowing from Ray Traced Distance Field Shadows
        • Shadowing from the volumetric fog itself
      • The GPU cost of Volumetric Fog is primarily controlled by the volume texture resolution, which is set from the Engine Shadow Scalability level. Use 'profilegpu' to inspect this cost.
    • Distance Field Lighting has been optimized for current generation consoles and mid-spec PC.
      • The Global Distance Field is much faster to update when a Movable object changes.
      • New project settings ‘Eight Bit Mesh Distance Fields’ and ‘Compress Mesh Distance Fields’ which significantly reduce memory requirements when enabled
      • Runtime cost has been reduced between 30-50%
      • Mesh Distance Field generation uses Intel’s Embree ray tracing library, for a 2.5x speedup
    • Vertex Interpolator nodes have been added to the material graph. These nodes offer better control for value interpolation between vertex and pixel work. These are intended as a workflow improvement, there are no changes to interpolator limits nor will shaders change. The feature is compatible with Customized UVs and will pack results together.
    • Post processing now supports Image-based (FFT) convolution for physically realistic bloom effects in addition to the existing bloom method. This new post processing feature is designed for use in cinematics or on high-end hardware.
      • Image based Convolution adds new control parameters to the existing Lens | Bloom section found in Post Process volumes.
      • In using a new texture to serve as the kernel image, it is important to ensure that the full image is present on the gpu and available at full resolution. Two settings are required to ensure this:
        • Mips Gen Setting should be set to NoMipmaps
        • Texture option Never Stream should be selected.
  • Core Updates:
    • Garbage Collection improvements make collection times twice as fast.
    • Removed support for Visual Studio 2013. VS 2015 and VS 2017 are currently supported.
  • Tools Updates:
    • Localized String Tables are now supported. These provide a way to centralize your localized text into one (or several) known locations, and then reference the entries within a string table from other assets or code in a robust way that allows for easy re-use of localized text. String Tables can be defined in C++, loaded via CSV file, or created as an asset.
    • Color Grading improvements include polished UI, a new HSV mode, dynamically changing the min/max value of sliders using Ctrl+drag, and a new Reset button.
    • The Asset Audit Window, built on top of the experimental Asset Management Framework, can be used to audit disk size, memory usage, and general asset properties for many assets at once. It is a specialized version of the Content Browser, and can be accessed from the Window->Developer Tools menu, or from the right click menu in the Content Browser or Reference Viewer. Once you have opened the window assets can be added using the buttons, and platform data loaded out of cooked asset registries can be loaded using the platform drop down.
    • Updates to VR Mode's UI and interation include a new asymmetrical controller setup where you can change which controller has the interaction laser in Editor Preferences > VR Mode, and an updated Radial Menu with access to all major editor features and UI panels. Additional updates include smoothed interaction lasers and improved teleportation.
    • VR Mode now has access to the Sequencer Editor
    • Physics simulation is now possible in VR mode
    • Smart Snapping in VR Mode uses the bounds of your object to align to other actors in the scene, enabling you to exactly fit them together without needing to build modular assets with a grid in mind.
  • Sequencer Updates:
    • Shot enhancements include pre/post roll and hierarchical bias.
    • UI enchancements include re-sizable tracks and audio thumbnails now render the peak samples with an inner RMS curve
    • Material Parameter Collection Tracks have been added
    • Various minor improvements including Binding Override improvements, additional event receivers on Level Sequence Actors, event ordering, and more.
  • Audio Updates:
    • The new Unreal Audio Engine is available for testing on PC, Mac, iOS, and Android. It is not enabled by default in 4.16 as there is continued work on implementing backends for console platforms, Linux, and HTML5, as well as stability and performance improvements, especially on mobile platforms. It is enabled for each platform by changing the AudioDeviceModuleName ini setting in the platform’s Engine.ini file:
      • For PC, in WindowsEngine.ini set AudioDeviceModuleName=AudioMixerXAudio2
      • For MacOS, in MacEngine.ini set AudioDeviceModuleName=AudioMixerCoreAudio
      • For iOS, in IOSEngine.ini set AudioDeviceModuleName=AudioMixerAudioUnit
      • For Android, in AndroidEngine.ini set AudioDeviceModuleName=AudioMixerAndroid
    • The Steam Audio SDK has a fully-integrated implementation using the capabilities of the new Unreal Audio Engine. Click here for more details.
    • The new Synthesis Plugin contains two new real-time synthesizers written using the new Unreal Audio Engine’s “SynthComponent” class to implement a fully-featured subtractive synthesizer as well as a real-time granulator. It also contains a host of new DSP source and submix effects for use with the new Unreal Audio Engine.
  • Mobile Updates:
    • Android 23 (and above) now supports runtime permissions. Permissions are requested at runtime for the required ones used by the engine, and there is an Android Permission plugin to check for permission (Check Permission BP node), request it (AcquirePermissions BP node), and get the response (OnPermissionsGrantedDynamicDelegate). This is only necessary if Android Target SDK is set to 23 or above, otherwise any permissions in the Android Manifest are granted as usual.
      • 4.16 requires Android SDK 23 or higher to be installed. Please use the SDK manager to install it if you have an earlier install. Also, in Android SDK project settings, please change your Android SDK API Level from “matchndk” to “latest”. This will use the newest installed SDK found in your Android SDK platforms directory. There is no need to change the NDK API Level; “android-19” is correct to allow installing your APK on Android version prior to Lollipop (Android 5.0); setting this higher will require Android 5.0+.
    • Improved Virtual Keyboard support on Android (Experimental). A new option in Project Settings > Platforms > Android > APKPackaging enables the improved virtual keyboard. This replaces the dialog input box. When this option is enabled, users should bind event handlers to FGenericApplication::OnVirtualKeyboadShown and FGenericApplication::OnVirtualKeyboardHidden.
    • Mobile performance improvements for Slate and UI rendering. Set the console variable Slate.CacheRenderData=0 to enable the option for Invalidation Panels to cache only widget elements, which improves texture batching and reduces draw calls.
    • HTML5 support for WebAssembly (WASM). This feature comes from Mozilla's latest Emscripten toolchain (v1.37.9) and reduces app download size, startup times, memory consumption, and improves performance.
    • HTML5 support for WebGL 2. This provides better rendering performance, visual fidelity, and more rendering feature sets.
  • VR Updates:
    • Unified Console Commands across VR platforms. The previous code base did not share common interfaces across HDMs; now there is a shared layer that developers can work from rather than maintaining each platform individually. This provides several benefits such as easier bootstrapping of new platforms, consistent interfaces, and less redundancies in HMD implementations.
    • The mobile multiview path now supports GearVR. Mobile multiview is similar to instanced stereo on the desktop, and provides an optimized path for stereo rendering on the CPU. Enable this in your Project Settings under VR, and restart the editor to take effect.

 

The preview release is available via the Epic game launcher.

GameDev News

26. April 2017

 

Welcome back to our ongoing HaxeFlixel Tutorial Series, today we are going to cover playing sound and music in HaxeFlixel.  This title is a bit misleading, as they are both basically the same thing, they are all ultimately FlxSound objects.  The single biggest challenge you are going to have with HaxeFlixel audio is file formats, so let’s start there. 

The file format of your audio file depends entirely on the platform your code is going to run on.  On the Flash player platform, all of your audio files need to be in mp3 format.  However on other platforms the mp3 file format has a number of licensing issues, so they instead use Ogg Vorbis (ogg) files for long audio files, and WAV for shorter effects.  Why two different formats?  WAV files are much faster to load, but take up more system memory.  They are ideal for frequently used but shorter sound files.  Ogg files on the other hand are much more compressed, taking more CPU power to process but a great deal less space.  This makes them ideal for longer audio files, such as your game’s music.  In addition to the file format limitations, your sounds need to be encoded at 11025, 22050 or 44100kHz frequency.  If you need to switch encoding frequencies, Audacity is a great free program (you can learn more about here) that can also be used to convert between the various file formats.

Using a default project, HaxeFlixel will automatically create sound and music folders in your asset directory.  One thing you will notice in this example is when you start including multiple different audio files for different platforms, the warnings can become extremely irritating.  Hearing that your mp3 file isn't compatible with your platform, or that your WAV file wont work on Flash gets irritating quickly.  Thankfully we have the ability to filter our files based on platform.  In your Project.xml file, locate the Path Settings area and edit it like so:

   <!-- _____________________________ Path Settings ____________________________ -->

   <set name="BUILD_DIR" value="export" />
   <classpath name="source" />
   <!-- <assets path="assets" /> -->
   <assets path="assets/music" include="*.mp3" if="flash" />
   <assets path="assets/music" include="*.ogg|*.wav" unless="flash" />
   <assets path="assets/sounds" include="*.mp3" if="flash" />
   <assets path="assets/sounds" include="*.ogg|*.wav" unless="flash" />

This will cause Flash builds to only see mp3 files from the music and sound directories, while all other platforms will only see the ogg and wav format files.  Ok, now lets move on to some actual code.

Playing music and sound effects is trivial in HaxeFlixel, but once again platform becomes a factor.  In this case we are using conditional compilation to solve this problem, like so:

#if flash
   FlxG.sound.playMusic(AssetPaths.techno__mp3);   
   soundEffect = FlxG.sound.load(AssetPaths.gunshot__mp3);
#else
   FlxG.sound.playMusic(AssetPaths.techno__ogg);
   soundEffect = FlxG.sound.load(AssetPaths.gunshot__wav);  
#end

 

In this case the music file will play automatically, while the soundEffect needs to be triggered manually.  Let’s take a look at how to do that in our update() method.

if(FlxG.keys.justPressed.G)
   soundEffect.play(true);

The true parameter causes the sound effect to replace any running instances.  This will result in the sound starting from the beginning if you press the G key multiple times before a sound effect has finished playing. 

You also have a fair bit of control over sound playback.  The following code shows how to pause/resume sound as well as increasing and decreasing the volume.  Volume is a value that ranges from 0.0 to 1.0, with 0.0 being complete silence while 1.0 is maximum volume.

if(FlxG.keys.justPressed.P)
   if(FlxG.sound.music.active)
      FlxG.sound.music.pause();
   else
      FlxG.sound.music.resume();
if(FlxG.keys.justPressed.PLUS)
   FlxG.sound.changeVolume(0.1);
if(FlxG.keys.justPressed.MINUS)
   FlxG.sound.changeVolume(-0.1);

 

HaxeFlixel also supports positional audio.  Both by changing the X and Y position of a sound effect, or positioning it relative to another object.  The latter is the approach we are going to take here, creating a virtual ear that is centered to the screen.

soundX = FlxG.width/2;  
soundY = FlxG.height/2;
ear = new FlxObject();
ear.setPosition(FlxG.width/2, FlxG.height/2);   

Now we can relocate the sound relative to the ear and it will change accordingly.

 

There is also

if(FlxG.keys.justPressed.LEFT)
   soundX -= 25;
if(FlxG.keys.justPressed.RIGHT)
   soundX += 25;        
if(FlxG.keys.justPressed.UP)
   soundY -= 25;
if(FlxG.keys.justPressed.DOWN)
   soundY += 25;                 
FlxG.sound.music.proximity(soundX,soundY, ear,300,true);

It is also possible to tween audio, fading it in and out over time, like so:

if(FlxG.keys.justPressed.S)
   FlxG.sound.music.fadeOut(3.0);

You can also set up callback functions when a sound stops playing.  This can be used to easily create a music or sound effect management system.

FlxG.sound.music.onComplete = function() {
   // This will only ever fire if you hit L to turn looping off
   trace("Song ended");
}

 

Now let’s show the complete source listing, which also illustrates a couple other features such as looping, muting and pausing audio.

package;

import flixel.FlxG;
import flixel.FlxSprite;
import flixel.FlxState;
import flixel.system.FlxSound;
import flixel.FlxObject;

class PlayState extends FlxState
{
   var soundEffect:FlxSound;
   var soundX:Float;
   var soundY:Float;
   var ear:FlxObject;

   override public function create():Void
   {
      super.create();
      #if flash
         FlxG.sound.playMusic(AssetPaths.techno__mp3);   
         soundEffect = FlxG.sound.load(AssetPaths.gunshot__mp3);
      #else
         FlxG.sound.playMusic(AssetPaths.techno__ogg);
         soundEffect = FlxG.sound.load(AssetPaths.gunshot__wav);  
      #end

         soundX = FlxG.width/2;  
         soundY = FlxG.height/2;
         ear = new FlxObject();
         ear.setPosition(FlxG.width/2, FlxG.height/2);   

      FlxG.sound.music.onComplete = function() {
         // This will only ever fire if you hit L to turn looping off
         trace("Song ended");
      }

      FlxG.sound.muted = true;
   }

   override public function update(elapsed:Float):Void
   {
      super.update(elapsed);

if(FlxG.keys.justPressed.G)
   soundEffect.play(true);

      if(FlxG.keys.justPressed.P)
         if(FlxG.sound.music.active)
            FlxG.sound.music.pause();
         else
            FlxG.sound.music.resume();
      if(FlxG.keys.justPressed.PLUS)
         FlxG.sound.changeVolume(0.1);
      if(FlxG.keys.justPressed.MINUS)
         FlxG.sound.changeVolume(-0.1);

      if(FlxG.keys.justPressed.LEFT)
         soundX -= 25;
      if(FlxG.keys.justPressed.RIGHT)
         soundX += 25;        
      if(FlxG.keys.justPressed.UP)
         soundY -= 25;
      if(FlxG.keys.justPressed.DOWN)
         soundY += 25;                 
      FlxG.sound.music.proximity(soundX,soundY, ear,300,true);

      
      if(FlxG.keys.justPressed.L)
         FlxG.sound.music.looped = false;

      if(FlxG.keys.justPressed.S)
         FlxG.sound.music.fadeOut(3.0);

      if(FlxG.keys.justPressed.M)
         FlxG.sound.muted = !FlxG.sound.muted;
   }
}

 

The Video

[Coming Soon]

Programming , , ,

25. April 2017

 

In beta for a couple of years now, Silicon Studios have just released Xenko Game Engine.  If you are interested in learning more, we did a complete tutorial series back when it was known as Paradox 3D.  As part of the release pricing information has finally been announced.

XenkoPricing

Until July 31st, the Pro version will be available for free.  Xenko is a cross platform 2D/3D game engine with an editor and full Visual Studio integration.  The primary language is C#.  The personal release requires a splashscreen and has a $200K USD revenue limit.

We did a hands on video detailing the new release below and embedded below.

GameDev News

Month List

Popular Comments

Marmalade 7.3 C++ based cross platforming gaming library released. Now available free
Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon


Home > >

6. June 2014

Marmalade header

Marmalade just released version 7.3 of their cross platform gaming library, highlights of the update are:

 

  • Enhanced support for Windows Store platform 
  • Multi-touch support for Windows Desktop platform
  • OpenGL ES 3.0 support for iOS, Android and Windows Desktop platforms
  • OpenGL ES 2.0 and OpenAL 1.1 support for Marmalade Juice
  • GCC 4.8 support for building x86 and ARM application binaries
  • Hub support for simultaneous x86 and ARM deployment packaging on Android platform
  • iOS 7.1 framework support
  • ARM architecture variant support

 

Of course, the biggest news with this release is, you can now use Marmalade completely free.  Of course you say, there must be a catch!  You would be right, but it’s a pretty fair one.  From a Marmalade engineer doing an AMA on Reddit, he said:

 

  • Marmalade C++ - for C++ development using your preferred IDE
  • Marmalade Juice - for porting iOS projects to Android
  • Marmalade Quick - for rapid application development with the Lua scripting language
  • Marmalade Web - HTML5 development for web and hybrid apps

With the free version, you are able to deploy your projects to the following platforms using a single common codebase:

  • iOS
  • Android
  • Windows Phone 8
  • Windows Store
  • Tizen
  • BlackBerry

There are restrictions on the free version (you knew it was coming... :P)

  • Splash screen when your app starts
  • You are limited to the pre-integrated extensions that are included (these include billing and IAP APIs for all major app stores, advertising, social and analytics services)

 

He missed another major missing feature in that summary, on device debugging.  This one is somewhat critical when it comes time to ship.

So, a no device debugging, splash screen and limited to the included extensions.  If these limits are too much for you, you can obviously upgrade to a paid option.  For the $15/150 (month/year) tier, you can replace the splash screen.  For $499/year ( no monthly option ) you can get on device debugging and full access to native extensions.  The next two tiers ( $1500 and $3500 ) are mostly about support and source access.

blog comments powered by Disqus

Month List

Popular Comments