Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

17. January 2017

 

Unreal Engine have released a preview version of the upcoming Unreal Engine 4.15.  As is now normal with Unreal Engine point releases, this one is absolutely packed with new fixes and features.  Keep in mind this is a preview release for a reason, you should not be using it with production code.  Expect to encounter more than your regular share of bugs.  Details from the release notes:

 

  • Rendering Updates:
    • Finalized Texture Streaming Optimizations that were started in 4.13 and 4.14.
      • Textures used by non visible & hidden components are streamed with one less mips, as a prefetch.
      • Reducing time taken for the visible textures to stream in.
      • Reducing the CPU time taken by the streamer.
      • Mesh UV densities are now computed per material instead of per mesh. New data also takes into account lods. This resolved most issue were texture would appear low resolution. Also, there is now a wider texture streaming support from component, including particle systems and instanced meshes. That resolved other low resolution issues and sometime high memory consumption.
      • The texture streamer can now automatically fit to different memory budgets, without manual tweaks. The streamer will select which textures need to be reduced using different heuristics to minimize visual impact.
      • New visualization tools for debugging.
    • New Nodes have been added to the Material graph
      • Commonly request mathematics nodes: Sine, Cosine, Tangent, Arcsine, Arccosine, Arctangent, Arctangent2, ArcsineFast, ArccosineFast, ArctangentFast, Arctangent2Fast, Round, Truncate, Saturate
        • The nodes marked with the “Fast” tag will execute approximations instead of the real instructions. These can give a worthwhile performance improvement to materials but have input restrictions and precision tradeoffs.
      • PreviousFrameSwitch has been added to allow specific overrides for world-position offsets in complex materials used during motion vector generation.
      • Pre-Skinned Local Normal works in a similar way to the Pre-Skinned Local Position node added in the 4.14 release but returns the local surface normal for skeletal and static meshes. This opens the door to more local-space, mesh aligned effects or advanced use-cases such as writing dynamic surface data to a mask read-back in another material.
    • Metal support has been extended to use many of the new API & shader language features added by Apple in macOS 10.12 Sierra & iOS 10.
      • Enabling the new Metal v1.2 standard allows all Metal platforms to use Unordered Access Views in pixel shaders
      • Full support for Unreal Engine's tessellation features has been implemented
      • Experimental support for HDR rendering on Macs with an appropriate display built in
  • Sequencer Updates:
    • Animation Blending by weight is supported.
      • You can add weight by expanding each track, and key the value in the desired timeline
      • There is no limit to how many animations it can blend at the same time but as for full body animations, the weights will be normalized, so that it doesn’t under/over scale the mesh. For additive animations, the weights will be kept.
    • Audio volume and pitch curves have been added.
  • Blueprint Updates:
    • Cooking Blueprints to C++ is no longer an "experimental" feature.
      • This is enabled in the editor through your project’s Packaging Settings: Blueprints => Blueprint Nativization Method.
      • It can also be invoked by passing -NativizeAssets as a parameter to the UAT BuildCookRun script.
      • Generated source is saved as a plugin in your project’s intermediate folder, under: …\Intermediate\<TargetPlatform>\NativizedAssets
      • More information is available in the documentation.
    • Map & Set containers are now available in blueprints. They are a collection of items which guarantee they contain only unique items, with no repeating entries.
      • The Variable Type control is now a drop down, allowing you to select ‘Single Variable’, ‘Array’, ‘Set’, or ‘Map’. When ‘Map’ is selected a second drop down for the ‘value’ type appears.
      • For maps the following operations are available to blueprint users: Add, Remove, Find, Contains, Keys, Values, Length, and Clear.
      • Set supports: Add, AddItems, Remove, RemoveItems, ToArray, Clear, Length, Contains, Intersection, Union, and Difference.
      • Set and map variables declared in C++ can now be exposed to Blueprints.
      • NOTE: Replication of map and set properties is not yet supported in C++ or Blueprints.
  • Framework Updates:
    • A Raw Input plugin has been checked in to provide support in Windows for steering wheels, flight sticks, and other non-XInput supported devices.
      • All of the Vehicle templates and Vehicle Game* have been configured to work with the Logitech G920. (*Vehicle Game not yet updated for 4.15 Previews)
      • Adding new devices is as easy as setting up a configuration for them in the project settings or editing DefaultInput.ini. (Vendor and Product ID can be discovered from the driver properties)
      • We need your help! - After you successfully configure your devices (figuring out what axes represent brake, gas, steering wheel and necessary modifiers to represent brake/gas as 0 to 1 and the wheel as -1 to 1), please share your .ini settings back with us so we can ensure these devices work without additional setup in the future.
    • A Force Feedback Component can now be added to Actors and exist in the world. It can have attenuation properties to determine the intensity of the playback of the force feedback pattern based on the distance between the player and the effect. The attenuation properties can either be specified directly on the component or you can create a Force Feedback Attenuation asset in the content browser and reuse it for multiple components. Force Feedback Components can also be spawned into the World from blueprints in a similar way that audio, decals, and emitters can.
    • PhysX Vehicle Support is now an optional plugin. This makes it easy for games that are not using vehicles to exclude this feature and save disk space and memory. This work also adds several useful physics extension points to Engine (e.g. OnPhysSceneInit/Term, OnPhysSceneStep) to make it easier for other developers to write their own similar systems.
    • The Blendspace Editor has been overhauled with an updated UI and internal rework.
    • Save Pose Snapshot has been added to capture a runtime skeletal mesh pose in blueprints. Once the pose has been saved, you can use it in the anim-blueprint like any other pose, or save it to a variable.
    • You can Link a Curve to a specific Bone in the skeleton. This supports LOD and Layer Blending, and can be accomplished in the Anim Curves window.
    • Physics objects now have Mass Properties debugging visualizers to see center of mass and inertia tensor. (Show -> Advanced -> Mass Properties)
    • Gameplay Tags have been improved and are now fully supported. They are implemented by the GameplayTag structure in the GameplayTags module and are registered in a central dictionary, which can be accessed from the new GameplayTags project settings view. If you enable the “Import Tags from Config” option, tags can be added from the editor through the Gameplay Tag List on this page, or from the UI used to select tags. Once you have added Tag properties to your data, you can query them from either Blueprint or C++ and use them to change functionality. The BlueprintGameplayTagLibrary has several useful functions. To actually set up tags, add GameplayTag or GameplayTagContainer variables to your data or functions, then you can set the tags from selection UI.
  • Mobile Rendering Updates:
    • ES 3.1 / Metal / Vulkan editor feature level preview no longer experimental. This mode will emulate the feature set available to iOS Metal, Android GLES3.1 and Android Vulkan devices.
    • Mobile devices can now use Custom Stencil in post-processing materials. This requires MobileHDR option enabled and non-Mosaic device. To enable this feature in your project go to Project Settings -> Rendering -> Postprocessing and set ‘Custom Depth-Stencil Pass’ to ‘Enabled with Stencil’.
    • You can now disable shader permutations for lighting setups that your mobile game does not require. The settings available under Project Settings -> Rendering will reduced shader memory usage and App package size.
    • Android applications can now be packaged to support a third-party Graphics Debugger. You can choose from either the Mali Graphics Debugger or Adreno Profiler depending on your device’s GPU. Graphics Debugger options can be found in Project Settings -> Android. You first need to download and install these debuggers from the GPU vendor’s website, and they each require some small amount of additional setup to configure your device for debugging. Be sure to to follow the directions that appear after selecting the debugger type. Also note that if you package your app to support a particular GPU debugger, it may not function correctly when run on a device with a different GPU.
  • Editor Updates:
    • Reroute nodes have been added to the Material Editor.
    • Font Asset Improvements have been made to address issues with memory consumption and stability of Font assets using runtime-cached fonts. Font assets have been split in two: Font and Font Face.
      Font Face is now the asset that stores the font data, and these assets are simply referenced by the Font assets. This means that the same font data can be re-used for multiple font assets, or even multiple typefaces within a Font asset. Existing Font assets will automatically upgrade their internal font data into embedded Font Face assets during load. You can use the Font Editor to split these embedded assets out into real Font Face assets that may be edited and shared.
    • (Experimental) Content Hot-Reloading is available for testing. To enable it you need to go to your "Editor Preferences", and enable "Content Hot-Reloading" under the "Experimental" section. Once enabled all of the in-editor source control operations that affect assets will use content hot-reloading. You also gain a "Reload" option under "Asset Actions". This can be used to forcibly reload a package from disk.
  • Build Updates:
    • The codebase has been converted to a "include what you use" model, where every header includes other headers it needs, rather than every source file including large monolithic headers like Engine.h and UnrealEd.h. Existing game code can continue to include those files as before, but we measure the engine compiling 25-50% faster!
      • Every header now includes everything it needs to compile.
      • Every .cpp file includes its matching .h file first.
      • No engine code includes a monolithic header such as Engine.h or UnrealEd.h any more.
      • No engine code explicitly includes a precompiled header any more.
  • Platform Updates:
    • New Location Services now provides access to GPS data for Android and iOS. A new OnLocationChanged delegate is available and Blueprint nodes are provided under Services->Mobile->Location.
    • Streaming audio for iOS has been implemented.
    • Remote notifications for IOS are now supported. This includes callbacks in the game for handling the notification as well as properly setting up all plist information needed by the application for the app store.
    • We have added support for ARM64 (AArch64) devices running Linux. Right now only boards with desktop GL are supported.
    • 'Launch On' to Remote Linux Machine from the Editor or UFE.
  • VR Updates:
    • The PlayStationⓇVR Aim Controller is now supported through the new AimController plugin. To activate, simply change the “Hand” value to “Gun” on your Motion Controller component.
    • (Experimental) Monoscopic Far Field Rendering for mobile VR is available for testing. With content that has many distant objects, this can benefit performance. To enable, select the checkbox under Project Settings -> Rendering -> VR. We don’t currently support both mobile multi-view and monoscopic far field simultaneously and mobile HDR needs to be disabled.
  • VR Editor Updates:
    • Updated Quick Menu and Radial Menu to quickly access editor functionality.
    • The new number pad menu appears when you click on an editable text field.

GameDev News

12. January 2017

 

Today we are going to take a quick look at the Tilengine 2D game engine.  Tilengine in their own words is:

Tilengine is a free, cross-platform 2D graphics engine for creating classic/retro games with tilemaps, sprites and palettes. Its unique scanline-based rendering algorithm makes raster effects a core feature, a technique used by many games running on real 2D graphics chips.Untitled 3

Tilengine is open source (sorry, the core isn't open ), available on Github however I never could locate what license it’s released under.

EDIT—Since posted, there has been a bit of conversation about the licensing since this was posted, read here.

  It’s a C library, but contains bindings for Python, C# and Java.  I’m actually going to use the C# bindings for the example in this review as it’s the least documented of the available bindings.  There is a single page class reference available here and a small manual available here.  The engine is geared towards creating retro sprite style games and handles graphics, animations, palettes, input and window management, but has no sound or physics engine built in.  It is also designed to be used as a backend solution to an existing front end renderer.  There are several C based examples available here, and this represents the primary way you will get up to speed.  The graphics system is designed to emulate classic sprite systems like Sega’s SuperScaler arcade board but with Super Nintendo’s Mode 7 style graphics effects available.  Tilengine is layered over SDL and is cross platform, capable of running on most desktop operating systems, as well as Raspberry Pi devices.

Tilengine is composed like so:

image

 

Tilengine has direct support for tiled map files created using the Tiled map editor.  If you want to learn more about Tiled, I have done a complete tutorial series available here.

 

As a pretty straight forward game engine, let’s jump right in with the example created using the C# bindings:

using Tilengine;

namespace ConsoleApplication
{
    public class Program
    {
        public static void Main(string[] args)
        {
            var engine = Tilengine.Engine.Init(320,240,1,16,16);
            var window = Tilengine.Window.Create("",Tilengine.WindowFlags.Vsync);
            
            // This is the clear color drawn each frame.  Think of it as the sky color
            engine.BackgroundColor = new Color(0,128,238);

            // Load tsx and tmx file.  These are created in the Tiled level editor
            // tsx is a collection of tiles, tmx is a map painted using those tiles
            var tileset = Tileset.FromFile("SOTB_bg.tsx");
            var tilemap = Tilemap.FromFile("SOTB_bg.tmx","Layer 1");
            
            // create a new layer using our just loaded tiles.  Games can have multiple layers
            var layer = new Layer();
            layer.Setup(tileset,tilemap);
            layer.SetPosition(0,0);

            
            // Now we are loading an animated sprite riped from the 90s classic Shadow of the Beast
            // Spriteset is simply the image collection composing our game Spriteset
            // SequencePack is simple text format describing the available animations, their frames, speed etc
            // While Sequence is a named entry in the SequencePack text file
            Spriteset ss = Spriteset.FromFile("SOTB");
            SequencePack sp = SequencePack.FromFile("SOTB.sqx");
            Sequence walk = sp.Find("walk");

            // Now finally create a sprite using our spritesheet
            Sprite sprite = new Sprite();
            sprite.Setup(ss,TileFlags.None);

            int spriteX = 15;
            sprite.SetPosition(15,215);
            
            // Now play the animation sequence named "walk".  We also pass the final 0 in to tell it how many times the animation
            // should loop.  Zero equals forever
            Animation anim = new Animation();
            anim.SetSpriteAnimation(0,walk,0);
            
            
            int frame = 0;

            // This is your game loop
            while(window.Process()){
                // Draw the current frame of graphics (sprites, layers, etc)
                window.DrawFrame(frame++);

                // Now check if left or right arrow/gamepad are pressed, in which case move in that direction
                // IF moving left, flip the sprite over on the X axis
                if(window.GetInput(Input.Right)){
                    spriteX ++;
                    sprite.Flags = TileFlags.None; 
                }
                if(window.GetInput(Input.Left)){
                    spriteX --;
                    sprite.Flags = TileFlags.FlipX; 

                }
                sprite.SetPosition(spriteX, 185);
                if(spriteX > engine.Width) spriteX = 0;
            }
            

            //Cleanup
            tilemap.Delete();
            tileset.Delete();
            window.Delete();
            engine.Deinit();
        }
    }
}

 

The comments pretty much describe everything that is going on there.  For more details, be sure to check the video version of this tutorial available here [coming soon].  This example loads a sprite and animation from the game Shadow of the Beast, an Amiga platformer classic.  The SequencePack file format is extremely simple XML file, here is the example used:

<?xml version="1.0" encoding="UTF-8"?>

<sequences>
  <sequence name="walk" delay="6" loop="0">
    1,2,3,4,5,6
  </sequence>
</sequences>

 

The tsx and tmx files are generated using the Tiled level editor, another open source and free tool.  As you can see, it’s extremely simple to get up and going.  Run this code you will see:

SOTB

 

This is of course a primitive example, but does show the many parts of a game.  A game loop, sprite loading, animations, level loading, etc.  The major features of the engine, that I’m not covering here, are the various sprite effects it emulates.  You can see these effects demonstrated here or in the samples.

 

The Video

Programming , , ,

12. January 2017

 

Unity have just released patch 5.5.0p4.  No new functionality but it contains a large number of fixes:

Fixes
  • (858785) - Analytics: Fixed occasional windows editor crash on shutdown.
  • (827110) - Android: Disabled fence sync on poor performing drivers.
  • (855603, 859268) - Android: Fixed a crash when reloading or resuming scene which uses WebCamTexture.
  • (848830) - Android: Fixed an exception when trying to build to a non-existent path.
  • (845080) - Android: Fixed an issue where pausing during the splash screen would cause sprites to be black.
  • (855612) - Android: Gradle build and project export now support icon override.
  • (803872) - Android: Post process now executed before app is pushed to device.
  • (855545) - Animation: Fix for a crash using Resources.UnloadUnusedAssets with Animators caused by orphaned references.
  • (858208) - AR: Fixed a crash when exiting play mode during a Holographic Simulation session.
  • (848920, 858080, 860956) - Asset and Scene Management: Fixed a bug where importing multiple native DCC source files resulted in the contents of imported prefabs randomly switching places.
  • (849875) - Collab: Fixed an issue whereby the collab toolbar continued to have 'sign in' button even after signing in from the collab tool bar.
  • (859350) - Editor: Fixed an editor crash when switching platforms on a command line build.
  • (none) - Editor: Fixed local cache server not working if there were spaces in the path to the Unity Editor executable.
  • (858043) - Editor: Fixed Sprite Editor not always grid slicing fully black sprites correctly.
  • (793891) - Editor: Fixed launching Unity getting stuck on a grey screen for a minute or longer when your internet connection is bad.
  • (834243) - Editor: Fixed the issue that personal user is able to skip a mandatory survey.
  • (857504) - Editor: Fixed the splash screen 'Preview' button showing the NoiseModule preview texture.
  • (828286) - Editor: Reduced heap allocations for each frame when rotating the scene view in the editor.
  • (849376) - Graphics: Fixed a bug when importing Alpha8 textures which didn't import them as a single channel texture.
  • (852116) - Graphics: Fixed a crash during texture importing if the import failed.
  • (853722) - Graphics: Fixed a crash during texture importing if the texture format wasn't supported by the platform.
  • (825464) - Graphics: Fixed console error generated when using WWW.movie to create a movie texture.
  • (767034) - Graphics: Fixed errors spamming the console when performing GPU profiling on a DX11 Standalone build.
  • (none) - Graphics: Fixed the GPU Profiler in standalone mode.
  • (732380) - Graphics: Stopped rendering projectors twice if there is any transparent object visible to the camera.
  • (849356) - Graphics: Stopped the texture importer ignoring pure white Alpha channels by default. It is now a user option to choose to ignore it.
  • (810286) - iOS: Fix 2nd stage splash on iPhones with landscapeRight orientation
  • (856989) - iOS: Fixed crash in application:openURL:sourceApplication:annotation due to missing null check with Facebook SDK.
  • (831195) - iOS: Removed extra offset in constraint in default launchscreens.
  • (851764) - Lighting: Only clear the lighting progress bar when lighting is in progress. Stops lighting system accidentally closing other (non-lighting) progress bars.
  • (867312) - Metal: Fixed a memory leak when loading scene.
  • (807091) - Multiplayer: Fixed hostmigration sync issue.
  • (853316, 826931) - Multiplayer: Made sure isLocalPlayer works as expected on OnDestroy.
  • (none) - Networking: Skip proxy check when using the "file://" protocol on Windows.
  • (828188) - Physics: Display message in Inspector for Rigidbody2D when auto-mass is used on a Prefab or an inactive object.
  • (829769) - Physics 2D: Ensure that Rigidbody2D interpolation is reset if the Transform rotation is changed.
  • (715922) - Physics 2D: Fixed some 2D polygon outlines that were almost collinear causing collision detection problems.
  • (764734) - Shadows: Fixed a memory leak and assert when shadows are cast from lights with specific properties and in a specific scene setup.
  • (857270) - Substance: Fixed a crash when compressing small non-square textures to ETC with 'fast' quality.
  • (none) - Test: Corrected CHECK_EQUAL parameter and added another check for validation call count.
  • (856733) - UI: Fixed a curve preview cache not updating preview if curve data had changed but not the bounds.
  • (845756) - UI: Fixed a NullReferenceException when changing font to none.
  • (861467) - VCS: Don't attempt to connect to a Perforce server if any of the following parameters are unset: Server, User or Client.
  • (none) - VR: Fixed the usage of VRSettings.renderViewportScale in Camera's OnPreCull so that it was not a frame latent on all supporting SDKs. Fixed issues with Valve's Renderer adaptive quality feature.
  • (858634) - VR: Y-Axis Range For VR Controllers now match XBox controllers
  • (759286, 782587) - WebGL: Disabled deferred rendering on webgl1.0.
  • (850383) - WebGL: Fixed build with improperly tagged plugins
  • (none) - WebGL: Fixed Content-Length header field for local web server response that caused some audio files to have .duration == inf or zero. This was not reproducible with the normal workflow but could occur with a specific asset store plugin for webgl audio streaming.

As always, the patch is available for download here.

GameDev News

10. January 2017

 

Carmel is a new WebVR browser available for the Oculus Gear VR in preview form.  WebVR is an attempt to bring Virtual Reality to the web browser, while Carmel is their mobile browser supporting it.  You can learn more about the VR web here. In a nutshell, WebVR is going to enable you to write VR experiences in a similar experience to creating dynamic web pages today.  To help with that, Oculus have released the Carmel VR Starter Kit.  It’s a set of samples and examples to get you up and running in WebVR for the Carmel browser. 

 

You can download the starter kit on Github and getting started is easy assuming you have Node.js installed.

Running Samples

First, run npm install to get the npm dependencies used for hosting the samples locally.

Run npm start to start a local http server on port 8000.

You can navigate to, http://:8000/index.html, to access the samples on a Gear VR enabled mobile device.

The top category of links will launch each sample into the Carmel Technical Preview.

The bottom category of links can be used to launch each sample directly in your browser if the sample supports rendering monoscopically.

 

Here is a look at the Hello World example’s code:

<!DOCTYPE html>
<!--
  Copyright 2016-present, Oculus VR, LLC.
  All rights reserved.

  This source code is licensed under the license found in the
  LICENSE-examples file in the root directory of this source tree.
-->
<html>
  <head>
    <title>Hello WebVR</title>
    <style>
      body {
        margin: 0;
      }
      canvas {
        position: absolute;
        width: 100%;
        height: 100%;
      }
      #messages {
        position: absolute;
        color: white;
        width: 100%;
        height: 100%;
      }
    </style>
    <script>
      var vrDisplay;      // The VRDisplay we will present to, discovered from 
      getVRDisplays
      var frameData;      // HMD information, populated each frame by 
      getFrameData
      var layerSource;    // The source of the VRLayer passed to requestPresent, 
      our canvas element.

      var gl;             // The webgl context of the canvas element, used to 
      render the scene
      var quadProgram;    // The WebGLProgram we will create, a simple quad 
      rendering program
      var attribs;        // A map of shader attributes to their location in the 
      program
      var uniforms;       // A map of shader uniforms to their location in the 
      program
      var vertBuffer;     // Vertex buffer used for rendering the scene
      var texture;        // The texture that will be bound to the diffuse 
      sampler
      var quadModelMat;   // The quad's model matrix which we will animate

      // This is the entrypoint to this sample and where we attempt to begin VR 
      presentation
      function requestPresent() {
        // First, initialize our WebGL program for rendering a simple quad
        initWebGLProgram();

        // Next, we will get the first VRDisplay that is available and try to 
        requestPresent.
        // If VR is unavailable or we aren't able to present, we will simply 
        display an HTML message in the page.
        if (navigator.getVRDisplays) {
          navigator.getVRDisplays().then(function (displays) {
            if (displays.length > 0) {
              // We reuse this every frame to avoid generating garbage
              frameData = new VRFrameData();

              vrDisplay = displays[0];

              // We must adjust the canvas (our VRLayer source) to match the 
              VRDisplay
              var leftEye = vrDisplay.getEyeParameters("left");
              var rightEye = vrDisplay.getEyeParameters("right");

              // This layer source is a canvas so we will update its width and 
              height based on the eye parameters.
              // For simplicity we will render each eye at the same resolution
              layerSource.width = Math.max(leftEye.renderWidth, rightEye.
              renderWidth) * 2;
              layerSource.height = Math.max(leftEye.renderHeight, rightEye.
              renderHeight);

              // This can normally only be called in response to a user gesture.
              // In Carmel, we can begin presenting the VR scene right away.
              vrDisplay.requestPresent([{ source: layerSource }]).then(function (
              ) {
                // Start our render loop, which is synchronized with the 
                VRDisplay refresh rate
                vrDisplay.requestAnimationFrame(onAnimationFrame);
              }).catch(function (err) {
                // The Carmel Developer preview allows entry into VR at any time 
                because it is a VR first experience.
                // Other browsers will only allow this to succeed if called in 
                response to user interaction, such as a click or tap though.
                // We expect this to fail outside of Carmel and would present 
                the user with an "Enter VR" button of some sort instead.
                addHTMLMessage("Failed to requestPresent.");
              });
            } else {
              // Usually you would want to hook the vrdisplayconnect event and 
              only try to request present then.
              addHTMLMessage("There are no VR displays connected.");
            }
          }).catch(function (err) {
            addHTMLMessage("VR Displays are not accessible in this context.  
            Perhaps you are in an iframe without the allowvr attribute specified.
            ");
          });
        } else {
          addHTMLMessage("WebVR is not supported on this browser.");
          addHTMLMessage("To support progressive enhancement your fallback code 
          should render a normal Canvas based WebGL experience for the user.");
        }
      }

      // Once we are presenting this will get called any time a new frame should 
      be rendered on the VRDisplay
      // The timestamp passed to our callback is the current DOMHighResTimeStamp 
      at the start of the frame.
      // We can use the timestamp to update our scene and perform animations in 
      a framerate independent way.
      function onAnimationFrame(timestamp) {
        // Continue to request frames to keep the render loop going
        vrDisplay.requestAnimationFrame(onAnimationFrame);

        // Clear the layer source - we do this outside of render to avoid 
        clearing twice
        gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

        // Update the scene once per frame
        update(timestamp);

        // Get the current pose data
        vrDisplay.getFrameData(frameData);

        // Render the left eye
        gl.viewport(0, 0, layerSource.width * 0.5, layerSource.height);
        render(frameData.leftProjectionMatrix, frameData.leftViewMatrix);

        // Render the right eye
        gl.viewport(layerSource.width * 0.5, 0, layerSource.width * 0.5, 
        layerSource.height);
        render(frameData.rightProjectionMatrix, frameData.rightViewMatrix);

        // Submit the newly rendered layer to be presented by the VRDisplay
        vrDisplay.submitFrame();
      }

      function update(timestamp) {
        // Animate the z location of the quad based on the current frame 
        timestamp
        var oscillationSpeed =  Math.PI / 2;
        var z = -1 + Math.cos(oscillationSpeed * timestamp / 1000);
        quadModelMat[14] = z
      }

      // For VR, it's important that your render method is parameterized by the 
      camera
      // (projection and view matrices) so that it can be used to render from 
      each
      // eye's perspective
      function render(projectionMat, viewMat) {
        gl.useProgram(quadProgram);

        // The view and projection uniforms are passed in and are different for 
        the left eye and right eye
        gl.uniformMatrix4fv(uniforms.projectionMat, false, projectionMat);
        gl.uniformMatrix4fv(uniforms.viewMat, false, viewMat);

        // The remainder of our rendering is the same for both eyes now that 
        view and projection have been set up.
        gl.uniformMatrix4fv(uniforms.modelMat, false, quadModelMat);

        gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);

        gl.enableVertexAttribArray(attribs.position);
        gl.enableVertexAttribArray(attribs.texCoord);

        gl.vertexAttribPointer(attribs.position, 3, gl.FLOAT, false, 20, 0);
        gl.vertexAttribPointer(attribs.texCoord, 2, gl.FLOAT, false, 20, 12);

        gl.activeTexture(gl.TEXTURE0);
        gl.uniform1i(uniforms.diffuse, 0);
        gl.bindTexture(gl.TEXTURE_2D, texture);

        gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);
      }

      function initWebGLProgram() {
        layerSource =  document.getElementById("webgl-canvas");

        var glAttribs = {
          alpha: false,                   // The canvas will not contain an 
          alpha channel
          antialias: true,                // We want the canvas to perform anti-
          aliasing
          preserveDrawingBuffer: false    // We don't want our drawing to be 
          retained between frames, we will fully rerender each frame.
        };

        // You should also check for "experimental-webgl" when implementing 
        support for canvas based WebGL fallback when VR is not available.
        gl = layerSource.getContext("webgl", glAttribs);

        var quadVS = [
          "uniform mat4 projectionMat;",
          "uniform mat4 viewMat;",
          "uniform mat4 modelMat;",
          "attribute vec3 position;",
          "attribute vec2 texCoord;",
          "varying vec2 vTexCoord;",

          "void main() {",
          "  vTexCoord = texCoord;",
          "  gl_Position = projectionMat * viewMat * modelMat * vec4(position, 1.
          0);",
          "}",
        ].join("\n");

        var quadFS = [
          "precision mediump float;",
          "uniform sampler2D diffuse;",
          "varying vec2 vTexCoord;",

          "void main() {",
          "  gl_FragColor = texture2D(diffuse, vTexCoord);",
          "}",
        ].join("\n");

        quadProgram = gl.createProgram();

        var vertexShader = gl.createShader(gl.VERTEX_SHADER);
        gl.attachShader(quadProgram, vertexShader);
        gl.shaderSource(vertexShader, quadVS);
        gl.compileShader(vertexShader);

        var fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
        gl.attachShader(quadProgram, fragmentShader);
        gl.shaderSource(fragmentShader, quadFS);
        gl.compileShader(fragmentShader);

        attribs = {
          position: 0,
          texCoord: 1
        };

        gl.bindAttribLocation(quadProgram, attribs.position, "position");
        gl.bindAttribLocation(quadProgram, attribs.texCoord, "texCoord");

        gl.linkProgram(quadProgram);

        uniforms = {
          projectionMat: gl.getUniformLocation(quadProgram, "projectionMat"),
          modelMat: gl.getUniformLocation(quadProgram, "modelMat"),
          viewMat: gl.getUniformLocation(quadProgram, "viewMat"),
          diffuse: gl.getUniformLocation(quadProgram, "diffuse")
        };

        var size = 0.2;
        var quadVerts = [];

        var x = 0;
        var y = 0;
        var z = -1;
        quadVerts.push(x - size, y - size, z + size, 0.0, 1.0);
        quadVerts.push(x + size, y - size, z + size, 1.0, 1.0);
        quadVerts.push(x + size, y + size, z + size, 1.0, 0.0);
        quadVerts.push(x - size, y + size, z + size, 0.0, 0.0);

        vertBuffer = gl.createBuffer();
        gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
        gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(quadVerts), gl.
        STATIC_DRAW);

        quadModelMat = new Float32Array([
          1, 0, 0, 0,
          0, 1, 0, 0,
          0, 0, 1, 0,
          0, 0, 0, 1
        ]);

        texture = gl.createTexture();

        var image = new Image();

        // When the image is loaded, we will copy it to the GL texture
        image.addEventListener("load", function() {
          gl.bindTexture(gl.TEXTURE_2D, texture);
          gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, 
          image);

          gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
          gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.
          LINEAR_MIPMAP_NEAREST);

          // To avoid bad aliasing artifacts we will generate mip maps to use 
          when rendering this texture at various distances
          gl.generateMipmap(gl.TEXTURE_2D);
        }, false);

        // Start loading the image
        image.src = "../assets/cube-sea.png";
      }

      function addHTMLMessage(msgText) {
        var message = document.createElement("div");
        message.innerHTML = msgText;
        document.getElementById("messages").appendChild(message);
      }
    </script>
  </head>
  <body onload="requestPresent()">
    <canvas id="webgl-canvas"></canvas>
    <div id="messages"></div>
  </body>
</html>

News

6. January 2017

 

Back in September I reported on the impending end to development of the cross platform C++ game framework Marmalade.  It was slated to see it’s final release in March of 2017.  According to an email I received earlier today, it seems a last minute savior has stepped in, GMO Cloud k.k.  Not a traditional gaming company, GMO Cloud is a Japanese group of companies known primarily for cloud hosting and security solutions, perhaps most famously in the west as the parent company of GlobalSign, one of the oldest SSL certificate companies in existence.  While not a gaming company, it does appear that they intend to continue development of Marmalade and to continue supporting Marmalade developers.  Excerpt from their press release:

GMO Cloud K.K. (President and CEO: Mitsuru Aoyama, “GMO Cloud”), a member of the GMO Internet Group, was granted the exclusive rights to use the Marmalade software development kit (SDK) by Marmalade Technologies Ltd (Headquarters: London, UK, “Marmalade Technologies”) on November 24, 2016. The Marmalade SDK features cross platform support.


 This license will allow GMO Cloud to accelerate the development of and the ability to offer new features for the Marmalade SDK. GMO Cloud wants to increase its ability to support the development of games and apps in both Japan and abroad in order to expand and enhance sales and support throughout the world.

Followed by

To help meet these needs, GMO Cloud obtained a license to resell the cross platform SDK “Marmalade,” which enables the development of games and apps compiled from a single source code. They released the SDK on July 1, 2015 and currently offer it to game developers in Japan.


 “We are very excited about GMO managing the development and distribution of the Marmalade SDK to enable our customers to continue building compelling content with the high performance and flexible C++ framework that the Marmalade SDK delivers” said Bruce Beckloff chairman and CEO Marmalade Technologies Ltd.


 GMO Cloud believes that by performing everything from development to provision of the SDK that it had previously only sold, it will be able to further stimulate the development of games and apps in Japan and around the world. The company has thus been transferred exclusive rights to use Marmalade from Marmalade Technologies. In addition to selling it in the Japanese and overseas markets, GMO Cloud will develop new features for this tool that will make developers’ lives easier. The company will also enhance the sales and support that it offers in Japan and elsewhere to make Marmalade a more popular SDK, continuing to support the evolution of the game and app development market.

 

You can read the full press release here.  Certainly good news for existing Marmalade developers.  Will be interesting to see what the future holds.

GameDev News

Month List

Popular Comments