Unreal Engine 4.13 Preview Release

 

Epic have just released a preview release of the upcoming 4.13 release of Unreal Engine.  Of course this is not a production release, so expect some warts and troubles.  As always, this release is available from the Epic Game Launcher or via Github.  There are a number of new features in this release including:

 

  • Sequencer Updates:
    • Sequencer has a variety of new import/export capabilities:
      • Can import and export .cmx and .edls for interchange with non-linear editing packages. Each shot in a sequence will be written to a separate movie file which is referenced by the edl file
      • You can apply a burn-in to the exported image
      • The ability to export HDR data in OpenEXR files has been expanded to give a choice of the color gamut used to encode the HDR data
      • Supports importing fbx animation directly to an object or track. You can also export animated tracks to fbx
    • Sequence recorder improvements as shown at Siggraph Real-Time Live! 2016
  • Framework Updates:
    • A new Physical Animation Component (Experimental) allows you to drive skeletal mesh animation through physical motors. The component allows you to set motor strengths directly, as well as using pre-configured physical animation profiles which can be created and edited inside PhAT.
    • A new Pose Driver Animation node (Experimental). This uses an RBF (Radial Basis Function) to interpolate driven values based on the pose of a target bone. You use a PoseAsset to define the target poses for the bone, and the desired curve values at each pose. This node can be used as a Pose Space Deformer, to drive corrective blend shapes based on the orientation of a bone.
    • Physics Constraint Profiles can be used to switch between pre-configured joint setups authored in PhAT. This allows for fine tuning of physics assets based on different game contexts (e.g: full ragdoll vs dangly bits)
    • Sub Animation Blueprints allow you to share animation logic by using a ‘Sub Anim Instance’ node within your Animation Blueprint to reference another Sub Animation Blueprint. This also allows you to break up large Animation Blueprints, for example into ‘locomotion’ and ‘physics’ parts. Member variables of the Sub Blueprint can be exposed as input pins on the node. The Sub Animation Blueprint must use the same Skeleton as the outer Animation Blueprint.
    • A Pose Asset that interprets each key to each pose, and you can blend between. This is to support facial animation where either FACS (Facial Action Coding System) or viseme curves can drive poses. This supports bone transform as well as morph targets.
      • Currently you can create pose asset from AnimSequence using either context menu or create asset menu in Persona. Once you create pose asset, the names will be auto generated by default. You can rename each pose or you can paste from clipboard.
      • Poses are driven by normal animation curves. As long as they exist in the animation you can see the curve. In Persona, in order to preview pose from curve, you can set Preview Pose Asset. In AnimGraph, you could use Pose Blender Node (or Pose By Name) to output the pose according to curves.
    • Animation Curve Viewer: We removed the Skeleton Curve tab from Persona, and moved that functionality into the improved Animation Curves tab. Here you can rename and delete curves, as well as previewing curve data.
      • You can see all curves that belong to current skeleton or currently active curves from preview asset. And you could also filter by specific type of curves if you only want to see active curves. Please note that we named default curve to be called “Attribute”, so any animation curves will be by default attribute curves.
      • If you want to modify the curve value, you can either turn off ‘Auto’ check box option or just type the value.
    • Alembic Importer for Vertex Animation (Experimental) has been added in the form of a plugin, it allows importing caches in different ways:
      • Single frame – Static Mesh (the specific frame will be imported as a static mesh)
      • Frame Sequence
        • Geometry Cache (new asset type, allows for playing back vertex-varying sequences) The imported sequence is played back as flipbook of frames, this means that performance will not be optimal and scale according to the mesh complexity
        • Skeletal Mesh (is only supported for playing back non-vertex-varying sequences) During the import process the animation sequence will be compressed using a PCA scheme, in which common poses (bases) are extracted and weighted to compose the original animation during playback time. The percentage or fixed number of bases used can be set during import to tweak the level of compression
    • Preview Scene Improvements for Static and Skeletal meshes. To access this functionality a new panel (named Preview Scene Settings) is added to both the editors. In here you can setup multiple profiles (scenes) to preview your meshes, and the profiles allow for changing Directional light, Sky light and Post processing settings.
    • A project setting to support Texture Coordinate (UV) info from Line Traces. The option is under Project Settings -> Physics -> Optimizations. When this is enabled, you can use the ‘FindCollisionUV’ function to take a HitResult and find the UV info for any UV channel at the point of impact. Enabling this feature does use extra memory, as a copy of UV information must be stored in CPU memory.
    • Slicing Utility for ProceduralMeshComponent: There is a new utility in the ProceduralMeshComponent plugin which will ‘slice’ a ProceduralMesh at runtime using a plane, adding ‘capping’ geometry, and creating a second ProceduralMeshComponent for the ‘other half’ if desired. ProceduralMesh’s also support simple collision, so they can use physics simulation. We also added a utility to copy data from a StaticMesh to a ProceduralMesh (though the new ‘Allow CPU Access’ flag must be set on the StaticMesh for this to work in cooked builds).
    • Animation Node Pose Watching Anim graph nodes can be “watched” in Persona. This allows you to see a representation of the pose being generated at any point in the anim graph dynamically. Multiple watches can be active at once allowing you to compare poses at different points and find the exact point at which any errors in your current pose are introduced.
    • Threaded Audio (Experimental): Outside of the editor, sound cue evaluation and active wave instance determination can run independently of the game thread.
    • TSet Property: TSet can be used as a UPROPERTY subject to the same limitations as TMap.
      • Does not work with replication, no blueprint support, details panel editing is limited to entering as a string (i.e. “(1,2,3)”), and if you have a TSet of UObjects and one of those UObjects is marked pending kill, the reference will be nulled on GC, but it will still be hashed as if it was the original UObject and it is possible to have multiple of these nulls
  • VR Editor Updates:
    • Foliage Editing in VR allows you to use motion controllers to spray down foliage instances. Pressure sensitivity on the trigger is supported, and you can hold the ‘Modifier’ button to erase foliage.
      • Some features (lasso tool and select tool) are still unavailable.
    • Mesh Painting in VR allows you to use motion controllers to paint on textures and mesh vertices. Pressure sensitivity on the trigger is supported, and you can hold the ‘Modifier’ button to erase.
      • To use this feature, open the “Modes” window in VR, then click the “Mesh Paint” tab.
    • Automatic Entry to VR Editing Mode so you can enter and leave VR editing mode when the VR editor is enabled without having to use the VR button or escape manually! As long as the editor is in the foreground, when you wear the headset, you will automatically enter VR editing mode; when you remove the headset, you will leave it.
      • There is a setting under VR in the Experimental section of Editor Settings that will allow you to turn off auto-entry if you prefer.
    • Color Picker window in VR so you can change color properties on lights and other Actors in your level. You can also use the Color Picker to select colors for Vertex Painting and Texture Painting in VR
    • New Quick Menu Options let you add a flashlight to your controller, to light up dark parts of your scene or see how light interacts with different Materials. You can also take screenshots right from VR
    • Play from VR Editor: To easily prototype your project it is possible to start your project in VR from within the VR Editor. Press the “Play” button on the quick menu to start PIE in VR. To instantly go back to the VR Editor from PIE press the 2 triggers and a grip button on each motion controller.
  • VR Updates:
    • A number of improvements to Instanced Stereo Rendering including moving the velocity pass to use instanced stereo rendering. Multi-view support has also been enabled on the PS4, which leads to significant performance improvements when using the ISR path
    • VR Template for Desktop and Console
  • Rendering Updates:
    • Drawing to render targets from Blueprint. Blueprint functions can be used to draw materials into render targets. This enables a huge variety of game-specific rendering effects to be implemented without having to modify source code.
      • The new Blueprint function DrawMaterialToRenderTarget draws a quad filling the destination render target with the EmissiveColor input in the material.
      • For more advanced drawing to a render target, use Begin/EndDrawCanvasToRenderTarget. These allow multiple draws to a subset of the render target efficiently, as well as font drawing methods through the Canvas object.
    • Scene Captures have been improved to work better with Blueprint Render to Texture
    • Cached shadowmaps for movable point and spot lights When a point or spot light is not moving, we can store off the shadowmap for that light and reuse it next frame.
    • Several improvements to the Noise Material Node. Performance has been improved for several of the functions available in this node, with more detailed description of the performance tradeoffs in the function selection tooltips. Most of these are still slow for runtime use, so baking the results into a texture is encouraged.
      • There is a new Voronoi noise option available for the Noise material node, which can be useful for procedural material creation.
      • The noise functions “Gradient – Texture Based”, “Gradient – Computational”, “Value – Computational”, and “Voronoi” can all be set to tile with an integer tiling period. This can be especially useful for baking into a seamless texture.
    • Morph Targets on GPU Projects can enable calculating morph targets on the GPU on Shader Model 5 level hardware. This frees the CPU from performing those calculations
  • Platform Updates:
    • Mac Metal Shader Model 5 has initial support. This exposes all the available features of Metal on Mac OS X 10.11.6 that are applicable to Unreal Engine 4.
      • Implements the RHI thread & parallel translation features to parallelise render command dispatch
      • Exposes support for Metal compute shaders
      • Exposes asynchronous compute support on AMD GPUs
      • Enables high-end rendering features previously unavailable on Mac, including:
        • High quality dynamic exposure (a.k.a. Eye Adaptation)
        • Compute-shader reflection environments – only available on discrete GPUs for 4.13
        • DistanceField Ambient Occlusion – only available on discrete GPUs for 4.13
        • DistanceField Shadowing – only available on discrete GPUs for 4.13
  • Mobile Updates:
    • Custom Post-Process materials can be used on Mobile devices
      • Requires MobileHDR, non-Mosaic devices
      • Supports only fetching from PostProcessInput0 (SceneColor) with blendable location ‘Before Tonemapping’ or ‘After Tonemapping’
    • Lighting Channels are supported
      • Multiple directional lights are supported in different channels
      • Each primitive can only be affected by one directional, and it will use the directional light from first lighting channel it has set
      • CSM shadows from stationary or movable directional lights cast only on primitives with matching lighting channels
      • Dynamic point lights fully support lighting channels
    • Android Automation Testing: The Project Launcher is able to package and launch your project onto multiple Android devices simultaneously. The app running on each device will communicate back to your host PC over the USB cable and the will appear in the Session Frontend window. You can then launch Automated Tests on all the devices and see the results in the Session Frontend.
    • Mobile Renderer supports OpenGL ES 3.1 on Android.
      • You can choose to package both ES 2.0 and ES 3.1 shaders for the same project, and the device will chooses the best shader platform based on the device’s capabilities.
    • A Mobile Packaging Wizard to help support packaging for mobile where a minimal app without any content is uploaded to an App Store and the rest of the content is downloaded from the Cloud.
      • Wizard can be accessed from Project Launcher window, it will help you create launcher profiles
    • Materials have an option to use Full Precision in pixel shaders when used on Mobile devices.
  • Network Updates:
    • Replay Backwards Compatibility is supported. This means you can make modifications to a build, even add or remove replicated properties and then load replays on the new build that were recorded with an older build.
      • Most of the work is handled by the low level reflection information we have for each replicated property. For custom serialized network data (UObject::NetSerialize), you can use two new functions added to FArchive (FArchive::EngineNetVer() and FArchive::GameNetVer()) which allow you to obtain the current network version of the stream and handle old data manually.
      • To test this feature out, you can simply record a replay with a certain build, change some replicated properties, and then load that same replay on a newer build!
  • Editor/Tools Updates:
    • Localized Text Formatting Improvements allow your translations to be more accurate
      • Plural Forms allow you to use different text based upon a numeric variable given to your text format
      • Gender Forms allow you to use different text based upon an ETextGender value given to your text format
      • Hangul Post-Positions help you deal with the grammar rules present in Korean, and will insert the correct glyph(s) based upon whether the value being inserted ends in a consonant or a vowel
      • To allow you to pass in the numeric/gender values needed for plural/gender form support, all of the FText::Format(…) family of functions take their values as FFormatArgumentValue rather than FText. This can be implicitly constructed from any numeric type, ETextGender, or FText. The ability to set these value types in Blueprints has been exposed using wildcard pins on the “Format Text” node.
      • You can also pre-compile your format pattern if you’re going to be re-using it for multiple calls to FText::Format(…). Simply create and store a FTextFormat instance and pass it as the pattern to FText::Format(…).
    • Using the Widget Interaction Component, you can simulate hardware input events with widget components in the world.
    • Widget Sprites: You can use Paper2D Sprites as Brush inputs for UMG and Slate Widgets.
  • Media Framework Updates: (not tested for all platforms)
    • Media Framework Overhaul
      • Media Framework API has been completely overhauled and simplified
      • Media sources are assets in the Content Browser
      • Media library tab in Media Player editor
      • Playlist assets for playing multiple media sources in a row
      • Audio playback support has been added
      • Improved media file import workflow
      • Improved Blueprint integration
      • Performance improvements on several platforms
      • Pixel format conversion on the GPU
      • Support for dynamically changing video dimensions on some platforms
    • Android Update
      • Support for multiple audio tracks
      • HTTP Live Streaming (HLS) on devices supporting it (m3u8)
    • PS4 Update
      • HTTP Live Streaming (HLS)
      • Improved playback controls (Pause, SetRate, etc.)
      • Media files can be pre-cached to memory
      • Opening media from FArchive
    • Windows Update
      • H.264 is supported
      • Better support for HTTP(S) and RTSP streams
      • Better error handling and logging
      • Stability and usability improvements
      • Graceful handling of non-standard & unsupported codecs
  • Landscape Updates:
    • Optimization of Landscape Material Shader permutations that improve editor iteration and reduce memory usage and package size for landscape materials
    • Landscape Tessellation Performance Improvements by rendering with tessellation only on the highest LOD
  • Build Updates:
    • New script for making Installed Builds: The ‘Rocket’ build process has been re-written using our new BuildGraph script (similar in style to MSBuild scripts), which should make the process easier to follow and modify for other users who want to make their own Installed builds. The main script can be found at Engine/Build/InstalledEngineBuild.xml and can be run using one of the following command lines from AutomationTool:
      • BuildGraph -target=”Make Installed Build Win64” -script=Engine/Build/InstalledEngineBuild.xml
      • BuildGraph -target=”Make Installed Build Mac” -script=Engine/Build/InstalledEngineBuild.xml
    • If you run one of these with -listonly added to the command, you will be able to see what will be built and a list of additional options you can specify. By default it will attempt to build every target platform your host machine is capable of making builds for, except for XboxOne and PS4, which are disabled by default. You can disable target platforms by adding -set:WithWin64=false to the commandline and also skip over the creation of DDC for Engine and Template content by passing -WithDDC=false.

GameDev News


Scroll to Top