Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

10. January 2017

 

Carmel is a new WebVR browser available for the Oculus Gear VR in preview form.  WebVR is an attempt to bring Virtual Reality to the web browser, while Carmel is their mobile browser supporting it.  You can learn more about the VR web here. In a nutshell, WebVR is going to enable you to write VR experiences in a similar experience to creating dynamic web pages today.  To help with that, Oculus have released the Carmel VR Starter Kit.  It’s a set of samples and examples to get you up and running in WebVR for the Carmel browser. 

 

You can download the starter kit on Github and getting started is easy assuming you have Node.js installed.

Running Samples

First, run npm install to get the npm dependencies used for hosting the samples locally.

Run npm start to start a local http server on port 8000.

You can navigate to, http://:8000/index.html, to access the samples on a Gear VR enabled mobile device.

The top category of links will launch each sample into the Carmel Technical Preview.

The bottom category of links can be used to launch each sample directly in your browser if the sample supports rendering monoscopically.

 

Here is a look at the Hello World example’s code:

<!DOCTYPE html>
<!--
  Copyright 2016-present, Oculus VR, LLC.
  All rights reserved.

  This source code is licensed under the license found in the
  LICENSE-examples file in the root directory of this source tree.
-->
<html>
  <head>
    <title>Hello WebVR</title>
    <style>
      body {
        margin: 0;
      }
      canvas {
        position: absolute;
        width: 100%;
        height: 100%;
      }
      #messages {
        position: absolute;
        color: white;
        width: 100%;
        height: 100%;
      }
    </style>
    <script>
      var vrDisplay;      // The VRDisplay we will present to, discovered from 
      getVRDisplays
      var frameData;      // HMD information, populated each frame by 
      getFrameData
      var layerSource;    // The source of the VRLayer passed to requestPresent, 
      our canvas element.

      var gl;             // The webgl context of the canvas element, used to 
      render the scene
      var quadProgram;    // The WebGLProgram we will create, a simple quad 
      rendering program
      var attribs;        // A map of shader attributes to their location in the 
      program
      var uniforms;       // A map of shader uniforms to their location in the 
      program
      var vertBuffer;     // Vertex buffer used for rendering the scene
      var texture;        // The texture that will be bound to the diffuse 
      sampler
      var quadModelMat;   // The quad's model matrix which we will animate

      // This is the entrypoint to this sample and where we attempt to begin VR 
      presentation
      function requestPresent() {
        // First, initialize our WebGL program for rendering a simple quad
        initWebGLProgram();

        // Next, we will get the first VRDisplay that is available and try to 
        requestPresent.
        // If VR is unavailable or we aren't able to present, we will simply 
        display an HTML message in the page.
        if (navigator.getVRDisplays) {
          navigator.getVRDisplays().then(function (displays) {
            if (displays.length > 0) {
              // We reuse this every frame to avoid generating garbage
              frameData = new VRFrameData();

              vrDisplay = displays[0];

              // We must adjust the canvas (our VRLayer source) to match the 
              VRDisplay
              var leftEye = vrDisplay.getEyeParameters("left");
              var rightEye = vrDisplay.getEyeParameters("right");

              // This layer source is a canvas so we will update its width and 
              height based on the eye parameters.
              // For simplicity we will render each eye at the same resolution
              layerSource.width = Math.max(leftEye.renderWidth, rightEye.
              renderWidth) * 2;
              layerSource.height = Math.max(leftEye.renderHeight, rightEye.
              renderHeight);

              // This can normally only be called in response to a user gesture.
              // In Carmel, we can begin presenting the VR scene right away.
              vrDisplay.requestPresent([{ source: layerSource }]).then(function (
              ) {
                // Start our render loop, which is synchronized with the 
                VRDisplay refresh rate
                vrDisplay.requestAnimationFrame(onAnimationFrame);
              }).catch(function (err) {
                // The Carmel Developer preview allows entry into VR at any time 
                because it is a VR first experience.
                // Other browsers will only allow this to succeed if called in 
                response to user interaction, such as a click or tap though.
                // We expect this to fail outside of Carmel and would present 
                the user with an "Enter VR" button of some sort instead.
                addHTMLMessage("Failed to requestPresent.");
              });
            } else {
              // Usually you would want to hook the vrdisplayconnect event and 
              only try to request present then.
              addHTMLMessage("There are no VR displays connected.");
            }
          }).catch(function (err) {
            addHTMLMessage("VR Displays are not accessible in this context.  
            Perhaps you are in an iframe without the allowvr attribute specified.
            ");
          });
        } else {
          addHTMLMessage("WebVR is not supported on this browser.");
          addHTMLMessage("To support progressive enhancement your fallback code 
          should render a normal Canvas based WebGL experience for the user.");
        }
      }

      // Once we are presenting this will get called any time a new frame should 
      be rendered on the VRDisplay
      // The timestamp passed to our callback is the current DOMHighResTimeStamp 
      at the start of the frame.
      // We can use the timestamp to update our scene and perform animations in 
      a framerate independent way.
      function onAnimationFrame(timestamp) {
        // Continue to request frames to keep the render loop going
        vrDisplay.requestAnimationFrame(onAnimationFrame);

        // Clear the layer source - we do this outside of render to avoid 
        clearing twice
        gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

        // Update the scene once per frame
        update(timestamp);

        // Get the current pose data
        vrDisplay.getFrameData(frameData);

        // Render the left eye
        gl.viewport(0, 0, layerSource.width * 0.5, layerSource.height);
        render(frameData.leftProjectionMatrix, frameData.leftViewMatrix);

        // Render the right eye
        gl.viewport(layerSource.width * 0.5, 0, layerSource.width * 0.5, 
        layerSource.height);
        render(frameData.rightProjectionMatrix, frameData.rightViewMatrix);

        // Submit the newly rendered layer to be presented by the VRDisplay
        vrDisplay.submitFrame();
      }

      function update(timestamp) {
        // Animate the z location of the quad based on the current frame 
        timestamp
        var oscillationSpeed =  Math.PI / 2;
        var z = -1 + Math.cos(oscillationSpeed * timestamp / 1000);
        quadModelMat[14] = z
      }

      // For VR, it's important that your render method is parameterized by the 
      camera
      // (projection and view matrices) so that it can be used to render from 
      each
      // eye's perspective
      function render(projectionMat, viewMat) {
        gl.useProgram(quadProgram);

        // The view and projection uniforms are passed in and are different for 
        the left eye and right eye
        gl.uniformMatrix4fv(uniforms.projectionMat, false, projectionMat);
        gl.uniformMatrix4fv(uniforms.viewMat, false, viewMat);

        // The remainder of our rendering is the same for both eyes now that 
        view and projection have been set up.
        gl.uniformMatrix4fv(uniforms.modelMat, false, quadModelMat);

        gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);

        gl.enableVertexAttribArray(attribs.position);
        gl.enableVertexAttribArray(attribs.texCoord);

        gl.vertexAttribPointer(attribs.position, 3, gl.FLOAT, false, 20, 0);
        gl.vertexAttribPointer(attribs.texCoord, 2, gl.FLOAT, false, 20, 12);

        gl.activeTexture(gl.TEXTURE0);
        gl.uniform1i(uniforms.diffuse, 0);
        gl.bindTexture(gl.TEXTURE_2D, texture);

        gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);
      }

      function initWebGLProgram() {
        layerSource =  document.getElementById("webgl-canvas");

        var glAttribs = {
          alpha: false,                   // The canvas will not contain an 
          alpha channel
          antialias: true,                // We want the canvas to perform anti-
          aliasing
          preserveDrawingBuffer: false    // We don't want our drawing to be 
          retained between frames, we will fully rerender each frame.
        };

        // You should also check for "experimental-webgl" when implementing 
        support for canvas based WebGL fallback when VR is not available.
        gl = layerSource.getContext("webgl", glAttribs);

        var quadVS = [
          "uniform mat4 projectionMat;",
          "uniform mat4 viewMat;",
          "uniform mat4 modelMat;",
          "attribute vec3 position;",
          "attribute vec2 texCoord;",
          "varying vec2 vTexCoord;",

          "void main() {",
          "  vTexCoord = texCoord;",
          "  gl_Position = projectionMat * viewMat * modelMat * vec4(position, 1.
          0);",
          "}",
        ].join("\n");

        var quadFS = [
          "precision mediump float;",
          "uniform sampler2D diffuse;",
          "varying vec2 vTexCoord;",

          "void main() {",
          "  gl_FragColor = texture2D(diffuse, vTexCoord);",
          "}",
        ].join("\n");

        quadProgram = gl.createProgram();

        var vertexShader = gl.createShader(gl.VERTEX_SHADER);
        gl.attachShader(quadProgram, vertexShader);
        gl.shaderSource(vertexShader, quadVS);
        gl.compileShader(vertexShader);

        var fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
        gl.attachShader(quadProgram, fragmentShader);
        gl.shaderSource(fragmentShader, quadFS);
        gl.compileShader(fragmentShader);

        attribs = {
          position: 0,
          texCoord: 1
        };

        gl.bindAttribLocation(quadProgram, attribs.position, "position");
        gl.bindAttribLocation(quadProgram, attribs.texCoord, "texCoord");

        gl.linkProgram(quadProgram);

        uniforms = {
          projectionMat: gl.getUniformLocation(quadProgram, "projectionMat"),
          modelMat: gl.getUniformLocation(quadProgram, "modelMat"),
          viewMat: gl.getUniformLocation(quadProgram, "viewMat"),
          diffuse: gl.getUniformLocation(quadProgram, "diffuse")
        };

        var size = 0.2;
        var quadVerts = [];

        var x = 0;
        var y = 0;
        var z = -1;
        quadVerts.push(x - size, y - size, z + size, 0.0, 1.0);
        quadVerts.push(x + size, y - size, z + size, 1.0, 1.0);
        quadVerts.push(x + size, y + size, z + size, 1.0, 0.0);
        quadVerts.push(x - size, y + size, z + size, 0.0, 0.0);

        vertBuffer = gl.createBuffer();
        gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
        gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(quadVerts), gl.
        STATIC_DRAW);

        quadModelMat = new Float32Array([
          1, 0, 0, 0,
          0, 1, 0, 0,
          0, 0, 1, 0,
          0, 0, 0, 1
        ]);

        texture = gl.createTexture();

        var image = new Image();

        // When the image is loaded, we will copy it to the GL texture
        image.addEventListener("load", function() {
          gl.bindTexture(gl.TEXTURE_2D, texture);
          gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, 
          image);

          gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
          gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.
          LINEAR_MIPMAP_NEAREST);

          // To avoid bad aliasing artifacts we will generate mip maps to use 
          when rendering this texture at various distances
          gl.generateMipmap(gl.TEXTURE_2D);
        }, false);

        // Start loading the image
        image.src = "../assets/cube-sea.png";
      }

      function addHTMLMessage(msgText) {
        var message = document.createElement("div");
        message.innerHTML = msgText;
        document.getElementById("messages").appendChild(message);
      }
    </script>
  </head>
  <body onload="requestPresent()">
    <canvas id="webgl-canvas"></canvas>
    <div id="messages"></div>
  </body>
</html>

News

10. October 2016

 

Bullet is a popular open source C++ based 3D physics engine available on Github under the zlib permissive license, a very liberal open source license.  They just released version 2.84.   This release brings a few new features:

  • pybullet -- python bindings
  • VR support for the HTC Vive
  • VR support for the Oculus Rift
  • support for Inverse Kinematics

 

The following video demonstrates the new python bindings in action, using inverse kinematics and running on an HTC Vive.

GameDev News

5. October 2016

 

Back in May Google announced VR support was going to be built into future versions of Android.  It was only a matter of time until Google launched a compatible device, and today that device is here.  The DayDream View is a new VR headset very similar to Samsung’s Galaxy GearVR headset. 

 

The new headset has some interesting features.  Perhaps the most shocking aspect of the DayDream View is that it is made from fabric, causing it to be the lightest of the available headsets.  It’s also shipping with a touchpad motion controller with two buttons.  The headset is compatible with Google’s newly released Pixel phones as well as future VR enabled Android devices.  Perhaps most impressively, the headset and controller are shipping for $79 and will be shipping in November.

 

Below is Google’s DayDream View launch video:

GameDev News

26. September 2016

 

Daydream is Google’s project to bring VR to the Android platform.  Two of the biggest game engines, Unity and Unreal, just announced Daydream support in preview forms.image

 

First Unity’s announcement:

We’re excited to announce that native Daydream support is available as of today! It brings a more streamlined workflow, significant optimizations and reduced latency beyond the Google VR SDK for Unity released at Google I/O. No prefabs, scripts or manual manifest modifications are required to get started – simply enable VR and add Daydream as a targeted platform and begin making your own virtual worlds.

Unity’s native support for Daydream aims to solve the hard problems for you. To get optimal performance and latency from the platform we have done a deep integration with the Daydream SDK to leverage the platform’s asynchronous reprojection and VR performance mode. We have also made it easy to switch in and out of VR mode so that your applications can easily expand to the Google VR audience.

Not targeting just Daydream hardware? You can also have your application target Google Cardboard with native support. Applications which target Cardboard will work on older devices so that your application can reach as many users as possible. At this time, Cardboard support is exclusive to Android with iOS Cardboard support coming soon.

You can find more information and download the Technical Preview here. For questions or feedback head over to the new Daydream forum.

Google has also created a Unity SDK which expands Unity further by providing spatialized audio, Daydream controller support, utilities and samples. Please see the script reference and download pages for more details.

 

And now Unreal Engine:

Back in May during Google I/O, Epic announced its day one support of Daydream, Google’s exciting mobile VR platform for high quality, mobile virtual reality which is coming in Fall 2016 and will provide rich, responsive, and immersive experiences with hardware and software built for VR.

Well, after gathering developer feedback and evolving its resources into a suite of powerful tools, Google has announced that Google VR SDK 1.0 has graduated out of beta and is now available on theDaydream developer site

As pointed out in the official announcement post, the updated SDK simplifies common VR development tasks so developers can focus on building immersive, interactive mobile VR applications for Daydream-ready phones and headsets while supporting integrated scanline racing and interactions using the innovative Daydream controller.

With this release, significant improvements to UE4’s native integration have been implemented that will help developers build better production-quality Daydream apps. The latest version introduces Daydream controller support in the editor, a neck model, new rendering optimizations, and much more - many features of which will be rolled directly into Unreal Engine 4.13.1 with the rest being availablenow through GitHub and rolling into 4.14.

Interested in accessing Google Daydream SDK 1.0? UE4 developers can do so right now by downloading the source here. We can’t wait to see what types of content the Unreal Engine community dreams up!

 

As mentioned in the Unreal Engine announcement, Google also released the 1.0 version of their new GoogleVR SDK.

GameDev News , ,

23. June 2016

 

Razer recently launched HDK2, the open source Head Mounted Display for OSVR, the open source VR standard first released by Valve.  As part of that VR effort, they just announced a $5 million fund for developing OSVR titles.  So does this require exclusivity?  Well no.  In fact their funding approach is pretty novel and one I think others should adopt.

This fund is open to all developers, indie or major, to apply. For every successful applicant, OSVR funding partners will purchase copies of their content in exchange for OSVR integration.

So basically, you add OSVR support to your title, Razer (and other OSVR parties I’m assuming) will then purchase a number of copies of that title, which they in turn can use to include with their headsets.  It’s fairly win/win actually, developers are guaranteed a minimum number of sales, promotion and exposure and headset manufacturers have more titles and promotional options as a result.  Further information from the press release:

SAN FRANCISCO – Organizers of Open Source Virtual Reality (OSVR), the largest open source virtual reality consortium in the world, today announced the OSVR Developer Fund—a content accelerator program led by Razer that avails $5 million to the developer community. The fund encourages developers to support the OSVR ecosystem – an open source ecosystem that allows VR content to work across the board with all VR hardware, giving VR fans and developers more choice without worrying about DRM policies or other restrictive measures.


“VR is working toward being a mainstream success thanks to all the developers who have stepped up to the plate to deliver the next-generation in interactive experiences,” says Christopher Mitchell, OSVR lead, Razer. “The OSVR Developer Fund allows us to directly support the efforts of VR pioneers across the breadth of this developing industry, while at the same time ensuring that content is available to everyone in the industry. It is our contention that if everyone who is constructively contributing to the VR ecosystem succeeds, then VR will succeed. Closed doors in the world of development are a death sentence.”


The OSVR Developer Fund will be available to qualified, participating VR content developers – independent or major. If successful, applicants will have their game codes purchased in bulk by Razer or any future contributors to the fund in exchange for support of the platform. This will help compensate developers for the time spent integrating as well as provide OSVR with assets to promote their game’s availability in the unrestricted OSVR powered eco-system for use with all headsets.


“We understand content developers have various development challenges and we’re committed to helping them get ahead of those barriers,” says Justin Cooney, OSVR director of developer relations, Razer. “The OSVR Developer Fund helps to support initial sales while enabling developers to contribute to the VR industry as a whole. Together, OSVR and its content partners enjoy the realization of a shared vision for the future of VR.”
In the egalitarian spirt of OSVR, developers will not beholden to only one particular sales channel, hardware device or development engine. Publishers will likewise retain full creative control over their content.
They will also receive marketing and promotional support including opportunities to be a part of OSVR hardware bundles or showcases at major consumer events.

 

If you are interested in signing up, be sure to head over to the OSVR page.

GameDev News

Month List

Popular Comments