Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon
15. December 2015

 

Back in October I had the following Twitter conversation with Nat Friedman of Xamarin.

image

 

It appears that this process has begun, at least for a limited time and for published developers.  From this announcement at Xamarin:

Christmas comes early for indie game developers

Because we love seeing indie games succeed, Xamarin wants to support indie game developers all over the world in bringing their games to billions of mobile gamers. We want every indie game developer to enjoy the power of C# and Visual Studio, so we have an amazing special offer this December:

Free, community-supported subscriptions of Xamarin.iOS and Xamarin.Android, including our Visual Studio extensions

Indie game developers only need to have published a game in any framework on any platform to qualify. We’ll use your published details to verify your indie status:

This offer is limited to independent game developers who have published a game on or before Tuesday, December 15, 2015 in any reputable public store for indie games, such as Steam, Apple App Store, Google Play Store, Windows Store, Xbox Store, PlayStation Store, or Nintendo eShop. No more than one subscription will be granted to any given publisher. This offer expires on December 31, 2015 at 9 pm ET.

 

The published title restriction is a bit of a mind twister for me…  isn’t this a matter of preaching to the choir?  Wouldn’t it make a lot more sense to try and attract aspiring developers instead of a group that already committed to the .NET ecosystem?

GameDev News


18. November 2015

 

Xamarin, the popular cross platform .NET toolchain just released version4.  From the announcement:

Xamarin Platform – native, cross-platform apps

Xamarin.Forms 2.0

Since releasing Xamarin.Forms last year, we’ve been amazed at how quickly developers have adopted it, accelerating their time-to-market for cross-platform native experiences across iOS, Android, and Windows, all while sharing over 90% of their code.

Since then, we’ve more than doubled the engineering team behind Xamarin.Forms, and made major improvements along the way. Today, Xamarin.Forms 2.0 is faster, more reliable, and more functional than ever before. Highlights include support for pre-compiled screens defined in XAML for faster app loads, preview support for Universal Windows Platform apps, support for iOS 9, Android Material Design, and new gestures like pinch and pull-to-refresh.

Native Xamarin.Forms app displayed on iOS, Android, and Windows Phone devices

Visual Studio and iOS

We’ve rebuilt our support for developing iOS apps in Visual Studio from the ground up, and it’s smoother, easier to set up, and more reliable than ever before. Now you can develop, build, deploy and debug iOS apps entirely from within Visual Studio and communication with the Mac build host is now handled via a secure SSH connection. We now also support multiple concurrent Visual Studio instances, which is especially important if you have multiple iOS projects open at the same time. We think you’re going to love it.

Mono/.NET upgrade

In Xamarin 4, we have incorporated large portions from Microsoft’s open sourced .NET codebase into this release, increasing compatibility, performance, and reliability for all use cases.

Android and iOS Designers

We’ve made big improvements to our iOS and Android designers. The iOS designer can now load and save XIB files in addition to storyboard files, and our Android designer now supports Android Material Design. We have also improved the UI for both designers, and switched to using high-performance native design surfaces, for a smoother, faster editing experience.

Xamarin Test Cloud – automated app testing

We built Xamarin Test Cloud to allow you to easily test your app on more than 2,000 real iOS and Android devices in the cloud. With Xamarin 4, we’re making mobile testing more accessible than ever.

Introducing Xamarin Test Recorder

We’re introducing a new preview tool that makes mobile UI testing dead simple: Xamarin Test Recorder. Initially available for Mac, Xamarin Test Recorder records your interactions on iOS or Android apps, plays them back, and automatically creates test scripts that can immediately be run in Xamarin Test Cloud or imported into mobile test projects in Xamarin Studio and Visual Studio. Xamarin Test Recorder records your actions in our C#-based UITest framework so you can automatically execute them as part of your continuous integration process either locally or in the cloud. Download it now to get started.

Xamarin.UITest 1.0

Xamarin 4 includes the 1.0 release of the Xamarin.UITest C# testing framework, with new capabilities for advanced test scenarios. We’re also very happy to announce that Xamarin.UITest is now free for everyone to use, with no limits on test duration, or the use of local devices and simulators. The powerful combination and ease of use of Xamarin Test Recorder and Xamarin Test Cloud will help you immediately improve your apps.

Xamarin Insights – real-time app monitoring

General Availability

We’re proud to announce that starting today, Xamarin Insights is generally available, with free crash reporting for all Xamarin Platform customers and advanced app monitoring features for power users.

Know the Health of Your App, Know Your Users

App monitoring begins with knowing what problems your users are encountering. Is the app crashing or encountering exceptions or errors? Xamarin Insights provides automatic crash reporting and handles both managed and unmanaged mobile crashes seamlessly. You can also explicitly report errors or warnings to Xamarin Insights and track them through the its dashboard. Xamarin Insights makes it easy to rank your issues by impact, spot patterns in app and device usage, and diagnose the corresponding issues.

Xamarin Insights also helps you understand how your app is being used by tracking and timing step-by-step event data. Analyze which screens are the most popular or which actions take your users the longest. You can even see which events led up to a crash, making it easy to reproduce your issues.

You can add Xamarin Insights to your app with just a few lines of code. In Xamarin Studio, new apps immediately get the benefit of Xamarin Insights with templates that utilize the SDK from the very beginning of a mobile project. And your IDE will automatically upload dSYM files for you so that you get symbolicated stack traces with line numbers.

Visit our docs to get started.

End-to-End for Everyone

We think it’s important that every developer be able to benefit from the full range of what Xamarin 4 can do, which is why we’re excited to announce that as a part of their existing subscription, every active Xamarin subscriber will receive:

  • Crash and error reporting from Xamarin Insights with 30 day data retention and detailed issue reports that include step-by-step pre-crash events and crashed-user identification.
  • 60 Xamarin Test Cloud device minutes per month, with access to every single one of the devices in our growing test lab.
  • Complete access to Xamarin.UITest, including tests of unlimited duration running on simulator or device.
  • A 30-day trial pass to Xamarin University, including access to guest lectures and our introductory courses.
Paid Plans

As your business grows and your apps progress, you can buy paid plans of Xamarin Insights and Xamarin Test Cloud that suit your needs.

For companies who want to ramp up their mobile testing, we’re happy to introduce affordable Xamarin Test Cloud pricing plans starting at $99/month (billed annually). If you want to go further with app monitoring, we also provide scalable Xamarin Insights paid plans.

Xamarin Ultimate

Finally, for companies who want a complete end-to-end solution, we’re introducing a new offering called Xamarin Ultimate, which includes full access to all the features of Xamarin Platform, Test Cloud, Insights, and University for your entire team in a complete package at a great price. If you’re interested in learning more about this, please get in touch with our sales team at [email protected].

 

No news yet on the more indie friendly licensing, which hopefully comes soon.

GameDev News


8. September 2015

 

Over the past couple months I have been working on a series of posts covering MonoGame with the intention of compiling them into an e-book when finished.  There have been a few preview builds of the book available to Patreon supporters (thanks by the way!).  Now however I consider the series to be complete so I am making the book available to all.  I will eventually be creating a more complete and formal homepage for the title but this one should work in the meantime.

 

Truth of the matter is, I had intended to cover a great deal more on the subject, but the level of traffic simply doesn’t justify the further expenditure of time.  That said, I leave the book at a state I think it should prove useful for anyone getting started in XNA or MonoGame game development, it is as comprehensive as any beginner XNA book currently available.  The book is composed of seven chapters:

 

Chapter One

An Introduction and Brief History


Book Cover

Chapter Two

Getting Started with MonoGame on Windows


Chapter Three

Getting Started with MonoGame on MacOS


Chapter Four

Creating an Application


Chapter Five

2D Graphics


Chapter Six

Audio Programming


Chapter Seven

3D Graphics 

 

 

 

Of course, the tutorials based here on GameFromScratch are still going to be available in addition to this PDF.  There is also a complete video tutorial series to go along with each chapter in the book available here.

 

With today’s release of the book, I also have published a github repository containing all of the source code used in the book.  This is a single Visual Studio solution containing each example as a separate project.  For some reason I don’t quite understand, all of the chapters are mismatched by one.  So for example the code in Chapter 8 on Github actually corresponds with Chapter 7 in the book.  Oops.

 

Alright, enough blathering, here is the book in PDF format.  I can make it available in other e-reader formats if requested.

EDIT: Here is an untested epub version of the book.

EDIT2: Now it has been posted on Smashwords as well, which should ultimately make it available from a number of sources.

 

If you enjoyed this free e-book and would like to see more similar free books, or would like access to books in development, please consider supporting GameFromScratch on Patreon.

 

 

Cheers!

Mike


20. August 2015

 

In this chapter we start looking at 3D game development using MonoGame.  Previously I called XNA a low level code focused engine and you are about to understand why.  If you come from a higher level game engine like Unity or even LibGDX you are about to be in for a shock.  Things you may take for granted in other engines/libraries, like cameras, are your responsibility in Monogame.  Don’t worry though, it’s not all that difficult.

 

This information is also available in HD Video.

 

This chapter is going to require some prior math experience, such as an understanding of Matrix mathematics.  Unfortunately teaching such concepts if far beyond the scope of what we can cover here without adding a few hundred more pages!  If you need to brush up on the underlying math, the Khan Academy is a very good place to start.  There are also a few books dedicated to teaching gamedev related math including 3D Math Primer for Graphics and Game Development and Mathematics for 3D Game Programming and Computer Graphics.  Don’t worry, Monogame/XNA provide the Matrix and Vector classes for you, but it’s good to understand when to use them and why.

 

Our First 3D Application

 

This might be one of those topics that’s easier explained by seeing.  So let’s jump right in with an example and follow it up with explanation.  This example creates then displays a simple triangle about the origin, then creates a user controlled camera that can orbit and zoom in/out on said triangle.

using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;

namespace Test3D
{

    public class Test3DDemo : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;

        //Camera
        Vector3 camTarget;
        Vector3 camPosition;
        Matrix projectionMatrix;
        Matrix viewMatrix;
        Matrix worldMatrix;

        //BasicEffect for rendering
        BasicEffect basicEffect;

        //Geometric info
        VertexPositionColor[] triangleVertices;
        VertexBuffer vertexBuffer;

        //Orbit
        bool orbit = false;

        public Test3DDemo()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
        }

        protected override void Initialize()
        {
            base.Initialize();

            //Setup Camera
            camTarget = new Vector3(0f, 0f, 0f);
            camPosition = new Vector3(0f, 0f, -100f);
            projectionMatrix = Matrix.CreatePerspectiveFieldOfView(
                               MathHelper.ToRadians(45f), 
                               GraphicsDevice.DisplayMode.AspectRatio,
                1f, 1000f);
            viewMatrix = Matrix.CreateLookAt(camPosition, camTarget, 
                         new Vector3(0f, 1f, 0f));// Y up
            worldMatrix = Matrix.CreateWorld(camTarget, Vector3.
                          Forward, Vector3.Up);

            //BasicEffect
            basicEffect = new BasicEffect(GraphicsDevice);
            basicEffect.Alpha = 1f;

            // Want to see the colors of the vertices, this needs to 
            be on
            basicEffect.VertexColorEnabled = true;

            //Lighting requires normal information which 
            VertexPositionColor does not have
            //If you want to use lighting and VPC you need to create a 
            custom def
            basicEffect.LightingEnabled = false;

            //Geometry  - a simple triangle about the origin
            triangleVertices = new VertexPositionColor[3];
            triangleVertices[0] = new VertexPositionColor(new Vector3(
                                  0, 20, 0), Color.Red);
            triangleVertices[1] = new VertexPositionColor(new Vector3(-
                                  20, -20, 0), Color.Green);
            triangleVertices[2] = new VertexPositionColor(new Vector3(
                                  20, -20, 0), Color.Blue);

            //Vert buffer
            vertexBuffer = new VertexBuffer(GraphicsDevice, typeof(
                           VertexPositionColor), 3, BufferUsage.
                           WriteOnly);
            vertexBuffer.SetData<VertexPositionColor>(triangleVertices)
                                                      ;
        }

        protected override void LoadContent()
        {
            spriteBatch = new SpriteBatch(GraphicsDevice);
        }

        protected override void UnloadContent()
        {
        }

        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            if (Keyboard.GetState().IsKeyDown(Keys.Left))
            {
                camPosition.X -= 1f;
                camTarget.X -= 1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Right))
            {
                camPosition.X += 1f;
                camTarget.X += 1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Up))
            {
                camPosition.Y -= 1f;
                camTarget.Y -= 1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Down))
            {
                camPosition.Y += 1f;
                camTarget.Y += 1f;
            }
            if(Keyboard.GetState().IsKeyDown(Keys.OemPlus))
            {
                camPosition.Z += 1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.OemMinus))
            {
                camPosition.Z -= 1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Space))
            {
                orbit = !orbit;
            }

            if (orbit)
            {
                Matrix rotationMatrix = Matrix.CreateRotationY(
                                        MathHelper.ToRadians(1f));
                camPosition = Vector3.Transform(camPosition, 
                              rotationMatrix);
            }
            viewMatrix = Matrix.CreateLookAt(camPosition, camTarget, 
                         Vector3.Up);
            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            basicEffect.Projection = projectionMatrix;
            basicEffect.View = viewMatrix;
            basicEffect.World = worldMatrix;

            GraphicsDevice.Clear(Color.CornflowerBlue);
            GraphicsDevice.SetVertexBuffer(vertexBuffer);

            //Turn off culling so we see both sides of our rendered 
            triangle
            RasterizerState rasterizerState = new RasterizerState();
            rasterizerState.CullMode = CullMode.None;
            GraphicsDevice.RasterizerState = rasterizerState;

            foreach(EffectPass pass in basicEffect.CurrentTechnique.
                    Passes)
            {
                pass.Apply();
                GraphicsDevice.DrawPrimitives(PrimitiveType.
                                              TriangleList, 0, 3);
            }
            
            base.Draw(gameTime);
        }
    }
}

Alright… that’s a large code sample, but don’t worry, it’s not all that complicated.  At a top level what we do here is create a triangle oriented about the origin.  We then create a camera, offset –100 units along the z-axis but looking at the origin.  We then respond to keyboard, panning the camera in response to the arrow keys, zooming in and out in response to the plus and minus key and toggling orbit using the space bar.  Now let’s take a look at how we accomplish all of this.

 

First, when I said we create a camera, that is a misnomer, in fact we are creating three different Matrices (singular – Matrix), the View, Projection and World matrix.  These three matrices are combined to help position elements in your game world.  Let’s take a quick look at the function of each.

 

View Matrix  The View Matrix is used to transform coordinates from World to View space.  A much easier way to envision the View matrix is it represents the position and orientation of the camera.  It is created by passing in the camera location, where the camera is pointing and by specifying which axis represents “Up” in the universe.  XNA uses a Y-up orientation, which is important to be aware of when creating 3D models.  Blender by default treats Z as the up/down axis, while 3D Studio MAX uses the Y-axis as “Up”.

Projection Matrix The Projection Matrix is used to convert 3D view space to 2D.  In a nutshell, this is your actual camera lens and is created by specifying calling CreatePerspectiveFieldOfView() or CreateOrthographicFieldOfView().  With Orthographic projection, the size of things remain the same regardless to their “depth” within the scene.  For Perspective rendering it simulates the way an eye works, by rendering things smaller as they get further away.  As a general rule, for a 2D game you use Orthographic, while in 3D you use Perspective projection.  When creating a Perspective view we specify the field of view ( think of this as the degrees of visibility from the center of your eye view ), the aspect ratio ( the proportions between width and height of the display ), near and far plane ( minimum and maximum depth to render with camera… basically the range of the camera ).  These values all go together to calculate something called the view frustum, which can be thought of as a pyramid in 3D space representing what is currently available.

World Matrix The World matrix is used to position your entity within the scene.  Essentially this is your position in the 3D world.  In addition to positional information, the World matrix can also represent an objects orientation.

 

So nutshell way to think of it:

View Matrix –> Camera Location

Projection Matrix –> Camera Lens

World Matrix –> Object Position/Orientation in 3D Scene

 

By multiplying these three Matrices together we get the WorldProjView matrix, or a magic calculation that can turn a 3D object into pixels.

What value should I use for Field of View?

You may notice in this example I used a relatively small value of 45 degrees in this example.  What you may ask is the ideal setting for field of view?  Well, there isn’t one, although there are some commonly accepted values.  Human beings generally have a field of view of about 180 degrees, but this includes peripheral vision.  This means if you hold your hands straight out you should be able to just see them out of the edge of your vision.  Basically if its in front of you, you can see it.

However video games, at least not taking into account VR headset games, don’t really use the peripherals of your visual space.   Console games generally set of a field of view of about 60 degrees, while PC games often set the field of view higher, in the 80-100 degree range.  The difference is generally due to the size of the screen viewed and the distance from it.  The higher the field of view, the more of the scene that will be rendered on screen.

 

Next up we have the BasicEffect.  Remember how earlier we used a SpriteBatch to draw sprites on screen?  Well the BasicEffect is the 3D equivalent.  In reality it’s a wrapper over a HLSL shader responsible for rendering things to the screen.  Now HLSL coverage is way beyond the scope of what we can cover here, but basically it’s the instructions to the shader units on your graphic card telling how to render things.  Although I can’t go into a lot of details about how HLSL work, you are in luck, as Microsoft actually released the shader code used to create BasicEffect in the Stock Effect sample available at http://xbox.create.msdn.com/en-US/education/catalog/sample/stock_effects.  In order for BasicEffect to work it needs the View, Projection and Matrix matrixes specified, thankfully we just calculated all three of these.

 

Finally at the end of Intialize() we create an array of VertexPositionColor, which you can guess is a Vertex with positional and color data.  We then copy the triangle data to a VertexBuffer using a call to SetData().  You may be thinking to yourself… WOW, doesn’t XNA have simple primitives like this built in?  No, it doesn’t, although there are easy community examples you can download such as this one: http://xbox.create.msdn.com/en-US/education/catalog/sample/primitives_3d.

 

The logic in Update() is quite simple.  We check for input from the user and respond accordingly.  In the event of arrow keys being pressed, or +/- keys, we change the cameraPosition.  At the end of the update we then recalcuate the View matrix using our new Camera position.  Also in response to the space bar, we toggle orbiting the camera and if we are orbiting, we rotate the camera by another 1 degree relative to the origin.  Basically this shows how easy it is to update the camera by changing the viewMatrix.  Note the Projection Matrix generally isn’t updated after creation, unless the resolution changes.

 

Finally we come to our Draw() call.  Here we set the view, projection and world matrix of the BasicEffect, clear the screen, load our VertexBuffer into the GraphicsDevice calling SetVertexBuffer().  Next we create a RasterState object and set culling off.  We do this so we don’t cull back faces, which would result in our triangle from being invisible when we rotate behind it.  Often you actually want to cull back faces, no sense drawing vertices that aren’t visible!  Finally we load through each of the Techniques in the BasicEffect ( look at the BasicEffect.fx HLSL file and this will make a great deal more sense.  Otherwise stay tuned for when we cover custom shaders later on ), finally we draw our triangle data to screen by calling DrawPrimitives, in this case it’s a TriangleList.  There are other options such as lines and triangle strips, you are basically telling it what kind of data is in the VertexBuffer.

I’ll admit, compared to many other engines, that’s a heck of a lot of code to just draw a triangle on screen!  Reality is though, you generally write this code once and that’s it.  Or you work at a higher level, such as with 3D models imported using the content pipeline.

 

Loading and Displaying 3D Models

 

Next we take a look at the process of bringing a 3D model in from a 3D application, in this case Blender.  The process of creating such a model is well beyond the scope of this tutorial, although I have created a video showing the entire process available right here.  Or you can simply download the created COLLADA file and texture.

Which File Format works Best?


The MonoGame pipeline tool relies on an underlying library named Assimp for loading 3D models. You may wonder which if the many model formats supported should you use if exporting from Blender? FBX and COLLADA(dae) are the two most commonly used formats, while X and OBJ can often be used reliably with very simple non-animated meshes. That said, exporting from Blender is always a tricky prospect, and its a very good idea to use a viewer like the one included in the FBX Converter package to verify your exported model looks correct.

The above video also illustrates adding the model and texture using the content pipeline.  I won’t cover the process here as it works identically to when we used the content pipeline earlier.  Let’s jump right in to the code instead:

using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;

namespace Test3D
{

    public class Test3DDemo2 : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;

        //Camera
        Vector3 camTarget;
        Vector3 camPosition;
        Matrix projectionMatrix;
        Matrix viewMatrix;
        Matrix worldMatrix;

        //Geometric info
        Model model;

        //Orbit
        bool orbit = false;

        public Test3DDemo2()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
        }

        protected override void Initialize()
        {
            base.Initialize();

            //Setup Camera
            camTarget = new Vector3(0f, 0f, 0f);
            camPosition = new Vector3(0f, 0f, -5);
            projectionMatrix = Matrix.CreatePerspectiveFieldOfView(
                               MathHelper.ToRadians(45f), graphics.
                               GraphicsDevice.Viewport.AspectRatio,
                1f, 1000f);
            viewMatrix = Matrix.CreateLookAt(camPosition, camTarget, 
                         new Vector3(0f, 1f, 0f));// Y up
            worldMatrix = Matrix.CreateWorld(camTarget, Vector3.
                          Forward, Vector3.Up);

            model = Content.Load<Model>("MonoCube");
        }

        protected override void LoadContent()
        {
            spriteBatch = new SpriteBatch(GraphicsDevice);
        }

        protected override void UnloadContent()
        {
        }

        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            if (Keyboard.GetState().IsKeyDown(Keys.Left))
            {
                camPosition.X -= 0.1f;
                camTarget.X -= 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Right))
            {
                camPosition.X += 0.1f;
                camTarget.X += 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Up))
            {
                camPosition.Y -= 0.1f;
                camTarget.Y -= 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Down))
            {
                camPosition.Y += 0.1f;
                camTarget.Y += 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.OemPlus))
            {
                camPosition.Z += 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.OemMinus))
            {
                camPosition.Z -= 0.1f;
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Space))
            {
                orbit = !orbit;
            }

            if (orbit)
            {
                Matrix rotationMatrix = Matrix.CreateRotationY(
                                        MathHelper.ToRadians(1f));
                camPosition = Vector3.Transform(camPosition, 
                              rotationMatrix);
            }
            viewMatrix = Matrix.CreateLookAt(camPosition, camTarget, 
                         Vector3.Up);
            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            GraphicsDevice.Clear(Color.CornflowerBlue);

            foreach(ModelMesh mesh in model.Meshes)
            {
                foreach(BasicEffect effect in mesh.Effects)
                {
                    //effect.EnableDefaultLighting();
                    effect.AmbientLightColor = new Vector3(1f, 0, 0);
                    effect.View = viewMatrix;
                    effect.World = worldMatrix;
                    effect.Projection = projectionMatrix;
                }
                mesh.Draw();
            }
            base.Draw(gameTime);
        }
    }
}

It operates almost identically to when we created the triangle by hand, except that model is loaded using a call to Content.Load<Model>().  The other major difference is you no longer have to create a BasicEffect, one is automatically created for you as part of the import process and is stored in the Mesh’s Effects property.  Simply loop through each effect, setting up the View, Projection and World matrix values, then call Draw().  If you have a custom effect you wish to use instead of the generated Effects, you can follow the process documented here: https://msdn.microsoft.com/en-us/library/bb975391(v=xnagamestudio.31).aspx.

 

The Video

Programming


25. July 2015

 

In this chapter we are going to look at using audio in XNA.  Originally XNA supported one way of playing audio, using XACT (Cross Platform Audio Creation Tool ).  Since the initial release they added a much simplified API.  We will be taking a look at both processes.

 

There is an HD video of this chapter available here.

 

When playing audio there is always the challenge of what formats are supported, especially when you are dealing with multiple different platforms, all of which have different requirements.  Fortunately the content pipeline takes care of a great deal of the complications for us.  Simply add your audio files ( mp3, mp4, wma, wav, ogg ) to the content pipeline and it will do the rest of the work for you.   As you will see shortly though, it is also possible to load audio files outside of the content pipeline.  In this situation, be aware that certain platforms do not support certain formats ( for example, no wma support on Android or iOS, while iOS doesn’t support ogg but does support mp3 ).  Unless you have a good reason, I would recommend you stick to the content pipeline for audio whenever possible.

 

The Perils of MP3

Although MP3 is supported by MonoGame, you probably want to stay away from using it. Why?
Patents. If your game has over 5,000 users you could be legally required to purchase a license. From a legal perspective, Ogg Vorbis is superior in every single way. Unfortunately Ogg support is not as ubiquitous as we'd like it to be.

 

Adding Audio Content using the Content Pipeline

This process is virtually identical to adding a graphic file in your content file.

image

 

Simply add the content like you did using right click->Add Existing Items or the Edit menu:

image

 

If it is a supported format you will see the Processor field is filled ( otherwise it will display Unknown ).  The only option here is to configure the mp3 audio quality, a trade off between size and fidelity.

 

Playing a Song

Now let’s look at the code involved in playing the song we just added to our game.

// This example shows playing a song using the simplified audio api

using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;
using Microsoft.Xna.Framework.Media;

namespace Example1
{
    public class Game1 : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;
        Song song;

        public Game1()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
        }

        protected override void Initialize()
        {
            base.Initialize();
        }

        protected override void LoadContent()
        {
            spriteBatch = new SpriteBatch(GraphicsDevice);

            this.song = Content.Load<Song>("prepare");
            MediaPlayer.Play(song);
            //  Uncomment the following line will also loop the song
            //  MediaPlayer.IsRepeating = true;
            MediaPlayer.MediaStateChanged += MediaPlayer_MediaStateChan
                                             ged;
        }

        void MediaPlayer_MediaStateChanged(object sender, System.
                                           EventArgs e)
        {
            // 0.0f is silent, 1.0f is full volume
            MediaPlayer.Volume -= 0.1f;
            MediaPlayer.Play(song);
        }

        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            GraphicsDevice.Clear(Color.CornflowerBlue);
            base.Draw(gameTime);
        }
    }
}

 

Notice that we added the using statement Microsoft.Xna.Framework.Media.  We depend on this for the MediaPlayer and Song classes.  Our Song is loaded using the ContentManager just like we did earlier with Texture, this time with the type Song.  Once again the content loader does not use the file’s extension.  Our Song can then be played with a call to MediaPlayer.Play().  In this example we wire up a MediaStateChanged event handler that will be called when the song completes, decreasing the volume and playing the song again.

 

Playing Sound Effects

 

This example shows playing sound effects.  Unlike a Song, SoundEffects are designed to support multiple instances being played at once.  Let’s take a look at playing SoundEffect in MonoGame:

// Example showing playing sound effects using the simplified audio 
api
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;
using Microsoft.Xna.Framework.Audio;
using System.Collections.Generic;

namespace Example2
{
    public class Game1 : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;
        List<SoundEffect> soundEffects;

        public Game1()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
            soundEffects = new List<SoundEffect>();
        }

        protected override void Initialize()
        {
            base.Initialize();
        }

        protected override void LoadContent()
        {
            // Create a new SpriteBatch, which can be used to draw 
            textures.
            spriteBatch = new SpriteBatch(GraphicsDevice);

            soundEffects.Add(Content.Load<SoundEffect>("airlockclose"))
                             ;
            soundEffects.Add(Content.Load<SoundEffect>("ak47"));
            soundEffects.Add(Content.Load<SoundEffect>("icecream"));
            soundEffects.Add(Content.Load<SoundEffect>("sneeze"));

            // Fire and forget play
            soundEffects[0].Play();
            
            // Play that can be manipulated after the fact
            var instance = soundEffects[0].CreateInstance();
            instance.IsLooped = true;
            instance.Play();
        }


        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            if (Keyboard.GetState().IsKeyDown(Keys.D1))
                soundEffects[0].CreateInstance().Play();
            if (Keyboard.GetState().IsKeyDown(Keys.D2))
                soundEffects[1].CreateInstance().Play();
            if (Keyboard.GetState().IsKeyDown(Keys.D3))
                soundEffects[2].CreateInstance().Play();
            if (Keyboard.GetState().IsKeyDown(Keys.D4))
                soundEffects[3].CreateInstance().Play();


            if (Keyboard.GetState().IsKeyDown(Keys.Space))
            {
                if (SoundEffect.MasterVolume == 0.0f)
                    SoundEffect.MasterVolume = 1.0f;
                else
                    SoundEffect.MasterVolume = 0.0f;
            }
            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            GraphicsDevice.Clear(Color.CornflowerBlue);

            base.Draw(gameTime);
        }
    }
}

 

Note the using Microsoft.Xna.Framework.Audio statement at the beginning.  Once again we added our audio files using the Content Pipeline, in this case I added several WAV files.  They are loaded using Content.Load() this time with the type SoundEffect.  Next it is important to note the two different ways the SoundEffects are played.  You can either call Play() directly on the SoundEffect class.  This creates a fire and forget instance of the class with minimal options for controlling it.  If you have need for greater control ( such as changing the volume, looping or applying effects ) you should instead create a SoundEffectInstance using the SoundEffect.CreateInstance() call.  You should also create a separate instance if you want to have multiple concurrent instances of the same sound effect playing.  It is important to realize that all instances of the same SoundEffect share resources, so memory will not increase massively for each instance created.  The number of simultaneous supported sounds varies from platform to platform, with 64 being the limit on Windows Phone 8, while the Xbox 360 limits it to 300 instances.  There is no hard limit on the PC, although you will obviously hit device limitations quickly enough.

 

In the above example, we create a single looping sound effect right away.  Then each frame we check to see if the user presses 1,2,3 or 4 and play an instance of the corresponding sound effect.  If the user hits the spacebar we either mute or set to full volume the global MasterVolume of the SoundEffect class.  This will affect all playing sound effects.

 

Positional Audio Playback

Sound effects can also be positioned in 3D space easily in XNA. 

// Display positional audio

using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;
using Microsoft.Xna.Framework.Audio;

namespace Example3
{
    public class Game1 : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;
        SoundEffect soundEffect;
        SoundEffectInstance instance;
        AudioListener listener;
        AudioEmitter emitter;


        public Game1()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
        }

        protected override void Initialize()
        {
            base.Initialize();
        }

        protected override void LoadContent()
        {
            spriteBatch = new SpriteBatch(GraphicsDevice);
            
            soundEffect = this.Content.Load<SoundEffect>("circus");
            instance = soundEffect.CreateInstance();
            instance.IsLooped = true;

            listener = new AudioListener();
            emitter = new AudioEmitter();

            // WARNING!!!!  Apply3D requires sound effect be Mono!  
            Stereo will throw exception
            instance.Apply3D(listener, emitter);
            instance.Play();
        }

        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            if (Keyboard.GetState().IsKeyDown(Keys.Left))
            {
                listener.Position = new Vector3(listener.Position.X-0.
                                    1f, listener.Position.Y, listener.
                                    Position.Z);
                instance.Apply3D(listener, emitter);
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Right))
            {
                listener.Position = new Vector3(listener.Position.X + 
                                    0.1f, listener.Position.Y, 
                                    listener.Position.Z);
                instance.Apply3D(listener, emitter);
            }

            if (Keyboard.GetState().IsKeyDown(Keys.Up))
            {
                listener.Position = new Vector3(listener.Position.X, 
                                    listener.Position.Y +0.1f, 
                                    listener.Position.Z);
                instance.Apply3D(listener, emitter);
            }
            if (Keyboard.GetState().IsKeyDown(Keys.Down))
            {
                listener.Position = new Vector3(listener.Position.X, 
                                    listener.Position.Y -0.1f, 
                                    listener.Position.Z);
                instance.Apply3D(listener, emitter);
            }            
            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            GraphicsDevice.Clear(Color.CornflowerBlue);
            base.Draw(gameTime);
        }
    }
}

 

In this example, we load a single SoundEffect and start it looping infinitely.  We then create an AudioListener and AudioEmitter instance.  The AudioListener represents the location of your ear within the virtual world, while the AudioEmitter represents the position of the sound effect.  The default location of both is a Vector3 at (0,0,0).  You set the position of a SoundEffect by calling Apply3D().  In our Update() call, if the user hits an arrow key we updated the Position of the AudioListener accordingly.  After changing the position of a sound you have to call Apply3D again.  As you hit the arrow keys you will notice the audio pans and changes volume to correspond with the updated position.  It is very important that your source audio file is in Mono ( as opposed to Stereo ) format if you use Apply3D, or an exception will be thrown.

 

Using XACT

As mentioned earlier, XACT used to be the only option when it came to audio programming in XNA.  XACT is still available and it enables your audio designer to have advanced control over the music and sound effects that appear in your game, while the programmer uses a simple programmatic interface.  One big caveat is XACT is part of the XNA installer or part of the Direct X SDK as is not available on Mac OS or Linux.  If you wish to install it but do not have an old version of Visual Studio installed, instructions can be found here ( https://www.gamefromscratch.com/post/2015/07/23/Installing-XNA-Tools-Like-XACT-without-Visual-Studio-2010.aspx ).  If you are on MacOS or Linux, you want to stick to the simplified audio API that we demonstrated earlier.

Xact is installed as part of the XNA Studio install, on 64bit Windows by default the Xact executable will be located in C:\Program Files (x86)\Microsoft XNA\XNA Game Studio\v4.0\Tools.  Start by running AudConsole3.exe:

image

 

The XACT Auditioning Tool needs to be running when you run the Xact tool.

Then launch Xact3.exe

image

First create a new project:

image

 

Next right click Wave Banks and select New Wave Bank

image

 

Drag and drop your source audio files into the Wave Bank window:

image

 

Now create a new Sound Bank by right clicking Sound Bank and selecting New Wave Bank

image

 

Now drag the Wave you wish to use from the Wave Bank to the Sound Bank

a1

 

Now create a Cue by dragging and dropping the Sound Bank to the Cue window.  Multiple files can be added to a cue if desired.

a2

 

You can rename the Cue, set the probability to play if you set several Sounds in the Cue and change the instance properties of the Cue in the properties window to your left:

image

Now Build the results:

image

 

This will then create two directories in the folder you created your project in:

image

 

These files need to be added directly to your project, you do not use the content pipeline tool!  Simply copy all three files to the content folder and set it’s build action to Copy.

image

 

Now let’s look at the code required to use these generated files:

using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Graphics;
using Microsoft.Xna.Framework.Input;
using Microsoft.Xna.Framework.Audio;

namespace Example4
{
    public class Game1 : Game
    {
        GraphicsDeviceManager graphics;
        SpriteBatch spriteBatch;
        AudioEngine audioEngine;
        SoundBank soundBank;
        WaveBank waveBank;

        public Game1()
        {
            graphics = new GraphicsDeviceManager(this);
            Content.RootDirectory = "Content";
        }

        protected override void Initialize()
        {
            base.Initialize();
        }
        protected override void LoadContent()
        {
            // Create a new SpriteBatch, which can be used to draw 
            textures.
            spriteBatch = new SpriteBatch(GraphicsDevice);

            audioEngine = new AudioEngine("Content/test.xgs");
            soundBank = new SoundBank(audioEngine,"Content/Sound Bank.
                        xsb");
            waveBank = new WaveBank(audioEngine,"Content/Wave Bank.
                       xwb");

            soundBank.GetCue("ak47").Play();
        }

        protected override void Update(GameTime gameTime)
        {
            if (GamePad.GetState(PlayerIndex.One).Buttons.Back == 
                ButtonState.Pressed || Keyboard.GetState().IsKeyDown(
                Keys.Escape))
                Exit();

            // TODO: Add your update logic here

            base.Update(gameTime);
        }

        protected override void Draw(GameTime gameTime)
        {
            GraphicsDevice.Clear(Color.CornflowerBlue);

            // TODO: Add your drawing code here

            base.Draw(gameTime);
        }
    }
}

 

First you create an AudioEngine using the xgs file, then a SoundBank using the xsb and a WaveBank unsing the xwb file.  We then play the Cue we created earlier with a call to SoundBank.GetQue().Play().  This process allows the audio details to be configured outside of the game while the programmer simply uses the created Que.

 

Finally it is possible to play audio files that weren’t added using the content pipeline or using Xact using a Uri. 

        protected override void LoadContent()
        {
            // Create a new SpriteBatch, which can be used to draw 
            textures.
            spriteBatch = new SpriteBatch(GraphicsDevice);

            // URL MUST be relative in MonoGame
            System.Uri uri = new System.Uri("content/background.mp3",
                             System.UriKind.Relative);
            Song song = Song.FromUri("mySong", uri);
            MediaPlayer.Play(song);
            MediaPlayer.ActiveSongChanged += (s, e) => {
                song.Dispose();
                System.Diagnostics.Debug.WriteLine("Song ended and 
                                                   disposed");
            };
        }

 

First you create a Uri that locates the audio file you want to load.  We then load it using the method FromUri, passing in a name as well as the uri.  One very important thing to be aware of here, on XNA you could use any URI.  In MonoGame it needs to be a relative path.

 

The Video

 

Programming


AppGameKit Studio

See More Tutorials on DevGa.me!

Month List