Subscribe to GameFromScratch on YouTube Support GameFromScratch on Patreon

30. July 2013

 

It’s a pilot program for now, but Facebook has tentatively entered the world of game publishing, now that their exclusive agreement with Zygna has expired.FBPublishing  They are currently accepting applications from small and medium sized developers.

 

 

So, what does a game developer get out of this?

 

Three things. 

 

The first is most obvious, exposure.  The ability to get your game in front of 800M monthly mobile users is nothing to sneeze at.  Considering lack of exposure seems to be one of the biggest problems on the various app stores, this is hard to ignore.

 

Next is targeting.  In their words:

With our unique targeting ability and mobile users who like playing a diversity of games, we'll help the right people discover your game.

I’ll admit, the tangible benefits of that one are  a bit harder to define.  I assume what they are saying in not so many words is, they have such a huge volume of data on their users that they are able to target with exceptional precision.  Beyond being a bit creepy, I suppose this is completely true… Facebook does know a hell of a lot about Facebookers.

 

Finally is access to their analytics tools.

 

What does this cost?  Well, currently the Facebook game publishing page  doesn’t specify.  The most we have to go from is this Facebook blog post:

We are invested in the success of these games, and in exchange for a revenue share, we will be collaborating deeply with developers in our program by helping them attract high-quality, long-term players for their games. We'll also be sharing analytics tools and the expertise we've gained from helping games grow on our platform for more than six years.

 

The same blog post went on to mention 10 games that are participating in the program, including a game from publisher Gameloft.  Games on that list include Age of Booty:Tactics, Dawn of Dragons and Train City.

 

So, in a nutshell, Facebook are getting into the app promotion space and taking a cut of the profit.  Not sure exactly how that would work for sharing revenue when they don’t control the app store ( where the sale is tracked ). 

 

If you are a small to medium sized developer and want to apply click here.  Facebook will be disclosing more information about the program later today at Casual Connect.

News

28. July 2013

After waiting about 19 years for SFML 2 to be released, 2.1 has followed only a month or two later.

 

This is mostly a bug fix release:

SFML 2.1

sfml-window

- Fixed MouseMove event sometimes not generated when holding left button on Windows (#225)

- Fixed ContextSettings ignored when creating a 3.x/4.x OpenGL context on Linux (#258)

- Fixed ContextSettings ignored on Linux when creating a window (#35)

- Fixed windows bigger than the desktop not appearing on Windows (#215)

- Fixed KeyRelease events sometimes not reported on Linux (#404)

- Now using inotify on Linux to avoid constantly polling joystick connections (#96)

- Add keypad return, equal and period keys support for OS X

- Improved mouse events on OS X regarding fullscreen mode

- Improved mouse events on OS X (#213, #277)

- Improved reactivity of setMousePosition on OS X (#290)

- Fixed mouse moved event on OS X when dragging the cursor (#277)

- Added support for right control key on OS X

- Fixed KeyRelease event with CMD key pressed (#381)

- Improved TextEntered for OS X (#377)

- Fixed taskbar bugs on Windows (#328, #69)

- Improved the performances of Window::getSize() (the size is now cached)

- Added the WM_CLASS property to SFML windows on Linux

- Fixed Window::getPosition() on Linux (#346)

- Fake resize events are no longer sent when the window is moved, on Linux

- Unicode characters outside the BMP (> 0xFFFF) are now correctly handled on Windows (#366)

- Pressing ALT or F10 on Windows no longer steals the focus

sfml-graphics

- Fixed bounding rect of sf::Text ignoring whitespaces (#216)

- Checking errors in RenderTarget::pushGLStates() to avoid generating false error messages when user leaves unchecked OpenGL errors (#340)

- Solved graphics resources not updated or corrupted when loaded in a thread (#411)

- Fixed white pixel showing on first character of sf::Text (#414)

- Optimized Shader::setParameter functions, by using a cache internally (#316, #358)

- sf::Rect::contains and sf::Rect::intersects now handle rectangles with negative dimensions correctly (#219)

- Fixed Shape::setTextureRect not working when called before setTexture

sfml-audio

- Added a workaround for a bug in the OS X implementation of OpenAL (unsupported channel count no properly detected) (#201)

- loadFromStream functions now explicitly reset the stream (seek(0)) before starting to read (#349)

- Fixed SoundBuffer::loadFromStream reading past the end of the stream (#214)

sfml-network

- Replaced the deprecated gethostbyname with getaddrinfo (#47)

- Fixed non-blocking connection with a sf::TcpSocket on Windows

- Minor improvements to sf::Packet operators (now using strlen and wcslen instead of explicit loops) (#118)

- Fixed TCP packet data corruption in non-blocking mode (#402, #119)

- On Unix systems, a socket disconnection no longer stops the program with signal SIGPIPE (#72)

Examples

- Updated the Window and OpenGL examples (got rid of GLU and immediate mode)

Nice to see a quicker turn around with the releases. You can download it here. It's nice to see they are offering many more binaries, that was a point of great confusion for people starting out with SFML in the past.

 

24. July 2013

 

 

My somewhat recently published book, PlayStation Mobile Development Cookbook is currently the top ranked game programming book on Amazon…

 

…in Japan …on Amazon …in Kindle format.

 

But it’s still pretty cool. Smile

 

I check Amazon sales rank every once in a while to see how my book is doing ( it’s the only insight into book sales I have, amazingly enough ) and today when I check Japan, I see:

 

image

 

Number one baby!  Open-mouthed smile

 

Granted, Amazon track this value in pretty close to real time, so I am the number one selling game programming book in Japan for RIGHT NOW… in an hour I might be 50th…  but I’ll take it!

 

In another bittersweet milestone, I also just found the book on pirate sites for the first time.  That actually took longer than I expected to be honest.  I suppose an author or game developer should take it as a badge of honour that people will pirate what you created.

 

 

Oh, and Japan, you rock!

Totally Off Topic

24. July 2013

 

I have finally put together a single page for Project Anarchy tutorials.

 

image

 

Basically, it’s just a static page where you can access all current and future Project Anarchy tutorials on GameFromScratch.  There are a couple advantages to this though.  First, it makes it easier to find things than using tags.  Second, since the tutorial series is in order, I suppose it’s nice to actually put them in order, eh?

 

Finally and perhaps most importantly, it gives a great place to leave suggestions, feedback and comments on the series as a whole.  So, if you have a question that isn’t specific to an individual tutorial, or you want to request a specific thing be covered, this page is the perfect place to post them.

Programming ,

23. July 2013

 

In this second part of the input tutorials, we are going to look at handling input on a mobile device.  We are going to cover:

  • Configuring the RemoteInput plugin in vForge
  • Connecting with an Android device
  • Handling touch
  • Handling acceleration

 

This tutorial is fairly short, as once you’ve got RemoteInput up and running, most of the rest is pretty similar to stuff we have already covered, as you will soon see.

 

One major gotcha with working with a mobile device is emulating touch and motion control on your PC.  Fortunately Project Anarchy offer a good solution, RemoteInput.

 

RemoteInput

 

RemoteInput allows you to use an Android device to control your game in vForge.  This can result in a massive time savings, since you dont have to deploy your application to device to make use of the controls.

 

First though, you need to enable the plugin.  To enable the plugin, in vForge select Engine->Manifest Settings.

image

 

Then select the Engine Plugins tab

image

 

Then click Add Plugin

image

 

Then locate the vRemoteInput plugin, mine was located at C:\Havok\AnarchySDK\Bin\win32_vs2010_anarchy\dev_dll\DX9.

 

image

 

You will now be prompted to reload your project, do so.

 

Next you need to add a small bit of code to your project.

 

G.useRemoteInput = true

function OnAfterSceneLoaded(self)
  if G.useRemoteInput then
    RemoteInput:StartServer('RemoteGui')
    RemoteInput:InitEmulatedDevices()
  end
end

 

 

This code needs to be called before any of input is handled, I chose OnAfterSceneLoaded(), but there are plenty of other options earlier along that would have worked.  Of critical importance to notice is the global value, G.useRemoteInput, you need to set this to true if you are going to be using the RemoteInput plugin.  Next we start the server up with a call to StartServer() passing in an (arbitrary) string identifier.  We then initialize things with InitEmulatedDevices() which to be honest, I don’t know why you would ever defer this, so why not just do this in StartServer?  Anyways…  you need to call the init function.  You are ready to use RemoteInput.

 

Next time you run your app with the newly added code you will see:

image

 

Now on your Android device, open the web browser and enter the URL listed at the top of your app.  Once loaded, you can touch on screen and it will be sent to vForge as local input.  Motion data is also sent, allowing you to use the accelerometer and multi-touch without having to deploy to an actual device.  If you change the orientation of the device, it updates in vForge.  One thing to keep in mind, the coordinates returned by RemoteInput are relative to the window size on screen, not those of your actual device.

 

There are a few things to be aware of.  First, your computer and mobile device both need to be on the same sub-net… so both connected to the same wireless network for example.  Next, it doesn’t require a ton of speed, but as I type this connected to a cafe internet hotspot made of bits of string and a couple tin cans…  on a slow network it’s basically useless.  Also, you may need to reload your browser on occasion to re-establish the connection.  Finally, the browser may not support motion data, for example the stock browser on my HTC One works fully, but the Chrome browser does not.

 

Handling mobile input using Lua

 

Now we are going to look at how you handle touch and motion controls.  The code is virtually identical to the input code from the previous tutorial so we wont be going into any detail on how it works.

 

Handling Touch:

 

-- in the function you create your input map:
local w, h = Screen:GetViewportSize()

self.map:MapTrigger("X", {0, 0, w, h}, "CT_TOUCH_ABS_X")
self.map:MapTrigger("Y", {0, 0, w, h}, "CT_TOUCH_ABS_Y")

-- in the function where you handle input:
local x = self.map:GetTrigger("X")
local y = self.map:GetTrigger("Y")
-- Display touch location on screen
Debug:PrintLine(x .. "," .. y)

 

This code is taken from two sections, the initialization area ( probably where you init RemoteServer ) where you define your input map, and then in the function where you handle input, possibly OnThink().  It simply displays the touch coordinates on screen.  You can handle multiple touches using CT_TOUCH_POINT_[[N]]_X/Y/Z, where [[N]] is the 0 based index of the touch.  So for example, CT_TOUCH_POINT_3_X, would give the X coordinate of the 4th touching finger, if any.

 

Handling Acceleration:

-- in the function you create your input map:
self.map:MapTrigger("MotionX", "MOTION", "CT_MOTION_ACCELERATION_X")
self.map:MapTrigger("MotionY", "MOTION", "CT_MOTION_ACCELERATION_Y")
self.map:MapTrigger("MotionZ", "MOTION", "CT_MOTION_ACCELERATION_Z")

-- in the function where you handle input:
local motionX = self.map:GetTrigger("MotionX")
local motionY = self.map:GetTrigger("MotionY")
local motionZ = self.map:GetTrigger("MotionZ")

Debug:PrintLine(motionX .. "," .. motionY .. "," .. motionZ)

 

As you can see, motion and touch are handled virtually identical to other forms of input.  The value returned for MOTION values is the amount of motion along the given axis, with the sign representing the direction of the motion.

Programming ,

Month List

Popular Comments