Porting my game engine to the Playbook

The last few days I am a happy owner of a BlackBerry Playbook. The device was offered to me by RIM (thanks to Luca Filigheddu) in order to port Pop Corny to it. To tell you the truth I never owned a Blackberry device before, not to mention develop for it. It was a totally new experience, where I had no idea what to expect.

It turns out RIM has done an awesome job with Playbook and probably with its upcoming phones (just speculating I don’t know for sure). The system is based on the QNX operating system and it has strong support for standards and open libraries. I found myself right at home with it! I am going to come back with more details about the process (probably with an altdevblogaday article), but by cutting the long story short, I was able to port the engine with only native code (no java glue code like on Android) with OpenGL, OpenAL (even ALUT), freetype, etc all coming bundled with the system. Continue reading

Porting your game from iOS to Android

So you created a C/C++ game for iOS that gives joy to iPhone and iPad gamers from around the world. How can you deny this joy from all loyal Android users? I can’t, so I had to port Pop Corny to the Android platform. It was a very interesting experience, full of gain as I say, and I think it would be nice to share some information and knowledge on the subject. Continue reading

Writing portable code: A process full of gain

Lately I am spending some of my time into porting my game engine to the Android platform. It is a rather refreshing, interesting, rewarding and also frustrating experience. All at the same time. The process helped me learn new lessons and remember some old ones I had forgot.

Getting confortable

First of all I realized once again that we people get comfortable. And oh boy we get comfortable! I remember myself a few weeks back being frustrated with XCode 4 and how it was slow and sluggish compared to XCode 3, how I don’t like the new environment, etc. Well, no more! All it took was a few days in Eclipse. Dialog windows popping up behind the main window, >500ms on most clicks on files, kitchen & sink user interface, can go on forever, and all you really basically get at the end of the day is just a smart editor and debugger that only works for the Java part. Compare that to XCode with its memory profilers, time profiles, filesystem profilers, network profilers, battery optimizers, the very helpful OpenGL state inspector and logger, there is really no relation. I had forgot how it was to develop on other platforms, and how amazed I was initially with the special treatment that Apple gives to developers with its tools. What amazed me more is that I don’t come from such a “comfy“ background. The initial version of Sylphis3D was developed in parallel on Linux and Windows, mostly using no IDE at all, and I never found the tools a problem. As it turns out hardship builds character, while comfortness breaks it.

Portable software is good for you

Building portable software is highly valued in my mind, because it helps you be a better software engineer while making better quality software at the same time. You get to learn many different development environments, understand their design decisions, workaround platform differences, think further ahead, etc. All these require you to get a deeper understanding of your code and your dependencies. Always pushes towards a more organized code structure and reveals bugs that would otherwise go unnoticed until Murphy’s laws decides it is the worst time to trigger.

So if you are a software engineer, don’t get too comfortable with your development and target environment. No matter how attractive that environment makes it! Make your code portable, to keep yourself and the code in better shape. After all wouldn’t it be cool to run your code on a future toaster?!

Beta testers on Android wanted – Ask inside

Pop Corny on Android Sony Xperia X10 Mini

If you follow me online you probably know that the last weeks I have been porting Pop Corny to Android. I can say that it is a great experience and I myself can’t wait to put it out there. I am also going to blog about my experience and will try to provide any valuable information about the process. However the plague of Android that hears to the name “fragmentation” is creeping in, and I need your help to fight it!

Yes that is right, if you have and Android device you can be my soldier. Do you have what it takes? Are you prepared to suffer finger damage from extreme screen swiping? Do you know the history of pop corn? Are you prepared to play a game that will probably crash every 5 minutes and not throw the device out of the window? You do?! Just register on the form and I will contact you with more details:


To tell you the truth this is going to be more of an ALPHA-BETA testing phase as I don’t have an Android device myself. I did try out 2-3 real devices but it is quite likely to take some time until I can have a stable beta running on most devices. This will require patience on your side. If you are not interested yourself, tell a friend. I will need all the help I can get! If you also happen to have an old device (new ones will do too! 😀 ) that you don’t mind sacrificing in the name of game development, I would gladly accept it as a testing device, and you would gain a special place in the game’s credits and more importantly in my heard. For this contact my directly at my email: harkal at gmail dot com.

Lets get it started!!

C++ Concurrency and Parallelism

One of the blessings and curses of C++ is its Standardization Committee. The bureaucrats that steer the language can surely be both. I admit that I have found myself frustrated with the slowness and lag that C++ can have with catching up with progress, but when I think about it I wouldn’t be able to find a way to move forward such a huge language, with so many different forces pushing for its evolution. Today I read this article about the implementation of threading, parallelism and concurrency in C++, and it can be very explanatory on why questions “y u no standar threading, C++?” don’t have an easy answer.

Link: The Future of C++ Concurrency and Parallelism

My game design hat

One of the main traits that an indie developer must cultivate is that of wearing many hats. All indie game develepers will tell you about it, and it is the first thing you will realise when going indie. Wearing many hats is usually the result of small budgets. Small budgets mean less heads, but equal amount of hats. What I learned through the course of developing my indie game is that your success depends on how well your head fits those hats. Your game will simply be as good as the worst fit.

One of the hats I had to wear for the development of “Pop Corny” was that of the game designer. The closer I had ever come to game design before this, was playing games with a little more inquiring spirit than most players do. This can definitely be interpreted as a bad hat fit. It was clear that in order to have a successful game, I had to find clever ways to improve the fit. This of course could be done by adjusting the head (becoming a better game designer) or by adjusting the hat (adjusting the problem itself to something that I would handle). It was obvious that I had to do the first as much as possible, but without the later I was not going to go far. Continue reading

Predictable garbage collection with Lua

In one of my previous posts I talked about how you can make the Lua garbage collector (GC) more predictable in running time. This is a virtue that is highly valued in a GC used in games where you don’t have the luxury of going over you frame time. In that post I described a solution to the problem which works fine most of the time, leaving little space for garbage collection times that will hurt the framerate. However I ended that post with a promise to provide a better solution and in this post, I deliver.

The ideal situation would be to have the GC run for a specific amount of time. This way the game engine will be able to assign exact CPU time to the GC based on the situation. For example one strategy would be to give a constant amount of time to the GC per frame. Lets say 2ms every frame. Or it can be more clever and take into consideration other parameters, like the amount of time it took to do the actual frame. Is there enough time left for this frame? If there is, spend some for GC, if not, hold it for the next frame when things might not be too tight. Other parameters can be memory thresholds, memory warnings, etc.

All of the above depend on a GC that can be instructed to run for an exact amount of time. This kind of GC is what we call a realtime GC. And Lua does not have one. However it turns out that we can get very close to realtime with minor changes to the Lua GC.

The patch below modifies the behavior of the GC in the way we need it:

--- a/src/lgc.c
+++ b/src/lgc.c
@@ -609,15 +617,14 @@ static l_mem singlestep (lua_State *L) {
 void luaC_step (lua_State *L) {
   global_State *g = G(L);
-  l_mem lim = (GCSTEPSIZE/100) * g->gcstepmul;
-  if (lim == 0)
-    lim = (MAX_LUMEM-1)/2;  /* no limit */
   g->gcdept += g->totalbytes - g->GCthreshold;
+  double start = getTime();
+  double end = start + (double)g->gcstepmul / 1000.0;  
   do {
-    lim -= singlestep(L);
+    singlestep(L);
     if (g->gcstate == GCSpause)
-  } while (lim > 0);
+  } while (getTime() < end);
   if (g->gcstate != GCSpause) {
     if (g->gcdept < GCSTEPSIZE)
       g->GCthreshold = g->totalbytes + GCSTEPSIZE;  /* - lim/g->gcstepmul;*/

The only missing part from the patch above is the getTime() that can be something like this:

double getTime() {
    struct timeval tp;
    gettimeofday(&tp, NULL);
    return (tp.tv_sec) + tp.tv_usec/1000000.0;

I guess however that everyone will want to use their own time function.

The patch modifies the code so that is stops based on a time limit and not based on a calculated target memory amount to be freed. The simplicity of the patch also comes from the fact that we “reuse” the STEPMUL parameter that is no longer used to carry the aggressiveness of the GC. We now use it to hold the exact duration we want the GC to run in milliseconds. So the usage will be this:

lua_gc(L, LUA_GCSETSTEPMUL, gcMilliSeconds);
lua_gc(L, LUA_GCSTEP, 0);

The above code will run the GC for gcMilliSeconds ms. This way you will never be out of your frame time budget, because the garbage collection took a little longer to execute. Problem solved!

From Python to Lua

(This blog was originaly posted at #AltDevBlogADay)

All game developers, sooner or later, learn to appreciate scripting languages. That magical thing that allows for letting others do your job, better scaling of the team, strengthening the game code/engine separation, sandboxing, faster prototyping of ideas, fault isolation, easy parametrization, etc. Every game has to be somehow data driven to be manageable, and stopping at simple configuration files, with many different custom parsers, without going the extra mile of adding a full scripting language, is 90% of the times a bad design choice. 

Today the developer can choose from a large variety of scripting languages, or even go crazy and implement one on his own. It happens that the most favored language for game developers is Lua. Its easy to understand why Lua is the favorite but other options are used as well. For example Python and the lately upcoming force of  Javascript.

Here I would like to share some of my experience of moving a game engine from Python to Lua. Continue reading

Sylphis3D lighting, shadows, physics demonstration

This is some “memory lane” kind of post. As you probably already know, I am working on an iOS port of Sylphis3d lately and I have been going through some old videos from Sylphis3D. I must admit the feeling is overwhelming. All those nights strugling with algorithms, data structures, broken drivers, experimental scripting… The vibrant community of people surrounding the project starving for more info on the progress. I really miss those days. I would like to share one of the oldest videos with you. The video below was “shot” in 2003 and is now of historical value! It features per-pixel normal mapped lighting with realtime shadows from every light in the scene, coupled with realistic physics. Note that this was more than one year before DOOM 3 came out… Enjoy!