Categories androidGame DevelopmentPop CornyProgrammingSoftwareSylphis

Porting your game from iOS to Android

So you created a C/C++ game for iOS that gives joy to iPhone and iPad gamers from around the world. How can you deny this joy from all loyal Android users? I can’t, so I had to port Pop Corny to the Android platform. It was a very interesting experience, full of gain as I say, and I think it would be nice to share some information and knowledge on the subject.

The basics

First of all, if you are feeling comfortable in your xcode environment and enjoying the feathery wings of mother Apple, get prepared for a rough landing to the Android land. Be prepared to face lots of not-so-streamlined tools, and basically no documentation. The NDK (the toolchain and libraries you need to build native apps on Android) has no relation to the SDK. It is obvious that Google is working hard on bringing native code support to the platform but we are not at the place where developing native code is as nice as it is with Java.

The first thing that you must get comfortable with are the build tools and process. Google exposes the same tools that are used for building the actual Android platform to the NDK developers, and by tools I mean a set of shell scripts and makefiles. What you are actually being requested to do in order to build your project is write a makefile part that gets included in the main makefiles provided by the NDK. This causes a steep learning curve at the beginning that might demoralize some. However when you get the grasp of it, it works fine and it is probably better that rolling out your custom makefiles.

In the end what you are ultimately building is a dynamically linked library that Dalvic can load with JNI. That is right, your game will still be a Dalvic Java VM process that just calls out to your library.

Mixing Java and native code

So you will not be able to fully escape Java. Your code must get along with it, and this is actually something you want, as almost all of Android API’s are still Java only. Also most libraries for Android that you might want to use are also written in Java. For example if you want Openfeint for leaderboards and achievements, of Flurry for analytics, you need to talk to Java.

This is accomplished with Java Native Interface (JNI). This interface allows Java code running in the VM to call out, and to be called back, by native code written in C/C++. This is an example of how the code to call out to Dashboard.open() from native code.

jclass cls = javaEnv->FindClass("com/openfeint/api/ui/Dashboard");
jmethodID open = javaEnv->GetStaticMethodID(cls, "open", "()V");
javaEnv->CallStaticVoidMethod(cls, open);
The only dark point in the above code is the “()V” which is the internal type signature of the method. This is the way that the Java VM describes the parameters and return value of a method. The syntax is error prune and I suggest you always use the “javap -s myclass” command that prints out all the methods along with their signatures. Copy and paste from there. Keep in mind that if you miss-spell a signature you will only find out at runtime.

Even though that the latest versions of the NDK allow you to write an activity in full native code, I went with the traditional way of creating the activity in Java and then do the calling out to native code from that.

Input

Handling touch input is slightly more complex on Android than on iOS, as the designers of Android thought it would be cool to have the system pass in an array of “historical” touch events instead of calling you for each. Appart from that the story is the same. Just make sure you handle both ACTIONUP and ACTIONPOINTER_UP events.

The major issue however, that also applies to many other aspects of the porting details, is that these events come in on a different thread. This might surprise some iOS developers that are accustomed to almost everything happening on the main thread’s looper. It did surprise me at least, but it turns out that Android is very generous with threads. So depending on how your engine is coded you might have to queue up the events and then pass them to your native code from the thread it expects them.

Finally, there is button’s -real hardware button’s- handling. You would want to handle at least the back and home button, in the way Android users expect them to work.

Sound

This is where the Android platform got me by surprise… Brace… there is no OpenAL! It was one of those things that you can’t believe and you keep looking desperately denying the simple truth. So it is true, if you are hopping to easily port your OpenAL based sound engine to Android you are in for a big disappointment. I believe it had to do with some licensing rights or something. The choices you are left with are MediaPlayer, SoundPool and OpenSL ES. The first two are Java API’s while the third is native.

MediaPlayer is basically for playing music and sounds that don’t require low latency. I could have used that for playing music, but I decided to try OpenSL. I implemented the music playing part of the engine on OpenSL and decided that I don’t like the API. If I knew from the beginning I would go straight for MediaPlayer which is very straightforward.

The SoundPool class is very well suited for playing sound effects. It also does the sound decompression for you and stores the uncompressed-ready-to-play samples in memory. It has its drawbacks as it can’t support effects that are bigger that 1MB in most of my test cases. The SoundPool class also has a bad history behind it. Due to a race condition in the code, SoundPool caused every app that used it on the first dual core phones to crash! Mainly the Samsung Galaxy S2 with the vanilla Android version. Can you imagine? You have your nice game running on the store and one day a company releases a phone that causes your game to crash… and sells millions of it! The fix from Samsung came a year later. Until then game developers had to drop SoundPool and probably implement the same functionality in OpenSL ES -which I tell you is not fun. The worst part is that even now that Samsung released newer versions of Android that don’t have the problem, most of the users don’t upgrade. So even last month that I released Pop Corny, most S2s had a buggy SoundPool. I took the decision not to drop SoundPool and simply detect when running on a buggy version and don’t play sound effects at all.

Graphics

Thank god Android does support OpenGL! You will have no problem here. Just be a little careful with the multi-threaded nature of Android and you will be ok (all GL commands must come from the GL thread). But you must be prepared for the variety of resolutions that Android phones and tablets have. You are no longer in the iOS ecosystem where you can get away with just two aspect ratios (iPhone and iPad).

For Pop Corny the game already supported the aspect ratios of iPhone and iPad, so I just generalized and made the code accept a certain range of aspect ratios and after that add black bars as necessary. For example the exotic resolution of 480×854 pixels on some phones is so extreme that can’t be handled without redesigning the whole game. Therefore it gets black bars.

It will also be useful to only load the appropriate texture mipmap and below, depending on the screen resolution. This will save precious memory specially on the low end devices that usually come with the low resolution displays.

The major problem that you are going to face with OpenGL when porting to Android is dealing with the activity life cycle. As you probably already know, everything on Android are activities. Even a small dialog box that you will bring up is an activity.

The problem is that when the dialog comes up if pauses your current activity and if that activity was your OpenGL view, Android will trash your OpenGL context! What that means is that when the dialog will go away, to get back to rendering you will have to reload every resource that OpenGL had. The same applies for when the user puts your application in the background, or when the user takes a call mid game.

Loading all your textures again, whenever something like that happens, is unacceptable. This one took me a while to sort out. Possibly due to the fact that I had no actual Android device to test on and I was relaying on the slow beta tester round trip. Anyhow, it turns out that this was fixed on version 3.0 of Android. The GLViewSurface of that version adds a method named setPreserveEGLContextOnPause(boolean) that when set to true is tries to preserve the GL context. But as you know very few people upgrade on the Android ecosystem. So what I did was take the GLSurfaceView class from the latest sources of Android, do some changes, and use that instead of the one in the users phone. Simple as that. However even with that, many phones were losing the GL context. It turns out that the GLSurfaceView did not preserve the context in the case the GPU was an Adreno regardless of whether the GPU supported multiple contextes. Well, all Adreno based devices I tried can preserve the context and simply removing that test in GLViewSurface’s source allows the game to continue instantly after an activity pause. Case closed.

Assets

The final thorn in the porting endeavor was the asset management and loading. Those that come from iOS will be surprised to find out that Android does not decompress the application bundle when installing an application, like iOS does. The files will remain in the .apk file, which is essentially a zip file. This causes a number of problems. You can’t just use your trusted system calls to open a file and read it. You have to open the apk file and poke in it, find your file, decompress it, and then use it.

For some files you can skip the decompression part. There are some kinds of files that the build process stores uncompressed in the apk. Mostly media files that are already compressed. If you use ant for building, you can actually add more file extensions to the no-compression list. Unfortunately, I didn’t manage to find a way to do it with Eclipse. These files can be loaded easily (with the usual file manipulation functions) using the file descriptor of the apk, an offset, and a length that you can get from the Java AssetManager. In the case of a compressed file however you will have to load it completely in Java using the asset manager and copy the whole file data over to C using JNI, which is inefficient.

Thankfully, Google added native asset loading capabilities after version 2.3. So if you are only supporting 2.3 and up, you can forget all the above and use the native API directly. It does all the work for you.

Closing words

As you can see the Android platform has its quirks. Most of the times due to the NDK being still too young. It is getting better though with every new version. If only Android users where quick to upgrade to the latest version…

To all the above add that you probably want to compile for three different CPUs: ARM, ARM7 and x86. That’s right, x86. There are quite some tablets out there that are based on x86 right now, and we are probably going to see even more with time.

It might be a little cumbersome at times, but the effort really pays of. In the end you have a completely new world for your game to explore. And the Android users are also very welcome and warm. A lot more that iOS users I think. So lets give them our games!

Good luck!

About the author