Since we are sharing the development process of Penumbear as we build, I thought it would be good start with the tools and technology that we are building with.
Penumbear will be coming to iOS, so we chose a framework that we already have a lot of experience with on that platform: cocos2d. Cocos2d is an excellent framework for 2d games and a great community has built up around it. Many common problems that you will encounter while creating a game have already been solved by the community. There are also a number of great tools that have cocos2d support built in that will save lots of time. If you are just getting started with cocos2d, then you can check out Part 1, Part 2, and Part 3 of our letter to a cocos2d noob. Also be sure to read the tutorials on Ray Wenderlich's site.
We've decided to use cocos2d 2.0 for this project. This buys us OpenGL ES 2.0 support (and therefore shaders), particle batch nodes, and number of other improvements. We lose support for the iPhone 3G, 2nd gen iPods, and iOS 3.x.
In addition to cocos2d, we are also using UIKit as part of an in-game level editor. We aren't worried about tight integration with cocos2d or skinning the controls because the editor is for development purposes and not part of the final game.
Quick rundown of the tools we are using:
TexturePacker - Packs assets into a sprite sheet.
PhysicsEditor - Used to trace shapes to define the polygons for sprites. Also allows you to edit other properties related to your physics engine of choice. We are using it in conjunction with chipmunk. There are a number of approaches to use when building a platformer. In a later post we will talk about how Physics Editor fits into the approach we are using.
ParticleDesigner - Allows you to see the results of modifying particle system configs without having to restart the simulator over and over.
GlyphDesigner - Convert TTF and other fonts into bitmaps. We convert all our fonts to bitmaps because updating BMFont labels is much faster than updating TTF in cocos2d. Updating TTF labels has the same cost as constructing a new label every update.
Spriter - An animation tool we helped fund through Kickstarter. Steve is currently leaning towards hand animating most of the koala's frames, but Spriter is still extremely useful for menu animations and other parts of the game.
If you have any questions, let us know and we will do our best to answer them. We hope to be as open about the development process of Penumbear as possible.
We would also love your feedback on the project. If you aren't already, follow us on Twitter for regular updates.
In the past we have developed in secret and then revealed the game to the world when everything is buttoned up and ready for release. We've decided that for this title, we are going to open up the development process and share the experience of building a game. We are happy to present… Penumbear.
Yep, we are really calling it Penumbear. At least for now as the project's code name. Originally we were using Penumbra, but that name was already taken by another game. The name of the game will likely change prior to release, but we need something to call it in the interim.
This project started off as a tech demo. Sal was building a dynamic lighting engine for another project we are working on.
Various game mechanics that utilize light kept popping into Sal's head as he developed the engine. One of these ideas grew into Penumbear.
In Penumbear, you are a koala that walks on the line between light and shadow. You turn lights on and off, hopping from shadow to shadow, as you find your way from room to room. In addition to an array of lighting sources, a variety of dangers stand in your way creating a puzzle out of every room.
Sal usually draws some placeholder art as he develops a prototype and then Steve starts to refine the style until we settle on an aesthetic that works for the game. Steve sketched a number of possible main characters:
We settled on the bear. Who doesn't love a bear who dwells between light and dark? And yes, we know koalas are not actually bears. Our koala is not actually a koala.
We plan on posting regular updates on the development process, screencasts of art being drawn, gameplay videos, and anything else that comes up while developing Penumbear. We would also love your feedback on the project. If you aren't already, follow us on Twitter for regular updates.
The ship in Omegapixel is controlled via a virtual joystick. The touch screen is not well suited to joystick like input and it is hard to get right. Omegapixel was built using cocos2d so we had the option to use SneakyInput, but we opted to roll our own code.
We've received a number of questions about the input code that we used, so I've decided to publish psuedocode of the implementation:
Sketchshare had an amazing run even though it was rather short. Sketchshare was a great app; he was the kind of app that would allow you to work collaboratively when you most needed to. Unlike most sketching apps, he had usefulness above and beyond drawing a doodle.
I remember when I first met Sketchshare. @abitofcode had announced his birth on Twitter and the cocos2d forums. As I got to know Sketchshare, I was jealous that I had not built Sketchshare, a complement to his parents, @abitofcode and @creativewax.
Sketchshare's removal was sudden. When I heard the news I could not believe it. Sketchshare was so young, but that short life was lived wonderfully. Featured by Apple, adored by critics, loved by fans. The amazing art that his existence facilitated will be missed.
Sketchshare has been taken from us by Apple. This is not a time for us to grieve but to remember that we do not know how long our apps have on this platform. We must live and develop accordingly. As developers it will always hurt when our apps are taken away. As users the pain is just as real.
Sketchshare will be forever missed and I hope that someday I will meet him again. I am thankful that I was given the chance to have know the app named Sketchshare.
In a previous post I discussed the approach that I used to make a Universal app for iPhone and iPad. That approached circumvented some problems others were having on the iPad 3 with cocos2d. Even though apps continued to run properly on the iPad 3 with that approach, we were not taking full advantage of the extra pixels.
The first thing we need to do is generate new high res assets. I am going to skip that step for now and assume you have higher resolution assets. Add the suffix "-hdpad" to the end of your high res assets or pick your own suffix and adjust the code below accordingly.
Next we need to define our Retina iPad suffix in ccConfig.h:
Then we need to edit CCFileUtils to make sure we use the "-hdpad" suffix when a Retina iPad is running our Universal app:
You will notice in the above code we are checking for the Retina iPad by checking to see if the content scale factor is equal to 4. If we load "-hd" assets and scale the points by 2 on the iPad, then we will need to double that for the Retina iPad.
We also need to update the utility function that cleans up the hd suffix:
Next we need to modify our enableRetinaDisplay flow we outlined in the previous post to enable Retina on the iPad 3.
The first thing we do is create a new enableRetinaDisplay:onPad method so we can choose not to enable the Retina Display on the iPad 3 if we don't have the assets to support it:
You will notice that the old enableRetinaDisplay method calls the new method but passes a default value of FALSE. This will keep our app working as is until we have our "-hdpad" assets ready.
The next step is to modify the content scale we will pass to setContentScaleFactor. Remember that we want to end up with a content scale factor of 4.
Notice that if we are not supporting the Retina iPad, then we set the scale exactly as we did before. If we are supporting the Retina iPad, then we set the new scale equal to the scale times the point scale factor which is set to 2 in makeUniversal. Hey Maw, look at my handy chart:
Next we have to take care of the line that saved us from having Retina iPad headaches in the first place:
Referencing out handy chart, you will see that __ccPointScaleFactor is not one. This means that __ccContentScaleFactor is ignored and we use one instead. We need to generalize this line so that it still works under the old case, but will also handle the new scenario:
Now we will still end up with a scaleFactor of 1 on the non-Retina iPad and on the Retina iPad if we did not enable Retina iPad mode. If Retina iPad mode was enabled, then we end up with a scale factor of 2 which enables retina mode on the device.
Here are all the changes to CCDirectorIOS.m in one handy gist:
Now you have a universal app that supports the full resolution of the iPad 3.
Preparing your asset pipeline to fit into the hd, hdpad, sd world will take a bit of work. If you have vector or pixel art, then your job is easier but no one wants to export every asset three times. Some of the common tools like Texture Packer will likely build in support for this in the future. In the interim you can probably get by with a few command line scripts.
Caveat: The modifications in this post were done to cocos2d 0.99.5. There are minor differences between the supplied code and the changes you should make to cocos2d 1.1.
Caveat Emptor: You aren't actually buying anything but use at your own risk anyway. There will likely be tweaks and adjustments to this code in the future.
Note: If you are building a stand alone iPad version, then I recommend you use the latest updates in the develop branch for cocos2d 1.1. They have already added Retina iPad support. This code provides a simple way to create a Universal app.
Recommendation: If you haven't had them before, I recommend you go buy some tomatillos and try them on your next taco in place of tomato. Prepare yourself for a surprising and unique flavor.
The March 2012 Apple Keynote brought no real surprises. The new iPad (hereinafter iPad 3 despite protestations) has a Retina Display with a whopping 1.572864 million pixels. I have some worries about whether the device will have enough oomph to push those pixels, but we will burn that bridge when we get to it.
Like a good little Apple dev, I downloaded the newest XCode and iOS Simulator to test out my apps. The rest of the cocos2d community was doing the same and that is when the excrement came in contact with the bladed cooling device. Devs started to report that their apps were showing a black screen on the iPad 3 simulator. More reports were coming in of other strange behavior.
I had only tested out my current project when I first heard the buzz. It hadn't shown any problems but I was worried about my apps that were already for sale. I fired up every project and tested them one by one. They all behaved exactly as expected, but why? What was the difference between my apps and all of the apps that were having problems?
The root of the problem was that the iPad 3 was behaving exactly as it should. Universal cocos2d applications that support the Retina Display on the iPhone often have this snippet of code in the app delegate:
The enableRetinaDisplay method attempts to turn on the Retina Display if the hardware supports it. Turns out that the iPad 3, with its Retina Display, activates the Retina Display when asked to do so. The problem is that since the rest of cocos2d and our apps aren't expecting this to happen on an iPad all sorts of craziness happens.
The good news is that Apple didn't go crazy and that things are behaving as one might expect. The bad news is that we have code built around an assumption that is now wrong. So why did my code work? Clearly I must not have had that handy snippet of Retina enabling code from above. WHAT?!?! I did? Oh, something else must be going on then.
The solution to the problem was a bit of an accident. I wanted to create universal apps that took advantage of the iPad's resolution without needing new assets. With careful asset management and a few hacks it is easy to use your Retina iPhone assets on the iPad. This helps you reduce your app's footprint and take advantage of additional pixels. DOUBLE RAINBOW!
The method that I use is partially outlined in this forum post. It adds a makeUniversal method to CCDirector which you call immediately after instantiation of the director in your app delegate. Here is the code:
This method sets the __ccPointScaleFactor to 2. This sets the scaling between points and pixels. This also causes an escape to be skipped in enableRetinaDisplay display which causes the content scale factor to be updated to 2. The content scale factor is used by cocos2d to determine if the "-hd" assets should be used.
This has another side effect due to a check in updateContentScaleFactor which is indirectly called by the enableRetinaDisplay method:
If __ccContentScaleFactor is 2, then the method forces the display's scaleFactor to 1. This means that we don't actually activate the Retina Display even though we call enableRetinaDisplay. We are now safely using iPhone Retina assets on all (existing) models of the iPad. WOOOOOOOT!
But All Those Pixels!
You are right, this does not take advantage of all the extra pixels that the iPad 3 has available. You will need higher resolution assets to fully take advantage of the new iPad, so we need another approach. We can go over that in another post. (Edit: In fact, I just wrote up a post on how to make a truly universal iOS application here.)
Wow, it's been a long time. I shouldn't have left you. Without a new post to review. I was sucked into a new project that started off as a Ludum Dare Jam and then turned into a full game (check it out after you read this). I learned a lot going through the process and I recommend a game jam for any aspiring game dev. Unfortunately, adding a new project into the mix meant the next installment of this series was delayed. Time to dig back into my archive of horrible mistakes...
When the retina display came out Apple gave us a simple way to automagically load retina assets instead of "SD" assets. Adding "@2x" to the retina version of the asset solved all our woes.
When I first came to cocos2d I was already in the habit of adding "@2x" to all of my retina assets, so I continued to do so. This is not the recommended naming convention for cocos2d because Apple's special handling of the "@2x" images can cause subtle bugs when interacting with cocos2d. Instead we should add "-hd" to all retina display images. Do this from the start and avoid having to unleash your command-line-fu to rename all your images. It is amazing how many issues can be avoided by just reading the documentation.
Another bonus of the "-hd" suffix is that it works for other asset types such as plists, fonts, and TMX files. If you need to load different assets for retina than for sd, then just slap an "-hd" on the end of the retina version.
Ticking Away The Moments
Within the main game loop a monster lies in wait. It will devour noob game developers indiscriminately. It doesn't matter if you are using cocos2d, another framework, or no framework at all. This monster's name: Time.
A very important thing to recognize is that not every step through the game loop will take the same amount of time. This is especially true as you move from one device for another.
A Story:You have carefully tweaked your monster speed to charge at your player at a rate of 10.698 units per update. The time through your update loops are variable, but you haven't caught on because they are always similar enough for the effect to go unnoticed. Now you load the game on your friend's shiny new iPhone 5 with it's awesomely powerful processor. Holy badunks! You can no longer react fast enough to escape the monsters! This game is impossible!
Since you used a fixed delta per game loop iteration and since the faster device can iterate more times per second, you have ended up with a game that no longer works as intended on a faster device. Don't feel too bad about it, many older games built for a specific platform suffer the same fate when run on newer hardware.
Luckily, this is an easy to solve problem if you address it from the start. Take the delta into account with each update and you will avoid this problem entirely. If you have a step method that looks like this:
-(void) step: (ccTime) delta;
Multiply the time delta by a velocity to determine the positional delta.
If you are using a physics system like chipmunk, then the delta should be passed into the step method (cpSpaceStep). The physics engine will then take care of all of the grunt work for you, but you still need to be careful with anything you update from outside the physics system.
Aside: There are two ways that physics engines can be updated, fixed step or variable step. There are pros and cons to each approach. Engines will facilitate one or both approaches. A good starting place to learn more is this thread on gamedev.stackexchange.
The last tip of the day comes in the form of some FREE code. The internet is just full of this stuff! (Always check the attached licenses) Odds are you will eventually need to shuffle the elements of an array. An easy way to do this is to create a category for NSMutableArray. This will give you a highly reusable piece of code, and then you won't have to write new shuffling code everywhere you need to randomize array elements.
A discussion on shuffling an NSMutable array using a category can be found over on StackOverflow. I have included my own slightly modified version below for ease of Copy-Paste.
If you want to randomize an array the first time you generate it, and then want to be able to regenerate that same sequence again in the future then you can make a few modifications to the code above (or add a new method).
int n = (arc4random() % nElements) + i;
int n = (rand() % nElements) + i;
Then before calling shuffle, call srand() and pass in your saved seed value.
// Our first play of Taco Madness
// resulting order => beef, chicken, fish, pork, brain
// Replaying Taco Madness later using the same seed
// resulting order => beef, chicken, fish, pork, brain
// same results as first run
// Starting a new game of Taco Madness and therefore using a new seed
// resulting order => pork, chicken, brain, fish, beef
// new results due to new seed
This will guarantee that shuffling will return the same result as the last time you shuffled the same array with the same seed. They call them pseudo-random for reason.
In the next installment I am going to go through an entirely new class of errors related to process of releasing a game on the App Store. It will be chock full of mistakes you will absolutely want to avoid.
Glad to see that tacos were used in an example. I was beginning to grow worried.
You've finally wrapped up that first game and want to show it to the world. Time to make a trailer! Most advice about contacting reviewers suggests providing a link to a gameplay video. I understand why. I am not a professional reviewer, but when someone sends me a link to a new game I want to see it in action.
There are a wide array of tools available for recording trailers and what works best for you will depend on your needs and your budget. Two of the tools I have used are Sound Stage and iSimulate.
Sound Stage is an OS/X app that allows you to screen capture your entire desktop, a specified region, or the iPhone Simulator output. It is available on the Mac App Store and only costs $5.
Easy to use
Optimized for utility apps
Designed for capturing from the iPhone simulator
No simulator audio capture
Optimized for utility apps
Using the App
There are a number of options that allow you to set the quality, output file, viewing area, and touch indicators. One option that is lacking is simulator audio capture. This isn't a big deal for utility apps, but as game developer it is a feature I miss. They do provide audio capture from the built-in mic or the input jack so I could jerry-rig something, but it would be nice to have it baked in.
The best thing about the app is that it is very easy to get up and running. Just click the big red record button:
The app also allows you to select a soundtrack and drop in images so you could build an entire trailer using it. I prefer to use it for screen capture only and edit video in a more powerful solution.
This approach began to fall down for me when I needed to record footage from an app that required me to touch the screen in multiple places at once. I'm just not that fast with the mouse. This isn't a Sound Stage problem, it extends right through to testing on the simulator. There are shortcuts for the basic multi-touch gestures, but games often move beyond that.
The approach I used for Four Hats was to port it to OS/X and then map keyboard controls so I could simulate touches in all the right places. Cocos2d makes this pretty simple but it still takes some work and isn't going to work for every game. Then I discovered iSimulate.
iSimulate allows you to send the multitouch, GPS, accelerometer, and compass data from your iPhone to the iPhone simulator. The SDK is free but to use it you need to purchase an app from the iOS App Store that costs $16. I have to admit there was a bit of sticker shock looking at an app that costs 16X most of my App Store purchases. I then felt pretty ridiculous for feeling shocked about spending $16 on something that will save me hours. I believe in using good tools to magnify the effect of your time and this is one that is well worth it. This is a great tool not only for recording footage but also for debugging.
It is simple to set up an app to work with the iSimulate SDK. Download it. Link to it. You are done. One thing to note is that if you are not using CoreLocation in your app you will still need to link to it while linking to the iSimulate SDK or you will get a few linker errors that look like this:
Undefined symbols for architecture i386:
"_OBJC_CLASS_$_CLHeading", referenced from:
_OBJC_CLASS_$_iSimulateCLHeading in libisimulate-4.x-opengl.a(libisimulate-opengl.a-i386-master.o)
Once you have your app up and running in the simulator you can launch iSimulate on an iDevice on the same WIFI network. You should see your computer in the list. Tap it to connect and then the data starts streaming.
Don't be scared off by the UI. It is one of the ugliest apps that I have purchased, but it works. I definitely suggest you read the documentation here. They have a few screenshots with legends that will help you get around.
If you are using iSimulate to record gameplay footage, then one of the first things you will want to do is remove or change the touch indicators. The default ones are giant gray circles.
You can remove them entirely by adding the following code to your AppDelegate's applicationDidFinishLaunching method:
If you want to replace the touch indicator simply add an image to your project named "isimulate-touch.png". There is a lot more useful information in the provided documentation so I suggest you check it out.
There is also an option to stream simulator output back to the iPhone which I can definitely see being useful.
I found that if I used iSimulate in conjunction with the simulator for more than 5 minutes the target app would start to slow down to a crawl. This happened both while connected to the debugger and while running the app independent of the debugger. I tested the same app without iSimulate connected and there was no lag.
I recommend adding both Sound Stage and iSimulate to your indie game dev toolkit. Both have flaws but save more than enough time to outweigh their cost.
If there are tools out there that fill these gaps at a lower cost or more completely, please share.
When building a twitch based game like The Four Hats, it is important that the game never lags. Losing from a hiccup in performance is infuriating as a player. In order to avoid mid-play lag we employed a few simple strategies that we would like to share. These can be used in a sidescroller or an endless runner like Temple Run.
Reduce Draw Calls
Try to reduce the number of draw calls made. If you are using cocos2d the easiest way to do this is to take advantage of CCSpriteBatchNode. CCSpriteBatchNode will batch draw all of its children. It must be initialized with a texture and all the CCSprites children must use that texture. Check out the documentation here.
Reduce Memory via Pixel Format
By using RGBA4444 rather than RGBA8888 you greatly reduce the amount of memory used for textures.
RGBA4444 may cause issues with gradient heavy artwork but if you are using TexturePacker then you can change your dithering settings to FloydSteinburg+Alpha or whichever format works best for your artwork.
Depending one what style of art you have you can try out other pixel formats and see what works best for balancing size, memory, and quality. An old but good post on understanding pixel formats can be found here.
Pruning / Delayed-Placement
If we loaded the entire level into memory at once we would spend a lot of time simulating objects far offscreen that might never being encountered. We would also waste clock cycles and memory on objects that have been passed by and will never be seen again. In order to reduce the number of objects that are part of the simulation, we wait to add them to the simulation until they are close to being on screen and we remove them from the simulation once then cannot be returned to. In Four Hats we can use the position of the approaching horde of fans to determine a safe point beyond which the character will never return.
Pre-allocation / Object Pool
Constantly adding and removing objects from the simulation in a naive way could cause a large number of allocations and deallocations while playing. This is something you should avoid to prevent random hiccups in game play. To solve this problem we used an object pool. We attempt to pre-allocate all of the objects we will need before starting the level. As objects are added and removed from the simulation they are swapped in and out of the object pool. This required game objects to have a reset method to clear any accumulated state. We built Four Hats using Obj-C and cocos2d, but if we were working in pure C++ then would likely have followed the methodology laid out by Noel Llopis on pre-allocation here. Our approach was unfortunately a bit messier.
In an endless runner it is difficult to predict exactly how many of any one type of object will be needed on screen at a time. In this case pre-allocation is difficult, but an object pool can still be used.
Hopefully the work we put into avoiding slow downs will save a few iPhone from being thrown at us! If you have any other techniques that you have used, please share them in the comments.
Ludum Dare is a 48/72 hour game making competition. The 48 hour comp is for the teams of one and the 72 hour jam is for larger teams or those who don't full follow the main comp rules. We entered the jam because we wanted to enter as a team.
I've known about LD for a few years, but have never participated before. On Friday I asked Steve if he would be interested in being sleep deprived for the entire weekend and he was! Once we decided to join we anxiously awaiting for the theme to be announce and couldn't focus on much else.
We added an extra layer of challenge beyond building a game in 72 hours. We wanted to try and build a lite version of a game that we could submit to the App Store. Going into it we knew it was a crazy and foolish idea, but we wanted to push our abilities and focus to their limits.
Even though 72 hours is technically 3 days, the schedule below is broken into the 4 "days" or awake periods between naps.
Normally Steve and I work remotely, but we live close enough together that face to face meetings are possible. We decided to setup a joint office for weekend so we could brainstorm efficiently and motivate each other. I setup a folding table next to my desk and he lugged all of his stuff over.
Steve worked on OSX using Flash, Photoshop, and a Wacom tablet. He sketched out a lot of art on paper prior to working in Flash to make sure we nailed the look. I worked on OSX as well using XCode 4, Texture Packer, Particle Designer, Audacity, GIMP. We were targeting iOS so I worked with cocos2d-iphone and chipmunk. We were both very familiar with the tools and frameworks that we were using so we didn't waste a lot of time learning our tools.
Every LD has a theme and this year it was "Alone". We were ready to shout out ideas but when we saw the theme all we heard was crickets. "Alone" was one of the themes we were interested in but we knew it would be tricky coming up with a good game. Luckily the silence soon ended and we started brainstorming.
Everyone started throwing out ideas and Steve scribbled them down. Whenever we would get stuck we would read through the list and try to go further with an existing idea or try to take it in another direction. We initially approached the process by trying to come up with a story or situation that fit the theme. This generated a lot of ideas but it was hard to come up with a game mechanic out of many of them. Here are some of the ideas we had:
Ghost who wants to be alone and tries to evict the living
A mute and deaf person (possibly blind?) who is alone in the world and needs to find a way to communicate
Stuck in space
Stuck at sea
Famous and trying to buy friend
We ended up expanding on the idea of someone who wants to be alone and can't seem to get away. We decided that our protagonist would be a rockstar who was being chased by a horde or adoring fans and the player would guide them along a platformer-esque level to the tour bus.
The rockstars would be members of the four person pop group The Four Hats. Their debut album Hat Tricks is also introduced in the game.
The Four Hats in "Room to Roam": Rockstars have it hard. Millions of adoring fans, mountains of cash, and no alone time. How is a musician supposed to write new music when chased by the paparazzi or the hordes of adoring fans? In order to find the solitude that you crave and unlock your inner muse, you must outrun fans and make your way to the studio. Don't get caught by the paparazzi on your way there or your creative energy will be sapped dry.
The initial idea included having all of the members of the band as playable characters with different abilities that would allow you to navigate the levels in unique ways. The levels would take the band through multiple eras of music to provide variety. Clearly that was biting off a bit more than we could chew in the time that we had.
Why rockstars and music? We had at our disposible a very talentled musician who has written music for our other projects. We wanted to work on a project that showcased his strengths as well as ours. He could easily do the creepy music that would accompany the ghost story or the stifling silence of being buried alive, but who doesn't want to be a rockstar for a weekend.
Flesh on the Bones
Next we started to flesh out the idea. We worked on the sound, did sketches to get the look, settled on the mood, and worked out the details of the mechanics. Once we were on the same page and understood what we were making we each started working on our own parts.
Getting the style and animation of the first character to match what we were aiming for brought Steve to our first nap break. During that time I got a basic prototype working of some platforms, and obstacles, the controls, and a horde chasing you.
On Day 2 we started adding actual art assets into the game. I had put together a basic animation system on top of the existing system in cocos2d while Steve finsihed up the animations. We had also settled on a naming convention and system of organization to simplify the code and avoid rework.
Once we had the character in game, Steve moved on to background elements, obstacles, and other art assets. At this point our pace matched up really well. Art and code were never too far ahead of each other.
As the day drew to a close we had all of the basic gameplay in place with art assets and an awesome instrumental version of the music. We did some playtesting and identified the pain points to work on the next day. One of the issue we had was that the obstacles weren't interesting enough to make the game varied and interesting. We had a few ideas for "traps" or "enemies", but hadn't added them into the actual game yet.
Sunday was a busy day and by the end of it we had a game. Even though we were not in the 48 hours contest, it still felt good to have a complete game by the 48 hour mark.
A few things that we worked on:
Everything else: menus, icons, etc
Level loading, gameplay tweaks, enemies
Writing and recording vocals
Writing and recording menu music
Up until this point we had been randomly generating levels, which worked for testing purposes, but it led to uninteresting levels. We decided to switch over to hand built levels.
The best part of the day was that we had the final cut of our main gameplay music. "Room to Roam" is a track written, produced, and performed by The Four Hats. You can check it out on soundcloud:
If you need music for your game, we can put you in touch with negapixel (or The Four Hats) as he is available for contract work.
Building the level was a lot of fun. It also gave me a lot of time to play test the game and burn off a lot of warts. Building the level by hand gave the game a good pacing and slowly introduced elements in a balanced way. Since we didn't have time to build level selection screens, or other elements that would create a polished multi-level game, we made sure the level was decently long and ended with more challenging play.
Building and playtesting the level took a lot of time. I had been storing the level config in a plist, which is not at all visual, and definitely added some friction to the process. After reading other LD post-mortems I realized we should have been using a pixel map. It would have been a quick way to implement a simple visual level editor. It is definitely a technique that I will be using in the future.
Polish / Cleanup
We finished the basic game and added a lot of polish ahead of schedule. We knew we didn't have time to add additional characters or levels so we focused on wart removal and polish. We spent time putting together sound effects, button states, an intro cutscene, and particle effects.
In the final hours of the competition we were as ready to submit we were going to be. We continued to playtest the game because we were addicted. It was a great feeling to realize that we were no longer playing to test, but because it was fun.
What Went Well
Came up with a fun concept that excited us
Balanced work, sleep, and breaks to stay productive
Familiarity with cocos2d and tools
Music, art, and gameplay meshed together into a consistent experience
Had time to add a lot of the polish we hoped for
Reduced scope to make schedule realistic and focus on quality
Time constraints focused the project
Lots of testing
What Didn't Go Well
First time building a platformer. Learned that realistic physics didn't equal fun.
Took a bit longer than expected to get gameplay feeling right
Took too long to build level
Didn't choose a good format for level editing
Wasn't familiar with any existing level editors
Should have used a pixel map
Took longer to animate initial character than expected
Had to redraw some frames a few times to make the action smooth
Had to reduce the scope of the game
The kittens we rented ended up mostly causing a distraction when we didn't need them for the theme
A few minor bugs slipped through (expected given the time restraints)
Building an iOS game for LD has caused fewer people to be able to play than if we built a flash game
We are going to submit Four Hats to the App Store after Apple's holiday break and will continue development on additional levels and characters if people enjoy the game. We originally pictured a much bigger game, but the 72 hour time limit restricted what we could build. In the full version we pictured:
Four playable characters each with a special ability
Switch between characters to work through levels
Multiple eras of music by the Four Hats with levels to match
An EPIC sound track
Prior to submitting the free version we will fix a few minor bugs that we missed in our initial testing. This should only be another hour or so of work.
Ludum Dare is awesome! We would definitely do it again. We are happy with what we managed to build in a weekend.
It would be great if there was an iOS focused version of the contest. I am not sure how to solve the distribution issue, but if all the participants had iDevices it would expand the audience.
Building a polished app in a weekend is extremely difficult. We already knew this going into the process, but we wanted to challenge ourselves. What we found is that it is possible to build a fairly well polished prototype or lite version of a game, but adding in the extra content required for a full featured game takes a lot of time.
See art, video, and photos from the project on our live blog post from the event.