Saturday, April 25, 2009

iPhone Piracy: Hard Numbers For A Soft Problem

Update: Apple has fixed the piracy problem by implementing In-App Purchases, which use signed receipts that can be validated by servers. In-App Purchases have become available for free applications on or around October 14, 2009. Therefore, this post is only relevant for historic interest. 

This post will give hard numbers representing the current state of piracy on the iPhone platform. Its main purpose is to help independent developers that are considering working on the iPhone decide if they should invest their efforts into the platform.

Overview
This post analyzes the piracy rate of my iPhone application, StockPlay. The article begins by describing the application used for measurements, then argues that the real piracy rate for the application is over 90%, and explains why this state of affairs is unlikely to change. The post closes with advice for individual developers considering entering the iPhone market.

Background
StockPlay is a simulated stock trading game, where the virtual market is strongly correlated with the real market. The game is backed by a Ruby on Rails server that the iPhone client must connect to in order to play, which made it possible to get hard numbers on piracy.

The game retails for $9.99 (price tier 10) in the App Store, and is available world-wide since April 6, 2009 (19 days before this writing). This post on the game's official blog explains the motivation behind the pricing. The game does not contain any copy protection like Ripdev's Kali, and solely relies on Apple's DRM obfuscation.

StockPlay was cracked and became available on the most popular site for cracked applications 1 day after its launch.

StockPlay's Piracy Rate
To this date (April 25, 2009), we have 40 sales, and 2902 users. However, as most pirates would say to defend themselves, some of these people only tried StockPlay because it was available for free. To account for this, I will restrict my calculation to the 456 users that were still trading (and thus actively using the application) 24 hours after they registered with the server. This yields a piracy rate of 91%.

Pirates also say that some people would not have afforded the application, but I claim that price is not an issue, given the cost of buying an iPhone and a data plan for it.

Apple Doesn't Care
After reading the above numbers, you're probably thinking that Apple will come in and fix the situation. This section argues that Apple has no financial incentive to eliminate piracy, and their behavior indicates that they're well aware of that.

First, the iTunes App Store is expected to break even. According to their statements, Apple doesn't expect to make profit out of operating the store. This means that they don't care if an application is purchased through the store, or downloaded from elsewhere, not using their bandwidth. On the other hand, more free (from the consumers' points of view) applications translate into better demand for Apple's hardware.

Second, Apple already knows about the issue. I filed a bug in Radar, explaining how easy it is to crack applications with Crackulous (yes, it's really that easy), and providing a solution to prevent piracy for server-based apps. The bug received the ID 6755444, and was marked as a duplicate of 6707901, which was probably filed in mid-February. I'm making this claim based on the IDs of my other bugs, and on the assumption that Radar IDs are serial. Bottom line: Apple has other priorities.

Last, but not least, Apple makes it ridiculously difficult for developers to implement their own solution. The iPhone SDK developer agreement bans developers from getting involved with jailbreaking, which is a prerequisite to understanding how our applications are being cracked. To make matters worse, Apple does not make it easy for developers to obtain the final application binary, as it will be distributed on the iPhone. This means we cannot implement server-side binary checksums without having to jump through a lot of hoops. Furthermore, implementing a decent anti-cracking system requires messing with the binary bits and application loader at a low level. This runs the risk of get your application rejected, which pushes your launch date back by a couple of weeks.

Conclusion
If you're hoping to make easy money on the iPhone, look elsewhere. Don't believe the hype about Apple users having better morals, and being much more likely to pay for software. iPhone users are educated enough to Google search for pirated applications, and dishonest enough to use them. Just like PC users.

The piracy rate of over 90% suggests that you're better off developing desktop applications. Sure, they'll be pirated as well, but at least you don't have to put up with Apple's approval process and you won't have to design and code around the excessive technical limitations of the iPhone SDK.

Want to avoid piracy and stay ahead of the pack? It's a great time to be a Web programmer.

Now What?
If you're determined on writing an iPhone application (we programmers like to play with cool toys, after all), and want to monetize your effort, you should stick to one of the following:
  • in-app advertising - admob seems to offer the best toolkit at the moment. Google has been experimenting with iPhone ads, but they don't offer an SDK to the public quite yet. Downsides: ads take up a sizable chunk of screen real-estate, so you'll have to work harder at designing your app. If the application isn't wildly popular, the ad revenue will not be worth the effort.
  • traditional payment methods - if you have a server in your application (like StockPlay does), you can distribute the application for free, then charge for accounts on the server. Disadvantages: your users will have to have PayPal or other payment methods, and will have to log in using their mobile phones (I hate typing on iPhone). People may get frustrated if they blindly download the app because it's free, then realize they have to pay. Frustrated people give bad reviews.
  • third-party copy protection - the best solution that I know of is Ripdev's kali. Ripdev plays an active role in the jailbreak community, so they're likely to stay ahead of the crackers. Disadvantages: they charge a setup fee per application, and royalties. You'll have that nasty feeling of being ripped off, as you're already paying Apple 30% of your revenue for the same service.
  • develop your own copy protection - not worth it, unless you want the learning experience, or you're a big company. Copy protection is boring as hell, and it's unrewarding - no matter what you do, you eventually lose.
Motivation
I wrote this post to help my fellow developers decide if they should pursue the iPhone as a development platform. When my friends and I decided to write an iPhone application, the development blogs seemed to agree on a piracy rate of 60%, so I wanted to share my completely different findings with the developer community.

I believe the findings are novel and worth sharing, because they are based on hard numbers, as opposed to proxy measurements such as declines in sales, or in-app analytics. Most applications can function without a server, so the majority of developers cannot obtain 100%-accurate user statistics.

My friends and I particularly cared about piracy because our application uses a server, which means that pirates are not just lost business, but also unauthorized consumers of server resources such as bandwidth and CPU time.

Sunday, April 19, 2009

Toolkit for Web Service-Backed iPhone Apps

This post describes the chunk of iPhone code that I have recently open sourced (edit: I wrote outsourced before; Epic FAIL). I wrote the code while developing the StockPlay trading simulation game, because the currently released iPhone SDK does not ship with a good infrastructure for building applications that talk to Web services.

Overview
I named the toolkit ZergSupport, and you can get it from my GitHub repository. The README file contains a thorough description of the library so, instead of rehashing that, my post will highlight the main reasons why you should care about the toolkit, and discuss some of the thoughts that went into writing this code.

The code is organized as a toolkit not a framework, which means that ZergSupport is a collection of supporting classes, and does not impose a rigid architecture on your application, like a framework would. As you read this post, please keep in mind that you can use the parts that you want, and ignore everything else. ZergSupport is liberally licensed under the MIT license, so feel free to go to GitHub and jump right into it, as soon as this post convinces you that it's useful.

Web Service Communication
Without further ado, this is how data exchange is done.
The code above makes a Web service call, passing in the data in the user and device models, and parsing data that comes back into models. The data that gets passed to the Web server is formatted such that Rails and PHP would parse the models into hashes, which fits right into how Rails structures its forms. The code expects the Web service to respond in XML format, as implied by the ZNXmlHttpRequest class name. The models in the Web service response are sent to processResponse:, as an array of models.

You have to agree that the code above is much more convenient than having to deal with low-level HTTP yourself. That is, unless setting up the models is a real hassle. Read on, to see how easy (I hope) it is to declare models.

Models
On Mac OS X, you have Core Data to help you with your models. Sadly, this feature didn't make it into iPhone 2.x, so you have to write your own model code. Since StockPlay works with a lot of models, I couldn't write quick hacks and ignore this underlying problem. Actually, I could have, but I didn't want to.
The following listing shows an example ZergSupport model declaration.

The model's attributes are defined as Objective C 2.0 properties. I did this to keep the code as DRY as possible, thinking that models will need accessor methods anyway, and having explicit property declarations makes Xcode be happy and not clutter up the code window with compilation warnings. Right now, the model declaration is a big FAIL in terms of DRY, because the iPhone Objective C runtime requires declaring fields to back up the properties.  However, the 64-bit Objective C supports omitting the field declarations, so I have reason to hope that the iPhone runtime will do this as well, eventually.

An advantage I liked for using properties to declare model attributes is that the model declaration is plain code, which is easy to work with using version control, and is easy to code-review. I think this is as close as it gets to the convenience of Rails models.

Models And The Web
Models change, usually by gaining more attributes. If you're writing an iPhone application on top of an MVC (e.g. Rails) Web service, your iPhone models will probably mirror the Web models. I assert that this strategy can only work well if the iPhone code is capable of ignoring model attributes that it does not understand. The motivation is that models change over the life time of the application, and most of the time they change by gaining attributes. If your iPhone code cannot handle unknown attributes, you have to synchronize your Web server changes with your iPhone application release dates, which is a pain.

So, the ZergSupport models accept unknown attributes. In fact, they go one step further, and store unknown attributes as they are, so these attributes survive serialization / de-serialization. This is particularly handy for using iPhone-side models to cache server-side models. As soon as the server emits new attributes, these are stored on the iPhone cache, ready to be used by a future version of the application.

Just One More Thing (x5)
ZergSupport model serializers and deserializers can convert between iPhoneVariableCasing and script_variable_casing on the fly, so your iPhone models follow the iPhone's Objective C naming conventions, and your server-side models follow the naming conventions in your Web application language.

The toolkit includes reusable bits of logic that can come in handy for Web service-based iPhone applications, such as a communication controller that regularly synchronizes models between the iPhone and the Web server.

The ZergSupport code base packages a subset of Google's Toolkit for Mac that provides unit testing. You create unit tests simply by adding a new executable target to your project, and including the testing libraries into it. The testing code is wrapped in a separate target from the main code, so you don't ship unit testing code in your final application.

Speaking of testing, ZergSupport has automated test cases covering all its functionality. The Web service code is tested by an open-source mock Web service which is hosted for free, courtesy of Heroku Garden. I used a hosting service because I wanted to make sure that the OS integration works correctly, and to allow any user and contributor to run the unit tests without the hassle of server setup.

Last, but definitely not least, the ZergSupport code can be automatically imported into your project, using the zerg-xcode tool that I have open-sourced earlier this year.

Conclusion
The final conclusion is yours. I hope you will find the ZergSupport toolkit useful, and incorporate it in your code. I promise to share all future improvements that I make to ZergSupport, and I hope you will do the same, should you find it useful enough to look through the code and change it.

Wednesday, April 15, 2009

App Engine supports Ruby! Sort-of.

This post is a follow-up to my Great Time To Be a Web Programmer post, where I assert that HTML / CSS / JavaScript are the technology to learn in 2009, if you don't know them already. In that post, I said that Google's App Engine only supports Python, and that has changed. I am writing this quick update so my blog's readers are aware of the change in the Web application hosting landscape.

Java leads the way to Ruby
As of early April, Google's App Engine supports Java. The really good news here, if you don't care for low-productivity languages, is that the Java 6 VM provided by the App Engine has near-native performance, and most high-level languages have interpreters written in Java.

This opens the route to my favorite language, Ruby, being available on Google's App Engine. appengine-jruby is an experimental open-source project aimed at making Jruby available for the App Engine, and at implementing Ruby-esque abstractions over Google's APIs. At the same time, Ola Bini from ThoughtWorks took the time to get Rails to run on the App Engine, and wrote a blog post documenting his method.

There is still a devil in the details, however. According to Ola Bini, performance is nothing to write home about, and developers still have to zip up their source code to work around App Engine's 1,000 file limit.

Why this matters 
I think Google's App Engine is an important cloud-hosting platform because of its generous free tier. It is the best solution that I know for hosting hobby projects, or for a project's incubating phase

Conclusion
Sooner or later, Rails applications will run seamlessly on Google's App Engine. I believe it will happen sooner rather than later. Once Rails 3 shows up in the horizon and delivers on its promise of modularity, developers will be in a good position to rewrite the right parts for the App Engine.

In the bigger picture, the reality is shifting towards my guess that the cloud hosting platforms will soon support all the high-level programming languages. So the last programming language that you will have to learn for Web development is JavaScript, because the browser is still tied to it.

I hope that you have found this post useful, and look forward to your comments.

Saturday, April 4, 2009

Ubuntu 9.04 on Dell Mini 910

This post outlines a cheap and reasonably fast procedure for upgrading the stock Ubuntu installation on a Dell mini 910. The method does not require an external CD drive. Instead, it uses things that you're likely to have around, if you like messing with computers.

As always, I go through the actual procedure first, and leave the motivation for last. If you're here, it's likely that you have already decided you want to upgrade, and want to get the job done. If you're uncertain, skip to the Motivation section to find out why you might care.

Requirements
  • Dell mini 910 (the Ubuntu model); you can probably get away with the Windows XP model, but it is unclear whether all your hardware will be supported by drivers
  • 1Gb+ USB stick (cheap, you're likely to have one around already)
  • another computer running Ubuntu 8.10 or newer


Method
First, we have to load Ubuntu on a USB stick, and make it bootable. We'll use the Ubuntu 8.10+ computer for that.
  1. Download the latest reasonably stable 9.04 CD image (I recommend avoiding daily CD images, you can use Update Manager to get the latest updates). This google query should point you in the right direction.
  2. While waiting for the download, backup anything you need off your USB drive. It will get erased in the next step.
  3. Go to System > Administration > USB Startup Disk Creator and go through the instructions to end up with a bootable USB stick.
  4. If the USB stick is automatically mounted, eject it and take it out.
Second, we need to get Ubuntu onto the Dell mini.
  1. Power off the computer, insert the USB stick.
  2. Power the computer back on, press and hold the 0 (zero) key until you see a menu. The mini will be annoying and beep at you, ignore that.
  3. Select your language (I recommend English even if it's not your primary language, especially for beta software) and choose to install Ubuntu (as opposed to running the live image).
  4. Breeze through the easy choices in setup. Stop at the disk partitioning phase, as you might want to give that a thought.
  5. For my configuration (1Gb RAM, 8Gb disk) I recommend choosing manual partitioning, and creating a 1Gb swap partition. I did run out of RAM while running my development scripts on the machine, so I decided I need the swap. I also recommend ext3 over ext4, because you won't store too much data on your mini's disk, so ext4's benefits are not worth the risk in this case. For the default configuration (512Mb RAM, 4Gb disk), I'd spend 512Mb or at least 256Mb on swap.
  6. Defaults are fine for everything else until the installation reboots.
  7. Enjoy the improvements in 9.04, and the lack of Dell branding.
    Motivation
    I'm using the Dell mini as a demo machine that I can easily carry around. Its low cost also means that, if necessary, I can leave it with the people I'm demoing to, and I won't feel too bad about that. For that reason, I want the Dell branding removed, I want the latest and greatest from my linux distribution, and I want regular the x86 architecture, not LPIA (low-power intel architecture).
    My wishes aside, I think that the UI improvements in 9.04 and getting rid of Dell's stuff is sufficient reason to upgrade.

    Alternatives
    If you want to use the Dell mini as your portable computer, you might prefer the LPIA architecture to plain vanilla x86. The 9.04 download pages offer both netbook-optimized builds, and LPIA builds. Disclaimer: YMMV (your mileage may vary), I haven't tried this because I want don't want extra hassles during my development cycle.

    If you don't have an Ubuntu 8.10 computer and/or a USB drive, you can try Unetbootin. Google searches indicate that it gives mixed results, I haven't tried it because I had another Ubuntu machine. The procedure in might work, and it only requires your mini and internet access.

    If you have a lot of time on your hands and want to play, you can explore setting up a PXE server. Requires lots of software and access to the network hardware (easy if that's your home router, more difficult if you're in a school or company).

    I hope you found this post useful. Please comment if you know better method, or you found some tweaks that everyone should know about.

    Even monitors need power-cycling

    This post publicizes my latest finding that LCD monitor firmware has reached the level of unreliability of consumer-grade computer software, and therefore we even have to reboot our screens, every once in a while.

    Background
    Just to make things crystal clear, power-cycling is turning a computer off, and then back on. It's also known as cold booting, and hard reset. It is different from resetting (warm-booting) a computer, because the equipment has to lose power completely, and not just undergo a complete software reload.

    Today's desktop computers have removed the Reset button, so we have to resort to power-cycling the computer (usually by holding the power button for 4 seconds) any time it freezes completely. Warm-booting (slightly more gentle) is usually associated with software updates, and it's become a common but rare occurrence. Poor Windows users are forced into it once a month, by Microsoft's mandatory "you can say later, but turn your back for 5 minutes and I'll reboot your system" security updates.

    So, cold reboots are associated with software failures. I learned to accept that as an inevitable consequence of operating systems being complex software (hundreds of millions of line of code) which are released based on time, not quality, to meet revenue goals.

    Lesson Learned
    Imagine my bedazzlement when I had to do the same thing to... my LCD monitor. I have a (reasonably old, granted) Dell E228WPFc (entry-level  22" widescreen, not HD). I tried to switch the DVI cable from my laptop to my Mac mini, and the monitor just wouldn't get out of sleep. After wasting 5 minutes wondering if any of the cables is broken, I yanked the power cable out of the monitor, waited for a second, then put it back in. And the screen lit up, and it worked.

    Next time, I'll try power-cycling the screen earlier in the debugging process. And, as power-saving modes are implemented into more and more devices, I'll hope I don't step into an elevator which hangs getting out of sleep. Or in a car, for that matter.