Thinking about that problem led me to muse on how an app can get to this point in the first place. When should a developer drop support for an older operating system? This applies to desktop software programs as well, but the pace of desktop operating systems changing is so much slower than mobile, I don't think it's as big an issue.
The choices
There are two main approaches I can think of:1) You could set a hard and fast rule that you drop support for an operating system version when the percentage of users running that version drops below 5% (or some other arbitrary number).
2) Pick a cut off date and drop support for the operating system version on that date, regardless of how many users remain on that platform.
Issues with the first approach
If you insist on waiting until the user base drops below some percentage, the main problem is who knows when that will happen? At the time of this writing, Android 2.3 still has over 11% of the total Android marketshare. It could be quite a while longer before it drops below 5%. Additionally, if that's the only criteria, it's easy to see how performance could start to degrade on older devices because if there's one thing you should know about developers, it's this: Developers like having the newest stuff. They run the beta version. They upgrade their devices as soon as they can convince their spouse they "need" a new phone "for work". Developers are testing their code on that top shelf, speed demon device they love, not the $50 bargain bin smartphone your parents picked up to go with their Walmart family plan. Now don't get me wrong, this is the fault of the developers. They should be testing on those devices FIRST, instead of not at all. But that's not fun, and unfortunately, many devs won't do it. So by the time an OS version drops below the magic percentage, the app might be so fragile on that platform that it's unusable anyway.
Issues with the second approach
From a developer's perspective, it feels a bit weird to release a new version of an app that drops support for iOS 5 or Android 2.3 (etc) without any real reason other than "we want to leave those users with a stable version". Granted, that's an admirable goal, but "this new release adds five new features!" and it's not necessarily unstable/unusable, so it feels like you're cheating those users out of the new features and fixes. The instability creeps in over time and is hard to notice until it's already too late. Objectively speaking, this option is probably preferable to the first choice, but when it comes time to actually make that call, it leaves both users and developers unhappy.
Other options?
What else is there? Maybe Apple has the right idea. They say that developers should support the "current version and the current version - 1" (I tried really hard to find a source for this, but had no luck. Pretty sure it was a slide in a WWDC presentation). That probably works on iOS and OS X where most users update to the new version right away (if their hardware supports it), but it's not really feasible on Android or Windows.
I'd be interested in hearing other people's thoughts. I don't know what the right answer is, but it's definitely a tough problem and one that is important to get right to avoid leaving a bad taste in the mouths of users.