Apple has been in the news a lot lately with the release of the iPad, details of the next iteration of the iPhone OS, a patent lawsuit against HTC, and some notable changes to their developer agreements. Right now, they dominate the tablet and computer-like smart-phone market, and it looks like they're trying to get away with the same sort of behavior Microsoft pulled against them. Unfortunately I think they've learned the wrong lessons. The OS industry in the 1980's didn't have the depth of competition the smart-phone market has today. The line between phones, tablets and computers is blurring, which will make it harder to monopolize any one domain. And Microsoft succeeded in its bullying because it had buy-in from two crucial groups: the consumer, and the developer. Apple is doing well with the former, but antagonizing the latter.
Windows has over 90% market-share and has done even better in the past. Why? A large part of it is applications. Almost any piece of software you can find will run on Windows. If you like video games, or need a particular piece of business software, Windows will run it. Cross platform development has improved recently, but even today you'll find many more windows exclusives then any other system.
So the consumers go where the software is, and that in turn drives the programmers to support Windows. With 90% of the market, your software doesn't stand much chance if it won't run on Windows. You end up with a feedback loop: consumers go where the apps are, the apps are written for the OS the consumer uses. If you cut either side, you're in danger. For all its failings, Microsoft did well enough keeping the consumer content, and did an excellent job of giving developers what they wanted.
Microsoft packaged QBasic with Windows until recently, which was my first exposure to programming. They've developed one of the best development environments out there, and give out a fully-functional free version. You can write software for windows without spending a penny, and Microsoft demands no licensing fees for you to sell it.
Contrast this with Apple. To release an iPhone application you need to pay $99 up front, then give Apple a cut of your profits. After a great deal of effort producing the application, Apple needs to approve it, and there are many tales on the internet of benign apps getting rejected. If Apple doesn't like it for any reason, your development effort is sunk. While you can program in literally hundreds of languages for Windows (or even write your own), Apple now restricts you to 3.
When Apple rejected Flash on the iPhone/iPad, I was surprised. But the action was understandable: Flash would be a hole in the App Store model: another way of distributing content without Apple's approval or (more cynically) without Apple getting its cut. Adobe responded as many other programming languages have: by writing a compiler that turns flash code into iPhone code. You program in a human-readable, high level language. A compiler turns this into 1's and 0's the computer can read. Each operating system has its own dialect of 1's and 0's, but there's no reason you can't compile into any of them. This seemed like a reasonable solution: the apps would now be indistinguishable from any other app. They would go through Apple's store, through its approval process, and Apple would get its cut. Because the 1's and 0's are essentially the same no matter what the original language they were written in, there would no longer be any obvious difference between a flash app and a c app.
Apple has said no. The latest iteration of the agreement developers need to sign to write for the iPhone or iPad has been updated: you must write your programs in c, c++ or objective-c. Objective-c, born in 1986, is used extensively in Macs and iPhones, but nowhere else. C, born in 1972, was once all the rage but is rather out of date now. It's still used, but not frequently. C++, born 1979 is a very popular (but complicated) language. The youngest is as old as me, and these represent just a small spectrum of the language paradigms that exist. Most programmers have some language they like best, and its usually not any of these anymore. These are all slow languages to develop in: newer ones let you produce working code much faster. And given all the existing software already in existence in another language, there are lots of programs that could have easily been ported to the iPad, but now won't be.
The idea is to force developers to commit exclusively to the Apple universe. With a modern language you could easily develop for every major smartphone and every major tablet at once. Apple seems to hope that by taking away these cross platform choices, developers will give up on the Android, or Windows, and build iPhone exclusives. But I highly doubt that'll happen, at least not with the sort of developers you want to attract. By taking away languages programmers want to write in, by taking away the ability to easily port something you wrote for Windows to the iPad, and by all the other anti-developer actions Apple has taken, I suspect most will just write for something else. Apple has gotten consumer buy-in, but if the developers leave, the consumers will too. Will you still want an iPhone if nobody's writing apps for it? Apple's throwing its weight around because it has an early lead, which worked well for Microsoft. But Microsoft never used its development community as fodder for its corporate battles.
Perhaps times have changed. Perhaps there are enough developers out there that you can push away most of the community and still have all the software you need. As programs continue migrating into web-apps, maybe the battle over natively running apps will stop mattering so much. But I've got a feeling pundits will be pointing at this action in the years to come as the moment the iPhone jumped the shark.