Sunday, April 18, 2010

Thou Shalt Not Infringe Intellectual Properties

I was at the Lupa Zoo last weekend, which was a lot of fun. It's always a pleasure to see and feed exotic animals. Stationed throughout the zoo were boxes with bags of peanuts and crackers. There was a little sign asking for $2 or $5 (depending on size), and informing us "Don't Steal." The honor system was at work.

What's interesting to me is that the food was ultimately destined for the animal's bellies, whether via my hand or some employee's. In a sense, we weren't buying the physical food, we were buying the experience of feeding the animals. The gift shop wasn't based on the honor system: there would be an actual, real expense to stealing a stuffed animal. The profit from the feed bags certainly helps keep the zoo going, but if an individual stole a bag who wouldn't have purchased one otherwise, the zoo doesn't really lose anything.

These economics are the same as noncommercial copyright violations (AKA piracy). And seeing the parallels has really convinced me that the honor system is best in both situations. The new Intellectual Property Czar had a request for comments recently, and the media industries weighed in with their hopes for our future: censorship of the Internet, spyware on our computers to detect any unethical behavior, federal cops enforcing these edicts. Free wifi spots will be a thing of the past. Youtube may be as well. For one thing, this opens us up for abuses (will the ability to freely spy on everything any American does with his computer be limited to downloading music? Australian censorship is already being used to prevent access to any information about euthanasia). But even if you trust our government, it legitimizes the actions of nations like Iran and China, who use this sort of information to capture and torture human rights activists.

Groups like the RIAA and MPAA like to present our options as either accepting censorship and surveillance, or just letting our entertainment industries die. But the idea that laws are the only way to influence behavior is a scary one. If I hadn't paid for the bag of feed, I wouldn't have been fined $2 million. I've just been taught stealing is wrong, absent laws. Cigarette's are a bad choice, but we let people make that choice. Plenty feel premarital sex, or at the very least adultery, are wrong, but we don't punish either of those with fines or jail time. Adultery in particular seems far more hurtful to another human being then downloading music, but we don't legislate against it. Why?

Because we previously understood that it isn't the courts role to dish out vengeance for every little wrong. Especially when an action occurs between two consenting adults (as piracy does), the violations to our freedoms necessary to enforce the law are far too burdensome to be worthwhile. Instead, we have another tactic: we teach children the difference between right and wrong. How about instead of spying, censorship and lawsuits, we just teach our children how buying things lets the producers keep producing? And if the occasional free loader declines, or if a family struggling to feed themselves takes a movie they couldn't afford, or we download a movie because our original dvd has broken, who cares? Of the ten commandments, I only count three we legislate. Do business models deserve a place above the ten commandments?

I recommend we all acknowledge piracy is bad, and then give up on the hunt to eradicate it from the earth.

Sunday, April 11, 2010

The iPhone: A Programmer's View

Apple has been in the news a lot lately with the release of the iPad, details of the next iteration of the iPhone OS, a patent lawsuit against HTC, and some notable changes to their developer agreements. Right now, they dominate the tablet and computer-like smart-phone market, and it looks like they're trying to get away with the same sort of behavior Microsoft pulled against them. Unfortunately I think they've learned the wrong lessons. The OS industry in the 1980's didn't have the depth of competition the smart-phone market has today. The line between phones, tablets and computers is blurring, which will make it harder to monopolize any one domain. And Microsoft succeeded in its bullying because it had buy-in from two crucial groups: the consumer, and the developer. Apple is doing well with the former, but antagonizing the latter.

Windows has over 90% market-share and has done even better in the past. Why? A large part of it is applications. Almost any piece of software you can find will run on Windows. If you like video games, or need a particular piece of business software, Windows will run it. Cross platform development has improved recently, but even today you'll find many more windows exclusives then any other system.

So the consumers go where the software is, and that in turn drives the programmers to support Windows. With 90% of the market, your software doesn't stand much chance if it won't run on Windows. You end up with a feedback loop: consumers go where the apps are, the apps are written for the OS the consumer uses. If you cut either side, you're in danger. For all its failings, Microsoft did well enough keeping the consumer content, and did an excellent job of giving developers what they wanted.

Microsoft packaged QBasic with Windows until recently, which was my first exposure to programming. They've developed one of the best development environments out there, and give out a fully-functional free version. You can write software for windows without spending a penny, and Microsoft demands no licensing fees for you to sell it.

Contrast this with Apple. To release an iPhone application you need to pay $99 up front, then give Apple a cut of your profits. After a great deal of effort producing the application, Apple needs to approve it, and there are many tales on the internet of benign apps getting rejected. If Apple doesn't like it for any reason, your development effort is sunk. While you can program in literally hundreds of languages for Windows (or even write your own), Apple now restricts you to 3.

When Apple rejected Flash on the iPhone/iPad, I was surprised. But the action was understandable: Flash would be a hole in the App Store model: another way of distributing content without Apple's approval or (more cynically) without Apple getting its cut. Adobe responded as many other programming languages have: by writing a compiler that turns flash code into iPhone code. You program in a human-readable, high level language. A compiler turns this into 1's and 0's the computer can read. Each operating system has its own dialect of 1's and 0's, but there's no reason you can't compile into any of them. This seemed like a reasonable solution: the apps would now be indistinguishable from any other app. They would go through Apple's store, through its approval process, and Apple would get its cut. Because the 1's and 0's are essentially the same no matter what the original language they were written in, there would no longer be any obvious difference between a flash app and a c app.

Apple has said no. The latest iteration of the agreement developers need to sign to write for the iPhone or iPad has been updated: you must write your programs in c, c++ or objective-c. Objective-c, born in 1986, is used extensively in Macs and iPhones, but nowhere else. C, born in 1972, was once all the rage but is rather out of date now. It's still used, but not frequently. C++, born 1979 is a very popular (but complicated) language. The youngest is as old as me, and these represent just a small spectrum of the language paradigms that exist. Most programmers have some language they like best, and its usually not any of these anymore. These are all slow languages to develop in: newer ones let you produce working code much faster. And given all the existing software already in existence in another language, there are lots of programs that could have easily been ported to the iPad, but now won't be.

The idea is to force developers to commit exclusively to the Apple universe. With a modern language you could easily develop for every major smartphone and every major tablet at once. Apple seems to hope that by taking away these cross platform choices, developers will give up on the Android, or Windows, and build iPhone exclusives. But I highly doubt that'll happen, at least not with the sort of developers you want to attract. By taking away languages programmers want to write in, by taking away the ability to easily port something you wrote for Windows to the iPad, and by all the other anti-developer actions Apple has taken, I suspect most will just write for something else. Apple has gotten consumer buy-in, but if the developers leave, the consumers will too. Will you still want an iPhone if nobody's writing apps for it? Apple's throwing its weight around because it has an early lead, which worked well for Microsoft. But Microsoft never used its development community as fodder for its corporate battles.

Perhaps times have changed. Perhaps there are enough developers out there that you can push away most of the community and still have all the software you need. As programs continue migrating into web-apps, maybe the battle over natively running apps will stop mattering so much. But I've got a feeling pundits will be pointing at this action in the years to come as the moment the iPhone jumped the shark.

Saturday, April 3, 2010

That's no space's a moon!

I posted previously about a future energy source: solar panels in space. Without an atmosphere to get in the way, and without that whole "day and night" thing, solar panels can absorb easily 300% of the energy they would on Earth. Because the energy would be constant, we could avoid having to build wasteful methods of preserving energy for night or cloudy days. Overall, its a very promising technology.

But there are downsides: specifically, cost. Shooting things into space is not cheap. The best figure I could find puts bringing a US ton of matter into space at just under $10m. That would decline if we sent more things into space: it's far more expensive to build individual shuttles then to mass produce the launching mechanisms. But even at a quarter the cost, the economics of these space panels is questionable. You might not get as much sunlight on Earth, but space travel is a pricey proposition. Thus while these space panels may someday form a viable energy source, we're probably not ready yet.

But there's a better option, I've realized. Space solar panels work so well because of the lack of an atmosphere: well, the moon lacks one as well. Solar panels are usually constructed of silicon, which turns out to be the second most prevalent element in the moon's crust. Instead of building solar panels here on Earth and tossing them out of our gravity well, we could just construct the solar panels on the moon. This turns it from a question of cheap space travel to a question of extraterrestrial construction. Any complicated machinery would be constructed here on Earth, then rocketed to the moon. There, cousins of the Mars Rover would shovel moon dust into little self contained factories. Solar Panels would come out, be laid in grids across the lunar surface, and hooked up to a microwave generator that would beam plentiful energy back to Earth. We'd have to keep sending new robots and factories as they break (at least in the short term), but besides that the solar panel fields could grow and grow and grow. Plentiful energy for all!

And it would, I suspect, have to be for all. Space is one thing, but moon-based construction is going to be a thorny political issue. Who owns the land on the moon? The first person to start using it? And would it be rational for America (if we're the ones building the Lunar Solar Fields) to switch to a pure solar energy society while China continues burning coal? No, I suspect it makes far more sense to get everybody over to to this climate friendly energy source as soon as possible. It would require an unprecedented degree of global cooperation, which worries me. But if we could find an agreeable way to distribute the energy we could move over to a vastly more environmentally friendly energy source in the very near future. It's a tricky engineering problem: constructing factories in an inhospitable environment with minimal direct human interaction, but its not something that strikes me as beyond our current means. It shouldn't require terribly advanced robotics, or major advances in solar panel construction. Someday you may look up at the moon and see a little splotch of black, and in the following decades that black would grow until our great grandchildren look up at the sky and can dimly make out a great spherical solar panel, orbiting the planet, providing energy more plentiful then anything we've ever known.