The mass market doesn't buy, and doesn't want to buy, products based on what they might become months from now if these companies somehow dramatically improve the software. They buy products for what they are today, out of the box.
I would wager that, sometime between now and 30 June 2012, iCloud will offer a web interface just as good as if not better than MobileMe's (and quite possibly, under the hood, based on MobileMe's). They just haven't announced it yet, and if Apple hasn't announced it, they won't talk about it.
In short, there is no reason to assume that iCloud as it will exist 12 months from now will be limited to what was announced one week ago.
[Translations: Arabic, ...]
When DVD Jon was arrested after breaking the CSS encryption algorithm, he was charged with “unauthorized computer trespassing.” That led his lawyers to ask the obvious question, “On whose computer did he trespass?” The prosecutor’s answer: “his own.”
If that doesn’t make your heart skip a beat, you can stop reading now.
When I was growing up, “trespassing” was something you could only do to other people’s computers. But let’s set that aside and come back to it.
My father was a college professor for much of his adult life. One year, he took a sabbatical to write a book. He had saved up enough money to buy a computer and a newfangled thing called a word processing program. And he wrote, and he edited, and he wrote some more. It was so obviously better than working on a typewriter that he never questioned that it was money well spent.
As it happens, this computer came with the BASIC programming language pre-installed. You didn’t even need to boot a disk operating system. You could turn on the computer and press Ctrl-Reset and you’d get a prompt. And at this prompt, you could type in an entire program, and then type RUN, and it would motherfucking run.
I was 10. That was 27 years ago, but I still remember what it felt like when I realized that you — that I — could get this computer to do anything by typing the right words in the right order and telling it to RUN and it would motherfucking run.
That computer was an Apple ][e.
By age 12, I was writing BASIC programs so complex that the computer was running out of memory to hold them. By age 13, I was writing programs in Pascal. By age 14, I was writing programs in assembly language. By age 17, I was competing in the Programming event in the National Science Olympiad (and winning). By age 22, I was employed as a computer programmer.
Today I am a programmer, a technical writer, and a hacker in the Hackers and Painters sense of the word. But you don’t become a hacker by programming; you become a hacker by tinkering. It’s the tinkering that provides that sense of wonder. You have to jump out of the system, tear down the safety gates, peel away the layers of abstraction that the computer provides for the vast majority of people who don’t want to know how it all works. It’s about using the Copy ][+ sector editor to learn how the disk operating system boots, then modifying it so the computer makes a sound every time it reads a sector from the disk. Or displaying a graphical splash screen on startup before it lists the disk catalog and takes you to that BASIC prompt. Or copying a myriad of wondrous commands from the Beagle Bros. Peeks & Pokes Chart and trying to figure out what the fuck I had just done. Just for the hell of it. Because it was fun. Because it scared my parents. Because I absolutely had to know how it all worked.
Later, there was an Apple IIgs. And later still, a Mac IIci. MacsBug. ResEdit. Norton Disk Editor. Stop me if any of this sounds familiar.
Apple made the machines that made me who I am. I became who I am by tinkering.
This post’s title is stolen from Alex Payne’s “On the iPad,” which I shall now quote at great length.
The iPad is an attractive, thoughtfully designed, deeply cynical thing. It is a digital consumption machine. As Tim Bray and Peter Kirn have pointed out, it’s a device that does little to enable creativity...
The tragedy of the iPad is that it truly seems to offer a better model of computing for many people — perhaps the majority of people. Gone are the confusing concepts and metaphors of the last thirty years of computing. Gone is the ability to endlessly tweak and twiddle towards no particular gain. The iPad is simple, straightforward, maintenance-free...
The thing that bothers me most about the iPad is this: if I had an iPad rather than a real computer as a kid, I’d never be a programmer today. I’d never have had the ability to run whatever stupid, potentially harmful, hugely educational programs I could download or write. I wouldn’t have been able to fire up ResEdit and edit out the Mac startup sound so I could tinker on the computer at all hours without waking my parents.
Now, I am aware that you will be able to develop your own programs for the iPad, the same way you can develop for the iPhone today. Anyone can develop! All you need is a Mac, XCode, an iPhone “simulator,” and $99 for an auto-expiring developer certificate. The “developer certificate” is really a cryptographic key that (temporarily) allows you (slightly) elevated access to... your own computer. And that’s fine — or at least workable — for the developers of today, because they already know that they’re developers. But the developers of tomorrow don’t know it yet. And without the freedom to tinker, some of them never will.
(As a side note, I was wrong and Fredrik was right, and Chrome OS devices will have a switch for developers to run their own local code. I don’t know the specifics of what it will look like, whether it will be a hardware button or switch or whatever. But it will be there, an officially supported mode for the developers of today and, more importantly, the developers of tomorrow.)
And I know, I know, I know you can “jailbreak” your iPhone, (re)gain root access, and run anything that can motherfucking run. And I have no doubt that someone will figure out how to “jailbreak” the iPad, too. But I don’t want to live in a world where you have to break into your own computer before you can start tinkering. And I certainly don’t want to live in a world where tinkering with your own computer is illegal. (DVD Jon was acquitted, by the way. The prosecutor appealed, and he was acquitted again. But who needs the law when you have public key cryptography on your side?)
Once upon a time, Apple made the machines that made me who I am. I became who I am by tinkering. Now it seems they’re doing everything in their power to stop my kids from finding that sense of wonder. Apple has declared war on the tinkerers of the world. With every software update, the previous generation of “jailbreaks” stop working, and people have to find new ways to break into their own computers. There won’t ever be a MacsBug for the iPad. There won’t be a ResEdit, or a Copy ][+ sector editor, or an iPad Peeks & Pokes Chart. And that’s a real loss. Maybe not to you, but to somebody who doesn’t even know it yet.
My parents gave up on Linux and bought a Mac Mini. We bought an AppleTV for the kids and filled it with their favorite DVDs. I stood in line for three hours to buy my wife an iPhone 3G for her birthday. And nobody gives a shit about freedom 0.
So, hypothetically speaking, let's say you want to design a system where you had absolute control over which applications your customers were allowed to install on your device. Certainly you would want to ensure that you were the only source for applications. But for extraordinary cases, you might also need to create a blacklist of applications.
Each entry in the blacklist would also need a human-readable title -- presumably the name of the app -- and perhaps even a human-readable description to explain why the app was blacklisted. But each entry would also need a unique identifier, of course, so you don't accidentally get confused between six apps named "TODO." Finally, you would probably want to include the date that the entry was added to the list.
Furthermore, since you anticipate continually adding new applications to this blacklist to protect your and your partners' business model, you would need your proprietary non-browser-based client to periodically poll the list for changes.
All of which raises a very serious question: what data format should you use for the list?
If you answered "JSON" then congratulations, you
win the Trendy Tech of the Month Award lose! To collect your prize, please proceed through the door marked "This way to the egress." Some restrictions apply.
Update: OK, OK, it's a "Core Location" blacklist. Big deal. I'll see your tree and raise you a forest:
... an independent engineer discovered code inside the iPhone that suggested iPhones routinely check an Apple Web site that could, in theory trigger the removal of the undesirable software from the devices.
Mr. Jobs confirmed such a capability exists, but argued that Apple needs it in case it inadvertently allows a malicious program -- one that stole users' personal data, for example -- to be distributed to iPhones through the App Store.
As I've said before, "protecting users from malicious programs" is code for "cryptographically enforcing restrictions on applications to protect our and our partnersâ€™ business model." The bullshit about "stealing personal data" is just a rhetorical sleight of hand, like the RIAA claiming that piracy hurts "artists and other rights holders" when 99% of artists don't own the rights to their own songs. How many apps has Apple de-listed over privacy concerns? Only one that I know of, and it was quickly reinstated after a quick update. How many apps has Apple de-listed (or prevented being written in the first place) to protect their business? Lots and lots.
Not for nothing, but I've had my share of bad reviews in my professional career. Some I've taken well, and some I've taken... poorly. Some were my fault and others honestly weren't. There isn't a manager on Earth who hasn't had to give a bad review to somebody, sometime. It's always awkward and it's never fun and in the end you're left with a low score on a piece of paper and a sinking feeling in your chest.
And yet, if you rounded up all the managers in the world and shot them... no wait, that's not where I was going with this. If you rounded up all the managers in the world and got them drunk -- yes, I think that would work -- you got them drunk and you asked them one question, they'd all tell you the same thing: the score that they give and you get doesn't mean a damn thing. Oh, you'll fixate on the score, since it means no salary bump or no bonus or no promotion or -- jackpot! -- all three at the same time, but it truly, truly, truly doesn't mean a damn thing. The only thing that truly matters is the conversation that follows.
And it is in this context that I am somewhat embarrassed on behalf of the Mozilla Corporation. They certainly didn't ask for my opinion or my guilt-by-proxy, but they apparently haven't noticed that they ought to be embarrassed, so by God somebody needs to step up. I refer, of course, to the Acid 3 test cooked up by the inimitable Ian Hickson and his motley crew of meddling minions. The test gives a numerical score that purports to rank a browser's compatibility with a potpourri of well-established web standards. Of course any such test is guaranteed to be unfair to somebody, but this one was especially unfair to everybody since the makers intentionally sought out bugs in major browsers to highlight their incompatibilities.
That, by itself, is not the story. First there was the Acid test, then there was the Acid 2 test, and there will no doubt be an Acid 4 test and so on. The fact that the testmakers had to work so damn hard to find compatibility bugs to highlight speaks volumes by itself, but that is not the story either. The story is that two browser vendors -- Opera and Apple -- somehow got into a bit of a race over who could reach a perfect score first. This, on top of their already insane release schedules (Safari 3.1, Opera 9.5), shocked and awed the web standards community, who for the first time in recent memory were put in the enviable position of arguing about which browser had increased its standards compliance the most and the fastest.
The funny thing is, I don't even know who won. There were some inconsistencies about which builds passed what, and then they found some last-minute bugs in the tests themselves, and despite minute-by-minute updates on programming.reddit.com, I don't really know or care who "won" the race. But I'll tell you one thing: it sure as hell wasn't Mozilla, because they were too busy complaining that the tests were just designed to highlight bugs (duh)... and they didn't see any real worth in the feature tests (like downloadable web fonts, which is a five-digit Bugzilla bug that has been open since 2001)... and they felt they should get partial credit for still being ahead of Internet Explorer (new working slogan: "Firefox: We're Not Dead Last")... and anyway, they're really busy right now -- unlike the fine young minds at Apple and Opera, who, unbeknownst to their managers, have outsourced all their browser development to summer interns and are spending their newfound free time reenacting Roman toga parties. And oh, by the way, didn't you hear that the other guys cheated? Also, their toga parties are, like, totally inaccurate when viewed from a psycho-historical perspective.
C'mon, guys. It's not the score that matters, it's the followup. It's the conversation you have, the promises you make, the progress you show the next day and the day after that and the day after that. And bitching about an openly developed test suite whose ultimate goal was just to get people excited about web standards for a few minutes -- man, you should all be embarrassed with yourselves. But you're not, so here I am stepping up, publicly being embarrassed on your behalf. No need to thank me.
Update: once again, I explain myself better the next morning.