Is the Mailman Watching Me?

My take on Walt Disney World’s “magic bands,” which will probably be misinterpreted as a defense of the NSA.


My friend Michael sent me a link to “You don’t want your privacy: Disney and the meat space data race,” an article by John Foreman on GigaOm, and made the mistake of asking my opinion on it. I think it’s a somewhat shallow essay, frankly, but it raises some interesting topics, so in the interest of spreading my private data everywhere on the internet, I’m copy-and-pasting my response from Facebook. Overall, it seems like one of those shallow mass-market-newspaper-talks-about-technology pieces, the kind that breathlessly describes video games as “virtual worlds” in which your “avatar” has the freedom to do “anything he or she chooses.”

For starters, I’m immediately suspicious of anyone who says something like “Never will we take our children to Disney World.” (Assuming they can afford it, of course; considering that the author had just talked about vacationing in Europe and enjoying the stunningly blue waters off crumbling-economy Greece, that’s a safe assumption). Granted, I’m both childless and Disney theme park-obsessed, so my opinion will be instantly and summarily dismissed. But all the paranoia about Disney in general and princesses in particular strikes me less as conscientious parenting and more as fear-based pop-cultural Skinner-boxing. It seems a lot healthier to encourage kids to be smarter than marketing, than to assume that they’re inescapably helpless victims of it. Peaceful co-existence with the multi-billion dollar entertainment conglomerate.

Which is both none of my business and a digression, except for one thing: I really do think that that mindset is what causes a lot of shallow takes on the Disney phenomenon, which are based in the assumption that people can’t see past the artificiality and enforced whimsy, so an edgier, “counter-culture” take on Disney is showing them something they haven’t seen before. It also causes the kind of paranoia about Disney that describes it as if it were an oppressive government, and not a corporation whose survival depends on mutually beneficial business transactions.

There’s no doubt that Disney wants to get more data on park guests, but that essay’s extrapolations of what they’ll actually DO with that data are implausibly silly. They’re all based on the idea that Disney would spend a ton of money to more efficiently collect a ton of data aggregated for weeks across tens of thousands of customers, and then devote all that money and effort to develop creepily specific experiences for individuals.

It’s telling that Foreman compares Disney’s magic bands to the NSA, since I think the complaints miss the point in the same way. People freak out that the government has all kinds of data on them, when the reality is that the government has all kinds of data on millions of people. The value of your anonymity isn’t that your information is private; it’s that your information is boring. All your private stuff is out there, but it’s still a burden to collate all of it into something meaningful to anyone.

This absolutely is not an attempt to excuse the NSA, by any stretch. The NSA’s breaches are a gross violation, but the violation isn’t that they’re collecting the data, so much as that they’re collecting the data against our will and without our knowledge.

Anything Disney does with the Magic Band data, at least in the next ten years or so, is going to be 1) trend-based instead of individual based, and 2) opt-in. For instance, they’ve already announced that characters can know your name and about special events like birthdays, but they’re only going to use something like that at a character meet-and-greet. For example, you’ve specifically gone to see Mickey Mouse, and he’ll be able to greet you by name and wish you a Happy Anniversary or whatever. Characters seeking you out specifically is just impractical; the park has already had enough trouble figuring out how to manage the fact that tens of thousands of people all want to get some individual time with the characters. The same goes for the bit about “modifying” a billion-dollar roller coaster based on the data they get from magic bands; it’s just as silly as assuming that you could remove floors from a skyscraper that weren’t getting frequented enough by the elevators.

It’s absolutely going to be marketing driven; anybody who says otherwise doesn’t get how Disney works. But I think it’s going to be more benign. Walt Disney World as a whole just doesn’t care about a single guest or a single family when they’ve got millions of people to worry about every day. So they can make more detailed correlations like “people who stay at the All Star resorts don’t spend time at the water parks” and adjust their advertising campaigns accordingly, or “adults 25-40 with no children spend x amount of time in Epcot.” But the most custom-tailored experience — at least, without your opting in by spending extra — is going to be something like, at most, coming back to your hotel room to find a birthday card waiting for you.

The creepier and more intrusive ideas aren’t going to happen. Not because the company’s averse to profiting from them, but because they’re too impractical to make a profit.

Sometimes When We Stylus

A review, more or less, of the Samsung Galaxy Note 8.0, plus a bit of marveling on the current state of tablet computers.

GalaxyNote8
Back in March of 2009, Kindles still had keyboards, and we were still a year away from enjoying all the feminine hygiene jokes that came with the release of the iPad. I took advantage of the release of the Kindle 2 to describe what would be my ideal tablet computer.

Reading that blog post now, what stands out the most is what a fundamental shift in thinking the iPad was. Looking back, it’d be easy to say that the iPad was inevitable — of course they’d just make a bigger iPhone! But that’s definitely not where the speculation was before the iPad’s announcement. People were still thinking that there’s a clear distinction between computer and media device. It’s why there’ve been so many “hybrid” laptops with removable screens that become “tablets,” that invariably have tech journalists swooning and declaring them the perfect solution right up until the point that the thing is released and fails to make a dent. It’s why people still insist on making a distinction between devices for “consumption” vs. ones for “creation.” If you’d asked most people in 2009 to describe what the iPad was going to be like, they’d have described something basically like the Microsoft Surface Pro.

That includes me; what I had in mind was essentially a thinner, lighter Tablet PC (in other words, the Surface Pro). The iPad undercut that, not just in size and in price, but in function. It made good on the promise of a “personal computer:” portable enough for media consumption, but multi-purpose enough not to be dismissed as just an evolution of the e-book reader or PDA. It’s clear now that that was absolutely transformative, and anyone who suggests otherwise is not to be trusted with your technology prognostication.

I’m not claiming to be prescient; at the end of that blog post, I gave a spec list for my perfect tablet computer, and it’s not an iPad. However, it is eerily close to a tablet computer that exists today, with one major difference: it wasn’t made by Apple, and it doesn’t run OS X. It’s the Samsung Galaxy Note 8.0.

Why Would You Even…?!

I’ve already been completely converted to the form factor of the iPad mini, and this one reportedly had all of that plus removable storage and a Wacom digitizer. The existence of refurbished models, some left-over gift certificates, and “reward” points, meant that I could get one for about $250. (They retail for about $400, and not to spoil the review, but I really can’t recommend it at that price). If you don’t think I chomped on that like the star attraction at Gatorland, then you just don’t know me at all.

Most of the reviews for the Note 8 that I’d read acknowledged that it’s a fairly good — but soon to be outdated — tablet whose main draw was stylus input, and that unless you need the stylus, either the iPad mini or the Nexus 7 is a better value. They still treated the digitizer as an extra feature though, as opposed to the whole reason for the tablet’s existence. (Which is fair, since Samsung’s treating it basically the same way by selling a Galaxy and Galaxy Note line in parallel). I hadn’t seen any that reviewed it mainly for the strength of its digitizer and its appeal to digital artists; the closest I could find was Ray Frenden’s review of the Galaxy Note II smartphone.

Over the years, I’ve tried out various graphics tablets, tablet PCs, styluses, and art software with the hope that I’d find the magic bullet that suddenly turned me into a better artist. I’ve finally given up on that idea and resigned myself to the fact that only practice is going to turn me into a better artist. By that measure, anything that reduces “friction” and encourages me to practice more often is a worthwhile investment. I’ve got several Moleskines that were going to do exactly the same thing, but instead just frustrated me with their analogness. Without an “erase everything” button, they’re like tiny Islands of Dr. Moreau, the misshapen forms of my previous failures staring back at me and discouraging me before I’ve even begun a new drawing.

A tablet that I use as often as the iPad mini, on the other hand, but that has a pressure-sensitive stylus and palm rejection and layers and simulates different media and colors and download reference material directly into my art program? And for less than 300 bucks? How could that not be ideal?

As a Digital Notebook

Gaulbladder
The sketch above is the most effort I was willing to put into a drawing for this blog post. Obviously better artists could show more of the capabilities of the device; even I have generated better drawings on the Note 8 when I’ve put more time into it.

But sample images for these things are always deceptive. I know I’ve gotten in the habit of looking at Frenden’s reviews and thinking, if I buy this thing, I’ll be able to draw like that!, which is of course a lie. And you can find amazing pieces of work done on just about any device, from a cell phone to a Cintiq, by artists who already know what they’re doing. What I wanted to see was what kind of results you’d get if an average person interested in being a better artist sat down and tried to use it.

That drawing was done in Autodesk Sketchbook Pro for Android, and it’s intended to show off the basic advantages of using a digital notebook. Reference art from the web, multiple layers for sketching & inking, brushes with variable line weight, and a tool that makes it easy to add simple color.

Pressure Sensitivity: You can tell that it’s a pressure-sensitive pen, but you’re not going to see dramatic differences in line weight unless you’re willing to do a lot of fiddling with brush settings. There is a way to increase the sensitivity of the S-Pen, apparently (instructions are in that Frenden review of the Note 2), but I had no luck getting it to work.

Palm Rejection: Even though there are now pressure-sensitive styluses for the iPad, one of the biggest annoyances remaining was that none of the software (at least that I’m aware of) supported any sort of palm rejection. As a result, you had to hold the stylus out as if you were using charcoal or pastels, which to me kind of defeats the purpose of having a stylus. On the Galaxy Note 8, of all the apps I’ve tried — Sketchbook Pro, Sketchbook Ink, Photoshop Touch, ArtFlow, and Infinite Painter — it only worked reliably in Sketchbook Pro. The others would either leave a smudge at the bottom of the screen, resize the view, or interrupt the current drawing stroke. Even in Sketchbook Pro in “Pen Only” mode, it seemed eager to interpret my palm as an attempt to resize the canvas. I get the impression that both pressure sensitivity and palm rejection have to be implemented by each app for itself, although it seems like it’d make far more sense to have it implemented at the OS level.

Accuracy: The other big problem with drawing on the iPad is that you need a blunt tip to register on a capacitive display. The S-Pen is much, much better at this, as you’d expect. The other thing that helps is that the tablet detects proximity of the pen to the surface, not just an actual touch, so you get a cursor showing where you’re about to draw. (It also means you get tooltips throughout the entire system when using the pen. Which is nice, I suppose, but I’d prefer just to have simple clarity of the UI, and I’d been hoping that touch screens meant tooltips were dying off for good).

Drawing on Glass: Earlier I said I wanted something that would reduce the “friction” of drawing so I’d practice more often; drawing on the Note 8 takes that a little bit too literally. I’ve gotten used to drawing on graphics tablets, and with rubber-tipped styluses on the iPad. That’s entirely different from drawing with a plastic nib on a glass screen, enough to make me wish they’d sacrificed the display brightness a bit in favor of a more matte surface on the screen. That would never have happened, since Samsung’s trying to position the tablet as a superset competitor to the iPad mini and is going to make a big deal out of the slightly better pixel density. But I think it would’ve been a good way to further differentiate this as a stylus based tablet, instead of a table that happens to also have a stylus.

Responsiveness: It varied from app to app, with Sketchbook Ink being the worst. When I turned off the “Smooth Brush” option in Sketchbook Pro, the lag was all but imperceptible to me, unless I was drawing with a particularly large or complex brush.

Bezel: Unlike the iPad mini, the Note 8 has a bezel that’s as wide on the sides as it is on the top and bottom. While I think it does actually contribute a bit to the overall “cheap and plastic” look of the device, it’s absolutely essential for a tablet with a stylus. If you were to simply slap a Wacom digitizer onto an iPad mini, there’d be no good place to hold it.

Software: If it’s not obvious by now, Sketchbook Pro is the clear winner of all the apps I’ve tried. That’s no big surprise, since it’s been around for years and was designed specifically for tablet computers. I’ve bought a version of it for every operating system and every computer I own, and they’re all excellent; it’s nice to finally be able to use it as it was designed to be used. I do wish that it were possible to import brushes on the tablet version as you can on the desktop versions; if there is a way to do that, I have yet to find it.

Overall, I’d say that even though our skill levels are vastly different, my take on the Note 8 isn’t all that different from Frenden’s take on the Note II. (Much of that’s intentional on Samsung’s part, as they want consistency between the phone, 8-inch, and 10-inch tablet devices in their line). Don’t expect to use it for finished art, and don’t expect it to function like a $300 Cintiq tablet. But as a sketch book with a complete set of art tools that you always have with you, it’s fine. Whether you have the Note II or the Note 8 — every review I’ve read of the Note 10 says that it’s underpowered, so I’d avoid it — just depends on which one you’re more likely to have with you everywhere you go.

For my part, I can definitely see myself practicing more often on this thing.

As a Tablet Computer

Practicing art was only part of the thing; even I can’t justify spending a couple hundred bucks to replace a $10 Moleskine. The idea was that I’d have something that would do everything the iPad mini can, and function as a digital notebook. In that regard, I’d say that it’s not quite there, but it’s pretty close.

When I had my semi-religious experience in an Apple Store, I said that the iPad mini seems absolutely silly until you actually hold one. I still think that’s the case, and I think that the build quality of the Note 8 really drives that home. It’s got a white plastic back and a silver border that makes it seem 1) like a prop from 2001 or Space: 1999, 2) thicker than it actually is, and 3) kind of cheap. The iPad mini feels like a solid block of metal and glass; the Galaxy Note 8 just feels like a plastic consumer product.

According to the specs, the Note 8 has a slightly higher pixel density than the mini. It shouldn’t be enough to be perceptible, but whether it’s more clarity, better use of fonts, brighter colors, or just placebo effect, the picture does look better than the mini’s. Especially with text and line drawings (by which I mean comics, of course). The colors also seem brighter than on the mini.

Battery life is middling. I haven’t stress tested it (and I’m unlikely to), but it has been completely drained of power just sitting idle for three days, which has never been the case with any iPad I’ve used. I suspect that if I took it on the road, I’d be having to charge it every night.

It does support micro SD cards up to 64 GB for external storage — one of the items on my “ideal tablet computer” list from 2009 — but for documents only, not apps. (Since it’s Android, there are instructions online on how to root the tablet so you can use the SD card for apps, but I’ve always considered rooting or jailbreaking these things to be more trouble than it’s worth). Since the tablet is limited to 16 GB of internal storage, and you’re left with around 9.7 GB after all the pre-installed software, the extra space is definitely nice to have. It could store my entire library of Kindle books and comic books, and have enough space to actually store a significant chunk of my music library, which is something I’ve never been able to do with the iPad. Consistent with the build quality of the rest of the tablet, the door for the SD card is one of those tiny plastic covers that always seems in danger of breaking off.

The stylus is definitely closer to a Palm Pilot stylus than a Wacom pen, but it’s perfectly adequate for drawing. It’s more white plastic, it fits snugly in the underside of the tablet, and pulling it out automatically brings up a page of Samsung’s pen-enabled “S Note” app. Unlike the bloatware I would’ve expected, that’s actually a pretty solid app. It’s got a set of templates of questionable usefulness, but the technology underneath is solid. Handwriting recognition is flawless enough to be eerie, and it’s got additional modes that recognize mathematical formulae and shapes for diagrams. The latter one was the biggest surprise for me, since I’ve been surprised that I haven’t see any tablet computers pull off the potential of OmniGraffle very well, when it seems like it’d be a natural. (It’s possible that OmniGraffle for iPad is an excellent program, but at $50 I’m never going to find out).

Handwriting recognition is available throughout the system as a “keyboard” mode; the others are a traditional keyboard and voice input. (Somewhat surprisingly, handwriting is faster and more accurate for me than voice input. Could Star Trek have gotten the future wrong?)

Samsung has re-skinned the entire OS and included its own apps, but I didn’t think either one was particularly obtrusive. All the apps other than S Note were quickly relegated to a different page. Having a “Samsung Cares Video” icon that can never be deleted from the system is kind of an annoyance, but at least I never have to look at it. And I tried using a different launcher for a bit, but soon went back to the default one.

Considering how often I read comments online from people demanding that this app or that service be released on Android, I’d expected the Google Play store to be filled with nothing but fart apps and tumbleweeds. But I’d quickly found and downloaded every one of the apps I use most often on iOS. I’d be more disappointed if I’d any intention of giving up iOS completely, but there’s a respectable amount of software out there.

It’s also got two cameras, and they’re both terrible. Which is as it should be, because if tablets had good cameras, you’d have even more people taking pictures with them in public.

Android vs. iOS

Believe it or not, I did go into Android with a completely open mind. As long as it’s functionally equivalent to iOS, then there’s no point in getting butthurt over all the differences.

And at least with the version of Android that’s installed on this thing — I don’t know, it’s Peanut Buster Parfait or some shit — it is pretty much functionally equivalent. On a task-by-task basis, there’s little that’s inherently better about one way of doing things than the other. Widgets and Google Now seem better in theory in practice, and the only thing that’s outright worse about Android is the lack of a gesture to immediately scroll to the top of a screen.

What’s surprised me is just how much the cliches about each OS are true. Overall, Android seems like an OS that was made by programmers, while iOS seems like an OS that was made by designers. iOS tends to have a consistent aesthetic, while Android has that weird combination of sparseness and excess that you see on Linux desktops: there’s only an icon for a Terminal window and an open source MS Office clone, but they glow and rotate in 3D space with The Matrix constantly scrolling on top of an anime girl in the background.

I’ve certainly got my own preferences. The lack of options and settings common to iOS apps is often, bafflingly, described as a failing, but what it is is an acknowledgement that having a consistent experience that just works is preferable to having to fiddle with a billion different settings. I often have to read people complaining about Apple’s “walled garden” and its arrogant insistence on one way of doing things as opposed to giving the user choice; what I see in Android is a ton of meaningless, inconsequential choices that I’m simply not interested in making.

One of the “features” of the Note 8 that I didn’t mention above is that it supports multiple windows. You can open a little task bar and drag a separate app onto the screen to have two apps running at the same time. A lot of reviews that I’ve seen for the tablet list this as a major advantage of the system. I say that it’s a clear sign the developers have learned nothing from the failure of the Tablet PC. They’re still trying to cram a desktop OS onto a tablet with a touchscreen, when even Microsoft has learned to stop emphasizing windows in Windows. The iOS limitation of having only one app running concurrently isn’t just some technical limitation; it’s one of those constraints that makes the design of the entire system stronger. It means the designer can’t just lazily port a desktop interface to a tablet, but has to put real thought into how to optimize the app for the new device and how it will be used.

(There are definitely, absolutely, major inconveniences to having only one app running at a time on iOS, as anyone who uses 1Password will tell you. But I’m convinced that the best way to solve it won’t look anything like what works on a desktop OS).

I think the best example of the whole divide between Android and iOS is in the file system. iOS is notoriously closed; each app has its own sandbox of files that only it can touch, and transferring documents between apps is cumbersome. Android is notoriously free and open; you have access to the entire file system of the device, with a file-and-folder-based GUI that should be familiar to you because it’s the exact same one you’ve been using for 30 years.

Some people will say this is a perfect example of each person being able to choose the operating system philosophy that works best for him. I say it’s an example of how stubbornly sticking to one way of doing things results in something that’s best for nobody. I’m perpetually frustrated by the file handling in iOS, where I just want to use this app to open that document but can’t find any flow of import or export that’ll make it work. But I’ve been just as frustrated with Android, where I keep creating files and then am completely unable to find them in any of the dozens of folders and subfolders on the system. (Sketchbook, for instance, doesn’t save pictures you’ve exported in Pictures. Nor in Documents. It saves them in Autodesk/SketchbookPro/Export).

I’m hoping that Android will eventually get over its problems with market fragmentation, let go of the desktop, and finally embrace a post-PC world. And I’m hoping that iOS will eventually let go of Steve Jobs’s pathological fear of multiple buttons and develop a scheme for cross-app communication that doesn’t depend on clipboards or exposing the file system. Concentrating on how to use touch as a completely new way of interacting with a computer could lead to a dramatically improved method of working with computers; we’ve already seen that kind cross-pollination happening between iOS and OS X. I don’t see that kind of innovation coming from Android, though, since it seems to be still doing little more than iterating on stuff that’s as old as X Window.

And one of the cliches that’s hilariously not true is the one about Android being all about functionality and practicality with Apple being all about flash and gimmickry. Because I’m now the owner of a tablet that has no less than eight different ways to unlock it (most using a rippling water effect), and which keeps warning me that it can’t see my eyes, because it has a “feature” (optional, of course!) that won’t let it go to sleep if it detects my face looking at it. Unlike the iPad, which, you know, turns off when I close the case.

And Finally, the Verdict

I’m way too invested in iOS at this point to ever switch over completely, so that was never an issue. And I think I’ve gotten most of the making-fun-of-Android out of my system, so I’m not going to be starting any campaigns against it. (I’d even like to try writing an app for it, at some point).

The questions for me were whether the Galaxy Note 8 could replace my iPad mini as the “everyday workhorse” tablet, and whether it’d help me practice drawing more often by having a ubiquitous digital notebook. The answers, so far: almost definitely not, and maybe.

If I were actually writing for one of the tech blogs, I’d be laughed out of my job if I based my entire verdict on “how the computer feels.” But for me, that’s what it comes down to with the iPad mini. It’s like Kirk Cameron’s banana: it just fits the hand perfectly (and doesn’t squirt in your face, either). It just feels more fun to use, for some indefinable value of “fun.” When Apple inevitably releases one with a higher resolution display, it’s going to be all but impossible for me to avoid getting one. I bought the first one thinking it was a ridiculously excessive extravagance, and it almost immediately became indispensable; I use it every day.

Still, I’m happy to have the Galaxy Note 8, although I’m glad I didn’t pay full price for it. It’s a solid (if not exceptional) drawing tablet that didn’t require me to shell out for a Cintiq or even a Surface Pro. If it helps me get to the level where I could actually make art for a game, then it was a good investment.

As for normal people, without my weird affliction when it comes to gadgets?

  • If you don’t care that much about drawing and just want the best tablet: get an iPad mini.
  • If you want a good tablet for an unbeatable price: get the Nexus 7.
  • If you’ve got the money, and you’re looking for a laptop replacement or the best drawing experience you can currently get on a tablet: get the Surface Pro. (I haven’t used it myself, but I’ve never seen a review of one that could find fault with the digitizer on it).
  • If you want a mid-sized tablet and think you’ll ever want to use a stylus with it: get the Galaxy Note 8. Preferably on sale.

Protestant Gadget Ethic

Making sense of the iPad mini in a world that doesn’t need it.

After my previous unfortunate episode in an Apple store, it should come as little surprise that I didn't last very long before I broke down and bought an iPad mini. No, it doesn't make sense for me to be throwing my credit card around as if I were the CEO of Papa John's or something. I've already got a perfectly fancy tablet computer that's not just functional, but really quite terrific. It's not like I'm getting paid to write reviews of these things, and even my typical “I need it for application development testing” is sounding increasingly hollow.

What helps is a new metric I've devised, which measures how long it takes me after a purchase before the appeal of the thing overwhelms my feeling of conspicuous consumption guilt over buying it. It's measured in a new unit: the Hal (named after Green Lantern Hal Jordan, the Jordan who does have willpower).

By that standard, the iPad mini clocks in with a new record of 0.03 Hals, or about 30 minutes after I opened the box. Because this thing is sweet, and I pretty much never want to stop holding it. I'm writing this post on it, as a matter of fact, even though a much more functional laptop with keyboard is sitting about three feet away from me at this very moment. But to use it would mean putting the iPad down.

The “finish” of the iPad mini, with its beveled edge and rounded matte aluminum back, is more like the iPhone 5 than the existing iPads. It makes such a difference in the feel of the thing that I can talk about beveled edges and matte aluminum backs without feeling self conscious, as if I were a tech blogger desperately seeking a new way to describe another piece of consumer electronics.

It’s about as thin as the iPhone 5, and almost as light. With the new Apple cover wrapped around the back, it's perfect for holding in one hand. There have been several times that I've made fun of Apple, or Apple fanatics, for making a big deal about a few millimeters difference in thickness, or a few ounces in weight. And I joked about the appeal of the iPad mini, as if the existing iPad was unreasonably bulky and heavy.

But then something awful happened: I had to fly cross country four times within two weeks. And reading a book on the iPad required me to prop the thing up on the tray table and catch it as the person in front of me kept readjusting his seat. All my mocking comments were flying back in my face (along with the iPad, my drink, and the in-flight magazine), in the form of the firstest of first-world problems.

“Version 1 of the iPad mini is for chumps,” I said. “Check back with me once you've put in a higher resolution display, Apple.” In practice, though, the display is perfectly sharp. “Retina” isn't the make-or-break feature I thought it would be. You can certainly tell the difference when comparing the two; I'd assumed that squabbling over pixel density was something best left to the comments sections of tech blogs, but the difference in sharpness is definitely visible. It's really only an issue for very small text, though. Books, Flipboard, and web pages are all clear and legible.

And speaking of Flipboard, it and Tweetbot are the two apps that get me giddy enough to own up to making another unnecessary tech purchase. Browsing through articles and status updates on a tablet that thin is probably the closest I'll ever come to being on board the Enterprise.

The phrase I've seen reoccurring the most in reviews of the iPad mini is some variation on “this is the size the iPad is supposed to be.” And really, there's something to that. I'm not going to give up my other one; the larger size really is better for some stuff, like drawing, Garage Band, and reading comics or magazines. But overall, I haven't been this impressed with the “feel” of a piece of consumer electronics since I saw the original iPhone. Realizing that this is just version 1.0 is actually a little creepy — apart from the higher resolution display, I honestly can't conceive of how they'll improve on the design of the iPad mini.

Tabletized

Impressions of the iPad mini and Microsoft Surface, and the realization that I just don’t know what the hell is going on anymore.

MSSurfaceAngrySchoolgirls
It started out innocently enough; I thought it’d be good to get out of the apartment for once and take an aimless drive in Marin County. And hey, there’s an Apple Store in Corte Madera, so what could it hurt to stop by and take a quick look around?

It ended with a descent into craven tablet-computing madness. In a shockingly short time, I’ve gone from thinking that tablet computers were a vulgar, unnecessary display of conspicuous consumption; to thinking that tablet computers were a vulgar, unnecessary display of conspicuous consumption and I want all of them.

I’ve already conceded that the iPad for me was a case of blowing a wad of cash on a gadget, in the hope that it would create a use for itself at some point down the road. And it did; even if I weren’t still calling myself an iOS developer, I’d still have gotten enough use out of the iPad to justify the expense. But what happened today is much worse.

iPhone Gigante Más Pequeño

I should explain something first: my reaction to the Apple “event” where they announced the iPad mini. Now, I’m about as shameless a whore for Apple products as anybody you’ll find outside of Daring Fireball, but I still saw nothing in that batch of announcements that interested me in the slightest. New MacBook Pro with “retina” display? Please; they hit exactly the sweet spot of size vs. power with the Air. New iPad with a thinner “lightning” connector and faster processor? I don’t think even Apple seriously expected anyone to consider an “upgrade” after just a few months.

And the iPad mini was just baffling. Who could it possibly appeal to? It’s smaller than the iPad, but not enough to be a recognizably different product, and not quite small enough to be really called “handheld.” It’s not as if the iPad is uncomfortably big and bulky anyway; in fact, it’s the perfect size for comics and magazines, and is actually a little bit too small for playing board games. And the price of the iPad mini is way out of “impulse purchase” range like the Kindle or Nexus 7, and too expensive for an e-reader. It’s been getting stellar reviews pretty much across the board, but those are just tech reviewers buried under such a mountain of gadgetry that anything with Apple’s fit and finish is going to make a great first impression. It couldn’t possibly pass the real-world, practical test. It seemed like Apple had once again done the impossible: creating an Apple product that I had absolutely no desire to buy.

Then I went into the Apple store and tried out the iPad mini, just for yuks. I don’t know what kind of gas they were pumping into the store, but whatever it is has the same effect on my brain chemistry as a nightclub singer has on a Tex Avery wolf. This is the perfect size. How was I ever able to imagine reading in bed with such a freakishly enormous iPad? I would totally read more with a tablet this size. And the display is so sharp and the text so clear that it doesn’t even need a higher-resolution display. We have always been at war with the 10-inch diagonal. Go away, old woman, I don’t care that you’ve been waiting in line; I just want to keep holding this and never stop.

I’m not a complete idiot; I left the store without buying anything. And even though I see an Apple Store the same way Angelina Jolie sees African orphanages, I don’t plan on buying another iOS device anytime soon. When they release version 2 of the mini, though, it’s going to be a tough time for me. I’ll admit to spending a few minutes in the car trying to envision scenarios where my bulky, cumbersome iPad 3 met with an unforeseen “accident.”

A Small-Ass Table

In what I’m sure was a completely coincidental bit of retail zoning, a new Microsoft Store happens to have opened up just a few doors down from the Apple Store in Corte Madera. I checked in to take a look at the new Surface tablet, and came out almost as surprised as I was by the iPad mini.

Another thing I should clear up: even though I’ve gone all-in on Apple, I’m actually kind of excited by Windows 8. The last time I considered Windows to be anything more than a necessary evil was back in 1995. (Yeah, I genuinely got excited over Windows 95. Come on, you could put spaces in filenames. Just try and tell me that wasn’t cool). But Windows 8 seems to be a case of Microsoft getting it right to such a degree that I’m surprised it’s from Microsoft.

I installed the upgrade on my Windows machine, and I’ve spent the time since trying to come up with excuses to use it more often. (Until now, it’s just been The Machine That Runs Steam). As a desktop OS, it’s reassuringly Vista-like in its clunkiness: stuff takes more clicks than it should, basic system settings are confusingly split into several different places, it’s hard to tell what’s clickable vs what’s decorative, and the UI on the whole requires a tutorial or cheat sheet to be usable at all since there’s absolutely no discoverability. Basically, there’s enough wrong with it to remind me that I haven’t fallen into some alternate dimension where Microsoft is no longer Microsoft.

But as a consistent, cross-device UI, it’s outstanding. Text is big, legible, and just plain looks cool. All the hassle of writing WPF applications that required mark-up and screen resolution-independence finally seems to have paid off. The “Live Tiles” are such an ingenious way of combining widgets, icons, and full-blown apps that it seems obvious in retrospect, and it makes it look like other OS developers have been dragging their feet for not coming up with it sooner. It’s functional and actually is genuinely fun to use, from desktop to touch screen to tablet to phone to video game console.

And it’s instantly recognizable — even if, without “Metro,” there’s no good name for it — and unique, which is something that Android still hasn’t been able to accomplish. I was in a Best Buy the other day, and I legitimately couldn’t tell the difference between a bunch of Android phones and the iPhone 5 at the end. And speaking of phones: I like the iPhone 5, but I got one more as an obligation than anything else; it was as much to get rid of AT&T as it was to to get any of the new stuff in iOS. But I’ve looked at the new HTC Windows Phone and for the first time have legitimately considered jumping ship.

So I think that Microsoft has been doing a remarkably good job with the new Windows push, overall. And the Surface fits in with that. The commercial still offends me deeply on a philosophical level, considering what a fan I am of basic human dignity. But still, it does what it’s supposed to do. Above everything else, it sets it apart from the iPad and Android tablets. It shows you how you’re supposed to use it — not tossing it to your pals or dancing with it in a fountain, but then again, why not? — propped up and attached to the keyboard cover, like a super-portable laptop. It’s still Windows, and the implication is that you can do the stuff you use Windows for, instead of just putting your feet up on a table and reading the New York Times.

Finally seeing it in person, I was surprised by how well it works. It’s much lighter and less bulky than it looks in pictures. The display is bright and clear. The OS functions pretty much the same with a touch screen as it does with a mouse and keyboard. Having already learned the gestures on the desktop, I didn’t have any problem figuring out how to use the tablet, and there was absolutely no sense of translation between desktop/laptop OS and tablet OS as there is between OS X and iOS.

I had very little success getting efficient with the touch cover keyboard, but I think the touch cover is as much branding as it is technology. It’s undeniably neat that it’s functional at all, but I think its bigger purpose is to remind people that they have a keyboard and trackpad with them wherever they go — This is a Windows machine. Look you can run Office, it whispers in your choice of colors. The “type cover” adds an imperceptibly small amount of height and is infinitely more functional; it’s a super-slim laptop keyboard.

It feels ridiculous to hold the Surface vertically, but again, I get the impression that’s an essential part of the branding. They want you thinking, This is not a magazine or newspaper. This is like a laptop computer. I know this. I can be productive with this, as I laugh at the iPad users lounging on their couches swiping through vacation photos.

I’m not actually tempted to get one, but I will say that it’s the first product that’s gotten me to look up from my iPad and pay any attention to it at all. There’s absolutely no way I’d actually buy one.

And yet… this is the Windows RT version. The “Surface Pro” is coming out sometime next year, and it’s even more shameless about being a full-fledged notebook computer in tablet form. It’ll be thicker and heavier. I’ve heard it’ll actually come with a fan (other than Steve Ballmer, obviously). And it’ll undoubtedly be considerably more expensive than the current Surface tablet, which is already on the higher end of the iPad price range. But: it’ll come with a pressure-sensitive stylus. And I’ve wanted a tablet PC for as long as I can remember, as soon as I learned that they were even a thing. In fact, three years ago I described my ideal tablet computer, and it’s pretty much point-by-point the feature list of the Surface Pro.

Except for the “Apple makes it” part. And that’s the part that’s got me wondering what’s happened to the world. I just spent a considerable amount of time praising Microsoft. I defended an ad that has an old couple kissing and a tablet-covered dude doing the robot. I actually considered getting a third iOS device, for a not-insignificant number of seconds. The Apple Store seemed austere, black-and-white, business-like; while the Microsoft Store was colorful and fun. I’m actually envisioning a world where I might have a Windows device as my main computer. A time will come — but I must not and cannot think! Let me pray that, if I do not survive this blog post, my executors may put caution before audacity and see that it meets no other eye.

They Don’t Love You Like Google Loves You

Maps in iOS 6 bring about the downfall of western civilization, and my disillusionment with tech commentary continues.

Dalijapancenter
Just days after the entire developed world sank into a depressive ennui due to Apple’s boring new smart phone, society was rocked to its foundations by the unmitigated disaster that is iOS 6’s new Google-free Maps app. Drivers unwittingly plunged their cars into the sea. Planes over Ireland crashed into nearby farms due to mislabeled icons. College students, long dependent on their iPhones to find their way around campus from day to day, were faced with a featureless blob of unlabeled buildings and had no option but to lie down on the grass and sob. Huge swaths of Scotland were covered with an impenetrable fog, and the Brooklyn Bridge collapsed.

Throughout the entire ordeal, Tim Cook only stopped giving the world the middle finger long enough to go on Saturday Night Live and rip up a picture of Steve Jobs. Jobs’s only recourse was to haunt the homes and workplaces of thousands of bloggers, commenters, and Twitter users, moaning “You’re the only one who truly understands what I wanted. Avenge me!

At least, that’s the way I heard it. You want proof? It’s right there in the video, the one where they say “The Statue of Liberty? Gone!” while showing a picture of the Statue of Liberty. (Psst… hey, The Verge people — it’s that green thing in the middle of that star-shaped island). You think just because it’s “journalism” they have to have sources to show that it’s a serious, widespread problem? Check it, Jack: a tumblr full of wacky Apple maps mishaps.

And no, it doesn’t matter that the vast majority of those are complaints about the 3D Flyover feature, which was universally acknowledged as being a “neat but practically useless” feature of the maps app as soon as it was released, because shut up that’s why.

Of course, since I’m a relentless Apple apologist, I’m focused, Zapruder-like, on one tiny six-second segment of that three-minute long video: the part that says “For walking and driving, the app is pretty problem free.” And I’m completely ignoring the bulk of the video, which shows incontrovertible evidence that not everything is 3D modeled and lots of things end up looking kind of wavy.

Sarcasm (mostly) aside, my problem with this isn’t “oh no, people are picking on Apple!” My problem is that the people who are supposed to be authorities on tech — and to be clear, it’s not just The Verge, by a long shot — keep spinning the most shallow observations into sweeping, over-arching narratives. (And no, I haven’t see a single Verge post about Apple in the past week that’s neglected to find a way to link to that 73-degrees-Apple-is-timid post).

The tech journalists are the ones who are shaping public opinion, so I don’t think it’s unreasonable to expect an attention span of longer than a week and a short term memory of longer than a year. And as a result, I’m going to hold them responsible every time I read something dumb on the internet.

To be fair, even though the video just says “Apple’s supposedly been working on its version of Maps for 5 years, and it’s resulted in an app that’s inferior to what was there before,” and leaves it at that, the article does mention that Google’s data has been public for 7 years. And points out that the data gets refined with the help of location data from millions of iPhones and Android devices.

But why make it sound as if the decision to break free from dependence on Google was an arbitrary decision on Apple’s part? By all accounts, Jobs had a preternatural grudge against Google for releasing Android. But it’s not as if the Maps app on iOS 5 and earlier was feature equivalent to the Google maps on Android, and Apple’s deciding to roll their own was a completely petty and spiteful decision. Android’s had turn-by-turn directions for a while now, and there were no signs that it was ever coming to the Google-driven app on iOS.

Was that a case of Google holding out, or Apple not bothering with it because they knew they had their own version in the works? I certainly don’t know — it’s the kind of thing it’d be neat for actual tech journalists to explain — but it ultimately doesn’t matter. The licensing deal with Google ran out, so Apple’s options were to reassert their dependency on their largest competitor, or to launch with what they had.

And incidentally, whenever someone says “Steve Jobs would never have allowed something this half-assed to be released!” it’s the tech journalists’ responsibility to remind them that the iPhone released without an SDK and nothing but Apple’s assurance that Web apps were The Future. Or that Jobs had no problem releasing the original iPhone without support for Flash video, even though there was an outcry that Flash was crucial to the user experience.

I installed iOS 6 on the iPad and tried out a few practical searches. It found everything I needed, and it actually gave me more relevant information than I remember the Google version giving me, since I was looking for businesses, and Yelp automatically came up with business hours. Of course, my experience means very little, since I happen to live in the one part of the world that’s going to be most aggressively tested by Silicon Valley companies. I have little doubt that Europe and Asia are going to have a harder time of it, and obviously they’re not markets to be ignored. But it’s not a one-size-fits-all problem, so it’s silly to treat it like one.

Apple has no problem calling Siri a beta, so they probably should’ve called Maps beta as well. It’s a huge part of why people use smart phones, so it’d be foolish to imply that serious inaccuracies are no big deal. Regardless, it’ll work well enough in a lot of cases for most Americans, and in the cases where it doesn’t work, the web version of Google maps is still available (and you can set up a link on the home page with its own icon, even). Maybe Google and Apple will reach enough of a détente for a third party Google Maps app to get released. Maybe it’ll even finally bring turn-by-turn directions, or Apple will even allow third party apps to be default handlers for links!

Until then, maybe we can stop with the melodrama and the hyperbole, and just appreciate Apple Maps as version 1.0 mapping software with a neat extra feature of peeking into an alternate reality where Japantown in San Francisco has been overtaken by giant spiders.

Everything is amazing and nobody’s insightful

Tech writers are disillusioned with the iPhone 5, and I’m getting disillusioned with tech writers.

Ouroboros pressI understand that if you’re writing about technology and/or gadgetry, a significant part of your job is taking a bunch of product announcements and reviews, and then fitting them into an easily-digestible, all-encompassing narrative. Usually, though, there’s at least an attempt to base that narrative off of reality, “ripped from today’s headlines” as it were. Lately, it seems like writers are content with a story that’s loosely based on actual events.

For the week or so building up to the iPhone 5 launch and the days after, the narrative has been simple: “The iPhone 5 is boring.” Writing for Wired, Mat Honan says, essentially, that it’s boring by design. And really, fair enough. Take Apple’s own marketing, remove the talking heads on white backgrounds, remove the hyperbole, and give it a critical spin, and you’ll end up with basically that: they’ve taken the same cell phone that people have already been going apeshit over for the past five years, and they’ve made it incrementally better.

Take that to its trollish extreme, and you have Dan Lyons (the creator of the “Fake Steve” blog). He wrote a “provocative” take on Apple’s past year including the iPhone 5 announcement for BBC News, in which Lyons (who wrote for Newsweek and also a satirical blog in which he pretended to be Steve Jobs) spends a couple of paragraphs reminding us why we should care about his opinion (it’s because he wrote a blog in which he was “Fake Steve”), and then mentions that he dropped the blog out of respect for Jobs’s failing health, and then invokes Jobs’s memory several times. In an “analysis” of Apple that’s as shallow and tired as you can possibly get without actually saying Micro$oft — he actually uses the word “fanboys.”

We can all acknowledge that to give him the attention he needs and then move on; there’s absolutely nothing there that wouldn’t get him laughed off of an internet message board. Lyons doesn’t even have the “I speak for Steve Jobs” thing going for him, since everybody has an opinion on how things would be different if Jobs were still in charge.

What’s more troubling to me is seeing writers who are usually worth reading instead take a similar approach: building a story that’s driven mostly by what other people are writing. Any idiot can regurgitate “news” items and rearrange them into a cursory non-analysis. (And that’s worth exactly as much as I got paid for writing them). (Which is zero in case you couldn’t tell already). Is it too much to ask for insight? Or at least, wait to see what the actual popular consensus is before making a declaration of what popular opinion should be?

If It Ain’t Broke, Fix It Anyway Because it Has Grown Tiresome to Me

On The Verge, Dieter Bohn found a perfect analogy for the iPhone in its own Weather app icon. He turned Honan’s piece from a blog post into the “prevailing opinion” about the announcement. But then he takes Honan’s reasonable but back-handed compliment and turns it into an accusation: the iPhone isn’t boring but timid. Sure, the hardware’s fine, but whatever: where Apple has failed is by showing no willingness to experiment with the core operating system or UI.

The mind-numbingly tedious drudgery of having to close multiple notifications under iOS (which, incidentally, I’ve never once noticed as a problem) proves that Apple’s entire philosophy is a lie. You’ve got a reputation for “sweating the details,” Apple? Really? Then how can you possibly explain this?! — as he thrusts a phone full of email notifications into the face of a sheepish Tim Cook, while Jobs’s ghost shakes his head sadly.

I honestly don’t want to be too harsh with any of the writers on The Verge, since I visit the site multiple times daily, and I really like their approach a lot. But I don’t think it’s particularly insightful to be aware that interface consistency isn’t just the dominant driving factor of UI design, but of Apple’s philosophy since the introduction of the Mac. We’ve been using “version 10” of the Mac OS for 10 years now, and while the appearance and the underlying hardware have changed dramatically, the core functionality is largely the same. Intentionally. It’s only with the still-new Mountain Lion that Apple’s made changes to how things work on a fundamental level — and, in my opinion, it hasn’t been all that successful. (I don’t understand all the people whining about faux leather while saying relatively little about changing the entire structure of document management for the worse).

On top of that, though, there’s the fact that The Verge has spent at least a month covering the Apple v. Samsung trial, in which Apple was spending a ton of time, effort, and presumably money to defend the UI that Bohn claims needs a dramatic overhaul. Yes, Microsoft has done considerable work to dramatically rethink the smart phone UI. That’s how Apple wants it. They spent a month saying, “this is ours, we made it, we like it, stop copying it.” Could it use some refinement? Of course. It always can, and some of those good ideas will come from other attempts at inventing a new cell phone OS. Does it need a dramatic re-imagining? No, unless you’re irrationally obsessed with novelty for its own sake, as a side effect of being steeped in cell phone coverage 24/7 to the point of exhaustion.

The Audacity of Swatch

Speaking of that, there’s Nilay Patel’s write-up of the new iPod Nano. Again, I think Patel’s stuff is great in general. But here, he carries on the “boring but actually it’s timid” idea by linking to Bohn’s article, and then goes on to build a story about what it says about Apple in general. Essentially, they’ve become The Establishment, too afraid of change to take any risks. With a product that’s changed dramatically in design in just about every iteration since the original.

“But that’s the old market, and the old way.” Apple isn’t about profiting over the planned obsolescence and impulse purchase cycle — which is news to all of us who have now become conditioned to buy a new cell phone every 2 years and a new laptop every 3 or 4 — but to pioneer new markets. The last iteration of the nano heralded a future of wearable computing. The last nano could’ve been the harbinger of a bold new market for a more forward-thinking Apple: wristwatches.

Let’s ignore the fact that Patel himself acknowledges that the nano wasn’t a very good watch in the first place. What about the fact that smart phones pretty much killed the entire wristwatch business? I’m about as far from being fashionably hip as you can get, but I still get considerable exposure to what’s actually popular just by virtue of living in San Francisco. And I don’t know anyone who still wears a watch. It’s been at least five years since anyone’s been able to ask for the time and not have to wait for everyone to pull their phones out of their pockets or handbags. Why would Apple go all-in on a market that they themselves helped destroy?

(Incidentally, Patel quotes watch band designer Scott Wilson as saying “The nano in that form factor gave me a reason to have three iOS devices on my body.” I can think of the iPhone and the iPod Nano-with-watchband; neither Wilson or Patel make it explicit what the third device is. And now I’m afraid to ask, because I’m not sure I want to know what this third device is exactly, or where a person would enjoy sticking it).

Saying that the MP3 market is dead fails to acknowledge what killed it: that functionality, along with that of the point-and-shoot camera, has moved away from a dedicated device and towards the smartphone. Smartphones are expensive, even with a contract, and the more info we put onto them, the more they become irreplaceable. There’s still a market for a smaller, simpler, and relatively inexpensive MP3 player. There’s a clue to that market on Apple’s own marketing page, and the most prominent icon on its home screen: “Fitness.” Joggers and people who work out — at least from what I’ve heard, since I have even less familiarity with the world of exercise than I do with people who still wear wristwatches — want a music player they can take for a run or take to the gym without worrying too much if it gets lost or broken. They’ll get more use out of that than from a too-large wristwatch that has to be constantly woken from sleep and needs a headphone wire running to your wrist if you want to listen to music.

That’s where the new market is, ripe for Apple to come in and dominate: stuff like the fitbit. I don’t have the actual numbers, of course, and I don’t even have any way of getting them, but I can all but guarantee that Apple sold more of the iPod nano armbands than it ever did with watchbands. And I imagine it’s the same philosophy that made them put a place for carrying straps on the new iPod touch: it’s not even that Apple doesn’t want to take risks with its flagship product, it’s that customers don’t want to take risks with the one device that handles all their communication with the world. For them, the iPod is an accessory.

Speaking of Consistency

But if you’re going to be making up stories, you should at least try to be consistent with them. On Slate, Farhad Manjoo has some serious issues with the new dock connector. He repeats the idea that the new iPhone is boring, but he uses the magic of sarcasm to make his point extra super clear. The problem is that so many details about the new phone leaked out weeks before release. By the time of the actual announcement, the world had already seen everything and stopped caring.

Got that? Apple’s problem is that it’s got to keep a tighter lid on its plans. Classic Apple, going blabbing about everything all over the press, flooding the market with product announcements. It’s boring everyone!

He’s bored by all the leaked information and the lack of any big bombshells in Apple’s announcement. Except for the big bombshell of the new dock connector. It’s pretty boring but also very impressive. It doesn’t take any risks but completely and unnecessarily changes the dock connector, destroying an entire industry of accessories. Apple has a long history of invalidating what it deems outdated technology, but this is different. Manjoo has to get a new alarm clock.

Also, the new phone is remarkably thin and light, “the thinnest and lightest iPhone ever made, and the difference is palpable.” But how could Apple possibly justify changing everything just for the sake of this new, thinner dock connector?

Back on The Verge, Vlad Savov describes all the leaks that bored Manjoo, and he mentions how embarrassing they must be for Tim Cook. He says that the problem is all the points of potential leaks in the supply chain, which have “drilled holes in Apple’s mystique.” In the same article, Savov links to Verge’s own post about every detail that was leaked ahead of time.

Isn’t it a little disingenuous to be on a site that publishes front-page posts of pictures of new iPhone cases, a detailed report of the new dock connector, and a compilation of all the rumors and leaks to date, and then comment on the unprecedented demand for leaked information? It seems a little like prom committee chairman Edward Scissorhands lecturing everyone on their failure at putting up all the balloons.

And of course, on the first night of pre-orders for the boring new iPhone that nobody’s interested in, Apple’s online store was overloaded, and the phone sold out of its initial order within within two hours.

On the topic of pots, kettles, and their relative blackness: I wasn’t that interested in the new iPhone, and I still chomped on the pre-order like a starving bass. I was much, much, much more excited about finally ditching AT&T than I was about the device itself. (So eager to get rid of AT&T that I’m willing to run into the arms of a company that’s no doubt almost as bad). Now that Apple’s not talking about magic, I can take their advertising at face value: I’m pretty confident that it is the best iPhone they’ve ever made. I’ve got an iPhone, and I like it, so I’ll get a better one.

What does that say about the state of gadget obsession that “I’m going to buy a new expensive device every two years even though I don’t technically need it” comes across as the most pragmatic view?

I can still remember when I first saw Engadget, and I thought the concept was absurd. A whole ongoing blog devoted solely to gadgets and tech? Is that really necessary? Then, of course, I got hooked on it, and started following it and now The Verge and quite a few Apple-centric sites. If I’ve reached a point of gadget saturation just reading the stuff, what does it do the folks having to write about it? It seems like it’s creating a self-perpetuating cycle of novelty for its own sake, which then drives commentary about this bizarre fixation on novelty for its own sake. You can’t even say “maybe just step away from it all for a bit;” it has to be a stunt, completely giving up the internet for an entire year while still writing for a technology-oriented site.

Whatever the case, it’d probably a good idea to step back a bit. I’ll start doing that just as soon as I get my sweet new extra-long phone next week.

Spontaneous Obsolescence

Dozens of over-privileged gadget hounds have suddenly found themselves with outdated electronic equipment. Won’t you please help?

macminiback.png
As an electronic gadget obsessive with more disposable income than common sense, I’m well aware with the trials of being an “early adopter.” (That’s a euphemism for the older term, “impatient doof.”) We buy overpriced things, we watch them go down in price and up in specs and features, we sell them or donate them once they’re four or five years old, we buy a new one. It’s all part of the Great Circle of Life.

Rarely, though, is this delicate ecosystem hit with such a wide-spread cataclysm like the one we’ve seen this week. In just a few short days, I went from being blessed with pristine examples of consumerism at its finest, to being burdened with obsolete relics. It disgusts me even to look at them.

What’s worst is that all of the new models fix the one most annoying thing about each. For instance:

We all knew that The iPhone 4 was coming out, so it wasn’t a surprise. (Well, it wasn’t a surprise to most of us — apparently Apple and AT&T didn’t get the memo). This version has a faster processor and a much improved camera, which were my biggest complaints about the old version: now I don’t feel compelled to take a point-and-shoot everywhere. So I set aside some money — that I would’ve wasted on charity or something frivolous like that — and was prepared to make an informed purchase.

But then E3 happened! A new Xbox 360! Styled after the PS3, with special dust-collecting coating and barely-sensitive touch-activated not-buttons! And it, theoretically, fixes the two biggest problems with the old Xbox 360: catastrophic system failure from poor ventilation, and the fact that turning on the console is like having a leaf blower pressed against your head while standing on a runway at LAX.

And then: A Nintendo 3DS! Which fixes the biggest problem with the older Nintendo DS: that, err, it didn’t have 3D. Okay, that one is kind of weak, but I still want one after hearing everybody on the Twitter going nuts over it.

But then out of nowhere: A new Mac Mini! I’ve spent the past couple of years trying to piece together a decent home theater PC using the enormous, Brezhnev-era Mac Mini; an external drive; a USB TV tuner; an assortment of remote control apps; a Microsoft IR receiver; various DVI-to-HDMI adapters; and snot. Now Apple has said, “Oh right! HDMI has existed for several years now!” and upped the hard drive size and built an HDMI port right into the back, making it an HTPC right out of the box. (And by the sound of it, fixing the problem Mac has with overscan/underscan on my TV). It’s still overpriced to use as just a home theater PC, but it’s the best version of the mini that Apple has made yet.

And of course, the Microsoft Kinect business, which solves the problem of “I don’t look stupid enough while playing videogames.”

Now looking on ebay at all the listings of used Mac minis and Xbox 360s is positively heartbreaking; you can almost hear Sarah McLachlan wailing in the background as you scroll past one “0 bids” after the other. And now I actually feel kind of gross for writing all this, so I’ll start browsing elsewhere.

Helpful Tips for Adobe

I’ve been getting increasingly annoyed with Adobe’s attempts to spin the whole Flash-on-iPhone OS story.

underdogcover.jpgIn a blog post on Tuesday, Mike Chambers of Adobe announced that the company was abandoning its plan to support creation of iPhone OS apps using Flash CS5. The post is less emotional than Lee Brimelow’s “Go screw yourself, Apple” but every bit as whiny.

All the angles on the issue have been covered extensively on tech blogs, in particular Daring Fireball, so you won’t see any particularly novel insight here. But I haven’t yet seen them all gathered in one place. Apple has gotten a lot of criticism across the internet — much of it entirely deserved — for its App Store approval policies and the closed system approach it’s taking with the iPhone OS. And it bugs me to see Adobe employees — whether representing the company as with Chambers’s post, or not as with Brimelow’s — getting so much traction by taking advantage of that ill will, when Adobe doesn’t have a leg to stand on.

After citing various stories about Apple’s rejection policies and an even-handed piece on Slate called Apple Wants to Own You, Chambers goes on to say that Adobe will be shifting its focus with Flash CS5 onto the Android platform. Here are a few helpful tips for Adobe that might make this next choice of platform go more smoothly:

1. Don’t promise something to your customers unless you’re sure you can deliver.

Adobe’s claiming that Apple suddenly introduced a new clause to the developer program license that blew all their hard work out of the water. How could they possibly have predicted that Apple would so cruelly impose a last-minute ban of Flash on the iPhone suddenly out of nowhere? I mean, sure, the iPhone has been out for three years now and it’s never allowed a Flash player, but with Apple’s draconian secrecy, who knows why that is? Okay, fine, the CEO of the company has repeatedly said that it’s for performance reasons and battery life, but that’s just spin. Adobe had no way of predicting that the company that’s refused to allow Flash on their devices would suddenly decide not to allow a program that runs Flash. It’s all Apple’s fault!

2. Have a chat with Google and Motorola first.

There’s no way that a small start-up like Adobe could’ve communicated with industry giant Apple, either. Who even knows those guys’ email addresses? Plus, they’re scary: Gizmodo made a whole post about the guy whose book-reading app was rejected for containing adult material. Who’s to say that the exact same thing wouldn’t happen to a multinational corporation proposing to create a new development environment for the platform? Unlike Apple, Motorola and Google are pledged to complete openness, and they won’t have any qualms about performance or security on their branded Android OS devices. You probably don’t even need to ask first.

3. Try running your software on the device in question.

Apple’s reasons for refusing Flash are so arcane and mysterious that nobody can figure them out. Even though it’s been said repeatedly from multiple sources both inside and outside Apple that Flash is a hit on performance on battery life, that’s just idle speculation. Better to try to sneak something in instead of actually trying to find the problems with interpreted code and non-standard video playback and getting it to run acceptably.

4. Don’t use a Mac for development.

Because if you want to get anything done, you’ll have to use Adobe software, since Adobe has near-total market dominance in every area of production. And Adobe software runs like shit on a Mac. Mr. Brimelow, I suggest that your talk about the long relationship Apple and Adobe have had with each other would be more convincing if you had a dramatic backdrop, or a YouTube video playing in the background. For the backdrop, you’ll want to use Photoshop CS4, the first version that supports a 64-bit OS, which came out a year and a half after OS X converted to 64-bit. And for the YouTube video, be sure you speak up loud, because playing anything with the Flash video player on a Mac will cause your computer’s fan to kick into overdrive from the increased processor load.

5. Consider what “cross-platform” means for a platform built entirely around its unique identity.

If the blog posts from employees weren’t enough to convince you that Adobe’s committed to cross-platform development, then running any piece of Adobe software — especially any AIR app — should do the trick. Using PhotoShop or Flash on a Mac means that you get to give up everything that made you choose the Mac OS in the first place. The closest they’ll come to “Mac look and feel” is begrudgingly including a “Use OS Dialog” button on their file dialog boxes. But the iPhone, even more than the Mac, is specifically branded as a device that wins not on features, but on the OS.

Chambers makes a point of saying “While it appears that Apple may selectively enforce the terms, it is our belief that Apple will enforce those terms as they apply to content created with Flash CS5.” Or in other words, Apple will allow Unity, .NET, et. al., but is singling out Flash/Adobe to screw them over specifically. Adobe’s complaining about Apple not giving them fair treatment is a lot like a polygamist accusing one of his wives of cheating.

6. Have someone define “closed system” to you.

Apple already covered this one beautifully with its terse and awesome response to Chambers’s post:

“Someone has it backwards–it is HTML5, CSS, JavaScript, and H.264 (all supported by the iPhone and iPad) that are open and standard, while Adobe’s Flash is closed and proprietary,” said spokeswoman Trudy Muller in a statement.

This is the most galling part of the whole thing, to me: Adobe’s desperately grabbing on to Cory Doctorow’s coat tails and waving the flag of intellectual freedom, while simultaneously suggesting that the iPhone OS is gimped because Flash has something like 98% market saturation with internet video.

The best explanation I’ve seen is from Louis Gerbarg on his blog: allowing Flash, or even iPhone-targeted Flash, onto the iPhone would mean Apple effectively turning its OS development cycles over to Adobe’s engineers. It’s the same reason they’re so uptight about developers using private frameworks: if they change something with an OS update, the app breaks, and customers complain to Apple. Not the developers.

Adobe’s essentially going into a store, handing the owner a big black box, refusing to let the owner see what’s inside, and then complaining about freedom and openness when the owner refuses to sell it.

7. Learn to appreciate the monopoly you’ve already got.

It’s not particularly insightful to point out that the environment Apple’s created with the iPhone OS is very similar to the environment that game developers have had to deal with for years. Game console manufacturers have very tight restrictions on what they will and won’t allow to run on their devices — if you think that Apple’s approval process is complicated and draconian, you should go through Nintendo’s technical certification process sometime. (Note that this isn’t a complaint: the certification process means it’s much, much harder to find a buggy game that crashes your system or runs poorly on your particular console configuration).

And the lesson in game development is that content is more of an investment than code. (At least, code written in a particular language. And that’s partly my programmer bias showing through, where it’s a point of pride that once you’ve learned how to do something in one development environment, it’s much easier to do the same thing in a different one). Art assets will port from platform to platform, even if the code base doesn’t. [More on this in point number 10.] I have yet to see a game company that didn’t use Photoshop to generate art assets, and most also use a combination of Illustrator or AfterEffects or any number of other Adobe products.

8. Come out and acknowledge who multi-platform development benefits, exactly.

There is an ease-of-use and familiarity benefit to using Flash. But Adobe reps hardly ever mention that. (As someone who’s developed games using Flash and using Cocoa, I can kind of understand why Adobe wouldn’t push the “ease-of-use” or “predictability” claims where Flash is concerned). Instead they talk about cross-platform capability. An independent developer might be drawn to Flash for, say, making a game because it’s an environment he already knows. A publisher would be drawn to Flash for being able to make the same game for the iPhone, Android, Web, and anything else.

And this makes it a little bit like trying to explain to poor people why they shouldn’t vote Republican: they don’t care about you. Adobe isn’t going to make such a big stink, or for that matter build a campaign around a new feature in one of their flagship products, for the indie developer who’s going to blow a thousand bucks on the new Creative Suite. Adobe wants to get publishers to buy site licenses. And publishers want to make something once and then get it onto as many platforms as possible, because for a publisher, development time is more expensive than hardware purchases, testing, and customer support. Smaller developers will quickly reach the point where having their product on multiple platforms is costing more than the revenue it’s generating.

So when Chambers says:

The cool web game that you build can easily be targeted and deployed to multiple platforms and devices. However, this is the exact opposite of what Apple wants. They want to tie developers down to their platform, and restrict their options to make it difficult for developers to target other platforms.

what he means is: Apple includes a free development environment on their machines, to encourage people to buy their hardware. It comes complete with documentation, visual design tools, and built-in animation and layering libraries that make it relatively easy to achieve Flash-like results using native code. However, this is the exact opposite of what Adobe wants. They want to tie developers to Flash, to ensure that they have a proprietary development environment that’s most appealing to larger publishers, and restrict their options to optimize the runtime to target any particular platform, guaranteeing that it runs equally bad everywhere.

The “cool web game” bit is there to make it sound like the guy sitting in his bedroom who’s just finished his cool Bejeweled clone-with-RPG-elements can just hit a button in Flash CS5 and suddenly be rolling in heaps of money from App Store sales. And to the smaller, independent developers who would like to try releasing their games for multiple platforms: learn Objective C. It’s not that difficult, and you’ll have developed another skill. That to me seems more valuable than getting upset that a game designed for a mouse and keyboard on the internet won’t port to a touch-based cell phone without your intervention and a little bit of effort.

9. Make a genuine attempt at an open system.

If Adobe really is all about content creation, and if they’re going to insist on jumping on the anti-Apple “closed system” bandwagon, why do it for an inherently closed system? They’ve got one development kit that requires a plug-in and forces all its content into a window on a webpage, and they’ve got another development kit that works with HTML and PHP but nobody uses it. Why not put their content creation software expertise to work creating stuff that’s genuinely based on open standards?

There are tons of great HTML 5 demos getting passed around the internet, and they’re all done with a text editor. (And, most likely, Photoshop). Why not take the Flash front-end, since people like it for whatever reason, and let it spit out HTML 5, CSS, and JavaScript? ActionScript is already a bastardized sub/superset of JavaScript. HTML 5 has a canvas element and layering. There’s a browser war going on, where everyone’s trying to come up with the fastest JavaScript interpreter; only Adobe can make Flash Player plugins run faster, and they don’t have a great track record at that. Flash that doesn’t require Flash Player would be huge. No doubt Flash has some power-user features I’m not familiar with, and of course Flash Video is a whole different topic, but I’ve never done anything with Flash that couldn’t be done according to the HTML 5 spec and some clever JavaScript.

10. Stop the damn whining already.

Brimelow closed comments to his post to avoid “the Cupertino Spam bots,” and Chambers warned that non-constructive comments such as “Flash SUXXORS!” would be deleted. Because, as everyone on the internet knows, anyone who defends Apple for any reason, ever, is automatically a drooling Apple fanboy who believes Steve Jobs can do no wrong.

Which means, I guess, that everyone in the tech industry is 12 years old.

What these guys need to understand is that complaining about Adobe’s closed, proprietary system doesn’t automatically make Apple’s good, and vice versa. (Although it’s a big point in Apple’s favor that they don’t try to claim that their system isn’t closed). There are definitely problems with iPhone development.

The restriction on interpreted code does indeed suck, and is the biggest problem that Apple needs to find a solution for. When I mentioned that game developers have spent years learning how to port games to different consoles, I didn’t mention that the key to that is often a scripting language, like Lua. That allows a big chunk of code to be included in the portable game “content:” tailor the engine specifically to each console, but let the game logic stay fixed. (In theory, anyway). If Apple would just have a set of approved scripting languages — instead of just JavaScript via WebKit — and include them with the OS, it would open up a huge number of possibilities. The appeal of Lua on a mobile phone is even more evident than on a PC: it’s tiny, relatively efficient, and too simple and general-purpose to cause many problems when the underlying OS gets updated.

(I’d be remiss if I didn’t mention again that an iPad-native equivalent of HyperCard would be sweet. It could even use the Keynote front-end and run everything with WebKit. If you need a consultant on the project, Apple, let me know).

The other problem is the lack of transparency in the approval process. I mentioned that the certification requirements for consoles are a lot more complicated than those for the App Store; the advantage, though, is that they’re very explicit. You can and will still get surprised by a rejection, but a lot of the more obscure problems are solved when there is a huge list of requirements and developers are forced to test everything.

As for the other objections that are so often brought up, they seem reasonable enough to me. Yes, the state of file management on the iPad is really terrible right now, but I’m confident it’ll improve. Sure, Apple can reject an app for “duplicating functionality” of one of its built-in apps, but that situation is fluctuating (witness their support for VOIP apps like Skype, and browsers like Opera Mini) and the core apps are functional enough anyway. (Rejecting an app for the “pinch to preview a stack of pictures” functionality is pure bullshit, though).

And Apple can and does reject apps based on content alone. But as John Gruber pointed out, Apple’s still selling a brand as much as a platform. That’s the fundamental philosophical difference between the Android model (and Adobe’s whining) and the iPhone model: Android is selling you on the idea that you can run anything, Apple is selling you on the idea that you can run anything good. That’s why it’s a good thing that both platforms are available to both developers and customers. If you want a general-purpose phone that can run anything you throw at it, including ports of web games, then get an Android. If you want only the stuff that’s been specifically tailored to run well under the iPhone OS, then get the iPhone.

iPerCard

The time is right for my favorite program to make a comeback.


Image of the HyperCard home stack is from the C.V. page of Robin Siberling

One of the things that gets my nostalgia fired up like no other is HyperCard, which I consider to be one of the greatest pieces of software ever made for a personal computer. No exaggeration. Neck-and-neck with PhotoShop in terms of significance, as far as I’m concerned, and definitely more mind-altering in terms of thinking about how computers work and what they can do.

I can and will go on at great length about how great HyperCard was, at any opportunity, but the design firm smackerel.net has put together a terrific retrospective called “When Multimedia was Black and White” that can do a better job than I could.

Ever since I made the switch back to Macs, I’ve been looking for a HyperCard successor. There’ve been several pretenders to the throne: SuperCard was the highest-profile, but it tried so hard to be a superset of HyperCard that it lost most of what made the original so cool. I keep checking in on Bento every time they announce a new release, but it’s becoming clear that a simple development environment just isn’t the market they’re after. Runtime Revolution, has been trying for years to be a direct successor, but I’ve never been happy with it — it seems like an attempt to recapture the multimedia authoring platform that HyperCard turned into, instead of consumer-level software.

There are still pockets of loyal HyperCard fanatics out there, still holding on to floppies full of HC stacks like mattresses stuffed with Confederate money. But over the past few weeks, I’ve been hearing HyperCard mentioned more and more often, even by sane people. For instance John Gruber on his Daring Fireball site about entry-level development environments, and in speculation about what the future of authoring multimedia content is going to be like.

What’s re-sparked interest in HyperCard is, of course, the iPad.

Why The Time To Act Is Now

Any fit of HyperCard nostalgia gets shut down pretty quickly when you look at it in a modern context and are forced to realize that it died for a reason. Bill Atkinson himself has acknowledged that its biggest flaw was focusing on local storage instead of taking advantage of networking. Most of what it could do was superseded by the World Wide Web. Later, Director and then Flash came in to take over the rest and become dominant.

But back then, there was little reason to believe (unless you were particularly prescient) that networking would become so huge outside of business settings. Personal computers were still very application-focused. Like the iPad is.

On OS X, a replacement HyperCard seems less necessary once you realize how many HyperCard remnants are scattered throughout the system. Much of HyperTalk remains in the AppleScript language. The rest of HyperCard’s coolest features live on in Xcode. Using Interface Builder, you can drag-and-drop interface elements, draw connections, and create a fully functional (if basic) UI without having to write any code. Databases are handled by the Core Data library, again without any code. But Xcode doesn’t (and definitely shouldn’t) exist on the iPad.

On the iPad, there’s no way to get apps except through the App Store, no way to develop them except through Xcode, and there are restrictions on what Apple will allow developers to do, for fear of interrupting a cellular network, or fear of ruining a user’s end experience (and by extension, as Gruber points out, their brand). A controlled development environment could “fix” much of that, giving Apple the control they want while allowing developers to make things more powerful than web apps but with less investment than the iPhone OS SDK.

Flash turned into the dominant platform for multimedia authoring, and even it got corrupted into a glorified interface for delivering streaming video. But — and I don’t know if anyone’s mentioned this yet — the iPad doesn’t support Flash.

The excuse that Apple’s defenders are using for the lack of Flash support is that open standards like HTML 5 are preferable to a proprietary format owned by Adobe. Streaming video support is a whole other issue, but what about “basic” Flash: games, presentations, or even banner ads? We keep seeing demos that show what can be done using only HTML 5, CSS, WebGL and the like, but there’s still no consumer-level authoring platform that’s as straightforward to use as Flash. (Which might sound odd to anyone who’s used Flash, but dragging and dropping keyframes on a timeline is still more accessible than writing code in a text editor). If they want the content to take off, then there needs to be a better tool for people to create that content.

People need something like HyperCard; it’ll sell more iPads. There’s a glut of special-purpose apps on the App Store for stuff like keeping wine lists or golf scores or, of course, making fart noises. That may help with the 150,000 available apps claim, but it doesn’t make the device itself seem more useful. Especially if you have to pay one or two dollars a pop instead of just making a simple app yourself. And Bento just isn’t cutting it; it tells you up-front exactly everything you can do with it, and if your application doesn’t fit right into that narrow selection of templates, then no sale.

And it’s more of a minor point, but it’s possibly the most important and least “selfish” reason to do it: it would help keep the iPad from being perceived as purely a device for consuming media, and let Apple reassert itself as the company that makes things for creative people. HyperCard was uniquely Apple, and it fit so perfectly into the Apple philosophy: giving you only the complexity you needed and only when you needed it, and making the act of creating things feel like fun instead of like work.

What’s more, people will get it now. Back in 1988, much of HyperCard was devoted to trying to explain what it was, exactly, and what kind of stuff you could do with it. In 2010, people just naturally understand hyperlinks and pages and multimedia.

How Apple Should Do It

I do believe that if an iPhone OS-centric successor to HyperCard were to happen, it would have to come from Apple. And that’s not because of my devotion to Apple or some naive or high-minded philosophy, but for very practical reasons:

  • It has to be ubiquitous. A huge part of HyperCard’s influence was due to the fact that it was included on every Mac. If you have to go looking for it, then you’re probably not going to bother. Especially if it’s not clear to you what it does, exactly.
  • It has to run interpreted code. It’d be a lot better if Apple just relaxed its rule against letting apps run interpreted code, so that there could be all kinds of rapid application development available. But until they do, any attempt at a HyperCard replacement would be hopelessly gimped and over-simplified.
  • It would compete with the App Store. Being able to create your own stacks (or whatever the new metaphor is) would be of little use unless you could distribute them. Even back before the internet took off, people traded stacks on floppies at User Group meetings and over BBSs. If Apple allowed that kind of distribution for one third-party developer, they’d have to do it for all of them, and I see that as being highly unlikely.
  • It’d probably have to use private frameworks. Any app that ran these stacks would have to be doing runtime configuration that I just don’t believe is possible with the public frameworks. (Or I just haven’t dug deep enough into them).
  • Apple is already so close to having a finished version.

That last point is what got me excited about the potential for HyperCard on the iPad. It began with the “Getting Started” document that’s the first thing you see when you start up Pages for the iPad.

The Pages tutorial is, in a word, totally sweet. You can move images around and see the text flow around them, insert graphs and charts and edit them in place, and assign borders and other effects using simplified pop-up windows. Using Pages on a desktop or laptop, and you get the sense I’m using a simplified entry-level word processor. Using it with your fingers on an iPad, and you think this is how all page layout software is supposed to work.

Now, one of the iLife applications that seemed to have a ton of potential, but just kind of fizzled out, was iWeb. (I don’t know its actual success, of course, just that I’ve never actually seen a website that was created with it). It makes one hell of a first impression, but after using it for a while, you quickly run into its limitations. And you realize that it’s not the best way to make a dynamic website.

It would, however, be a fantastic way to make a HyperCard stack, or an iPad app. You can drag all of the page elements around and edit their properties in place. You can set up connections between buttons and other elements on the site. There’s already a notion of static content (e.g. a blog page) vs. dynamic content (e.g. individual posts). It’s got a media browser, as well as several built-in widgets.

iWeb has to do all kinds of tricks to get the pages you make with its editor to look as nice when they’re rendered in a browser. (The most obvious is how it has to generate big, web-unfriendly images when you rotate them or add borders and drop shadows). But it wouldn’t have to if they weren’t targeted for a browser, but were intended to be viewed in the app itself. Or even if it were targeted for Safari and WebKit only, instead of any browser.

And again, while using iWeb-type controls to resize, rotate, and add effects to pictures is pretty cool with a mouse, it’s really cool when you’re dragging and pinching stuff directly.

For the kind of fairly simple databases that a HyperCard stack would require, the Core Data system should be plenty sufficient. Core Animation already has all kinds of fancy transitions that can just drop into multiple contexts.

For assigning functionality to the visual elements, Apple’s already got a library of candidates. Dragging links between elements in InterfaceBuilder is a natural. There’s also QuartzComposer, which lets you sequence effects by drawing lines between boxes. And there’s the Automator app, which puts a visual front end on AppleScript. On a desktop, visual programming environments, including Automator, invariably seem clunky and limited. It almost always seems faster just to type it out in a text editor. But on the iPad, dragging and dropping is much more natural than typing. Eliminating typing altogether would just make the whole thing useless, another Bento that relies too much on templates without allowing enough configuration and customization. But minimizing the typing makes more sense on the iPad than it would on a desktop or laptop.

I dunno, maybe the idea is completely counter to what Apple’s trying to do. But it seems like it makes so much sense, and it would address so many concerns, and it just fits in with everything they’ve built up to now. All the iWork and iLife apps feel to me like HyperCard is lurking there in the background, waiting to come out. On the iPad they’ve finally got a good reason to let it loose.

Remembrance of Computers Past

Looking back at the rest of Apple’s product line helps explain why people think the iPad is such a big deal. Also, kind of a review.

iPadHello.jpgOut of all the billions of articles that have been written about the iPad over the past few weeks — previews, reviews, essays, tirades, counter-tirades, hands-ons, first impressions, updates, and general grousing — the best is still Stephen Fry’s article for Time magazine. Fry’s an unabashed Apple fanboy, but the article does exactly what it needs to: explain why this is such a big deal to some people. And it gets rid of the white background and just asks the Apple guys directly, “What’s so great about this thing, anyway?”

Not that they gave a compelling answer, but it was still nice of him to ask. And he didn’t really need to, anyway, because Fry covered that himself. The best part of the article is when he describes his and Douglas Adams’s excitement over the original Mac:

Goodbye, glowing green command line; hello, mouse, icons and graphical desktop with white screen, closable windows and menus that dropped down like roller blinds.
[…] I would pant excitedly. Douglas’ wife Jane would point with resigned amusement to the stairs, and I would hurl myself up them to swap files and play. We were like children with toy train sets. And that was part of the problem. It was such fun. Computing was not supposed to be fun.

Douglas Adams’s enthusiasm for the Mac was pernicious and infectious. It’s been about 20 years (!) since I read the Dirk Gently books, but I can still remember the frontispiece of each one explaining how it was written on a Mac, listing the software used. And I can vaguely remember a long passage in one of the books describing a character using a Mac, written to make it sound as wondrous as any of the more fantastic elements of the book.

So Long, and Thanks For All the PICTs

I don’t know for sure whether reading those books is what set me on the course to Mac fanaticism, but whatever started it, I was hooked. I would buy issues of MacUserjust for the pictures. Everything seemed so much cooler on a Mac; the control panel had a picture of a rabbit and a turtle to set your keyboard speed, and even the error messages had pictures!

When I finally got a Mac Plus as a graduation present (that my parents couldn’t quite afford, but knew how much I wanted it, presumably because I wouldn’t shut up about it), I loved it. Doing even the simplest things was more fun, and I saw nothing but limitless potential in the computer because it was so enjoyable to use.

It didn’t quite “exceed my capacity to understand it,” and it definitely didn’t “just work.” The Mac OS had already outpaced my system’s memory, so it was constantly spitting out disks and asking me to insert a new one. (The sound of the Mac ejecting a disk probably haunted my college roommates for years). Even my Commodore 128 had color, but the Mac was still low-resolutely black and white. Back then, the Outsiders would make complaints that sound hauntingly familiar today: “You can’t open it.” “It’s a toy computer.” “There’s not enough software for it.” “You could get a much more powerful machine at that price.” I eventually fell for that, and “upgraded” to a machine that I liked just fine. But I never loved a computer like that one.

UIDejaView

And nostalgia couldn’t possibly be driving all of the hype around the iPad, but I do believe that the idea of the first Macintosh is a huge part of it, even for people who never owned one. And I believe the iPad is the closest Apple has come to realizing that philosophy since the first Mac.

After all, Windows may have copied the “look and feel” of the Mac, but it never quite got its soul. It wasn’t even until Windows 95 that they managed to get a consistent, unified personality at all. But you can’t blame Microsoft too much, since Apple lost it as well. As the personal computer got to be more ubiquitous and more general-purpose, it somehow got less personal. It got more functional, but less fun.

Using an iPad, I don’t just feel like I’m in the future, as I expected to. The part in that Time article that resonated the most with me was when Fry laments that Adams never got to see his Hitchhiker’s Guide to the Galaxy made real. Every new “mobile device” I’ve tried out, back to the original PalmPilot, I’ve subjected to the Hitchhiker’s Guide test. The iPad comes closer than any I’ve seen, it’d probably be even more uncanny if I’d gotten the 3G model. But more than that, I’m reminded of using my first Mac.

The iPad is obviously a direct descendent of the iPhone and the iPad, and it’s being described by tech writers and by Apple both as being a reaction to netbooks. But I believe you can trace the idea behind it all the way back to the Mac Plus.

The form factor is that of a magazine, sure, but it has a hint of the original Mac in there as well: just the screen when held horizontally, and the whole thing when in portrait mode. You can’t open it, but it doesn’t even seem like something you’d want to open: it feels like any time you’d spend configuring it is time that’d be better spent using it.

It’s got a few of the standby apps already installed and ready to go. MacWrite is no longer free, and it’s called Pages now, but it’s there if you need it. MacPaint has been made obsolete by digital photography, apparently, and the spreadsheet in AppleWorks now goes by the name Numbers. The desktop is still the realm of powerhouse applications with tons of features, but the iPad can comfortably support powerful apps that are simpler, more accessible, and more fun to use.
Continue reading “Remembrance of Computers Past”

Walled Garden Party

Passing the interminable waiting time by reading the hilariously over-the-top preactions to the iPad. Warning: very long and somewhat angry.

futuramahedonismbot.jpg
Tomorrow morning, as you know, is The Dawn of a New Era in Personal Computing. The Coming of the iPad will bring about a magical age where people are directly connected to content, and they will become mindless consumers tied to an unchecked corporate overlord, and also it will flop and no one will buy one. All at the same time. It’s just that special.

I was pretty skeptical of Apple’s marketing at first; I thought the claim that it was “magical and revolutionary” was a bunch of flowery nonsense. But now I’m convinced. Somehow, even before its release, the iPad has taken what was once a disparate group of strangers with internet access and magically turned them into thousands of experts, better able to tell me how I should spend my money than I’d be able to by myself. And it’s going to bring about a revolution (which won’t be televised in widescreen format, apparently) in which everyone suddenly finds himself unable to think for himself or create anything of value.

All across the web are the brave souls documenting the downfall of society. It’s been a little bit disheartening watching Nilay Patel of Engadget make the transition from his initial guarded optimism to having to mention the lack of printing and the App store’s rating system in only tangentially-related posts. I actually can’t tell if he’s being serious, or if he’s just been worn down by the thousands of commenters just plain losing their shit over the idea that a gadget blog would cover a new piece of consumer technology. Stay strong!

It’s a little easier with Marc Bernardin’s post on io9, a sarcastic but pleasant enough little piece about the ability to read comic books on the iPad that gives an overview of what apps are going to be available and what it means for distribution and oh my god we’re all gonna die where the hell did that come from all of a sudden?

With Ownership of Media Comes Great Responsibility

But the best of all is Cory Doctorow’s manifesto on Boing Boing, helpfully entitled “Why I won’t buy an iPad (and think you shouldn’t, either).” It’s certainly no surprise that the guy who’s appointed himself lead internet spokesperson against the evils of digital rights management would choose to write a tirade against Apple; the only surprising thing is that he waited this long. John Gruber wrote a response that was more even-tempered than I could be. And, frankly, more even-tempered than Doctorow’s post deserves.

I should make it clear that I don’t have anything personal against Doctorow; for all I know he’s a fine person, albeit one I’d probably hate getting stuck talking to at a party. But it seems that the iPad (and its media coverage) has magically turned him from an amusingly passionate and occasionally irritating anti-DRM evangelist, into full-on sputtering douchenozzle. On the plus side, his post makes Annalee Newitz’s rant on io9 (which tried to say exactly the same thing, a month earlier) seem reasoned and thoughtful by comparison. On the negative side: everything else.

First he rails against the assault on comic books:

I mean, look at that Marvel app (just look at it). I was a comic-book kid, and I’m a comic-book grownup, and the thing that made comics for me was sharing them. If there was ever a medium that relied on kids swapping their purchases around to build an audience, it was comics. And the used market for comics! It was — and is — huge, and vital. I can’t even count how many times I’ve gone spelunking in the used comic-bins at a great and musty store to find back issues that I’d missed, or sample new titles on the cheap.
[…]
So what does Marvel do to “enhance” its comics? They take away the right to give, sell or loan your comics. What an improvement. Way to take the joyous, marvellous sharing and bonding experience of comic reading and turn it into a passive, lonely undertaking that isolates, rather than unites. Nice one, Misney.

Haha, way to stick it to The Man, C-Doc! Because as we all know, Disney is a pure representation of Evil Multinational Corporation that stifles creativity, since it’s still 1994 and all of us had our emotional and intellectual maturation stopped when we were sophomores in college. Also, MEAT IS MURDER! Ever since tiny upstart mom-and-pop operation Marvel Comics got bought by their new corporate overlords, they’ve stopped publishing single issues. Even worse, they’re stifling kids’ enjoyment of comics by making them available on every single digital platform in existence.

I, too, am a comic-book grownup. And as a grownup I would prefer to have hundreds of comic books on a one pound, half-inch high device instead of in the boxes and stacks that are overflowing my closet, bookshelves, romantic-encounter-inhibiting stack beside my bed, and my parents’ basement. If I want to share them, then holy shit they’re now on a device that’s the same size as a comic book! I can hand somebody else the iPad, and it’ll even flip over to let them read it! Also the last time I shared a single issue of a comic book with anyone was when I was 18!

The comic book thing is just the first sign that Doctorow has become the worst kind of Old Guard: the Old Guard who believes he’s still cutting-edge counter-culture. The kind who believes that putting a picture of Steve Jobs upside down or using epithets like “Misney” is anything more than a lazy substitute for bonafide insight. What he’s done here is conflate two things: his pet cause of “ownership” of media, and the joyous magic of sharing. It’s selfishness disguised as generosity. If I start buying comics on an iPad, then I’m every bit as free to go “spelunking” through the online catalog for back issues — I could buy, right now, the first 10 issues of X-Men and read them immediately and individually; they’re not to the best of my knowledge in print as single issues. I could share my collection with anyone by sharing my iPad.

What I can’t do is take someone else’s work and sell it. That is not, however, “sharing.”
Continue reading “Walled Garden Party”

Our Browsers, Ourselves

Using the healing power of blogging to rationalize an expensive and unnecessary purchase.

ipadhardware.jpg
As one of the idiots loyal technology enthusiasts who bought an iPhone on day one, I was a little disappointed by the anti-climactic iPad pre-order event. In June a couple years ago, I was standing in line outside an AT&T store for an hour, chatting with the other saps fine people, only to be told at the last minute that they were sold out of the version I wanted. That led to my driving all over Marin county, eventually finding myself at an Apple Store, where I was welcomed by a double line of smiling Apple employees escorting me to the demo phones on display at their all-white tables, then putting a gentle hand on my back and leading me to the back of the store where they could take my credit card info. It was exciting and not at all creepy.

With the iPad, though, I just hit a button on a web form. Where’s the excitement? Or the exclusivity? It’s been over a week, and you can still order one online! You can even have it sent to your house, and miss out on all the energizing and totally not cult-like atmosphere of the Apple Store. I used the online form to reserve a pick-up at one of the stores in San Francisco. Conveniently, the very same form let me schedule a time and place outside the store to get mugged and have my iPad stolen from me.

I chose the WiFi 32 GB model, and I chose Darrel as my Forced Redistribution Representative. I figured that even the 64 GB model wouldn’t hold all of the music and video I’ve amassed over the decades, and the iPhone is a better music player anyway, so 32 GB could easily store a couple weeks’ worth of video and books until the next sync. And I liked that Darrel is a methadone addict who plans to re-sell the thing on Craigslist, so it felt like I was giving back to the community.

Now, I put a good bit of effort into talking myself out of wanting one of these things, and then calmly and rationally going through the pros and the cons, so that I could make an informed purchasing decision by the morning of the 12th. Apple ruined all that, by apparently having enough supply to meet the demand, but I hate to see all that thought process go to waste:

Cost-Effectiveness: When I moved into this apartment, I bought a couch for $600. It’s green and very comfortable. I also bought a chair from Office Depot for around $80. It’s oddly tilted and is bad for my back. When I get home after a grueling day of watching other people make videogames, I spend anywhere from two to four hours at my desk, reading news feeds and forum messages, starting and abandoning web posts such as this one, and obsessively checking Google for mentions of the game I’m working on. If it’s “Lost” or “Castle” night, or the day after “Community” and “30 Rock,” I might spend an hour on the couch in front of the TV. This means that every second I spend at my desk, I’m losing money that I spent on my couch. Being able to browse the web while reclined isn’t only more comfortable, it’s the right thing to do.

Productivity: Whenever I’m sitting in this uncomfortable chair reading the internets, I invariably run across a recommendation of some Flash game that I end up playing for longer than it’s worth. The iPad doesn’t support Flash. Big win.

Literature: I’ve still got all these books piled up from back when I used to intend to read things. But what a hassle! Those pages! Finding a light source! All the opening and closing! On the floor of my apartment, I’ve got a big stack of unread books just sitting there, mocking me every time I sit down to play a videogame or watch a movie. Just think of all the space I could save if I could have all those books on a single device that’s a half inch thick, and not read them there!

Health Concerns: The books that I do still read are comics, and reading comic books means leaving the apartment to take a bus down to the comic book store. And that means exposing my body to unhealthy UV radiation. In the perfect world of 2010, I should be able to buy comic books without going outside. And without waiting for the trade paperbacks to come out.

My Concern for You, The Readers: The one thing the best writers all have in common is that they have a singular voice, a defining characteristic. The one thing that all my writing has in common is that there’s a lot of it. If I can make blog posts from a touchscreen keyboard with the iPhone OS’s auto-correction, then I’ll be encouraged to keep it short and sample.

The Lamentations of Bloggers: There have been several bloggers calling out the iPad for representing the Evils of Closed Systems, writing post after post decrying the “walled garden” of the App Store and Apple’s unfair business practices. They suggest that consumers are complicit in the death of open software, lured by the status of an Apple logo and a bright shiny piece of electronics instead of getting a more powerful and more user-empowering computer. So I’m buying an iPad to make a statement. And that statement is: “Fuck you.” With the additional statement: “I know what I’m doing, and how I spend my money is my own damn business. If Windows or Android or Linux or HP or LG or whoever had beaten Apple to the system with a superior product, then I would’ve bought that instead. So suck it.”

Research: There are plenty of other e-book readers and personal media players and netbooks out there already; I believe that the new thing that the iPad will bring to the market is genuinely social computing. As in: a direct, tactile connection to the content displayed on screen; and real, face-to-face communication with another person while sharing the contents of the screen. Apple mentioned both aspects during the iPad keynote, but the “sharing” part was kind of an afterthought. I believe that’s were it’s going to make a real difference, though. (It’s also what the Microsoft Surface project has been all about, but they got locked into the mindset of a big-ass table. Instead of a portable device, which they always tried to turn into just another Windows machine). Apple mentioned showing off pictures with an iPad, but I think that’s because Steve Jobs feels about games the same way George Bush feels about black people. Board games and card games are just a different experience than playing single-player or even multiplayer games online, and it’s an experience that I don’t think computers have been able to replicate yet.

I do honestly believe that there’s going to be a subtle shift in the way people think about computers once more of us can show someone else a web page or a photo or a video simply by handing them the screen. But I think the most exciting stuff on the iPad is going to come from two areas: online magazines, and social games. (And hopefully, we’ll be able to take the term “social games” back from all the people making Facebook games).

Continue reading “Our Browsers, Ourselves”