iPad Cons

Reluctantly coming to the conclusion that the computer I’ve always wanted isn’t the computer I’ve always wanted

TETSUMIN 28
It’s a reliable source of tragicomedy to see people working themselves into an indignant rage over gadget reviews. When I was looking for reviews of the iPad Pro this Wednesday (to do my due diligence), Google had helpfully highlighted some fun guy on Twitter calling the tech journalists’ coverage of the device “shameful.” The reviews themselves had hundreds of comments from people outraged that even the notion of a larger, more expensive iPad was an assault to everything we hold dear as Americans.

The complaints about the rampant “Apple bias” are especially ludicrous in regards to the iPad Pro, since the consensus has been overwhelmingly cautious: out of all the reviews I read, there’s only one that could be considered an unqualified recommendation. Even John Gruber wasn’t interested in getting one. (But he did still believe that it’s the dawning of a new age in personal computing; it’s still Daring Fireball, after all). Every single one of the others offered some variation on “It’s nice, I don’t have any interest in it, but I’m sure it’s perfect for some people.”

Yes, I thought, I am exactly those some people.

Designed By Apple In Cupertino Specifically For Me

I’ve spent the better part of this year trying to justify getting a smaller and lighter laptop computer. I’ve spent the better part of the last decade wanting a good tablet computer for drawing. And I’ve tried — and been happy with — most of the variations on tablets and laptops that Apple’s been cranking out since the PowerBook G4. (One thing people don’t mention when they complain about how expensive Apple products are is that they also retain their resale value exceptionally well. I’ve managed to find a buyer for every Apple computer or tablet I’ve wanted to sell).

I’ve tried just about every stylus I could find for the iPad. I tried a Galaxy Note. I tried a Microsoft Surface. I got dangerously excited about that Microsoft Courier prototype video. Years ago, I tried a huge tablet PC from HP. None of them have been right, for one reason or another.

But when they announced the iPad Pro this Fall, it sounded like Apple had finally made exactly what I wanted: a thin and relatively light iPad with a high-resolution display, better support for keyboards, faster processor, and a pressure-sensitive stylus designed specifically for the device. Essentially, a “retina” MacBook Air with a removable screen that could turn into a drawing tablet. The only way it could be more exactly what I want would be if it came with a lifetime supply of Coke.

Still, I decided to show some constraint and caution for once, which meant having the calm and patience to get one a few hours into opening day instead of ordering one online the night before.

I read all the reviews, watched all the videos, paid closest attention to what artists were saying about using it. The artists at Pixar who tried it seemed to be super-happy with it. All the reviews were positive about the weight and the display and the sound and the keyboards.

I went to the Apple Store and tried one out, on its own and with the Logitech keyboard case. It makes a hell of a first impression. The screen is fantastic. The sound is surprisingly good. It is huge, but it doesn’t feel heavy or all that unwieldy when compared to the other iPads; it’s more like the difference between carrying around a clipboard vs carrying a notepad. (And it doesn’t have the problem I had with the Surface, where its aspect ratio made using it as a tablet felt awkward).

And inside the case, it gets a real, full-size keyboard that feels to me just like a MacBook Air’s. It really does do everything shown in the demo videos. I imagined it becoming the perfectly versatile personal computer: laptop for writing, sketchpad for drawing, huge display for reading comics or websites, watching video, or playing games. (I’m not going to lie: the thought of playing touchscreen XCOM on a screen this big is what finally sold me).

But Not For Me

But I don’t plan to keep it.

It’s not a case of bait-and-switch, or anything: it’s exactly what it advertises, which is a big-ass iPad. The question is whether you really need a big-ass iPad.

The iPad Pro isn’t a “hybrid” computer, and Apple’s made sure to market it as 100% an iPad first. But it’s obvious that they’re responding to the prevalence of hybrids in Windows and Android, even if not to the Surface and Galaxy Note specifically. And I think Apple’s approach is the right one: differentiating it as a tablet with optional (but strongly encouraged) accessories that add laptop-like functionality, instead of as some kind of all-in-one device that can seamlessly function as both.

But a few days of using the iPad Pro has convinced me that the hybrid approach isn’t the obviously perfect solution that common sense would tell you it is. It’s not really the best of both words, but the worst of each:

  • Big keyboards: The Apple-designed keyboard is almost as bad for typing as the new MacBook’s is, which is almost as bad as typing on a Timex Sinclair. Maybe some people are fine with it, and to be fair, even the on-screen keyboard on the iPad Pro is huge and full-featured and easy to use. But for me, the Logitech keyboard case is the only option. And it’s pretty great (I’m using it to type this, as a cruel final gesture before I return it) but it turns the iPad Pro from being surprisingly light and thin into something that’s almost as big and almost as heavy as a MacBook Air.
  • Big-ass tablet: Removed from the case, the iPad Pro quickly becomes just a more unwieldy iPad. The “surprisingly” part of “surprisingly light and thin” means that it’s genuinely remarkable considering its processor speed and its fantastic screen, but it still feels clumsy to do all the stuff that felt natural on the regular iPad. It really wants to be set down on a table or desktop.
  • It’s not cheap: I wouldn’t even consider it overpriced, considering how well it’s made and how much technology went into it. But it does cost about as much as a MacBook Air. That implies that it’s a laptop replacement, instead of the “supplemental computer” role of other iPads.
  • Touching laptop computer screens is weird: Nobody’s yet perfected the UI that seamlessly combines keyboards and touch input. Even just scrolling through an article makes me wish I had a laptop with a touchpad, where it’s so much more convenient. When it feels like the touchpad is conspicuously absent while you’re using a device that’s essentially a gigantic touchpad, that means that something has broken down in the user experience.
  • Aggressive Auto-correct: Because iOS was designed for touch input on much smaller screens, it was designed for clumsy typing with fat fingers. Which means it aggressively autocorrects. Which means I’ve had to re-enter every single HTML tag in this post. And it still refuses to let me type “big-ass” on the first try.
  • It’s missing much of OS X’s gesture support: Despite all the clever subtle and not-so-subtle things they’ve done to make iOS seamless, it’s still got all the rough edges as a result of never being designed for a screen this large. In fact, having your hands anchored to a keyboard goes directly against the “philosophy” of iOS, which was designed to have an unobtrusive UI that gets out of the way while you directly interact with your content. Ironically, it’s all the gesture recognition and full-screen stuff that made its way from iOS to OS X that I find mysef missing the most — I wish I could just quickly swipe between full-screen apps, or get an instant overview of everything I have open.
  • No file system: This has been a long-running complaint about iOS, but I’ve frankly never had much problem with it. But now that the iPad is being positioned as a product that will help you do bigger and more sophisticated projects, it becomes more of a problem. I just have a hard time visualizing a project without being able to see the files.
  • The old “walled-garden” complaints: Apple’s restrictions aren’t nearly as draconian as they’re often made out to be, but they still exist. Occasionally I need to look at a site that still insists on using Flash. And the bigger screen size and keyboard support of the iPad Pro suggest that programming would be a lot of fun on this device, but Apple’s restrictions on distributing executable code make the idea of an IDE completely impractical.
  • Third-party support: App developers and web developers haven’t fully embraced variable-sized screens on iOS yet. (As an iOS programmer, I can definitely understand why that is, and I sympathize). So apps don’t resize themselves appropriately, or don’t support split screen. Some apps (like Instagram, for instance) still don’t have iPad versions at all. Some web sites insist I use the “mobile” version of the site, even though I’m reading it on a screen that’s as large as my laptop’s.

If You Don’t See a Stylus, They Blew It

For me, the ultimate deciding factor is simply that the Apple “Pencil” isn’t available at launch. They’re currently back-ordered for at least four weeks, and that’s past the company’s 14-day return window. Maybe they really have been convinced that the stylus is a niche product, and they weren’t able to meet the demand. Whatever the case, it seems impossible for me to really get a feel for how valuable this device is with such a significant piece missing.

The one unanimous conclusion — from both artists and laypeople — is that the Pencil is excellent. And I don’t doubt it at all. Part of what gets the tech-blog-commenters so angrily flummoxed about “Apple bias” is that Apple tends to get the details right. Their stuff just feels better, even if it’s difficult or impossible to describe exactly how or why, and even if it’s the kind of detail that doesn’t make for practical, non-“magical” marketing or points on a spec sheet.

Even though I haven’t been able to use it, I have been impressed with how Apple’s pitched the stylus. They emphasize both creativity and precision. There’s something aspirational about that: you can use this device to create great things. Microsoft has probably done more over the years to popularize “pen computing” than any company other than Wacom, but they’ve always emphasized the practical: showing it being used to write notes or sign documents. It’s as if they still need to convince people that it’s okay for “normal” people to want a stylus.

Part of the reason I like Apple’s marketing of the Pencil is that it reminds me of the good old days before the iPhone. Back when Apple was pitching computers to a niche market of “creative types.” It was all spreadsheets vs. painting and music programs, as clearly differentiated as the rich jocks vs the sloppy underdogs in an 80s movie.

I only saw a brief snippet of Microsoft’s presentation about the Surface and Surface Book. In it, the Microsoft rep was talking about the Surface’s pen as if he’d discovered the market-differentiating mic-drop finishing-move against Apple’s failed effort: unlike “the other guys,” Microsoft’s pen has an eraser. I’ve been using a Wacom stylus with an eraser for some time, and its always too big and clumsy to be useful, and it always ends up with me using the wrong end for a few minutes and wondering why it’s not drawing anything.

Meanwhile, Apple’s ads talk about how they’ve painstakingly redesigned the iPad screen to have per-pixel accuracy with double the sampling rate and no lag, combining their gift for plausible-sounding techno-marketing jargon with GIFs that show the pen drawing precise lines on an infinite grid. That difference seems symbolic of something, although I’m not exactly sure what.

The Impersonal Computer

I’ve been pretty critical of Microsoft in a post that’s ostensibly about how I don’t like an Apple product. To be fair, the Surface Book looks good enough to be the best option for a laptop/tablet hybrid, and it’s clear some ingenious work went into the design of it — in particular, putting the “guts” of the machine into the keyoard.

I’m just convinced now that a laptop/tablet hybrid isn’t actually what I want. And I think the reason I keep going back to marketing and symbolism and presentation and the “good old days” of Apple is that computers have developed to the point where the best computer experience has very little to do with what’s practical.

I get an emotional attachment to computers, in the same way that Arnie Cunningham loved Christine. There have been several that I liked using, but a few that I’ve straight-up loved. My first Mac was a Mac Plus that had no hard drive and was constantly having to swap floppy disks and had screen burn-in from being used as a display model and would frequently shut down in the middle of doing something important. But it had HyperCard and Dark Castle and MacPaint and the floppy drive made it look like it was perpetually smirking and it as an extravagant graduation gift from my parents, so I loved it. I liked the design of OS X and the PowerBook so much that I even enjoyed using the Finder. I tried setting up my Mac mini as a home theater PC mostly as an attempt to save money on cable, but really I just enjoyed seeing it there under the TV. Even a year into using my first MacBook Air, I’d frequently clean it, ostensibly to maintain its resale value but really because I just liked to marvel at how thin and well-designed it was.

I used to think that was pretty common (albeit to healthier and less obsessive degres). But I get the impression that most people see computers, even underneath all their stickers and cases to “personalize” them, as ultimately utilitarian. A while ago I had a coworker ask why I bring my laptop to work every day when the company provided me with an identical-if-not-better one. The question seemed absolutely alien to me: that laptop is for work; this laptop has all my stuff.

Another friend occasionally chastises me for parading my conspicuous consumption all over the internet. I can see his point, especially since the Apple logo has gone from a symbol of “I am a creative free-thinker” to “I have enough money to buy expensive things, as I will now demonstrate in this coffee shop.” But I’ve really never understood the idea of Apple as status symbol; I’ve never thought of it as “look at this fancy thing I bought!” but “look at this amazing thing people designed!”

The iPad was the perfect manifestation of that, and the iPad mini was even more. Like a lot of people, I just got one mainly out of devotion to a brand: “If Apple made it, it’s probably pretty good.” I had no idea what I’d use it for, but I was confident enough that a use would present itself.

What’s interesting is that a use did present itself. I don’t think it’s hyperbolic to say that it created an entirely new category of device, because it became something I never would’ve predicted before I used it. And it’s not a matter of technology: what’s remarkable about it isn’t that it was a portable touch screen, since I’ve known I wanted one of those ever since I first went to Epcot Center. I think what’s ultimately so remarkable about the iPad is that it was completely and unapologetically as supplemental computer.

Since its release, people (including me) have been eager to justify the iPad by showing how productive it could be. Releasing a version called the “Pro” would seem like the ultimate manifestation of that. But I’m only now realizing that what appealed to me most about the iPad had nothing to do with productivity. I don’t need it to replace my laptop, since I’m fortunate enough to be able to have a laptop. And the iPhone has wedged itself so firmly into the culture that it’s become all but essential; at this point it just feels too useful to be a “personal” device. (Plus Apple’s business model depends on replacing it every couple of years, so it’s difficult to get too attached to one).

Apple’s been pitching the watch as “their most personal device ever,” but I wouldn’t be devastated if I somehow lost or broke the watch. My iPad mini, on the other hand, is the thing that has all my stuff. Not even the “important” stuff, which is scattered around and backed up in various places. The frivolous, inconsequential stuff that makes it as personal as a well-worn notebook.

Once I had the iPad Pro set up with all my stuff, I was demoing it to a few people who wanted to see it. And obviously with coworkers but even, surprisingly, when showing it to my boyfriend, there was a brief moment of hesitation where I wondered if I was showing something too personal. I don’t mind anybody using my laptop or desktop, or sharing my phone with someoen who needs it, but I’ve got a weird, very personal attachment to the iPad. (And not just because I treat my Tumblr app like the forbidden room in a gothic novel which no one must ever enter).

It’s entirely possible that I’m in the minority, and whatever attachment most people have to “their stuff” is to the stuff itself in some nebulous cloud, and not the device that’s currently showing it to them. It’s even more likely that there’s simply no money to be made in selling people devices that they become so attached to that they never want to give them up. It may be that Convergence is The Future of Personal Computing, and one day we’ll all have the one device that does everything.

After using the iPad Pro, I’m no longer convinced that a big iPad that also functions as a laptop is what I want. I really want a “normal”-sized iPad that’s just really good at being an iPad. Which means adding support for the Apple Pencil to the iPad Air.

So I’m back to hoping Apple’s already got one of those in the pipeline, and waiting until it’s announced at some point next year, and then ordering one the second they’re available and then trying to justify it as a rational and well-considered purchase. Next time for sure it’s going to be exactly the computer I want.

To Apple, Love Tailored Experiences

The Apple TV sure seemed like a good idea… at first!

Universal Apps from Apple TV announcement

On the surface (sorry), it seemed like Apple had made all the right decisions with its new product announcements yesterday. [For future anthropologists: new Apple Watches, a bigger iPad with a stylus, and Apple TV with an app store, and iPhones with better cameras and pressure-sensitive input. Also, the title of this blog post is a reference to something that happened a few months ago that nobody cares about now. — Ed.]

I’ve wanted an iPad with a stylus since before the iPad was even announced, so long ago that my image links don’t even work anymore! And I’ve been wanting a lighter laptop to use as purely a “personal computer” in the strictest sense — email, social media, writing, whatever stuff I need to get done on the web — and keep finding myself thinking “something like a MacBook Air that doubles as a drawing tablet would be perfect!” In fact, the iPad Pro is pretty close to what I’d described years ago as my dream machine but cheaper than what I’d estimated it to cost.

There’s been a lot of grousing online about how Apple’s acting like it invented all of this stuff, when other companies have had it for years. On the topic of pen computing, though, I can unequivocally say no they haven’t. Because over the years, I’ve tried all of them, from Tablet PCs to the Galaxy Note to the Microsoft Surface to the various Bluetooth-enabled styluses for iOS. (I’ve never been able to rationalize spending the money for a Cintiq, because I’m just not that great an artist). I haven’t tried the iPad Pro — and I’ll be particularly interested in reading Ray Frenden’s review of it — but I know it’s got to be at least worth investigation, because Apple simply wouldn’t release it if it weren’t.

Even if you roll your eyes at the videos with Ive talking about Apple’s commitment to design, and even if you like talking about Kool-Aid and cults whenever the topic of Apple comes up, the fact is that Apple’s not playing catch-up to anyone right now. They’ve got no incentive to release something that they don’t believe is exceptional; there’d be no profit in it. The company innovates when it needs to, but (and I’m not the first to say it): they don’t have to be the first to do something; they just have to be the first to do it right. And they’ve done exactly that, over and over again. The only reason I may break precedent and actually wait a while to get a new Apple device is because I’m not convinced I need a tablet that big — it’d be interesting to see if they’ll release a pen-compatible “regular-sized” iPad.

And if I’ve been wanting a pen-compatible iPad for almost a decade, I’ve been wanting a “real” Apple-driven TV set-top box for even longer. The first time I tried to ditch satellite and cable in favor of TV over internet, I used a bizarre combination of the first Intel Mac mini with Bootcamp to run Windows Media Center, a Microsoft IR remote adapter, a third party OTA adapter, and various third party drivers for remotes and such, all held together with palm fronds and snot. I’ve also tried two versions of the “hobby” Apple TV, relics of a time when Apple was known for glossy overlays, Cover Flow, and an irrational fear of physical buttons. Basically, any update would’ve been welcome.

But the announcement yesterday was a big deal, obviously, because they announced an App Store and an SDK. Which turned it from “just a set-top box” into a platform. That’s as big a deal for customers as it is for developers, since it means you don’t have to wait for Apple to make a new software release to get new stuff, content providers can make their own apps instead of having to secure some byzantine backroom deal with Apple to become a content channel, and some developers will come up with ways to innovate with the device. (Look to Loren Brichter’s first Twitter client as a great example of UI innovation that became standard. Or for that matter, Cover Flow).

And for games: I don’t think it’s an exaggeration to say that the iOS App Store has done more to democratize game development than anything, including Steam as a distribution platform and Unity as a development tool. Whether it was by design or a lucky accident, all the pieces of device, software, market, and audience came together: it was feasible to have casual games ideally played in short bursts, that could be made by small teams or solo developers, and have them reach so many millions of people at once that it was practical and (theoretically) sustainable.

I hope nobody expects that the Apple TV will become anywhere near as ubiquitous as the iPhone (or even the iPad, for that matter), but still: opening up development creates the potential for independents to finally have an audience in the console game space. It’d be like the Xbox Live Indie Games and XNA, if all the games weren’t relegated to a difficult-to-find ghetto separate from the “real” games. Or like the Ouya, if they’d made a device that anyone actually wanted to buy.

Game developers love saying that Apple doesn’t care about games and doesn’t get how games work — as if they’d just inadvertently stumbled into making a handheld gaming device that was more popular than Nintendo’s and Sony’s. You could look at the new Apple TV the same way, and guess that while trying to secure deals with big content providers and compete with Amazon or “Smart” TV manufacturers, they’d accidentally made a Wii without even trying.

There’ve been enough game-focused developments in the SDK, and the company marketing as a whole, that suggest Apple really does get it. (Aside from calling Disney Infinity “my favorite new Star Wars game”). But there’s a couple of troubling things about the setup, that suggest they expect everything on the TV to play out exactly the same way that it has on smartphones and tablets.

First is that the Apple TV has a heavy reliance on cloud storage and streaming of data, with a pretty severe limitation on the maximum size of your executable. They’ve demoed smart phone games on stage (Infinity Blade) that were 1 GB downloads, so it’s not inspiring to see a much smaller limit on downloadable size for games that are intended to run on home theater-sized screens. Maybe it’s actually not that big a problem; only developers who’ve made complete games for the Apple TV would be able to say for sure. But for now, it seems to suggest either very casual games, or else forcing players to sit through very long loading times. The latter’s been enough of a factor to kill some games and give a bad reputation to entire platforms.

Second is the emphasis on universal apps. They mentioned it at the event and just kind of moved on. I didn’t really think much of it until I saw this from Neven Mrgan:


You could take the most mercenary possible interpretation of that, which is what people always do once the economics of software development comes up: “Big deal! Having one app is what’s best for consumers! What’s best for consumers always wins, and it’s the developers’ responsibility to adjust their business model to enable that!” Also “Information wants to be Free!!!”

Except what’s best for consumers is that the people making great stuff can stay in business to keep making great stuff. And we’ve already seen on iOS exactly what happens when developers “adjust their business models” to account for a market that balks at paying anything more than 99 cents for months to years of development. Some big publishers (and a few savvy independents, like Nimblebit) came in and made everything free-to-play with in-app purchases. Maybe there is a way to make a free-to-play game that doesn’t suck (and again, Nimblebit’s are some of the least egregious). But I can’t see anybody making a believable case that the glut of opportunistic games hasn’t been a blight on the industry. I was out of work for a long time at the beginning of this year, and it was overwhelmingly depressing to see so many formerly creative jobs in game development in the Bay Area that now put “monetization” in the job title.

Believe me, I’d love it if one of these publishers went all-in on the Apple TV, and then lost everything because they didn’t take into account they were pandering to a different audience. But that’s not what would happen, of course. What would happen is that a couple of the big names would see that they can’t just fart out a “plays on your TV screen!!!” version of the same casual game and still make a fortune off of it, so they’d declare the entire platform as being not worth the effort. And then smaller studios who are trying to make stuff that takes specific advantage of the Apple TV “space” will be out of luck, because there are no big publisher-style marketing blitzes driving people to the platform. You need a combination of big names and smaller voices for a platform to work: again, see XBLIG.

It just seems as if there’s no recognition of the fact that there’s a lot more differentiating a game you play on your phone and one you play on your television than just the screen size. It seems especially tone-deaf coming from a company like Apple, who’s made a fortune out of understanding how hardware and software work together and what makes the experience unique. (Part of the reason that iOS has had so much success is that they didn’t try to cram the same operating system into a laptop and a smartphone).

At least the games on display showed evidence that they “get it.” The game demoed by Harmonix took advantage of the stuff that was unique to the Apple TV — a motion-sensitive controller and (presumably) a home theater-quality audio system. And even Crossy Road, which would seem like the worst possible example of shoveling a quick-casual game onto a TV screen and expecting the same level of success, showed some awareness of what makes the TV unique: someone sitting next to you playing the game, or at least having other people in the room all able to see something goofy happening on your screen.

I haven’t seen enough about tvOS to know if Universal apps are actually a requirement, or just a marketing bullet point and a “strong recommendation” from Apple. (Frankly, since I’m trying to make an iPad-only game, I’m ignorant of the existing requirements for iOS, and whether they restrict developers from releasing separate iPad-only or iPhone-only versions of the same software). So maybe there’ll be a market for separate versions? And somehow, magically, a developer will be able to release a longer, more complex game suitable for a home entertainment system, and he won’t be downvoted into oblivion for being “greedy” by asking more than ten bucks for the effort.

And there’s been some differentiation on the iPad, too. Playing XCOM on the iPad, for example, is glorious. That’s not a “casual” game — I’ve had sessions that lasted longer than my patience for most recent Xbox games — but is still better on the iPad because you can reach in and interact with the game directly. I could see something like that working — I’d pay for a game with lower visual fidelity than I’d get on Xbox/PS4/PC, if it had the added advantage that I could take it with me and play on a touchscreen.

So I could just be reactionary or overly pessimistic. But it’s enough to take what first seemed like a slam-dunk on Apple’s part, and turn it into an Ill Portent for The Future Viability Of Independent Game Development. As somebody who’s seen how difficult it was to even make a game in The Before Times, much less sell one, the democratization of game development over the past ten years has been phenomenal. And as somebody who’s finally realized how much some game studios like to exploit their employees, it’s incredible to be in an environment where you can be free of that, and still be able to realize your passion for making games.

The reason I first wanted to learn programming was being at a friend’s house, watching them type something into their VIC-20, and seeing it show up on screen. It was like a little spark that set me down a path for the next 40 years: “Wait, you mean I can make the stuff that shows up there, instead of just sitting back and watching it?” It’d be heartbreaking to see all the potential we’re enjoying right now get undermined and undone by a series of business decisions that make it impractical to keep making things.

Worst case, it’ll be another box that lets me watch Hulu. I was down to only eight.

One Month With the Apple Watch (Order Status Page Edition)

My experiences with the latest advance from Apple that’s disrupting the ecosystems of wearable technology and order and personal fulfillment.

AppleWatchOrderStatus
As a well-known “early adopter,” I feel I’ve got an obligation to share my experiences with bleeding-edge advancements in SoaC-powered wealth redistribution with users who are more on the fence, baffled by the increasing number of options in wearable technology.

A lot of you have lots of money but no time to wade through all the industry jargon; you just have simple questions that you need answered: “What is the Apple Watch?” “Why haven’t I read or heard anything about it?” And most importantly: “Does Chuck have one yet?”

I can go ahead and conclusively answer the last question: No.

If you were hoping that the Apple Watch would finally be the game-changer that makes me satisfied with the number of gadgets I own, you’re probably better off waiting a month or two. Version 1.0 of Apple products are known for being a hint of the advancements and refinements yet to come, more than complete, functional, devices. It’s as if with the Apple Watch, Jony Ive and his team of designers at Apple are giving us a roadmap for the future, announcing to the world: This is what the smart watch will be like, some time in early July when Chuck is actually able to have one.

So the question remains: is it really that insufferable to be waiting for the delivery of an expensive, inessential device, while surrounded by other people who already have theirs? Let’s find out.

How The Other Half Lives

Marketing Apple’s Most Personal Device Ever

Apple had to take a different approach with their first foray into the world of wearable technology. That meant making sure that before the product even hit stores, watch models were made available to the leading tastemakers: the technology and gadget bloggers who’d complain that Pharell and wil.i.am were posting Instagram pictures of their watches before any of the reviewers could get one.

By now, you’ve no doubt seen the “Big Guys” offer up their opinions about the Apple Watch (48mm Steel with the Milanese Loop band, universally), and their experiences with glances, taptic feedback, the Activity tracker, re-charging it every day, and the importance of selectively disabling notifications. By virtue of the mathematical study of combinatorics and the number of words in the English language, each reviewer’s take is, strictly speaking, unique.

You’ve seen a quirky first person attempt to free the device from Jony Ive’s perfectly-controlled environment and present it in a more realistic day-to-day setting: a tech blogger in New York City with a head-mounted camera. You’ve doubtless savored the definitive review from a suave globetrotting secret agent tech blog editor figuring out how this new innovation fits into a busy day packed with meetings and treadmill-running, including an up-close look at how hard it is to execute cross-site web content scheduling in a New York City bar with the double distractions of a watch constantly tapping your wrist, and a full camera and lighting crew having to run multiple takes of video while in a New York City bar. You’ve seen a stop-motion animated version with paper cutouts, for some reason. By now, you’ve even seen the Tech Reviewer Old Guard offer another look back at the watch after using it for a month.

What none of those so-called “professional” reviews will tell you is what life is like for real people who don’t have the product being reviewed. Sure, you occasionally get somebody like Apple insider and sarcasm enthusiast Neven Mrgan making a feeble attempt to relate to The Rest of Us outside Apple’s walled garden clique, but how much can you really say about an experience after only a week or two? How does that experience change after an entire month? [Full disclosure: Mr. Mrgan graciously offered a royalty-free license for me to completely rip off the premise of this blog post, presumably by effortlessly dictating said license into the always-on AI assistant of his futurewatch].

It’s Finally Here

Just Not For You

One thing that none of the reviews mention is how much of the Apple Watch experience is dependent on having not just an iPhone, but an actual physical Apple Watch. The site iMore.com, for example, offers a list of what the Apple Watch can do without an iPhone, but makes no mention of what can be done without the watch itself.

Granted, one site can’t possibly cover every single aspect of the watch (although not for lack of trying), but this seems like an oversight. How do I keep time without an Apple Watch? How is not having the watch changing my health for the better? When will I get one? They’re all questions strangely left unanswered by the “All [sic] your questions answered!” Apple Watch FAQ.

That’s a perfect example of how blog developers are adjusting to the new paradigms introduced by the Apple Watch: They’re not as content-focused as more traditional devices like the iPhone’s reviews. Instead, they’re best consumed as “glances,” not meant to be “read” so much as absorbed in quick seconds-long bursts throughout the day, every day, for months.

The truth is that there’s no amount of parallax scrolling and full-screen looping background video that will provide a truly definitive review of life without Apple’s latest must-have. For that, you need to go to Apple itself.

AppleWatchBallerina
That trademark Apple design is evident from first glance: the photographs of other people with their watches bleed right up to the bezel of the laptop screen, putting a subtle but unmistakable emphasis on the object that you don’t have. It’s a perfect example of how Apple makes cold hardware more personal, by telling a personal story: This woman has a watch and you don’t. She is a ballerina. What does she need a smartwatch for? She can’t possibly have her iPhone in range; her pockets are too small. Also the screen is likely to come on frequently as she moves her arms, causing a distraction to the other dancers. Did she not think this through? I wonder if she ordered her watch at midnight instead of waiting. A good night’s rest is very important for dancers, so it seems foolish to forsake that just to get a new watch that can’t even give incoming message notifications. Not to mention that dancers aren’t usually paid well enough to be spending hundreds of dollars on a watch. I bet she didn’t even wait in line for a new iPhone every other year since the first model, like I did. Who does she think she is, anyway?

This is also likely to be your first bit of frustration when dealing with the lack of an Apple Watch: because the title photograph has to do a full round-trip circuit from designer to marketing team to photographer and model to graphic designer to web publisher, it can get hopelessly out of sync with reality. I still find myself reading the notification “The Watch is Here,” and then glancing down at my wrist only to confirm that it’s most assuredly not here. I hope this is fixed in a future update.

The Best Part of Waking Up

Getting Into the Groove of a Daily Routine Without Your Apple Watch

Apple’s attention to detail and design carry through the rest of the experience. There’s no garish “Order Status” menu, for example, instead offering a simple “Store” menu that reveals more beautifully photographed images of the product you don’t have.

It’s only there that you find a friendly drop-down menu takes you to “Order Status.” That will ask you for your password every time you open or refresh the page throughout the day — you’ll be doing this a lot, so I recommend using a password manager like 1Password.

In the month since I ordered an Apple Watch, I’ve really started to notice how I use technology differently throughout the day and in different locations. On the laptop, for instance, I hardly ever use the Delivery Status widget to track the status of my shipment, both because of the decreasing relevance of the OS X Dashboard, and because after 5 weeks the order is still in “Processing” status without a tracking number. Instead, I prefer to go to the Apple Store page, bring up the order status, enter my password, refresh the page, wait a few seconds, and refresh the page again, sigh, then refresh it one more time. I would’ve thought that this would feel like an intrusion, but it’s become such an integral part of my morning routine that I hardly even notice it anymore.

While out around town, not going to bars or important meetings, it’d be a lot more convenient to bring up the Apple Store app on my phone. In practice, though, the app requires me to type my password again every time I want to check the order status, so I end up not bothering. Maybe they’ll fix this sometime within the next 5-6 weeks. In a perfect world, I could have some type of device on my wrist that could give me order updates with just a “glance.”

On the Order Status page, you’ll see the time period in an elegant but still-readable font. Apple still knows how to make the most of the user experience, giving a moment of delight as you see the estimate change from “June” to “5-6 weeks.” These displays are made possible by “complications,” a term Apple is borrowing from the hardware industry to describe things like doing a huge marketing push for a product release that depends on faulty haptic feedback engines from overseas manufacturers.

Apple makes it really easy to go back to the main store page from the Order Status page, so you can get a beautiful, detailed look at all the various models and colors of watches you don’t have. It’s fun for running “what if?” type experiments, such as “Could I cancel my order and instead get one of the dainty models with a pink band? Would that ship any faster?”

There’s also support for Apple’s new “Force Touch” technology, in which you give a long, exasperated sigh followed by a sharp slamming gesture on all of the keyboard’s keys simultaneously, or pressing a closed fist firmly and repeatedly on the laptop’s trackpad. This gives helpful feedback in the form of Safari crashing. It definitely takes some practice, but in my experience, it became second nature the more often I saw my colleagues unwrapping their just-delivered Apple Watches near my desk.

I Regret Reading a Gadget Blog Post (and I knew I would)

The Cold, Hard Sting That Can Only Happen When You Physically Open Your Wallet

Even though the watch is only available online and who the hell writes for a technology blog but still has to physically open his wallet when he buys stuff online?

He Should Try Apple Pay
Unless Maybe He Also Bought a Really Expensive Wallet, And He Just Likes the Way It Feels

As a mobile software developer in San Francisco, I’ve already seen how the release of the Apple Watch has changed my routine. During my morning workout (two reps climbing up BART station stairs, followed by an intensive 1.5 block walk), I enjoy listening to podcasts that keep me on the bleeding edge of the most disruptive of apps and innovators. (ICYMI: My essential travel gear). (I recommend Overcast for podcast-listening, even if you’re going truly old-school and changing podcast tracks on your Bluetooth headphones by manipulating actual buttons on your touchscreen-enabled wireless mobile computer).

The gang at SixColors.com has been active on various podcasts, letting me know about their experiences after initial unboxing, two days, four days, a week, and several weeks later, while traveling, writing, and recording podcasts. In addition to the roundtable discussions where groups of people discuss how the watch I don’t have yet has changed their lives, I’ve gotten answers to the questions you don’t usually think about with some cursory product review. For instance: what if you have two watches, and you can’t decide which of them you want to keep? And: now that we’ve all had the opportunity to get used to our new watches, what would we most like to see in the new version?

Another highlight: an account of the podcaster whose significant other isn’t much of a technology devotee and wasn’t that interested in the watch, became interested after seeing the podcaster use his for a few days, ordered one, received it, and is giving her first impressions. It’s a magical time, as if entire generations of wearable technology are happening all around me as I watch the Order Status page. Whole waves of Gawker Media-led backlashes are whooshing by with the lasting permanence of burrito farts, the only constants being me, a web site, and a refresh button.

After five weeks, I find I have a lot in common with Mat Smith, who wrote for Engadget a confessional thinkpiece entitled “I regret buying an Apple Watch (and I knew I would).”

Like Smith, I was initially unmoved by the announcement of a new device from Apple. I, too, had bought a Pebble watch but quickly got out of the habit of wearing it. I’ve gotten the first versions of other Apple products and often been surprised by how dramatically and how quickly they’re made obsolete by the next release. I, too, write rambling stuff on the internet that frequently makes me come across as an insufferable asshole. And I also find myself reluctantly falling back into the role of “early adopter” for the sake of completely irrational impulses — in my case, an animated Mickey Mouse watch face that taps his foot every second; in his case, enjoying buying unnecessarily expensive stuff that makes him look cool.

It was important to him to have the sapphire face and stainless steel body, whereas I have large wrists, so it really stands out when I roll my eyes and make a wanking gesture while reading the rest of his post.

We ordered different models of the watch, because we have different needs. He tried on the gold version and was invited to look at himself in a mirror, while I managed to get 10 minutes bending over a bench in an Apple store by scheduling an appointment a couple of days in advance. He fell in love with the Milanese band, while I could only justify getting the cheapest model by telling myself it was a birthday present for myself. He doodles tiny pictures of cocks to colleagues and concludes it’s not a life-changing device; I see colleagues with watches and go back to reading blog posts written by, apparently, sentient, literate cocks.

One More Thing

Adding a Semi-Pithy Coda About Consumerism to What Should Have Been a Short and Silly Blog Post to Make it Unclear How Much of Any of This Is Intended to be Sarcastic

This Is Why People Don’t Read Your Blog

For decades there’s been a tendency to be dismissive of Apple devotees as being cultish and image-obsessed, with more money than common sense. As Macs and iPhones got more ubiquitous (and cheaper), enough people caught on to the fact that good design actually has real value. There are, no doubt, plenty of people who put “shiny” and “has visible glowing Apple logo” high on their list of priorities, but I think they’re finally outnumbered by those of us who just want something that’s really well made. (And who’ve bought enough cheap computers for the sake of saving a few bucks to realize that it ends up costing more in the long run when it needs to be replaced). Now it’s only the cranks in forums and blog comments that insist on complaining about the “Apple Tax.”

When Apple announced a gold edition of its new watch, that was rumored to cost over ten thousand bucks, there were fears that it’d bring all the old class warfare back to consumer technology: the company was now explicitly targeting status-obsessed rich people.

As I look at photos of models tying up their toe shoes, or draping their watch-bedecked arms over other models to make out with them, or stopping mid-jog-through-the-Hollywood Hills, and I see the three clearly-delineated castes of watch available, and I commit a few hundred bucks to the “lowest” caste of thing that I didn’t even want a few months ago, and I get increasingly resentful of the people who already have their inessential thing, and even more annoyed when they have the more expensive version of the thing I don’t yet have (even though I wouldn’t even want the more expensive version), I’m just glad those fears turned out to be completely unfounded.

Protestant Gadget Ethic

Making sense of the iPad mini in a world that doesn’t need it.

After my previous unfortunate episode in an Apple store, it should come as little surprise that I didn't last very long before I broke down and bought an iPad mini. No, it doesn't make sense for me to be throwing my credit card around as if I were the CEO of Papa John's or something. I've already got a perfectly fancy tablet computer that's not just functional, but really quite terrific. It's not like I'm getting paid to write reviews of these things, and even my typical “I need it for application development testing” is sounding increasingly hollow.

What helps is a new metric I've devised, which measures how long it takes me after a purchase before the appeal of the thing overwhelms my feeling of conspicuous consumption guilt over buying it. It's measured in a new unit: the Hal (named after Green Lantern Hal Jordan, the Jordan who does have willpower).

By that standard, the iPad mini clocks in with a new record of 0.03 Hals, or about 30 minutes after I opened the box. Because this thing is sweet, and I pretty much never want to stop holding it. I'm writing this post on it, as a matter of fact, even though a much more functional laptop with keyboard is sitting about three feet away from me at this very moment. But to use it would mean putting the iPad down.

The “finish” of the iPad mini, with its beveled edge and rounded matte aluminum back, is more like the iPhone 5 than the existing iPads. It makes such a difference in the feel of the thing that I can talk about beveled edges and matte aluminum backs without feeling self conscious, as if I were a tech blogger desperately seeking a new way to describe another piece of consumer electronics.

It’s about as thin as the iPhone 5, and almost as light. With the new Apple cover wrapped around the back, it's perfect for holding in one hand. There have been several times that I've made fun of Apple, or Apple fanatics, for making a big deal about a few millimeters difference in thickness, or a few ounces in weight. And I joked about the appeal of the iPad mini, as if the existing iPad was unreasonably bulky and heavy.

But then something awful happened: I had to fly cross country four times within two weeks. And reading a book on the iPad required me to prop the thing up on the tray table and catch it as the person in front of me kept readjusting his seat. All my mocking comments were flying back in my face (along with the iPad, my drink, and the in-flight magazine), in the form of the firstest of first-world problems.

“Version 1 of the iPad mini is for chumps,” I said. “Check back with me once you've put in a higher resolution display, Apple.” In practice, though, the display is perfectly sharp. “Retina” isn't the make-or-break feature I thought it would be. You can certainly tell the difference when comparing the two; I'd assumed that squabbling over pixel density was something best left to the comments sections of tech blogs, but the difference in sharpness is definitely visible. It's really only an issue for very small text, though. Books, Flipboard, and web pages are all clear and legible.

And speaking of Flipboard, it and Tweetbot are the two apps that get me giddy enough to own up to making another unnecessary tech purchase. Browsing through articles and status updates on a tablet that thin is probably the closest I'll ever come to being on board the Enterprise.

The phrase I've seen reoccurring the most in reviews of the iPad mini is some variation on “this is the size the iPad is supposed to be.” And really, there's something to that. I'm not going to give up my other one; the larger size really is better for some stuff, like drawing, Garage Band, and reading comics or magazines. But overall, I haven't been this impressed with the “feel” of a piece of consumer electronics since I saw the original iPhone. Realizing that this is just version 1.0 is actually a little creepy — apart from the higher resolution display, I honestly can't conceive of how they'll improve on the design of the iPad mini.

They Don’t Love You Like Google Loves You

Maps in iOS 6 bring about the downfall of western civilization, and my disillusionment with tech commentary continues.

Dalijapancenter
Just days after the entire developed world sank into a depressive ennui due to Apple’s boring new smart phone, society was rocked to its foundations by the unmitigated disaster that is iOS 6’s new Google-free Maps app. Drivers unwittingly plunged their cars into the sea. Planes over Ireland crashed into nearby farms due to mislabeled icons. College students, long dependent on their iPhones to find their way around campus from day to day, were faced with a featureless blob of unlabeled buildings and had no option but to lie down on the grass and sob. Huge swaths of Scotland were covered with an impenetrable fog, and the Brooklyn Bridge collapsed.

Throughout the entire ordeal, Tim Cook only stopped giving the world the middle finger long enough to go on Saturday Night Live and rip up a picture of Steve Jobs. Jobs’s only recourse was to haunt the homes and workplaces of thousands of bloggers, commenters, and Twitter users, moaning “You’re the only one who truly understands what I wanted. Avenge me!

At least, that’s the way I heard it. You want proof? It’s right there in the video, the one where they say “The Statue of Liberty? Gone!” while showing a picture of the Statue of Liberty. (Psst… hey, The Verge people — it’s that green thing in the middle of that star-shaped island). You think just because it’s “journalism” they have to have sources to show that it’s a serious, widespread problem? Check it, Jack: a tumblr full of wacky Apple maps mishaps.

And no, it doesn’t matter that the vast majority of those are complaints about the 3D Flyover feature, which was universally acknowledged as being a “neat but practically useless” feature of the maps app as soon as it was released, because shut up that’s why.

Of course, since I’m a relentless Apple apologist, I’m focused, Zapruder-like, on one tiny six-second segment of that three-minute long video: the part that says “For walking and driving, the app is pretty problem free.” And I’m completely ignoring the bulk of the video, which shows incontrovertible evidence that not everything is 3D modeled and lots of things end up looking kind of wavy.

Sarcasm (mostly) aside, my problem with this isn’t “oh no, people are picking on Apple!” My problem is that the people who are supposed to be authorities on tech — and to be clear, it’s not just The Verge, by a long shot — keep spinning the most shallow observations into sweeping, over-arching narratives. (And no, I haven’t see a single Verge post about Apple in the past week that’s neglected to find a way to link to that 73-degrees-Apple-is-timid post).

The tech journalists are the ones who are shaping public opinion, so I don’t think it’s unreasonable to expect an attention span of longer than a week and a short term memory of longer than a year. And as a result, I’m going to hold them responsible every time I read something dumb on the internet.

To be fair, even though the video just says “Apple’s supposedly been working on its version of Maps for 5 years, and it’s resulted in an app that’s inferior to what was there before,” and leaves it at that, the article does mention that Google’s data has been public for 7 years. And points out that the data gets refined with the help of location data from millions of iPhones and Android devices.

But why make it sound as if the decision to break free from dependence on Google was an arbitrary decision on Apple’s part? By all accounts, Jobs had a preternatural grudge against Google for releasing Android. But it’s not as if the Maps app on iOS 5 and earlier was feature equivalent to the Google maps on Android, and Apple’s deciding to roll their own was a completely petty and spiteful decision. Android’s had turn-by-turn directions for a while now, and there were no signs that it was ever coming to the Google-driven app on iOS.

Was that a case of Google holding out, or Apple not bothering with it because they knew they had their own version in the works? I certainly don’t know — it’s the kind of thing it’d be neat for actual tech journalists to explain — but it ultimately doesn’t matter. The licensing deal with Google ran out, so Apple’s options were to reassert their dependency on their largest competitor, or to launch with what they had.

And incidentally, whenever someone says “Steve Jobs would never have allowed something this half-assed to be released!” it’s the tech journalists’ responsibility to remind them that the iPhone released without an SDK and nothing but Apple’s assurance that Web apps were The Future. Or that Jobs had no problem releasing the original iPhone without support for Flash video, even though there was an outcry that Flash was crucial to the user experience.

I installed iOS 6 on the iPad and tried out a few practical searches. It found everything I needed, and it actually gave me more relevant information than I remember the Google version giving me, since I was looking for businesses, and Yelp automatically came up with business hours. Of course, my experience means very little, since I happen to live in the one part of the world that’s going to be most aggressively tested by Silicon Valley companies. I have little doubt that Europe and Asia are going to have a harder time of it, and obviously they’re not markets to be ignored. But it’s not a one-size-fits-all problem, so it’s silly to treat it like one.

Apple has no problem calling Siri a beta, so they probably should’ve called Maps beta as well. It’s a huge part of why people use smart phones, so it’d be foolish to imply that serious inaccuracies are no big deal. Regardless, it’ll work well enough in a lot of cases for most Americans, and in the cases where it doesn’t work, the web version of Google maps is still available (and you can set up a link on the home page with its own icon, even). Maybe Google and Apple will reach enough of a détente for a third party Google Maps app to get released. Maybe it’ll even finally bring turn-by-turn directions, or Apple will even allow third party apps to be default handlers for links!

Until then, maybe we can stop with the melodrama and the hyperbole, and just appreciate Apple Maps as version 1.0 mapping software with a neat extra feature of peeking into an alternate reality where Japantown in San Francisco has been overtaken by giant spiders.

Everything is amazing and nobody’s insightful

Tech writers are disillusioned with the iPhone 5, and I’m getting disillusioned with tech writers.

Ouroboros pressI understand that if you’re writing about technology and/or gadgetry, a significant part of your job is taking a bunch of product announcements and reviews, and then fitting them into an easily-digestible, all-encompassing narrative. Usually, though, there’s at least an attempt to base that narrative off of reality, “ripped from today’s headlines” as it were. Lately, it seems like writers are content with a story that’s loosely based on actual events.

For the week or so building up to the iPhone 5 launch and the days after, the narrative has been simple: “The iPhone 5 is boring.” Writing for Wired, Mat Honan says, essentially, that it’s boring by design. And really, fair enough. Take Apple’s own marketing, remove the talking heads on white backgrounds, remove the hyperbole, and give it a critical spin, and you’ll end up with basically that: they’ve taken the same cell phone that people have already been going apeshit over for the past five years, and they’ve made it incrementally better.

Take that to its trollish extreme, and you have Dan Lyons (the creator of the “Fake Steve” blog). He wrote a “provocative” take on Apple’s past year including the iPhone 5 announcement for BBC News, in which Lyons (who wrote for Newsweek and also a satirical blog in which he pretended to be Steve Jobs) spends a couple of paragraphs reminding us why we should care about his opinion (it’s because he wrote a blog in which he was “Fake Steve”), and then mentions that he dropped the blog out of respect for Jobs’s failing health, and then invokes Jobs’s memory several times. In an “analysis” of Apple that’s as shallow and tired as you can possibly get without actually saying Micro$oft — he actually uses the word “fanboys.”

We can all acknowledge that to give him the attention he needs and then move on; there’s absolutely nothing there that wouldn’t get him laughed off of an internet message board. Lyons doesn’t even have the “I speak for Steve Jobs” thing going for him, since everybody has an opinion on how things would be different if Jobs were still in charge.

What’s more troubling to me is seeing writers who are usually worth reading instead take a similar approach: building a story that’s driven mostly by what other people are writing. Any idiot can regurgitate “news” items and rearrange them into a cursory non-analysis. (And that’s worth exactly as much as I got paid for writing them). (Which is zero in case you couldn’t tell already). Is it too much to ask for insight? Or at least, wait to see what the actual popular consensus is before making a declaration of what popular opinion should be?

If It Ain’t Broke, Fix It Anyway Because it Has Grown Tiresome to Me

On The Verge, Dieter Bohn found a perfect analogy for the iPhone in its own Weather app icon. He turned Honan’s piece from a blog post into the “prevailing opinion” about the announcement. But then he takes Honan’s reasonable but back-handed compliment and turns it into an accusation: the iPhone isn’t boring but timid. Sure, the hardware’s fine, but whatever: where Apple has failed is by showing no willingness to experiment with the core operating system or UI.

The mind-numbingly tedious drudgery of having to close multiple notifications under iOS (which, incidentally, I’ve never once noticed as a problem) proves that Apple’s entire philosophy is a lie. You’ve got a reputation for “sweating the details,” Apple? Really? Then how can you possibly explain this?! — as he thrusts a phone full of email notifications into the face of a sheepish Tim Cook, while Jobs’s ghost shakes his head sadly.

I honestly don’t want to be too harsh with any of the writers on The Verge, since I visit the site multiple times daily, and I really like their approach a lot. But I don’t think it’s particularly insightful to be aware that interface consistency isn’t just the dominant driving factor of UI design, but of Apple’s philosophy since the introduction of the Mac. We’ve been using “version 10” of the Mac OS for 10 years now, and while the appearance and the underlying hardware have changed dramatically, the core functionality is largely the same. Intentionally. It’s only with the still-new Mountain Lion that Apple’s made changes to how things work on a fundamental level — and, in my opinion, it hasn’t been all that successful. (I don’t understand all the people whining about faux leather while saying relatively little about changing the entire structure of document management for the worse).

On top of that, though, there’s the fact that The Verge has spent at least a month covering the Apple v. Samsung trial, in which Apple was spending a ton of time, effort, and presumably money to defend the UI that Bohn claims needs a dramatic overhaul. Yes, Microsoft has done considerable work to dramatically rethink the smart phone UI. That’s how Apple wants it. They spent a month saying, “this is ours, we made it, we like it, stop copying it.” Could it use some refinement? Of course. It always can, and some of those good ideas will come from other attempts at inventing a new cell phone OS. Does it need a dramatic re-imagining? No, unless you’re irrationally obsessed with novelty for its own sake, as a side effect of being steeped in cell phone coverage 24/7 to the point of exhaustion.

The Audacity of Swatch

Speaking of that, there’s Nilay Patel’s write-up of the new iPod Nano. Again, I think Patel’s stuff is great in general. But here, he carries on the “boring but actually it’s timid” idea by linking to Bohn’s article, and then goes on to build a story about what it says about Apple in general. Essentially, they’ve become The Establishment, too afraid of change to take any risks. With a product that’s changed dramatically in design in just about every iteration since the original.

“But that’s the old market, and the old way.” Apple isn’t about profiting over the planned obsolescence and impulse purchase cycle — which is news to all of us who have now become conditioned to buy a new cell phone every 2 years and a new laptop every 3 or 4 — but to pioneer new markets. The last iteration of the nano heralded a future of wearable computing. The last nano could’ve been the harbinger of a bold new market for a more forward-thinking Apple: wristwatches.

Let’s ignore the fact that Patel himself acknowledges that the nano wasn’t a very good watch in the first place. What about the fact that smart phones pretty much killed the entire wristwatch business? I’m about as far from being fashionably hip as you can get, but I still get considerable exposure to what’s actually popular just by virtue of living in San Francisco. And I don’t know anyone who still wears a watch. It’s been at least five years since anyone’s been able to ask for the time and not have to wait for everyone to pull their phones out of their pockets or handbags. Why would Apple go all-in on a market that they themselves helped destroy?

(Incidentally, Patel quotes watch band designer Scott Wilson as saying “The nano in that form factor gave me a reason to have three iOS devices on my body.” I can think of the iPhone and the iPod Nano-with-watchband; neither Wilson or Patel make it explicit what the third device is. And now I’m afraid to ask, because I’m not sure I want to know what this third device is exactly, or where a person would enjoy sticking it).

Saying that the MP3 market is dead fails to acknowledge what killed it: that functionality, along with that of the point-and-shoot camera, has moved away from a dedicated device and towards the smartphone. Smartphones are expensive, even with a contract, and the more info we put onto them, the more they become irreplaceable. There’s still a market for a smaller, simpler, and relatively inexpensive MP3 player. There’s a clue to that market on Apple’s own marketing page, and the most prominent icon on its home screen: “Fitness.” Joggers and people who work out — at least from what I’ve heard, since I have even less familiarity with the world of exercise than I do with people who still wear wristwatches — want a music player they can take for a run or take to the gym without worrying too much if it gets lost or broken. They’ll get more use out of that than from a too-large wristwatch that has to be constantly woken from sleep and needs a headphone wire running to your wrist if you want to listen to music.

That’s where the new market is, ripe for Apple to come in and dominate: stuff like the fitbit. I don’t have the actual numbers, of course, and I don’t even have any way of getting them, but I can all but guarantee that Apple sold more of the iPod nano armbands than it ever did with watchbands. And I imagine it’s the same philosophy that made them put a place for carrying straps on the new iPod touch: it’s not even that Apple doesn’t want to take risks with its flagship product, it’s that customers don’t want to take risks with the one device that handles all their communication with the world. For them, the iPod is an accessory.

Speaking of Consistency

But if you’re going to be making up stories, you should at least try to be consistent with them. On Slate, Farhad Manjoo has some serious issues with the new dock connector. He repeats the idea that the new iPhone is boring, but he uses the magic of sarcasm to make his point extra super clear. The problem is that so many details about the new phone leaked out weeks before release. By the time of the actual announcement, the world had already seen everything and stopped caring.

Got that? Apple’s problem is that it’s got to keep a tighter lid on its plans. Classic Apple, going blabbing about everything all over the press, flooding the market with product announcements. It’s boring everyone!

He’s bored by all the leaked information and the lack of any big bombshells in Apple’s announcement. Except for the big bombshell of the new dock connector. It’s pretty boring but also very impressive. It doesn’t take any risks but completely and unnecessarily changes the dock connector, destroying an entire industry of accessories. Apple has a long history of invalidating what it deems outdated technology, but this is different. Manjoo has to get a new alarm clock.

Also, the new phone is remarkably thin and light, “the thinnest and lightest iPhone ever made, and the difference is palpable.” But how could Apple possibly justify changing everything just for the sake of this new, thinner dock connector?

Back on The Verge, Vlad Savov describes all the leaks that bored Manjoo, and he mentions how embarrassing they must be for Tim Cook. He says that the problem is all the points of potential leaks in the supply chain, which have “drilled holes in Apple’s mystique.” In the same article, Savov links to Verge’s own post about every detail that was leaked ahead of time.

Isn’t it a little disingenuous to be on a site that publishes front-page posts of pictures of new iPhone cases, a detailed report of the new dock connector, and a compilation of all the rumors and leaks to date, and then comment on the unprecedented demand for leaked information? It seems a little like prom committee chairman Edward Scissorhands lecturing everyone on their failure at putting up all the balloons.

And of course, on the first night of pre-orders for the boring new iPhone that nobody’s interested in, Apple’s online store was overloaded, and the phone sold out of its initial order within within two hours.

On the topic of pots, kettles, and their relative blackness: I wasn’t that interested in the new iPhone, and I still chomped on the pre-order like a starving bass. I was much, much, much more excited about finally ditching AT&T than I was about the device itself. (So eager to get rid of AT&T that I’m willing to run into the arms of a company that’s no doubt almost as bad). Now that Apple’s not talking about magic, I can take their advertising at face value: I’m pretty confident that it is the best iPhone they’ve ever made. I’ve got an iPhone, and I like it, so I’ll get a better one.

What does that say about the state of gadget obsession that “I’m going to buy a new expensive device every two years even though I don’t technically need it” comes across as the most pragmatic view?

I can still remember when I first saw Engadget, and I thought the concept was absurd. A whole ongoing blog devoted solely to gadgets and tech? Is that really necessary? Then, of course, I got hooked on it, and started following it and now The Verge and quite a few Apple-centric sites. If I’ve reached a point of gadget saturation just reading the stuff, what does it do the folks having to write about it? It seems like it’s creating a self-perpetuating cycle of novelty for its own sake, which then drives commentary about this bizarre fixation on novelty for its own sake. You can’t even say “maybe just step away from it all for a bit;” it has to be a stunt, completely giving up the internet for an entire year while still writing for a technology-oriented site.

Whatever the case, it’d probably a good idea to step back a bit. I’ll start doing that just as soon as I get my sweet new extra-long phone next week.

Like I’m going to read in direct sunlight with a $600 tablet. COME ON!

Yet another attempt to figure out the mindset of people blinded by glowing Apple logos.


Not even 24 hours after Apple’s new iPad announcement, John Gruber at Daring Fireball resumed his vicious assault on female tech bloggers by quoting “Apple’s Press Conference Showed a Brand Unraveling” by Jolie O’Dell at VentureBeat.

It’s an op/ed that says there were no major problems with the presentation, just “a few minor but glaring inconsistencies” that were worth spending several paragraphs describing in context and explaining how they foretell the imminent downfall of Apple Inc. as we know it. For want of a tucked-in shirt, the $540 share price was lost.

The article’s actually not much worse than the bulk of the tech punditry circling the product announcement. Sure, it does try to make the case that Apple is falling apart after Jobs’s death, and it does so by making spurious comparisons between products released now and products released when Jobs was already no longer CEO of the company. I suppose it’s less compelling to acknowledge that it’s been a couple years now since Jobs was in charge of day-to-day operations, or to point out that Apple hasn’t actually released an industry-changing product every year since Jobs took over.

And it’s easier to write:

Last time Apple was without Jobs, it came out with a lineup of duds.

as long as you conveniently forget about the Apple Hi Fi.

But I guess it’s inevitable for a charismatic leader of a company to get praised for all his successes while the not-quite-successes get conveniently ignored. I just hope that it doesn’t reach Disney fan intensity, where 55 years from now we’re still having to hear “What would Steve think?” And I hope that people, even people surrounded by tech “news” all day, still have enough of a handle on reality to recognize how silly the complaints are.

O’Dell complains about the word “resolutionary” as something Jobs’s perfectionism would never have allowed. I think it’s goofy, but no goofier than anything else Apple marketing has done in the past 15 years. Maybe it’s just a case of their thinking differently.

One thing O’Dell doesn’t complain about, although it seems just about everyone else has, is that the announcement was just a “modest” or “unremarkable” update. As if it’s no big deal that they were able to quadruple the resolution of the iPad screen. Except the entire device is a screen. People have apparently forgotten back to a few weeks ago, when the speculation was that a “retina” display on the iPad would be kept to a much more expensive “HD” model. I’ve got to wonder whether releasing the new model without a significant price bump somehow undermined what an achievement it is to get that kind of pixel density on a mobile screen.

I’ll admit I was getting a little excited about the rumors of haptic feedback, even though they were based on pretty implausible speculation (all that just from the words “and touch” on an invitation for a touch screen device?) But that’ll probably come in the 7th or 8th revision of the iPad.

Which will apparently still be called the “iPad.” And we’re all supposed to be upset about that, for some reason. I’m not just singling out O’Dell here, either; this is something several people are actually complaining about.

O’Dell says that calling the new version “new iPad” is an inconsistency in branding that wouldn’t have been allowed under Jobs’s reign, even though it’s the weird iPhone naming pattern that’s inconsistent throughout the line of Apple products. Did you remember that the iPhone 3G is actually the second version of the phone? Followed by the iPhone 3GS? And the iPhone 4, which was actually the fourth iteration of the phone and not to be confused with “4G” cellular networks? And the 4S, which was the fifth iteration but dropped the “G”. Not since SimCity has a franchise shown such a reckless disregard for numbering.

O’Dell gets it right by saying (the obvious) “Likewise, the Apple brand stood for beauty in simplicity.” What could be simpler than “iMac,” “iPod”, “iPhone,” and “iPad?”

Icecreamsandwichguineapig

What struck me the most about the article, though, was this bit:

But Apple’s ethos is about so much more than hardware and technology: It’s supposed to be, as this outsider sees it, about aspiration, dreams, desires, the future, even Utopia. In a word, it’s only 30 percent about the tech and 70 percent about the branding.

(psst… “it’s only 30 percent about the tech and 70 percent about the branding” is 13 times more than “a word.”)

I’ve seen this claim made hundreds of times over the years, but this is the first I’ve seen it made by someone speaking favorably about Apple (Steve Jobs-era Apple, anyway), instead of being followed by complaints about the “Apple tax” or intellectually bankrupt words like “sheeple” and “fanboys.”

I’m assuming (and I’m being charitable in the assumption) that it’s rooted in a mis-interpretation of a talk Jobs gave about branding around the time of the “Think Different” campaign launch. But the point of that wasn’t that branding is more important than technology. The point was that the company’s core values are more important than specifications and speed bumps.

At the time, even the idea of a tech company having “core values” was unusual. The environment at the time was more like the various Android phones and tablets trying to differentiate themselves for having 4G LTE and Ice Cream Sandwich with an AMOLED screen and a 1.5GHz single-core processor instead of focusing on what you can actually do with them.

Pointing out that the new iPad has a higher resolution screen is talking about specs. Launching the new higher resolution screen along with a mobile version of iPhoto, showing how the better screen, faster wireless networking, and cloud storage can help you organize and share your photos as journals — that’s Apple branding. And “iPad HD” or “iPad Retina” or even “iPad 3” is diametrically opposed to that branding. Saying “The iPad is the best tablet you can buy, and this is the best version of the iPad, and hey look at this happy family and their adorable children” fits the brand perfectly.

It’s been going on for well over a decade, but it still surprises me whenever I see someone making the claim that Apple’s appeal is mostly marketing. So much tech writing describes MacBooks, iPods, iPhones, and iPads as “status symbols,” taking it as a given that people buy them for the huge, shiny (or glowing) Apple logo on the back as opposed to what’s inside. That kind of knee-jerk reaction is baffling to me, and I’m someone who often has a hard time getting past the preconceived notion that anybody who drives a BMW is a douchebag.

Every Apple computer I’ve ever bought has turned out to be the best computer I’ve ever owned. (Except for the mice; the mice all universally suck). Every time I’ve tried to go with a Windows PC to save money, or to get some feature that’s not available on the Apple equivalent, I’ve gotten burned — burned enough that I’ve actually lost money in the transaction. I couldn’t care less whether it says Apple on the outside, as long as it works as well as I’ve grown accustomed to expect. Saying that it’s “only 30 percent tech” is pretty ludicrous, when no other company handles the technology as well.

Are there really people who buy these things for the logo, or because Steve Jobs told them to?

iLife

Piecing together the obituaries and eulogies of Steve Jobs makes it clear that his impact wasn’t just reality distortion

I try to stay wary of Apple’s marketing lingo: as much as I like using the iPad, it’s not “magical;” and for all the Apple-branded products I have scattered around the house, in various states of obsolescence but each one the best device I’ve ever owned, I’d never describe any of them as “insanely great.”

But Apple’s brief memoriam is absolutely right in calling Steve Jobs “visionary.”

There were plenty of obituaries and eulogies popping up across the internet within minutes after the official announcement of Jobs’s death; most ghoulishly composed right after his resignation (if not sooner) and polished off with an edited date and time. There were a few insightful ones as well; the best I’ve seen being Slate’s analysis of the wide reach of Jobs’s vision and a more personal thanks from Stack Overflow on behalf of all computer programmers.

But the best obituary was provided by Jobs himself, his commencement speech at Stanford in 2005. You have to wonder at the time whether he was aware he was delivering what would become the best summation of his life, not content with letting other people handle it.

That wasn’t the first time Jobs provided his own retrospective; the Think Different campaign for the Macintosh was every bit as much about Jobs’s own philosophy as it was about a computer brand. Jobs says as much in that video. And that ad campaign is a better testament to his legacy than any number of rote obituaries checking off his career achievements.

It may seem crass to associate a life’s work with a product marketing campaign, but I think it’s an outstanding symbol of Jobs’s vision, that his public life and his ideals are so inextricably linked with the Macintosh. It’s because of Steve Jobs that we can even think of computers and mobile phones as having “ideals” at all.

Even the tired criticisms of Apple echo the criticisms of Jobs himself. People decry Apple devices as being overpriced status symbols, while most of us who depend on Macs and iPhones use them simply because they do everything we want and do it well. People criticized Jobs for being an arrogant, stubborn, and sometimes ruthless; while he consistently described his perfectionism as a desire to reject the less-than-perfect in favor of making something that would genuinely change the world.

People are quick to point out that technologies existed before Apple used them, or that other devices have better technical specs — more slots, faster processors, more “open” technologies. But Steve Jobs’s greatest achievement was staying true to a holistic view of computing: individual specs aren’t as important as how they all work together. Technology isn’t the focus, what you do with technology is the focus. Xerox PARC first developed the GUI. But would Xerox have produced MacPaint and HyperCard?

It was the work of hundreds of hardware and software engineers, industrial designers, and graphic artists, not just Steve Jobs, that “invented” the Mac, iMac, PowerBook, iPod, iPhone, and iPad. But without Jobs’s dogged fixation on Apple’s core philosophy, they never would’ve come together as an integrated product line — not a phone, or an MP3 player, or a computer, but a line of technological products that could inspire you and enable you to make something great.

Getting that right once or twice could be dismissed as a fluke. Getting it right over and over again can only be genius. And it’s only by “connecting the dots” over Jobs’s career that you can see the remarkable consistency and devotion to that philosophy. How much did he influence the direction of Pixar, for example? It’s always a mistake to give too much credit to one person, but then you have to realize: Pixar was the studio that developed the most advanced computer animation and put it to use not as pure spectacle, but for storytelling. Again, it’s not the technology that’s important.

I never regarded Steve Jobs as a hero, and I barely knew anything about him before I read the retrospectives after his resignation from Apple. By most accounts of his management style, I would’ve hated working for him. I tend to be annoyed at the level at which people worship him. And I absolutely reject the ideal of the auteur, and I’ve seen far too many cases of people being treated poorly for the sake of staying true to one man’s arrogant “vision.”

And still, I’m more profoundly affected by the news of Steve Jobs’s death than I’d expected to be. His arrogance doesn’t seem just dogmatic, but inspirational: not just for the people making the computers, but for all of us using them. And “think different” no longer seems like just an opportunistic marketing plan to inspire people to buy computers and cell phones; but a genuinely-felt philosophy intended to inspire us to do great things with them. Maybe Jobs’s greatest achievement was understanding that business and art don’t have to be mutually exclusive.

I don’t think it’s an exaggeration to say that Jobs invented the personal computer. And I’ve only just recently started to have fleeting moments of awareness of how profound that is: getting directions from my cell phone while I’m listening to music after just playing a game or reading an article, and having the sudden realization that I’m living in the future.

From now on, when I watch Apple ads, I’ll try not to see ethnically-diverse models on skiing trips or vacations to Paris, or hear the carefully-selected focus-tested music in the background as actors pretend to be a father talking to his wife and daughter. Instead, I’ll try to appreciate the bigger picture, and understand the vision Jobs wanted us all to see: friends and families using innovation to make their lives better.

L-i-t-t-l-e M-o-n-e-y

Final Fantasy Tactics and the bizarro psychology of Apple App Store pricing

Final Fantasy Tactics CalculatorsAs we all know, Final Fantasy Tactics is the best video game ever made. In the thirteen (!) years since it’s been released, I’ve been looking for other games that hit all the right notes as well as FFT did, with no luck. Plus I’ve been looking for rereleases as an excuse to buy it again, in the hopes that I could play through once more as if it were the first time.

Which is why Square’s announcement that it was going to be released on iOS was exciting: sure, I’ve still got a version — two versions, actually, since I got the PS1 Greatest Hits release way back when — that runs on the PS3, and I bought the PSP rerelease a while ago. But here was a chance to play it on a machine I actually use!

We were all warned well in advance that there’d be separate versions for iPhone and iPad, and not only did I not complain, I thought: even better! I get to buy it two more times, twice the chance to reaffirm how much I like the game. Once you reach a certain age and a certain level of Western entitlement and media saturation, buying a copy of a game or a movie becomes less about getting access and more about saying “I liked this enough to spend money on it.”

What I hadn’t been warned about, though, was that the iPhone version would be sixteen dollars.

Even the “prestige” titles for iOS max out around five dollars, with the super-fancy or particularly lengthy ones going as high as ten. Sixteen bucks for an iPhone game is outrageous.

That was my reaction to the price, even though I’d already paid $20-$40 for the game without a second thought, three times over. Even though it’s my favorite game, and I know that I can get at least 30-40 hours of play from it. And even though I’ve done enough iOS development to realize that developing for the platform can be every bit as time- and asset-intensive as developing for PCs and consoles. I’d become part of the race-for-the-bottom problem without even realizing it.

The two aspects of the App Store that have usually justified the lower pricing are: apps and games are smaller and simpler, so there’s a much lower barrier to entry; and the market penetration got so huge so quickly that you could sell an app to less than 1% of iOS users and still make a sizable profit.

Neither of those are true of Final Fantasy Tactics. Even though it’s a port of a 13 year old game, it’s still a pretty huge game with a ton of assets, not to mention a redesigned input system. And even though it’s spoken of in hushed tones as one of the greatest games ever made, it’s way too niche a game to reach even Plants vs Zombies-level sales. And it’s worth pointing out that the iPhone version is still cheaper than the PSP remake from a few years ago.

It’s a bizarre market to get into. The traditional rules of “charge what it’s worth” don’t seem to apply to the App Store; it’s become more a gamble, hoping that you can appeal to a large enough tiny fraction of the iOS market to recoup your lower production costs. On the one hand, that’s horrifying, as it creates a gold rush mentality of making unambitious and derivative games that are just “mainstream” enough to be another Angry Birds. On the other hand, it’s part of what makes the platform appealing: even with more and more huge corporate monstrosities (like, well, Square-Enix, I suppose) barging in and trying to dominate, it’s still egalitarian enough that a one- or two-man operation can make something novel and see it not only compete with the bigger guys, but surpass them.

In the fifteen years since I got into game development, it’s the closest I’ve seen to a creator-driven, “great American novel” environment in games. I know I’d never have even considered “going indie” if my only options had been PC or console releases. (I’m not even sure a one-man operation can release something on XBLA or PSN anymore). Now it feels like I’ve actually got a chance to recoup my minimal investment.

Assuming of course, I don’t waste all my time playing Final Fantasy Tactics. It’s a shame that it’s the War of the Lions release, since the more earnest translation lost a lot of the charm of the weirdly-translated original. Ah well, Life is short: Bury! Steady Sword!

140 Characters Plus an Internet

Twitter for iPad is another one of those apps that make you think all the talk about the iPad being the future of computing wasn’t just hype.

Twitteripad.png
The official Twitter app finally went universal with its iPad version yesterday, pushing the iPad one step closer to being my most useful computer. (Next milestone: the OS 4.2 update, and a good blogging client).

I’ve been using Twitterrific, and its iPad version really is great, but it was understood between the both of us that I’d be jumping ship as soon as Atebits released its app. Tweetie basically defined what features a desktop Twitter client should have, then did it again on the iPhone version, and now once again on the iPad. There’s a reason Twitter bought it as its official client — not necessarily because it’s the best one, but because it’s the best one for what Twitter wants the service to be.

I’m not interested in writing a review, because there are already dozens of reviews out there (a lot of us were waiting, apparently), and because the app is free. If you’ve got an iPad and use Twitter, there’s no reason not to download it. My review is just that “hey, it’s great.” What’s interesting to me is how much thought went into the design of the app, and even more importantly, how significantly the design of one app can change how I perceive the entire device.

A lot of people seem to be dismissing the new approach as “nice UI touches” (or alternatively, “annoying gimmicks”). And the gesture stuff — pinching and two-finger dragging — is pretty gratuitous. But the big change isn’t just a new, slick, presentation. The change is the notion that absolutely everything in the app has context.

Everything you tap on causes a new panel to slide out with more information. There’s no new information here that you couldn’t get via the older clients, but the app is constantly making predictions about what you’ll want to see based on the content of the tweet — single tweets open the user’s profile, replies display the entire conversation, tweets with a photo link open the picture, tweets with a hashtag do a search on all the other tweets with that hashtag. Since none of the information is all that new, it may not seem like that big a deal. What formerly took two or three clicks now just takes one tap. But in practice, it feels like a leap from mid 80s text-chat technology to the bridge of the Enterprise.

I’m still not sure anyone really gets what Twitter is, exactly — even Twitter doesn’t or they wouldn’t be asking “How do you use it?” A lot of people, myself included, have always seen it as just instant messaging for lonely narcissists. I’ve got lots of interesting things to say about the state of my beard and bowel movements that are far too boring to tell a single real-life friend, but are just perfect for sharing with hundreds of strangers. As a result, Twitter clients have always tended to look like IM clients. That’s why I believe if you think of Twitter as global public IM, Twitterrific is still the best client for that.

But lots of other people, who are every bit as boring as I am but a billion times more famous, are using it for advertising or self-promotion. That’s where any hope of monetizing the service comes in, and that’s (I’m assuming) why the official Twitter app emphasizes the external content in tweets instead of just the text itself. When you first start the iPad version, the main timeline (what used to be the focus in older clients) looks awkwardly small on the screen. As soon as you start scrolling and tapping, though, you can see what the designers want the Twitter service to be: a stream — or, I suppose, firehose — of information.

The only feature from Twitterrific that I miss is that there’s no quick and easy way to look at a person’s profile and find out if they’re following you. I can imagine that’s intentional, too — they’re not pushing individual conversations as much as individually tailored public streams of news and links. Part of the appeal of twitter is that contacts are asynchronous and not fake “friends”; if someone’s saying stuff you want to hear, it shouldn’t really matter whether or not they want to hear what you have to say. But that’s about the only place so far where Twitter’s enforced idea of how I should be using their service has been an annoyance.

The rest of the time, I’m just impressed by how dense the average Twitter feed is, all the stuff streaming by that I never bothered to click on before. And impressed by how the iPad app just seems to know what I want to look at. Presenting relevant information automatically instead of making you look for it seems like just a convenience (or annoyance, depending on how slow your internet connection is). But the more I use the iPad Twitter app, the more I get the sense that this is exactly the kind of presentation that will make tablet computers come into their own.