Some of the most exciting applications of AR in gaming have nothing to do with having Kratos in your kitchen
I read a column by Brendan Sinclair where he suggests that AR for gaming doesn’t have much of a future beyond the novelty factor. On Mastodon, he was even more blunt, suggesting that AR is good at attracting venture capital but will inevitably run the same course of underwhelming reality as current-day VR has.
My overall take on that column is that Sinclair makes short-term observations that I entirely agree with, and then he makes conclusions that I think are myopic and unimaginative. To be fair: the column is about games as business, which is all about analysis of existing product more than speculation about the future, and Sinclair acknowledges as much in the column.
For instance: it’s tempting to point to Pokemon Go’s success as a sign that AR is a potential gold mine, but as Sinclair points out, that game was successful because of its IP and its geolocation more than its AR functionality. We’re in agreement there, but I disagree that you can extrapolate much about the viability of AR games from that.
Pokemon Go’s AR element was doomed to be uninspiring (in my opinion at least) for two reasons: first, it had to be compatible with a broad range of devices, which limited it to the lowest-common-denominator in terms of AR functionality. Second, it had to work with a game that was literally designed to be played anywhere on earth, with zero predictability in terms of environment.
The main takeaway I got from Sinclair’s column is that most people’s thinking about the realistic potential of AR and VR — including my own! — has been both defined by the limitations of existing implementations, and also set to an impossibly high standard.
The devices — and by extension, experiences — that we’re familiar with have all been limited by necessary compromises: some of them because the tech just isn’t there yet; some of them because companies rushed products to market before they were fully baked; and some of them because the devices were intended to be prototypes, to generate ideas about what the future of VR or AR could be instead of presenting any finished and polished technology.
And to be clear, when I talk about products being rushed to market before they’re “ready,” I don’t think it’s entirely sinister. I fundamentally disagree with Meta’s overall take on VR, for instance, but I do think it was a reasonable decision to emphasize lower cost, wider adoption of headsets over the absolute best and most expensive technology.1In retrospect, I don’t agree with Google’s versions, though, even though I thought they made sense at the time. Sure, technically Google Cardboard and Daydream brought the potential of VR to more people who otherwise wouldn’t be able to afford it or be interested in it, but it also set expectations impossibly low for what VR could be.
But it’s also put us in a weird position in which current implementations of VR haven’t lowered the bar, but raised it. In other words, for some reason, it’s not enough just to fix the problems with the existing technology. Supposedly, we have to make it perfect, or it’s not worth pursuing at all.
In retrospect, I don’t agree with Google’s versions, though, even though I thought they made sense at the time. Sure, technically Google Cardboard and Daydream brought the potential of VR to more people who otherwise wouldn’t be able to afford it or be interested in it, but it also set expectations impossibly low for what VR could be.
Other people’s speculation about the Apple Vision headset and platform, and what we can learn from M3GAN.
I like Mastodon quite a bit, but for whatever reason, the bad takes on there do triple psychic damage on me compared to other places. Monday during the WWDC presentation felt like a cavalcade of performative anti-capitalism, and it was exhausting being reminded so forcefully and so frequently how I was a corporate shill for not thinking that everything on display at Apple was complete bullshit.1I have no doubt that there were a tiresome number of ride-or-die Apple fans gasping at the wonders on display, but I feel like it’s a lot easier to just shrug and let them have their thing.
Also I don’t know WTF that AR/VR headsets have to do with heat pumps, but one YouTuber took to Mastodon to scold all of us for being more excited about dinosaurs coming through magical portals into our living room, than efficient HVAC systems. I hope you’re all proud of yourselves for murdering our only planet Earth!
Anyway, there were a couple of takes that I thought were pretty interesting, from people who’ve actually tried out the Apple Vision Pro headset.
One was a first impressions video from Marques Brownlee, which I liked because he’s able to talk about this stuff with the healthy skepticism of somebody who’s seen a lot of devices pushed by companies as being the next big thing, but isn’t too jaded to say when a new piece of technology feels “magical.” If we can’t get excited about this stuff, and instead are doomed to just wallow in minimum-viable-product-induced cynicism, then what’s the point in following it at all? There are plenty of parks outdoors that one might enjoy.
Another was this post from Ben Thompson on his Stratechery blog, where he quotes himself from Twitter but is otherwise pretty insightful. He has some interesting observations about mimicking AR with a headset that is technically VR (instead of using actual glasses like Google Glass, for instance); and he also takes some time to guess where Apple might be positioning the Vision product line, based on what they’ve done in the past with the Mac, iPad, and iPhone.
To me, the common thing that stood out in both was the emphasis on the personal and societal implications of this device, beyond its technology.
MKBHD said he didn’t like the demo of 3D photos at all, and he thought it was too creepy to imagine a dad wearing this headset to take video of his kid’s birthday party. Thompson says that he felt that Apple’s demos seemed sad and lonely; a man looking at photos or watching videos of family events while alone on a couch suggested “a divorced dad, alone at home with his Vision Pro, perhaps because his wife was irritated at the extent to which he got lost in his own virtual experience.”
I think it’s fascinating that so much of the conversation about the device is already about questions like these, because it suggests that a dramatic shift is imminent. And that the difference between this and previous dramatic shifts in technology and society is that we can see this one coming.
A good bit of Apple’s presentation was reminiscent of Black Mirror. I think it’s telling that the reports from people who’ve used the Apple Vision headset2In a controlled demo environment, of course haven’t objected to the WWDC video as looking faked or staged. It really does look like it does in the videos and advertisements. By every account that I’ve seen, the team at Apple has actually achieved the tech shown in their demo videos, and it’s at the level of near-future sci fi concept art.
“Congratulations on achieving the most cynical and dystopian vision of a bunch of clinically depressed British television writers!” is, admittedly, an odd take. But I see it as optimistic, because we’re talking about technology not just in terms of what it can do, but in terms of how it feels and how it can affect us on a societal level.
I vividly remember being in an AT&T Store and seeing the first iPhone: I was so impressed by the screen that I thought it had to be a pre-printed sticker, no color handheld display could possibly be that sharp3I ask that everyone take that as an indication of how quickly display technology has improved, not as an indication of how gullible and easily-impressed I am., and it would be amazing to have a portable touchscreen that was always connected to the internet. I had no idea how fundamentally it would change things, and how much it would creep into my everyday life.
Here, we have a better idea of the implications and ideally, we can steer it in the right direction.
And even better, the conversation is already talking about the right things. Back when the iPad was released, for instance, it was all either “lol it sounds like maxipad!!!!” or “what’s the big deal it’s just a big iPhone” dominating the conversation. I didn’t see anyone with the foresight to predict what it’d mean for increased screen time for kids, or the implications of the App Store model on a more general-purpose computing device than a smart phone4In other words: there have always been people complaining about the “walled garden,” but they always focused on principle instead of practicality. I want my primary and essential communication device to be safe from malware, even if it means I lose some control over what I can put on it..
One of the (many) frustrating things about Facebook/Meta buying themselves into the role of Stewards of VR is that they’ve controlled the messaging about what VR can be capable of, and what is and isn’t important, and they’ve often been clumsy about it. A prime example is the whole non-issue about avatars not having legs. The reality is that without a bunch of extra cameras or sensors — which would be a compete non-starter for mass adoption of an HMD — it is near-impossible to track the user’s legs in 3D space. A more patient and thoughtful company would’ve side-stepped the issue entirely, because it doesn’t matter in practice, but instead it was allowed to become a running joke.5For the record, I’m not at all convinced that Apple’s uncanny valley virtual avatars are the right answer, either. But I’m a lot more optimistic that smart people will find a way to make 3D avatars more convincing, than that they’ll find a way to let cameras strapped into your head see down and around your body and somehow map your legs in 3D space with no depth information. Or that we’ll ever live in a world in which anyone anywhere genuinely gives a damn about what you’re doing with your legs unless you’re either dancing, or you’re kicking them.
Something that is much more important than avatars with legs: whether technology facilitates human interaction, or tries to replace it, or worse, obviate it. One of the things that elevated the movie M3GAN above camp into genuine social satire is that it depicted a world where people were so impressed by the potential of tech that they ignored all of the obvious problems with it.
Even as an Apple fan, I thought some of the marketing images crossed that line. “They know how weird this looks, right? Tell me that they still get how this is unnerving.” But I also feel like so much of it is from a determination to stay on-brand as a lifestyle-facilitated-by-technology company instead of just a computer company; and the need to undo years of counter-programming from other companies who’ve released VR headsets and set people’s expectations for what VR is like.
I can’t see anyone wearing this thing while packing a suitcase, for instance, but the message is that it can be a personal communication device. I don’t see anyone wearing the $3500 version while cooking — although an AR-enabled recipe and cooking instruction app for a lower-cost version is a no-brainer — but the message is that you can do other stuff while wearing it, and you can communicate with other people in the room. Consumer-grade HMDs have so far always been like diving into a sensory deprivation tank, and that’s the main idea they need to counteract for this to ever get traction.
For what it’s worth: I don’t believe that making this a “real” AR device — i.e. applying a display on top of transparent glasses instead of using opaque video screens and cameras — would’ve helped with that. After all, headphones can be isolating. I’ve never had a conversation with someone wearing a glasses-mounted camera, but I know I would’ve noped out of such a conversation as quickly as possible, because that’s hell of creepy.
I think that one of the issues with Apple’s demo videos is that it’s difficult to convey the idea of being alone but not lonely. There’s nothing inherently weird about looking through old photos or video of family gatherings when you’re by yourself, for instance, but it’s become a familiar symbol of “sad about the past” almost as much as panning over to the fireplace means “they’re doin’ it.”
The more troubling aspect of that part of the presentation is the whole question of how the 3D video of the family gathering was made in the first place. We already know the problem of people using their phones to record things instead of being in the moment, and it’s much worse when you think of the photographer wearing an opaque headset projecting unsetting video of his eyes on the front. I think that’s the image that Apple’s eager to counteract — and to be honest, it’s kind of their own fault for making the iPhone ubiquitous in the first place — and to convey that wearing this thing does not mean cutting yourself off from other people or not being in the moment.
But personally, I think that a device like the Apple Vision is something you’d most often be using while you’re alone. I don’t necessarily see that as a problem, since that’s also true of my computer, my iPad, and my phone. They often facilitate social interaction, but they’re not inherently social experiences. Writing a blog post, for instance, is completely solitary but still helps fill my “social meter” in Sims terminology.
By that standard, maybe the uncanny-valleyness of the virtual avatar and the front-facing eyes are not bugs, but features? Maybe it’s good to have a reminder that the goal isn’t to perfectly recreate or replace real human interaction, because it will never be as good as the real thing.
I predict that the biggest use case for the Apple Vision, at least initially, will be the most solitary parts of their demo presentation — the guy working at his desk with multiple large virtual screens, and the people sitting on their couch watching TV shows or movies. It’s a huge screen replacement with a built-in computer, essentially a 120-inch-or-more iMac, with any number of additional monitors, all of which are also 3D-capable displays.
That’s a lot less ambitious than what I imagined back when I first saw my first demo of a VR headset. I’m as surprised as anyone to realize that I’d be more interested in 2D gaming in a 3D headset (even though if that’s mostly because I could remain seated comfortably). But it’s also a lot more practical goal, and, I think, a lot more optimistic. If you concentrate solely on the escapist nature of VR and AR, you’re emphasizing the idea that the everyday world is something you need to escape.
1
I have no doubt that there were a tiresome number of ride-or-die Apple fans gasping at the wonders on display, but I feel like it’s a lot easier to just shrug and let them have their thing.
2
In a controlled demo environment, of course
3
I ask that everyone take that as an indication of how quickly display technology has improved, not as an indication of how gullible and easily-impressed I am.
4
In other words: there have always been people complaining about the “walled garden,” but they always focused on principle instead of practicality. I want my primary and essential communication device to be safe from malware, even if it means I lose some control over what I can put on it.
5
For the record, I’m not at all convinced that Apple’s uncanny valley virtual avatars are the right answer, either. But I’m a lot more optimistic that smart people will find a way to make 3D avatars more convincing, than that they’ll find a way to let cameras strapped into your head see down and around your body and somehow map your legs in 3D space with no depth information. Or that we’ll ever live in a world in which anyone anywhere genuinely gives a damn about what you’re doing with your legs unless you’re either dancing, or you’re kicking them.
Opinions about the newly-announced Apple Vision Pro AR/VR headset
Earlier today, Apple announced an AR/VR headset that is being positioned as a new computing platform, and they introduced it with a promo from Disney showing The Mandalorian and Mickey Mouse and the Main Street Electrical Parade being shoved into your house in 3D.
So yes, as a matter of fact, I do have opinions.
Back in 2016, a kind video game journalist who followed me on Twitter invited me to come by their offices in San Francisco to try out the HTC Vive and the Oculus Rift. I was so impressed by Valve’s demos that I went all-in on the potential of VR, and before too long found myself working at a company doing physical therapy in VR. (I no longer work for Penumbra or REAL, but that team is doing some great work and I wish them all the best!)
In the years since then, my enthusiasm for VR and mixed-reality headsets has been diminished, if not aggressively stomped on, by all the impractical realities of these devices. They’re kind of uncomfortable, setup can be a pain, and it’s just fatiguing to wear them for extended periods over 45 minutes or so. Even for the games that don’t require a lot of physical exertion, I would get hot and sweaty just from having screens and a battery strapped to my face.
I’ve been wondering if the skeptics and cynics were right, and all the seeming potential of AR and VR was just illusory. Does it just make a great first impression but have no staying power? Is it doomed to be not just a niche, but a quickly-forgotten fad? Or will there be some sort of breakthrough that finally makes it “stick”?
Thoughts about old computers, emulators, and the difference between idealized memory and practical reality
Every couple of years, with relentless regularity since the late 1990s, I become overwhelmed with the need to bring my beloved college computers back to life.
Throughout my freshman year, I had a Mac Plus that was given to me as a graduation present. I loved this computer about as much as it’s possible to love an inanimate object. I still vividly remember the story of when my parents gave it to me, and it’s easily one of my top 10 memories. It was everything I wanted after years of reading Mac User magazine, getting excited at screenshots of simple utilities that looked like pure magic, running GEOS on my Commodore 128 and dreaming of the day I’d finally get “the real thing.” It was the focal point of my friendship with my best friend that year, as we’d spend a lot of time on the Mac running Dark Castle, Beyond Dark Castle, and Uninvited, and it was likely the thing that really made me want to work in video games.
After that year, I “upgraded” to an Amiga 500, which was clearly better in every possible way. So many colors! Such better sound! So many more options for expansion! So much room for activities! I ended up using it for all my school work, and spent a lot of time running Deluxe Paint, but somehow it never captured the same magic as that compact Mac.1It was, however, my introduction to The Secret of Monkey Island, so I’m grateful for that. Every time I get overcome with the desire to bring these computers back to life2Or more likely, find functional ones on Ebay, I’m reminded of that feeling of barely-definable, irrational disappointment.
Above is a video from Marques Brownlee comparing the relationship between Apple and third-party developers to that of sharks and remoras, and then comparing that to the Tile app and for some reason, lingering bitterness over Watson.
I mean, I’ve acknowledged several times over that I’m an Apple apologist, so I’ll go ahead and spoil this post and say that I think this video is bullshit mistaken and oddly conspiratorial.1“Bullshit” was way too harsh, now that I’m re-reading. I don’t even care about the topic that much. He goes out of his way to make business sound sinister and non-competitive, and right out of the gate he’s got a spurious argument.
As Brownlee points out, Apple set up for the launch of Airtags by making the Find My network available to third party developers. (For the record: I was completely unaware that they’d done this, so I’m even more surprised to see people calling foul). He then claims that this is just an illusion of choice, because whether or not Tile chooses to make Find My-compatible devices, Apple still “wins” because each Tile device now improves the Find My network, instead of Tile’s own network.
The first, most obvious problem with that: saying that a “win” for Apple is a “loss” for everyone else. By that metric, there’s no middle ground between “corporate altruism” and “unfair monopoly.” Brownlee makes it sound like the most valuable asset Tile has is its proprietary network, and not the devices themselves. But he’d already described how there are many, many more Apple devices out there on the Find My network than there are Tiles, but orders of magnitude. If Tile were competing on the network alone, then they’d already lost that before the Airtag was even released. And Apple’s only non-villainous option would’ve been to stay out of the business completely. Make its own network open to third parties and not introduce its own separate tracking device.
The other problem I have with it is illustrated in Brownlee’s thumbnail, and in the video as he holds Tile devices up to the camera. The Tile devices all have holes drilled in the tile itself, making them useful without buying an extra case or strap. And there’s a variety of sizes, including credit card-sized ones that will fit in a wallet, unlike the Airtags. So Tile has already differentiated itself in a marketable way. Taking advantage of the Find My network seems like a no-brainer.
What strikes me as especially weird is that Brownlee has been an advocate of Tesla for a long time, and a few months ago he made a video essentially describing how and why Tesla was so far ahead of the game in the EV market. To be fair, he did mention some criticisms of Tesla, and he said the whole reason for his video was a desire for there to be more competition in the EV market. But he listed Tesla’s battery range and extensive supercharger network as the two main reasons the company was at least a few years ahead of every other EV manufacturer.
Whenever people are praising the supercharger network, they never seem to have an issue with the fact that it’s proprietary, for Teslas only. On the rare occasion they do mention it, it’s always described as being the fault of other manufacturers, for not following Tesla’s lead. Because as we know, establishing a tech standard means proposing your own and telling every other manufacturer to do it your way, or suck it. Somehow, this is described as groundbreaking innovation, and never as colossal arrogance or anti-competitive business.
Obviously, a company as huge as Apple doesn’t need some jerk with a blog defending it. But its size doesn’t automatically make it the bad guy, either. Starting a business dependent on another company’s product — whether it’s software development or hardware accessories — is always going to be risky. It’s usually in both companies’ best interest to make sure the other succeeds.
The amount of money Apple’s going to make off Airtags is likely going to be on the level of a rounding error compared to their other businesses. It’s probably best to think of the Airtags as similar to reference graphics cards made by the chipset manufacturer: an Apple-designed example of how devices can use the Find My Network. That’s the kind of symbiosis that Brownlee describes, so I’m not sure why he’s so eager to make it sound sinister.
1
“Bullshit” was way too harsh, now that I’m re-reading. I don’t even care about the topic that much.
The 13″ M1 MacBook Pro is the first perfect laptop I’ve used in years, and it’s actually changing how I think about personal computers.
On his site SixColors, Jason Snell is doing a great series called 20 Macs for 2020. I’ve been loving it, as someone who’s been a fan of Macs since I was a teenager, long before I had one. I used to buy issues of MacUser1Instead of Macworld, sorry Mr Snell. and look at the one-bit screenshots of dialog boxes and menus, hoping for the day I’d be able to actually use a Mac.
I’ve got my own list of notable computers over the years. There’s the Mac Plus my parents got me as a high school graduation present, which was constantly swapping floppy disks and frequently crashing at random and which was my favorite computer. The Amiga 500 I used through most of college, which was the first machine I played LucasArts games on. The Commodore 64 I had in high school, where I learned BASIC and 6502 Assembly. The cheese-grater Mac Pro that was impeccably well-designed but absurdly impractical for what I needed, but still made me feel like I’d somehow leveled up as a computer programmer.
Least remarkable but probably most impactful: the Aluminum PowerBook G4 I got when I was working at Maxis. The year before, I’d been using this plasticky Dell Inspiron behemoth as a “desktop replacement,” which I regretted within weeks after buying it. Too heavy to be portable, but too much of a laptop to be expandable, it was the worst of all possible worlds. I decided that since I was a grown-up now, I could afford to pay the “Apple tax” and get a grown-up computer. After a decade hiatus from Macs, I decided I’d try to get a good computer instead of a cheap computer.
That photo isn’t one of me Totally Sidetalkin’, but it’s almost as amazing. I’m holding a running MacBook Pro up to my face. Without fear of scorching my delicate skin or torching all my white hair to blackened cinders.
Today I got a 13″ MacBook Pro with the new M1 chip, and I’ve spent the last 3 hours or so using it to transfer and update files, have Safari with several tabs (including YouTube) open, play Music, edit photos with the Photos app, run Photoshop 2021 (via Rosetta), have Xcode running and updating, working on a project in Nova and a Terminal window and emulator, read RSS feeds with Reeder, and I’ve even got Steam up and running Pendragon.1I don’t have any graphically-intensive games installed on this Mac because I hardly ever play games on the Mac. And the machine just barely feels warmer than when I took it out of the package.
Apple’s now selling its first Macs with Apple Silicon, and being an early adopter is slightly harder than it used to be
Yesterday, Apple announced its first lineup of Macs switching to its internally-designed Apple Silicon as it transitions away from Intel. For what it’s worth, I thought the presentation itself was excellent, staying fairly conservative but still showing exactly what developers and Mac devotees needed to see. Some people wanted to see more dramatic redesigns, but I think they needed this first round of machines out so that people can make direct comparisons.1I want everyone to appreciate my restraint in not using the phrase “Apples to Apples.” You’re welcome.
The purpose of this one was to reassure everyone that they were well prepared for the transition and that they’re still committed to the Mac line. My favorite parts were the multiple below-ground six-colored hallways, and the part where a MacBook and Craig Federighi both got turned on instantly. I know that they like treating product announcements like social events for the press, but I wish they’d keep the pandemic format for all their future announcements, because they’ve all been really slick and charming.
Reluctantly coming to the conclusion that the computer I’ve always wanted isn’t the computer I’ve always wanted
It’s a reliable source of tragicomedy to see people working themselves into an indignant rage over gadget reviews. When I was looking for reviews of the iPad Pro this Wednesday (to do my due diligence), Google had helpfully highlighted some fun guy on Twitter calling the tech journalists’ coverage of the device “shameful.” The reviews themselves had hundreds of comments from people outraged that even the notion of a larger, more expensive iPad was an assault to everything we hold dear as Americans.
The complaints about the rampant “Apple bias” are especially ludicrous in regards to the iPad Pro, since the consensus has been overwhelmingly cautious: out of all the reviews I read, there’s only one that could be considered an unqualified recommendation. Even John Gruber wasn’t interested in getting one. (But he did still believe that it’s the dawning of a new age in personal computing; it’s still Daring Fireball, after all). Every single one of the others offered some variation on “It’s nice, I don’t have any interest in it, but I’m sure it’s perfect for some people.”
Yes, I thought, I am exactly those some people.
Designed By Apple In Cupertino Specifically For Me
I’ve spent the better part of this year trying to justify getting a smaller and lighter laptop computer. I’ve spent the better part of the last decade wanting a good tablet computer for drawing. And I’ve tried — and been happy with — most of the variations on tablets and laptops that Apple’s been cranking out since the PowerBook G4. (One thing people don’t mention when they complain about how expensive Apple products are is that they also retain their resale value exceptionally well. I’ve managed to find a buyer for every Apple computer or tablet I’ve wanted to sell).
I’ve tried just about every stylus I could find for the iPad. I tried a Galaxy Note. I tried a Microsoft Surface. I got dangerously excited about that Microsoft Courier prototype video. Years ago, I tried a huge tablet PC from HP. None of them have been right, for one reason or another.
But when they announced the iPad Pro this Fall, it sounded like Apple had finally made exactly what I wanted: a thin and relatively light iPad with a high-resolution display, better support for keyboards, faster processor, and a pressure-sensitive stylus designed specifically for the device. Essentially, a “retina” MacBook Air with a removable screen that could turn into a drawing tablet. The only way it could be more exactly what I want would be if it came with a lifetime supply of Coke.
Still, I decided to show some constraint and caution for once, which meant having the calm and patience to get one a few hours into opening day instead of ordering one online the night before.
I read all the reviews, watched all the videos, paid closest attention to what artists were saying about using it. The artists at Pixar who tried it seemed to be super-happy with it. All the reviews were positive about the weight and the display and the sound and the keyboards.
I went to the Apple Store and tried one out, on its own and with the Logitech keyboard case. It makes a hell of a first impression. The screen is fantastic. The sound is surprisingly good. It is huge, but it doesn’t feel heavy or all that unwieldy when compared to the other iPads; it’s more like the difference between carrying around a clipboard vs carrying a notepad. (And it doesn’t have the problem I had with the Surface, where its aspect ratio made using it as a tablet felt awkward).
And inside the case, it gets a real, full-size keyboard that feels to me just like a MacBook Air’s. It really does do everything shown in the demo videos. I imagined it becoming the perfectly versatile personal computer: laptop for writing, sketchpad for drawing, huge display for reading comics or websites, watching video, or playing games. (I’m not going to lie: the thought of playing touchscreen XCOM on a screen this big is what finally sold me).
But Not For Me
But I don’t plan to keep it.
It’s not a case of bait-and-switch, or anything: it’s exactly what it advertises, which is a big-ass iPad. The question is whether you really need a big-ass iPad.
The iPad Pro isn’t a “hybrid” computer, and Apple’s made sure to market it as 100% an iPad first. But it’s obvious that they’re responding to the prevalence of hybrids in Windows and Android, even if not to the Surface and Galaxy Note specifically. And I think Apple’s approach is the right one: differentiating it as a tablet with optional (but strongly encouraged) accessories that add laptop-like functionality, instead of as some kind of all-in-one device that can seamlessly function as both.
But a few days of using the iPad Pro has convinced me that the hybrid approach isn’t the obviously perfect solution that common sense would tell you it is. It’s not really the best of both words, but the worst of each:
Big keyboards: The Apple-designed keyboard is almost as bad for typing as the new MacBook’s is, which is almost as bad as typing on a Timex Sinclair. Maybe some people are fine with it, and to be fair, even the on-screen keyboard on the iPad Pro is huge and full-featured and easy to use. But for me, the Logitech keyboard case is the only option. And it’s pretty great (I’m using it to type this, as a cruel final gesture before I return it) but it turns the iPad Pro from being surprisingly light and thin into something that’s almost as big and almost as heavy as a MacBook Air.
Big-ass tablet: Removed from the case, the iPad Pro quickly becomes just a more unwieldy iPad. The “surprisingly” part of “surprisingly light and thin” means that it’s genuinely remarkable considering its processor speed and its fantastic screen, but it still feels clumsy to do all the stuff that felt natural on the regular iPad. It really wants to be set down on a table or desktop.
It’s not cheap: I wouldn’t even consider it overpriced, considering how well it’s made and how much technology went into it. But it does cost about as much as a MacBook Air. That implies that it’s a laptop replacement, instead of the “supplemental computer” role of other iPads.
Touching laptop computer screens is weird: Nobody’s yet perfected the UI that seamlessly combines keyboards and touch input. Even just scrolling through an article makes me wish I had a laptop with a touchpad, where it’s so much more convenient. When it feels like the touchpad is conspicuously absent while you’re using a device that’s essentially a gigantic touchpad, that means that something has broken down in the user experience.
Aggressive Auto-correct: Because iOS was designed for touch input on much smaller screens, it was designed for clumsy typing with fat fingers. Which means it aggressively autocorrects. Which means I’ve had to re-enter every single HTML tag in this post. And it still refuses to let me type “big-ass” on the first try.
It’s missing much of OS X’s gesture support: Despite all the clever subtle and not-so-subtle things they’ve done to make iOS seamless, it’s still got all the rough edges as a result of never being designed for a screen this large. In fact, having your hands anchored to a keyboard goes directly against the “philosophy” of iOS, which was designed to have an unobtrusive UI that gets out of the way while you directly interact with your content. Ironically, it’s all the gesture recognition and full-screen stuff that made its way from iOS to OS X that I find mysef missing the most — I wish I could just quickly swipe between full-screen apps, or get an instant overview of everything I have open.
No file system: This has been a long-running complaint about iOS, but I’ve frankly never had much problem with it. But now that the iPad is being positioned as a product that will help you do bigger and more sophisticated projects, it becomes more of a problem. I just have a hard time visualizing a project without being able to see the files.
The old “walled-garden” complaints: Apple’s restrictions aren’t nearly as draconian as they’re often made out to be, but they still exist. Occasionally I need to look at a site that still insists on using Flash. And the bigger screen size and keyboard support of the iPad Pro suggest that programming would be a lot of fun on this device, but Apple’s restrictions on distributing executable code make the idea of an IDE completely impractical.
Third-party support: App developers and web developers haven’t fully embraced variable-sized screens on iOS yet. (As an iOS programmer, I can definitely understand why that is, and I sympathize). So apps don’t resize themselves appropriately, or don’t support split screen. Some apps (like Instagram, for instance) still don’t have iPad versions at all. Some web sites insist I use the “mobile” version of the site, even though I’m reading it on a screen that’s as large as my laptop’s.
If You Don’t See a Stylus, They Blew It
For me, the ultimate deciding factor is simply that the Apple “Pencil” isn’t available at launch. They’re currently back-ordered for at least four weeks, and that’s past the company’s 14-day return window. Maybe they really have been convinced that the stylus is a niche product, and they weren’t able to meet the demand. Whatever the case, it seems impossible for me to really get a feel for how valuable this device is with such a significant piece missing.
The one unanimous conclusion — from both artists and laypeople — is that the Pencil is excellent. And I don’t doubt it at all. Part of what gets the tech-blog-commenters so angrily flummoxed about “Apple bias” is that Apple tends to get the details right. Their stuff just feels better, even if it’s difficult or impossible to describe exactly how or why, and even if it’s the kind of detail that doesn’t make for practical, non-“magical” marketing or points on a spec sheet.
Even though I haven’t been able to use it, I have been impressed with how Apple’s pitched the stylus. They emphasize both creativity and precision. There’s something aspirational about that: you can use this device to create great things. Microsoft has probably done more over the years to popularize “pen computing” than any company other than Wacom, but they’ve always emphasized the practical: showing it being used to write notes or sign documents. It’s as if they still need to convince people that it’s okay for “normal” people to want a stylus.
Part of the reason I like Apple’s marketing of the Pencil is that it reminds me of the good old days before the iPhone. Back when Apple was pitching computers to a niche market of “creative types.” It was all spreadsheets vs. painting and music programs, as clearly differentiated as the rich jocks vs the sloppy underdogs in an 80s movie.
I only saw a brief snippet of Microsoft’s presentation about the Surface and Surface Book. In it, the Microsoft rep was talking about the Surface’s pen as if he’d discovered the market-differentiating mic-drop finishing-move against Apple’s failed effort: unlike “the other guys,” Microsoft’s pen has an eraser. I’ve been using a Wacom stylus with an eraser for some time, and its always too big and clumsy to be useful, and it always ends up with me using the wrong end for a few minutes and wondering why it’s not drawing anything.
Meanwhile, Apple’s ads talk about how they’ve painstakingly redesigned the iPad screen to have per-pixel accuracy with double the sampling rate and no lag, combining their gift for plausible-sounding techno-marketing jargon with GIFs that show the pen drawing precise lines on an infinite grid. That difference seems symbolic of something, although I’m not exactly sure what.
The Impersonal Computer
I’ve been pretty critical of Microsoft in a post that’s ostensibly about how I don’t like an Apple product. To be fair, the Surface Book looks good enough to be the best option for a laptop/tablet hybrid, and it’s clear some ingenious work went into the design of it — in particular, putting the “guts” of the machine into the keyoard.
I’m just convinced now that a laptop/tablet hybrid isn’t actually what I want. And I think the reason I keep going back to marketing and symbolism and presentation and the “good old days” of Apple is that computers have developed to the point where the best computer experience has very little to do with what’s practical.
I get an emotional attachment to computers, in the same way that Arnie Cunningham loved Christine. There have been several that I liked using, but a few that I’ve straight-up loved. My first Mac was a Mac Plus that had no hard drive and was constantly having to swap floppy disks and had screen burn-in from being used as a display model and would frequently shut down in the middle of doing something important. But it had HyperCard and Dark Castle and MacPaint and the floppy drive made it look like it was perpetually smirking and it as an extravagant graduation gift from my parents, so I loved it. I liked the design of OS X and the PowerBook so much that I even enjoyed using the Finder. I tried setting up my Mac mini as a home theater PC mostly as an attempt to save money on cable, but really I just enjoyed seeing it there under the TV. Even a year into using my first MacBook Air, I’d frequently clean it, ostensibly to maintain its resale value but really because I just liked to marvel at how thin and well-designed it was.
I used to think that was pretty common (albeit to healthier and less obsessive degres). But I get the impression that most people see computers, even underneath all their stickers and cases to “personalize” them, as ultimately utilitarian. A while ago I had a coworker ask why I bring my laptop to work every day when the company provided me with an identical-if-not-better one. The question seemed absolutely alien to me: that laptop is for work; this laptop has all my stuff.
Another friend occasionally chastises me for parading my conspicuous consumption all over the internet. I can see his point, especially since the Apple logo has gone from a symbol of “I am a creative free-thinker” to “I have enough money to buy expensive things, as I will now demonstrate in this coffee shop.” But I’ve really never understood the idea of Apple as status symbol; I’ve never thought of it as “look at this fancy thing I bought!” but “look at this amazing thing people designed!”
The iPad was the perfect manifestation of that, and the iPad mini was even more. Like a lot of people, I just got one mainly out of devotion to a brand: “If Apple made it, it’s probably pretty good.” I had no idea what I’d use it for, but I was confident enough that a use would present itself.
What’s interesting is that a use did present itself. I don’t think it’s hyperbolic to say that it created an entirely new category of device, because it became something I never would’ve predicted before I used it. And it’s not a matter of technology: what’s remarkable about it isn’t that it was a portable touch screen, since I’ve known I wanted one of those ever since I first went to Epcot Center. I think what’s ultimately so remarkable about the iPad is that it was completely and unapologetically as supplemental computer.
Since its release, people (including me) have been eager to justify the iPad by showing how productive it could be. Releasing a version called the “Pro” would seem like the ultimate manifestation of that. But I’m only now realizing that what appealed to me most about the iPad had nothing to do with productivity. I don’t need it to replace my laptop, since I’m fortunate enough to be able to have a laptop. And the iPhone has wedged itself so firmly into the culture that it’s become all but essential; at this point it just feels too useful to be a “personal” device. (Plus Apple’s business model depends on replacing it every couple of years, so it’s difficult to get too attached to one).
Apple’s been pitching the watch as “their most personal device ever,” but I wouldn’t be devastated if I somehow lost or broke the watch. My iPad mini, on the other hand, is the thing that has all my stuff. Not even the “important” stuff, which is scattered around and backed up in various places. The frivolous, inconsequential stuff that makes it as personal as a well-worn notebook.
Once I had the iPad Pro set up with all my stuff, I was demoing it to a few people who wanted to see it. And obviously with coworkers but even, surprisingly, when showing it to my boyfriend, there was a brief moment of hesitation where I wondered if I was showing something too personal. I don’t mind anybody using my laptop or desktop, or sharing my phone with someoen who needs it, but I’ve got a weird, very personal attachment to the iPad. (And not just because I treat my Tumblr app like the forbidden room in a gothic novel which no one must ever enter).
It’s entirely possible that I’m in the minority, and whatever attachment most people have to “their stuff” is to the stuff itself in some nebulous cloud, and not the device that’s currently showing it to them. It’s even more likely that there’s simply no money to be made in selling people devices that they become so attached to that they never want to give them up. It may be that Convergence is The Future of Personal Computing, and one day we’ll all have the one device that does everything.
After using the iPad Pro, I’m no longer convinced that a big iPad that also functions as a laptop is what I want. I really want a “normal”-sized iPad that’s just really good at being an iPad. Which means adding support for the Apple Pencil to the iPad Air.
So I’m back to hoping Apple’s already got one of those in the pipeline, and waiting until it’s announced at some point next year, and then ordering one the second they’re available and then trying to justify it as a rational and well-considered purchase. Next time for sure it’s going to be exactly the computer I want.
The Apple TV sure seemed like a good idea… at first!
On the surface (sorry), it seemed like Apple had made all the right decisions with its new product announcements yesterday. [For future anthropologists: new Apple Watches, a bigger iPad with a stylus, and Apple TV with an app store, and iPhones with better cameras and pressure-sensitive input. Also, the title of this blog post is a reference to something that happened a few months ago that nobody cares about now. — Ed.]
I’ve wanted an iPad with a stylus since before the iPad was even announced, so long ago that my image links don’t even work anymore! And I’ve been wanting a lighter laptop to use as purely a “personal computer” in the strictest sense — email, social media, writing, whatever stuff I need to get done on the web — and keep finding myself thinking “something like a MacBook Air that doubles as a drawing tablet would be perfect!” In fact, the iPad Pro is pretty close to what I’d described years ago as my dream machine but cheaper than what I’d estimated it to cost.
There’s been a lot of grousing online about how Apple’s acting like it invented all of this stuff, when other companies have had it for years. On the topic of pen computing, though, I can unequivocally say no they haven’t. Because over the years, I’ve tried all of them, from Tablet PCs to the Galaxy Note to the Microsoft Surface to the various Bluetooth-enabled styluses for iOS. (I’ve never been able to rationalize spending the money for a Cintiq, because I’m just not that great an artist). I haven’t tried the iPad Pro — and I’ll be particularly interested in reading Ray Frenden’s review of it — but I know it’s got to be at least worth investigation, because Apple simply wouldn’t release it if it weren’t.
Even if you roll your eyes at the videos with Ive talking about Apple’s commitment to design, and even if you like talking about Kool-Aid and cults whenever the topic of Apple comes up, the fact is that Apple’s not playing catch-up to anyone right now. They’ve got no incentive to release something that they don’t believe is exceptional; there’d be no profit in it. The company innovates when it needs to, but (and I’m not the first to say it): they don’t have to be the first to do something; they just have to be the first to do it right. And they’ve done exactly that, over and over again. The only reason I may break precedent and actually wait a while to get a new Apple device is because I’m not convinced I need a tablet that big — it’d be interesting to see if they’ll release a pen-compatible “regular-sized” iPad.
And if I’ve been wanting a pen-compatible iPad for almost a decade, I’ve been wanting a “real” Apple-driven TV set-top box for even longer. The first time I tried to ditch satellite and cable in favor of TV over internet, I used a bizarre combination of the first Intel Mac mini with Bootcamp to run Windows Media Center, a Microsoft IR remote adapter, a third party OTA adapter, and various third party drivers for remotes and such, all held together with palm fronds and snot. I’ve also tried two versions of the “hobby” Apple TV, relics of a time when Apple was known for glossy overlays, Cover Flow, and an irrational fear of physical buttons. Basically, any update would’ve been welcome.
But the announcement yesterday was a big deal, obviously, because they announced an App Store and an SDK. Which turned it from “just a set-top box” into a platform. That’s as big a deal for customers as it is for developers, since it means you don’t have to wait for Apple to make a new software release to get new stuff, content providers can make their own apps instead of having to secure some byzantine backroom deal with Apple to become a content channel, and some developers will come up with ways to innovate with the device. (Look to Loren Brichter’s first Twitter client as a great example of UI innovation that became standard. Or for that matter, Cover Flow).
And for games: I don’t think it’s an exaggeration to say that the iOS App Store has done more to democratize game development than anything, including Steam as a distribution platform and Unity as a development tool. Whether it was by design or a lucky accident, all the pieces of device, software, market, and audience came together: it was feasible to have casual games ideally played in short bursts, that could be made by small teams or solo developers, and have them reach so many millions of people at once that it was practical and (theoretically) sustainable.
I hope nobody expects that the Apple TV will become anywhere near as ubiquitous as the iPhone (or even the iPad, for that matter), but still: opening up development creates the potential for independents to finally have an audience in the console game space. It’d be like the Xbox Live Indie Games and XNA, if all the games weren’t relegated to a difficult-to-find ghetto separate from the “real” games. Or like the Ouya, if they’d made a device that anyone actually wanted to buy.
Game developers love saying that Apple doesn’t care about games and doesn’t get how games work — as if they’d just inadvertently stumbled into making a handheld gaming device that was more popular than Nintendo’s and Sony’s. You could look at the new Apple TV the same way, and guess that while trying to secure deals with big content providers and compete with Amazon or “Smart” TV manufacturers, they’d accidentally made a Wii without even trying.
There’ve been enough game-focused developments in the SDK, and the company marketing as a whole, that suggest Apple really does get it. (Aside from calling Disney Infinity “my favorite new Star Wars game”). But there’s a couple of troubling things about the setup, that suggest they expect everything on the TV to play out exactly the same way that it has on smartphones and tablets.
First is that the Apple TV has a heavy reliance on cloud storage and streaming of data, with a pretty severe limitation on the maximum size of your executable. They’ve demoed smart phone games on stage (Infinity Blade) that were 1 GB downloads, so it’s not inspiring to see a much smaller limit on downloadable size for games that are intended to run on home theater-sized screens. Maybe it’s actually not that big a problem; only developers who’ve made complete games for the Apple TV would be able to say for sure. But for now, it seems to suggest either very casual games, or else forcing players to sit through very long loading times. The latter’s been enough of a factor to kill some games and give a bad reputation to entire platforms.
Second is the emphasis on universal apps. They mentioned it at the event and just kind of moved on. I didn’t really think much of it until I saw this from Neven Mrgan:
Universal apps = haha no seriously good luck making money, folks.
You could take the most mercenary possible interpretation of that, which is what people always do once the economics of software development comes up: “Big deal! Having one app is what’s best for consumers! What’s best for consumers always wins, and it’s the developers’ responsibility to adjust their business model to enable that!” Also “Information wants to be Free!!!”
Except what’s best for consumers is that the people making great stuff can stay in business to keep making great stuff. And we’ve already seen on iOS exactly what happens when developers “adjust their business models” to account for a market that balks at paying anything more than 99 cents for months to years of development. Some big publishers (and a few savvy independents, like Nimblebit) came in and made everything free-to-play with in-app purchases. Maybe there is a way to make a free-to-play game that doesn’t suck (and again, Nimblebit’s are some of the least egregious). But I can’t see anybody making a believable case that the glut of opportunistic games hasn’t been a blight on the industry. I was out of work for a long time at the beginning of this year, and it was overwhelmingly depressing to see so many formerly creative jobs in game development in the Bay Area that now put “monetization” in the job title.
Believe me, I’d love it if one of these publishers went all-in on the Apple TV, and then lost everything because they didn’t take into account they were pandering to a different audience. But that’s not what would happen, of course. What would happen is that a couple of the big names would see that they can’t just fart out a “plays on your TV screen!!!” version of the same casual game and still make a fortune off of it, so they’d declare the entire platform as being not worth the effort. And then smaller studios who are trying to make stuff that takes specific advantage of the Apple TV “space” will be out of luck, because there are no big publisher-style marketing blitzes driving people to the platform. You need a combination of big names and smaller voices for a platform to work: again, see XBLIG.
It just seems as if there’s no recognition of the fact that there’s a lot more differentiating a game you play on your phone and one you play on your television than just the screen size. It seems especially tone-deaf coming from a company like Apple, who’s made a fortune out of understanding how hardware and software work together and what makes the experience unique. (Part of the reason that iOS has had so much success is that they didn’t try to cram the same operating system into a laptop and a smartphone).
At least the games on display showed evidence that they “get it.” The game demoed by Harmonix took advantage of the stuff that was unique to the Apple TV — a motion-sensitive controller and (presumably) a home theater-quality audio system. And even Crossy Road, which would seem like the worst possible example of shoveling a quick-casual game onto a TV screen and expecting the same level of success, showed some awareness of what makes the TV unique: someone sitting next to you playing the game, or at least having other people in the room all able to see something goofy happening on your screen.
I haven’t seen enough about tvOS to know if Universal apps are actually a requirement, or just a marketing bullet point and a “strong recommendation” from Apple. (Frankly, since I’m trying to make an iPad-only game, I’m ignorant of the existing requirements for iOS, and whether they restrict developers from releasing separate iPad-only or iPhone-only versions of the same software). So maybe there’ll be a market for separate versions? And somehow, magically, a developer will be able to release a longer, more complex game suitable for a home entertainment system, and he won’t be downvoted into oblivion for being “greedy” by asking more than ten bucks for the effort.
And there’s been some differentiation on the iPad, too. Playing XCOM on the iPad, for example, is glorious. That’s not a “casual” game — I’ve had sessions that lasted longer than my patience for most recent Xbox games — but is still better on the iPad because you can reach in and interact with the game directly. I could see something like that working — I’d pay for a game with lower visual fidelity than I’d get on Xbox/PS4/PC, if it had the added advantage that I could take it with me and play on a touchscreen.
So I could just be reactionary or overly pessimistic. But it’s enough to take what first seemed like a slam-dunk on Apple’s part, and turn it into an Ill Portent for The Future Viability Of Independent Game Development. As somebody who’s seen how difficult it was to even make a game in The Before Times, much less sell one, the democratization of game development over the past ten years has been phenomenal. And as somebody who’s finally realized how much some game studios like to exploit their employees, it’s incredible to be in an environment where you can be free of that, and still be able to realize your passion for making games.
The reason I first wanted to learn programming was being at a friend’s house, watching them type something into their VIC-20, and seeing it show up on screen. It was like a little spark that set me down a path for the next 40 years: “Wait, you mean I can make the stuff that shows up there, instead of just sitting back and watching it?” It’d be heartbreaking to see all the potential we’re enjoying right now get undermined and undone by a series of business decisions that make it impractical to keep making things.
Worst case, it’ll be another box that lets me watch Hulu. I was down to only eight.
My experiences with the latest advance from Apple that’s disrupting the ecosystems of wearable technology and order and personal fulfillment.
As a well-known “early adopter,” I feel I’ve got an obligation to share my experiences with bleeding-edge advancements in SoaC-powered wealth redistribution with users who are more on the fence, baffled by the increasing number of options in wearable technology.
A lot of you have lots of money but no time to wade through all the industry jargon; you just have simple questions that you need answered: “What is the Apple Watch?” “Why haven’t I read or heard anything about it?” And most importantly: “Does Chuck have one yet?”
I can go ahead and conclusively answer the last question: No.
If you were hoping that the Apple Watch would finally be the game-changer that makes me satisfied with the number of gadgets I own, you’re probably better off waiting a month or two. Version 1.0 of Apple products are known for being a hint of the advancements and refinements yet to come, more than complete, functional, devices. It’s as if with the Apple Watch, Jony Ive and his team of designers at Apple are giving us a roadmap for the future, announcing to the world: This is what the smart watch will be like, some time in early July when Chuck is actually able to have one.
So the question remains: is it really that insufferable to be waiting for the delivery of an expensive, inessential device, while surrounded by other people who already have theirs? Let’s find out.
How The Other Half Lives
Marketing Apple’s Most Personal Device Ever
Apple had to take a different approach with their first foray into the world of wearable technology. That meant making sure that before the product even hit stores, watch models were made available to the leading tastemakers: the technology and gadget bloggers who’d complain that Pharell and wil.i.am were posting Instagram pictures of their watches before any of the reviewers could get one.
By now, you’ve no doubt seen the “Big Guys” offer up their opinions about the Apple Watch (48mm Steel with the Milanese Loop band, universally), and their experiences with glances, taptic feedback, the Activity tracker, re-charging it every day, and the importance of selectively disabling notifications. By virtue of the mathematical study of combinatorics and the number of words in the English language, each reviewer’s take is, strictly speaking, unique.
You’ve seen a quirky first person attempt to free the device from Jony Ive’s perfectly-controlled environment and present it in a more realistic day-to-day setting: a tech blogger in New York City with a head-mounted camera. You’ve doubtless savored the definitive review from a suave globetrotting secret agent tech blog editor figuring out how this new innovation fits into a busy day packed with meetings and treadmill-running, including an up-close look at how hard it is to execute cross-site web content scheduling in a New York City bar with the double distractions of a watch constantly tapping your wrist, and a full camera and lighting crew having to run multiple takes of video while in a New York City bar. You’ve seen a stop-motion animated version with paper cutouts, for some reason. By now, you’ve even seen the Tech Reviewer Old Guard offer another look back at the watch after using it for a month.
What none of those so-called “professional” reviews will tell you is what life is like for real people who don’t have the product being reviewed. Sure, you occasionally get somebody like Apple insider and sarcasm enthusiast Neven Mrgan making a feeble attempt to relate to The Rest of Us outside Apple’s walled garden clique, but how much can you really say about an experience after only a week or two? How does that experience change after an entire month? [Full disclosure: Mr. Mrgan graciously offered a royalty-free license for me to completely rip off the premise of this blog post, presumably by effortlessly dictating said license into the always-on AI assistant of his futurewatch].
It’s Finally Here
Just Not For You
One thing that none of the reviews mention is how much of the Apple Watch experience is dependent on having not just an iPhone, but an actual physical Apple Watch. The site iMore.com, for example, offers a list of what the Apple Watch can do without an iPhone, but makes no mention of what can be done without the watch itself.
That’s a perfect example of how blog developers are adjusting to the new paradigms introduced by the Apple Watch: They’re not as content-focused as more traditional devices like the iPhone’s reviews. Instead, they’re best consumed as “glances,” not meant to be “read” so much as absorbed in quick seconds-long bursts throughout the day, every day, for months.
The truth is that there’s no amount of parallax scrolling and full-screen looping background video that will provide a truly definitive review of life without Apple’s latest must-have. For that, you need to go to Apple itself.
That trademark Apple design is evident from first glance: the photographs of other people with their watches bleed right up to the bezel of the laptop screen, putting a subtle but unmistakable emphasis on the object that you don’t have. It’s a perfect example of how Apple makes cold hardware more personal, by telling a personal story: This woman has a watch and you don’t. She is a ballerina. What does she need a smartwatch for? She can’t possibly have her iPhone in range; her pockets are too small. Also the screen is likely to come on frequently as she moves her arms, causing a distraction to the other dancers. Did she not think this through? I wonder if she ordered her watch at midnight instead of waiting. A good night’s rest is very important for dancers, so it seems foolish to forsake that just to get a new watch that can’t even give incoming message notifications. Not to mention that dancers aren’t usually paid well enough to be spending hundreds of dollars on a watch. I bet she didn’t even wait in line for a new iPhone every other year since the first model, like I did. Who does she think she is, anyway?
This is also likely to be your first bit of frustration when dealing with the lack of an Apple Watch: because the title photograph has to do a full round-trip circuit from designer to marketing team to photographer and model to graphic designer to web publisher, it can get hopelessly out of sync with reality. I still find myself reading the notification “The Watch is Here,” and then glancing down at my wrist only to confirm that it’s most assuredly not here. I hope this is fixed in a future update.
The Best Part of Waking Up
Getting Into the Groove of a Daily Routine Without Your Apple Watch
Apple’s attention to detail and design carry through the rest of the experience. There’s no garish “Order Status” menu, for example, instead offering a simple “Store” menu that reveals more beautifully photographed images of the product you don’t have.
It’s only there that you find a friendly drop-down menu takes you to “Order Status.” That will ask you for your password every time you open or refresh the page throughout the day — you’ll be doing this a lot, so I recommend using a password manager like 1Password.
In the month since I ordered an Apple Watch, I’ve really started to notice how I use technology differently throughout the day and in different locations. On the laptop, for instance, I hardly ever use the Delivery Status widget to track the status of my shipment, both because of the decreasing relevance of the OS X Dashboard, and because after 5 weeks the order is still in “Processing” status without a tracking number. Instead, I prefer to go to the Apple Store page, bring up the order status, enter my password, refresh the page, wait a few seconds, and refresh the page again, sigh, then refresh it one more time. I would’ve thought that this would feel like an intrusion, but it’s become such an integral part of my morning routine that I hardly even notice it anymore.
While out around town, not going to bars or important meetings, it’d be a lot more convenient to bring up the Apple Store app on my phone. In practice, though, the app requires me to type my password again every time I want to check the order status, so I end up not bothering. Maybe they’ll fix this sometime within the next 5-6 weeks. In a perfect world, I could have some type of device on my wrist that could give me order updates with just a “glance.”
On the Order Status page, you’ll see the time period in an elegant but still-readable font. Apple still knows how to make the most of the user experience, giving a moment of delight as you see the estimate change from “June” to “5-6 weeks.” These displays are made possible by “complications,” a term Apple is borrowing from the hardware industry to describe things like doing a huge marketing push for a product release that depends on faulty haptic feedback engines from overseas manufacturers.
Apple makes it really easy to go back to the main store page from the Order Status page, so you can get a beautiful, detailed look at all the various models and colors of watches you don’t have. It’s fun for running “what if?” type experiments, such as “Could I cancel my order and instead get one of the dainty models with a pink band? Would that ship any faster?”
There’s also support for Apple’s new “Force Touch” technology, in which you give a long, exasperated sigh followed by a sharp slamming gesture on all of the keyboard’s keys simultaneously, or pressing a closed fist firmly and repeatedly on the laptop’s trackpad. This gives helpful feedback in the form of Safari crashing. It definitely takes some practice, but in my experience, it became second nature the more often I saw my colleagues unwrapping their just-delivered Apple Watches near my desk.
I Regret Reading a Gadget Blog Post (and I knew I would)
The Cold, Hard Sting That Can Only Happen When You Physically Open Your Wallet
Even though the watch is only available online and who the hell writes for a technology blog but still has to physically open his wallet when he buys stuff online?
He Should Try Apple Pay
Unless Maybe He Also Bought a Really Expensive Wallet, And He Just Likes the Way It Feels
As a mobile software developer in San Francisco, I’ve already seen how the release of the Apple Watch has changed my routine. During my morning workout (two reps climbing up BART station stairs, followed by an intensive 1.5 block walk), I enjoy listening to podcasts that keep me on the bleeding edge of the most disruptive of apps and innovators. (ICYMI: My essential travel gear). (I recommend Overcast for podcast-listening, even if you’re going truly old-school and changing podcast tracks on your Bluetooth headphones by manipulating actual buttons on your touchscreen-enabled wireless mobile computer).
The gang at SixColors.com has been active on various podcasts, letting me know about their experiences after initial unboxing, two days, four days, a week, and several weeks later, while traveling, writing, and recording podcasts. In addition to the roundtable discussions where groups of people discuss how the watch I don’t have yet has changed their lives, I’ve gotten answers to the questions you don’t usually think about with some cursory product review. For instance: what if you have two watches, and you can’t decide which of them you want to keep? And: now that we’ve all had the opportunity to get used to our new watches, what would we most like to see in the new version?
Another highlight: an account of the podcaster whose significant other isn’t much of a technology devotee and wasn’t that interested in the watch, became interested after seeing the podcaster use his for a few days, ordered one, received it, and is giving her first impressions. It’s a magical time, as if entire generations of wearable technology are happening all around me as I watch the Order Status page. Whole waves of Gawker Media-led backlashes are whooshing by with the lasting permanence of burrito farts, the only constants being me, a web site, and a refresh button.
Like Smith, I was initially unmoved by the announcement of a new device from Apple. I, too, had bought a Pebble watch but quickly got out of the habit of wearing it. I’ve gotten the first versions of other Apple products and often been surprised by how dramatically and how quickly they’re made obsolete by the next release. I, too, write rambling stuff on the internet that frequently makes me come across as an insufferable asshole. And I also find myself reluctantly falling back into the role of “early adopter” for the sake of completely irrational impulses — in my case, an animated Mickey Mouse watch face that taps his foot every second; in his case, enjoying buying unnecessarily expensive stuff that makes him look cool.
It was important to him to have the sapphire face and stainless steel body, whereas I have large wrists, so it really stands out when I roll my eyes and make a wanking gesture while reading the rest of his post.
We ordered different models of the watch, because we have different needs. He tried on the gold version and was invited to look at himself in a mirror, while I managed to get 10 minutes bending over a bench in an Apple store by scheduling an appointment a couple of days in advance. He fell in love with the Milanese band, while I could only justify getting the cheapest model by telling myself it was a birthday present for myself. He doodles tiny pictures of cocks to colleagues and concludes it’s not a life-changing device; I see colleagues with watches and go back to reading blog posts written by, apparently, sentient, literate cocks.
One More Thing
Adding a Semi-Pithy Coda About Consumerism to What Should Have Been a Short and Silly Blog Post to Make it Unclear How Much of Any of This Is Intended to be Sarcastic
This Is Why People Don’t Read Your Blog
For decades there’s been a tendency to be dismissive of Apple devotees as being cultish and image-obsessed, with more money than common sense. As Macs and iPhones got more ubiquitous (and cheaper), enough people caught on to the fact that good design actually has real value. There are, no doubt, plenty of people who put “shiny” and “has visible glowing Apple logo” high on their list of priorities, but I think they’re finally outnumbered by those of us who just want something that’s really well made. (And who’ve bought enough cheap computers for the sake of saving a few bucks to realize that it ends up costing more in the long run when it needs to be replaced). Now it’s only the cranks in forums and blog comments that insist on complaining about the “Apple Tax.”
When Apple announced a gold edition of its new watch, that was rumored to cost over ten thousand bucks, there were fears that it’d bring all the old class warfare back to consumer technology: the company was now explicitly targeting status-obsessed rich people.
As I look at photos of models tying up their toe shoes, or draping their watch-bedecked arms over other models to make out with them, or stopping mid-jog-through-the-Hollywood Hills, and I see the three clearly-delineated castes of watch available, and I commit a few hundred bucks to the “lowest” caste of thing that I didn’t even want a few months ago, and I get increasingly resentful of the people who already have their inessential thing, and even more annoyed when they have the more expensive version of the thing I don’t yet have (even though I wouldn’t even want the more expensive version), I’m just glad those fears turned out to be completely unfounded.
Making sense of the iPad mini in a world that doesn’t need it.
After my previous unfortunate episode in an Apple store, it should come as little surprise that I didn't last very long before I broke down and bought an iPad mini. No, it doesn't make sense for me to be throwing my credit card around as if I were the CEO of Papa John's or something. I've already got a perfectly fancy tablet computer that's not just functional, but really quite terrific. It's not like I'm getting paid to write reviews of these things, and even my typical “I need it for application development testing” is sounding increasingly hollow.
What helps is a new metric I've devised, which measures how long it takes me after a purchase before the appeal of the thing overwhelms my feeling of conspicuous consumption guilt over buying it. It's measured in a new unit: the Hal (named after Green Lantern Hal Jordan, the Jordan who does have willpower).
By that standard, the iPad mini clocks in with a new record of 0.03 Hals, or about 30 minutes after I opened the box. Because this thing is sweet, and I pretty much never want to stop holding it. I'm writing this post on it, as a matter of fact, even though a much more functional laptop with keyboard is sitting about three feet away from me at this very moment. But to use it would mean putting the iPad down.
The “finish” of the iPad mini, with its beveled edge and rounded matte aluminum back, is more like the iPhone 5 than the existing iPads. It makes such a difference in the feel of the thing that I can talk about beveled edges and matte aluminum backs without feeling self conscious, as if I were a tech blogger desperately seeking a new way to describe another piece of consumer electronics.
It’s about as thin as the iPhone 5, and almost as light. With the new Apple cover wrapped around the back, it's perfect for holding in one hand. There have been several times that I've made fun of Apple, or Apple fanatics, for making a big deal about a few millimeters difference in thickness, or a few ounces in weight. And I joked about the appeal of the iPad mini, as if the existing iPad was unreasonably bulky and heavy.
But then something awful happened: I had to fly cross country four times within two weeks. And reading a book on the iPad required me to prop the thing up on the tray table and catch it as the person in front of me kept readjusting his seat. All my mocking comments were flying back in my face (along with the iPad, my drink, and the in-flight magazine), in the form of the firstest of first-world problems.
“Version 1 of the iPad mini is for chumps,” I said. “Check back with me once you've put in a higher resolution display, Apple.” In practice, though, the display is perfectly sharp. “Retina” isn't the make-or-break feature I thought it would be. You can certainly tell the difference when comparing the two; I'd assumed that squabbling over pixel density was something best left to the comments sections of tech blogs, but the difference in sharpness is definitely visible. It's really only an issue for very small text, though. Books, Flipboard, and web pages are all clear and legible.
And speaking of Flipboard, it and Tweetbot are the two apps that get me giddy enough to own up to making another unnecessary tech purchase. Browsing through articles and status updates on a tablet that thin is probably the closest I'll ever come to being on board the Enterprise.
The phrase I've seen reoccurring the most in reviews of the iPad mini is some variation on “this is the size the iPad is supposed to be.” And really, there's something to that. I'm not going to give up my other one; the larger size really is better for some stuff, like drawing, Garage Band, and reading comics or magazines. But overall, I haven't been this impressed with the “feel” of a piece of consumer electronics since I saw the original iPhone. Realizing that this is just version 1.0 is actually a little creepy — apart from the higher resolution display, I honestly can't conceive of how they'll improve on the design of the iPad mini.