I mean, I’ve acknowledged several times over that I’m an Apple apologist, so I’ll go ahead and spoil this post and say that I think this video is bullshit mistaken and oddly conspiratorial.1“Bullshit” was way too harsh, now that I’m re-reading. I don’t even care about the topic that much. He goes out of his way to make business sound sinister and non-competitive, and right out of the gate he’s got a spurious argument.
As Brownlee points out, Apple set up for the launch of Airtags by making the Find My network available to third party developers. (For the record: I was completely unaware that they’d done this, so I’m even more surprised to see people calling foul). He then claims that this is just an illusion of choice, because whether or not Tile chooses to make Find My-compatible devices, Apple still “wins” because each Tile device now improves the Find My network, instead of Tile’s own network.
The first, most obvious problem with that: saying that a “win” for Apple is a “loss” for everyone else. By that metric, there’s no middle ground between “corporate altruism” and “unfair monopoly.” Brownlee makes it sound like the most valuable asset Tile has is its proprietary network, and not the devices themselves. But he’d already described how there are many, many more Apple devices out there on the Find My network than there are Tiles, but orders of magnitude. If Tile were competing on the network alone, then they’d already lost that before the Airtag was even released. And Apple’s only non-villainous option would’ve been to stay out of the business completely. Make its own network open to third parties and not introduce its own separate tracking device.
The other problem I have with it is illustrated in Brownlee’s thumbnail, and in the video as he holds Tile devices up to the camera. The Tile devices all have holes drilled in the tile itself, making them useful without buying an extra case or strap. And there’s a variety of sizes, including credit card-sized ones that will fit in a wallet, unlike the Airtags. So Tile has already differentiated itself in a marketable way. Taking advantage of the Find My network seems like a no-brainer.
What strikes me as especially weird is that Brownlee has been an advocate of Tesla for a long time, and a few months ago he made a video essentially describing how and why Tesla was so far ahead of the game in the EV market. To be fair, he did mention some criticisms of Tesla, and he said the whole reason for his video was a desire for there to be more competition in the EV market. But he listed Tesla’s battery range and extensive supercharger network as the two main reasons the company was at least a few years ahead of every other EV manufacturer.
Whenever people are praising the supercharger network, they never seem to have an issue with the fact that it’s proprietary, for Teslas only. On the rare occasion they do mention it, it’s always described as being the fault of other manufacturers, for not following Tesla’s lead. Because as we know, establishing a tech standard means proposing your own and telling every other manufacturer to do it your way, or suck it. Somehow, this is described as groundbreaking innovation, and never as colossal arrogance or anti-competitive business.
Obviously, a company as huge as Apple doesn’t need some jerk with a blog defending it. But its size doesn’t automatically make it the bad guy, either. Starting a business dependent on another company’s product — whether it’s software development or hardware accessories — is always going to be risky. It’s usually in both companies’ best interest to make sure the other succeeds.
The amount of money Apple’s going to make off Airtags is likely going to be on the level of a rounding error compared to their other businesses. It’s probably best to think of the Airtags as similar to reference graphics cards made by the chipset manufacturer: an Apple-designed example of how devices can use the Find My Network. That’s the kind of symbiosis that Brownlee describes, so I’m not sure why he’s so eager to make it sound sinister.
The 13″ M1 MacBook Pro is the first perfect laptop I’ve used in years, and it’s actually changing how I think about personal computers.
On his site SixColors, Jason Snell is doing a great series called 20 Macs for 2020. I’ve been loving it, as someone who’s been a fan of Macs since I was a teenager, long before I had one. I used to buy issues of MacUser1Instead of Macworld, sorry Mr Snell. and look at the one-bit screenshots of dialog boxes and menus, hoping for the day I’d be able to actually use a Mac.
I’ve got my own list of notable computers over the years. There’s the Mac Plus my parents got me as a high school graduation present, which was constantly swapping floppy disks and frequently crashing at random and which was my favorite computer. The Amiga 500 I used through most of college, which was the first machine I played LucasArts games on. The Commodore 64 I had in high school, where I learned BASIC and 6502 Assembly. The cheese-grater Mac Pro that was impeccably well-designed but absurdly impractical for what I needed, but still made me feel like I’d somehow leveled up as a computer programmer.
Least remarkable but probably most impactful: the Aluminum PowerBook G4 I got when I was working at Maxis. The year before, I’d been using this plasticky Dell Inspiron behemoth as a “desktop replacement,” which I regretted within weeks after buying it. Too heavy to be portable, but too much of a laptop to be expandable, it was the worst of all possible worlds. I decided that since I was a grown-up now, I could afford to pay the “Apple tax” and get a grown-up computer. After a decade hiatus from Macs, I decided I’d try to get a good computer instead of a cheap computer.
That photo isn’t one of me Totally Sidetalkin’, but it’s almost as amazing. I’m holding a running MacBook Pro up to my face. Without fear of scorching my delicate skin or torching all my white hair to blackened cinders.
Today I got a 13″ MacBook Pro with the new M1 chip, and I’ve spent the last 3 hours or so using it to transfer and update files, have Safari with several tabs (including YouTube) open, play Music, edit photos with the Photos app, run Photoshop 2021 (via Rosetta), have Xcode running and updating, working on a project in Nova and a Terminal window and emulator, read RSS feeds with Reeder, and I’ve even got Steam up and running Pendragon.1I don’t have any graphically-intensive games installed on this Mac because I hardly ever play games on the Mac. And the machine just barely feels warmer than when I took it out of the package.
Apple’s now selling its first Macs with Apple Silicon, and being an early adopter is slightly harder than it used to be
Yesterday, Apple announced its first lineup of Macs switching to its internally-designed Apple Silicon as it transitions away from Intel. For what it’s worth, I thought the presentation itself was excellent, staying fairly conservative but still showing exactly what developers and Mac devotees needed to see. Some people wanted to see more dramatic redesigns, but I think they needed this first round of machines out so that people can make direct comparisons.1I want everyone to appreciate my restraint in not using the phrase “Apples to Apples.” You’re welcome.
The purpose of this one was to reassure everyone that they were well prepared for the transition and that they’re still committed to the Mac line. My favorite parts were the multiple below-ground six-colored hallways, and the part where a MacBook and Craig Federighi both got turned on instantly. I know that they like treating product announcements like social events for the press, but I wish they’d keep the pandemic format for all their future announcements, because they’ve all been really slick and charming.
Reluctantly coming to the conclusion that the computer I’ve always wanted isn’t the computer I’ve always wanted
It’s a reliable source of tragicomedy to see people working themselves into an indignant rage over gadget reviews. When I was looking for reviews of the iPad Pro this Wednesday (to do my due diligence), Google had helpfully highlighted some fun guy on Twitter calling the tech journalists’ coverage of the device “shameful.” The reviews themselves had hundreds of comments from people outraged that even the notion of a larger, more expensive iPad was an assault to everything we hold dear as Americans.
The complaints about the rampant “Apple bias” are especially ludicrous in regards to the iPad Pro, since the consensus has been overwhelmingly cautious: out of all the reviews I read, there’s only one that could be considered an unqualified recommendation. Even John Gruber wasn’t interested in getting one. (But he did still believe that it’s the dawning of a new age in personal computing; it’s still Daring Fireball, after all). Every single one of the others offered some variation on “It’s nice, I don’t have any interest in it, but I’m sure it’s perfect for some people.”
Yes, I thought, I am exactly those some people.
Designed By Apple In Cupertino Specifically For Me
I’ve spent the better part of this year trying to justify getting a smaller and lighter laptop computer. I’ve spent the better part of the last decade wanting a good tablet computer for drawing. And I’ve tried — and been happy with — most of the variations on tablets and laptops that Apple’s been cranking out since the PowerBook G4. (One thing people don’t mention when they complain about how expensive Apple products are is that they also retain their resale value exceptionally well. I’ve managed to find a buyer for every Apple computer or tablet I’ve wanted to sell).
I’ve tried just about every stylus I could find for the iPad. I tried a Galaxy Note. I tried a Microsoft Surface. I got dangerously excited about that Microsoft Courier prototype video. Years ago, I tried a huge tablet PC from HP. None of them have been right, for one reason or another.
But when they announced the iPad Pro this Fall, it sounded like Apple had finally made exactly what I wanted: a thin and relatively light iPad with a high-resolution display, better support for keyboards, faster processor, and a pressure-sensitive stylus designed specifically for the device. Essentially, a “retina” MacBook Air with a removable screen that could turn into a drawing tablet. The only way it could be more exactly what I want would be if it came with a lifetime supply of Coke.
Still, I decided to show some constraint and caution for once, which meant having the calm and patience to get one a few hours into opening day instead of ordering one online the night before.
I read all the reviews, watched all the videos, paid closest attention to what artists were saying about using it. The artists at Pixar who tried it seemed to be super-happy with it. All the reviews were positive about the weight and the display and the sound and the keyboards.
I went to the Apple Store and tried one out, on its own and with the Logitech keyboard case. It makes a hell of a first impression. The screen is fantastic. The sound is surprisingly good. It is huge, but it doesn’t feel heavy or all that unwieldy when compared to the other iPads; it’s more like the difference between carrying around a clipboard vs carrying a notepad. (And it doesn’t have the problem I had with the Surface, where its aspect ratio made using it as a tablet felt awkward).
And inside the case, it gets a real, full-size keyboard that feels to me just like a MacBook Air’s. It really does do everything shown in the demo videos. I imagined it becoming the perfectly versatile personal computer: laptop for writing, sketchpad for drawing, huge display for reading comics or websites, watching video, or playing games. (I’m not going to lie: the thought of playing touchscreen XCOM on a screen this big is what finally sold me).
But Not For Me
But I don’t plan to keep it.
It’s not a case of bait-and-switch, or anything: it’s exactly what it advertises, which is a big-ass iPad. The question is whether you really need a big-ass iPad.
The iPad Pro isn’t a “hybrid” computer, and Apple’s made sure to market it as 100% an iPad first. But it’s obvious that they’re responding to the prevalence of hybrids in Windows and Android, even if not to the Surface and Galaxy Note specifically. And I think Apple’s approach is the right one: differentiating it as a tablet with optional (but strongly encouraged) accessories that add laptop-like functionality, instead of as some kind of all-in-one device that can seamlessly function as both.
But a few days of using the iPad Pro has convinced me that the hybrid approach isn’t the obviously perfect solution that common sense would tell you it is. It’s not really the best of both words, but the worst of each:
Big keyboards: The Apple-designed keyboard is almost as bad for typing as the new MacBook’s is, which is almost as bad as typing on a Timex Sinclair. Maybe some people are fine with it, and to be fair, even the on-screen keyboard on the iPad Pro is huge and full-featured and easy to use. But for me, the Logitech keyboard case is the only option. And it’s pretty great (I’m using it to type this, as a cruel final gesture before I return it) but it turns the iPad Pro from being surprisingly light and thin into something that’s almost as big and almost as heavy as a MacBook Air.
Big-ass tablet: Removed from the case, the iPad Pro quickly becomes just a more unwieldy iPad. The “surprisingly” part of “surprisingly light and thin” means that it’s genuinely remarkable considering its processor speed and its fantastic screen, but it still feels clumsy to do all the stuff that felt natural on the regular iPad. It really wants to be set down on a table or desktop.
It’s not cheap: I wouldn’t even consider it overpriced, considering how well it’s made and how much technology went into it. But it does cost about as much as a MacBook Air. That implies that it’s a laptop replacement, instead of the “supplemental computer” role of other iPads.
Touching laptop computer screens is weird: Nobody’s yet perfected the UI that seamlessly combines keyboards and touch input. Even just scrolling through an article makes me wish I had a laptop with a touchpad, where it’s so much more convenient. When it feels like the touchpad is conspicuously absent while you’re using a device that’s essentially a gigantic touchpad, that means that something has broken down in the user experience.
Aggressive Auto-correct: Because iOS was designed for touch input on much smaller screens, it was designed for clumsy typing with fat fingers. Which means it aggressively autocorrects. Which means I’ve had to re-enter every single HTML tag in this post. And it still refuses to let me type “big-ass” on the first try.
It’s missing much of OS X’s gesture support: Despite all the clever subtle and not-so-subtle things they’ve done to make iOS seamless, it’s still got all the rough edges as a result of never being designed for a screen this large. In fact, having your hands anchored to a keyboard goes directly against the “philosophy” of iOS, which was designed to have an unobtrusive UI that gets out of the way while you directly interact with your content. Ironically, it’s all the gesture recognition and full-screen stuff that made its way from iOS to OS X that I find mysef missing the most — I wish I could just quickly swipe between full-screen apps, or get an instant overview of everything I have open.
No file system: This has been a long-running complaint about iOS, but I’ve frankly never had much problem with it. But now that the iPad is being positioned as a product that will help you do bigger and more sophisticated projects, it becomes more of a problem. I just have a hard time visualizing a project without being able to see the files.
The old “walled-garden” complaints: Apple’s restrictions aren’t nearly as draconian as they’re often made out to be, but they still exist. Occasionally I need to look at a site that still insists on using Flash. And the bigger screen size and keyboard support of the iPad Pro suggest that programming would be a lot of fun on this device, but Apple’s restrictions on distributing executable code make the idea of an IDE completely impractical.
Third-party support: App developers and web developers haven’t fully embraced variable-sized screens on iOS yet. (As an iOS programmer, I can definitely understand why that is, and I sympathize). So apps don’t resize themselves appropriately, or don’t support split screen. Some apps (like Instagram, for instance) still don’t have iPad versions at all. Some web sites insist I use the “mobile” version of the site, even though I’m reading it on a screen that’s as large as my laptop’s.
If You Don’t See a Stylus, They Blew It
For me, the ultimate deciding factor is simply that the Apple “Pencil” isn’t available at launch. They’re currently back-ordered for at least four weeks, and that’s past the company’s 14-day return window. Maybe they really have been convinced that the stylus is a niche product, and they weren’t able to meet the demand. Whatever the case, it seems impossible for me to really get a feel for how valuable this device is with such a significant piece missing.
The one unanimous conclusion — from both artists and laypeople — is that the Pencil is excellent. And I don’t doubt it at all. Part of what gets the tech-blog-commenters so angrily flummoxed about “Apple bias” is that Apple tends to get the details right. Their stuff just feels better, even if it’s difficult or impossible to describe exactly how or why, and even if it’s the kind of detail that doesn’t make for practical, non-“magical” marketing or points on a spec sheet.
Even though I haven’t been able to use it, I have been impressed with how Apple’s pitched the stylus. They emphasize both creativity and precision. There’s something aspirational about that: you can use this device to create great things. Microsoft has probably done more over the years to popularize “pen computing” than any company other than Wacom, but they’ve always emphasized the practical: showing it being used to write notes or sign documents. It’s as if they still need to convince people that it’s okay for “normal” people to want a stylus.
Part of the reason I like Apple’s marketing of the Pencil is that it reminds me of the good old days before the iPhone. Back when Apple was pitching computers to a niche market of “creative types.” It was all spreadsheets vs. painting and music programs, as clearly differentiated as the rich jocks vs the sloppy underdogs in an 80s movie.
I only saw a brief snippet of Microsoft’s presentation about the Surface and Surface Book. In it, the Microsoft rep was talking about the Surface’s pen as if he’d discovered the market-differentiating mic-drop finishing-move against Apple’s failed effort: unlike “the other guys,” Microsoft’s pen has an eraser. I’ve been using a Wacom stylus with an eraser for some time, and its always too big and clumsy to be useful, and it always ends up with me using the wrong end for a few minutes and wondering why it’s not drawing anything.
Meanwhile, Apple’s ads talk about how they’ve painstakingly redesigned the iPad screen to have per-pixel accuracy with double the sampling rate and no lag, combining their gift for plausible-sounding techno-marketing jargon with GIFs that show the pen drawing precise lines on an infinite grid. That difference seems symbolic of something, although I’m not exactly sure what.
The Impersonal Computer
I’ve been pretty critical of Microsoft in a post that’s ostensibly about how I don’t like an Apple product. To be fair, the Surface Book looks good enough to be the best option for a laptop/tablet hybrid, and it’s clear some ingenious work went into the design of it — in particular, putting the “guts” of the machine into the keyoard.
I’m just convinced now that a laptop/tablet hybrid isn’t actually what I want. And I think the reason I keep going back to marketing and symbolism and presentation and the “good old days” of Apple is that computers have developed to the point where the best computer experience has very little to do with what’s practical.
I get an emotional attachment to computers, in the same way that Arnie Cunningham loved Christine. There have been several that I liked using, but a few that I’ve straight-up loved. My first Mac was a Mac Plus that had no hard drive and was constantly having to swap floppy disks and had screen burn-in from being used as a display model and would frequently shut down in the middle of doing something important. But it had HyperCard and Dark Castle and MacPaint and the floppy drive made it look like it was perpetually smirking and it as an extravagant graduation gift from my parents, so I loved it. I liked the design of OS X and the PowerBook so much that I even enjoyed using the Finder. I tried setting up my Mac mini as a home theater PC mostly as an attempt to save money on cable, but really I just enjoyed seeing it there under the TV. Even a year into using my first MacBook Air, I’d frequently clean it, ostensibly to maintain its resale value but really because I just liked to marvel at how thin and well-designed it was.
I used to think that was pretty common (albeit to healthier and less obsessive degres). But I get the impression that most people see computers, even underneath all their stickers and cases to “personalize” them, as ultimately utilitarian. A while ago I had a coworker ask why I bring my laptop to work every day when the company provided me with an identical-if-not-better one. The question seemed absolutely alien to me: that laptop is for work; this laptop has all my stuff.
Another friend occasionally chastises me for parading my conspicuous consumption all over the internet. I can see his point, especially since the Apple logo has gone from a symbol of “I am a creative free-thinker” to “I have enough money to buy expensive things, as I will now demonstrate in this coffee shop.” But I’ve really never understood the idea of Apple as status symbol; I’ve never thought of it as “look at this fancy thing I bought!” but “look at this amazing thing people designed!”
The iPad was the perfect manifestation of that, and the iPad mini was even more. Like a lot of people, I just got one mainly out of devotion to a brand: “If Apple made it, it’s probably pretty good.” I had no idea what I’d use it for, but I was confident enough that a use would present itself.
What’s interesting is that a use did present itself. I don’t think it’s hyperbolic to say that it created an entirely new category of device, because it became something I never would’ve predicted before I used it. And it’s not a matter of technology: what’s remarkable about it isn’t that it was a portable touch screen, since I’ve known I wanted one of those ever since I first went to Epcot Center. I think what’s ultimately so remarkable about the iPad is that it was completely and unapologetically as supplemental computer.
Since its release, people (including me) have been eager to justify the iPad by showing how productive it could be. Releasing a version called the “Pro” would seem like the ultimate manifestation of that. But I’m only now realizing that what appealed to me most about the iPad had nothing to do with productivity. I don’t need it to replace my laptop, since I’m fortunate enough to be able to have a laptop. And the iPhone has wedged itself so firmly into the culture that it’s become all but essential; at this point it just feels too useful to be a “personal” device. (Plus Apple’s business model depends on replacing it every couple of years, so it’s difficult to get too attached to one).
Apple’s been pitching the watch as “their most personal device ever,” but I wouldn’t be devastated if I somehow lost or broke the watch. My iPad mini, on the other hand, is the thing that has all my stuff. Not even the “important” stuff, which is scattered around and backed up in various places. The frivolous, inconsequential stuff that makes it as personal as a well-worn notebook.
Once I had the iPad Pro set up with all my stuff, I was demoing it to a few people who wanted to see it. And obviously with coworkers but even, surprisingly, when showing it to my boyfriend, there was a brief moment of hesitation where I wondered if I was showing something too personal. I don’t mind anybody using my laptop or desktop, or sharing my phone with someoen who needs it, but I’ve got a weird, very personal attachment to the iPad. (And not just because I treat my Tumblr app like the forbidden room in a gothic novel which no one must ever enter).
It’s entirely possible that I’m in the minority, and whatever attachment most people have to “their stuff” is to the stuff itself in some nebulous cloud, and not the device that’s currently showing it to them. It’s even more likely that there’s simply no money to be made in selling people devices that they become so attached to that they never want to give them up. It may be that Convergence is The Future of Personal Computing, and one day we’ll all have the one device that does everything.
After using the iPad Pro, I’m no longer convinced that a big iPad that also functions as a laptop is what I want. I really want a “normal”-sized iPad that’s just really good at being an iPad. Which means adding support for the Apple Pencil to the iPad Air.
So I’m back to hoping Apple’s already got one of those in the pipeline, and waiting until it’s announced at some point next year, and then ordering one the second they’re available and then trying to justify it as a rational and well-considered purchase. Next time for sure it’s going to be exactly the computer I want.
The Apple TV sure seemed like a good idea… at first!
On the surface (sorry), it seemed like Apple had made all the right decisions with its new product announcements yesterday. [For future anthropologists: new Apple Watches, a bigger iPad with a stylus, and Apple TV with an app store, and iPhones with better cameras and pressure-sensitive input. Also, the title of this blog post is a reference to something that happened a few months ago that nobody cares about now. — Ed.]
I’ve wanted an iPad with a stylus since before the iPad was even announced, so long ago that my image links don’t even work anymore! And I’ve been wanting a lighter laptop to use as purely a “personal computer” in the strictest sense — email, social media, writing, whatever stuff I need to get done on the web — and keep finding myself thinking “something like a MacBook Air that doubles as a drawing tablet would be perfect!” In fact, the iPad Pro is pretty close to what I’d described years ago as my dream machine but cheaper than what I’d estimated it to cost.
There’s been a lot of grousing online about how Apple’s acting like it invented all of this stuff, when other companies have had it for years. On the topic of pen computing, though, I can unequivocally say no they haven’t. Because over the years, I’ve tried all of them, from Tablet PCs to the Galaxy Note to the Microsoft Surface to the various Bluetooth-enabled styluses for iOS. (I’ve never been able to rationalize spending the money for a Cintiq, because I’m just not that great an artist). I haven’t tried the iPad Pro — and I’ll be particularly interested in reading Ray Frenden’s review of it — but I know it’s got to be at least worth investigation, because Apple simply wouldn’t release it if it weren’t.
Even if you roll your eyes at the videos with Ive talking about Apple’s commitment to design, and even if you like talking about Kool-Aid and cults whenever the topic of Apple comes up, the fact is that Apple’s not playing catch-up to anyone right now. They’ve got no incentive to release something that they don’t believe is exceptional; there’d be no profit in it. The company innovates when it needs to, but (and I’m not the first to say it): they don’t have to be the first to do something; they just have to be the first to do it right. And they’ve done exactly that, over and over again. The only reason I may break precedent and actually wait a while to get a new Apple device is because I’m not convinced I need a tablet that big — it’d be interesting to see if they’ll release a pen-compatible “regular-sized” iPad.
And if I’ve been wanting a pen-compatible iPad for almost a decade, I’ve been wanting a “real” Apple-driven TV set-top box for even longer. The first time I tried to ditch satellite and cable in favor of TV over internet, I used a bizarre combination of the first Intel Mac mini with Bootcamp to run Windows Media Center, a Microsoft IR remote adapter, a third party OTA adapter, and various third party drivers for remotes and such, all held together with palm fronds and snot. I’ve also tried two versions of the “hobby” Apple TV, relics of a time when Apple was known for glossy overlays, Cover Flow, and an irrational fear of physical buttons. Basically, any update would’ve been welcome.
But the announcement yesterday was a big deal, obviously, because they announced an App Store and an SDK. Which turned it from “just a set-top box” into a platform. That’s as big a deal for customers as it is for developers, since it means you don’t have to wait for Apple to make a new software release to get new stuff, content providers can make their own apps instead of having to secure some byzantine backroom deal with Apple to become a content channel, and some developers will come up with ways to innovate with the device. (Look to Loren Brichter’s first Twitter client as a great example of UI innovation that became standard. Or for that matter, Cover Flow).
And for games: I don’t think it’s an exaggeration to say that the iOS App Store has done more to democratize game development than anything, including Steam as a distribution platform and Unity as a development tool. Whether it was by design or a lucky accident, all the pieces of device, software, market, and audience came together: it was feasible to have casual games ideally played in short bursts, that could be made by small teams or solo developers, and have them reach so many millions of people at once that it was practical and (theoretically) sustainable.
I hope nobody expects that the Apple TV will become anywhere near as ubiquitous as the iPhone (or even the iPad, for that matter), but still: opening up development creates the potential for independents to finally have an audience in the console game space. It’d be like the Xbox Live Indie Games and XNA, if all the games weren’t relegated to a difficult-to-find ghetto separate from the “real” games. Or like the Ouya, if they’d made a device that anyone actually wanted to buy.
Game developers love saying that Apple doesn’t care about games and doesn’t get how games work — as if they’d just inadvertently stumbled into making a handheld gaming device that was more popular than Nintendo’s and Sony’s. You could look at the new Apple TV the same way, and guess that while trying to secure deals with big content providers and compete with Amazon or “Smart” TV manufacturers, they’d accidentally made a Wii without even trying.
There’ve been enough game-focused developments in the SDK, and the company marketing as a whole, that suggest Apple really does get it. (Aside from calling Disney Infinity “my favorite new Star Wars game”). But there’s a couple of troubling things about the setup, that suggest they expect everything on the TV to play out exactly the same way that it has on smartphones and tablets.
First is that the Apple TV has a heavy reliance on cloud storage and streaming of data, with a pretty severe limitation on the maximum size of your executable. They’ve demoed smart phone games on stage (Infinity Blade) that were 1 GB downloads, so it’s not inspiring to see a much smaller limit on downloadable size for games that are intended to run on home theater-sized screens. Maybe it’s actually not that big a problem; only developers who’ve made complete games for the Apple TV would be able to say for sure. But for now, it seems to suggest either very casual games, or else forcing players to sit through very long loading times. The latter’s been enough of a factor to kill some games and give a bad reputation to entire platforms.
Second is the emphasis on universal apps. They mentioned it at the event and just kind of moved on. I didn’t really think much of it until I saw this from Neven Mrgan:
Universal apps = haha no seriously good luck making money, folks.
You could take the most mercenary possible interpretation of that, which is what people always do once the economics of software development comes up: “Big deal! Having one app is what’s best for consumers! What’s best for consumers always wins, and it’s the developers’ responsibility to adjust their business model to enable that!” Also “Information wants to be Free!!!”
Except what’s best for consumers is that the people making great stuff can stay in business to keep making great stuff. And we’ve already seen on iOS exactly what happens when developers “adjust their business models” to account for a market that balks at paying anything more than 99 cents for months to years of development. Some big publishers (and a few savvy independents, like Nimblebit) came in and made everything free-to-play with in-app purchases. Maybe there is a way to make a free-to-play game that doesn’t suck (and again, Nimblebit’s are some of the least egregious). But I can’t see anybody making a believable case that the glut of opportunistic games hasn’t been a blight on the industry. I was out of work for a long time at the beginning of this year, and it was overwhelmingly depressing to see so many formerly creative jobs in game development in the Bay Area that now put “monetization” in the job title.
Believe me, I’d love it if one of these publishers went all-in on the Apple TV, and then lost everything because they didn’t take into account they were pandering to a different audience. But that’s not what would happen, of course. What would happen is that a couple of the big names would see that they can’t just fart out a “plays on your TV screen!!!” version of the same casual game and still make a fortune off of it, so they’d declare the entire platform as being not worth the effort. And then smaller studios who are trying to make stuff that takes specific advantage of the Apple TV “space” will be out of luck, because there are no big publisher-style marketing blitzes driving people to the platform. You need a combination of big names and smaller voices for a platform to work: again, see XBLIG.
It just seems as if there’s no recognition of the fact that there’s a lot more differentiating a game you play on your phone and one you play on your television than just the screen size. It seems especially tone-deaf coming from a company like Apple, who’s made a fortune out of understanding how hardware and software work together and what makes the experience unique. (Part of the reason that iOS has had so much success is that they didn’t try to cram the same operating system into a laptop and a smartphone).
At least the games on display showed evidence that they “get it.” The game demoed by Harmonix took advantage of the stuff that was unique to the Apple TV — a motion-sensitive controller and (presumably) a home theater-quality audio system. And even Crossy Road, which would seem like the worst possible example of shoveling a quick-casual game onto a TV screen and expecting the same level of success, showed some awareness of what makes the TV unique: someone sitting next to you playing the game, or at least having other people in the room all able to see something goofy happening on your screen.
I haven’t seen enough about tvOS to know if Universal apps are actually a requirement, or just a marketing bullet point and a “strong recommendation” from Apple. (Frankly, since I’m trying to make an iPad-only game, I’m ignorant of the existing requirements for iOS, and whether they restrict developers from releasing separate iPad-only or iPhone-only versions of the same software). So maybe there’ll be a market for separate versions? And somehow, magically, a developer will be able to release a longer, more complex game suitable for a home entertainment system, and he won’t be downvoted into oblivion for being “greedy” by asking more than ten bucks for the effort.
And there’s been some differentiation on the iPad, too. Playing XCOM on the iPad, for example, is glorious. That’s not a “casual” game — I’ve had sessions that lasted longer than my patience for most recent Xbox games — but is still better on the iPad because you can reach in and interact with the game directly. I could see something like that working — I’d pay for a game with lower visual fidelity than I’d get on Xbox/PS4/PC, if it had the added advantage that I could take it with me and play on a touchscreen.
So I could just be reactionary or overly pessimistic. But it’s enough to take what first seemed like a slam-dunk on Apple’s part, and turn it into an Ill Portent for The Future Viability Of Independent Game Development. As somebody who’s seen how difficult it was to even make a game in The Before Times, much less sell one, the democratization of game development over the past ten years has been phenomenal. And as somebody who’s finally realized how much some game studios like to exploit their employees, it’s incredible to be in an environment where you can be free of that, and still be able to realize your passion for making games.
The reason I first wanted to learn programming was being at a friend’s house, watching them type something into their VIC-20, and seeing it show up on screen. It was like a little spark that set me down a path for the next 40 years: “Wait, you mean I can make the stuff that shows up there, instead of just sitting back and watching it?” It’d be heartbreaking to see all the potential we’re enjoying right now get undermined and undone by a series of business decisions that make it impractical to keep making things.
Worst case, it’ll be another box that lets me watch Hulu. I was down to only eight.
My experiences with the latest advance from Apple that’s disrupting the ecosystems of wearable technology and order and personal fulfillment.
As a well-known “early adopter,” I feel I’ve got an obligation to share my experiences with bleeding-edge advancements in SoaC-powered wealth redistribution with users who are more on the fence, baffled by the increasing number of options in wearable technology.
A lot of you have lots of money but no time to wade through all the industry jargon; you just have simple questions that you need answered: “What is the Apple Watch?” “Why haven’t I read or heard anything about it?” And most importantly: “Does Chuck have one yet?”
I can go ahead and conclusively answer the last question: No.
If you were hoping that the Apple Watch would finally be the game-changer that makes me satisfied with the number of gadgets I own, you’re probably better off waiting a month or two. Version 1.0 of Apple products are known for being a hint of the advancements and refinements yet to come, more than complete, functional, devices. It’s as if with the Apple Watch, Jony Ive and his team of designers at Apple are giving us a roadmap for the future, announcing to the world: This is what the smart watch will be like, some time in early July when Chuck is actually able to have one.
So the question remains: is it really that insufferable to be waiting for the delivery of an expensive, inessential device, while surrounded by other people who already have theirs? Let’s find out.
How The Other Half Lives
Marketing Apple’s Most Personal Device Ever
Apple had to take a different approach with their first foray into the world of wearable technology. That meant making sure that before the product even hit stores, watch models were made available to the leading tastemakers: the technology and gadget bloggers who’d complain that Pharell and wil.i.am were posting Instagram pictures of their watches before any of the reviewers could get one.
By now, you’ve no doubt seen the “Big Guys” offer up their opinions about the Apple Watch (48mm Steel with the Milanese Loop band, universally), and their experiences with glances, taptic feedback, the Activity tracker, re-charging it every day, and the importance of selectively disabling notifications. By virtue of the mathematical study of combinatorics and the number of words in the English language, each reviewer’s take is, strictly speaking, unique.
You’ve seen a quirky first person attempt to free the device from Jony Ive’s perfectly-controlled environment and present it in a more realistic day-to-day setting: a tech blogger in New York City with a head-mounted camera. You’ve doubtless savored the definitive review from a suave globetrotting secret agent tech blog editor figuring out how this new innovation fits into a busy day packed with meetings and treadmill-running, including an up-close look at how hard it is to execute cross-site web content scheduling in a New York City bar with the double distractions of a watch constantly tapping your wrist, and a full camera and lighting crew having to run multiple takes of video while in a New York City bar. You’ve seen a stop-motion animated version with paper cutouts, for some reason. By now, you’ve even seen the Tech Reviewer Old Guard offer another look back at the watch after using it for a month.
What none of those so-called “professional” reviews will tell you is what life is like for real people who don’t have the product being reviewed. Sure, you occasionally get somebody like Apple insider and sarcasm enthusiast Neven Mrgan making a feeble attempt to relate to The Rest of Us outside Apple’s walled garden clique, but how much can you really say about an experience after only a week or two? How does that experience change after an entire month? [Full disclosure: Mr. Mrgan graciously offered a royalty-free license for me to completely rip off the premise of this blog post, presumably by effortlessly dictating said license into the always-on AI assistant of his futurewatch].
It’s Finally Here
Just Not For You
One thing that none of the reviews mention is how much of the Apple Watch experience is dependent on having not just an iPhone, but an actual physical Apple Watch. The site iMore.com, for example, offers a list of what the Apple Watch can do without an iPhone, but makes no mention of what can be done without the watch itself.
That’s a perfect example of how blog developers are adjusting to the new paradigms introduced by the Apple Watch: They’re not as content-focused as more traditional devices like the iPhone’s reviews. Instead, they’re best consumed as “glances,” not meant to be “read” so much as absorbed in quick seconds-long bursts throughout the day, every day, for months.
The truth is that there’s no amount of parallax scrolling and full-screen looping background video that will provide a truly definitive review of life without Apple’s latest must-have. For that, you need to go to Apple itself.
That trademark Apple design is evident from first glance: the photographs of other people with their watches bleed right up to the bezel of the laptop screen, putting a subtle but unmistakable emphasis on the object that you don’t have. It’s a perfect example of how Apple makes cold hardware more personal, by telling a personal story: This woman has a watch and you don’t. She is a ballerina. What does she need a smartwatch for? She can’t possibly have her iPhone in range; her pockets are too small. Also the screen is likely to come on frequently as she moves her arms, causing a distraction to the other dancers. Did she not think this through? I wonder if she ordered her watch at midnight instead of waiting. A good night’s rest is very important for dancers, so it seems foolish to forsake that just to get a new watch that can’t even give incoming message notifications. Not to mention that dancers aren’t usually paid well enough to be spending hundreds of dollars on a watch. I bet she didn’t even wait in line for a new iPhone every other year since the first model, like I did. Who does she think she is, anyway?
This is also likely to be your first bit of frustration when dealing with the lack of an Apple Watch: because the title photograph has to do a full round-trip circuit from designer to marketing team to photographer and model to graphic designer to web publisher, it can get hopelessly out of sync with reality. I still find myself reading the notification “The Watch is Here,” and then glancing down at my wrist only to confirm that it’s most assuredly not here. I hope this is fixed in a future update.
The Best Part of Waking Up
Getting Into the Groove of a Daily Routine Without Your Apple Watch
Apple’s attention to detail and design carry through the rest of the experience. There’s no garish “Order Status” menu, for example, instead offering a simple “Store” menu that reveals more beautifully photographed images of the product you don’t have.
It’s only there that you find a friendly drop-down menu takes you to “Order Status.” That will ask you for your password every time you open or refresh the page throughout the day — you’ll be doing this a lot, so I recommend using a password manager like 1Password.
In the month since I ordered an Apple Watch, I’ve really started to notice how I use technology differently throughout the day and in different locations. On the laptop, for instance, I hardly ever use the Delivery Status widget to track the status of my shipment, both because of the decreasing relevance of the OS X Dashboard, and because after 5 weeks the order is still in “Processing” status without a tracking number. Instead, I prefer to go to the Apple Store page, bring up the order status, enter my password, refresh the page, wait a few seconds, and refresh the page again, sigh, then refresh it one more time. I would’ve thought that this would feel like an intrusion, but it’s become such an integral part of my morning routine that I hardly even notice it anymore.
While out around town, not going to bars or important meetings, it’d be a lot more convenient to bring up the Apple Store app on my phone. In practice, though, the app requires me to type my password again every time I want to check the order status, so I end up not bothering. Maybe they’ll fix this sometime within the next 5-6 weeks. In a perfect world, I could have some type of device on my wrist that could give me order updates with just a “glance.”
On the Order Status page, you’ll see the time period in an elegant but still-readable font. Apple still knows how to make the most of the user experience, giving a moment of delight as you see the estimate change from “June” to “5-6 weeks.” These displays are made possible by “complications,” a term Apple is borrowing from the hardware industry to describe things like doing a huge marketing push for a product release that depends on faulty haptic feedback engines from overseas manufacturers.
Apple makes it really easy to go back to the main store page from the Order Status page, so you can get a beautiful, detailed look at all the various models and colors of watches you don’t have. It’s fun for running “what if?” type experiments, such as “Could I cancel my order and instead get one of the dainty models with a pink band? Would that ship any faster?”
There’s also support for Apple’s new “Force Touch” technology, in which you give a long, exasperated sigh followed by a sharp slamming gesture on all of the keyboard’s keys simultaneously, or pressing a closed fist firmly and repeatedly on the laptop’s trackpad. This gives helpful feedback in the form of Safari crashing. It definitely takes some practice, but in my experience, it became second nature the more often I saw my colleagues unwrapping their just-delivered Apple Watches near my desk.
I Regret Reading a Gadget Blog Post (and I knew I would)
The Cold, Hard Sting That Can Only Happen When You Physically Open Your Wallet
Even though the watch is only available online and who the hell writes for a technology blog but still has to physically open his wallet when he buys stuff online?
He Should Try Apple Pay
Unless Maybe He Also Bought a Really Expensive Wallet, And He Just Likes the Way It Feels
As a mobile software developer in San Francisco, I’ve already seen how the release of the Apple Watch has changed my routine. During my morning workout (two reps climbing up BART station stairs, followed by an intensive 1.5 block walk), I enjoy listening to podcasts that keep me on the bleeding edge of the most disruptive of apps and innovators. (ICYMI: My essential travel gear). (I recommend Overcast for podcast-listening, even if you’re going truly old-school and changing podcast tracks on your Bluetooth headphones by manipulating actual buttons on your touchscreen-enabled wireless mobile computer).
The gang at SixColors.com has been active on various podcasts, letting me know about their experiences after initial unboxing, two days, four days, a week, and several weeks later, while traveling, writing, and recording podcasts. In addition to the roundtable discussions where groups of people discuss how the watch I don’t have yet has changed their lives, I’ve gotten answers to the questions you don’t usually think about with some cursory product review. For instance: what if you have two watches, and you can’t decide which of them you want to keep? And: now that we’ve all had the opportunity to get used to our new watches, what would we most like to see in the new version?
Another highlight: an account of the podcaster whose significant other isn’t much of a technology devotee and wasn’t that interested in the watch, became interested after seeing the podcaster use his for a few days, ordered one, received it, and is giving her first impressions. It’s a magical time, as if entire generations of wearable technology are happening all around me as I watch the Order Status page. Whole waves of Gawker Media-led backlashes are whooshing by with the lasting permanence of burrito farts, the only constants being me, a web site, and a refresh button.
Like Smith, I was initially unmoved by the announcement of a new device from Apple. I, too, had bought a Pebble watch but quickly got out of the habit of wearing it. I’ve gotten the first versions of other Apple products and often been surprised by how dramatically and how quickly they’re made obsolete by the next release. I, too, write rambling stuff on the internet that frequently makes me come across as an insufferable asshole. And I also find myself reluctantly falling back into the role of “early adopter” for the sake of completely irrational impulses — in my case, an animated Mickey Mouse watch face that taps his foot every second; in his case, enjoying buying unnecessarily expensive stuff that makes him look cool.
It was important to him to have the sapphire face and stainless steel body, whereas I have large wrists, so it really stands out when I roll my eyes and make a wanking gesture while reading the rest of his post.
We ordered different models of the watch, because we have different needs. He tried on the gold version and was invited to look at himself in a mirror, while I managed to get 10 minutes bending over a bench in an Apple store by scheduling an appointment a couple of days in advance. He fell in love with the Milanese band, while I could only justify getting the cheapest model by telling myself it was a birthday present for myself. He doodles tiny pictures of cocks to colleagues and concludes it’s not a life-changing device; I see colleagues with watches and go back to reading blog posts written by, apparently, sentient, literate cocks.
One More Thing
Adding a Semi-Pithy Coda About Consumerism to What Should Have Been a Short and Silly Blog Post to Make it Unclear How Much of Any of This Is Intended to be Sarcastic
This Is Why People Don’t Read Your Blog
For decades there’s been a tendency to be dismissive of Apple devotees as being cultish and image-obsessed, with more money than common sense. As Macs and iPhones got more ubiquitous (and cheaper), enough people caught on to the fact that good design actually has real value. There are, no doubt, plenty of people who put “shiny” and “has visible glowing Apple logo” high on their list of priorities, but I think they’re finally outnumbered by those of us who just want something that’s really well made. (And who’ve bought enough cheap computers for the sake of saving a few bucks to realize that it ends up costing more in the long run when it needs to be replaced). Now it’s only the cranks in forums and blog comments that insist on complaining about the “Apple Tax.”
When Apple announced a gold edition of its new watch, that was rumored to cost over ten thousand bucks, there were fears that it’d bring all the old class warfare back to consumer technology: the company was now explicitly targeting status-obsessed rich people.
As I look at photos of models tying up their toe shoes, or draping their watch-bedecked arms over other models to make out with them, or stopping mid-jog-through-the-Hollywood Hills, and I see the three clearly-delineated castes of watch available, and I commit a few hundred bucks to the “lowest” caste of thing that I didn’t even want a few months ago, and I get increasingly resentful of the people who already have their inessential thing, and even more annoyed when they have the more expensive version of the thing I don’t yet have (even though I wouldn’t even want the more expensive version), I’m just glad those fears turned out to be completely unfounded.
Making sense of the iPad mini in a world that doesn’t need it.
After my previous unfortunate episode in an Apple store, it should come as little surprise that I didn't last very long before I broke down and bought an iPad mini. No, it doesn't make sense for me to be throwing my credit card around as if I were the CEO of Papa John's or something. I've already got a perfectly fancy tablet computer that's not just functional, but really quite terrific. It's not like I'm getting paid to write reviews of these things, and even my typical “I need it for application development testing” is sounding increasingly hollow.
What helps is a new metric I've devised, which measures how long it takes me after a purchase before the appeal of the thing overwhelms my feeling of conspicuous consumption guilt over buying it. It's measured in a new unit: the Hal (named after Green Lantern Hal Jordan, the Jordan who does have willpower).
By that standard, the iPad mini clocks in with a new record of 0.03 Hals, or about 30 minutes after I opened the box. Because this thing is sweet, and I pretty much never want to stop holding it. I'm writing this post on it, as a matter of fact, even though a much more functional laptop with keyboard is sitting about three feet away from me at this very moment. But to use it would mean putting the iPad down.
The “finish” of the iPad mini, with its beveled edge and rounded matte aluminum back, is more like the iPhone 5 than the existing iPads. It makes such a difference in the feel of the thing that I can talk about beveled edges and matte aluminum backs without feeling self conscious, as if I were a tech blogger desperately seeking a new way to describe another piece of consumer electronics.
It’s about as thin as the iPhone 5, and almost as light. With the new Apple cover wrapped around the back, it's perfect for holding in one hand. There have been several times that I've made fun of Apple, or Apple fanatics, for making a big deal about a few millimeters difference in thickness, or a few ounces in weight. And I joked about the appeal of the iPad mini, as if the existing iPad was unreasonably bulky and heavy.
But then something awful happened: I had to fly cross country four times within two weeks. And reading a book on the iPad required me to prop the thing up on the tray table and catch it as the person in front of me kept readjusting his seat. All my mocking comments were flying back in my face (along with the iPad, my drink, and the in-flight magazine), in the form of the firstest of first-world problems.
“Version 1 of the iPad mini is for chumps,” I said. “Check back with me once you've put in a higher resolution display, Apple.” In practice, though, the display is perfectly sharp. “Retina” isn't the make-or-break feature I thought it would be. You can certainly tell the difference when comparing the two; I'd assumed that squabbling over pixel density was something best left to the comments sections of tech blogs, but the difference in sharpness is definitely visible. It's really only an issue for very small text, though. Books, Flipboard, and web pages are all clear and legible.
And speaking of Flipboard, it and Tweetbot are the two apps that get me giddy enough to own up to making another unnecessary tech purchase. Browsing through articles and status updates on a tablet that thin is probably the closest I'll ever come to being on board the Enterprise.
The phrase I've seen reoccurring the most in reviews of the iPad mini is some variation on “this is the size the iPad is supposed to be.” And really, there's something to that. I'm not going to give up my other one; the larger size really is better for some stuff, like drawing, Garage Band, and reading comics or magazines. But overall, I haven't been this impressed with the “feel” of a piece of consumer electronics since I saw the original iPhone. Realizing that this is just version 1.0 is actually a little creepy — apart from the higher resolution display, I honestly can't conceive of how they'll improve on the design of the iPad mini.
Maps in iOS 6 bring about the downfall of western civilization, and my disillusionment with tech commentary continues.
Just days after the entire developed world sank into a depressive ennui due to Apple’s boring new smart phone, society was rocked to its foundations by the unmitigated disaster that is iOS 6’s new Google-free Maps app. Drivers unwittingly plunged their cars into the sea. Planes over Ireland crashed into nearby farms due to mislabeled icons. College students, long dependent on their iPhones to find their way around campus from day to day, were faced with a featureless blob of unlabeled buildings and had no option but to lie down on the grass and sob. Huge swaths of Scotland were covered with an impenetrable fog, and the Brooklyn Bridge collapsed.
Throughout the entire ordeal, Tim Cook only stopped giving the world the middle finger long enough to go on Saturday Night Live and rip up a picture of Steve Jobs. Jobs’s only recourse was to haunt the homes and workplaces of thousands of bloggers, commenters, and Twitter users, moaning “You’re the only one who truly understands what I wanted. Avenge me!”
At least, that’s the way I heard it. You want proof? It’s right there in the video, the one where they say “The Statue of Liberty? Gone!” while showing a picture of the Statue of Liberty. (Psst… hey, The Verge people — it’s that green thing in the middle of that star-shaped island). You think just because it’s “journalism” they have to have sources to show that it’s a serious, widespread problem? Check it, Jack: a tumblr full of wacky Apple maps mishaps.
And no, it doesn’t matter that the vast majority of those are complaints about the 3D Flyover feature, which was universally acknowledged as being a “neat but practically useless” feature of the maps app as soon as it was released, because shut up that’s why.
Of course, since I’m a relentless Apple apologist, I’m focused, Zapruder-like, on one tiny six-second segment of that three-minute long video: the part that says “For walking and driving, the app is pretty problem free.” And I’m completely ignoring the bulk of the video, which shows incontrovertible evidence that not everything is 3D modeled and lots of things end up looking kind of wavy.
Sarcasm (mostly) aside, my problem with this isn’t “oh no, people are picking on Apple!” My problem is that the people who are supposed to be authorities on tech — and to be clear, it’s not just The Verge, by a long shot — keep spinning the most shallow observations into sweeping, over-arching narratives. (And no, I haven’t see a single Verge post about Apple in the past week that’s neglected to find a way to link to that 73-degrees-Apple-is-timid post).
The tech journalists are the ones who are shaping public opinion, so I don’t think it’s unreasonable to expect an attention span of longer than a week and a short term memory of longer than a year. And as a result, I’m going to hold them responsible every time I read something dumb on the internet.
To be fair, even though the video just says “Apple’s supposedly been working on its version of Maps for 5 years, and it’s resulted in an app that’s inferior to what was there before,” and leaves it at that, the article does mention that Google’s data has been public for 7 years. And points out that the data gets refined with the help of location data from millions of iPhones and Android devices.
But why make it sound as if the decision to break free from dependence on Google was an arbitrary decision on Apple’s part? By all accounts, Jobs had a preternatural grudge against Google for releasing Android. But it’s not as if the Maps app on iOS 5 and earlier was feature equivalent to the Google maps on Android, and Apple’s deciding to roll their own was a completely petty and spiteful decision. Android’s had turn-by-turn directions for a while now, and there were no signs that it was ever coming to the Google-driven app on iOS.
Was that a case of Google holding out, or Apple not bothering with it because they knew they had their own version in the works? I certainly don’t know — it’s the kind of thing it’d be neat for actual tech journalists to explain — but it ultimately doesn’t matter. The licensing deal with Google ran out, so Apple’s options were to reassert their dependency on their largest competitor, or to launch with what they had.
And incidentally, whenever someone says “Steve Jobs would never have allowed something this half-assed to be released!” it’s the tech journalists’ responsibility to remind them that the iPhone released without an SDK and nothing but Apple’s assurance that Web apps were The Future. Or that Jobs had no problem releasing the original iPhone without support for Flash video, even though there was an outcry that Flash was crucial to the user experience.
I installed iOS 6 on the iPad and tried out a few practical searches. It found everything I needed, and it actually gave me more relevant information than I remember the Google version giving me, since I was looking for businesses, and Yelp automatically came up with business hours. Of course, my experience means very little, since I happen to live in the one part of the world that’s going to be most aggressively tested by Silicon Valley companies. I have little doubt that Europe and Asia are going to have a harder time of it, and obviously they’re not markets to be ignored. But it’s not a one-size-fits-all problem, so it’s silly to treat it like one.
Apple has no problem calling Siri a beta, so they probably should’ve called Maps beta as well. It’s a huge part of why people use smart phones, so it’d be foolish to imply that serious inaccuracies are no big deal. Regardless, it’ll work well enough in a lot of cases for most Americans, and in the cases where it doesn’t work, the web version of Google maps is still available (and you can set up a link on the home page with its own icon, even). Maybe Google and Apple will reach enough of a détente for a third party Google Maps app to get released. Maybe it’ll even finally bring turn-by-turn directions, or Apple will even allow third party apps to be default handlers for links!
Until then, maybe we can stop with the melodrama and the hyperbole, and just appreciate Apple Maps as version 1.0 mapping software with a neat extra feature of peeking into an alternate reality where Japantown in San Francisco has been overtaken by giant spiders.
Tech writers are disillusioned with the iPhone 5, and I’m getting disillusioned with tech writers.
I understand that if you’re writing about technology and/or gadgetry, a significant part of your job is taking a bunch of product announcements and reviews, and then fitting them into an easily-digestible, all-encompassing narrative. Usually, though, there’s at least an attempt to base that narrative off of reality, “ripped from today’s headlines” as it were. Lately, it seems like writers are content with a story that’s loosely based on actual events.
For the week or so building up to the iPhone 5 launch and the days after, the narrative has been simple: “The iPhone 5 is boring.” Writing for Wired, Mat Honan says, essentially, that it’s boring by design. And really, fair enough. Take Apple’s own marketing, remove the talking heads on white backgrounds, remove the hyperbole, and give it a critical spin, and you’ll end up with basically that: they’ve taken the same cell phone that people have already been going apeshit over for the past five years, and they’ve made it incrementally better.
Take that to its trollish extreme, and you have Dan Lyons (the creator of the “Fake Steve” blog). He wrote a “provocative” take on Apple’s past year including the iPhone 5 announcement for BBC News, in which Lyons (who wrote for Newsweek and also a satirical blog in which he pretended to be Steve Jobs) spends a couple of paragraphs reminding us why we should care about his opinion (it’s because he wrote a blog in which he was “Fake Steve”), and then mentions that he dropped the blog out of respect for Jobs’s failing health, and then invokes Jobs’s memory several times. In an “analysis” of Apple that’s as shallow and tired as you can possibly get without actually saying Micro$oft — he actually uses the word “fanboys.”
We can all acknowledge that to give him the attention he needs and then move on; there’s absolutely nothing there that wouldn’t get him laughed off of an internet message board. Lyons doesn’t even have the “I speak for Steve Jobs” thing going for him, since everybody has an opinion on how things would be different if Jobs were still in charge.
What’s more troubling to me is seeing writers who are usually worth reading instead take a similar approach: building a story that’s driven mostly by what other people are writing. Any idiot can regurgitate “news” items and rearrange them into a cursory non-analysis. (And that’s worth exactly as much as I got paid for writing them). (Which is zero in case you couldn’t tell already). Is it too much to ask for insight? Or at least, wait to see what the actual popular consensus is before making a declaration of what popular opinion should be?
If It Ain’t Broke, Fix It Anyway Because it Has Grown Tiresome to Me
On The Verge, Dieter Bohn found a perfect analogy for the iPhone in its own Weather app icon. He turned Honan’s piece from a blog post into the “prevailing opinion” about the announcement. But then he takes Honan’s reasonable but back-handed compliment and turns it into an accusation: the iPhone isn’t boring but timid. Sure, the hardware’s fine, but whatever: where Apple has failed is by showing no willingness to experiment with the core operating system or UI.
The mind-numbingly tedious drudgery of having to close multiple notifications under iOS (which, incidentally, I’ve never once noticed as a problem) proves that Apple’s entire philosophy is a lie. You’ve got a reputation for “sweating the details,” Apple? Really? Then how can you possibly explain this?! — as he thrusts a phone full of email notifications into the face of a sheepish Tim Cook, while Jobs’s ghost shakes his head sadly.
I honestly don’t want to be too harsh with any of the writers on The Verge, since I visit the site multiple times daily, and I really like their approach a lot. But I don’t think it’s particularly insightful to be aware that interface consistency isn’t just the dominant driving factor of UI design, but of Apple’s philosophy since the introduction of the Mac. We’ve been using “version 10” of the Mac OS for 10 years now, and while the appearance and the underlying hardware have changed dramatically, the core functionality is largely the same. Intentionally. It’s only with the still-new Mountain Lion that Apple’s made changes to how things work on a fundamental level — and, in my opinion, it hasn’t been all that successful. (I don’t understand all the people whining about faux leather while saying relatively little about changing the entire structure of document management for the worse).
On top of that, though, there’s the fact that The Verge has spent at least a month covering the Apple v. Samsung trial, in which Apple was spending a ton of time, effort, and presumably money to defend the UI that Bohn claims needs a dramatic overhaul. Yes, Microsoft has done considerable work to dramatically rethink the smart phone UI. That’s how Apple wants it. They spent a month saying, “this is ours, we made it, we like it, stop copying it.” Could it use some refinement? Of course. It always can, and some of those good ideas will come from other attempts at inventing a new cell phone OS. Does it need a dramatic re-imagining? No, unless you’re irrationally obsessed with novelty for its own sake, as a side effect of being steeped in cell phone coverage 24/7 to the point of exhaustion.
The Audacity of Swatch
Speaking of that, there’s Nilay Patel’s write-up of the new iPod Nano. Again, I think Patel’s stuff is great in general. But here, he carries on the “boring but actually it’s timid” idea by linking to Bohn’s article, and then goes on to build a story about what it says about Apple in general. Essentially, they’ve become The Establishment, too afraid of change to take any risks. With a product that’s changed dramatically in design in just about every iteration since the original.
“But that’s the old market, and the old way.” Apple isn’t about profiting over the planned obsolescence and impulse purchase cycle — which is news to all of us who have now become conditioned to buy a new cell phone every 2 years and a new laptop every 3 or 4 — but to pioneer new markets. The last iteration of the nano heralded a future of wearable computing. The last nano could’ve been the harbinger of a bold new market for a more forward-thinking Apple: wristwatches.
Let’s ignore the fact that Patel himself acknowledges that the nano wasn’t a very good watch in the first place. What about the fact that smart phones pretty much killed the entire wristwatch business? I’m about as far from being fashionably hip as you can get, but I still get considerable exposure to what’s actually popular just by virtue of living in San Francisco. And I don’t know anyone who still wears a watch. It’s been at least five years since anyone’s been able to ask for the time and not have to wait for everyone to pull their phones out of their pockets or handbags. Why would Apple go all-in on a market that they themselves helped destroy?
(Incidentally, Patel quotes watch band designer Scott Wilson as saying “The nano in that form factor gave me a reason to have three iOS devices on my body.” I can think of the iPhone and the iPod Nano-with-watchband; neither Wilson or Patel make it explicit what the third device is. And now I’m afraid to ask, because I’m not sure I want to know what this third device is exactly, or where a person would enjoy sticking it).
Saying that the MP3 market is dead fails to acknowledge what killed it: that functionality, along with that of the point-and-shoot camera, has moved away from a dedicated device and towards the smartphone. Smartphones are expensive, even with a contract, and the more info we put onto them, the more they become irreplaceable. There’s still a market for a smaller, simpler, and relatively inexpensive MP3 player. There’s a clue to that market on Apple’s own marketing page, and the most prominent icon on its home screen: “Fitness.” Joggers and people who work out — at least from what I’ve heard, since I have even less familiarity with the world of exercise than I do with people who still wear wristwatches — want a music player they can take for a run or take to the gym without worrying too much if it gets lost or broken. They’ll get more use out of that than from a too-large wristwatch that has to be constantly woken from sleep and needs a headphone wire running to your wrist if you want to listen to music.
That’s where the new market is, ripe for Apple to come in and dominate: stuff like the fitbit. I don’t have the actual numbers, of course, and I don’t even have any way of getting them, but I can all but guarantee that Apple sold more of the iPod nano armbands than it ever did with watchbands. And I imagine it’s the same philosophy that made them put a place for carrying straps on the new iPod touch: it’s not even that Apple doesn’t want to take risks with its flagship product, it’s that customers don’t want to take risks with the one device that handles all their communication with the world. For them, the iPod is an accessory.
Speaking of Consistency
But if you’re going to be making up stories, you should at least try to be consistent with them. On Slate, Farhad Manjoo has some serious issues with the new dock connector. He repeats the idea that the new iPhone is boring, but he uses the magic of sarcasm to make his point extra super clear. The problem is that so many details about the new phone leaked out weeks before release. By the time of the actual announcement, the world had already seen everything and stopped caring.
Got that? Apple’s problem is that it’s got to keep a tighter lid on its plans. Classic Apple, going blabbing about everything all over the press, flooding the market with product announcements. It’s boring everyone!
He’s bored by all the leaked information and the lack of any big bombshells in Apple’s announcement. Except for the big bombshell of the new dock connector. It’s pretty boring but also very impressive. It doesn’t take any risks but completely and unnecessarily changes the dock connector, destroying an entire industry of accessories. Apple has a long history of invalidating what it deems outdated technology, but this is different. Manjoo has to get a new alarm clock.
Also, the new phone is remarkably thin and light, “the thinnest and lightest iPhone ever made, and the difference is palpable.” But how could Apple possibly justify changing everything just for the sake of this new, thinner dock connector?
Back on The Verge, Vlad Savov describes all the leaks that bored Manjoo, and he mentions how embarrassing they must be for Tim Cook. He says that the problem is all the points of potential leaks in the supply chain, which have “drilled holes in Apple’s mystique.” In the same article, Savov links to Verge’s own post about every detail that was leaked ahead of time.
Isn’t it a little disingenuous to be on a site that publishes front-page posts of pictures of new iPhone cases, a detailed report of the new dock connector, and a compilation of all the rumors and leaks to date, and then comment on the unprecedented demand for leaked information? It seems a little like prom committee chairman Edward Scissorhands lecturing everyone on their failure at putting up all the balloons.
And of course, on the first night of pre-orders for the boring new iPhone that nobody’s interested in, Apple’s online store was overloaded, and the phone sold out of its initial order within within two hours.
On the topic of pots, kettles, and their relative blackness: I wasn’t that interested in the new iPhone, and I still chomped on the pre-order like a starving bass. I was much, much, much more excited about finally ditching AT&T than I was about the device itself. (So eager to get rid of AT&T that I’m willing to run into the arms of a company that’s no doubt almost as bad). Now that Apple’s not talking about magic, I can take their advertising at face value: I’m pretty confident that it is the best iPhone they’ve ever made. I’ve got an iPhone, and I like it, so I’ll get a better one.
What does that say about the state of gadget obsession that “I’m going to buy a new expensive device every two years even though I don’t technically need it” comes across as the most pragmatic view?
I can still remember when I first saw Engadget, and I thought the concept was absurd. A whole ongoing blog devoted solely to gadgets and tech? Is that really necessary? Then, of course, I got hooked on it, and started following it and now The Verge and quite a few Apple-centric sites. If I’ve reached a point of gadget saturation just reading the stuff, what does it do the folks having to write about it? It seems like it’s creating a self-perpetuating cycle of novelty for its own sake, which then drives commentary about this bizarre fixation on novelty for its own sake. You can’t even say “maybe just step away from it all for a bit;” it has to be a stunt, completely giving up the internet for an entire year while still writing for a technology-oriented site.
Whatever the case, it’d probably a good idea to step back a bit. I’ll start doing that just as soon as I get my sweet new extra-long phone next week.