It Seemed Like the World Was Changing.

Matt Licata:

Apple—the philosophy, the hope—is dead. Maybe it never existed in the first place. And to be clear: the company didn’t build it and isn’t directly responsible for it. But their wild success made it feel like pleasant, user-centered design was on the verge of taking over the world and leaving thoughtful, carefully considered objects everywhere in its wake. If Apple could soar past RIM and Microsoft, why not?

What really happened was that ugly, hostile, “business”-oriented ways of doing things infected the new paradigm as much as the reverse. [ . . . ]

The start of the iPhone and iPad era offered a reprieve, but it was only a matter of time before the forces of garbage, disorder, relentless capitalism, and “best practices” caught up and squeezed through any hole they could find. These forces move more slowly than small, enthusiastic, independent creators, so it took them a little while, and it seemed like the world was changing. But it wasn’t.

Advertisements

The Moore’s Law Express Hits the Great Ceiling: A Possible Hitch to Alien Contact

Amid the discussions of the potential for contact with extra-terrestrial civilizations, there’s one big buzzkill I don’t recall ever hearing posited as a possibility for why we haven’t made contact yet: Because it can’t be done.

We are used to the idea that technology advances exponentially, that we are all riding the Moore’s Law Express to the Singularity, and that as long as we don’t destroy ourselves via world war, climate catastrophe, or extermination by the artificial intelligences we’ve created, we will be capable of wonders that we can’t even image today, just as our nomadic ancestors of 100,000 years ago could never have imagined a steam engine, library, vaccine, or iPhone.

It follows that any other species on another world that has developed intelligence will get to hitch a ride on the same train. The details will differ, what they figure out first, what they emphasize, and what they’re physically capable of manufacturing will be different, but given a clear path, they too will achieve unimaginably advanced technologies that will, among many other things, allow them to voyage the galaxy and make themselves known to its other inhabitants.

There are lots of reasons to think this won’t happen, or if it does, that we won’t ever be aware of it. In an excellent piece by Tim Urban that I found via John Gruber, several reasons for our ongoing celestial loneliness are offered, all pretty sensible (except the one about the government cover-up, which he also thinks is silly). Some examples:

Super-intelligent life could very well have already visited Earth, but before we were here. In the scheme of things, sentient humans have only been around for about 50,000, a little blip of time—if contact happened before then, it might have made some ducks flip out and run into the water and that’s it.

Getting the sole experience of First Contact is so like the ducks, you know?

Another follows the metaphor of ants trying to comprehend a nearby highway (one presumes they cannot):

[I]t’s not that we can’t pick up the signals from Planet X using our technology, it’s that we can’t even comprehend what the beings from Planet X are or what they’re trying to do. It’s so beyond us that even if they really wanted to enlighten us, it would be like trying to teach ants about the internet.

That’s very much in line with the Moore’s Law Express, where it just so happens that the Planet X-ians are so much further down the track that we can’t even see them.

Urban also puts forth the idea of a “Great Filter,” a kind of universal civilizational buffer zone that extraordinarily few species ever cross. Maybe it’s because of planetary or astrological cataclysms killing off entire biospheres before they can evolve, or maybe it’s a near-inevitability of intelligent species destroying themselves, but either way, there may be some Rubicon that finishes off nearly all civilizations before they can become space-faring, let alone Type II or III.

(A side note about Type III civs, the kind that harvest an entire galaxy’s energy: Urban talks about how there might be a relatively small number of them that can inhabit any one galaxy, and I’m thinking, if they’re defined by their ability to eat up the energy of a whole galaxy, I have to imagine it’s a “there ain’t room for both of us in this one-horse town” kind of thing, where it’s not 1000 Types IIIs in a given galaxy, but one, ever. But I digress.)

And he posits many other possibilities, and you should read the whole piece, because it’s really good.

But my thinking, which again is a real bummer, is that we need to consider the possibility that we haven’t made contact with alien civilizations because it simply can’t be done. The Moore’s Law Express actually does have a final stop at which technological advancement more or less halts because of the limits of physics, or even just the limits of any intelligence (natural or artificial) tomanipulate physics.

It might just be that traversing light years in a span of time that allows for survival, proliferation, or communication is simply impossible. It may be that there is no way to send communications signals of any known kind across the vast stretches of nothing that would allow another intelligence to receive them, let alone understand them.

Maybe there can and will be no warp speed, no folding of space, no teleportation, no subspace communications, no navigation of wormholes, no uploading of consciousness to interstellar servers, no Dyson Spheres, and no Singularity. As opposed to a Great Filter that finishes off civilizations on the way up, there may instead by a Great Ceiling, a lid on reality that says we (meaning we on Earth and any other species in the Universe) can go this far, but no further.

Now look, I know that thinking this way sucks, and it’s no way to get kids excited about science and exploration, or to rally the public to support more investment in scientific research. It is in our interest as a species and a civilization to cheerfully ride the Moore’s Law Express as though it has no terminus. But if the conversation about why we haven’t made contact with aliens is going to be an honest one, I think it has to at least acknowledge this sad possibility: Not that “they” might not be out there, but that they are, and we simply can never know for sure, and nor can they.

Okay, now pretend you never read this.

By the way, one potential way to travel the stars is by way of a Bussard Collector, and I just happen to have written a song about one. See? I have hope.

Your Unique Amalgam: On the Fluidity of Geekhood

Geek in training.

Geek in training.

I had assigned myself* the task of writing a post about what it is to be a “geek.” It’s obviously not the same as it was when I was in school, as geekhood no longer implies utter alienation from the mainstream, but being part of a kind of cultural elite, a kind of priesthood that knows the Most Holy Secrets of computers (once nerdy, now cool), comic books (same), science fiction. So am I a geek?

The easy answer is yes. I’m into a whole slew of traditionally-geeky things like Star Trek and Macs and Monty Python and the Hitchhikers Guide to the Galaxy. But while I check many of the geek boxes, I can’t help but feel like the label still doesn’t suit me. And not because I’m too “preppy” or conformist, but because I don’t feel like I conform sufficiently to the geek’s clique.

I like Star Trek and some superhero movies, but I never got into comic books. I like science fiction generally, but I’ve never gotten deeply into that genre of books. I love my Mac and my iThings almost more than my children, but I don’t know anything about coding or software development, or even really how the damn things work.

And beyond some of the cultural (or pop-cultural) differences, there’s a class barrier as well, at least that I perceive. Being a geek, or so it seems from the tech blogosphere, is getting expensive. You not only need to have expensive devices (which I barely manage), but clothes and glasses and bags and notebooks and pens and coffee-makers and spirits and cameras and sometimes even cars that meet a certain aesthetic, and cost more than they probably ought to. Do I not qualify as a geek if I can’t see my way to purchasing $300 headphones or a $500 computer bag?

No one’s told me I can’t call myself a geek if I don’t meet all these criteria, of course. But I do feel outside the circle when I can’t match all these references, when I can’t afford the right paraphernalia, when I can’t speak the whole language, but just get by with a phrasebook (a moleskine phrasebook of course). It all makes me recall the old King Missle song that stirred me as a sophomore in high school, “It’s Saturday,” where John S. Hall says, with wide-eyed eagerness:

I want to be different, like everybody else I want to be like
I want to be just like all the different people
I have no further interest in being the same
Because I have seen difference all around
And now I know that that’s what I want
I don’t want to blend in and be indistinguishable
I want to be a part of the different crowd
And assert my individuality along with the others
Who are different like me

What I’m perceiving, I suppose, is really just a popular modern conception and generalization of geekhood. But if you broaden the definition to mean “someone who is passionate about niche subjects,” then I still qualify. Not just on Star Trek and the more consumer-centric aspects of technology, but about things like media criticism, secularism, politics, acting, songwriting, and prose writing. I even still retain some geekiness around certain small policies around electoral reform! Allow me to go into detail about why you should be into instant runoff voting.

And I reserve the right to get geeky about things down the road. I’m rediscovering a love of drawing (dumb cartoony things), and I may yet delve into Dr. Who one day, which would make my dad happy. Other things, like guitar playing and theatre, I’ve gotten less geeky about as other things in my life have left little room for them.

But geekhood needs to be fluid, I’d say. Especially if we want it to refer to something other than the relatively small circle of technological elites on the west coast. I’d like to think of geekhood, then, not as a description of a group that’s into a certain, prescribed set of things, but as the state of being deeply into one’s own unique amalgam of interests. That hodgepodge of perhaps-unrelated passions is your geekiness. That’s different.

_ _ _

* It’s not so much that I assigned myself, but that I was asked to write about geekdom by the folks at Singlehop, who do private cloud hosting, and are doing a whole thing around what it is to be a geek. If you’d like more info about SingleHop, check out their new private cloud hosting page. Thanks, guys!

A Ravenous Insistence on Having an Opinion

Andy Greenwald, in a post that’s really about the show Louiediagnoses the tweetosphere:

“We live in an era of opinions. In the Internet economy — in which I am a loyal and grateful participant! — loud voices are more than just currency, they’re coal. The Outrage Industrial Complex burns all day and all night with Twitter as its blistering engine room. A constant stream of fuel is necessary to keep the entire enterprise afloat, and so any event, be it the collapse of a government or the cancellation of a sitcom, is greeted with a near instantaneous torrent of reaction. Though the appeal of the virtual yawp can be undeniably intoxicating, I’m gradually finding it less and less tolerable. It’s no secret that nuance and doubt are rarely retweeted, but as Twitter has metastasized, its vaunted panoply of voices has grown more strident and, oddly, more unified — not in their positions but in their ravenous insistence on having one. It’s become less a conversation and more a crusade. Being silent is far worse than being wrong.”

This is my experience as well, though I’m not so sure about that last bit. I stay silent on a lot of things, partly because of my job, and partly because I don’t necessarily want to burn along with the rest of the coals, or whatever Strong Feeling I have about something has already been expressed by someone else, and better than I would have. And I don’t feel judged for this.

The one kernel of truth I find is that, if anything, having been silent, it makes it all the more apparent when I’m not. And then it’s not so much that I’d be coal, but something far worse: fresh meat. To some group of folks or other, I’d have the Wrong Opinion which would instantly render me a Bad Person, and they would let me know. A lot.

Freddie deBoer, in a post that’s really about reading, seems to get at this:

“You’re doing it wrong” is the internet’s truest, most genuine expression of itself.  For whatever reason, the endless exposure to other people’s minds has made the vague feeling that someone, somewhere, is judging you into the most powerful force in the world.

Silly, isn’t it? I know it’s real people behind those tiny avatars, but they’re so removed, there’s no real idea of who they are. And even if you also know them personally, the Twitter experience is so ephemeral, yet perceieved sins seem to last forever.

There Will Be No Hybrid Device (And That’s Good)

Microsoft insists that the Surface is “the tablet that can replace your laptop.” But even from the press event with enthusiastic demonstrations, and never having held the thing, it’s clear that this simply isn’t true. Yes, it seems passable as a laptop I suppose, but to qualify as something that “replaces” it, it has to be as good or better than a standard laptop (and really, better than a MacBook Air, a much higher bar). It’s clearly not. Its laptop functionality, for one, is a separate add-on, its keyboard not included. Its trackpad is reported to be inferior. Even with the keyboard attached and with the improvements Microsoft has made, it’s still unstable and awkward on a lap, according to reviews. So it fails here.

And as for the tablet part? No one is going to want to use the Surface as a tablet. It’s enormous, it’s heavy, and it has a poor tablet-app ecosystem. Is it passable as a tablet? Maybe? But again, passable isn’t good enough. 

The other selling point left is that is reduces the load of gadgets you carry. But for that to be the kicker, the benefit of having fewer things to lug around has to be so great as to overshadow the device’s other drawbacks. The Surface plus a keyboard weighs 2.42 pounds. A MacBook Air and an iPad Air together weigh 3.96 pounds. We’re not talking about back-breaking differences, here. And the tradeoff is that by having one device instead of two, you have one heavily compromised and inferior device instead of two excellently refined devices (and that presumes you even want to carry both around). I’ll take the extra pound and a half, please.

So while the Surface may be a very well made and interesting device, it’s not the Grand Unified Device it’s being trumpeted as. And it goes a step further in proving that such a device may not exist, or at least oughtn’t.

Bringing this back to Apple, when the top brass at the company made a lot of claims decrying the idea of a unified tablet-Mac hybrid thing, one thing I presumed was that their denials could be standard Apple evasion. Remember, no one wanted to watch videos on an iPod, until Apple made a video iPod. No one wanted to read books anymore, until Apple made an entire bookstore platform. They often look askance at features or ideas, only to adopt them later. I don’t fault them for this. They want the attention on the products they have now, not what they might make someday. 

But after Monday’s WWDC keynote, it’s clear to me that they weren’t bluffing about not melding OS X and iOS. Nor were they just being obstinate. For I certainly thought I wanted this unicorn device, the One Thing I’d Ever Use for both relaxing on the couch and for serious work. And when Apple said they’d never make that device, I also thought that perhaps they were being stubborn, the we-know-better company that they get panned for being so often, by simply folding their arms, sticking their noses up and saying “no!”

I revisit this interview that Craig Federighi, the operating systems guy, and Phil Schiller, the marketing guy, did with Jason Snell at Macworld last year, and you can see just how prescient it is, or rather, how the guys at Apple were telling us exactly what they were doing.

“The reason OS X has a different interface than iOS isn’t because one came after the other or because this one’s old and this one’s new,” Federighi said. Instead, it’s because using a mouse and keyboard just isn’t the same as tapping with your finger. “This device,” Federighi said, pointing at a MacBook Air screen, “has been honed over 30 years to be optimal” for keyboards and mice. Schiller and Federighi both made clear that Apple believes that competitors who try to attach a touchscreen to a PC or a clamshell keyboard onto a tablet are barking up the wrong tree.

“It’s obvious and easy enough to slap a touchscreen on a piece of hardware, but is that a good experience?” Federighi said. “We believe, no.”

“We don’t waste time thinking, ‘But it should be one [interface]!’ How do you make these [operating systems] merge together?’ What a waste of energy that would be,” Schiller said. But he added that the company definitely tries to smooth out bumps in the road that make it difficult for its customers to switch between a Mac and an iOS device—for example, making sure its messaging and calendaring apps have the same name on both OS X and iOS.

“To say [OS X and iOS] should be the same, independent of their purpose? Let’s just converge, for the sake of convergence? [It’s] absolutely a nongoal,” Federighi said. “You don’t want to say the Mac became less good at being a Mac because someone tried to turn it into iOS. At the same time, you don’t want to feel like iOS was designed by [one] company and Mac was designed by [a different] company, and they’re different for reasons of lack of common vision. We have a common sense of aesthetics, a common set of principles that drive us, and we’re building the best products we can for their unique purposes. So you’ll see them be the same where that makes sense, and you’ll see them be different in those things that are critical to their essence.”

Unlike Microsoft and a handful of other manufacturers, Apple sees a unique place for each device in the gadget triad of phone, tablet, and PC. Rather than meld them, and worry about merging for the sake of merging — for the sake of reducing the number of devices one has — they work on perfecting each device within the contexts of their individual places. And instead of hybridizing them, they build bridges, highways, tunnels, and even wormholes between them, drastically reducing the friction for making them cooperate, without making them the same. If they make good on their promises from WWDC, they will have proven that strategy to be very right.   

(How this will play out with a larger-screened iPhone, or dare I say it, an iPhablet, remains to be seen, though I feel some dread about it.)

And for Microsoft and its would-be customers, the question is begged, why would I want my tablet to replace my laptop? Yes, it’s great when new functionality comes to existing device categories. More data-sharing and third-party support on iOS will be great for letting me do more on my iPad, for example, but I don’t want new features at the cost of the iPad being a crummier tablet. And I really do want to replace the laptop I have (an aging 2011 11″ MacBook Air), but I want to replace it with a better laptop, not a worse laptop that also happens to be tablet-like. Who would?

Twitter Tsunamis of Desperate Signaling

Alan Jacobs on the swarm of me-too righteousness online, in the form of “Twitter tsunamis.”

This kind of thing always makes me want to flee Twitter, even when I am deeply sympathetic to the positions people are taking. It’s a test of my charity, and a test I usually fail. To me these tsunamis feel like desperate signaling, people trying to make sure that everyone knows where they stand on the issue du jour. I can almost see the beads of sweat forming on their foreheads as they try to craft retweetable tweets, the kind to which others will append that most wholehearted of endorsements: “THIS.” I find myself thinking, People, you never tweeted about [topic x] before and after 48 hours or so you’ll never tweet about it again, so please stop signaling to all of us how near and dear to your heart [topic X] is.

Here’s one of the things I love about Jacobs. Even when he hates what you’re doing, he gives you so much benefit of the doubt.

Likewise, Freddie deBoer:

Indeed: sincerity, in these instances, is in abundant supply. What’s lacking is the understanding that good people being publicly sincere makes nothing happen. But what else are you going to do? What am I doing? What can I do? I don’t know. I don’t know.

I don’t know either, but I am more cynical, and while I agree there’s sincerity behind these tsunamis, I also suspect that the impulse to act on them en masse to no other end than to add a “+1” to what has been said innumerably is a true expression of vanity in the digital age; not much different than dressing in-fashion, only here it’s done with text rather than fabric, it’s joining the in-group via a conviction instead of clothes. It’s the yellow ribbon gaudily displayed by those who have never done anything to support a troop.

And lord do I hate it when people just type “THIS” and then a link. As though by doing so they have presented the final word on a subject, and can thereby bask in the glow of having delivered it to the rest of us.

Worse, is “THIS. SO MUCH THIS.” It’s the triple-dog-dare of conviction-bearing tweets.

DeBoer again:

The trouble with talking about right and wrong in the age of the internet is that our communicative systems are oriented towards communicating only with those whom we wish to.

There’s no risk in taking part in one of these storms (unless you’re a woman, in which case you’re going to get a lot of shit from assholes, because you always do no matter the topic), because you know in advance that everyone shares your opinion. I of course can’t read anyone’s mind and can’t prove anything here, but my deep suspicion is that hand in hand with the vain in-grouping of these tsunamis is the pose of courage, that by expressing such and such an opinion, which I know is shared by everyone who will read it, I have somehow really put myself out there in a vulnerable position, but dammit, I can’t remain silent about this any longer. Never mind that no one else in this 48-hour period is being silent about it either, and being not-silent in the exact same way.

Oh, except for this writer from a favored partisan journalistic outlet, who really nailed it, beyond all dispute. This. So much this.

The iPad’s Deep Niche

Jared Sinclair (who I find by way of Alan Jacobs), in a well-reasoned post, comes to an “uncomfortable conclusion” about the iPad:

In order for the iPad to fulfill its supposed Post-PC destiny, it has to either become more like an iPhone or more like a Mac. But it can’t do either without losing its raison d’être.

I’m not at all convinced that this is true. First, though, I agree with him on some key points, such as:

Although both the iPhone and the iPad are multi-purpose devices, it seems only the iPhone fills a multi-purpose need in customers’ lives. A typical customer’s iPhone is put to work in all its capacity, while her iPad is relegated to only one or two niche uses. An iPhone is a phone, a flashlight, a GPS navigator, a camera, etc. An iPad can be most of those things, but in practice it gets stuck being just one or two of them.

I’d word this somewhat differently, but largely this is correct: the smartphone is a Swiss Army Knife of tools that most folks who are even tangentially involved in the information economy need to have on them today. And even if they’re not, it’s come to replace many of the other devices most consumers would otherwise think of as standard. You need a smartphone (here, specifically an iPhone) because you need a smartphone.

Sinclair argues that in many ways the iPad over-serves its users in relation to the iPhone in many areas, and then goes on to contrast where the iPad falls short versus phones and Macs. Here’s what he says about the Mac:

A Mac is Better Than an iPad for…

Workplace Productivity – The Mac has an exposed file system, physical keyboard, a pixel-accurate pointing device, and multitasking applications, all of which contribute to more efficient workflows.

Power Computing – There are some professional tasks that require powerful processors, expansion ports, large storage devices, multiple displays, etc. These features are only available on a PC.

No arguments here. The Mac/PC is what you have to go to if you want to do most serious work in an effecient way. Yes, there are some blurry lines and overlap (lately I’ve found iMovie on the iPad more convenient for quick video editing of my kids’ antics than on the Mac, for example), but in 95% of cases, excepting those special circumstances that happen mostly in iPad commercials, you need a Mac to do “work-work.”

Here’s where Sinclair comes down:

I think the future of the iPad is for it to disappear, absorbed at the low end by iPhones with large displays and at the high end by Macs running a more iOS-like flavor of OS X. Perhaps it won’t disappear completely. After all, for certain niche uses [Sinclair is referring to things like reading, movies, and the things that happen in iPad commercials] … the iPad is great because it’s neither a phone nor a PC. But these are still niche uses and can’t possibly sustain the long, bountiful future that many hope the iPad has.

And here’s where I disagree. I can grant all the facts he presents. Yes, the iPad is not as good at utility-belt type stuff as the iPhone, and it’s not as good at get-work-done-at-my-desk stuff as the Mac. But I think that’s because, as I noted in the opening paragraphs ofmy iPad Air review, the iPad shines as the device you reach for when you don’t need the other two. Speaking in broad terms, you use a smartphone because you need, at that moment, what it provides. You use a Mac/PC because you need what it can do in order to do your job, or what have you.

You use an iPad, however, because you’re now on your time, and can do the things youwant to do, rather than what you need to do. You can kick back and read, or watch movies, or draw, paint, fiddle with music, chat on social media, futz around on the Web, work on your novella, play Tiny Wings, and so on.

Again, there’s enormous overlap for these functions among all three of these device categories, but I still submit that smartphones and PCs broadly exist as “necessities” for modern work, and the iPad, broadly, is your off-time device, the device you use when “need” or “must” becomes “want” or “choose.”

I don’t see that going away by any means. For some, of course, a phone is enough, especially if they opt for phablets. For others, a laptop is sufficiently “casual” to use as one’s kick-back machine. But I think that tablets (iPads, really) excel in this space, being the device-of-choosing. That doesn’t mean that they will remain explosive in terms of sales. Once you have a good iPad, you don’t need to upgrade often at all. But if the space that the iPad occupies can be described as a “niche,” then it’s a deep and wide niche, one that will not be covered over any time soon.

(And a small note on phone sizes: Having now owned, briefly, a couple of medium-sized Android phones that were overall excellent, there’s just no getting around that I highly, highly prefer the smaller size of the iPhone. One-handed use is something you don’t realize how much you miss until it’s gone, as it’s so incredibly useful and convenient that its utility dwarfs the benefits a larger and more dazzling screen might provide – and I include the 4.7″ Moto X as too big. So I actually suspect/hope that phones won’t subsume the iPad in that particular direction by increasing in size. And I hope Apple doesn’t abandon the current iPhone size as it almost certainly also introduces larger iPhones this year.)