Neurotypicals Keep Feeling Things At Me

Here’s how Stephen Colbert helps explain how I, as someone with Asperger’s syndrome, am in a constant state of anxious bewilderment at this current moment.

The introduction of truthiness to the American lexicon by Stephen Colbert in 2005 was something of a cultural watershed, the moment when we all finally had a way to describe the semi-facts and quasi-reality we experienced consuming political punditry. Overwhelmed as we are today with outright lies and misinformation, the George W. Bush era of truthiness seems almost idyllic.

But remember that the driving force behind the phenomenon of truthiness was not the relative veracity of a claim, or even the convenient massaging of facts. Truth was not really the point at all. Emotion was. In coining this neologism, I think Colbert may have inadvertently prophesied our current dystopia.

“Face it, folks,” said Colbert as his “Stephen Colbert” character on The Colbert Report. “We are a divided nation. Not between Democrats and Republicans, or Conservatives and liberals, or tops and bottoms. No. We are divided between those who think with their head, and those who know with their heart.”

And here’s a kicker.

“Anyone can read the news to you,” he said. “I promise to feel the news atyou.”

Remember that.

I recently happened upon a piece from Psych Central by Ivy Blonwyn about her experience counseling a married couple wherein the wife was neurotypical and the husband was very likely an Aspie. Blonwyn writes:

We neurotypicals cannot begin to fathom how hard it is for Aspies to exist in a culture we dominate. We set the rules. We design society. We define social norms. Even something as fundamental as the rules for manners and polite conversation are foreign to an Aspie. They may behave ‘normally’ (as NTs [neurotypicals] define it) but that’s because they’ve memorised how to follow our seemingly nonsensical rules by rote. It’s a script for them and a senseless one at that.

For example, when Dan was breaking eye contact, waving his hands and gasping, I had been talking about a movie that quite interested me. A neurotypical who had not seen that movie as Dan had not would automatically realise the important part of the conversation is not the movie. It is how the speaker felt about it.

An Aspie on the other hand, cogitates on the movie (they haven’t seen) and having nothing to contribute to the subject of the movie, wants to advance the conversation to something they enjoy talking about. Hence the appearance of impatience and disinterest.

It never occurred to Dan that I was telling him about my feelings. He thought we were discussing the movie. ‘No, I was telling you about me’, I told Dan.

‘Then why didn’t you say that?’ he retorted.

As a neurotypical, I thought I had. It was implied. So obvious, that it never occurred to me to verbally express it.

But Aspies don’t make assumptions so hard-wired in NT minds that what we really mean is usually left unspoken.

I experience this kind of interaction all the time. Someone is telling me something about their day, something they’re going through, or something they experienced, full of details and observations, and I can barely maintain my attention. If what I’m being told has no direct relevance to me, is about something of which I have no experience myself, or is out of my control to do anything about, my brain desperately seeks to abandon it.

Particularly if the speaker is someone I care about, I make my best effort to be attentive and engaged, and I think I usually succeed. By now I know that to appear to lose interest is hurtful and offensive. I want to be supportive and useful to the people I love, so I do my best.

But I also don’t quite get it. Why would I want to know about the plot of a TV show you watched? Why would I want to know about a casual conversation you had with your coworker? How can I possibly be a part of a conversation in which I have no frame of reference? What’s the point?

It’s because the speaker is really telling me about themselves. They are not reading the news to me, they’re feeling the news at me.

And that’s just what neurotypical people do and it’s perfectly normal. For them, it’s necessary.

The propagandists of our current informational hellscape, such as Fox News, the president, and the great heaving mass of conspiracy theorists, all of them are feeling at us, and people are responding.

But even the “good guys” in the reality-based community, such as progressives and the otherwise-sane folks I follow on Twitter, are doing the same thing. They may be working with actual facts that are actually true, but the outrage-tweeting they engage in operates under the same priciple. They, too, are feeling the news at us.

And that’s why I can’t deal. When opponents of the president shame-tweet his latest outrage, I keep appending the question, “So what do we do?” No one ever answers. Not necessarily because they don’t know what to do (though I suspect they usually don’t), but because that’s not the point. They came to emote, not to cogitate.

My neurology is ill-suited for this moment. I do not find satisfaction or connection from this mode of communication.

If someone I love tells me how bad their day at work was, I will likely try and brainstorm solutions to each problem they faced, when that’s not at all what they wanted from me. They were feeling their news at me, not looking for answers.

I’m looking for answers.

For those I love, I will try to be better at accepting what they share with me, what they feel at me. I will try to better understand that they are trying to share themselves, their souls, not their raw data.

For everyone else, I will try to ignore the firehose of feelings, and seek answers elsewhere.

I maybe oughta blog more.

There was a time when I tried to make a point of writing at least one blog post every day. Today that sounds like some trite advice from a self-help article on Medium, but I wasn’t doing it in order to “gain 50,000 followers” or what have you. It was a good habit to keep as a writer, to practice in public like that, and it genuinely felt good to have made something each day. But mostly, I actually felt like I had something to say, all the time.

These days, it’s remarkable if I write something more frequently than once a month (this is outside of work, of course, where I write all damn day, every day). There’s a long list of contributing factors. Personal reasons include mental exhaustion from work, attention demanded by kids and other family matters, the attraction of less intellectually demanding pastimes like video games (I really don’t watch much TV at all), and a bedtime that seems to seep every-earlier into the evening as I age.

There are also, I think, broader cultural reasons I don’t blog like I used to. The novelty of the form itself has worn off since its early-aughts hayday. While blogs were once the primary venue for processing and debating the events and issues of the day, they have been largely replaced; for journalists and activists, by Twitter; for everyone else, by Facebook. In those now-hazy before-times, one might be outraged over something some political figure did, compose a four or five-paragraph screed expressing said outrage, and liberally blockquote from some other source for the purpose of bolstering or rebutting one’s argument. Today, the same person will now retweet someone someone else said about said outrage, and maybe add an original line to a tweet in order to keep it within one’s personal brand. Or they’d share an article (probably unread) on Facebook, perhaps adding their own exclamation-marked sentence about the outrageousness of the outrage.

The author in 2006, with a laptop, possibly blogging. Possibly not.

The point being, blogs just aren’t where the action is. Blogs were once little islands of thought, from which individuals or small bands of like-minded island-dwellers would cast their prose into the wide ocean of the internet (or, as it was more often characterized back then, the capital-I Internet, like it was a place). Often, that prose might be fashioned into a kind of dinghy and aimed directly at another Internet Island, sometimes carrying supplies, sometimes a warhead.

It was fun!

Some of those Internet Islands still exist and thrive, and some have developed into full-blown Outlets, honest-to-goodness nation-states in the online media realm. Some blogs were subsumed into larger entities, or their feudal lords were lured away to more luxurious courts. But I think for most of us who were on the tiniest of those Internet Islands, we saw that no one was reading what we wrote anyway, so we might as well put in as little effort as possible, and be ignored on Twitter instead.

And good lord, did I love Twitter for a while. I felt like I really got it, and my own brand of everything-is-terrible humor-as-despair shtick felt very well suited to the platform. Today, though, Twitter is like punishment. I check in, I scroll, and I am quickly saturated by anxiety, anger, and despondency. And it doesn’t seem to matter what measures I take to curate my feed. In a time as ugly as this, ugliness is all there is to tweet.

As for the material I put out on Twitter, no one is seeing it. Even after thirteen years on the platform (Jesus Christ, has it really been thirteen years???) I have managed to attract a measly 4000-some followers, only a tiny fraction of which ever actually see (or care to notice) what I write. If something I tweet does happen to break out a little — usually because a certain friendly atheist has retweeted it to his own massive following — I become deluged with inane replies that are often inexplicably hostile. None of it seems to make things any better, and there’s no feeling of accomplishment.

And besides, I’m not a “tweeter.” I’m a writer. And while thoughts expressed in 280 characters or less is an absolutely valid and valuable form of writing, it’s not sufficient for me.

This gets me back to the question about why I don’t write more, or more specifically, why I don’t blog.

The despondence engendered by Twitter is part of the answer. The ocean of the internet (it’s lowercase-I these days) is already so polluted with opinions, punditry, takes, essays, outrages, and news, it hardly seems useful to throw in more of one’s own trash. Things are bad! Bad people are doing bad things! You don’t need me to tell you that. And while I could write about something else instead, something that has nothing to do with how terrible everything is, my despair has sapped my drive to share my thoughts about anything.

Another reason for my blog-hesitancy is ego. There seems little point in putting in the effort of writing when I know that no one’s going to read it. And my standards for what constitutes “some folks read it” versus “no one read it” have already been lowered to sub-basement levels. The idea is supposed to be that the good stuff will rise to the top, but I don’t think anyone believes that anymore, and who knows if my stuff would even qualify as “the good stuff” anyway? Sometimes I think it has, but what do I know? I only have 4000 Twitter followers.

I end this post without an answer, other than the obvious, which is: Do it anyway. What I write — and yes, specifically, blog — should exist for its own sake. For my sake. Because each time I do it, I will have made something. I will have improved my own thinking and come to better know myself. It will, as Vonnegut put it, make my soul grow.

And maybe, on the off chance that someone else encounters it, maybe it will do something good for them, too. Maybe that person will stand up from where they’re sitting on their Internet Island, look across the sea in my direction, and wave.

Nothing to be done

The part of all of this that most fills me with despair is the fact that those with the power to do something simply won’t.

My experience of Twitter right now is one of being told over and over to be outraged about every offense committed by the president, Republicans, right-wing media, or their followers. And I am! Good lord, I am. Trump constantly lies, promotes self-serving misinformation, and foments civil war. His allies and defenders fall in line. The parade of fanatical ignoramuses react, predictably, with garish displays of jingoist hate. Their cells become food for viruses.

And so the Important People on social media do their duty and Point it Out.

Fine. What I’m not seeing, and what I desperately need, is for someone to do more than Point it Out, but to offer a solution. The dead horse I continue to beat comes in the form of quote-retweets in which I ask, “So what do we do?”

Trump encourages insurrection: “So what do we do?”

Trump refuses to give aid to states who don’t kiss his ass: “So what do we do?”

Trump ignored warnings about the pandemic, and now pretends he was always on top of it: “So what do we do?”

Maybe, in a previous era, reporting on the wrongdoings of a president or other public official would at least get the ball rolling on getting that leader to change course or be held accountable. But, surely, now it must be obvious that this is no longer the case! Everything we all got used to, the idea of “scandals,” exposés of corruption, and various career-ending “-gates,” none of it matters anymore. We can Point Out and Be Outraged over every appalling example of nogoodniks nogoodnicking until we run out of tears and our fingers can no longer tap out our replies and retweets, and none of it will change a thing.

Those who believe what the president says will believe him until their dying breath, even if it’s a breath gasped without the help of the ventilator they needed but couldn’t get because of the president they loved. If reporting, explaining, and shaming had any impact whatsoever, Trump would already be out of office, Pence would be under investigation, and far, far fewer people would be sick or dead.

So, I’m asking, what do we do?

The Senate could have done something. We know how that worked out.

Pence could do something. He and other members of the cabinet could agree among themselves that the president is a danger to the country, invoke the 25th Amendment, and remove him from power, even if only temporarily. But of course, they won’t.

Is there something more the news media could do? I honestly don’t know. Again, merely reporting the many crimes of the moment isn’t enough. Jake Tapper and Anderson Cooper can fume into the camera over the president’s lies and the exponentially rising body count, but everyone who is watching already agrees that this is all an outrage.

Can voters do something? If they can, they have to wait until November, and then you have to assume that they will be able or allowed to vote. And because of how the Electoral College rigs the system in favor of the Candidate of the Fanatical Ignoramuses, it may not matter anyway.

Could well-intentioned billionaires and business titans do something? I don’t know! Governors? Celebrities? Anyone?

It’s hard for me to psychologically accept the idea that there’s nothing to be done, that we’re just hostage to the madness of an idiot cult leader, and that’s that.

I suppose what it comes down to, short of something even more destabilizing or dangerous, is that enough people will have to demand change in any way they can. But by “enough,” I don’t mean an motivated plurality or even 50 percent-plus-one. Overwhelming numbers of Americans will have tell those in power to fix this shit, but do it through some means that doesn’t require them to “take the the streets” like the Fanatical Ignoramuses protesting stay-at-home rules.

But there isn’t enough of us. This won’t happen.

So what do we do?

An Actor, an Introvert, and a Universe of Possibilities

The author in 2006.

People tend not to believe me when I tell them I’m severely introverted. It’s understandable, as the persona I put forward is usually that of a quirky, agreeable smart-aleck. I am animated and expressive in conversation, I engage in overtly silly play with my kids, and of course, I’m an actor and musician.

To many people, my personality simply seems too big to be that of someone who is shy, anxious, or reserved, let alone autistic. Some have even told me they find me intimidating. To me, that’s beyond ridiculous, but there it is.

When folks have trouble grasping how it is I could have had found any joy in being an actor while finding social interaction to be utterly draining and even painful, I explain that when I’m performing, I’m protected by several layers of metaphorical masks. On stage in a play, I am explicitly not myself. It says so right in the program! Next to my name will be the name of whatever character or characters I’m playing. I’m definitely not playing “Paul Fidalgo.”

I don’t have to be clever or come up with interesting things to say, because the words have been written for me, hopefully by someone who is well established as being really, really good at writing interesting things for people say, like, for example, William Shakespeare.

People tend not to believe me when I tell them I’m severely introverted. It’s understandable, as the persona I put forward is usually that of a quirky, agreeable smart-aleck. I am animated and expressive in conversation, I engage in overtly silly play with my kids, and of course, I’m an actor and musician.

To many people, my personality simply seems too big to be that of someone who is shy, anxious, or reserved, let alone autistic. Some have even told me they find me intimidating. To me, that’s beyond ridiculous, but there it is.

When folks have trouble grasping how it is I could have had found any joy in being an actor while finding social interaction to be utterly draining and even painful, I explain that when I’m performing, I’m protected by several layers of metaphorical masks. On stage in a play, I am explicitly not myself. It says so right in the program! Next to my name will be the name of whatever character or characters I’m playing. I’m definitely not playing “Paul Fidalgo.”

I don’t have to be clever or come up with interesting things to say, because the words have been written for me, hopefully by someone who is well established as being really, really good at writing interesting things for people say, like, for example, William Shakespeare.

I don’t even have to decode any social signals or read between the lines of what others are saying in order to know when to speak, because it’s all been planned out in advance. I am forbidden from speaking until my own lines are cued. That limitation is indescribably liberating.

I don’t have to know what to wear. I don’t have to know where to stand or how to behave, because all of that will have been worked out in rehearsal. If the play doesn’t call for my presence in a scene, I don’t even have to exist.

But there’s another way to explain the apparent incongruity of my personality that flips all of this on its head, and I didn’t even realize it myself until I had it explained to me in an article by a true master of the theatre from several years ago.

I recently came across an essay published in The Nation in 2011 by the great actor and playwright Wallace Shawn, who most folks will know as Vizzini in The Princess Bride, Grand Nagus Zek on Star Trek: Deep Space Nine, or the voice of the Tyrannosaurus Rex in the Toy Story movies. Maybe you know him from the 1981 film My Dinner with Andre. Oh, and he was just in Marriage Story, so that might help.

In his essay for The Nation, which is a truly beautiful piece of prose in which he explains how his art leads him to consider himself a socialist, Shawn writes:

We are not what we seem. We are more than what we seem. The actor knows that. And because the actor knows that hidden inside himself there’s a wizard and a king, he also knows that when he’s playing himself in his daily life, he’s playing a part, he’s performing, just as he’s performing when he plays a part on stage. He knows that when he’s on stage performing, he’s in a sense deceiving his friends in the audience less than he does in daily life, not more, because on stage he’s disclosing the parts of himself that in daily life he struggles to hide. He knows, in fact, that the role of himself is actually a rather small part, and that when he plays that part he must make an enormous effort to conceal the whole universe of possibilities that exists inside him.

In one version of my explanation for why such a loud, animated performer like me could be such a severe introvert is that I alone am too small and too vulnerable to be comfortable in my own skin in the midst of other humans. But what Shawn helped me to see is that this disconnect also stems from the fact that my singular, real-life self is also near to bursting with thoughts, ideas, fears, ambitions, impulses, and possibilities.

The potential energy bottled up and pressed down into this small, delicate body is overwhelming. Letting any of its pressure out brings with it the risk of humiliation, regret, misunderstanding, or bewilderment. So a single, inoffensive persona must be adopted, a safe and broadly acceptable packaging must be applied.

The stage does not solve or sort all of these parts, but it does allow them to manifest in meaningful, productive, and satisfying ways. In this way, an actor’s role is sort of like Mjölnir to Thor.

In Thor: Ragnarok, the Asgardian Avenger has lost his legendary hammer, Mjölnir, and at the edge of utter defeat, he hears the voice of his late father Odin, who asks him, “Are you the god of hammers?” Odin explains that Mjölnir was not the source of Thor’s power, but merely a means of focusing and controlling it. The real power, the “thunder,” is already inside him, coursing through him.

That’s what a role in a play is for an actor. It harnesses the lightning and thunder inside us and allows us to wield it. Shakespeare himself even wrote of “youths that thunder at a playhouse.”

It is true that for me, and I suspect for many actors, taking on a role is a way of protecting ourselves, providing armor for our fragility. But it is also a means to show our strength, to unleash a power within us that in most other circumstances would be too dangerous or destructive.

As Wallace Shawn says, we have within us a universe of possibilities. The stage allows us to live some of them out.

Oh Crap We’re Living in “Final Crisis”

Here’s a panel from the big DC Comics event, Final Crisis, in which a fictional President of the United States laments his state of affairs. You see, a god-like alien, Darkseid, has begun reprogramming the minds of the Earth’s population, causing them to submit to utter subjugation.

In this scene, a man with the president (for some reason wearing a fedora in the 2000s), warns that Darkseid’s forces, brainwashed humans and superheroes called “justifiers,” are about to wipe them out.

The haggard president, dejectedly clutching a gun, says, “This can’t be happening. The scale of it. The speed of it. Not in my lifetime…not like this…”

Well, of course it happened quickly! It’s a superhero comic book crossover event with an antagonist whose home planet is literally called Apokolips! Darkseid doesn’t do gradual.

But there was something about this particular comic book armageddon scenario that struck a chord with me. Cosmic-level supervillains usually achieve their aims through overwhelming destruction and death. Palpatine will rule the galaxy with the might of his fleet and the power of the Dark Side of the Force. We will all become children of Thanos once he murders half of all life forms. Etcetera.

With Darkseid, however, while there’s plenty of death and destruction, his plan for intergalactic domination was to turn humanity into a hyper-materialist cult.


Okay, here’s a quick summary of this particular branch of the rather dizzying plot of Final Crisis: An evil prophet-type character, Libra, recruits supervillains to help him infect people’s minds with the “Anti-Life Equation,” a sort of “proof” that leads the person exposed to the equation to reject all the values they once held dear, and choose to serve Darkseid. But not just “serve” in the sense of bowing down before his greatness or what have you, but becoming willing cogs in a sort of empty-headed, ultra-fascist state.

(Here’s where I must point out that I provided the voice for Libra and a couple other characters in the audiobook version of Final Crisis. Cool, right?)

We get a taste of what’s coming when, in a very strange part of the story, Superman deals with various alternate-universe Supermans for reasons that are frankly too esoteric to explain here. (I find these Crisis-themed series very confusing.) One such is Ultraman, and he’s not a truth-and-justice kind of guy.

“We value material wealth above everything,” he says through gritted teeth to the nicer Superfellows. His declaration is a kind of foreshadow for what Darkseid is bringing to Earth. Here’s a taste of what life under Darkseid looks like:

“Increase production!” shouts a justifier to the brainwashed drones that had once been everyday folks. And then shit gets real.

“Work! Consume! Die!” he shouts. Whoa, I’m thinking. Darkseid is creating a consumerist dystopia! Which sounds pretty close to the world as it is anyway!

And then the kicker. The justifier shouts, “Judge others! Condemn the different! Exploit the weak!”

It’s here I’m thinking, okay, Darkseid just built a Republican dream world. It’s Trumpism from space.

Don’t think so? Look how a justifier reacts to finding a copy of Darwin’s On the Origin of Species:

“What disagrees with Darkseid is heresy.” The book is burned, and echoing what other justifiers have said at the commission of horrifying acts, “Anti-Life justifies my ignorance!”

Take away the space-gods, and the attitude is exactly what is demanded by the cult of Trumpism. The facts are only what the cult leader says they are. They can change moment to moment, at his whim. Nothing he does can be bad, because it’s done by him. Any crimes committed by others are justified if they are done in his name.

This is the position of the United States’ ruling faction right now. And like the fictional comic book president observed, it happened so quickly.

Attempting to rescue some folks from the devastation, and from becoming Anti-Life zombies, the hero Black Lightning says, “Darkseid is remaking the world in his image, using our technology, our people as building blocks.”

For about a generation, the advent of the internet and social media were seen as means to enlightenment. And then the bad guys figured out how to use that technology to bring out our worst selves, minute after minute. Now countless “dimensions” and “alternate realities” are mainlined to us by Facebook through our individually-optimized Anti-Life Equation Feed, and the resulting state of chaos and confusion is the perfect breeding ground for the lies, the ignorance, the disenfranchisement, the demonization, and the many other forms of supervillainy we are witnessing right now.

Trump and his cult are remaking the world in their image. Like Black Lightning says, “This won’t be over until each and every one of us chooses to resist.” That’s true for us, too. But Superman’s not coming.

The Unexpected Plausibility of Mike Bloomberg

“I am getting really sick of all these Bloomberg ads!”

This was spoken by my 10-year-old son who watches shows on Hulu with his mom and has therefore been exposed, repeatedly, to ads for the presidential campaign of Mike Bloomberg.

When Bloomberg formally entered the race for the Democratic nomination last year, I railed to the heavens (you couldn’t hear me, but trust me, I railed), “WHY?” I don’t have any major objections to Bloomberg as a candidate or potential president, though he’s certainly not one of my top choices. But I simply couldn’t understand what he thought his path to the nomination could possibly be.

I know what the pundits have said, and what the line of the campaign is: Bloomberg can afford (both in terms of money and political capital) to skip the early states, use his wealth to blanket the later states with ads, and eventually squeeze past a muddled field of candidates currently lacking an overwhelming frontrunner, with the promise that his business acumen and aura of competence would seal the deal.

But, you know. Come on.

While technically possible, there is nothing plausible about Bloomberg’s prospects. Polls showed for months that the Democratic electorate was plenty satisfied with its existing options, and that the top four or five candidates regularly bested Trump in head-to-head general election polls, particularly Joe Biden — who Bloomberg would have to totally neutralize to even have a shot at the nomination.

How many times have we seen a late entrance into a presidential primary contest go on to win a party nomination? As we learned from would-be party saviors Wesley Clark, Rick Perry, and Fred Thompson (and eventually Deval Patrick), pretty much never. The fashionably late just don’t get to be president.

But let’s say none of this is the case. Let’s sat a latecomer could in fact ride in and shake everything up and that the Democrats are utterly despondent over their choices. Even then, in what universe does this imply that what progressives really want is the stop-and-frisk, former-Republican, Bush-endorsing, women-belittling, 80-pushing one-tenth-of-one-percenter? Perhaps there was some alternate dimension in which this made sense before the Crisis on Infinite Earths, but not on this Prime Material Plane.

Mike Bloomberg knows all of this. So my only explanations for his decision to run anyway are, one, that he is surrounded by advisors and consultants on his payroll who have a vested interest in convincing him that he will be president, and two, perhaps most importantly, he just really, really wants to be president, and at age 77, this is his last chance.

In recent weeks I’ve finally started to see some of those Bloomberg ads myself, either on social media or, yes, on Hulu. They’re really quite good. They’re not blockbuster, knock-your-socks-off, windsurfing-swiftboat ads that blow up the race, but they’re good. Perhaps the most effective thing about them is how reassuring they are. In general, his ads lightly contrast Bloomberg with a reckless Trump by highlighting Bloomberg’s competence and, well, normalness. They send a message that’s similar to Biden’s, in that they tell you that the country would be back in sane hands under this candidate, only Bloomberg’s ads layer on an actual record of governance. Twelve years as mayor of the city at the center of the universe can provide that kind of record.

Biden, for all his decades in public office, has never really been an executive in the way a mayor or a governor would be, and no one would mistake his role as vice president for that of a buck-stopping decision maker. So his ads rely on character; he’s got it, Trump doesn’t. He’s not wrong.

But without saying it, Bloomberg’s ads communicate that same message, that same feeling. Maybe it’s because I have been so skeptical about Bloomberg’s campaign that my reaction is disproportionate to their actual effect, but I have been very surprised to see how invested he appears in the people he’s shown listening to, how convicted he appears in the candidate-with-voters B-roll that are the standard filling for every political ad.

“That’s a good ad,” I find myself saying out loud. Hmm, I find myself thinking, the field of candidates is still pretty muddled. Hmm, I think, Bloomberg is often polling third and fourth nationally.

Nah. I mean, come on.

But then Iowa happened.

Put aside the procedural shitshow of the caucus tabulation debacle. What the Iowa caucuses showed us was that the race is a mess. Nationally, Sanders and Biden are wrestling for a small plurality to claim the top spot, with an undulating rotation of Elizabeth Warren, Bloomberg, Pete Buttigieg, Amy Klobuchar, and Andrew Yang in the next few slots. Sanders people will always be Sanders people no matter what, but the rest of the field has not been sufficiently winnowed to clarify who Sanders’ prime challengers are. Maybe it’s Joe Biden, maybe it’s eventually Pete Buttigieg, but it’s no clearer today than it was a week ago.

By the looks of the final alignment results, as I type this on February 4, Joe Biden wound up with a pretty bad fourth place showing in Iowa. He was never banking on an Iowa win, but a distant fourth-place finish is pretty damned embarrassing for the erstwhile national frontrunner and former vice president.

Pete Buttigieg’s showing was the breakout of the night, for whether or not he “won,” he certainly wrestled Sanders to a functional tie. But he’s polling in single digits nationally. Unless Iowa has an impact on voters that is even more outsized than usual, I don’t see how he turns a tied-for-first-but-technically-second-place showing into meteoric rise. And I suspect there’s not going to be a lot of bounce to be had out of this particular Iowa caucus.

I don’t really get what’s happening with Elizabeth Warren. It seems like the voters all really like her a lot, but too few of them are willing to cast their lot with her. I think that’s a huge shame and a big loss for all of us.

This is all to say that where I once did not see an opening for Mike Bloomberg, now I think I might. Skipping Iowa certainly seems to have proven to have been a net plus for his campaign, though also missing New Hampshire seems like an unforced error. That’s a whole other week of coverage in which he won’t be part of the conversation about who will be the next president.

But maybe it doesn’t matter, and if so, that’s largely because of his money. (It’s also because of his name recognition, as everyone knows who Mike Bloomberg is, and hardly anyone recognizes Tom Steyer, the other billionaire.) For all the media that Bloomberg won’t earn, he’ll buy, ten-fold. He’s already run a Superbowl ad and is reportedly planning to double his already gargantuan ad spending in the coming weeks. If his polls go up soon, he’ll qualify for the debates after New Hampshire, and I suspect that even after New Hampshire votes, we won’t be much closer to knowing who Bernie Sanders’ real competition is. That’s a good spot for Bloomberg to find himself in.

And here’s what might be the biggest thing. Bloomberg obviously wants to be president badly. Those other late-arrival candidates I mentioned earlier were largely ushered into the race by draft campaigns and twitchy party insiders. They didn’t jump in because of an insatiable desire to become President of the United States. Bloomberg’s got that desire, and one should never underestimate the guy who just wants it more.

It remains the case that Bloomberg’s chances at being the Democratic candidate to take on President Trump are incredibly slim, requiring a near-perfect falling-domino execution to create the circumstances for his ultimate nomination. But for the first time, I can see it.

I personally support Elizabeth Warren. My 7-year-old daughter agrees, and even more strongly, often screaming “WOOOOO ELIZABETH!!!!” when the election comes up in conversation — which it does a lot in my house. My 10-year-old son has found a lot to like about several candidates, and I think he misses Beto O’Rourke and Kamala Harris. But the other day, as I’m driving him home from a lesson, he started asking me to tell him about Mike Bloomberg.

Those are some good ads.

Suicides Are Not Car Accidents

When trying to make a point about the troubling rise of suicide rates in America, the statistics are often compared to other modes of fatalities, and understandably so. In order to understand the scale of the problem, it helps to compare it quantitatively to undesirable things we feel more familiar with. But in the case of suicide, I think it diminishes the scope and seriousness of the issue.

I’ll pick on Arthur C. Brooks’s column at the Washington Post, because that’s what got me thinking about this. Brooks rightfully urges a serious confrontation of the undeniably sharp and steady rise in suicide rates, and to convey just how bad things have gotten, he draws comparisons to homicides and motor vehicle accidents. According to his cited statistics, there are now two suicides for every homicide in America, and there are 17 percent more deaths per year by suicide than by car accidents.

While these numbers do help to communicate the scale of the problem, these kinds of comparisons, to my mind, oversimplify the issue to an unacceptable degree and make some morally false equivalencies between only somewhat related actions and phenomena.

Let’s start with the homicide-suicide comparison, and take as granted that twice as many people die by suicide than by homicide. Put aside for the moment the rise in suicide rates. Is the fact that more people take their own lives than have their lives taken from them in itself an obviously bad thing? In every instance? Could it not be a sign that perhaps we are now less eager to murder each other, and that if we must die by someone’s hands, at least it’s more likely to happen by our own choice and volition rather than someone else’s?

Of course it’s not “a good thing” that people want to kill themselves, but what does it actually mean to point out that there are more suicides than homicides? One is a death without choice, the other, at least in the broadest sense, is. What this numerical comparison ignores is what drives a person to make that choice. Would we feel better if those numbers were reversed? I don’t think I’d feel very safe if I was told that I was twice as likely to die by murder versus suicide.

Now let’s look at the comparison to death by motor vehicle accidents. It’s similarly problematic, in that it is not at all clear to me that we’d be pleased to see the statistics in reverse, in which car accident fatalities overtake suicide deaths. In that case, we’d all be wondering what is up with all the careless driving and poorly-made vehicles?

More to the point, Brooks points to the campaigns of a generation ago to improve car safety and how they eventually led to laws and regulations that made cars and driving much safer than they had ever been, and says, essentially, let’s do that, but for suicide.

This makes little sense to me. Absolutely increase public awareness of the issue. Countless people might be helped if more Americans were better educated and empathetic about suicide. But what else are we really talking about? There’s no suicide equivalent to seat belt laws, speed limits, or air bags.

Auto accidents, and to a lesser extent homicides, are more or less binary. They are bad things that result in unwanted deaths, but they can be prevented and mitigated by a fairly straightforward collection of measures. Make cars safer, make guns scarcer, and so on. Homicide is somewhat different in that, like suicide, its tendrils reach much deeper into other societal ills, and I’ll address that in a moment. But both auto accidents and homicides boil down to one thing: They kill people who don’t want to be dead.

Suicide is something else entirely. Let us grant for the sake of simplicity that no one’s starting position is “I want to be dead, and I want it badly enough that I will end my own life myself.” Something has to push a person into that place. Several somethings, both apparent and invisible. Whatever those factors are, they are different for everyone. I’m not even slightly qualified to list them or assign them relative values. But you know some of them; depression, fear, hopelessness, self-loathing, brought on because of abuse, joblessness, loss, addiction, and so on.

The point is that while murders and car crashes steal a person’s life from them, in the case of suicide, something else has stolen a person’s desire to live, and, importantly, driven them to the point that they have concluded that ending their life would be preferable to living it. Even if we could install some sort of air-bag-for-suicides and prevent the vast majority of them from succeeding, we’d certainly bring the suicide numbers down, but we’d have done nothing to address what brought on the choice to try in the first place. There’d be fewer deaths, but just as many miserable people.

So the question can’t be — must not be — “how do we bring down the number of suicides?” The real question is about how we change society so that tens of thousands of people every year do not become so overwhelmed by existential despair that suicide even becomes an option worth considering. Yes, that means laws, but not just preventative laws focused on the act of suicide itself. If that’s all we’re thinking about, then we’re too late anyway. But we will need laws that address poverty, economic anxieties, addiction, basic health and nutrition, mental illness, education, and much, much more.

But we’ll also need to change our culture, heart by heart. We’ll need to call ourselves to become kinder to each other, more sensitive to the pain of our fellow humans, and more willing to be a friend. We’ll need to stamp out those ideologies that thrive on the marginalization of other groups, and teach each other not to fear or resent equality. We have to decide, on an individual basis, to celebrate and embrace each single person’s differences and idiosyncrasies, and rather than seek uniformity, consider our species blessed by its infinite variety of neurologies, talents, flaws, ideas, histories, gifts, and limitations.

When we start to assign value to each and every one of us, then, maybe, we’ll see those suicide numbers go down. Not because we managed to stop more folks from pulling triggers or popping pills, but because far more of us have decided that this world is worth living in for as long as we can.


Listen: There are things we are supposed to want out of life, and there are the means by which we are supposed to attain them. There are cultural events, life milestones, rites of passage, and personal interactions which we are supposed to eagerly anticipate, and those we are supposed to bemoan. There are standards of comportment we are supposed to uphold, degrees of amiability we are supposed to project, and durations of eye contact we are supposed to maintain.

We are supposed to know when to laugh and when not to laugh. We are supposed to stand a certain way and sit in a certain way, and ways in which we are not supposed to stand or sit. We are supposed to have street smarts, as opposed to book smarts, but we should have enough street smarts to know how to attain those book smarts. We are supposed to be honest, except when we aren’t supposed to be. We are supposed to be careful and we are supposed to take big risks.

We are supposed to have jobs and we are supposed to seek to advance in both position and compensation in these jobs. We are supposed to look forward to lunch. We are supposed to care a great deal about food and eating and sporting events and and acts of geographical transit and the weather. We are supposed to care a great deal about many, many things, and we are supposed to know what those are without being informed in advance.

I have lived my life riven by “suppostas.” From the very first moment I realized that the world had expectations of me beyond the routine of elementary school classrooms, I was in a state of near constant bewilderment as to how I should go about my existence. All rules were unspoken and yet somehow universally understood and viciously enforced. I never received instructions, only penalties for failing to follow them.

So I flailed about in search of guidance; examples to follow and role models to emulate. But I never found any templates that I could build upon, no maps I could interpret, no ciphers I could even attempt to break. In day-to-day moments, there were ways to sit, to stand, to walk, to move, or not move, and I coudn’t make sense of them, which often meant defaulting to keeping very still. In my education, there were crucial tasks to accomplish that had nothing to do with that was in the syllubus, and yet were somehow clear to my peers. In the professional world, the litany of unwritten rules, agendas, rites, and rituals were so perplexing and numerous that it caused a kind of paralysis.

From the quotidian to the global, I simply could not figure out how to be. How I was supposed to be.

The problem may already be obvious to you. My error has been to seek meaning and validation through my perceptions of what other people value. I don’t know how I could have known any better. To observe other humans is to see them behave based on some combination of motivations that are invisible to me, guided by some set of rules that is equally inaccessible. Existing, as I do, as a single individual with no way to read the contents of anyone else’s thoughts, all I can do is look and guess, like trying to describe the constellations in the night sky by looking through a paper towel tube.

I would be 38 by the time I realized why the behavior of my fellow humans baffled me so, and why I was so lost navigating this ocean of “suppostas,” for that’s when I was diagnosed with Asperger’s syndrome, when I found out I am autistic. One aspect of this neurological difference is difficulty with ascribing agency to other people. Another is difficulty deciphering, or even perceiving, nonverbal communication. There are many more parts to being among the neurodivergent, but it all boils down to this: I don’t think or behave the way everyone else does, and I have a lot of trouble understanding what is happening around me if it’s not being explicitly stated. I have a neurological supposta deficit.

But forget that. Placing too much value on external validation is not solely the burden of autistics. Being baffled by the motivations and behavior of other people isn’t exclusive to the neurodivergent. We austistics are more likely to face these kinds of difficulties, often to greater degrees of severity, but feeling misunderstood or alienated is part of being human. We all fear being lost or alienated.

We can’t just dismiss this phenomenon as an error in thinking and move along, though. Whether or not we’ve identified the error, we still have the underlying problem, which is, really, how are we supposed to have meaning in our lives? How are we supposed to matter?

I think I have looked to others to tell me what is meaningful because I couldn’t work it out for myself analytically. When I think about the whole life-cycle trip we’re all taking down this mortal coil, it looks like this: We’re born ignorant and helpless, and starting from absolute zero we then spend twenty-odd years learning how to be a human in the world, and then spend what time we have left deteriorating until we eventually expire and utterly cease to exist at all. Good lord, what, I ask you, is the point?

Given that it looks so plainly like there is no point, and that existence has no meaning, the anxious young mind eagerly looks to the example of others. What’s meaningful to them? To what ends to they direct their finite time and energy? What do they celebrate and what do they revile? In short, what do they value? I hoped that by aping the moves of the humans around me, I would know what I was supposed to value.

This was never going to work.

Of course, society at large will always, at the least, set up the guardrails for what we find meaningful about life. We all learn by example from our family, our peers, and our broader culture. None of us find meaning in a vacuum.

What I didn’t realize, and what I think many people don’t realize, is that there is no meaning there to be found. Those expectations that society places on us, or that we place on ourselves, those “suppostas,” can nudge us in one direction or another or, hopefully, keep us from finding meaning in the grotesque or destructive.

Meaning itself, however, is not something that anyone or anything can provide or even define. In The Hitchhiker’s Guide to the Galaxy, the supercomputer Deep Thought is built in order to answer the great question of Life, the Universe, and Everything. Famously, after millions of years of calculations, Deep Thought declares the answer to be 42, which of course makes no sense to its egregiously disappointed audience. Obviously, the Answer is supposed to have a Question, so it builds another, even more unfathomably advanced computer to figure out what the question is that would spit out the cryptic answer 42. More hilarity ensues.

The first (several) times I read that book, I took it as I suspect most people did, as a wonderful example of humor, literary cleverness, misdirection, and a great poke in the eye to know-it-alls everywhere. And it is all of those things. But only now, at this point in my life, do I actually get it.

As a miserable, alienated, and frightened pre-adolescent, terrified of each new school day, I would have given anything to have a Deep Thought of my own to tell me What The Hell It Was All About. Throughout my adulthood, even with whatever wisdom I had accumulated, I would have no less welcomed an algorithmic answer to Life, the Universe, and Everything. But it doesn’t exist. It can’t exist.

In recent years, coming closer to terms with who I am has allowed me the psychic space to better appreciate those things that do not fit into a formula for optimal meaning generation. An atheist and skeptic through and through, I have yet developed warmer feelings toward practices and frames of mind that are usually the domain of religion. I don’t need to believe in the literal existence of supernatural beings in order to find peace, connection, or motivation through ritual and story. I am developing a love for the numinous and the immeasurable.

Ritual, for me, can come in the form of artistic creation. Writing essays like this is a kind of prayer or meditation, allowing me to have a conversation with myself, to interrogate and beseach my own mind for insight. Making theatre is not unlike having a church, where great stories are told by a congregation of participants in order to be shared and experienced by the community. Indeed, theatre and religion are siblings, both born from the stories told by our earliest ancestors around the fire.

These things are not, in and of themselves, meaning. Theatre may have an effect on me that is akin to “spiritual,” and I might just really like doing it, but that’s not what makes it meaningful. It’s not even enough to say that something in my life has meaning because I have chosen to bestow meaning upon it. Really, it’s the fact that I’m not at all sure that it has any meaning at all. Or that anything does.

For our lives to matter, and for what we do to have meaning, we have to endow it with meaning ourselves. But, remember, in the sense of some core, cosmic truth, meaning does not exist. So for me to bestow meaning on something is the same as my granting myself a knighthood. I can do all the bestowing I want, but it doesn’t make it so.

The point, then, is not to discover meaning. The point is to look for it. It’s the search, the investigation, and yes, the agonizing doubt. There are precious few creatures living, that we know of, that are blessed, or cursed, with the ability to even engage in this kind of inquiry. It is a quest to satisfy a yearning for that which can never truly be had.

And there is no single way one is supposed to go about it. “Suppostas” are meaning-repellents.

I was about 10 years old when I realized I was something of an alien among other humans. After college, dear friends of mine told me I’d figure myself out by age 27, and that didn’t happen. At age 41, still fresh off of an autism diagnosis and a divorce, I looked at an actuarial table that showed that I was at the precise midpoint of my lifespan. Barring unfortunate events, I had lived the first half of my life. A few weeks ago, I stepped into the second half.

Maybe it’s age. Maybe I’m just too old at this point in my life to care anymore what people think of me. Maybe it’s not the result of personal enlightenment, but that I simply no longer have the energy to care. Or maybe being fatigued by all the pretending, all the masking, all the passing as “normal,” maybe that very exhaustion is what has allowed for clarity. I just know that my life is likely more than half over, and I’m tired of the suppostas.

I no longer want to fit in. I want everyone else to make room.

So now I actively combat my old instinct to worry about what it is I’m supposed to be like. It is a daily battle, but at least now I’m waging it. Now I am more interested in that yearning, the endless inquiry into meaning and mystery. And I often feel frustrated by the lack of solid answers and despairing of my failures, but I can also accept that the frustration and despair are themselves part of the story, as much as anything else.

It turns out the question was always the answer. And it just so happened that I would arrive at this understanding at this particular time in my life. So, in a way, the answer was, in fact, 42.

A New World Without Loss

Arthur C. Brooks writes about how Ludwig van Beethoven dealt with his gradual hearing loss, which, while crushing to a genius composer, ultimately lead him to new heights of greatness.

It seems a mystery that Beethoven became more original and brilliant as a composer in inverse proportion to his ability to hear his own — and others’ — music. But maybe it isn’t so surprising. As his hearing deteriorated, he was less influenced by the prevailing compositional fashions, and more by the musical structures forming inside his own head. His early work is pleasantly reminiscent of his early instructor, the hugely popular Josef Haydn. Beethoven’s later work became so original that he was, and is, regarded as the father of music’s romantic period. “He opened up a new world in music,” said French romantic master Hector Berlioz. “Beethoven is not human.”

Brooks takes this as a lesson in loss. He says that here Beethoven shows us how losing something precious can open up new possibilities and ideas, and all of that is true. But that’s not the lesson I take.

When Beethoven lost his hearing, he could no longer be aware of what others in his field were doing. Whatever music was being lauded or pilloried at the time, Beethoven had no way to know what it sounded like. He had no way to compare his work to anyone else’s. All he had were his memories of what had come before.

To me, the lesson isn’t how Beethoven turned the tables on fortune and made something beautiful out of loss. (And I do have my own, albeit far less severe, experience of hearing loss to draw from here.) The lesson is that his loss meant that he was no longer burdened with his own perception of what great music is supposed to be. Beyond what he could still hear in his own mind from his musical memory banks, there was nothing for Beethoven to compare himself to. The energy spent and wasted on anxiety and self-doubt brought on by the desire to suit the tastes of the time, his genius was liberated, freeing him to make the best music he was capable of at that moment.

Before I sat down to write this, I caught myself wondering whether the traditional early-2000s-era blog format was still viable, whether anyone would want to read a post by a relative nobody responding to an article by a relative somebody about an indisputably significant somebody. I worried whether the format would make me seem unhip. I worried that whatever I wrote might better suit a magazine essay, which would never be written (nor published if it were), or if it might be best to simply tweet a condensed version of my thoughts, and leave it at that. In other words, I wasted time and energy on anxiety about what my writing is “supposed” to look like.

Imagine that I came to this piece with no preconceived notions of the form my thoughts should take. Imagine I had no respectable essays, eye-catching blog posts, or pithy tweets to compare myself to. Imagine that all I had were my thoughts and my skills as a writer, whatever they happened to be at this moment.

Beethoven’s loss forced him into a position of ignorance. His deafness gave him no choice, but that ignorance freed him. His earlier work sounded like somebody else’s, the work of people he thought were “doing it right.” When he could hear no one else, “doing it right” meant only what was right to him in his own mind.

I, and we, do not have to wait for loss. We do not have to be forced into a kind of ignorance. We can choose to learn from what others have done, build on what we have already accomplished ourselves, and then let everything go. Then we can be free, and we can know it, too. We can open up a new world without loss.

Beethoven is not human, and neither am I. Thank god.