Self-Loathing in the Shadow of the Unfinished Work

A couple years ago, I had the chance to be a real writer, and I blew it.

Way back in 2017, I was asked to spend two weeks in October at a writers’ retreat in Northern California. This had nothing to do with any books I had written (for I had written none) or high-profile publications in which I had been published (for I had not). But because this particular retreat offered a very particular fellowship for writers in a very niche subject area, the previous fellowship recipient kindly recommended me to be his successor. I’m guessing there also weren’t many other folks to choose from, or perhaps they were busy.

The point is that I got to spend one whole fortnight in a gorgeous, rustic home, surrounded by natural beauty, doing nothing but working on my craft.

The problem I immediately faced upon accepting this fellowship was that I had nothing to craft. One was expected to come to this retreat to work on a specific project, usually a book or lengthy article in progress. I had no such project, in-progress or otherwise. I had to come up with one.

So I did. The formulation I made was simple. I took the two areas of thought that were of the most interest to me at the time and decided to mush them together, comparing and contrasting, wrestling with their implications, and working out what epiphanies, lessons, or truths I could extract from the whole enterprise.

It would be a big magazine article, intended for publication in the journal published by my employer. In this way, it would help justify my two-week absense from work, which, I must add, my employer happily and generously granted. It would be a big piece. A “longread.” Perhaps it could turn into a book.

At the retreat, I worked dilligently. Not one for sightseeing or communing with nature anyway, I made the most of this precious allotment of uninterrupted time. I dug deeply into the subject matter. I collected research materials, I interviewed experts over email, I took meticulously sourced and cited notes, I jotted stray thoughts, I sketched outlines, I worked in feature-laden applications for Serious Writers working on Major Projects, and I drafted sections and subsections and introductions and transitions and reflections.

I did not expect nor intend to finish the entire project during my residency, but by the time those two weeks were up, I had a piece that had grown to something like 13,000 good words.

But I still blew it. I never finished it. Two and a half years later, it’s still unfinished.

There were some contributing factors.

For one, during my time at the retreat, something went haywire in my ear. My existing tinnitus worsened exponentially, I began to go through spells of vertigo, and I lost some hearing. This was something of a distraction. It never stopped me from applying myself to my work, but obviously there was a good deal of mental energy that was inevitably spent on this emergent crisis on the right side of my head.

For another, a few months after my return, my marriage ended. You can imagine how that might drain one’s will to work on projects that are largely extracurricular.

These are fine excuses for why it became much more difficult to me to finish to project, but really, I never finished it because I never decided to finish it.

There was never going to be a mystical space carved out of my normal life to make room for plowing ahead with this work. My job resumed, my kids needed their dad, and I needed to manage a monumental and traumatic life transition. But even with all that, I failed to make the decision to sit back down at the computer and write.

Months passed. Then more months passed. In my mind, the Major Project became a queasy source of regret and shame. And the further time progressed from that autumn of 2017, the more I perceived that project as an unmanageable and outdated mess. I think I almost felt like it was angry with me.

But of course, it wasn’t. Nor was it unmanageable; I needed simply to decide to manage it. Nor was it outdated; I needed merely to decide to refresh it.

Nor was it a mess. I was.

A few months ago, I decided to return to it. I even announced it so that I could give myself at least the illusion of public accountability. And over the last several weeks, I have indeed been working on it.

It’s not finished. It begs for merciless refinement, and I don’t mean some tweaks for consicion. It needs some real horror-movie chainsaw violence done to it. I need to detatch myself from feeling precious about certain passages or turns of phrase that simply to not contribute to the larger goal of the piece. I need to rethink the way it’s framed in the opening section so that the reader is better ushered into the subject matter. And I need to find a path out of it, a way to merge its various tributary streams into a single current.

I need to figure out what it really is.

And I will. I haven’t yet, but I will.

I don’t know what this product will be when it’s done. It might yet be that magazine piece I promised my employers back in those innocent days of 2017. But perhaps it’ll be better suited to a series of blog posts. Or maybe it’ll cry out for expansion into a book. I can’t yet say.

Part of what makes this project loom so large in my psyche, and why it still provides a steady drip of regret into my heart, is the weight of validation I placed upon it. By being given this fellowship at this beautiful retreat, even if it had been a strange fluke of circumstance, I had the chance to be a real writer.

Let’s not get technical, now. I know that I am, indeed, already a writer. I constantly churn out written work for my job, I have written for several websites, I been published in a couple of journals, and I write for my own blog.

But you know what I mean. I sought the imprimatur of a real writer, someone whose byline is recognized and sought. Someone who is asked to be on panels at conferences. Someone whose name graces the spine of a book. Someone whose writing actually matters.

I’m not that guy. I might never be.

I definitely won’t be if I don’t decide to write.

And even in the best possible circumstance, in which this piece catches lightning and earns me some amount of approval, it still does not have the power to make me what I already am.

In fact, I may never publish it at all. It may turn out that its entire premise was ill-advised, and that it simply can’t be worked into something that is worth putting out into the wider world.

I don’t know yet. But even if another soul never reads a word of it, I promise myself this.

I will finish it.

Animal Crossing and the Joy of Bucolic Drudgery

Me, in jester’s hat, superhero mask, and business suit, with the quetzalcoatlus skeleton that looms over my property.

Why did I play Animal Crossing for four hours today?

About a month ago I became one of the bajillions of people of all ages enthralled with Nintendo’s bucolic-drudgery simulator, Animal Crossing: New Horizons. I never expected to be. When the game was announced, having no frame of reference for the previous iterations, I was utterly uninterested. Then I saw the deluge of fawning coverage and player testimonials about how this game, this experience, was keeping people sane during the COVID-19 lockdown, and I decided to give it a shot.

Now it’s the center of most family activity and interest at my house. My kids can’t stop talking and thinking about the game, and even my partner, who never plays any video games whatsoever, is utterly devoted to it. (She plays more than any of us!) The four of us are constantly dishing about the other island residents and trading gossip about their quirky behaviors (we all just love Zucker), and we cheer each other on for our successes. (“I finally caught an oarfish!”)

My partner Renée with her big catch. I have a heart attack every time I pull one of these monsters out of the water.

But, you know, why?

I do understand the general appeal of the game’s overall shtick. After all, I spent a great deal of time, circa 2000, enriching the lives of my Sims (or making them suffer unthinkably), and more recently I have easily logged around 1500 hours fashioning empires in Civilization VI. And while I’ve never really gotten the hang of Minecraft, I can at least appreciate how its limitless palette for creativity is so engrossing. I’ve even dabbled, rather tepidly, with Second Life. Animal Crossing boasts many of the elements that made Minecraft, Second Life, and the Sims and Civilization franchises appealing. And it’s way cuter.

But viewed from another angle, playing Animal Crossing can seem a lot like the equivalent of doing manual farm labor for a cult leader. Tom Nook is Joe Exotic and we are all his expendable underlings being paid in fake currency and expired meats.

For example, I can spend an hourlong game session just pulling weeds.

Let me slightly rephrase that. I choose to spend an hourlong game session pulling weeds.

And the crazy part is that I love it. With every clump of vegetation I yank from the ground and stuff into my “pockets,” I have made my little island home (which is called Duckbutt Island) just that much more beautiful, and made a larger canvas for me to do with as I like. The methodical, somewhat rhythmic pulling of the weeds is rather meditative, much like real gardening can be (but without the real dirt or real bugs). Even the sound that comes from each weed-pull, a sort of squirty “yoink,” is weirdly satisfying.

I’m not kidding here. When I go on a jaunt to a “mystery island” or visit my kids’ domain and I see a lot of weeds, I think, and perhaps shout out loud, “Oh boy! Weeds!

Later, I can store all those weed clumps away and wait for Leif to come back to Duckbutt town square and purchase them at a modest markup.

Planting flowers, shaking trees, whacking away at rocks, collecting seashells — all of it is tedious, and yet it’s the tediousness that’s often the most appealing part for me. I do also enjoy the creative customization, designing one’s avatar and dwelling, and I have fun checking the boxes that qualify Animal Crossing as a “game” by hitting certain milestones, fulfilling necessary tasks, and upgrading life on Duckbutt. Those things all help Animal Crossing feel like it has a “point.”

But even without those things, it’s remarkably soothing to simply wander one’s island and gently tend to it.

Me in my red outback hat, dress made of cherries, and recycled boots, livin’ life like it’s golden with the boys — my two giant snapping turtles.

In this way, Animal Crossing is less a game, and more of a place to go — which is especially valuable at this moment in history. Countless other games offer this kind of escape, of course, from Fortnite to World of Warcraft to, well, name your MMORPG of choice. None of them, however, have appealed to me the way Animal Crossinghas…with perhaps the exception of The Legend of Zelda: Breath of the Wild, which, while not a “sim” by any means, provides so rich and wide of an experience that one can simply wander and putter about delightedly for hours. And believe me, I have.

Zelda aside, perhaps it’s those other games’ sprawling complexity that suggests to me that the effort to master them wouldn’t be worth the time and energy.

Animal Crossing has many layers of complexity, but they all feel very optional. One can advance and upgrade at one’s own pace, and in the meantime there is always something to do, harvest, beautify, design, or craft. And, importantly, as you meander and dawdle, no one will be out to get you.

However, one aspect of Animal Crossing that has really solidified it as a breakout phenomenon at this moment is its social component. Players can visit the islands of friends or anyone on the internet who opens their island to visitors. I’ve played online with my kids while they’re at their mom’s house, but otherwise I have interacted very little with anyone else. What am I missing here?

I suspect it has more to do with me than the game. My reticence and anxieties over social encounters in meatspace seems to carry over to Animal Crossingin strikingly similar ways. Just like in the real world, I worry over what to say or how to behave around another player, and feel exhausted in advance by whatever expectations they might have of me. I feel pretty confident of my ability to cultivate lasting friendships with Zucker the octopus and Truffles the pig. And Blathers, well, he is my true soulmate. But actual humans are another story.

At least on a computer generated island, no one expects our avatars to make eye contact.

I maybe oughta blog more.

There was a time when I tried to make a point of writing at least one blog post every day. Today that sounds like some trite advice from a self-help article on Medium, but I wasn’t doing it in order to “gain 50,000 followers” or what have you. It was a good habit to keep as a writer, to practice in public like that, and it genuinely felt good to have made something each day. But mostly, I actually felt like I had something to say, all the time.

These days, it’s remarkable if I write something more frequently than once a month (this is outside of work, of course, where I write all damn day, every day). There’s a long list of contributing factors. Personal reasons include mental exhaustion from work, attention demanded by kids and other family matters, the attraction of less intellectually demanding pastimes like video games (I really don’t watch much TV at all), and a bedtime that seems to seep every-earlier into the evening as I age.

There are also, I think, broader cultural reasons I don’t blog like I used to. The novelty of the form itself has worn off since its early-aughts hayday. While blogs were once the primary venue for processing and debating the events and issues of the day, they have been largely replaced; for journalists and activists, by Twitter; for everyone else, by Facebook. In those now-hazy before-times, one might be outraged over something some political figure did, compose a four or five-paragraph screed expressing said outrage, and liberally blockquote from some other source for the purpose of bolstering or rebutting one’s argument. Today, the same person will now retweet someone someone else said about said outrage, and maybe add an original line to a tweet in order to keep it within one’s personal brand. Or they’d share an article (probably unread) on Facebook, perhaps adding their own exclamation-marked sentence about the outrageousness of the outrage.

The author in 2006, with a laptop, possibly blogging. Possibly not.

The point being, blogs just aren’t where the action is. Blogs were once little islands of thought, from which individuals or small bands of like-minded island-dwellers would cast their prose into the wide ocean of the internet (or, as it was more often characterized back then, the capital-I Internet, like it was a place). Often, that prose might be fashioned into a kind of dinghy and aimed directly at another Internet Island, sometimes carrying supplies, sometimes a warhead.

It was fun!

Some of those Internet Islands still exist and thrive, and some have developed into full-blown Outlets, honest-to-goodness nation-states in the online media realm. Some blogs were subsumed into larger entities, or their feudal lords were lured away to more luxurious courts. But I think for most of us who were on the tiniest of those Internet Islands, we saw that no one was reading what we wrote anyway, so we might as well put in as little effort as possible, and be ignored on Twitter instead.

And good lord, did I love Twitter for a while. I felt like I really got it, and my own brand of everything-is-terrible humor-as-despair shtick felt very well suited to the platform. Today, though, Twitter is like punishment. I check in, I scroll, and I am quickly saturated by anxiety, anger, and despondency. And it doesn’t seem to matter what measures I take to curate my feed. In a time as ugly as this, ugliness is all there is to tweet.

As for the material I put out on Twitter, no one is seeing it. Even after thirteen years on the platform (Jesus Christ, has it really been thirteen years???) I have managed to attract a measly 4000-some followers, only a tiny fraction of which ever actually see (or care to notice) what I write. If something I tweet does happen to break out a little — usually because a certain friendly atheist has retweeted it to his own massive following — I become deluged with inane replies that are often inexplicably hostile. None of it seems to make things any better, and there’s no feeling of accomplishment.

And besides, I’m not a “tweeter.” I’m a writer. And while thoughts expressed in 280 characters or less is an absolutely valid and valuable form of writing, it’s not sufficient for me.

This gets me back to the question about why I don’t write more, or more specifically, why I don’t blog.

The despondence engendered by Twitter is part of the answer. The ocean of the internet (it’s lowercase-I these days) is already so polluted with opinions, punditry, takes, essays, outrages, and news, it hardly seems useful to throw in more of one’s own trash. Things are bad! Bad people are doing bad things! You don’t need me to tell you that. And while I could write about something else instead, something that has nothing to do with how terrible everything is, my despair has sapped my drive to share my thoughts about anything.

Another reason for my blog-hesitancy is ego. There seems little point in putting in the effort of writing when I know that no one’s going to read it. And my standards for what constitutes “some folks read it” versus “no one read it” have already been lowered to sub-basement levels. The idea is supposed to be that the good stuff will rise to the top, but I don’t think anyone believes that anymore, and who knows if my stuff would even qualify as “the good stuff” anyway? Sometimes I think it has, but what do I know? I only have 4000 Twitter followers.

I end this post without an answer, other than the obvious, which is: Do it anyway. What I write — and yes, specifically, blog — should exist for its own sake. For my sake. Because each time I do it, I will have made something. I will have improved my own thinking and come to better know myself. It will, as Vonnegut put it, make my soul grow.

And maybe, on the off chance that someone else encounters it, maybe it will do something good for them, too. Maybe that person will stand up from where they’re sitting on their Internet Island, look across the sea in my direction, and wave.

An Actor, an Introvert, and a Universe of Possibilities

The author in 2006.

People tend not to believe me when I tell them I’m severely introverted. It’s understandable, as the persona I put forward is usually that of a quirky, agreeable smart-aleck. I am animated and expressive in conversation, I engage in overtly silly play with my kids, and of course, I’m an actor and musician.

To many people, my personality simply seems too big to be that of someone who is shy, anxious, or reserved, let alone autistic. Some have even told me they find me intimidating. To me, that’s beyond ridiculous, but there it is.

When folks have trouble grasping how it is I could have had found any joy in being an actor while finding social interaction to be utterly draining and even painful, I explain that when I’m performing, I’m protected by several layers of metaphorical masks. On stage in a play, I am explicitly not myself. It says so right in the program! Next to my name will be the name of whatever character or characters I’m playing. I’m definitely not playing “Paul Fidalgo.”

I don’t have to be clever or come up with interesting things to say, because the words have been written for me, hopefully by someone who is well established as being really, really good at writing interesting things for people say, like, for example, William Shakespeare.

People tend not to believe me when I tell them I’m severely introverted. It’s understandable, as the persona I put forward is usually that of a quirky, agreeable smart-aleck. I am animated and expressive in conversation, I engage in overtly silly play with my kids, and of course, I’m an actor and musician.

To many people, my personality simply seems too big to be that of someone who is shy, anxious, or reserved, let alone autistic. Some have even told me they find me intimidating. To me, that’s beyond ridiculous, but there it is.

When folks have trouble grasping how it is I could have had found any joy in being an actor while finding social interaction to be utterly draining and even painful, I explain that when I’m performing, I’m protected by several layers of metaphorical masks. On stage in a play, I am explicitly not myself. It says so right in the program! Next to my name will be the name of whatever character or characters I’m playing. I’m definitely not playing “Paul Fidalgo.”

I don’t have to be clever or come up with interesting things to say, because the words have been written for me, hopefully by someone who is well established as being really, really good at writing interesting things for people say, like, for example, William Shakespeare.

I don’t even have to decode any social signals or read between the lines of what others are saying in order to know when to speak, because it’s all been planned out in advance. I am forbidden from speaking until my own lines are cued. That limitation is indescribably liberating.

I don’t have to know what to wear. I don’t have to know where to stand or how to behave, because all of that will have been worked out in rehearsal. If the play doesn’t call for my presence in a scene, I don’t even have to exist.

But there’s another way to explain the apparent incongruity of my personality that flips all of this on its head, and I didn’t even realize it myself until I had it explained to me in an article by a true master of the theatre from several years ago.

I recently came across an essay published in The Nation in 2011 by the great actor and playwright Wallace Shawn, who most folks will know as Vizzini in The Princess Bride, Grand Nagus Zek on Star Trek: Deep Space Nine, or the voice of the Tyrannosaurus Rex in the Toy Story movies. Maybe you know him from the 1981 film My Dinner with Andre. Oh, and he was just in Marriage Story, so that might help.

In his essay for The Nation, which is a truly beautiful piece of prose in which he explains how his art leads him to consider himself a socialist, Shawn writes:

We are not what we seem. We are more than what we seem. The actor knows that. And because the actor knows that hidden inside himself there’s a wizard and a king, he also knows that when he’s playing himself in his daily life, he’s playing a part, he’s performing, just as he’s performing when he plays a part on stage. He knows that when he’s on stage performing, he’s in a sense deceiving his friends in the audience less than he does in daily life, not more, because on stage he’s disclosing the parts of himself that in daily life he struggles to hide. He knows, in fact, that the role of himself is actually a rather small part, and that when he plays that part he must make an enormous effort to conceal the whole universe of possibilities that exists inside him.

In one version of my explanation for why such a loud, animated performer like me could be such a severe introvert is that I alone am too small and too vulnerable to be comfortable in my own skin in the midst of other humans. But what Shawn helped me to see is that this disconnect also stems from the fact that my singular, real-life self is also near to bursting with thoughts, ideas, fears, ambitions, impulses, and possibilities.

The potential energy bottled up and pressed down into this small, delicate body is overwhelming. Letting any of its pressure out brings with it the risk of humiliation, regret, misunderstanding, or bewilderment. So a single, inoffensive persona must be adopted, a safe and broadly acceptable packaging must be applied.

The stage does not solve or sort all of these parts, but it does allow them to manifest in meaningful, productive, and satisfying ways. In this way, an actor’s role is sort of like Mjölnir to Thor.

In Thor: Ragnarok, the Asgardian Avenger has lost his legendary hammer, Mjölnir, and at the edge of utter defeat, he hears the voice of his late father Odin, who asks him, “Are you the god of hammers?” Odin explains that Mjölnir was not the source of Thor’s power, but merely a means of focusing and controlling it. The real power, the “thunder,” is already inside him, coursing through him.

That’s what a role in a play is for an actor. It harnesses the lightning and thunder inside us and allows us to wield it. Shakespeare himself even wrote of “youths that thunder at a playhouse.”

It is true that for me, and I suspect for many actors, taking on a role is a way of protecting ourselves, providing armor for our fragility. But it is also a means to show our strength, to unleash a power within us that in most other circumstances would be too dangerous or destructive.

As Wallace Shawn says, we have within us a universe of possibilities. The stage allows us to live some of them out.

A New World Without Loss

Arthur C. Brooks writes about how Ludwig van Beethoven dealt with his gradual hearing loss, which, while crushing to a genius composer, ultimately lead him to new heights of greatness.

It seems a mystery that Beethoven became more original and brilliant as a composer in inverse proportion to his ability to hear his own — and others’ — music. But maybe it isn’t so surprising. As his hearing deteriorated, he was less influenced by the prevailing compositional fashions, and more by the musical structures forming inside his own head. His early work is pleasantly reminiscent of his early instructor, the hugely popular Josef Haydn. Beethoven’s later work became so original that he was, and is, regarded as the father of music’s romantic period. “He opened up a new world in music,” said French romantic master Hector Berlioz. “Beethoven is not human.”

Brooks takes this as a lesson in loss. He says that here Beethoven shows us how losing something precious can open up new possibilities and ideas, and all of that is true. But that’s not the lesson I take.

When Beethoven lost his hearing, he could no longer be aware of what others in his field were doing. Whatever music was being lauded or pilloried at the time, Beethoven had no way to know what it sounded like. He had no way to compare his work to anyone else’s. All he had were his memories of what had come before.

To me, the lesson isn’t how Beethoven turned the tables on fortune and made something beautiful out of loss. (And I do have my own, albeit far less severe, experience of hearing loss to draw from here.) The lesson is that his loss meant that he was no longer burdened with his own perception of what great music is supposed to be. Beyond what he could still hear in his own mind from his musical memory banks, there was nothing for Beethoven to compare himself to. The energy spent and wasted on anxiety and self-doubt brought on by the desire to suit the tastes of the time, his genius was liberated, freeing him to make the best music he was capable of at that moment.

Before I sat down to write this, I caught myself wondering whether the traditional early-2000s-era blog format was still viable, whether anyone would want to read a post by a relative nobody responding to an article by a relative somebody about an indisputably significant somebody. I worried whether the format would make me seem unhip. I worried that whatever I wrote might better suit a magazine essay, which would never be written (nor published if it were), or if it might be best to simply tweet a condensed version of my thoughts, and leave it at that. In other words, I wasted time and energy on anxiety about what my writing is “supposed” to look like.

Imagine that I came to this piece with no preconceived notions of the form my thoughts should take. Imagine I had no respectable essays, eye-catching blog posts, or pithy tweets to compare myself to. Imagine that all I had were my thoughts and my skills as a writer, whatever they happened to be at this moment.

Beethoven’s loss forced him into a position of ignorance. His deafness gave him no choice, but that ignorance freed him. His earlier work sounded like somebody else’s, the work of people he thought were “doing it right.” When he could hear no one else, “doing it right” meant only what was right to him in his own mind.

I, and we, do not have to wait for loss. We do not have to be forced into a kind of ignorance. We can choose to learn from what others have done, build on what we have already accomplished ourselves, and then let everything go. Then we can be free, and we can know it, too. We can open up a new world without loss.

Beethoven is not human, and neither am I. Thank god.

Purposeless on Purpose

The_Night_School_1660_Geard_Dou

I seek to be at peace with my own irrelevance.

In earlier, less distracted, and less accountable years, I was a fount of creative energy. Free time was often spent on writing songs and recording music or writing essays and blogs. I have always been driven to create. That drive formed my earliest sense of identity.

Today, in my forties, raising two kids, and working at an intellectually demanding job, my sparse remaining energy usually feels insufficient for extracurricular creativity. Fumes make for a poor muse.

So I don’t write nearly as much anymore. I rarely pick up an instrument. Songwriting is now filed away in the dusty archives of my persona as ”something I used to do.”

But while the fatigue of existence is real, and my drive to create fires on fewer cylinders, these aren’t the real obstacles to creating. Nor can I lay the blame on the easy abundance of distractions provided by the internet, a phenomenon that had yet to saturate the culture when I was in my prolific twenties.

It used to be that as I worked, with every paragraph or stanza, I believed myself to be building toward something. I was laying the foundations of my career, one in which I would not just be a creator, but one that mattered. “Fame” isn’t quite the right word for what I was after (though I would not have shunned it by any means), but perhaps “prominence.” I would be known.

That didn’t happen. It’s not going to happen, either. For years now, this has been an inexhaustible source of regret and self-loathing. I’ve been dedicating a great deal of thought and work toward being at peace with the fact that whatever meager level of renown I’ve scraped together at this point is about as good as it’s ever going to get.

What does this mean for the creative drive I claim to still possess? Nothing good, I’ll tell you that!

I might become more accepting of my irrelevance to the wider world, but that very acceptance starves me of much of what once served as creative fuel. Why write an essay that only a handful of people will ever read, for which I will not be compensated, and which doesn’t lead to my work being discovered so that I can be placed into the demi-pantheon of People Whose Writing Matters?

In other words, why bother?

The wall of “why bother” is a big one. From any distance, its summit visibly looms over the top of my laptop’s screen. Large, white letters adorn the wall like the Hollywood sign promising “NO ONE CARES.” The letters are much brighter than the display on which I type.

One is not supposed to see things this way. Creation is supposed to be for its own sake. I have always had a great deal of trouble with “supposed to’s.”

So I seek out wisdom. In Zen and the Art of Archery, Eugen Herrigel questions his master about the purpose of his archery training. The master insists that Herrigel take no note of the target. The master insists that he not consider releasing the arrow. For what feels like ages, the master keeps him focused only on drawing back the bow, and nothing else. And Herrigel is utterly flustered. He says to his master that he is unable to lose sight of the fact that he draws the bow and lets loose the arrow in order to hit a target. There is a reason for all of this effort:

“The right art,” cried the Master, “is purposeless, aimless! The more obstinately you try to learn how to shoot the arrow for the sake of hitting the goal, the less you will succeed in the one and the further the other will recede.”

They debate this point for a bit, and Herrigel asks:

“What must I do, then?”

“You must learn to wait properly.”

“And how does one learn that?”

“By letting go of yourself, leaving yourself and everything yours behind you so decisively that nothing more is left of you but a purposeless tension.”

“So I must become purposeless — on purpose?” I heard myself say.

“No pupil has ever asked me that, so I don’t know the right answer.”

The idea that art, creation, is purposeless, is very difficult for me to internalize. I can intellectually understand and even appreciate it, but I can’t seem to accept it in my heart. The words “why bother” still ring in my head, and the “NO ONE CARES” sign still leaves a visual trace on my retinas when I close my eyes.

“You will be somebody, the second you make peace with being nobody,” Heather Havrilesky has written. “You can create great things, the second you recognize that making misshapen, stupid, pointless things isn’t just part of the process of achieving greatness, it is greatness itself.”

Being purposeless on purpose is, itself, greatness? I want to believe. The idea that a creative work is supposed to be purposeless is a claim without evidence. It is less a truth than it is a statement of faith. One has to decide for oneself that the work itself is enough.

“Let go of the shiny, successful, famous human inside your head,” writes Havrilesky. “Be who you are right now. That is how it feels to arrive. That is how it feels to matter.”

I do believe that. I can work with that.

But she also says, “Being a true artist merely lies in recognizing that you already matter.” That, I don’t understand. How does she know what qualifies one as a true artist? How does the archery master know that one’s aim must be aimless?

Like many statements of faith, I suspect the value of the claim that art is purposeless lies less in its veracity, and more in the behavior it induces. Its value is in the discipline required to live that ideal. It may or may not be true that creative work is “supposed to” be purposeless. It may or may not be true that writing this essay right now, or any other, is a meaningful end in itself, regardless of whether it is ever read or appreciated by anyone.

I don’t know if these things are true. I doubt that they are. But I might need to take the leap of faith and live as though they are. Doing so will take a good deal of practice. Discipline. But unlike art, it would not be purposeless. For if I can manage it, I may begin to believe that I do matter right now, and that mattering right now, and at no other time nor to any other people, is enough.

In Between the Pictures is the Dance

639D09D3-5BC7-4147-8B0A-FD0B734B8200I’m not a dancer by any stretch of the imagination, but I’ve taken my share of dance and movement classes in my previous life as an acting student. I don’t mind being able to tell people that “I studied dance at Alvin Ailey,” which is technically true, as that’s where the acting students in the Actors Studio graduate program had dance classes. I was a hard-working if mostly-hopeless student, and a frequent cause of eye-rolling and pity-sighs from our teacher Rodni, who moved with incredible control, strength, specificity, and power. It was not necessarily transferable.

The man who taught me more about dance and movement than anyone else in my life was Henry, the impossibly graceful, endlessly wise, and astoundingly patient head of the dance department at my undergraduate state college. Truly, there was something superhuman about the man (I assume there still is, he’s alive and well, and I imagine will be for many centuries to come). Taking a dance class with Henry was what I imagine taking a physics class with a gifted professor is like; it seemed as though every lesson had several “ah-ha” moments in which something marvelous about the body in space suddenly broke its way into my bewildered brain.

An example: What is walking? Henry would ask us as we ambled around the studio. The answer, which I was distinctly proud to call out in class when I had my eureka moment: Walking is falling. Think about it, you’ll get it.

Getting through to me was a doubly remarkable feat on Henry’s part, given that I’m autistic (Asperger’s, to be precise), which was unknown to me at the time, and surely made the job of teaching me how to move in a coordinated, graceful way exceedingly difficult. Rodni, gifted as he was, could have learned a few things from Henry, I have no doubt.

One particular “ah-ha” moment with Henry came outside of regular class time, when for some reason I can’t recall, he was looking over some of the choreography he had written for the school’s next big dance concert. I had never seen choreography written down before, only taught to me in person (an experience I do not envy any choreographer). Musical notation I could understand conceptually, of course, but how could one codify movement in unmoving glyphs?

I don’t know what most choreographers do, but Henry’s approach was pretty damned simple: stick figures. Much like a comic strip, the figures would be drawn in particular poses, indicating the moves the dancers would execute at various points in the music. There were probably arrows indicating direction and other marginalia scribbled throughout, but this is all I can remember.

I think I expressed my surprise that this was how choreography was written, that it could be done with stationary pictures even though the art form itself is based entirely on motion. Henry explained that rather than think of them as representations of movement, each picture should be thought of as points for the dancer to reach, marks to hit with their bodies. The stick figure poses were guideposts, “You Are Here” indicators.

“The pictures are the choreography,” explained Henry. “In between the pictures is the dance.”


There’s that cliché about the journey being of greater value than the destination, “it’s not where you go, it’s how you get there,” and so on. Maxims on that theme are so overused that they usually come off as trite to me, if not meaningless, or at least what Daniel Dennett might call a “deepity,” an idea that is true on its face in the most basic and obvious way, but without any of the profundity it’s presumed to convey.

I may be coming around.

Another fellow who, though I’ve never met him, I nonetheless consider one of my most important teachers, is the writer Alan Jacobs. One of his recent books is a short volume called, simply, How to Think, and truly, I feel like no one should be allowed to discuss politics or religion, write opinion columns, or use Twitter until they’ve read it.

The book warrants a substantive review of its own, but I want to call attention to one passage that had my neurons firing off like the 1812 Overture. Thinking, according to Jacobs, is a skill that has been wrongly equated with coming up with answers, decisions, and responses. Thinking becomes about being right, about winning. Jacobs explains what it’s really about:

This is what thinking is: not the decision itself but what goes into the decision, the consideration, the assessment. It’s testing your own responses and weighing the available evidence; it’s grasping, as best you can and with all available and relevant senses, what is, and it’s also speculating, as carefully and responsibly as you can, about what might be. And it’s knowing when not to go it alone, and whom you should ask for help.

Decisions, answers, conclusions; these are the final pose at the end of the music before the curtain falls. Each new piece of data acquired, each bit of information learned, are marks to hit, the guideposts that lead us on. They are static snapshots, pictures. But the thinking itself is what happens while we’re seeking those data points, hunting for information, and piecing it all together in our minds.

In between the pictures is the dance.


A few months ago, a friend of mine fervently insisted I read Mark Manson’s The Subtle Art of Not Giving a Fuck, one of those anti-self-help books that seem to hip right now. I was somewhat reluctant. (Oooh, it has “fuck” in the title! How edgy!) But I am so glad I did, because, much to my surprise, it taught me what happiness is.

That’s overstating it somewhat. But Manson offers a way of thinking about happiness that, for whatever reason, had never consciously occurred to me. Simply put, Manson says that happiness is not a state one achieves, but it is rather a process, it is what we experience when we are solving the problems we want to be solving.

That’s it. Happiness is not something to be attained, it’s just what happens while we’re solving problems. If we hate or don’t care about a set of problems, we’re miserable. If we do care about them, the process of solving them is what makes for a rewarding, meaningful existence. If I am spending my time and energies on tasks that hold no meaning for me, I’ll hate every moment of it.

But when I’m directing a play for my university students, for example, I can actually experience bliss, because I’m solving the problems presented by the production so that it can become its own living, breathing work of art. When it’s over, and the run of the play has finished, I almost always crash hard, and have serious trouble clambering my way out of a serious depression. This is largely because completing the play is not what brings me happiness (though averting a disaster for the production also averts severe psychological breakdowns).

It’s putting the play together that brings me meaning; helping the actors understand what they’re saying and why their characters do what they do; arranging the movement and positions of bodies on stage; coming up with ideas for costumes, sets, props, and sound; helping individual students overcome their hangups and anxieties so that they can grow into their roles and blossom. While it’s gratifying when each problem gets solved, checking off boxes on the great beast of a to-do list that a theatre production can be, each solved problem is one mark, one picture.

The struggle is the point. The joy is in the journey. Happiness is in the process. And in between the pictures is the dance.


Am I too old to have just figured some of this out? Having spent 40 years obsessing over goals and products, I never noticed that everything that mattered was in the reaching, in the creating. The doing, not the having-done. The -ing’s, not the -ed’s. Looking back, it becomes obvious.

I have time left, I think. I hope. I can’t have those previous 40 years back, but maybe I can reframe my memory, tell my story to myself that focuses on the journeys rather than the successes and failures. And maybe I can start the next story from this perspective, though not as a goal to be achieved — I must think differently about my life— but as a process, a discipline, an asymptotic odyssey.

Look, some goals must be achieved, whether they provide meaning or not. Marks do have to be hit and some boxes absolutely have to be checked. You know what kind of box-checking I mean, the Maslovian, bottom-of-the-triangle kind, the kind that provide for one’s life necessities, and that of those in one’s care. There is not always joy in hitting the most remedial marks of mere survival. Though maybe there sometimes is.


This is how love works too, isn’t it? Whether familial, platonic, or romantic, it’s the active cultivation of a relationship, the choice to give of oneself to another person, be it a child, friend, or lover. Like happiness, love can’t just be a feeling, a state that we achieve, or a spell cast upon us. It’s the choice to love — a choice we keep making, moment to moment, picture to picture — that gives it meaning, that makes it matter, that makes it real.

I think that has to be it.

Walking is falling, and in between the pictures is the dance. I between the answers is the thinking. In between the giving is the love.

In between the moments, in between the events, in between the accomplishments, in between the failures, in between the losses, in between the lessons, the steps, the miles.

In between the seconds is life. That’s where it is.

Cynical Boy: Thoughts on Marshall Crenshaw’s 1982 Self-Titled Album

Inspired by The Incomparable podcast’s series of “album draft” episodes, I thought it might be an interesting exercise to write about some of the albums that have been the most meaningful to me. So whether or not I decide to do several of these kinds of posts, here’s my first stab at it.


rs-135352-43d9626a23c1fa22111b40b8cfe7753ea4fd94a4I was very close to never having heard of Marshall Crenshaw. It just so happened that my dad had used a cassette copy of Crenshaw’s eponymous first album to mix down one of his own original songs (Billy Joel’s Nylon Curtain was on the other side, which I’ll probably get into in another post). One day while in my teens, I went searching through my dad’s tape collection to find his song, and gave it a listen. The tape kept playing after dad’s song, and suddenly this simple and engrossing little guitar riff grabbed my attention, and I was pretty much hooked from then on.

That riff was, of course, the opening notes of Crenshaw’s “There She Goes Again,” which remains one of my absolute favorite songs. It pretends to evince optimism and liberation in the face of separation and loss, but it’s all obviously a mask for the sickening weight of regret and the sting of rejection.

His album, Marshall Crenshaw (1982), largely remains in this vein, with nostalgically styled pop-rock tunes that sound like they could have been recorded in a basement, and I mean that as a compliment. It’s certainly polished, but it also has an immediacy and organic feeling, as though Crenshaw and his band are friends of yours who are working on their record right in front of you.

Once I discovered Crenshaw, I immediately related to him. He’s a smaller guy with glasses who likes hats, and he writes extraordinarily satisfying, hook-infused melodies and arrangements, almost all of which serve as wrappers for some sort of pain, self-doubt, or regret. This element is rarely overt, instead it comes out in comic self-deprecation, little jabs at his blunders, and a kind of hapless, “well what can you do?” persona. I really get that.

Anyway, the album. “Someday, Someway” is the album’s hit, which you’ll still hear once in a while on the radio or pop up in TV shows. It’s a very good song, but it’s not even one of the better ones on the record. Apart from the opening track, highlights include “Rockin’ Around in NYC,” which is both bouncy and tense at the same, in which he sings, “I get the feeling that it really was worth coming after we tasted disaster”; and “Mary Anne” with its gorgeous counterpoint backing vocals and its resignation to someone’s else’s despair.

“The Usual Thing” and “Cynical Girl” are rather different in tone, but both are defiant love songs that embrace uniqueness and alienation. On “The Usual Thing,” he worries that giving himself over to someone else will cause him to “lose his energy,” which sounds to me like the lamentation of an introvert. “But,” he tells her, “if I didn’t think you were a little bit out-there too, I just wouldn’t bother with you.”

And on “Cynical Girl,” he longs for a partner who, like him, has “got no use for the real world.” He sings, “I hate TV. There’s gotta be somebody other than me who’s ready to write it off immediately.” Damn right.

I really like a lot of Crenshaw’s other albums, most particularly #447 and Miracle of Science, but Marshall Crenshaw is something truly special, a rare distillation of the delights of classic pop-rock and the pain of being “a little bit out-there.”