Resident Theologian

About the Blog

Brad East Brad East

Two ways of reading

One way of reading something is to ask what’s wrong with it: what’s missing, what’s out of place. Another way of reading something is to ask what one might learn or benefit from it.

One way of reading something is to ask what’s wrong with it: what’s missing, what’s out of place. Another way of reading something is to ask what one might learn or benefit from it.

It’s a mistake to see the first as “critical” reading. This is the perennial academic error. Finding fault with a piece of text—whether an op-ed, an essay, a journal article, a monograph, a novel, or a poem—for being imperfect (i.e., human) or finite (i.e., limited) is absurd. We know in advance that every text we ever read will be both finite and imperfect. This is not news. Nor is “critical” reading the smirking discovery of whatever a given text’s limits or imperfections are. Who cares?

Roger Ebert liked to admonish Gene Siskel for being parsimonious in his joy. The principle applies to reading and indeed to all intellectual activity. Why should what is flawed take priority over what is good? Why not approach any text—any cultural artifact whatsoever—and ask, What do I stand to receive from this? What beauty or goodness or truth does it convey? How does it challenge, provoke, silence, instruct, or otherwise reach out to me? How might I stand under it, as an apprentice, rather than over it, as a master? What does it evoke in me, and how might I respond in kind?

Such a posture is not uncritical. It is a necessary component of any humane criticism. It is the first step in the direction of genuine (rather than superficial) criticism, for it is an admission of need: of the limits and imperfections of the reader, prior to mention of those of the text.

In a word, humility is the condition for joy, in reading as in all art. And without joy, the whole business is a sad and rotten affair.

Read More
Brad East Brad East

Prudence policing

There is principle and there is prudence. Principle is what’s right, what you believe to be true and good, no matter what. Prudence is what to say and do about it, when, and how. In online and social commentary, the prudence policing is as ubiquitous as it is nauseating.

There is principle and there is prudence. Principle is what’s right, what you believe to be true and good, no matter what. Prudence is what to say and do about it, when, and how.

In online and social commentary, the prudence policing is as ubiquitous as it is nauseating. Writer X claims that, if writer Y really believed in principle Z, then Y, like X, would go about addressing Z in precisely the same way X believes best. But that’s just a category mistake. There may be any number of legitimate reasons to disagree about what prudence calls for, whether in deed or in word—that is, with respect to public (or private) action or with respect to public (or private) speech.

It is silly and unserious to constantly police others’ prudential judgments, not least when the persons in question are strangers whom one knows only from the internet, their writing, or their profession. It’s tacky, more than anything. It treats the discipline of seeking to understand and elaborate our common life in all its detail and complexity as, if not a game, then a species of yellow journalism: Did you hear what happened ten seconds ago? Care to comment?

It’s perfectly reasonable to say no in reply. To assume otherwise is to reduce writing in all its forms to propaganda, sound bites, and the perpetual reinforcement of tribal identities. Which, come to think of it, is not a bad description of Twitter.

Read More
Brad East Brad East

Getting at the truth

There is a nasty temptation to which any thinker, writer, scholar, or speaker is vulnerable. It’s something I’ve observed in prominent theologians as they age but also peers who, though on the younger side, have amassed enough of a readership and output that they could plausibly be said to have “a body of work.”

There is a nasty temptation to which any thinker, writer, scholar, or speaker is vulnerable. It’s something I’ve observed in prominent theologians as they age but also peers who, though on the younger side, have amassed enough of a readership and output that they could plausibly be said to have “a body of work.”

The temptation I have in mind is this: A shift, subtle but clear, from seeking above all to get the matter right to clarifying how to get me right. That is, instead of aiming at the truth as such (regardless of what you or I now think or once thought or have written or whatever), aiming at “the truth of my position.” The latter project inevitably entails constant granular adjudication and exegetical niceties infused with, or motivated by, persistent and often grouchy defensiveness. “Ugh, are these critics even literate? Thank God this one other person can read; he got me right, and they should read him if they would understand me.”

This boundless self-referentiality not only creates an echo chamber. It not only moves the focus away from the subject of inquiry to the inquirer himself. It’s boring. Recursive hermeneutical obsessiveness about one’s own project, invariably framed as a necessary if toilsome defensive measure, is simply not interesting. Not least when the person is a theologian, philosopher, or ethicist—in other words, someone whose objects of interest are in themselves supremely fascinating and existentially urgent.

Don’t fall for it. Don’t be that person. Don’t assume your oeuvre is more attractive than what drew you to your discipline in the first place. Don’t substitute your own ego or career for the pursuit of the truth. If the cost is that people misunderstand you, or that what you once believed turns out to be erroneous, so be it. For intellectual work, that’s the price of doing business anyway. Best to accept it now and, so far as you’re able, affix your eyes to what matters most, refusing to look away—not to your CV, not to your reviews, not even to your mentions. Those are songs of the sirens. If they succeed in seizing your attention, they’ll keep you forever from arriving home.

Read More
Brad East Brad East

Touchy self-regard

There are many scholarly vices, but the two that stand out most prominently to me are defensiveness and self-pity. We all know academics who fall prey to these. What’s unfortunate is that, far too often, they seem to be positively rather than inversely correlated to one’s status, fame, renown, and success. Attend a conference, observe a well-known master of a sub-guild on a panel, and you’ll be shocked (or not) by his sheer touchiness. The mere mention of a minor dissent from one of his many ideas will call forth a thunderstorm of wrath and emotion worthy of a toddler tantrum.

There are many scholarly vices, but the two that stand out most prominently to me are defensiveness and self-pity. We all know academics who fall prey to these. What’s unfortunate is that, far too often, they seem to be positively rather than inversely correlated to one’s status, fame, renown, and success. Attend a conference, observe a well-known master of a sub-guild on a panel, and you’ll be shocked (or not) by his sheer touchiness. The mere mention of a minor dissent from one of his many ideas will call forth a thunderstorm of wrath and emotion worthy of a toddler tantrum.

But it’s not just the intemperate. Follow a scholar or writer on Twitter. It should be clear to all of us by now that social media in general exacerbates these vices. For the voices we’ve heard in our heads our whole lives—you’re a fake, everyone knows it, you don’t know anything, your writing isn’t worth a damn, why do you even waste your time?—are given quite literal and insistent and incessant expression in one’s replies, DMs, and emails. This is why every writer and scholar should get off every form of social media, Twitter above all. It trains the psyche (and the ego) to categorize any criticism, however legitimate or gently phrased, as falling under the genre of “reply-guy mentions.”

The other vice I have in mind is an exaggerated self-regard. This manifests in a reflex that wonders why it is that one writes in the first place—after all, no one reads my work anyway, so what’s the point? But then, such a cri de coeur is invariably in print, meant for others to see. Even when it’s honest and not just fishing for compliments or protestations, it’s an emotional and scholarly trap. How many people today are writing in English for a public audience, whether in books, journals, magazines, blogs, newsletters, or on the internet? Surely the number is in the tens of thousands. It boggles the mind, to be honest. Unless you’re selling millions of books, or you’re one of a handful of super-scholars like Charles Taylor, the truth is the only impact you can or will have is on a very, very, very small audience of readers, one that is necessarily vanishingly minuscule in absolute terms. Which means, in turn, that the overall effect of one’s work is in all likelihood going to be almost nil.

It seems to me that we have a choice: accept this as a fact on the front end or doom ourselves to inevitable melancholy, self-loathing, and despair on the back end. That I am not going be a Saint Augustine or a Bucer or a Barth, an Austen or a Trollope or an Eliot, a Taylor or a MacIntyre or a Jenson—this is a certainty. Does it mean I ought to stop writing? I don’t think so. I write, among other reasons, because I must. I can’t not write. But I also write because I might, within a very circumscribed range, affect or inform or educate or edify a few souls in my orbit. I call them souls because that is what they are: souls. Having even a tiny impact on a single soul isn’t nothing. It’s not much by comparison to the big leagues, but it’s something.

If you can accept that, you can be a writer without driving yourself crazy in the process. If you can’t, well, at least have the decency not to draw us into your circle of self-regard.

Read More
Brad East Brad East

“You are your actions”: close, but not quite

Over on Freddie deBoer's blog, he has a sharp piece up criticizing the vacuous identities induced by mass entertainment in late modern capitalism. Instead of having a nice time watching a Marvel movie, for instance, one's sense of self gets wrapped up in "being a Marvel fan." But Marvel doesn't care about you.

Over on Freddie deBoer's blog, he has a sharp piece up criticizing the vacuous identities induced by mass entertainment in late modern capitalism. Instead of having a nice time watching a Marvel movie, for instance, one's sense of self gets wrapped up in "being a Marvel fan." But Marvel doesn't care about you. Nor can it offer that depth of identity-constitutive meaning. It's just a movie that's a pinch of fun in a dark world, for which you fork over some money. Forgetting that, and allowing Disney to define who you are, is both childish and a trap. It doesn't end well, and it's a recipe for arrested development.

Here are the closing two paragraphs (my emphasis):

I wish I had a pat answer for what to do instead. Grasping for meaning – usually while drenching yourself in irony so that no one knows that that’s what you doing, these days – is universal. I will risk offending another very touchy subset of the internet by saying that I think many people turn to social justice politics and their prescriptivist politics, the notion that your demographic identifiers define you, out of motives very similar to the people I’ve been describing. There are readymade vehicles for acquiring meaning, from Catholicism to New Age philosophy to anarchism, that may very well create the solid ground people are looking for, I don’t know. I suspect that the best answer for more people would be to return to an idea that is very out of fashion: that you are what you do. You are your actions, not what you consume, what you say, or what you like.

It’s cool to name the bands you like to friends. It’s cool to be proud of your record collection. I’m sure it’s fun to create lists for Letterboxd. But those things don’t really say anything about you. Not really. Millions of people like all the things you like, after all. And trying to build a personality out of the accumulation of these things makes authentic appreciation impossible. I think it’s time to look elsewhere, as much as I admit that it would be nice if it worked.

The critique is on point, but the solution is not quite there. Part of the reason why presupposes what deBoer in turn presupposes is off the table (though he acknowledges it as a possibility for others): Christian anthropology. But the following points, though they trade on theological judgments about the nature of the human person, can be defended from other perspectives as well.

So why is defining one's identity by one's deeds an inadequate prescription?

First, because most people's actions are indistinguishable from others' actions: you wake up, you eat, you punch a clock, you watch a show, you pay the bills, you mow the lawn, you grab drinks with a friend, you text and email and post and scroll, maybe you put the kids to bed, you go back to sleep, you repeat it all over again. Such things "don't really say anything about you" either, since "millions of people [do] all the things you [do], after all."

Second, because meaning comes from without, not from within: even if your actions were robust enough to constitute a worthwhile identity, you'd still be seeking, desiring, yearning for that which is other than you, that which transcends you—whether in a lover, a friend, a child, a marriage, a job, a group, a church, a nation, a deity. Not only is it humanly basic for the source of our identity and meaning to come from beyond us (David Kelsey defines human life as "eccentric existence": the ground of our being stands outside ourselves); not only is it literally true that we depend on what is outside us for sustenance, care, and flourishing (gestation, birth, food, relationships, art); even more, turning in to oneself for one's own meaning is a dead end: simply put, none of us is up to the task. True navel gazing is monastic: it turns the self inside out—to find God within.

Third, because (positively speaking) your actions will never be enough: not impressive enough, not heroic enough, not virtuous enough, not even interesting enough. If I am what I do, I am a poor, indeed a boring and forgettable, specimen of the human species. Even if it were true that all that I am is found in my actions, that would be a cause for despair, not hope or meaning.

Fourth, because (negatively speaking) you are an inveterate sinner: you will fail, you will falter and stumble, you will invariably harm and hurt others, not always without intending to do so, but by commission and omission, you will err, you will induce pain, you will fall short—for the rest of your life, world without end. Christians don't think this is the end of the story (God not only forgives you but provides means of reparation for others, healing in oneself, and moral improvement over time), but there is a reason that Chesterton called original sin the one empirical doctrine. Look around at the world. Look at your own life. Does either inspire confidence? Does it suggest a source of reliable meaning and stable identity going forward? I didn't think so.

None of this means it's either deBoer's way or the church's. Even if my description were right, perhaps that merely means that life is meaningless and there is nowhere to look, even in one's own actions or character, for personal significance and rich identity. (Though in that case, who can blame the geeks for their projections?) Or perhaps I'm right at the formal level, but there are sources of transcendence beyond the church that folks like deBoer are remiss in overlooking. In any case, though, the upshot is clear. In terms of deep personal meaning and life-giving identity, the last place to look for who I am is in what I do. Look instead at what I'm looking for—looking at, looking to—and that'll tell you who I am. Or at least who I hope to be, who I'm trying to become.

Read More