Resident Theologian

About the Blog

Brad East Brad East

Civil War

One interpretation of Alex Garland’s new film.

I don’t yet know what I think about Civil War, Alex Garland’s latest. I’ve not read a word from others, though I have a vague sense that there are already battle lines drawn, strong readings offered, etc. I have nothing to say about that.

I do know that Garland is smart and makes smart films. I’m hesitant to trust either my or others’ knee-jerk reaction to a film that’s clearly got things on its mind, a film that is surely not what many of us supposed it would be based on trailers and ads.

I also care not one whit what Garland himself thinks about the film. He may have thought he was making a movie about X, intending to say Y, when in fact he made a movie about A, which happens to say B and C.

Like I said, I don’t have a strong take yet. I do have one possible interpretation, which may turn out to be a strong misreading. Here goes.

Civil War is not about American politics, American polarization, impending American secession, or even Trump. It’s not a post–January 6 fever dream/allegory/parable. It’s not a liberal fable or a conservative one.

Instead, Civil War is a film about the press—about the soul of the press, or rather, about what happens when the press loses its soul. In that sense it is about Trump, but not Trump per se. It’s about what happens to the press (what happened to the press) under someone like Trump; what the reaction to Trump does (did) to journalism; how the heart of a free polity turns to rot when it begins to mirror the heartless nihilism it purports to “cover.” Words become images; images become form without content; violence becomes a “story”; an assassination becomes a “scoop.”

It doesn’t matter what Nick Offerman’s president says seconds before he’s executed. It matters that he say something and that someone was there—first—to get “the quote.” The newsroom lifers and war-time photographers documenting propaganda, unable to listen to one more canned speech spouting lies on the radio, themselves become agents of propaganda. They become what they oppose, a photo negative of what they’re so desperate to capture for their audience. (What audience? Who’s watching? There’s no evidence anybody is reading, listening, or watching anymore. Outside of the soldiers and the press, everybody else appears to be pretending the war isn’t happening at all.)

The urban warfare Garland so expertly displays in the film—better than almost anything I’ve ever seen attempting to embed the viewer on the streets and in the cramped rooms of military units breaching fortified gates and buildings, made all the more surreal by its being set in downtown Washington, D.C.—is therefore not about itself, not about the images it seems to be showing, but is instead a Trojan horse for us to observe the “PRESS” who are along for the ride. And what happens between the three leads in the closing moments tells us all we need to know. One gets his quote. One gets her shot. And one loses her shot, as she does her life, having slowly awakened across the arc of the film to the intolerable inhumanity required of (or generated by) her profession. Another propagandist, though, rises to take her place. There’s always someone else waiting in the wings, ready to snap the picture that will make her name.

There, in the Oval Office, staring through a camera lens, a star is born.

Read More
Brad East Brad East

The smartest people I’ve ever known

A little rant about well-educated secular folks who look down on religious people.

have all been religious. Most of them have been devout Christians. Whether within the academy or without, my whole life has been full to the brim with brainy, well-educated, introspective, self-critical, “enlightened” folks who also believe in an invisible, incorporeal, omnipotent Creator of all things who became human in the person of Jesus and who calls all peoples to worship and follow him.

The fact that a bunch of believers are also intelligent and well-informed doesn’t, in and of itself, entail anything about the truth of Christian faith. Perhaps they’re all wrong, just as Christians suppose atheists are all wrong. It doesn’t make a lick of difference that atheists include educated, thoughtful people. If Christianity is true, then all the smart atheists are dead wrong (at least about God)—and vice versa. On the topic of religion, as with any other topic, a lot of bright people in the world are wrong; their brightness doesn’t ensure their rightness.

I say this to make a point I’ve made before: It is a strangely persistent myth, but a myth nonetheless, that sincere faith or religious belief or devout piety is a kind of maturational stage that persons above a certain level of intelligence inevitably leave behind given enough time, education, and social-emotional health. It isn’t true. Anyone not living in a bubble knows it’s not true. Yet it endures. Not only among tiny scattered remnants of New Atheists but also among graduates of elite universities, the types who congregate in big cities and fill jobs in journalism, academia, and politics. The types who love to celebrate what they call “diversity” but look down on anyone who, unlike them, believes in God and attends church, synagogue, or mosque.

The joke isn’t on the dummies who keep on believing. The joke is on people whose social and intellectual world is so parochial that they’ve honestly never read, met, or spoken to a serious religious person—one who’s read what they’ve read, knows what they know, and “still” believes. Better put, someone who’s read and knows all those things and continues to believe in God because of the evidence, not in spite of it. Someone whose reason points her to God, not someone who has sacrificed his intellect on the altar of faith.

Christians and other religious folks in America are fully aware that there are people unlike them in our society. They know they’re not alone. They know that atheists and agnostics and Nones include geniuses, scientists, scholars, journalists, professors, politicians, celebrities, artists, and more. They’re no fools. They know the score. They don’t pretend that “intelligence + education = believing whatever I do.”

Yet somehow that equation is ubiquitous among the secular smart set. I’m happy to leave them be. They’re free to continue in their ignorance. But I admit to being embarrassed on their behalf and, yes, more than occasionally annoyed.

Read More
Brad East Brad East

Publication promiscuity

Should writers be selective in where they publish, judging by ideology or other factors? I say no. Publish wherever you like, especially if they pay.

Every writer has to make decisions about where to publish. Not just where she would like to be published, but also where she would be willing to be published. All publications have a certain flavor or character, a back catalogue of writers, an ideological bent, an editorial slant, a reputation with readers and other writers.

I often get asked about why I publish with X or Y. I’m not nearly as well or as widely published as others, but here’s what I say in reply.

I am utterly promiscuous in my publishing habits. I would allow something I’d written—taking for granted that I wrote it, I was happy with it, and nothing more than the usual editorial oversight and revising was involved—to be published just about anywhere.

“Just about” rules out a tiny sliver of obviously objectionable venues: nothing, say, that calls or stands for armed rebellion, or racial supremacy, or some philosophy or politics so worthy of contempt that one could not possibly be associated with it or its adherents, much less aid and abet its cause. So if there’s a local chapter of some terrorist militia group with in-house literature, then no, I won’t answer any of their calls.

But beyond that, I’m very likely to say yes, and not with a tortured conscience but with a shrug. Moreover, even my definition begs for clarification. Because whether or not a viewpoint is contemptible is itself baked into the question of where to publish.

For precisely that reason, though, I think writers (and academics and journalists more broadly) shouldn’t worry so much about publication purity. Here’s why.

First, because there is no publication free of error. There’s always going to be some argument or position on offer in the venue with which you would take issue—perhaps serious issue. That’s just called writing in public.

Second, because every publication has skeletons in the closet. No magazine or journal has a blameless, infallible history. They’ve all issued corrections, errata, and apologies. Every one. (Those that haven’t, if they exist, should have; and the episode of their not having done so probably remains a stain on their reputation.)

Third, because no publication knows the future of any writer it publishes at the time of publishing him. So he goes on to be a wicked public personage, infamous for this opinion or that behavior. Editors don’t see the future. All they have to go on is the past in making judgments for the present.

Fourth, because writers are all wicked anyway. My tongue is in cheek, yes, but the claim also happens to be true, in more ways than one. Great writers are often awful people. More to the point, we Christians know that anyone you’ve ever read was a sinner. Every beautiful poem or essay or novel you’ve loved came from a wretched and damnable mind and heart. This doesn’t make the artifact any less lovely, nor does it excuse anyone’s actual sins (great or small). But it puts things into perspective.

Fifth, because publications don’t have cooties. Writing for X won’t infect you; reading Y isn’t contagious. You don’t contract uncleanness by reading or writing for a certain magazine. That’s not the way reading and writing work. Such a mindset is all too common among writers and intellectuals, but it’s childish and unwarranted. It’s the high school cafeteria all over again. Don’t fall for it.

Sixth, because writing for a given venue is not an endorsement, either of the venue’s usual views or of the particular views of any one of its writers or editors. I have no idea why anyone would suppose this, but many do, although typically only selectively—that is, with the “bad” outlets, whichever those may be. Such unthinking (often unreading) guilt by association isn’t worth a second’s thought.

Seventh, because great writers are themselves promiscuous. Some of my favorite writers today are fantastically diverse in where and with whom they will publish, including the most successful and respected. Off hand, I’m thinking of folks like Eve Tushnet and Phil Christman, Christopher Caldwell and David Bentley Hart, George Scialabba and Samuel Moyn, B. D. McClay and Paul Griffiths, or Wendell Berry and Christopher Hitchens in their younger days. Part of the discipline of publishing promiscuity is that it requires you, the writer, to be interesting enough and independent enough that editors at a bewildering variety of venues would kill to feature you, even though you “might not quite fit” with the norm there. Well done then. That means you’re doing something right.

Eighth and last, because writing wants to be free. It wants to get out. If someone wants to publish you, take them up on it. It’s unlikely you’ll regret it. Nor, if you do regret it, do you have to do it again; look elsewhere. But you’ll never be a writer if you don’t get your writing out in the world. Take some chances and don’t be picky, certainly not with active suitors.

In fact, go ahead and consider this post as good as giving you my card: if you want to publish me, you already know my answer.

Read More
Brad East Brad East

“Arbitrary”

A lexical nitpick with some contemporary public-facing writing by journalists and academics.

A lexical nitpick.

In contemporary journalistic as well as postmodern academic writing, and various intersections thereof, there is a habit of using the word “arbitrary” to describe what is anything but. In its usage, it is intended to denote or connote something both random and irrational. But invariably the referent is neither. There are always “reasons why,” and almost never are the reasons self-evidently wrong or bad. The author simply doesn’t accept them as persuasive. That doesn’t render the phenomenon arbitrary, though. It just makes it conventional, or arguable. But what isn’t?

I’ve seen this trend applied to weeding treatments in one’s lawn; to the distinction between men’s and women’s sports; to clothing style; to rules in a game; to questions of propriety or decorum in public spaces and in online images; to methods in biblical interpretation; and to much more besides. It’s a weird trick and usually a cheap one. It’s often unclear whether the author knows the inapplicable unfairness of the usage, and both options are bad: either dishonesty or superficiality.

There’s something else going on, too. Typically the author makes clear how magnificently aware he is of the social constructedness of everything in our common life. This, that, and the other thing is a social construction; ergo, it’s turtles all the way down; ergo, it’s all arbitrary anyway—nothing but a choice of arbitraries: ecco homo, behold the human condition.

But that is either the wrong conclusion to draw, or the author hasn’t followed it to its logical conclusion. If the former, then what he means is that social and cultural and political life is unavoidably and essentially contingent—and that is true. But contingency does not mean arbitrary. Whereas if is the latter, then absolute and irreducible arbitrariness as a feature of every aspect of reality entails that the author’s preference for (non-arbitrary) X over (arbitrary) Y is a nonstarter. He’s sawed off the branch he was sitting on. There’s no longer any argument to be had. In which case, he should drop the rhetorical gamesmanship and accept that his opponent’s position is no more or less arbitrary than his is.

Unless he already knows that, and is using words as a mere means to the end of getting what he, arbitrarily, wants. No inconsistency there, albeit at the price of reducing language to power and exalting the ego’s desires as final. The price, in other words, is nihilism as a social, political, and rhetorical philosophy. Which, if we’re honest, is sometimes what lies behind the public writing of academics and journalists today.

Read More
Brad East Brad East

Rules for reviewing and being reviewed

Twelve rules for writing book reviews, followed by twelve rules for being reviewed as the author of a book.

Twelve rules for writing a book review:

  1. The subject of the review should be the book and its ideas, not you or your ideas. If you are inclined to write a piece you could have written had you never read the book in question, beg off immediately.

  2. Any reader of your review should know, after reading it, who the author is and in particular what the book is about.

  3. A review should give the reader some taste of the prose, some sense of the voice of the author and not only the author as mediated by your voice.

  4. The object of your review is the book as written, not the book as you would have written it.

  5. If the review is under 1,000 words, then you do not have space to formulate either a wholesale critique of the book or an alternative argument. You have space, instead, for a few small criticisms or one main criticism.

  6. Most books are bad, but few books are all bad. Find something positive to say about the book under almost any circumstances. (As Roger Ebert liked to say, don’t be parsimonious in your joy.)

  7. It is all too easy to write a “take down.” Don’t do that. A book must be unremittingly wicked or dangerously foolish to merit the critical shotgun blast. And if you’re eager to pull the trigger, that’s a sign that you shouldn’t.

  8. Be charitable: imagine why someone might think this book worth writing, exactly in the way it was written.

  9. Be disinterested: if you have personal animus against the author or some reason to wish him or her ill, do not write the review. A reviewer is an arbiter, of merit and of preference, and the reader should be able to trust that the reviewer is a fair judge in the matter. Don’t be a hack.

  10. Be critical, but not cheap. Hold the book to a high standard, but don’t go looking for flaws, and don’t view your review as an occasion to parade each of those flaws before the mocking eyes of a mob. If you expect the book to be substantive, hold yourself to the same standard; don’t suckerpunch an author under any conditions, but certainly not by means of a double standard.

  11. If the form is a review essay, then your voice and views and arguments rightly enter the field of play. But it remains a review essay, not an essay simpliciter. Your many tangents, comments, and reflections ought to circle or spiral around the subject and substance of the book, intersecting at crucial moments. All of the above still applies; in fact, it applies more stringently. The review essay is a longer leash, but a tighter one.

  12. In sum: Review unto others as you would have them review unto you.

Twelve rules for being reviewed:

  1. To be read, by anyone, for any reason at all, is an honor and a privilege. Most authors go their whole lives without an audience to speak of. Be grateful.

  2. To be reviewed is therefore a double honor. Not just an individual reader but multiple people—with busy lives, deadlines, finite attention, not to mention editorial demands, publication schedules, and a readership of some sort—decided that your book was worthy of public attention. Get down on your knees and thank God!

  3. A review is not a personal comment on the quality of your character. It is not an expression of like or dislike. It is not an act of friendship or unfriendship. It is an intellectual (possibly scholarly) assessment of the quality of your writing: its style, its substance, its contribution to the world of letters and ideas. Receive it as such.

  4. Do not write, give up the writing life altogether, if you fear or resent or otherwise cannot handle being reviewed. It is a vulnerable and often nasty experience. Being an author is not for you if you are, shall we say, a touchy person. Defensiveness is never a good look, but for authors—whose entire job description is assessing others (and their ideas) and being assessed (for the same)—it is a sorry state indeed.

  5. A bad review is not the end of the world. It is to be expected. It is the ordinary run of things. It may hurt. But for a writer, getting a bad review is just another Tuesday. Likely as not, the review won’t matter. Sometimes, bad press is good press. Sometimes, even, the reviewer might be onto something.

  6. A largely positive review that includes modest criticism is not a bad review. Every author wants adulation and affirmation. But even the best reviews place some question mark here or there next to the book’s claims. Charles Taylor and Wendell Berry, Marilynne Robinson and Cormac McCarthy, Barbara Ehrenreich and Mary Karr are allowed to expect “Good Reviews.” (Though, to be clear, all of them have gotten bad reviews!) You and I are not.

  7. Because you are not a perfect writer and no one has ever written a perfect book, you should not only be unsurprised by criticisms of your work, you should expect and even welcome them. Go into a review presupposing the reviewer to be a good-faith interlocutor. What might they have to teach you, including about your own work? Probably not nothing. Learn!

  8. Credentials will not save you. Do not use them as a crutch or as a lifeline. It doesn’t matter what letters run after your name or how many degrees hang on your wall. “Experts” write bad books all the time. No one is disregarding your training by suggesting your work needs improvement. (Don’t you agree that it does?)

  9. Do not go reading in between the lines. Do not impute to the reviewer something that he or she has not put down in black and white. Do not suggest ulterior motives; do not conjure unstated beliefs; do not make accusations of malice. Do not go on the hunt for reasons to justify yourself in the court of public opinion. Most important, do not take the review as a personal slight, as though the reviewer has done you an injustice. That is a category mistake. Reviewers may be—they are allowed to be and sometimes encouraged to be—mean, caustic, brutal, uncouth, biting, sarcastic, disparaging, dismissive. Are you surprised? Welcome to the world of writers!—just about the most insecure, miserable, miserly, skeptical, and suspicious crew around. They are not easily pleased. You are unlikely to prove an exception.

  10. If you receive a genuinely, objectively disingenuous review, a vicious piece of spite animated by everything but an unbiased evaluation of your work—then kindly ignore it. If you have the will power, don’t read it; if you do read it, don’t give it a second’s thought, don’t share it with others, don’t write about it, don’t reply to it, don’t respond in kind. Pretend that it doesn’t exist, that it was never written. Any such review wants your blood up as a result: that’s why it was written in the first place. Don’t give them the satisfaction. Like the sound of a tree falling in an empty forest, does an unfair, ugly review exist if the author doesn’t promote it and the world doesn’t read it? Answer: No. So don’t feed the trolls.

  11. Where do trolls live and move and have their being? On Twitter, and social media generally. The most important thing you can do, then, is to delete every one of your social media accounts, Twitter above all. I repeat: Get. Off. Twitter. Twitter is poison, but it’s an addictive poison for writers. In truth, it’s nothing but an endless diet of empty calories for attention-starved, affirmation-seeking scribblers. But it never leaves you full. It just makes you hungrier and looking the worse for wear. For not only is it a giant waste of time. Not only does it steal your focus and rob your capacity for sustained, thoughtful attention. Not only does it warp your sense of the world. It’s bad for your writing. Without exception, every writer who spends time on Twitter is worse for it. So the very best thing you can do, hands down, is log off entirely, and for good.

  12. But since there will, alas, continue to be writers either who suppose Twitter is good for them (I’ve never met one of these in the wild, but I’m told they exist) or (more likely) who know it’s bad but see certain benefits as personally or professionally indispensable, here’s how to navigate being reviewed on Twitter:

    1. Follow rules 1-11 above; they still apply in full.

    2. Always and only express gratitude for being reviewed at all.

    3. Share links indiscriminately, and don’t prejudice readers with passive aggressive framing.

    4. No matter what, do not make Twitter a therapist’s couch for your wounded ego. It is impossible to overstate how sad and pitiable this is. Come feel sorry for me—a review of my work was mildly critical! It even used a mean tone and a loud voice! Unfair, am I right? Get over yourself. The very fact that your instinct is to run to Twitter or Instagram to fish for compliments and bask in your followers’ pity party is prima facie evidence that the review in question was on to something. You are earning no one’s respect, and only confirming priors you’d rather not confirm. Avoid this at all costs.

    5. Do not use any review as an opportunity to hold an online referendum on the character, integrity, or credentials of the reviewer or of the venue in which the review appeared. Remember, apart from the merits of such a question or of the quality of a particular review or of your feelings in response to it: your followers constitute an echo chamber. There is no reason to listen to anything they have to say—even more than you, they are likely to perceive written criticism as a personal affront rather than what it is: business as usual. The temptation is great, certainly if you have a bona fide following or sizable readership. But don’t give in. Resistance is not futile.

    6. The sad fact is that (a) popular authors have modeled this habit for up-and-coming writers as an unquestioned norm rather than as a cautionary tale; (b) this trend is itself the leading symptom of the poor health that besets the current writing–reviewing(–academic–journalist–publication) ecosystem; (c) following the trend, rather than avoiding it, perpetuates the very dysfunction everyone is suffering from and seeking relief from. It’s certainly true that an individual writer opting out makes only a minor difference, maybe no difference at all in the grand scheme of things. But there’s no reason to be part of the problem, once you know it’s a problem. And opting out will absolutely make a difference to you: your writing, your mental and emotional health, and much else besides. So get out while you can, if you can. You won’t regret it.

Read More
Brad East Brad East

The take temptation

There is an ongoing series of essays being slowly published in successive issues of The New Atlantis I want to commend to you. They’re by Jon Askonas, a friend who teaches politics at Catholic University of America. The title for the series as a whole is “Reality: A Post-Mortem.” The essays are a bit hard to describe, but they make for essential reading.

There is an ongoing series of essays being slowly published in successive issues of The New Atlantis I want to commend to you. They’re by Jon Askonas, a friend who teaches politics at Catholic University of America. The title for the series as a whole is “Reality: A Post-Mortem.” The essays are a bit hard to describe, but they make for essential reading. They are an attempt to diagnose the root causes of, and the essential character of, the new state of unreality we find ourselves inhabiting today. The first, brief essay lays out the vision for the series. The second treats the gamified nature of our common life, in particular its analogues in novels, role-playing games, and alternate reality games (ARGs). The latest essay, which just arrived in my mailbox, is called “How Stewart Made Tucker.” Go read them all! (And subscribe to TNA, naturally. I’ve got an essay in the latest issue too.)

For now, I want to make one observation, drawing on something found in essay #2.

Jon writes (in one of a sequence of interludes that interrupt the main flow of the argument):

Several weeks have gone by since you picked your rabbit hole [that is, a specific topic about which there is much chatter but also much nonsense in public discourse and social media]. You have done the research, found a newsletter dedicated to unraveling the story, subscribed to a terrific outlet or podcast, and have learned to recognize widespread falsehoods on the subject. If your uncle happens to mention the subject next Thanksgiving, there is so much you could tell him that he wasn’t aware of.

 You check your feed and see that a prominent influencer has posted something that seems revealingly dishonest about your subject of choice. You have, at the tip of your fingers, the hottest and funniest take you have ever taken.

1. What do you do?

a. Post with such fervor that your followers shower you with shares before calling Internet 911 to report an online murder.

b. Draft your post, decide to “check” the “facts,” realize the controversy is more complex than you thought, and lose track of real work while trying to shoehorn your original take into the realm of objectivity.

c. Private-message your take, without checking its veracity, to close friends for the laughs or catharsis.

d. Consign your glorious take to the post trash can.

2. How many seconds did it take you to decide?

3. In however small a way, did your action nudge the world toward or away from a shared reality?

Let’s call this gamified reinforcement mechanism “the take temptation.” It amounts to the meme-ification of our common life and, therefore, of the common good itself. Jon writes earlier in the essay, redescribing the problem behind the problem:

We hear that online life has fragmented our “information ecosystem,” that this breakup has been accelerated by social division, and vice versa. We hear that alienation drives young men to become radicalized on Gab and 4chan. We hear that people who feel that society has left them behind find consolation in QAnon or in anti-vax Facebook groups. We hear about the alone-togetherness of this all.

What we haven’t figured out how to make sense of yet is the fun that many Americans act like they’re having with the national fracture.

Take a moment to reflect on the feeling you get when you see a headline, factoid, or meme that is so perfect, that so neatly addresses some burning controversy or narrative, that you feel compelled to share it. If it seems too good to be true, maybe you’ll pull up Snopes and check it first. But you probably won’t. And even if you do, how much will it really help? Everyone else will spread it anyway. Whether you retweet it or just email it to a friend, the end effect on your network of like-minded contacts — on who believes what — will be the same.

“Confirmation bias” names the idea that people are more likely to believe things that confirm what they already believe. But it does not explain the emotional relish we feel, the sheer delight when something in line with our deepest feelings about the state of the world, something so perfect, comes before us. Those feelings have a lot in common with how we feel when our sports team scores a point or when a dice roll goes our way in a board game.

It’s the relish of the meme, the fun of the hot take—all while the world burns—that Jon wants us to see so that he, in turn, can explain it. I leave the explanation to him. For my part, I’m going to do a bit of moralizing, aimed at myself first but offered here as a bit of stern encouragement to anyone who’s apt to listen.

The moral is simple: The take temptation is to be resisted at all costs, full stop. The take-industrial complex is not a bit of fun at the expense of others. It’s not a victimless joke. It is nothing less than your or my small but willing participation in unraveling the social fabric. It is the false catharsis that comes from treating the goods in common we hope to share as a game, to be won or lost by cheap jokes and glib asides. Nor does it matter if you reserve the take or meme for like-minded friends. In a sense that’s worse. The tribe is thereby reinforced and the Other thereby rendered further, stranger, more alien than before. You’re still perpetuating the habit to which we’re all addicted and from which we all need deliverance. You’re still feeding the beast. You’re still heeding the sly voice of the tempter, whose every word is a lie.

The only alternative to the take temptation is the absolutely uncool, unrewarding, and unremunerative practice of charity for enemies, generosity of spirit, plainness of prose, and perfect earnestness in argument. The lack of irony is painful, I know; the lack of sarcasm, boring; the lack of grievance, pitiful. So be it. Begin to heal the earth by refusing to litter; don’t wish the world rid of litter while tossing a Coke can out the window.

This means not reveling in the losses of your enemies, which is to say, those friends and neighbors for whom Christ died with whom you disagree. It means not joking about that denomination’s woes. It means not exaggerating or misrepresenting the views of another person, no matter what they believe, no matter their character, no matter who they are. It means not pretending that anyone is beyond the pale. It means not ridiculing anyone, ever, for any reason. It means, practically speaking, not posting a single word to Twitter, Instagram, Facebook, or any other instrument of our digital commons’ escalating fracture. It means practicing what you already know to be true, which is that ninety-nine times out of one hundred, the world doesn’t need to know what you think, when you think it, by online means.

The task feels nigh impossible. But resistance isn’t futile in this case. Every minor success counts. Start today. You won’t be sorry. Nor will the world.

Read More
Brad East Brad East

“X is not in the Bible”

In an annual course I teach on moral philosophy I assign a textbook that contains a chapter on X. The author of the textbook is an ethicist, and the ethics he seeks to present to his readers (imagined as college students) is general or universal ethics; though he doesn’t out himself as a Kantian, those with ears to hear spy it from the opening pages. In the chapter on X the author has a sidebar dedicated to religious, by which he means Christian, arguments about X.

In an annual course I teach on moral philosophy I assign a textbook that contains a chapter on X. The author of the textbook is an ethicist, and the ethics he seeks to present to his readers (imagined as college students) is general or universal ethics; though he doesn’t out himself as a Kantian, those with ears to hear spy it from the opening pages. In the chapter on X the author has a sidebar dedicated to religious, by which he means Christian, arguments about X. He observes blithely that the Bible doesn’t mention X, though he allow that one or two passages have sometimes been trotted out as containing implicit commentary on X. Accordingly, he deploys a few perfunctory historical-critical tropes (without citation, naturally) to show how and why the original canonical authors in their original cultural context could never have meant what contemporary readers of the text sometimes take them to mean with respect to X.

I always dedicate time in class to discuss this sidebar with students. It is a perfect encapsulation of the naive inanity of non-theological scholars commenting on Christian thought. So far as I can tell the author is utterly sincere. He really seems to think that Christian thought, whether moral or doctrinal, is reducible to explicit assertions in the Bible, double-checked and confirmed by historical critics to have been what the putative author(s) could have or likely would have meant by the words found in a given pericope.

I used to think this sort of stupidity was willful and malicious; I’ve come to see, however, that it is honest ignorance, albeit culpable in the extreme.

A few days ago I was reminded of this annual classroom discussion because I read an essay by a scholar I otherwise enjoy and regularly profit from, who used the exact same argument, almost identically formulated. And he really seems to have meant what he wrote. That is, he really seems to believe that if he—neither a Christian nor a theologian nor a scholar of religion not a religious person at all—cannot find mention of X in the Bible, then it follows as a matter of course that:

  1. Christians have no convictions about X;

  2. Christians are permitted no convictions about X, that is, convictions with a plausible claim to be Christian;

  3. no Christian teaching about X exists, past or present; and

  4. Christianity as such neither has, nor has ever had, nor is it possible in principle that Christianity might have (or have had), authoritative doctrinal teaching on X.

All this, because he, the erudite rando, finds zero results when he does a word search for “X” on Biblegateway.com.

So far as I can tell, this ignorance-cum-stupidity—wedded to an eager willingness to write in public on such matters with casual authority—is widespread among folks of his ilk. They are true believers, and what they truly believe in is their own uninformed ineptitude.

The answer to the riddle of what’s going on here is not complicated. Anti- or post-Christian scholars, writers, and intellectuals in this country who spurn theological (not to mention historical) learning—after all, we don’t offer college courses in alchemy or astrology either—are sincerely unaware that American evangelicalism in its populist form is not representative of historic Christianity. They don’t realize that the modernist–fundamentalist debate is itself a uniquely modern phenomenon, and thus bears little relationship either to what Christianity is or to what one would find in Christian writings from any period from the second century to the seventeenth. They don’t know what they don’t know, and they’re too incurious to find out.

Were they to look, they would discover that Christianity has a living body of teaching on any range of topics. They would discover that over the centuries Christianity has had a teaching office, whose ordained leaders speak with varying degrees of authority on matters of pressing interest, including moral questions. They would discover that, in its acute American form, radical biblicism—the notion that Christians have beliefs only about things the Bible addresses directly and clearly—is one or two centuries old at most. They would discover that, even then, said biblicism describes a vanishingly small minority of global Christianity today. They would discover that the modernism on offer in Protestant liberalism is but the mirror image of fundamentalism, and therefore that to ape claims like “X isn’t even in the Bible—QED,” even intended as secular critique of conservative Christians, is merely an own goal: all it reveals is one’s own historical and cultural parochialism and basic theological incomprehension. They would discover that the church has never read the Bible the way either fundamentalists or historical critics do, in which case the word-search proof-text slam-dunk operation is not only irrelevant; in light of exegetical and theological tradition, it is liable to induce little more than a suppressed snort laugh.

They would discover, in a word, that the Bible does contain teaching about X, because the Bible contains teaching about all things (you just have to know where to look, that is, how to read); that the church’s tradition likewise contains considerable and consistent teaching about X, as any afternoon in a library or quick Google search would reveal; that Christianity is a living, not a dead thing; that Christian moral doctrine did not fossilize with the final breath of the last apostle; that postwar American evangelicalism is not the center of any universe, much less the Christian church’s.

They would discover—rather than learning the hard way—that asking someone in a position to know before writing about something of which one is wholly ignorant is a wise and generally admirable habit. But then, owning the fundies is a lot harder to do if you treat them as adults worthy of respect. This way is much more fun.

It’s all just a game anyway, right?

Read More
Brad East Brad East

On blissful ignorance of Twitter trends, controversies, beefs, and general goings-on

Being off Twitter continues to be good for my soul as well as my mind, and one of the benefits I'm realizing is the ignorance that comes as a byproduct. By which I mean, ignorance not in general or of good things but of that which it is not beneficial to know.

When you're on Twitter, you notice what is "trending." This micro-targeted algorithmic function shapes your experience of the website, the news, the culture, and the world. Even if it were simply a reflection of what people were tweeting about the most, it would still be random, passing, and mass-generated. Who cares what is trending at any one moment?

More important, based on the accounts one follows, there is always some tempest in a teacup brewing somewhere or other. A controversy, an argument, a flame war, a personal beef: whatever its nature, the brouhaha exerts a kind of gravitational pull, sucking us poor online plebs into its orbit. And because Twitter is the id unvarnished, the kerfuffle in question is usually nasty, brutish, and unedifying. Worst of all, this tiny little momentary conflict warps one's mind, as if anyone cares except the smallest of online sub-sub-sub-sub-sub-sub-sub-"communities." For writers, journalists and academics above all, these Twitter battles start to take up residence in the skull, as if they were not only real but vital and important. Articles and essays are written about them; sometimes they are deployed (with earnest soberness) as a synecdoche for cultural skirmishes to which they bear only the most tangential, and certainly no causal, relationship.

As it turns out, when you are ignorant of such things, they cease in any way to weigh down one's mind, because they might as well not have happened. (If a tweet is dunked on but no one sees it, did the dunking really occur?) And this is all to the good, because 99.9% of the time, what happens on Twitter (a) stays on Twitter and (b) has no consequences—at least for us ordinary folks—in the real world. Naturally, I'm excluding e.g. tweets by the President or e.g. tweets that will get one fired. (Though those examples are just more reasons not to be on Twitter: I suppose if all such reasons were written down even the whole world would not have room for the books that would be written.) What I mean is: The kind of seemingly intellectually interesting tweet-threads and Twitter-arguments are almost never (possibly never-never) worth attending to in the moment.

Why? First, because they're usually stillborn: best not to have read them in the first place; there is always a better use of one's time. Second, because, although they feel like they are setting the terms of this or that debate, they are typically divorced from said debate, or merely symptoms of it, or just reflections of it: but in most cases, not where the real action is happening. Third, because if they're interesting enough—possibly even debate-setting enough—their author will publish them in an article or suchlike that will render redundant the original source of the haphazard thoughts that are now well organized and digestible in an orderly sequence of thought. Fourth and finally, because if a tweet or thread is significant enough (this is the .01% leftover from above), someone will publish about it and make known to the rest of us why it is (or, as the case may be, is actually not) important. In this last case, there is a minor justification for journalists not to delete their Twitter accounts; though the reasons for deletion are still strong, they can justify their use of the evil website (or at least spending time on it: one can read tweets without an account). For the rest of us, we can find out what happened on the hellscape that is Twitter in the same way we get the rest of our news: from reputable, established outlets. And not by what's trending at any one moment.

For writers and academics, the resulting rewards are incomparable. The time-honored and irrefutable wisdom not to read one's mentions—corrupting the mind, as it does, and sabotaging good writing—turns out to have broader application. Don't just avoid reading your mentions. Don't have mentions to read in the first place.
Read More