Resident Theologian

About the Blog

Brad East Brad East

Carr, Sacasas, and eloquent reality

A long reflection on an essay by Nicholas Carr engaging L. M. Sacasas about enchantment, reality, and contemplation.

In a list of the best living writers on technology in the English language, the top ten would include Nicholas Carr and L. M. Sacasas. Yesterday the two came together in an essay I can’t get out of my head.

The essay in question is the third in a series called “Seeing Things” on Carr’s Substack “New Cartographies.” Titled “Contemplation as Rebellion,” it continues Carr’s reflections on the nature of perception in a digital age. Perception is both neurological and social; it is a mediated phenomenon; it can be done well or poorly, deeply or cheaply. And works of art, especially visual art like paintings and engravings, have the power to call forth the kind of attention that repays time, energy, focus, and affection.

Interwoven with these reflections is Carr’s intervention in the “enchantment” discourse, one I have myself dipped into more than a few times (especially in conversation with Alan Jacobs). In yesterday’s essay, following meditations on Heaney and Hawthorne, Carr turns to something Sacasas wrote last August titled “If Your World Is Not Enchanted, You're Not Paying Attention.” He begins with an excerpt from Sacasas:

This form of attention and the knowledge it yields not only elicits more of the world, it elicits more of us. In waiting on the world in this way, applying time and strategic patience in the spirit of invitation, we draw out and are drawn out in turn. As the Latin root of attention suggests, as we extend ourselves into the world by attending to it, we may also find that we ourselves are also extended, that is to say that our consciousness is stretched and deepened.

Here is Carr’s response, which ends his own essay and which I quote at length:

Even as I find Sacasas’s essay inspiring, I find it troubling. The way he frames the contemplative gaze as a means of re-enchantment makes me uncomfortable. An enchanted world is, by definition, a world that presents a false front to us — a front composed of what Sacasas terms, at the end of his essay, “mere things.” To see what’s really there in an enchanted world, you need to see beyond or through the surface. You need to discover what’s hidden, what’s concealed, by the merely material form, and that requires something more than sensory perception. It requires extrasensory perception. In this framing, the contemplative gaze is not just unlocking what lies untapped within us — the powers of perception, imagination, interpretation — but also exposing some spiritual essence that lies hidden within the object of the gaze.

The issue I take with Sacasas’s essay is not a matter of sense — I’m pretty sure we’re talking about the same perceptual phenomenon — but of wording. When he suggests that “enchantment is just the measure of the quality of our attention,” he’s muddying the waters. When we look at the quality of attention demonstrated by Heaney, Muñoz, and Hawthorne, we’re not seeing enchantment. We’re seeing an exquisite openness to the real. A sense of wonder does not require a world infused with spirit. The world as it is is sufficient. The reason the wording matters here is simple. What bedevils our perceptions today isn’t a lack of enchantment. It’s a lack of reality.

“Things change according to the stance we adopt towards them, the type of attention we pay to them,” Iain McGilchrist wrote in The Master and His Emissary. He’s right, but it’s important to recognize that the changes take place in the mind of the observer not in the things themselves. The things, whether works of art or of nature, have a material integrity that’s independent of our own thoughts and desires, and the stance we adopt toward them should entail a respect for that integrity.

The desire to re-enchant the world may seem like an act of humility, a way of paying tribute to the world’s unseen powers, but really it’s the opposite, an act of hubris. In demanding that the world hold greater meaning for us, that it be a reservoir for the fulfillment of our own spiritual yearnings, we are attempting yet again to impose our will on the world, to turn its myriad material forms to our own purposes, to make it our mirror. Whatever enchantment may once have been, re-enchantment is a power play.

It’s interesting that, in the English language, we have enchantment, disenchantment, and re-enchantment. What we don’t have is unenchantment. A state of disenchantment is by definition a state of loss, one that begs to be remedied by a process of re-enchantment. A state of unenchantment presumes no loss and requires no remedy. It is a state that is entirely happy with the thinginess of things. So let me, by fiat, introduce unenchantment into the language. And let me suggest that the contemplative gaze is best when it is an unenchanted gaze.

There is much to unpack here. Before I respond, let me be clear that nothing whatsoever hangs on the use, retention, or recovery of the term “enchantment” and its many variations. This entire conversation could be held, and all that technologists, philosophers, critics, and theologians want to say about it could be said, without Weber’s Entzauberung or any of its translations. Weber, for his part, was seeking to offer a sociological description of an epochal cultural change. Whatever the merit of his description, neither the concept nor the term nor its denial bears on the substance of the arguments that Sacasas and Carr make above.

I take Carr to be taking issue with a spiritually charged material reality for at least five reasons. First, it is reductive; things become “mere” things. Second, it is narcissistic; things must be what I want or need them to be to have value, in themselves or for me. Third, it is coercive; it imposes upon things what they evidently lack. Fourth, it is ungrateful; it fails to receive things as they are and thus to attend to them with the care they deserve. Finally, it is unreal; it substitutes my subjectivity for the stubborn objectivity of the thing before me. No longer am I interacting with some material item of the phenomenal world; instead, I am playing with projections upon the screen of my mind.

These are all valid and useful worries; no doubt they have a legitimate target. I don’t think Carr’s comments are an adequate response to Sacasas, however, or a successful critique of the broader view of enchanted perception that Sacasas is seeking to represent. In part there seem to be some misunderstandings between them. But perhaps more than any serious misunderstanding there is simple, unbridgeable disagreement. That disagreement, in turn, reverses the terms of the reproach: it is Carr, not Sacasas, who makes the world into a mirror.

More on that later; for now, consider definitions.

Carr opens by saying that an “enchanted world is, by definition, a world that presents a false front to us.” This is an unfortunate way to begin. Let me offer an alternative. At a minimum, an enchanted world is one that is full of life, intelligence, events, experiences, agents, and phenomena that exceed the capacity of secular, instrumental reason—especially the “hard” sciences—to measure, name, calculate, contain, control, or grasp. For Christians, the word for such a world is simply “creation.” But creation is not a false front. There may be more than what you or I can measure or glimpse, but there is not less. Creation is artifice in the sense that there is an artificer; it is not artifice in the sense that it is a façade.

Carr writes: “When we look at the quality of attention demonstrated by Heaney, Muñoz, and Hawthorne, we’re not seeing enchantment. We’re seeing an exquisite openness to the real. A sense of wonder does not require a world infused with spirit. The world as it is is sufficient.” These claims are all question-begging. What if openness to the real discloses to one’s awareness a deeper reality than one previously supposed to be true or possible—a reality not limited to one’s consciousness but objectively existent in the very thing one is contemplating, antecedent to one’s act of contemplation? Whether wonder requires a world infused with spirit is beside the point; it’s a hypothetical we aren’t in a position to answer. The question instead is whether this world is in fact suffused with spirit. To call a spirit-less world “the world as it is” begs the question, therefore, because we cannot and do not know a priori that the world lacks spirit, or that the spirit it manifests to so many in such a variety of ways is contained without remainder in the mind.

Carr is right to insist on respecting the integrity of the things of the world and of the world itself. Things aren’t playthings, and when we reduce the former to the latter both we and they are diminished as a result. So let me avoid the generic and embrace the particular. What follows is a specifically Christian account of why, in Gerard Manley Hopkins’ words, seeing the world as charged with the grandeur of God is not a failure to attend to the thisness of things.

Hopkins is a good person to start with, as it happens, given his emphasis on “inscape” or the proper “thisness” of created things, drawing on John Duns Scotus’s haecceitas. Each thing is just what it is; it isn’t anything else. It is the particular thing God made it to be, and it is this precisely in virtue of its relation to God the Creator, to his creative power and good pleasure. To, in a word, his delight.

The doctrine of creation extends this notion to anything and everything in existence. Material objects, then, are not windows we will one day raise (much less smash) in order to see “true” reality more clearly. Nor are they akin to Wittgenstein’s ladder, necessary to climb but kicked over once used. Nor still are they masks donned to deceive us or allegories that, in pointing to what they are not, exhaust themselves in their reference (somewhat like the self-destructing tapes of Mission: Impossible fame).

No, the Christian doctrine of creation teaches that the surfaces of the world contain depths and that seemingly silent things have a voice. They speak. They sing, in fact. Reality, in the words of Albert Borgmann, is eloquent. Significance in the broadest sense is therefore not only a product or property of the conscious human mind; it belongs to the things of the world prior to my contemplating them and emerge, intelligibly and fittingly, in the encounter between us.

Two concepts govern this theological perspective, each centered on the incarnation. The reason why is straightforward: the man Jesus is fully and utterly human without being merely human. He is more than human, but he is not less. Nothing in one’s phenomenal experience of Jesus’s humanity—nothing measurable by observation, analysis, or a thousand scientific tests—would tell you anything about who he is, only what he is: namely, a human being and, in that respect, like any other. Yet this man is God. Who he is is thus hidden from view.

Are we back, then, to the “false front” of Carr’s worries? By no means.

On one hand, Jesus’s humanity is not a fiction; it is not like the façades of Petra, which appear to be exteriors of magnificent temples yet contain nothing on the inside. Jesus’s humanity is, apart from sin, like yours and mine in every way. He really is a human being, and his humanity is not a temporary meat-suit he sloughs off at the Ascension. Jesus is human forever.

On the other hand, Jesus’s divinity is not opposed to his humanity. He is neither a hybrid nor a shell in which two competing principles vie for space. In all his actions, in all he says and suffers, he does so as God and man, divinely and humanly. Indeed, part of the revelation of the God-man is that God can be man without contradiction. Contra John Hick, the incarnation is not a square circle.

The most common patristic image for this reality comes from Scripture: the burning bush. The divinity of Jesus suffuses and saturates the humanity of Jesus without consuming it. This in turn came to govern the fathers’ view of the sacraments, the Eucharist above all. Anthony Domestico draws this out in a review of Paul Mariani’s biography of Hopkins:

Mariani is most affecting when describing what he calls Hopkins’s idea of “thisness—the dappled distinctiveness of everything kept in Creation.” He links Hopkins’s concept of inscape and instress to the poet’s abiding devotion to the Eucharist. Hopkins was drawn to Catholicism, Mariani suggests, through the doctrine of the Real Presence, “God dwelling in things as simple as bread and wine … the logical extension of God’s indwelling among us.” His poetry and his religion are necessary to one another: Hopkins was the poet he was because of his Catholic understanding of the Eucharist, and he was the Catholic he was because of his poetic apprehension of reality.

To be sure, the world is not a sacrament per se; a sacramental logic applies to creation in virtue of its status as created. In this way the sacraments help to explain how creation can be just what it is and, in the language of Alexander Schmemann, an epiphany of its Creator. It seems to me that Carr and other critics of (at least a certain Christian style of) enchantment substitute an “or” for the “and,” seeing the former as necessary and the latter as impossible. For Christians, it is the incarnation that demonstrates the truth and thus the possibility of the “and.”

The second concept that enters here is typology, or the use of “figure” in reading Scripture. The most famous study is Erich Auerbach’s Mimesis. He rightly argues there that the “types” or “figures” of the biblical narrative are not extinguished by their trans-local, trans-personal, trans-temporal signification. The fact that David figures Christ, or somehow mysteriously points forward to him, confirms and upholds his unique historicity; it does not obliterate it. Here is how Paul Griffiths puts it:

One event or utterance figures another when, while remaining unalterably what it is, it announces or communicates something other than itself. Eve’s assent to the tempter and her consequent taking of the forbidden fruit from the tree figures, in this sense, Mary’s fiat mihi in response to the annunciation and the consequent incarnation of the Lord in her womb. The second event—the figured—encompasses and includes the first, without removing its reality. The first—the figuring—has its reality, however, by way of participation in the second. This is in the order of being. Ontological figuration may, however, be replicated at the level of the text, and in scripture it inevitably is.

Put bluntly, figuralism falls apart if the human figures of history recorded by Scripture are neither truly human nor truly historical. It is exactly in their three-dimensional, irreducible humanity and historicity—their personal haecceity—that they “figure” Christ in advance of his advent. Saint Augustine writes in De Doctrina Christiana that humans signify with signs but God signifies with both signs and things. Salvation history, inscribed in Scripture, is thus the grand narrative of all creation, at once told by humans through written signs and told by God through created things—including the lives of human beings themselves, both their words and their deeds.

In sum, both typology and sacramentology manifest the logic embodied in the incarnation: a simultaneous affirmation of the goodness and thisness of creation in all its parts and of creation’s capacity to communicate, signify, or otherwise mediate depths of reality not immediately evident on the surface of things. “Re-enchantment,” as I see it, is one way to describe a Christian reassertion or recovery of this way of understanding and inhabiting the world. Carr acknowledges that such re-enchantment “may seem like an act of humility, a way of paying tribute to the world’s unseen powers, but really it’s the opposite, an act of hubris.” Why? “In demanding that the world hold greater meaning for us, that it be a reservoir for the fulfillment of our own spiritual yearnings, we are attempting yet again to impose our will on the world, to turn its myriad material forms to our own purposes, to make it our mirror. Whatever enchantment may once have been, re-enchantment is a power play.”

Whatever the truth of this critique applied to other types of (re-)enchantment, I hope I’ve made clear by now why it doesn’t apply to the Christian variety. Christian attention to the world and to things as the creation of God makes no demands, imposes no extrinsic meaning, bends nothing to our will to power or pleasure. It is a response (bottom up) to what we discover the world and its things to be, in themselves apart from and prior to us, just as it is a quest (top down) to see the world and its things as we have been told by God they in fact are. In the words of Psalm 19:

The heavens are telling the glory of God;
and the firmament proclaims his handiwork.
Day to day pours forth speech,
and night to night declares knowledge.
There is no speech, nor are there words;
their voice is not heard;
yet their voice goes out through all the earth,
and their words to the end of the world. (vv. 1-4)

The claim of the psalmist is that, in reality, the voice-that-is-no-voice and the words-that-are-no-words speak—are speaking, at all times, even now—whether or not we have ears to hear them. We do not imagine or construct what they say; we hearken to what they have to say to us. This is why Wendell Berry is so obstinate in his unfashionable insistence that the meaning humans find, whether in art or in the natural world, is just that: discovered, not created. Franz Wright captures the point well in his poem, “The Maker”:

The listening voice, the speaking ear

And the way, always, being
a maker
reminds:

you were made.

Berry himself puts it this way in a 1987 essay:

[Consider the concept] of artistic primacy or autonomy, in which it is assumed that no value is inherent in subjects, but that value is conferred upon subjects by the art and the attention of the artist. The subjects of world are only “raw material.” As William Matthews writes in a recent article: “A poet beginning to make something need raw material, something to transform.” For Marianne Moore, he says,

subject matter is not in itself important, except that it gives her the opportunity to speak about something that engages her passions. What is important instead is what she can discover to say.

And he concludes:

It is not, of course, the subject that is or isn't dull, but the quality of attention we do or do not pay to it, and the strength of our will to transform. Dull subjects are those we have failed.

This apparently assumes that for the animals and humans who are not fine artists, who have discovered nothing to say, the world is dull, which of course is not true. It assumes also that attention is of interest in itself, which is not true either. In fact, attention is of value only insofar as it is paid in the proper discharge of an obligation. To pay attention is to come into the presence of a subject. In one of its root senses, it is to “stretch toward” a subject, in a kind of aspiration. We speak of “paying attention” because of a correct perception that attention is owed—that, without our attention and our attending, our subjects, including ourselves, are endangered.

Mr. Matthews’ trivializing of subjects in the interest of poetry industrializes the art. He is talking about an art oriented exclusively to production, like coal mining. Like an industrial entrepreneur, he regards the places and creatures and experiences of the world as “raw material,” valueless until exploited.

Such an approach to “things” is, I recognize, just what Carr opposes. But the irony, and therefore the danger, is that Carr’s approach threatens to join hands with Matthews against Berry—as well as against Borgmann, Schmemann, Augustine, Wright, Hopkins, and Sacasas. (A formidable crew!)

Recall Carr’s modification of McGilchrist’s claim, “Things change according to the stance we adopt towards them, the type of attention we pay to them.” Carr writes, “He’s right, but it’s important to recognize that the changes take place in the mind of the observer not in the things themselves. The things, whether works of art or of nature, have a material integrity that’s independent of our own thoughts and desires, and the stance we adopt toward them should entail a respect for that integrity” (emphasis mine). It is crucial to see that the last sentence is a non sequitur. Enchanted, disenchanted, and unenchanted alike agree that all things possess a certain integrity (material and otherwise) independent of our thoughts and desires and that our relation to things ought to show respect for that integrity.

As a result, however, does Carr’s proposal not end up throwing us back into the cage of consciousness? Are not things thereby reduced to a mirror, in which we see not things but our thoughts about things? Are not things now become playthings in the inner theater of the imagination? So that I am no longer contemplating the thisness of what lies before me, but projecting it from a variety of angles—with countless filters and settings tried and tested—on the screen of my mind?

Consider Carr’s own words:

To see what’s really there in an enchanted world, you need to see beyond or through the surface. You need to discover what’s hidden, what’s concealed, by the merely material form, and that requires something more than sensory perception. It requires extrasensory perception. In this framing, the contemplative gaze is not just unlocking what lies untapped within us — the powers of perception, imagination, interpretation — but also exposing some spiritual essence that lies hidden within the object of the gaze. (emphasis mine)

So far as I can tell, the last sentence puts the shoe on the other foot. With respect to the contemplative gaze, what Carr seems to want is not for the conscious human mind to encounter an object as it is, much less to penetrate to its inexhaustible depths, but to double back on itself, thereby “unlocking what lies untapped within us—the powers of perception, imagination, interpretation” (emphasis, again, mine). It follows that, for Carr, “unenchanted” contemplation is not finally about the object in its independent objectivity but about the subject exercising his unfathomably creative subjective powers. Perception is turned inside out. Attention transforms into solipsism, even narcissism. What I see is ultimately about me, the one seeing, and what I choose or want to see. What is important is no longer the object interpreted but the change induced in the interpreter by his powers of interpretation.

This epistemic loop is just what Sacasas was worried about in his original essay. Following the work of Jane Bennett, Sacasas writes that we find ourselves “trapped in a vicious circle. Habituated against attending to the world with patience and care, we are more likely to experience the world as a mute accumulation of inert things to be merely used or consumed as our needs dictate.” He goes on:

And this experience in turn reinforces the disinclination to attend to the world with appropriate patience and care. Looking and failing to see, we mistakenly conclude there was nothing to see.

What is there to do, then, except to look again, and with care, almost as a matter of faith, although a faith encouraged by each fleeting encounter with beauty we have been graced to experience. To stare awkwardly at things in the world until they cease to be mere things. To risk the appearance of foolishness by being prepared to believe that world might yet be enchanted. Or, better yet, to play with the notion that we might cast our attention into the world in the spirit of casting a spell. We may very well conjure up surprising depths of experience, awaken long dormant desires, and rekindle our wonder in the process. What that will avail, only time would tell.

Carr is understandably worried that the “mere” in “mere things” suggests that things as they are are inadequate unless and until we impose on them a higher meaning suited to our needs, a weightier significance than they themselves can bear. Such an imposition both weighs them down and occludes their actual significance. What Sacasas has in mind, though, is the “raw material” of “industrial art,” the instrumental reason that sees things as nothing more than what they appear to be, nothing more, therefore, than their constituent elements. On such a view, what a thing is is what it is made of, which is only one step away from the constructivist view that what a thing is is whatever I make of it. In the words of David Graeber, “The ultimate hidden truth of the world is that it is something we make and could just as easily make differently.”

Sacasas is right to delineate an alternative. I don’t know whether he’d called it the Christian alternative, but I will. I’ve spent many words outlining it in detail, so let me close here by summarizing it by contrast to both Graeber and Carr.

Regarding Graeber, his radical constructivism fails to approach and attend to the world in its thisness, in its independence and integrity apart from and prior to us, and for this reason fails to receive it as the gift that it is. With this critique I think Carr is in agreement.

As for Carr, however, his own view falls between two stools. Theoretically, it lacks sufficient metaphysical grounding to anchor reality—both its thisness and its givenness—while practically, it terminates in a contemplation that is curved in on itself. Whether the result is modern in a Kantian mode or postmodern in a Graeberian mode matters little.

To be clear, my claim is not that Christians alone can or do attend to the world as it is or that Christian enchantment (what I’m calling the church’s doctrine of creation) is the only viable, coherent, or dominant version on offer. It is instead that Carr’s critique falls short in relation to a properly Christian account of creation, contemplation, and haecceity. And it is this account that I understand Sacasas to be explicating and defending in his recommendation of seeing the world as always already enchanted, if only we take the time to pay it the attention it deserves.

Read More
Brad East Brad East

On the phone with Google

A lightly edited transcript of a very special conversation.

Me: I can't sign in to my account.
Them: Go [here].
Me: Yes, I'm there.
Them: Sign in.
Me: Yes, I did. It’s asking to verify my identity.
Them: Click to verify.
Me: Yes, that's why I'm calling. I can't verify with these options.
Them: Which options?
Me: "Open the Gmail app in your phone."
Them: Yes, verify on your Gmail app.
Me: I don't have the Gmail app on my phone.
Them:
Me: And I don't want to download it either.
Them:
Me:
Them: Click for another option.
Me: Yes, that's why I called. The only other option is "Click 'Yes' on your iPhone or tablet."
Them: Yes, click that.
Me: But there is nothing to click. I don't have a tablet and there is no "Yes" on my phone; I don't have a Google app downloaded on my phone.
Them:
Me: And I don't want to download it either.​
Them:
Me:
Them: Click for another option.
Me: Yes, that's why I'm calling. When I click for another option, the only other option is "Open the Gmail app on your phone."
Them: Yes, ​verify on your Gmail app.
Me: Remember, I don't have the Gmail app on my phone.
Them:
Me: And I don't want to.
Them:
Me:
Them: Click for another option.
Me: Remember, the only options both involve an app on my phone.
Them: What about your Google email address?
Me: Yes, I have one of those. I'm currently signed in on this browser. But even though I'm signed in, it's asking me to verify my identity because I need to update my payment preferences.
Them: Good. Now use your email to log in.
Me: I can't, because my browser is asking me verify on my phone, and I don't have email on my phone.
Them: You don't have email on your phone?
Me: I don't have email on my phone.
Them:
Me:
Them:
Me:
Them: Please hold the line while I transfer your call.

Read More
Brad East Brad East

My latest: how to raise readers, in Front Porch Republic

A link to my latest essay.

This morning Front Porch Republic published an essay of mine called “How to Raise Readers, in Thirty-Five Steps.” It’s a list of, as the title suggests, thirty-five things for parents to do to raise their children to be readers, with running commentary. It’s fun and light-hearted in tone, certainly the most “advice-y” piece I’ve ever written. Here are the first four paragraphs, before the list proper gets going:

It is not too much to say that everything in our culture pushes against habits of deep reading. Our ears are filled with noise, our eyes are stuck on screens, and our attention is scattered and distracted by a thousand entertainments.

Parents and teachers are worried; I’m both a dad and a professor, and I’m very worried. My worry increases when I think about handing on the faith. Not every believer needs to be literate, much less a casual reader of Dante or Milton. But Christian faith is irreducibly wordy, its details and contours forever fixed in the complex texts of Holy Scripture and sacred tradition. Readers are interpreters, if not by their eyes then by their ears, and bad interpreters can do a lot of damage.

Indeed, the very habits that sustain deep reading are crucial for sustaining prayer. If I lack the attention to keep my eyes on a page I can see, how can I have the attention to keep my heart on a God I cannot see? Reading is not necessary for prayer, but it is one helpful training ground for it.

Is it possible, then, to raise readers in a digital age? I think so. I’ve got four kids, two boys then two girls, who range from sixth grade to first. I can’t say I’ve done much well, but I have raised readers. Every child is different, and aptitude and opportunity both matter greatly. Nevertheless, within varying limits, there are certain things parents can do to make it more likely that their children will learn good reading habits—even become lifelong readers themselves. Here are the ones that have worked well for our family.

Click here to read the rest. I welcome other suggestions!

Read More
Brad East Brad East

The greatest threat facing the church today

Thinking out loud about answers in response to this question.

In my latest piece for Christianity Today, I propose the following thesis:

The greatest threat facing the church today is not atheism or secularism, scientism or legalism, racism or nationalism. The greatest threat facing the church today is digital technology.

That’s a controversial claim for many reasons, and I’m not dogmatic about it. It could be wrong. Moreover, it’s not self-evident that there is a meaningful hierarchy of threats facing the church. Perhaps there are a handful, all on the same level; or a variety that are incommensurable. Finally, a year or two back Alan Jacobs and Andy Crouch took me (ever so kindly) to task for a claim like this one, proposing instead that Mammon, not Digital, is the principal threat; and, further, that Digital is a wholly owned subsidiary of Mammon.

With those caveats in place, what are the candidates for this particular category? What are the most significant threats facing the church today? By what measures should we judge them? And which church, or churches, or regions and cultures of the world, should we have in mind?

The range of answers would at least need to be large enough and systemic enough to threaten millions of believers at once, and in insidious and powerful ways difficult to suss out and extinguish. In the excerpt above I mention some “isms” that people are worried about. Let’s expand that list:

  • Capitalism

  • Progressivism

  • Liberalism

  • Secularism

  • Atheism

  • Scientism

  • Legalism

  • Racism

  • Nationalism

  • Imperialism

  • War-mongering

  • Industrialism

  • Environmentalism

  • Utilitarianism

  • Individualism

  • Nihilism

  • Anti-natalism

  • Technophilia

  • Thanatophilia (i.e., the culture of death)

The important thing to see is that the nature of the threat doesn’t consist in discrete events or even types of events—famine, plague, poverty, war. These are evils and cause mass suffering, but they aren’t threats to the church, at least not in the way I’m using the term. These and other trials the church will always have her. They’re part of the way of the world, the world we long for God to redeem. They aren’t systems or structures or ideologies perpetrated by human beings (except when they are—but they are rarely reducible to ideology or policies, for the simple reason that they are insoluble, perennial problems of finite, mortal existence in a fallen world). More to the point, in the midst of great suffering the church sometimes rises to the occasion in service, courage, and sacrifice. In the face of danger, damage, and pain the church can fail, falter, or flourish. But she can’t be what God calls her to be if she isn’t prepared—if, that is, her foundations are so eroded that she forgets her own reason for being.

It is the question of what enacts such erosion that I am naming with the language of “threat.” A major threat to the church would snuff out its life whether it was the best of times or the worst of times; it would silence the gospel before anyone could hear it or live it out at all.

Another way to put it would be to ask, as I did recently, what idol or idols a given generation or place or people worships, and why, and what counterfeit blessings it receives in return, and how its worship and what it receives in turn shape and form it in the image of said idol(s).

I’m far from dogmatic on this question, as I said at the outset. If I had to pick five, I suppose I would choose technophilia, individualism, utilitarianism, capitalism, and progressivism. But then, how many of these are birthed from or contained within liberalism, understood as the ideology developed and advanced in the eighteenth and nineteenth centuries? Not to mention scientism, which arguably is concomitant with both liberalism and utilitarianism and, later, with the love of the future that finds concrete expression in progressivism and technophilia.

And is Mammon then the devilish father of them all? I leave the question open for others to chime in.

Update (seconds after pressing publish): I realize that I did not specify that I am here thinking exclusively about exogenous threats—if I were put on the spot about internal threats, I might say that church division is the single greatest threat to the church’s integrity and to the credibility of the gospel she proclaims to the world. Not in view here!

Read More
Brad East Brad East

My latest: a plea for screen-free church, in CT

A link to my new piece on screen-free worship for Christianity Today.

I’m in Christianity Today arguing for screen-free church; here are the opening paragraphs:

Some years ago, author Hal Runkel trademarked a phrase that made his name: screamfree parenting. It’s a memorable term because it captures viscerally what so many moms and dads want: parenting without the volume turned up to 11—whether of our kids’ voices or our own.

I’d like to propose a similar phrase: screen-free church. It’s a vision for an approach to Christian community and especially public worship that critically assesses and largely eliminates the role of digital devices and surfaces in church life. But the prescription depends on a diagnosis, so let me start there.

Consider the following thesis: The greatest threat facing the church today is not atheism or secularism, scientism or legalism, racism or nationalism. The greatest threat facing the church today is digital technology.

Rest the rest here.

Read More
Brad East Brad East

Boys and video games in different stages of life

Thinking about the place of video games in boys' lives: preteen, teens, twenties, and thirties.

Update: I’m told this entire post is the subject of Mere Fidelity’s August 27 episode with Andy Crouch (called “Put Social Media in Its Place”). Hand over heart, I had not listened to it when I wrote this piece and still have not listened to it. The relevant question now is whether my friend had listened to it or whether, more intriguingly, he is the next Andy Crouch. My bet is on the latter.

*

A friend made a remark the other day that I want to expand on here.

He commented that there’s an important difference between teenage girls’ relationship to social media, on one hand, and teenage boys’ relationship to video games, on the other. In the former case, social media both creates and exacerbates all kinds of antisocial problems: friend drama, FOMO, anxiety, depression, loneliness, eating disorders, body image issues, lack of self-esteem, and the rest. In the latter case, there appears to be very little of this sort of thing; the effects are, on the whole, neutral or benign, especially if the boys in question have a relatively healthy home life and a diverse “activities” portfolio: sports, reading, board games, outdoor exploration, camping, rough-housing, sleepovers, church, school, youth group, and more.

At the same time, much of our public discourse about technology, gender, and social ills focuses—rightly—on video games. Why?

Two reasons. First, video games can absolutely become an addiction, a mono-activity that swallows up all the other options in the healthy array listed above (together, that is, with YouTube and pornography). Second, video games’ antisocial effects play out in disordered male lives not primarily in preteen and teenaged lives, but when boys grow up: in their twenties and thirties.

As a matter of fact, my friend pointed out, so far as he could tell, his sons’ gaming habits were embedded in and reinforced a broadly healthy network of social relationships. It didn’t pull them out of friendship and face-to-face activities but further into them.

I think he’s right. It’s not something I’d considered in depth before, though, so a few thoughts.

First, this resonates with my own experience. I played Nintendo, Sega, and PlayStation from early elementary through the end of high school, and they were for the most part heavily social experiences. Even when the game was one-player, I either played while buddies watched (and vice versa—always providing running commentary) or consulted constantly with friends who were also playing the same game at the same time (The Ocarina of Time, say, or Metal Gear Solid). I even subscribed to multiple gaming magazines, which means that my gaming habits encouraged the regular reading of print media!

Second, this view resonates with my observations of my own boys. What they want to do above all is play with their friends, whether their friends are in the room (Smash Bros or Gang Beasts) or online (Fortnite or … Fortnite). When they see their friends, they talk about when they played together the day before and immediately plan times to play with one another later that day or weekend. When they have birthday parties, they all congregate in the same room and find ways to play (Deo volente) for hours on end. I recall a middle school birthday party when I did the same thing, with a house set up with multiple TVs and a round robin NFL Blitz tournament. Again: social, not antisocial.

Third, the key component here is that gaming time isn’t unlimited and doesn’t descend into the dark abyss of late nights and endless, lonely play. You don’t have to tell me that there are households with no limits on screen time. But assuming there are limits, and the limits are real, and the boys in question really do spend much or most of their waking hours not gaming but swimming and jumping on the trampoline and playing Risk and reading epic fantasy and playing foosball and climbing trees and riding bikes around the neighborhood and walking the dog and shooting hoops and, and, and … then I’m just not that worried about the presence of video games in the lives of boys in middle and high school.

Fourth, however, life doesn’t end at eighteen or twenty-two. What my friend’s remark also brought to mind was that the challenge of video games and young men in our culture is not pre- but post-graduation (whether graduation here refers to high school or college). That doesn’t mean that no adult man in his twenties or thirties should play video games—although, cards on the table, I will admit that I’ve not seriously played a video game since my freshman year of college. (I recall it fondly: Beating Half-Life 2 over the Christmas break. Probably the only thing that could ever pull me out of retirement would be a third entry finally getting made.) That was a full twenty years ago. I have buddies who’ve continued gaming to various degrees since college, but I can’t relate. It lost its luster a long time ago.

So with that caveat in place, it seems clear to me that the pressing social question for (present and future) adult men in Gen Z and Gen Alpha is what role, if any, video games should play in their lives. In my perfect world it would be nil, minus the occasional nostalgic afternoon or competition with one’s nephews, nieces, and children. Since that’s not this world, the practical question becomes: What is healthy gaming for adult men in the 2020s and 2030s? What types of game? Within what limits? And do the answers change based on the man’s employment, marital, or paternal status?

I’m not in a position to give universal, much less concrete, answers, except that my suggested limits would be predictably strict. More to the point, if it is true that the more one games the less likely one is to eat well, exercise, have good friends, go to church, find a spouse, and/or have and raise children in the home, then it would seem obvious that as a society we should desire the least gaming possible for men in their twenties and thirties. Gaming as a child and teenager and even young adult would, by the time boys leave the home, go the way of bunk beds and cooties, curfews and driver’s permits. The axiom would be Pauline: When I was a child I gamed like a child; when I became a man, I put away childish things.

That rhetoric is strong, I admit; I freely allow that, as a non-gamer, I’m biased against gaming in a way that may not let me see how it could find a small but meaningful role in a balanced adult life. If it can, the onus is on those who think so to make the case and display it in their lives. At the moment, video games and adult men don’t mix well, for themselves or for the rest of society.

Read More
Brad East Brad East

More screens, more distractions; fewer screens, fewer distractions

A vision for the design of our shared spaces, especially public worship.

It’s a simple rule, but I repeat it here because it is difficult to internalize and even more difficult to put into practice, whatever one’s context:

In any given physical space, the more screens that are present, the more distractions there will be for people inhabiting that space; whereas the fewer screens, the fewer distractions.

So far as I can tell, this principle is always and everywhere true, including in places where screens are the point, like a sports bar. No one would study for the LSAT in a sports bar: it’s too distracting, too noisy, too busy. It’s built to over-stimulate. Indeed, a football fan who cared about only one game featuring one team would not spend his Sunday afternoon in a sports bar with a dozen games on simultaneously, because it would prove too difficult to focus on the one thing of interest to him.

Now consider other social spaces: a coffee shop, a classroom, a living room, a sanctuary, a monastery. How are these spaces usually filled? Given their ends, how should they be filled?

The latter question answers itself. This is why, for example, I do not permit use of screens when I teach in a college classroom. Phones, tablets, and laptops are in bags or pockets. In the past I have used a single projector screen for slides, especially for larger survey/lecture courses, but for the most part, even with class sizes of 40 or 50 or 60, I don’t use a screen at all, just markers and a whiteboard. Unquestionably the presence of personal screens open on desks is a massive distraction not only to their owners but to anyone around them. And because distractions are obstacles to learning, I eliminate the distractions.

The same goes for our homes and our churches.

At the outer limit, our homes would lack screens altogether. I know there are folks who do this, but it’s a rare exception to the rule. (Actually, I’m not sure if I have ever personally known someone whose home is 100% devoid of any screen of any kind.) So assuming there will be screens of some kind, how should they be arranged in a home?

  1. There should be numerous spaces that lack a permanent screen.

  2. There should be numerous spaces in which, by rule or norm, portable screens are unwelcome.

  3. There should be focal spaces organized around some object (fireplace, kitchen island, couch and coffee table) or activity (cooking, reading, playing piano) that are ordinarily or always screen-free.

  4. What screens there are should require some friction to use, i.e., a conscious and active rather than passive decision to turn them on or or engage with them.

  5. Fewer screens overall and fewer screens in any given space will conduce to fewer distractions, on one hand, and greater likelihood of shared or common screen usage, on the other. (I.e., watching a movie together as a family rather than adults and children on separate devices doing their own thing.)

There is more to say, but for those interested I’m mostly just repackaging the advice of Andy Crouch and Albert Borgmann. Now to church.

There are a few ways that screens can invade the space of public worship:

  1. Large screens “up front” that display words, images, videos, or live recording of whatever is happening “on stage” (=pastor, sermon, communion, music).

  2. Small screens, whether tablets or smartphones, out and visible and in active usage by ministers and others leading the congregation in worship.

  3. Small screens, typically smartphones, in the pockets and laps of folks in the pews.

Let me put it bluntly: It’s often said that Sunday morning is the most segregated hour in America. In a different vein, it’s equally true that Sunday morning may now be the most distracted hour in America.

Why? Because screens are everywhere! Not, to be sure, in every church. The higher liturgical traditions have preserved a liturgical celebration often, though not always, free of screen colonization. Yet even there parishioners still by and large bring their screens in with them.

Certainly for low-church forms of worship, screens are everywhere. And the more screens, the more distractions. Which means that, for many churches, distraction appears to be part of the point. Those attending are meant, in a twist on T. S. Eliot’s phrase, to be distracted from distraction by distraction—that is, to be distracted from bad distraction (fantasy football, Instagram, online shopping) by good distraction (cranked-up CCM, high production videos, Bible apps). It is unthinkable, on this view, to imagine worshiping on a Sunday morning in a screen-free environment. Yet a screen-free space would be a distraction-free space, one designed precisely to free the attention—the literal eyeballs—of those gathered to focus on the one thing they came for: God.

I hope to write a full essay on this soon for Christianity Today, laying out a practical vision for screen-free worship. For now I just want to propose it as an ideal we should all agree on. Ministers should not use phones while leading worship nor should they invite parishioners to open the Bible “on their apps.” Do you know what said parishioners will do when so invited? They may or may not open their Bible app. They will absolutely find their eyes diverted to a text message, an email, or a social media update. And at once you will have lost them—either for a few minutes or for good.

The best possible thing for public Christian worship in twenty-first century America would be the banishment of all screens from the sanctuary. Practically speaking, it would look like leaders modeling and then inviting those who attend to leave their phones at home, in their cars, or in cell phone lockers (the way K–12 schools are increasingly doing).

I’m well aware that this couldn’t happen overnight, and that there are reasonable exceptions for certain people to have a phone on them (doctors on call, police officers, parents of children with special needs). But hard cases make bad law. The normative vision should be clear and universally shared. The liturgy is a place for ordering our attention, the eyes of the heart, on what we cannot see but nevertheless gain a glimpse of when we hear the word of the Lord and see and smell and taste the signs of bread and wine on the Lord’s table. We therefore should not intentionally encourage the proliferation of distractions in this setting nor stand by and watch it happen, as if the design of public space were out of our hands.

More screens, more distractions; fewer screens, fewer distractions: the saying is sure. Let’s put it into practice.

Read More
Brad East Brad East

My latest: on the late Albert Borgmann, in HHR

A link to my essay on the life and writings of the philosopher Albert Borgmann.

This morning The Hedgehog Review published an essay of mine called “The Gift of Reality.” It’s an extended introduction to and exposition of the life and writings of the late Albert Borgmann, including a review of his last book, published posthumously last January. Here’s a sample paragraph from the middle of the piece:

At the same time, while Borgmann may have been a critic of liberalism, he argued that “it should be corrected and completed rather than abandoned.” In this he reads as a less polemical Christopher Lasch or Wendell Berry, fellow democrats whose political vision—consisting among other things of family, fidelity, fortitude, piety, honor, honest work, local community, neighborliness, and thrift—is likewise invested in preserving and respecting reality. Such a vision is simultaneously homeless on the national stage and the richest fruit of the American political tradition.

Click here to read the whole thing.

Read More
Brad East Brad East

Writing without a platform

Reflections on the possibilities of writing today without creating and maintaining an online "platform" via social media.

Is it possible? That’s what I’m wondering.

I can be a moralistic scold about social media—I’m aware. I’m also aware that, for many writers, social media feels like the one and only way to reach, much less build, an audience; to make a name for oneself in a time when anyone on earth can publish millions of words and just about no one pays for the privilege to read them.

I myself, for a time, benefited from social media. I was on Twitter from 2013 to 2022, with maximal usage coming in the span of years from 2015 to 2020. (Those dates are … interesting.) As it happens, I was ABD and dissertating from fall 2014 through spring 2017, then a newly hired professor starting later that fall. In other words, my Twitter usage peaked when (a) I was spending many hours daily staring at a laptop screen and (b) I was trying to get my life as a junior scholar and public writer off the ground. I got a handful of early writing gigs through Twitter and I made many more personal contacts through it, some of whom I still count as friends, colleagues, or nodes in my professional network.

That’s a long way of saying: I don’t have the luxury of strutting around on the moral high ground, looking down at folks building their platforms through X, IG, Substack, and YouTube. I did the very same thing, albeit to a lesser degree, and it undeniably helped my career, above all my career as a writer.

Hence the question. Is it possible, today, to write, to be a writer, without a platform?

A few thoughts.

First, credentials play a role. I was just telling an editor the other day that the academy is a backdoor into publishing books. My PhD opens doors. That’s a fact. Weirdly enough, since academic books aren’t bestsellers, it’s easier for me to creep my way into popular publishing than it is for someone who only wants to write popular books, since he or she has to make good from the jump. Or before the jump, in fact, through amassing followers and fans via “socials.”

Second, gender plays a role. I’ve written about this before, but the politics of women Christian writers was already complex before the rise of the internet and social media. Now it’s positively Byzantine. If you have a PhD, that’s one thing. If you’re employed in the industry—at a magazine, say, or at a publishing house—that’s another. If you just want to be a writer, though, your options for finding an audience and outlets are limited. If, further, you do not have a clear denominational or political tribe; and if, still further, you are not a culture warrior; and if, still further, you are not willing to post pictures of and share private information about your husband and children (assuming that you have them and that they are photogenic)—the circle just keeps getting narrower and narrower. I know exactly one contemporary female Christian writer who “broke through” without credentials, institutional home, tribal affiliation, or online platform, including Twitter. Otherwise one or more of these factors invariably determine the likelihood not only of getting written work into the world but of a sufficiently large audience finding it.

Third, expectations play a role. Almost no one makes an actual full-time living as a writer. Outside of those rare authors whose names we all know and who sell millions of books, writers either have a day job, or depend on a spouse’s income, or hustle like a maniac, or fundraise/crowdfund, or hit the speaker circuit, or live hand to mouth as a starving artist. Or they did one or more of these things for many years, probably decades, before reaching a threshold to just be able to pay their bills. This is not unjust. It’s just the way it is, and ever was it thus. Anger or resentment at lack of remuneration for the writing life is both a professional nonstarter and the product of a fantasy. A writer’s first rule is to live in the real world, and the real world doesn’t care about writers or what they write. The sooner one learns that, the sooner one can get started with what matters: the writing! Isn’t that what we’d be doing anyway, even if we knew we’d never get paid a dime?

Finally, the industry plays a role. This is the part where we get to complain. It’s common knowledge that trade presses use social media metrics as a gatekeeping mechanism. In plain speech, they ask first-time authors how many followers they have. If the answer is “a few thousand,” then they say “thanks for playing” and politely shut the door. If the answer is “zilch, because I’m not on social media,” then they laugh hysterically before slamming the door. (You can still hear them on the other side, doubled over in tears.) This is, it goes without saying, a new phenomenon, since social media is a new phenomenon. And writers eager to break through have followed these incentives to their logical conclusion: drumming up an online following by every means possible: Facebook, Twitter, Instagram, TikTok, LinkedIn, YouTube, Spotify, Substack, Threads—you name it, they’re there. Posting, re-posting, replying, commenting, replying again, sharing, re-sharing, streaming, recording in the car, recording on the run, recording with the kids, walk and talks, live tweets, and more. Always on, extremely online, creating memes, mocking memes, revising memes: keeping the content coming, letting the spice flow. And eventually, with a pinch of success come subscription deals, and after these come sponsorships, and after these come ads. And before you know it, you’re celebrating the free swag you got in the mail or reading an on-air advertisement for skin cream.

How’d you end up here?

That’s the question you should be asking. That’s the question I’m trying to pose in my repeated missives against social media. It’s why, although I’m anti-anti-Substack, and I’m no longer stridently anti-podcast, I’m still hesitant about the knock-on effects of podcasts’ ubiquity, and on certain days, if I’m honest, I’m anti-anti-anti-Substack.

What I mean is: Substack is an ecosystem, and one of the ways it forms both writers and readers is to make every writer a digital entrepreneur hawking a product. Further, it encourages a relationship between writer and readership on the model of celebrity fandom. (After all, you gotta give the people what they want.)

Put these together and the model becomes that of the influencer. The podcasting live-streaming YouTuber with a newsletter and a Patreon is a single genus—the hustling entrepreneurial influencer with fans in the hundreds, thousands, or more—of which Christians, including writers, become only one more species. They are different from Kim Kardashian and MrBeast only in degree, not in kind.

I’ve written elsewhere that there are wise, thoughtful people doing this in ways I admire, in service to the church. They’re digital lectors taking the gospel to an entire generation of (to be frank; I love them) uncatechized functional illiterates addicted to digital technology, and God be praised they’re finding a hearing. I don’t retract what I wrote. But we are fooling ourselves if we don’t step back and see clearly what is happening, what the nature of the dynamic is. Writers are being co-opted by the affordances of newsletters, social media, and audio/visual recording and streaming in ways that corrode the essence of good writing as well as the vocation of the writer itself.

A writer is not an influencer. To the extent that participating in any of these dynamics is necessary for a writer to get started or to get published, then by definition it can’t be avoided. But if it is necessary, we should see it as a necessary evil. Evil in the sense that it is a threat to the very thing one is seeking to serve, to indwell, celebrate, and dilate: the life of the mind, the reading life, the life of putting words on the page that are apt to reality and true to human nature and beautiful in their form and honoring to God. Exhaustively maintaining an online platform inhibits and enervates the attention, the focus, the literacy, the patience, the quietness, and the prayers that make the Christian writing life not only possible, but good.

In a word: If writing without a platform is impossible, then treat it like Wittgenstein’s ladder. Use it to get where you’re going, then kick it over once it’s done the job.

Read More
Brad East Brad East

It costs you nothing not to be on social media

One of my biannual public service announcements regarding social media.

Consider this your friendly reminder that signing up for social media is not mandatory. It costs nothing not to be on it. Life without the whole ensemble—TikTok, Twitter, Instagram, Facebook, and the rest—is utterly free.

In fact, it is simpler not to be on social media, inasmuch as it requires no action on your part, only inaction. If you don’t create an account, no account will be made for you. You aren’t auto-registered, the way you’re assigned a social security number or drafted in the military. You have to apply and be accepted, like a driver’s license or church membership. Fail to apply and nothing happens. And I’m here to tell you, it is a blessed nothingness.

That’s the trick with social media: nothing comes from nothing. Give it nothing and it can take nothing from you.

Supposedly, being on social media is free. But you know that’s not true. It costs you time—hours of it, in fact, each and every day. It costs you attention. It costs you the anxiety it induces. It costs you the ability to do or think about anything else when nothing exactly is demanding your focus at the moment. It costs you the ability to read for more than a few minutes at a time. It costs you the ability to write without strangers’ replies bouncing like pinballs around your head. It costs you the freedom to be ignorant and therefore free of the latest scandal, controversy, fad, meme, or figure of speech that everyone knew last week but no one will remember next week.

Thankfully, social media has no particular relationship to what is called “privilege.” It does not take money to be off social media any more than it takes money to be on it. It is not the privileged who have the freedom not to be on social media: it is everyone. Because, as I will not scruple to repeat, even at the risk of annoyance or redundancy, it costs nothing not to be on social media. And since it costs nothing for anyone, it therefore costs nothing for everyone. Unfortunately, the costs of being on social media do apply to everyone, privileged or not, which is why everyone would be better off deleting their accounts.

Imagine a world without social media. It isn’t ancient. It isn’t biblical. It’s twenty years ago. Are you old enough to remember life then? It wasn’t a hellscape, not in this respect at least. The hellscape is social media. And social media hasn’t, not yet, become a badge of “digital citizenship” required by law of every man, woman, and child, under penalty of fine or loss of employment. Until then, so long as it’s free, do the right thing and stay off—or, if you’re already on, get off first and then stay off.

Here’s the good news, but tell me if you’ve heard it before: It won’t cost you a thing.

Read More
Brad East Brad East

In defense of podcasts

A response to some idiot’s rant from a couple years back.

A few years ago some idiot on the internet wrote that he was quitting podcasts, and you should too. I can’t imagine what he was thinking. Podcasts are a pleasure.

They’re a pleasure to listen to, because they run the gamut. They’re about anything, everything, and nothing. They can be bite-size; they can appetize; they can tease. Or they can last for hours, leaving no nook or cranny unexplored. They can remain at the surface for beginners or they can dive in the deeps for experts.

Podcasts can cover philosophy, theology, history, politics, and ethics; they can also cover basketball, film, TV, music, and novels. They can pay six-figure salaries and they can sprout up tomorrow by a bro in his basement. They embody a democratic media and a free press and free speech all at the same time. What’s not to love?

They’re also a pleasure to go on. Once a month or so I get invited onto a podcast, and every single time it’s a blast. I’ve never joined a bad one! Apparently they’re all fun. We laugh, we talk theology or technology or academia; we learn something in the exchange; the recording goes up a few days later; and it’s there, more or less forever, for others to listen in on at their leisure. Just this week a shook-his-hand-once acquaintance at my (not small) congregation came up to me to tell me he enjoyed a podcast I was on. (Kudos to you, Kaitlyn; he’s a big fan.) Like so many others, this thirtysomething Christian listens to strangers talk theology for the layman while washing the dishes, or driving to work, or taking a walk. And why not?

I’m just glad these things are already so popular, or else that idiot’s rant might have made an unwelcome dent, or even popped the bubble. Life is short. Let’s enjoy its little pleasures while we can. And there can be no doubt that podcasts are among them.

Read More
Brad East Brad East

The best books about technology

What are they? What unites them? Read on to find out.

are not about technology. They’re not about the latest innovation or invention. They’re not an intervention in the news cycle, much less punditry about A.I. or the internet or digital or television or motion pictures or radio or the automobile or the printing press. They’re not dated the moment the car rolls off the lot.

The best books about technology are about humanity—about what it means to be human and about life well lived and urgent threats to the good life. Because technology is essentially a human thing, good writing about technology is good writing about human things. A doctrine of technology is only as good as its doctrine of man; indeed, not only depends upon but is a doctrine of man. The technologist is an anthropologist, from first to last.

Which is why, incidentally, the best technologists are philosophers and theologians. In Calvin’s words:

Our wisdom, in so far as it ought to be deemed true and solid wisdom, consists almost entirely of two parts: the knowledge of God and of ourselves. But as these are connected together by many ties, it is not easy to determine which of the two precedes and gives birth to the other. For, in the first place, no man can survey himself without forthwith turning his thoughts towards the God in whom he lives and moves; because it is perfectly obvious, that the endowments which we possess cannot possibly be from ourselves; nay, that our very being is nothing else than subsistence in God alone. In the second place, those blessings which unceasingly distill to us from heaven, are like streams conducting us to the fountain.

What, then, are the best books (not) about technology that I have read? A short list would include Abraham Joshua Heschel’s The Sabbath; Walker Percy’s Lost in the Cosmos; François Mauriac’s The Eucharist; Wendell Berry’s A Timbered Choir; Josef Pieper’s Leisure, the Basis of Culture; Jonathan Lear’s Radical Hope; Stephen King’s On Writing; Albert Murray’s The Omni-Americans; Pascal’s Pensées; and many more.

These are my models for good technology writing: not because they talk about technology but because they comment—uniquely, stylishly, with voice and perspective and courage—on the human condition. Today’s apps are yesterday’s fads, but the human condition isn’t going anywhere. Write, therefore, when you set out to write about technology, about what it means to be human today; seek the latter and the former will be added unto you.

Read More
Brad East Brad East

I joined Micro.blog!

Why I joined + thoughts on Micro.blog > Twitter et al.

After years of hearing Alan Jacobs sing the praises of Micro.blog, I created an account this week. Not only that, I’m able to host my micro blog on this website’s domain; so instead of eastbrad.micro.blog, the URL is micro.bradeast.org. In fact, I added “Micro” as an option on the header menu above, sandwiched in between “Media” and “Blog.” In a sense you’re technically “leaving” this site, but it doesn’t feel like it. In this I was also following Alan’s lead. Thank you, ayjay “own your own turf” dot org!

Now: Why did I join micro.blog? Don’t I already have enough to do? Don’t I already write enough? Isn’t my goal to be offline as much as possible? Above all, wasn’t I put on earth to do one name thing, namely, warn people away from the evils of Twitter? Aren’t I the one who gave it up in June 2020, deactivated it for Lent in spring 2022, then (absent-mindedly) deleted it a year later by not renewing the account? And didn’t regret it one bit? Don’t I think Twitter and all its imitators (Threads, Notes, et al) unavoidably addict their users in the infinite scroll while optimizing for all the worst that original sin has to offer?

What, in a word, makes micro blogging (and Micro.blog in particular) different?

Here’s my answer, in three parts: why I wanted to do this; how I’m going to use it; and what Micro.blog lacks that makes it distinct from the alternatives.

First, I miss what Twitter offered me: an accessible public repository of links, images, brief commentary, and minor thoughts—thoughts I had nowhere else to put except Twitter, and thoughts that invariably get lost in the daily shuffle. I tend to call this main blog (the one you’re reading right now) a space for “mezzo blogging”: something between Twitter/Tumblr (i.e., micro writing and sharing) and essays, articles, and books (i.e., proper macro writing). I suffer from graphomania, and between my physical notebook and texting with friends, I still have words to get out of my system; minus all the nonsense on Twitter, the reason I stayed as long as I did was that. (Also the connections, friends, and networking, but the downsides of gaining those things were and are just too great, on any platform.)

Second, I am going to use my micro blog in a certain way. I’m not going to follow anyone. I’m not going to look at my timeline. I’m not going to let it even show me follows, mentions, or replies. It’s not going to be a place for interaction with others. I’m not going to dwell or hang out on it. In a sense I won’t even be “on” it. I have and will have no way of knowing if even a single soul on earth reads, clicks, or finds my writing there. It exists more or less for one person: me. Its peripheral audience is anyone who cares to click from here to there or check in on me there from time to time.

What am I going to be doing, then? Scribbling thoughts that run between one and four sentences long; sharing links to what I’m reading online; sharing books and images of what I’m reading IRL; in short, putting in a single place the grab bag of “minor” writing that pulls me daily in a hundred directions: email, messages, WhatsApp, even Slack (once upon a time). E.g., right now I’m enthralled by the NBA playoffs, but not only does no one who reads this blog care about that; my thoughts are brief, ephemeral, and fleeting. But I have them, and I want to remember what they were! So now I put them there, on the micro blog.

I don’t, for what it’s worth, have any kind of organizational system for note-taking, journaling, or any such thing. I do keep a physical journal, but it’s mostly a place for first-draft brainstorming; it’s not much of an archive. I don’t use Drafts or Tot or Notes or Scrivener or even an iPad or tablet of any kind. Nothing is housed on the cloud; nothing is interconnected, much less interoperable. I’ve always toyed with trying Evernote—I know people who love it—but it’s just never appealed to me, and I don’t think I’m the type who would benefit from it or use it well. My mental habits and ideas and writing instincts are too diffuse. At the same time, I love the idea of a one-stop shop for little thoughts, for minor scribbles, in brief, for micro blogging. That’s how I used Twitter. I ultimately just got fed up with that broken platform’s pathologies.

So, third, what makes Micro.blog different? In a sense I’ve already answered that question. It’s not built to do what Twitter, Threads, and Substack Notes are meant to do. There’s no provocation or stimulation. There’s no hellish algorithm. It doesn’t scale. It’s not about followers or viral hits. It’s self-selecting, primarily because you have to pay for it and secondarily because it’s not a way to build an audience of thousands (much less millions). It’s for people like me who want a digital room of their own, so to speak, without the assault on my attention, or the virus of virality, or the infinite scroll, or the stats (follows, like, RTs) to stroke or shrink my ego, or the empty promise that the more I post the more books I’ll eventually sell. No publisher or agent is going to tout my Micro.blog to justify an advance. It’s just … there. For me, and max, for a few other dozen folks.

And anyway, I’m giving it a 30-day free trial. No commitments made just yet. I already like it enough that I expect to fork over $5/month for the privilege. But we’ll see.

Either way, this is all one long way of saying: See, I’m no Luddite. I use Squarespace and Instapaper and Firefox and Spotify and Libby and Letterboxd and now Micro.blog. I might even get to ten whole quality platforms one day.

Clearly, I don’t hate the internet. I’m just picky.

Read More
Brad East Brad East

My latest: the rise of digital lectors, in CT

A link to my latest column for Christianity Today, a sequel to my piece on biblical literacy and the postliterate church.

My April 18 Christianity Today column was called “Biblical Literacy in a Postliterate Age.” Last week, on May 8, CT published my follow-up, titled “Digital Lectors for a Postliterate Age.”

I’d always intended a sequel, and later this summer I may write a final column to complete a loose trilogy of reflections on Scripture, literacy, and technology in the church. This latest one covers a range of creative responses to postliterate believers, seekers, and drifters, from the Bible Project to Father Mike’s The Bible in a Year podcast to Jonathan Pageau and the Symbolic World to Alastair Roberts and many others. I call them “digital lectors,” readers and expositors of Scripture for a digital—which is to say, a postliterate—age.

In between the two columns, there were a couple noteworthy interactions with my claims about the state of biblical literacy (and literacy in general) in the church. The first was a conversation on the Holy Post podcast between Skye Jethani and Kaitlyn Schiess; you can find it on video here, starting around minute 33. The second was a response from Jessica Hooten Wilson (whom I quote in the piece), in a piece on her Substack called “The Post-literate Church.” Both engagements are friendly, thoughtful, critical, and worth your time. I’m grateful to all of them for their reflections.

Read More
Brad East Brad East

A.I., TikTok, and saying “I would prefer not to”

Finding wisdom in Bartleby for a tech-addled age.

Two technology pieces from last week have stuck with me.

Both were at The New York Times. The first was titled “How TikTok Changed America,” a sort of image/video essay about the platform’s popularity and influence in the U.S. The second was a podcast with Ezra Klein called “How Should I Be Using A.I. Right Now?,” an interview with Ethan Mollick.

To be clear, I skimmed the first and did not listen to the second; I only read Klein’s framing description for the pod (my emphases):

There’s something of a paradox that has defined my experience with artificial intelligence in this particular moment. It’s clear we’re witnessing the advent of a wildly powerful technology, one that could transform the economy and the way we think about art and creativity and the value of human work itself. At the same time, I can’t for the life of me figure out how to use it in my own day-to-day job.

So I wanted to understand what I’m missing and get some tips for how I could incorporate A.I. better into my life right now. And Ethan Mollick is the perfect guide…

This conversation covers the basics, including which chatbot to choose and techniques for how to get the most useful results. But the conversation goes far beyond that, too — to some of the strange, delightful and slightly unnerving ways that A.I. responds to us, and how you’ll get more out of any chatbot if you think of it as a relationship rather than a tool.

These two pieces brought to mind two things I’ve written recently about social media and digital technology more broadly. The first comes from my New Atlantic essay, published two years ago, reviewing Andy Crouch’s book The Life We’re Looking For (my emphases again):

What we need is a recommitment to public argument about purpose, both ours and that of our tools. What we need, further, is a recoupling of our beliefs about the one to our beliefs about the other. What we need, finally, is the resolve to make hard decisions about our technologies. If an invention does not serve the human good, then we should neither sell it nor use it, and we should make a public case against it. If we can’t do that — if we lack the will or fortitude to say, with Bartleby, We would prefer not to — then it is clear that we are no longer makers or users. We are being used and remade.

The other comes late in my Commonweal review, published last summer, of Tara Isabella Burton’s book Self Made:

It may feel to some of us that “everyone,” for example, is on Instagram. Only about 15 percent of the world is on the platform, however. That’s a lot of people. Yet the truth is that most of the world is not on it. The same goes for other social media. Influencer culture may be ubiquitous in the sense that most people between the ages of fifteen and thirty-five are affected by it in some way. But that’s a far cry from digitally mediated self-creation being a universal mandate.

Even for those of us on these apps, moreover, it’s possible to opt out. You don’t have to sell yourself on the internet. You really don’t. I would have liked Burton to show us why the dismal story she tells isn’t deterministic—why, for example, not every young woman is fated to sell her image on OnlyFans sooner or later.

The two relevant phrases from these essay reviews: You really don’t and Bartleby’s I would prefer not to. They are quite simply all you need in your toolkit for responding to new technologies like TikTok and generative A.I.

For example, the TikTok piece states that half of Americans are on the app. That’s a lot! Plenty to justify the NYT treatment. I don’t deny it. But do you know what that claim also means? That half of us aren’t on it. Fifty percent. One out of every two souls. Which is the more relevant statistic, then? Can I get a follow-up NYT essay about the half of us who not only aren’t tempted to download TikTok but actively reject it, can’t stand it, renounce it and all its pomp?

The piece goes further: “Even if you’ve never opened the app, you’ve lived in a culture that exists downstream of what happens there.” Again, I don’t deny it or doubt it. It’s true, to my chagrin. And yet, the power of such a claim is not quite what it seems on first glance.

The downstream-influence of TikTok works primarily if and as one is also or instead an active user of other social media platforms (as well as, perhaps, cable news programs focused on politics and entertainment). I’m told you can’t get on YouTube or Instagram or Twitter or Facebook without encountering “imported” content from TikTok, or “local” content that’s just Meta or Google cribbing on TikTok. But what if, like me, you don’t have an account on any of these platforms? What if you abstain completely from all social media? And what if you don’t watch Fox News or MSNBC or CNN or entertainment shows or reality TV?

I was prepared, reading the NYT piece, to discover all the ways TikTok had invaded my life without my even realizing it. It turns out, though, that I don’t get my news from TikTok, or my movie recommendations, or my cooking recipes, or my fashion advice(!), or my politics, or my Swiftie hits, or my mental health self-diagnoses, or my water bottle, or my nightly entertainment before bed—or anything else. Nothing. Nada. Apparently I have been immune to the fifteen “hottest trends” on TikTok, the way it invaded “all of our lives.”

How? Not because I made it a daily goal to avoid TikTok. Not because I’m a digital ascetic living on a compound free of wireless internet, smart phones, streaming TV, and (most important) Gen Z kiddos. No, it’s because, and more or less only because, I’m not on social media. Turns out it isn’t hard to get away from this stuff. You just don’t download it. You just don’t create an account. If you don’t, you can live as if it doesn’t exist, because for all intents and purposes, for your actual life, it doesn’t.

As I said: You really don’t have to, because you can just say I would prefer not to. All told, that’s enough. It’s adequate all on its own. No one is forcing you to do anything.

Which brings us to Ezra Klein.

Sometimes Klein seems like he genuinely “gets” the scale of the threat, the nature of the digital monstrosity, the power of these devices to shape and rewire our brains and habits and hearts. Yet other times he sounds like just another tech bro who wants to maximize his digital efficiencies, to get ahead of the masses, to get a silicon leg up on the competition, to be as early an adopter as possible. I honestly don’t get it. Does he really believe the hype? Or does he not. At least someone like Tyler Cowen picks a lane. Come join the alarmist train, Ezra! There’s plenty of room! All aboard!

Seriously though, I’m trying to understand the mindset of a person who asks aloud with complete sincerity, “How should I incorporate A.I. into my life ‘better’?” It’s the “should” that gets me. Somehow this is simultaneously a social obligation and a moral duty. Whence the ought? Can someone draw a line for me from this particular “is” to Klein’s technological ought?

In any case, the question presumes at least two things. First, that prior to A.I. my life was somehow lacking. Second, that just because A.I. exists, I need to “find a place for it” in my daily habits.

But why? Why would we ever grant either of these premises?

My life wasn’t lacking anything before ChatGPT made its big splash. I wasn’t feeling an absence that Sam Altman could step in to fill. There is no Google-shaped hole in my heart. As a matter of fact, my life is already full enough: both in the happy sense that I have a fulfilling life and in the stressful sense that I have too much going on in my life. As John Mark Comer has rightly pointed out, the only way to have more of the former is through having less of the latter. Have more by having less; increase happiness by jettisoning junk, filler, hurry, hoarding, much-ness.

Am I really supposed to believe that A.I.—not to mention an A.I. duplicate of myself in order (hold gag reflex) to know myself more deeply (I said hold it!) in ways I couldn’t before—is not just one more damn thing to add to my already too-full life? That it holds the secrets of self-knowledge, maximal efficiency, work flow, work–life balance, relational intimacy, personal creativity, and labor productivity? Like, I’m supposed to type these words one after another and not snort laugh with derision but instead take them seriously, very seriously, pondering how my life was falling short until literally moments ago, when A.I. entered my life?

It goes without saying that, just because the technology exists, I don’t “need” to adopt or incorporate it into my life. There is no technological imperative, and if there were it wouldn’t be categorical. The mere existence of technology is neither self-justifying nor self-recommending. And must I add that devoting endless hours of time, energy, and attention to learning this latest invention, besides stealing those hours from other, infinitely more meaningful pursuits, will undeniably be superseded and almost immediately made redundant by the fact that this invention is nowhere near completion? Even if A.I. were going to improve daily individual human flourishing by a hundredfold, the best thing to do, right now, would be absolutely nothing. Give it another year or ten or fifty and they’ll iron out the kinks, I’m sure of it.

What this way of approaching A.I. has brought home to me is the unalterably religious dimension of technological innovation, and this in two respects. On one side, tech adepts and true believers approach innovation not only as one more glorious step in the march of progress but also as a kind of transcendent or spiritual moment in human growth. Hence the imperative. How should I incorporate this newfangled thing into my already tech-addled life? becomes not just a meaningful question but an urgent, obvious, and existential one.

On the other side, those of us who are members of actual religious traditions approach new technology with, at a minimum, an essentially skeptical eye. More to the point, we do not approach it expecting it to do anything for our actual well-being, in the sense of deep happiness or lasting satisfaction or final fulfillment or ultimate salvation. Technology can and does contribute to human flourishing but only in its earthly, temporal, or penultimate aspects. It has nothing to do with, cannot touch, never can and never will intersect with eternity, with the soul, with the Source and End of all things. Technology is not, in short, a means of communion with God. And for those of us (not all religious people, but many) who believe that God has himself already reached out to us, extending the promise and perhaps a partial taste of final beatitude, then it would never occur to us—it would present as laughably naive, foolish, silly, self-deceived, idolatrous—to suppose that some brand new man-made tool might fix what ails us; might right our wrongs; might make us happy, once and for all.

It’s this that’s at issue in the technological “ought”: the “religion of technology.” It’s why I can’t make heads of tails of stories or interviews like the ones I cited above. We belong to different religions. It may be that there are critical questions one can ask about mine. But at least I admit to belonging to one. And, if I’m being honest, mine has a defensible morality and metaphysics. If I weren’t a Christian, I’d rather be just about anything than a true believing techno-optimist. Of all religions on offer today, it is surely the most self-evidently false.

Read More
Brad East Brad East

All together now: social media is bad for reading

A brief screed about what we all know to be true: social media is bad for reading.

We don’t have to mince words. We don’t have to pretend. We don’t have to qualify our claims. We don’t have to worry about insulting the youths. We don’t have to keep mum until the latest data comes in.

Social media, in all its forms, is bad for reading.

It’s bad for reading habits, meaning when you’re on social media you’re not reading a book. It’s bad for reading attention, meaning it shrinks your ability to focus for sustained periods of time while reading. It’s bad for reading desires, meaning it makes the idea of sitting down with a book, away from screens and images and videos and sounds, seem dreadfully boring. It’s bad for reading style, meaning what literacy you retain while living on social media is trained to like all the wrong things and to seek more of the same. It’s bad for reading ends, meaning you’re less likely to read for pleasure and more likely to read for strictly utilitarian reasons (including, for example, promotional deals and influencer prizes and so on). It’s bad for reading reinforcement, meaning like begets like, and inserting social media into the feedback loop of reading means ever more of the former and ever less of the latter. It’s bad for reading learning, meaning your inability to focus on dense, lengthy reading is an educational handicap: you quite literally will know less as a result. It’s bad for reading horizons, meaning the scope of what you do read, if you read at all, will not stretch across continents, cultures, and centuries but will be limited to the here and now, (at most) the latest faux highbrow novel or self-help bilge promoted by the newest hip influencers; social media–inflected “reading” is definitionally myopic: anti-“diverse” on principle. Finally, social media is bad for reading imitation, meaning it is bad for writing, because reading good writing is the only sure path to learning to write well oneself. Every single writing tic learned from social media is bad, and you can spot all of them a mile away.

None of this is new. None of it is groundbreaking. None of it is rocket science. We all know it. Educators do. Academics do. Parents do. As do members of Gen Z. My students don’t defend themselves to me; they don’t stick up for digital nativity and the wisdom and character produced by TikTok or Instagram over reading books. I’ve had students who tell me, approaching graduation, that they have never read a single book for pleasure in their lives. Others have confessed that they found a way to avoid reading a book cover to cover entirely, even as they got B’s in high school and college. They’re not proud of this. Neither are they embarrassed. It just is what it is.

Those of us who see this and are concerned by it do not have to apologize for it. We don’t have to worry about being, or being accused of being, Luddites. We’re not making this up. We’re not shaking our canes at the kids on the lawn. We’re not ageist or classist or generation-ist or any other nonsensical application of actual prejudices.

The problem is real. It’s not the only one, but it’s pressing. Social media is bad in general, it’s certainly bad for young people, and it’s unquestionably, demonstrably, and devastatingly bad for reading.

The question is not whether it’s a problem. The question is what to do about it.

Read More
Brad East Brad East

A tech-attitude taxonomy

A taxonomy of eleven different dispositions to technological development, especially in a digital age.

I’ve been reading Albert Borgmann lately, and in one essay he describes a set of thinkers he calls “optimistic pessimists” about technology. It got me thinking about how to delineate different positions and postures on technology, particularly digital technology, over the last century. I came up with eleven terms, the sixth one serving as a middle, “neutral” point with five on each side—growing in intensity as they get further from the center. Here they are:

  1. Hucksters: i.e., people who stand to profit from new technologies, or who work to spin and market them regardless of their detrimental effects on human flourishing.

  2. Apostles: i.e., true believers who announce the gospel of new technology to the unconvinced; they win converts by their true faith and honest enthusiasm; they sincerely believe that any and all developments in technology are good and to be welcomed as benefiting the human race in the short-, medium-, and long-term.

  3. Boosters: i.e., writers and journalists in media and academia who toe the line of the hucksters and apostles; they accuse critics and dissenters from the true faith of heresy or, worse, of being on the wrong side of history; they exist as cogs in the tech-evangelistic machine, though it’s never clear why they are so uncritical, since they are rarely either apostles or hucksters themselves.

  4. Optimists: i.e., ordinary people who understand and are sympathetic with thoughtful criticisms of new technologies but who, at the end of the day, passively trust in progress, in history’s forward march, and in the power of human can-do spirit to make things turn out right, including the challenges of technology; they adopt new technology as soon as it’s popular or affordable.

  5. Optimistic pessimists: i.e., trenchant and insightful critics of technopoly, or the culture wrought by technology, who nonetheless argue for and have confidence in the possibility of righting the ship (even, the ship righting itself); another term for this group is tech reformers.

  6. Naive neutrals: i.e., people who have never given a second thought to the challenges or perils of technology, are fundamentally incurious about them, and have no “position” to speak of regarding the topic; in practice they function like optimists or boosters, but lack the presence of considered beliefs on the subject.

  7. Pessimistic optimists: i.e., inevitabilists—this or that new technology may on net be worse for humanity, but there’s simply nothing to do about it; pushing back or writing criticism is for this group akin to a single individual blowing on a forest fire; technological change on this view is materialist and/or deterministic; at most, you try to see it for what it is and manage your own individual life as best you can; at the same time, there’s no reason to be Chicken Little, since this has always been humanity’s lot, and we always find a way to adapt and adjust.

  8. Pessimists: i.e., deep skeptics who see technological development in broadly negative terms, granting that not all of it is always bad in all its effects (e.g., medicine’s improvement of health, extension of life spans, and protection from disease); these folks are the last to adopt a new technology, usually with resentment or exasperation; they hate hucksters and boosters; they are not determinists—they think human society really can make social and political choices about technology ordered toward the common good—but know that determinism almost always wins in practice; their pessimism pushes them to see the downsides or tradeoffs even in the “best” technological developments.

  9. Doomsdayers: i.e., it’s all bad, all the time, and it’s clear as day to anyone with eyes to see and ears to hear; the internet is a bona fide harbinger of the apocalypse and A.I. is no-joke leading us to Skynet and the Matrix; the answer to new technology is always, therefore, a leonine Barthian Nein!; and any and all dissents and evidence to the contrary are only so much captivity to the Zeitgeist, heads stuck in the sand, paid-for shilling, or delusional “back to the land” Heidegerrian nostalgia that is impossible to live out with integrity in a digital age.

  10. Opt-outers: i.e., agrarians and urban monastics in the spirit of Wendell Berry, Ivan Illich, and others who pursue life “off the grid” or at least off the internet; they may or may not be politically active, but more than anything they put their money where their mouth is: no TV or wireless internet in the home, no smart phone, no social media, and a life centered on hearth, earth, family, children, the local neighborhood, a farm or community garden, so on and so forth; they may be as critical as pessimists and doomsdayers, but they want to walk the walk, not just talk the talk, and most of all they don’t want the technopoly to dictate whether or not, in this, their one life, it can be a good one.

  11. Resisters: i.e., leaders and foot soldiers in the Butlerian Jihad, whether this be only in spirit or in actual social, material, and political terms (IRL, as they say).

Cards on the table: I’m dispositionally somewhere between #7 and #8, with occasional emotional outbursts of #9, but aspirationally and even once in a while actually a #10.

Read More
Brad East Brad East

Email

Ten rules (for myself) that mitigate the timesuck that is the inbox.

Email is the scourge of just about everyone’s time and attention, at least those of us in the “laptop class” and the broader white-collar workforce. Some people’s jobs just are their inbox. But for academics, the inbox is the enemy. It’s a timesuck. It exerts a kind of gravitational pull on one’s mind and attention. It threatens to conquer every last second you might spend doing something else.

Here are some rules and practices I’ve instituted to manage my inbox.

  1. No email on my phone. By this I mean not only that I lack the app, I also can’t log in on a browser. I literally do not, because I cannot, access my inbox (personal or professional) on my phone.

  2. No email on the weekends. This rule’s looser, but I don’t reply or feel accountable to my inbox on the weekends. I mostly leave it be.

  3. No email until lunchtime. Mornings in my house are harried swirls of chaos getting kids to school; I don’t check my inbox before leaving. When I get to my office, I pour some coffee, pray, then sit down in my recliner and read for 2-4 hours. My laptop remains closed and in my bag, barring an urgent matter or occasional need. I usually open it around 11:30am.

  4. Inbox zero twice per day. While I munch on a salad, I take 15-30 minutes to whittle my emails down to zero. Two-thirds of this is deletion. For what remains, it’s a smattering of replies, archives, calendar notes, and snoozes. I do the same thing late afternoon, before I leave the office. Occasionally I’ll do it in the evening, at home, but I don’t plan on it.

  5. Little to no email outside of normal working hours. I’m flexible on this one, but the rule is that I don’t make myself accountable to my inbox outside of the 8:00am–5:00pm range. If I receive an email then, it can wait till the next day. And if the inbox piles up outside of office hours, so be it.

  6. As few newsletters as possible. I’ve slowly been unsubscribing from newsletters I read and transferring their feed to my RSS reader (I use Feedly). Some still come by email—I’m not sure how to pay for one without getting it via email!—but I either click on them immediately or skim and archive. I don’t want them just sitting there, cluttering up the place.

  7. A quick brief reply is better than a delayed reply or non-reply. I still remember an email I sent to a distinguished academic in 2010. I was going to visit his campus to see if it might be a good fit for my doctoral studies. My email must have been a thousand words at least. His reply was a single incomplete sentence. Yes, he would meet with me. But it was so curt I thought he was mad. He wasn’t! He just didn’t have time to match my logorrhea. He had better things to do. And he was right. So when colleagues, students, or readers email me, I’ve trained myself to give them a speedy 1-2 sentence answer, even if it’s not as detailed as they’d like. Sometimes it’s just, “Let’s talk more in person,” or “I’ll have to think about that.” But if the question is concrete, I can typically answer in a single sentence. Brevity is preferable to silence!

  8. Every personal email gets a reply. Except by mistake, I never ghost genuine emails from living human beings (unless, I suppose, there are living human beings behind A.I.-generated Ed Tech mass-mailers). Everyone who writes me by email receives a reply, full stop. That includes random readers of my work. Obviously I don’t have a sizable enough readership to make this infeasible; I assume Ken Follett can’t personally respond to every bit of fan mail he gets. But as long as it’s not unduly burdensome, I’m going to keep up this habit. And doubly so for emails from people I know, whether colleagues or friends or family. No email goes unanswered!

  9. Every (initial) personal email gets a reply within 24 hours. This one might sound tough, but it’s really not. I’m a fast typist and I’m committed to being brief whenever possible. So once the spam and nonsense are out of the way, I reply until the inbox is clear. Then I do it again later that day (on workdays, that is). That way I stay up on it, and it doesn’t pile up to unmanageable levels. Sometimes, granted, I lack the time to do so, or I’m traveling, or an email is going to take up too much time for the length and thought required. So I email within the 24-hour period to say that I’ll be emailing tomorrow or later that week or once my midterm grading is done or after the holiday or after the semesters wraps up. That way I’m not leaving a correspondent hanging, even if I can’t get to them in a timely manner.

  10. Strategic snoozing. The “snooze” button is your friend. It’s a great invention. Right now I have eight emails snoozed (and nothing in my inbox). Only two are emails I need to reply to. One is an ongoing correspondence about a paper I’ve given feedback on; the other will reappear in a couple weeks when I’m supposed to remind a colleague about something. So no one is waiting on me, exactly. Even when I’ve snoozed threads trying to find time to meet up for a meal or drinks, the person isn’t waiting on pins and needles; we both know it’s a busy time and we’ll figure something out in a month or two. Anything pressing has been dealt with; it’s the emails with longer deadline horizons, such as a recommendation letter request, or emails that function as self-reminders that call for strategic snoozing.

All this, I should say, is operative primarily during the academic calendar. I’m looser in the summer, for obvious reasons.

I should add as well that, unlike other techniques for tech management, I don’t feel constrained or stressed by these rules. They aren’t hard to keep. They’re shockingly easy, as a matter of fact. Not everyone is in similar circumstances, but something like these rules would work, I think, for most academics. Of perhaps all “laptop workers,” academics may be the most notorious for simply never returning emails, even important ones. It’s not that hard, folks! And rules like these actually keep you from being on email more. My average daily email time is 30-60 minutes. That’s doable. But I could stretch that out to just about anytime I’m in the office, if I kept my laptop open (or, when I’m writing, my email open). Were I to do that, then I’d see a new email demanding a reply every 10-20 minutes, in which case I’d never get around to anything else. It’s in service of getting around to other things—usually more important and always more interesting—that the rules are worth implementing and maintaining.

Read More
Brad East Brad East

Quit social porn

Samuel James is right: the social internet is a form of pornography. That means Christians, at least, should get off—now.

In the introduction to his new book, Digital Liturgies: Rediscovering Christian Wisdom in an Online Age, Samuel James makes a startling claim: “The internet is a lot like pornography.” He makes sure the reader has read him right: “No, that’s not a typo. I did not mean to say that the internet contains a lot of pornography. I mean to say that the internet itself—i.e., its very nature—is like pornography. There’s something about it that is pornographic in its essence.”

Bingo. This is exactly right. But let’s take it one step further.

A few pages earlier, James distinguishes the internet in general from “the social internet.” That’s a broader term for what we usually refer to as “social media.” Think not only Facebook, Twitter, Instagram, TikTok, et al, but also YouTube, Slack, Pinterest, Snapchat, Tumblr, perhaps even LinkedIn or Reddit and similar sites. In effect, any online platform that (a) “connects” strangers through (b) public or semi-public personal profiles via (c) proprietary algorithms using (d) slot-machine reward mechanisms that reliably alter one’s (e) habits of attention and (f) fame, status, wealth, influence, or “brand.” Almost always such a platform also entails (g) the curation, upkeep, reiteration, and perpetual transformation of one’s visual image.

This is the social internet. James is right to compare it to pornography. But he doesn’t go far enough. It isn’t like pornography. It’s a mode of pornography.

The social internet is social porn.

By the end of the introduction, James pulls his punch. He doesn’t want his readers off the internet. Okay, fine. I’m on the internet too, obviously—though every second I’m not on it is a second of victory I’ve snatched from defeat. But yes, it’s hard to avoid the internet in 2023. We’ll let that stand for now.

There is no good reason, however, to be on the social internet. It’s porn, after all, as we just established. Christians, at least, have no excuse for using porn. So if James and I are right that the social internet isn’t just akin to pornography but is a species of it, then he and I and every other Christian we know who cares about these things should get off the social internet right now.

That means, as we saw above, any app, program, or platform that meets the definition I laid out. It means, at a minimum, deactivating and then deleting one’s accounts with Facebook, Twitter, Instagram, and TikTok—immediately. It then means thinking long and hard about whether one should be on any para-social platforms like YouTube or Pinterest or Slack. Some people use YouTube rarely and passively, to watch the occasional movie trailer or live band performance, say, or how-to videos to help fix things around the house. Granted, we shouldn’t be too worried about that. But what about people who use it the way my students use it—as an app on their phone with an auto-populated feed they scroll just like IG or TT? Or what about active users and influencers with their own channels?

Get off! That’s the answer. It’s porn, remember? And porn is bad.

I confess I have grown tired of all the excuses for staying on the social internet. Let me put that differently: I know plenty of people who do not share my judgment that the social internet is bad, much less a type of porn. In that case, we lack a shared premise. But many people accept the premise; they might even go so far as to affirm with me that the social internet is indeed a kind of porn: just as addictive, just as powerful, just as malformative, just as spiritually depleting, just as attentionally sapping. (Such claims are empirical, by the way; I don’t consider them arguable. But that’s for another day.) And yet most of the people I have in mind, who are some of the most well-read and up-to-date on the dangers and damages of digital media, continue not only to maintain their social internet accounts but use them actively and daily. Why?

I’m at a point where I think there simply are no more good excuses. Alan Jacobs remarked to me a few years back, when I was wavering on my Twitter usage, that the hellsite in question was the new Playboy. “I subscribe for the articles,” you say. I’m sure you do. That might play with folks unconcerned by the surrounding pictures. For Christians, though, the gig is up. You’re wading through waist-high toxic sludge for the occasional possible potential good. Quit it. Quit the social internet. Be done with it. For good.

Unlike Lot’s wife, you won’t look back. The flight from the Sodom of the social internet isn’t littered with pillars of salt. The path is free and clear, because everyone who leaves is so happy, so grateful, the only question they ask themselves is what took them so long to get out.

Read More
Brad East Brad East

A decision tree for dealing with digital tech

Is the digital status quo good? If not, our actions (both personal and institutional) should show it.

Start with this question:

Do you believe that our, and especially young people’s, relationship to digital technology (=smartphones, screens, the internet, streaming, social media) is healthy, functional, and therefore good as is? Or unhealthy, dysfunctional, and therefore in need of immediate and drastic help?

If your answer is “healthy, functional, and good as is,” then worry yourself no more; the status quo is A-OK. If you answered otherwise, read on.

Now ask yourself this question:

Do the practices, policies, norms, and official statements of my institution—whether a family, a business, a university, or a church—(a) contribute to the technological problem, (b) maintain the digital status quo, or (c) interrupt, subvert, and cut against the dysfunctional relationship of the members of my institution to their devices and screens?

If your answer is (a) or (b) and yet you answered earlier that you believe our relationship to digital technology is in serious need of help, then you’ve got a problem on your hands. If your answer is (c), then well done.

Finally, ask yourself this:

How does my own life—the whole suite of my daily habits when no one’s looking, or rather, when everyone is looking (my spouse, my roommate, my children, my coworkers, my neighbors, my pastors, and so on)—reflect, model, and/or communicate my most basic beliefs about the digital status quo? Does the way I live show others that (a) I am aware of the problem (b) chiefly within myself and (c) am tirelessly laboring to respond to it, to amend my ways and solve the problem? Or does it evince the very opposite? So that my life and my words are unaligned and even contradictory?

At both the institutional and the personal level, it seems to me that answering these questions honestly and following them to their logical conclusions—not just in our minds or with our words but in concrete actions—would clarify much about the nature of our duties, demands, and decisions in this area of life.

Read More