Resident Theologian
About the Blog
A.I., TikTok, and saying “I would prefer not to”
Finding wisdom in Bartleby for a tech-addled age.
Two technology pieces from last week have stuck with me.
Both were at The New York Times. The first was titled “How TikTok Changed America,” a sort of image/video essay about the platform’s popularity and influence in the U.S. The second was a podcast with Ezra Klein called “How Should I Be Using A.I. Right Now?,” an interview with Ethan Mollick.
To be clear, I skimmed the first and did not listen to the second; I only read Klein’s framing description for the pod (my emphases):
There’s something of a paradox that has defined my experience with artificial intelligence in this particular moment. It’s clear we’re witnessing the advent of a wildly powerful technology, one that could transform the economy and the way we think about art and creativity and the value of human work itself. At the same time, I can’t for the life of me figure out how to use it in my own day-to-day job.
So I wanted to understand what I’m missing and get some tips for how I could incorporate A.I. better into my life right now. And Ethan Mollick is the perfect guide…
This conversation covers the basics, including which chatbot to choose and techniques for how to get the most useful results. But the conversation goes far beyond that, too — to some of the strange, delightful and slightly unnerving ways that A.I. responds to us, and how you’ll get more out of any chatbot if you think of it as a relationship rather than a tool.
These two pieces brought to mind two things I’ve written recently about social media and digital technology more broadly. The first comes from my New Atlantic essay, published two years ago, reviewing Andy Crouch’s book The Life We’re Looking For (my emphases again):
What we need is a recommitment to public argument about purpose, both ours and that of our tools. What we need, further, is a recoupling of our beliefs about the one to our beliefs about the other. What we need, finally, is the resolve to make hard decisions about our technologies. If an invention does not serve the human good, then we should neither sell it nor use it, and we should make a public case against it. If we can’t do that — if we lack the will or fortitude to say, with Bartleby, We would prefer not to — then it is clear that we are no longer makers or users. We are being used and remade.
The other comes late in my Commonweal review, published last summer, of Tara Isabella Burton’s book Self Made:
It may feel to some of us that “everyone,” for example, is on Instagram. Only about 15 percent of the world is on the platform, however. That’s a lot of people. Yet the truth is that most of the world is not on it. The same goes for other social media. Influencer culture may be ubiquitous in the sense that most people between the ages of fifteen and thirty-five are affected by it in some way. But that’s a far cry from digitally mediated self-creation being a universal mandate.
Even for those of us on these apps, moreover, it’s possible to opt out. You don’t have to sell yourself on the internet. You really don’t. I would have liked Burton to show us why the dismal story she tells isn’t deterministic—why, for example, not every young woman is fated to sell her image on OnlyFans sooner or later.
The two relevant phrases from these essay reviews: You really don’t and Bartleby’s I would prefer not to. They are quite simply all you need in your toolkit for responding to new technologies like TikTok and generative A.I.
For example, the TikTok piece states that half of Americans are on the app. That’s a lot! Plenty to justify the NYT treatment. I don’t deny it. But do you know what that claim also means? That half of us aren’t on it. Fifty percent. One out of every two souls. Which is the more relevant statistic, then? Can I get a follow-up NYT essay about the half of us who not only aren’t tempted to download TikTok but actively reject it, can’t stand it, renounce it and all its pomp?
The piece goes further: “Even if you’ve never opened the app, you’ve lived in a culture that exists downstream of what happens there.” Again, I don’t deny it or doubt it. It’s true, to my chagrin. And yet, the power of such a claim is not quite what it seems on first glance.
The downstream-influence of TikTok works primarily if and as one is also or instead an active user of other social media platforms (as well as, perhaps, cable news programs focused on politics and entertainment). I’m told you can’t get on YouTube or Instagram or Twitter or Facebook without encountering “imported” content from TikTok, or “local” content that’s just Meta or Google cribbing on TikTok. But what if, like me, you don’t have an account on any of these platforms? What if you abstain completely from all social media? And what if you don’t watch Fox News or MSNBC or CNN or entertainment shows or reality TV?
I was prepared, reading the NYT piece, to discover all the ways TikTok had invaded my life without my even realizing it. It turns out, though, that I don’t get my news from TikTok, or my movie recommendations, or my cooking recipes, or my fashion advice(!), or my politics, or my Swiftie hits, or my mental health self-diagnoses, or my water bottle, or my nightly entertainment before bed—or anything else. Nothing. Nada. Apparently I have been immune to the fifteen “hottest trends” on TikTok, the way it invaded “all of our lives.”
How? Not because I made it a daily goal to avoid TikTok. Not because I’m a digital ascetic living on a compound free of wireless internet, smart phones, streaming TV, and (most important) Gen Z kiddos. No, it’s because, and more or less only because, I’m not on social media. Turns out it isn’t hard to get away from this stuff. You just don’t download it. You just don’t create an account. If you don’t, you can live as if it doesn’t exist, because for all intents and purposes, for your actual life, it doesn’t.
As I said: You really don’t have to, because you can just say I would prefer not to. All told, that’s enough. It’s adequate all on its own. No one is forcing you to do anything.
Which brings us to Ezra Klein.
Sometimes Klein seems like he genuinely “gets” the scale of the threat, the nature of the digital monstrosity, the power of these devices to shape and rewire our brains and habits and hearts. Yet other times he sounds like just another tech bro who wants to maximize his digital efficiencies, to get ahead of the masses, to get a silicon leg up on the competition, to be as early an adopter as possible. I honestly don’t get it. Does he really believe the hype? Or does he not. At least someone like Tyler Cowen picks a lane. Come join the alarmist train, Ezra! There’s plenty of room! All aboard!
Seriously though, I’m trying to understand the mindset of a person who asks aloud with complete sincerity, “How should I incorporate A.I. into my life ‘better’?” It’s the “should” that gets me. Somehow this is simultaneously a social obligation and a moral duty. Whence the ought? Can someone draw a line for me from this particular “is” to Klein’s technological ought?
In any case, the question presumes at least two things. First, that prior to A.I. my life was somehow lacking. Second, that just because A.I. exists, I need to “find a place for it” in my daily habits.
But why? Why would we ever grant either of these premises?
My life wasn’t lacking anything before ChatGPT made its big splash. I wasn’t feeling an absence that Sam Altman could step in to fill. There is no Google-shaped hole in my heart. As a matter of fact, my life is already full enough: both in the happy sense that I have a fulfilling life and in the stressful sense that I have too much going on in my life. As John Mark Comer has rightly pointed out, the only way to have more of the former is through having less of the latter. Have more by having less; increase happiness by jettisoning junk, filler, hurry, hoarding, much-ness.
Am I really supposed to believe that A.I.—not to mention an A.I. duplicate of myself in order (hold gag reflex) to know myself more deeply (I said hold it!) in ways I couldn’t before—is not just one more damn thing to add to my already too-full life? That it holds the secrets of self-knowledge, maximal efficiency, work flow, work–life balance, relational intimacy, personal creativity, and labor productivity? Like, I’m supposed to type these words one after another and not snort laugh with derision but instead take them seriously, very seriously, pondering how my life was falling short until literally moments ago, when A.I. entered my life?
It goes without saying that, just because the technology exists, I don’t “need” to adopt or incorporate it into my life. There is no technological imperative, and if there were it wouldn’t be categorical. The mere existence of technology is neither self-justifying nor self-recommending. And must I add that devoting endless hours of time, energy, and attention to learning this latest invention, besides stealing those hours from other, infinitely more meaningful pursuits, will undeniably be superseded and almost immediately made redundant by the fact that this invention is nowhere near completion? Even if A.I. were going to improve daily individual human flourishing by a hundredfold, the best thing to do, right now, would be absolutely nothing. Give it another year or ten or fifty and they’ll iron out the kinks, I’m sure of it.
What this way of approaching A.I. has brought home to me is the unalterably religious dimension of technological innovation, and this in two respects. On one side, tech adepts and true believers approach innovation not only as one more glorious step in the march of progress but also as a kind of transcendent or spiritual moment in human growth. Hence the imperative. How should I incorporate this newfangled thing into my already tech-addled life? becomes not just a meaningful question but an urgent, obvious, and existential one.
On the other side, those of us who are members of actual religious traditions approach new technology with, at a minimum, an essentially skeptical eye. More to the point, we do not approach it expecting it to do anything for our actual well-being, in the sense of deep happiness or lasting satisfaction or final fulfillment or ultimate salvation. Technology can and does contribute to human flourishing but only in its earthly, temporal, or penultimate aspects. It has nothing to do with, cannot touch, never can and never will intersect with eternity, with the soul, with the Source and End of all things. Technology is not, in short, a means of communion with God. And for those of us (not all religious people, but many) who believe that God has himself already reached out to us, extending the promise and perhaps a partial taste of final beatitude, then it would never occur to us—it would present as laughably naive, foolish, silly, self-deceived, idolatrous—to suppose that some brand new man-made tool might fix what ails us; might right our wrongs; might make us happy, once and for all.
It’s this that’s at issue in the technological “ought”: the “religion of technology.” It’s why I can’t make heads of tails of stories or interviews like the ones I cited above. We belong to different religions. It may be that there are critical questions one can ask about mine. But at least I admit to belonging to one. And, if I’m being honest, mine has a defensible morality and metaphysics. If I weren’t a Christian, I’d rather be just about anything than a true believing techno-optimist. Of all religions on offer today, it is surely the most self-evidently false.
Church on Christmas Day
A response to the responses to Ruth Graham’s piece in the New York Times on American evangelicals staying home on Christmas Day. A sympathetic defense of the normies, in other words.
Allow me to stick up for the normies.
All my people—church folk, academics, readerly believers—are worked up about (the always excellent) Ruth Graham’s New York Times piece from two days ago about Christmas Day falling on a Sunday. And rightly so: it should be a no-brainer for Christians that Christmas Day is a day to go to church. A day to worship God. A day to gather with sisters and brothers in Christ to worship the child Christ. A day about him, not a day about us. The reason for the season, you might say.
No one is wrong about this. But there’s a touch of mercilessness in the proceedings, underwritten by a lack of context, which both ups the ante and elides understanding. So let me give a cheer and a half for staying home on Christmas, or at least for grasping, in something other than shock and disbelief, why it is so many devout believers do so.
First, we aren’t talking about catholic Christians. We’re talking about Protestants.
Second, we aren’t talking about Protestants in general. We’re talking about low-church, evangelical, biblicist, frontier-revival, and/or non-denominational American Protestants.
Third, for two centuries or more this specific subgroup of American Christians—I like to call them “baptists,” the lower-case “b” coming from James McClendon—have studiously avoided any and all connection to sacred tradition, particularly the liturgical calendar. In the Stone-Campbell movement, for example, most churches would studiously avoid mentioning even the existence of either Christmas or Easter, especially (as it always does in the case of the latter) when it fell on a Sunday. In other words, what you’ve got with American baptists is a wholesale lack of a Christian calendar governing, guiding, or forming their theological, liturgical, and festal imaginations—much less their family practices.
Fourth, and simultaneously, the practice of Christmas as a cultural event has been wholly subsumed by the wider society. Advent simply doesn’t exist; Christmas—all six to eight weeks of it—does. Asking baptist Christians to go to church on Christmas Day strikes many of them like asking them to go to church on Thanksgiving. Is this historically parochial? Yes. Is it liturgically lamentable? Yes. Is it a sad reflection of the total secularization of Christmas as a national holiday? Yes. Should this occasion anger and bewilderment at the millions of laypeople who have been successfully formed by both their churches and their culture to understand and celebrate Christmas in just this way? I don’t see why. The problem is the catechesis, not the catechumens. To overstate the matter, it’s an odd instance of blaming the victim, seasoned with overripe overreaction.
Fifth and finally, in a mitigating factor, most baptist churches of which I am aware have, over the last few decades, added or expanded a major liturgical celebration of Christ’s birth, in imitation of their more liturgically catholic neighbors: a Christmas Eve service. This has come to function, albeit accidentally I’m sure, as something akin to an Easter Vigil. It isn’t the prelude to the feast on the following day. It is the feast, or rather its beginning. Just as the sun sets, God’s people gather in darkness and candlelight to mark the moment when God came to earth in a manger. They sing and pray and celebrate and remember—prior to opening gifts or doing Santa. Only once this is complete do they disperse to their homes to begin the festivities, which continue into the following morning. And since it’s only every so often that Christmas Day is on a Sunday, it’s an odd and somewhat confusing eventuality when it does. Like Catholics who opt for Saturday 5:00pm mass, these baptists intuitively sense that they have already paid homage to the child Christ the night before. Christmas Day, even on a Sunday, becomes a kind of family Sabbath. Not necessarily (though granted, surely often in fact) to Mammon and his pomp, but to gift of multiple generations of family, grandparents and grandchildren in the same home, gratitude and feasting and toys and surprise gifts and laughter and exhaustion all giving glory to God in the domestic church of one’s household, free from work and duty and consumption and travel and the rest. You don’t do anything on such a Sabbath. You don’t go anywhere. Even, as it happens, to church.
What I’m saying is: I get it. It isn’t hard at all to understand why this default setting makes all the sense in the world to normie, mainstream, low-church American Christians. I’m not angry about it. I’m not sure you should be either. I even see the Christmas Eve service as a step in the right liturgical direction. Would I prefer for baptists, like catholics, to grasp in their bones that Christmas Day is a day for church, for worshiping Christ as Christ’s body, gathered in a single place? Yes, I would. Is it reasonable to expect that to be true at this moment, given our history and where the church is today? No, I don’t think so. To get from here to there, it seems to me, is the work of generations, a decades- or even centuries-long process of cultural, familial, liturgical, and ecclesial change. I’d like to see that change happen. I think it’s happening even now. But no shade on my neighbors.
Like I say, I get it.
NYT, guilt by association, and libraries
It's a relief to see so many thoughtful—albeit blistering—responses to the long-awaited NYT hit piece on Scott Alexander and his erstwhile blog. It means that I'm not crazy for having the reaction I did when I read it, and that I don't need to write much to draw attention to the article's numerous flaws:
It's a relief to see so many thoughtful—albeit blistering—responses to the long-awaited NYT hit piece on Scott Alexander and his erstwhile blog. It means that I'm not crazy for having the reaction I did when I read it, and that I don't need to write much to draw attention to the article's numerous flaws: only to point you to all the existing ones that already do the job. That's only a few, and none of the Twitter threads and dunks. I must say, the immediately striking thing about the piece is how boring and boringly written it is, in such a bone-deep passive-aggressive voice. Why all the fuss (internally, that is, at the NYT) for that?
But: one additional thought. It has to be strange, at the experiential (nay, existential) level, for a writer truly to think that he can damn another writer through nothing but guilt by association. And not just association in general, but association understood, first, as proximate contact (i.e., having read and engaged a "dangerous" or non-mainstream or legitimately Bad author); and, second, as having potentially contributed to the potential acceptance of said unsavory character's unsavory ideas (i.e., not having controlled in advance the judgments one's own readers might make in reading one's engagement of another's writings).
The entire hit piece is structured this way. "Alexander might be an okay dude, but it's possible that his blog might have led some readers to Wrong Thoughts." What I cannot for the life of me understand is what it means for a fellow writer to think this, to have written this way. Does he really suppose the way authors and their works ought to be judged is by the sheer possibility that some readers might draw undesirable conclusions? or misunderstand the authors' views? or go beyond the authors, or follow their evidence or arguments to different ends than the authors?
This view is patently preposterous. It's self-defeating for any writer to hold. And it's even worse as a picture of reading and learning. Imagine it applied to libraries:
- Libraries invariably contain books of all kinds.
- Some books invariably contain Bad Ideas.
- Moreover, All Kinds Of People read books.
- Therefore, some people will have Bad Ideas as a result of reading certain books contained in libraries.
- Therefore, libraries are Of Questionable Quality.
- Therefore, ban all libraries.
There is a real hatred afoot today: a hatred for learning, for thinking, for reading and giving space to ideas and authors outside of a narrow mainstream—whether or not that mainstream is a mushy moderate middle or something else entirely—and this hatred not only animates the hit piece on Alexander, it is a sort of electrical current pulsing through our culture today. Resist it, y'all. Resist it in any way you can.