Resident Theologian

About the Blog

Brad East Brad East

More screens, more distractions; fewer screens, fewer distractions

A vision for the design of our shared spaces, especially public worship.

It’s a simple rule, but I repeat it here because it is difficult to internalize and even more difficult to put into practice, whatever one’s context:

In any given physical space, the more screens that are present, the more distractions there will be for people inhabiting that space; whereas the fewer screens, the fewer distractions.

So far as I can tell, this principle is always and everywhere true, including in places where screens are the point, like a sports bar. No one would study for the LSAT in a sports bar: it’s too distracting, too noisy, too busy. It’s built to over-stimulate. Indeed, a football fan who cared about only one game featuring one team would not spend his Sunday afternoon in a sports bar with a dozen games on simultaneously, because it would prove too difficult to focus on the one thing of interest to him.

Now consider other social spaces: a coffee shop, a classroom, a living room, a sanctuary, a monastery. How are these spaces usually filled? Given their ends, how should they be filled?

The latter question answers itself. This is why, for example, I do not permit use of screens when I teach in a college classroom. Phones, tablets, and laptops are in bags or pockets. In the past I have used a single projector screen for slides, especially for larger survey/lecture courses, but for the most part, even with class sizes of 40 or 50 or 60, I don’t use a screen at all, just markers and a whiteboard. Unquestionably the presence of personal screens open on desks is a massive distraction not only to their owners but to anyone around them. And because distractions are obstacles to learning, I eliminate the distractions.

The same goes for our homes and our churches.

At the outer limit, our homes would lack screens altogether. I know there are folks who do this, but it’s a rare exception to the rule. (Actually, I’m not sure if I have ever personally known someone whose home is 100% devoid of any screen of any kind.) So assuming there will be screens of some kind, how should they be arranged in a home?

  1. There should be numerous spaces that lack a permanent screen.

  2. There should be numerous spaces in which, by rule or norm, portable screens are unwelcome.

  3. There should be focal spaces organized around some object (fireplace, kitchen island, couch and coffee table) or activity (cooking, reading, playing piano) that are ordinarily or always screen-free.

  4. What screens there are should require some friction to use, i.e., a conscious and active rather than passive decision to turn them on or or engage with them.

  5. Fewer screens overall and fewer screens in any given space will conduce to fewer distractions, on one hand, and greater likelihood of shared or common screen usage, on the other. (I.e., watching a movie together as a family rather than adults and children on separate devices doing their own thing.)

There is more to say, but for those interested I’m mostly just repackaging the advice of Andy Crouch and Albert Borgmann. Now to church.

There are a few ways that screens can invade the space of public worship:

  1. Large screens “up front” that display words, images, videos, or live recording of whatever is happening “on stage” (=pastor, sermon, communion, music).

  2. Small screens, whether tablets or smartphones, out and visible and in active usage by ministers and others leading the congregation in worship.

  3. Small screens, typically smartphones, in the pockets and laps of folks in the pews.

Let me put it bluntly: It’s often said that Sunday morning is the most segregated hour in America. In a different vein, it’s equally true that Sunday morning may now be the most distracted hour in America.

Why? Because screens are everywhere! Not, to be sure, in every church. The higher liturgical traditions have preserved a liturgical celebration often, though not always, free of screen colonization. Yet even there parishioners still by and large bring their screens in with them.

Certainly for low-church forms of worship, screens are everywhere. And the more screens, the more distractions. Which means that, for many churches, distraction appears to be part of the point. Those attending are meant, in a twist on T. S. Eliot’s phrase, to be distracted from distraction by distraction—that is, to be distracted from bad distraction (fantasy football, Instagram, online shopping) by good distraction (cranked-up CCM, high production videos, Bible apps). It is unthinkable, on this view, to imagine worshiping on a Sunday morning in a screen-free environment. Yet a screen-free space would be a distraction-free space, one designed precisely to free the attention—the literal eyeballs—of those gathered to focus on the one thing they came for: God.

I hope to write a full essay on this soon for Christianity Today, laying out a practical vision for screen-free worship. For now I just want to propose it as an ideal we should all agree on. Ministers should not use phones while leading worship nor should they invite parishioners to open the Bible “on their apps.” Do you know what said parishioners will do when so invited? They may or may not open their Bible app. They will absolutely find their eyes diverted to a text message, an email, or a social media update. And at once you will have lost them—either for a few minutes or for good.

The best possible thing for public Christian worship in twenty-first century America would be the banishment of all screens from the sanctuary. Practically speaking, it would look like leaders modeling and then inviting those who attend to leave their phones at home, in their cars, or in cell phone lockers (the way K–12 schools are increasingly doing).

I’m well aware that this couldn’t happen overnight, and that there are reasonable exceptions for certain people to have a phone on them (doctors on call, police officers, parents of children with special needs). But hard cases make bad law. The normative vision should be clear and universally shared. The liturgy is a place for ordering our attention, the eyes of the heart, on what we cannot see but nevertheless gain a glimpse of when we hear the word of the Lord and see and smell and taste the signs of bread and wine on the Lord’s table. We therefore should not intentionally encourage the proliferation of distractions in this setting nor stand by and watch it happen, as if the design of public space were out of our hands.

More screens, more distractions; fewer screens, fewer distractions: the saying is sure. Let’s put it into practice.

Read More
Brad East Brad East

A.I., TikTok, and saying “I would prefer not to”

Finding wisdom in Bartleby for a tech-addled age.

Two technology pieces from last week have stuck with me.

Both were at The New York Times. The first was titled “How TikTok Changed America,” a sort of image/video essay about the platform’s popularity and influence in the U.S. The second was a podcast with Ezra Klein called “How Should I Be Using A.I. Right Now?,” an interview with Ethan Mollick.

To be clear, I skimmed the first and did not listen to the second; I only read Klein’s framing description for the pod (my emphases):

There’s something of a paradox that has defined my experience with artificial intelligence in this particular moment. It’s clear we’re witnessing the advent of a wildly powerful technology, one that could transform the economy and the way we think about art and creativity and the value of human work itself. At the same time, I can’t for the life of me figure out how to use it in my own day-to-day job.

So I wanted to understand what I’m missing and get some tips for how I could incorporate A.I. better into my life right now. And Ethan Mollick is the perfect guide…

This conversation covers the basics, including which chatbot to choose and techniques for how to get the most useful results. But the conversation goes far beyond that, too — to some of the strange, delightful and slightly unnerving ways that A.I. responds to us, and how you’ll get more out of any chatbot if you think of it as a relationship rather than a tool.

These two pieces brought to mind two things I’ve written recently about social media and digital technology more broadly. The first comes from my New Atlantic essay, published two years ago, reviewing Andy Crouch’s book The Life We’re Looking For (my emphases again):

What we need is a recommitment to public argument about purpose, both ours and that of our tools. What we need, further, is a recoupling of our beliefs about the one to our beliefs about the other. What we need, finally, is the resolve to make hard decisions about our technologies. If an invention does not serve the human good, then we should neither sell it nor use it, and we should make a public case against it. If we can’t do that — if we lack the will or fortitude to say, with Bartleby, We would prefer not to — then it is clear that we are no longer makers or users. We are being used and remade.

The other comes late in my Commonweal review, published last summer, of Tara Isabella Burton’s book Self Made:

It may feel to some of us that “everyone,” for example, is on Instagram. Only about 15 percent of the world is on the platform, however. That’s a lot of people. Yet the truth is that most of the world is not on it. The same goes for other social media. Influencer culture may be ubiquitous in the sense that most people between the ages of fifteen and thirty-five are affected by it in some way. But that’s a far cry from digitally mediated self-creation being a universal mandate.

Even for those of us on these apps, moreover, it’s possible to opt out. You don’t have to sell yourself on the internet. You really don’t. I would have liked Burton to show us why the dismal story she tells isn’t deterministic—why, for example, not every young woman is fated to sell her image on OnlyFans sooner or later.

The two relevant phrases from these essay reviews: You really don’t and Bartleby’s I would prefer not to. They are quite simply all you need in your toolkit for responding to new technologies like TikTok and generative A.I.

For example, the TikTok piece states that half of Americans are on the app. That’s a lot! Plenty to justify the NYT treatment. I don’t deny it. But do you know what that claim also means? That half of us aren’t on it. Fifty percent. One out of every two souls. Which is the more relevant statistic, then? Can I get a follow-up NYT essay about the half of us who not only aren’t tempted to download TikTok but actively reject it, can’t stand it, renounce it and all its pomp?

The piece goes further: “Even if you’ve never opened the app, you’ve lived in a culture that exists downstream of what happens there.” Again, I don’t deny it or doubt it. It’s true, to my chagrin. And yet, the power of such a claim is not quite what it seems on first glance.

The downstream-influence of TikTok works primarily if and as one is also or instead an active user of other social media platforms (as well as, perhaps, cable news programs focused on politics and entertainment). I’m told you can’t get on YouTube or Instagram or Twitter or Facebook without encountering “imported” content from TikTok, or “local” content that’s just Meta or Google cribbing on TikTok. But what if, like me, you don’t have an account on any of these platforms? What if you abstain completely from all social media? And what if you don’t watch Fox News or MSNBC or CNN or entertainment shows or reality TV?

I was prepared, reading the NYT piece, to discover all the ways TikTok had invaded my life without my even realizing it. It turns out, though, that I don’t get my news from TikTok, or my movie recommendations, or my cooking recipes, or my fashion advice(!), or my politics, or my Swiftie hits, or my mental health self-diagnoses, or my water bottle, or my nightly entertainment before bed—or anything else. Nothing. Nada. Apparently I have been immune to the fifteen “hottest trends” on TikTok, the way it invaded “all of our lives.”

How? Not because I made it a daily goal to avoid TikTok. Not because I’m a digital ascetic living on a compound free of wireless internet, smart phones, streaming TV, and (most important) Gen Z kiddos. No, it’s because, and more or less only because, I’m not on social media. Turns out it isn’t hard to get away from this stuff. You just don’t download it. You just don’t create an account. If you don’t, you can live as if it doesn’t exist, because for all intents and purposes, for your actual life, it doesn’t.

As I said: You really don’t have to, because you can just say I would prefer not to. All told, that’s enough. It’s adequate all on its own. No one is forcing you to do anything.

Which brings us to Ezra Klein.

Sometimes Klein seems like he genuinely “gets” the scale of the threat, the nature of the digital monstrosity, the power of these devices to shape and rewire our brains and habits and hearts. Yet other times he sounds like just another tech bro who wants to maximize his digital efficiencies, to get ahead of the masses, to get a silicon leg up on the competition, to be as early an adopter as possible. I honestly don’t get it. Does he really believe the hype? Or does he not. At least someone like Tyler Cowen picks a lane. Come join the alarmist train, Ezra! There’s plenty of room! All aboard!

Seriously though, I’m trying to understand the mindset of a person who asks aloud with complete sincerity, “How should I incorporate A.I. into my life ‘better’?” It’s the “should” that gets me. Somehow this is simultaneously a social obligation and a moral duty. Whence the ought? Can someone draw a line for me from this particular “is” to Klein’s technological ought?

In any case, the question presumes at least two things. First, that prior to A.I. my life was somehow lacking. Second, that just because A.I. exists, I need to “find a place for it” in my daily habits.

But why? Why would we ever grant either of these premises?

My life wasn’t lacking anything before ChatGPT made its big splash. I wasn’t feeling an absence that Sam Altman could step in to fill. There is no Google-shaped hole in my heart. As a matter of fact, my life is already full enough: both in the happy sense that I have a fulfilling life and in the stressful sense that I have too much going on in my life. As John Mark Comer has rightly pointed out, the only way to have more of the former is through having less of the latter. Have more by having less; increase happiness by jettisoning junk, filler, hurry, hoarding, much-ness.

Am I really supposed to believe that A.I.—not to mention an A.I. duplicate of myself in order (hold gag reflex) to know myself more deeply (I said hold it!) in ways I couldn’t before—is not just one more damn thing to add to my already too-full life? That it holds the secrets of self-knowledge, maximal efficiency, work flow, work–life balance, relational intimacy, personal creativity, and labor productivity? Like, I’m supposed to type these words one after another and not snort laugh with derision but instead take them seriously, very seriously, pondering how my life was falling short until literally moments ago, when A.I. entered my life?

It goes without saying that, just because the technology exists, I don’t “need” to adopt or incorporate it into my life. There is no technological imperative, and if there were it wouldn’t be categorical. The mere existence of technology is neither self-justifying nor self-recommending. And must I add that devoting endless hours of time, energy, and attention to learning this latest invention, besides stealing those hours from other, infinitely more meaningful pursuits, will undeniably be superseded and almost immediately made redundant by the fact that this invention is nowhere near completion? Even if A.I. were going to improve daily individual human flourishing by a hundredfold, the best thing to do, right now, would be absolutely nothing. Give it another year or ten or fifty and they’ll iron out the kinks, I’m sure of it.

What this way of approaching A.I. has brought home to me is the unalterably religious dimension of technological innovation, and this in two respects. On one side, tech adepts and true believers approach innovation not only as one more glorious step in the march of progress but also as a kind of transcendent or spiritual moment in human growth. Hence the imperative. How should I incorporate this newfangled thing into my already tech-addled life? becomes not just a meaningful question but an urgent, obvious, and existential one.

On the other side, those of us who are members of actual religious traditions approach new technology with, at a minimum, an essentially skeptical eye. More to the point, we do not approach it expecting it to do anything for our actual well-being, in the sense of deep happiness or lasting satisfaction or final fulfillment or ultimate salvation. Technology can and does contribute to human flourishing but only in its earthly, temporal, or penultimate aspects. It has nothing to do with, cannot touch, never can and never will intersect with eternity, with the soul, with the Source and End of all things. Technology is not, in short, a means of communion with God. And for those of us (not all religious people, but many) who believe that God has himself already reached out to us, extending the promise and perhaps a partial taste of final beatitude, then it would never occur to us—it would present as laughably naive, foolish, silly, self-deceived, idolatrous—to suppose that some brand new man-made tool might fix what ails us; might right our wrongs; might make us happy, once and for all.

It’s this that’s at issue in the technological “ought”: the “religion of technology.” It’s why I can’t make heads of tails of stories or interviews like the ones I cited above. We belong to different religions. It may be that there are critical questions one can ask about mine. But at least I admit to belonging to one. And, if I’m being honest, mine has a defensible morality and metaphysics. If I weren’t a Christian, I’d rather be just about anything than a true believing techno-optimist. Of all religions on offer today, it is surely the most self-evidently false.

Read More
Brad East Brad East

Screentopia

A rant about the concern trolls who think the rest of us are too alarmist about children, screens, social media, and smartphones.

I’m grateful to Alan for writing this post so I didn’t have to. A few additional thoughts, though. (And by “a few thoughts” I mean rant imminent.)

Let me begin by giving a term to describe, not just smartphones or social media, but the entire ecosystem of the internet, ubiquitous screens, smartphones, and social media. We could call it Technopoly or the Matrix or just Digital. I’ll call it Screentopia. A place-that-is-no-place in which just about everything in our lives—friendship, education, finance, sex, news, entertainment, work, communication, worship—is mediated by omnipresent interlinked personal and public devices as well as screens of every size and type, through which we access the “all” of the aforementioned aspects of our common life.

Screentopia is an ecosystem, a habitat, an environment; it’s not one thing, and it didn’t arrive fully formed at a single point in time. It achieved a kind of comprehensive reach and maturity sometime in the last dozen years.

Like Alan, I’m utterly mystified by people who aren’t worried about this new social reality. Or who need the rest of us to calm down. Or who think the kids are all right. Or who think the kids aren’t all right, but nevertheless insist that the kids’ dis-ease has little to nothing to do with being born and raised in Screentopia. Or who must needs concern-troll those of us who are alarmed for being too alarmed; for ascribing monocausal agency to screens and smartphones when what we’re dealing with is complex, multicausal, inscrutable, and therefore impossible to fix. (The speed with which the writer adverts to “can’t roll back the clock” or “the toothpaste ain’t going back in the tube” is inversely proportional to how seriously you have to take him.)

After all, our concern troll asks insouciantly, aren’t we—shouldn’t we be—worried about other things, too? About low birth rates? And low marriage rates? And kids not playing outside? And kids presided over by low-flying helicopter parents? And kids not reading? And kids not dating or driving or experimenting with risky behaviors? And kids so sunk in lethargy that they can’t be bothered to do anything for themselves?

Well—yes! We should be worried about all that; we are worried about it. These aren’t independent phenomena about which we must parcel out percentages of our worry. It’s all interrelated! Nor is anyone—not one person—claiming a totality of causal explanatory power for the invention of the iPhone followed immediately by mass immiseration. Nor still is anyone denying that parents and teachers and schools and churches are the problem here. It’s not a “gotcha” to counter that kids don’t have an issue with phones, parents do. Yes! Duh! Exactly! We all do! Bonnie Kristian is absolutely right: parents want their elementary and middle school–aged kids to have smartphones; it’s them you have to convince, not the kids. We are the problem. We have to change. That’s literally what Haidt et al are saying. No one’s “blaming the kids.” We’re blaming what should have been the adults in the room—whether the board room, the PTA meeting, the faculty lounge, or the household. Having made a mistake in imposing this dystopia of screens on an unsuspecting generation, we would like, kindly and thank you please, to fix the problem we ourselves made (or, at least, woke up to, some of us, having not been given a vote at the time).

Here’s what I want to ask the tech concern trolls.

How many hours per day of private scrolling on a small glowing rectangle would concern you? How many hours per day indoors? How many hours per day on social media? How many hours per day on video games? How many pills to get to sleep? How many hours per night not sleeping? How many books per year not read? How many friends not made, how many driver’s licenses not acquired, how many dates and hangouts not held in person would finally raise a red flag?

Christopher Hitchens once wrote, “The North Korean state was born at about the same time that Nineteen Eighty-Four was published, and one could almost believe that the holy father of the state, Kim Il Sung, was given a copy of the novel and asked if he could make it work in practice.” A friend of mine says the same about our society and Brave New World. I expect people have read their Orwell. Have they read their Huxley, too? (And their Bradbury? And Walter M. Miller Jr.? And…?) Drugs and mindless entertainment to numb the emotions, babies engineered and produced in factories, sex and procreation absolutely severed, male and female locked in perpetual sedated combat, books either censored or an anachronistic bore, screens on every wall of one’s home featuring a kind of continuous interactive reality TV (as if Real Housewives, TikTok, and Zoom were combined into a single VR platform)—it’s all there. Is that the society we want? On purpose? It seems we’re bound for it like our lives depended on it. Indeed, we’re partway there already. “Alarmists” and “Luddites” are merely the ones who see the cliff’s edge ahead and are frantically pointing at it, trying to catch everyone’s attention.

But apparently everyone else is having too much fun. Who invited these killjoys along anyway?

Read More
Brad East Brad East

All together now: social media is bad for reading

A brief screed about what we all know to be true: social media is bad for reading.

We don’t have to mince words. We don’t have to pretend. We don’t have to qualify our claims. We don’t have to worry about insulting the youths. We don’t have to keep mum until the latest data comes in.

Social media, in all its forms, is bad for reading.

It’s bad for reading habits, meaning when you’re on social media you’re not reading a book. It’s bad for reading attention, meaning it shrinks your ability to focus for sustained periods of time while reading. It’s bad for reading desires, meaning it makes the idea of sitting down with a book, away from screens and images and videos and sounds, seem dreadfully boring. It’s bad for reading style, meaning what literacy you retain while living on social media is trained to like all the wrong things and to seek more of the same. It’s bad for reading ends, meaning you’re less likely to read for pleasure and more likely to read for strictly utilitarian reasons (including, for example, promotional deals and influencer prizes and so on). It’s bad for reading reinforcement, meaning like begets like, and inserting social media into the feedback loop of reading means ever more of the former and ever less of the latter. It’s bad for reading learning, meaning your inability to focus on dense, lengthy reading is an educational handicap: you quite literally will know less as a result. It’s bad for reading horizons, meaning the scope of what you do read, if you read at all, will not stretch across continents, cultures, and centuries but will be limited to the here and now, (at most) the latest faux highbrow novel or self-help bilge promoted by the newest hip influencers; social media–inflected “reading” is definitionally myopic: anti-“diverse” on principle. Finally, social media is bad for reading imitation, meaning it is bad for writing, because reading good writing is the only sure path to learning to write well oneself. Every single writing tic learned from social media is bad, and you can spot all of them a mile away.

None of this is new. None of it is groundbreaking. None of it is rocket science. We all know it. Educators do. Academics do. Parents do. As do members of Gen Z. My students don’t defend themselves to me; they don’t stick up for digital nativity and the wisdom and character produced by TikTok or Instagram over reading books. I’ve had students who tell me, approaching graduation, that they have never read a single book for pleasure in their lives. Others have confessed that they found a way to avoid reading a book cover to cover entirely, even as they got B’s in high school and college. They’re not proud of this. Neither are they embarrassed. It just is what it is.

Those of us who see this and are concerned by it do not have to apologize for it. We don’t have to worry about being, or being accused of being, Luddites. We’re not making this up. We’re not shaking our canes at the kids on the lawn. We’re not ageist or classist or generation-ist or any other nonsensical application of actual prejudices.

The problem is real. It’s not the only one, but it’s pressing. Social media is bad in general, it’s certainly bad for young people, and it’s unquestionably, demonstrably, and devastatingly bad for reading.

The question is not whether it’s a problem. The question is what to do about it.

Read More
Brad East Brad East

A decision tree for dealing with digital tech

Is the digital status quo good? If not, our actions (both personal and institutional) should show it.

Start with this question:

Do you believe that our, and especially young people’s, relationship to digital technology (=smartphones, screens, the internet, streaming, social media) is healthy, functional, and therefore good as is? Or unhealthy, dysfunctional, and therefore in need of immediate and drastic help?

If your answer is “healthy, functional, and good as is,” then worry yourself no more; the status quo is A-OK. If you answered otherwise, read on.

Now ask yourself this question:

Do the practices, policies, norms, and official statements of my institution—whether a family, a business, a university, or a church—(a) contribute to the technological problem, (b) maintain the digital status quo, or (c) interrupt, subvert, and cut against the dysfunctional relationship of the members of my institution to their devices and screens?

If your answer is (a) or (b) and yet you answered earlier that you believe our relationship to digital technology is in serious need of help, then you’ve got a problem on your hands. If your answer is (c), then well done.

Finally, ask yourself this:

How does my own life—the whole suite of my daily habits when no one’s looking, or rather, when everyone is looking (my spouse, my roommate, my children, my coworkers, my neighbors, my pastors, and so on)—reflect, model, and/or communicate my most basic beliefs about the digital status quo? Does the way I live show others that (a) I am aware of the problem (b) chiefly within myself and (c) am tirelessly laboring to respond to it, to amend my ways and solve the problem? Or does it evince the very opposite? So that my life and my words are unaligned and even contradictory?

At both the institutional and the personal level, it seems to me that answering these questions honestly and following them to their logical conclusions—not just in our minds or with our words but in concrete actions—would clarify much about the nature of our duties, demands, and decisions in this area of life.

Read More
Brad East Brad East

The tech-church show

A reflection on two issues raised by the recent viral clip of a prominent pastor lecturing his listeners not to treat public worship as a “show.”

A week or two ago a clip went viral of a prominent pastor lecturing his listeners, during his sermon, about treating Sunday morning worship like a show. I didn’t watch it, and I’m not going to comment about the pastor in question, whom I know nothing about. Here’s one write-up about it. The clip launched a thousand online Christian thinkpieces. A lot of hand-wringing about churches that put on worship as a show simultaneously wanting congregants not to see worship as a show.

Any reader of my work knows I couldn’t agree more. But I don’t want to pile on. I want to use the occasion to think more deeply about two issues it raises for the larger landscape of churches, public worship, and digital technology.

First: Should churches understand themselves to be sites of resistance against the digital status quo? That is, given their context, are churches in America called by God to be a “force for good” in relation to digital technology? And thus are they called to be a “force opposed” to the dominance of our lives—which means the lives of congregants as well as their nonbelieving neighbors—by digital devices, screens, and social media?

It seems to me that churches and church leaders are not clear about their answer to this question. In practice, their answer appears to be No. The digital status quo obtains outside the walls of the church and inside them. There is no “digital difference” when you walk inside a church—at least a standard, run-of-the-mill low-church, evangelical, or Protestant congregation. (The Orthodox have not yet been colonized by Digital, so far as I can tell. For Catholics it depends on the parish.)

In and of itself, this isn’t a problem, certainly not of consistency. If a church doesn’t think Digital’s dominion is a problem, then it’s only natural for Digital to reign within the church and not only without. You’d never expect such a church to be different on this score.

The problem arises when churches say they want to oppose believers’ digital habits, dysfunctions, and addictions while reproducing those very habits within the life of the church, above all in the liturgy. That’s a case of extreme cognitive dissonance. How could church leaders ever expect ordinary believers to learn from the church how to amend their digital lives when church leaders themselves, and the church’s public worship itself, merely model for believers their own bad habits? When, in other words, church members’ digital lives, disordered as they are, are simply mirrored back to them by the church and her pastors?

To be clear, I know more than a few Christians, including ministers, who don’t share my alarm at the reign of Digital in our common life. They wouldn’t exactly endorse spending four to eight hours (or more) per day staring at screens; they don’t deny the ills and errors of pornography and loss of attention span via social media and other platforms. But they see bigger fish to fry. And besides (as they are wont to say), “It’s here to stay. It’s a part of life. We can live in denial or incorporate its conveniences into church life. It’s inevitable either way.”

Personally, I think that’s a steaming pile of you-know-what. But at least it’s consistent. For anyone, however, who shares my alarm at the role of Digital in our common life—our own, our neighbors’, our children’s, our students’—then the inconsistency of the church on this topic is not only ludicrous but dangerous. It’s actively aiding and abetting the most significant problem facing us today while pretending otherwise. And you can’t have it both ways. Either it’s a problem and you face it head on; or it’s not, and you don’t.

Second: Here’s an exercise that’s useful in the classroom. It helps to get students thinking about the role of technology in the liturgy.

Ask yourself this question: Which forms and types of technology, and how much of them, could I remove from Sunday morning worship before it would become unworkable?

Another way to think about it would be to ask: What makes my church’s liturgy different, technologically speaking, than an instance of the church’s liturgy five hundred years ago?

Certain kinds of technology become evident immediately: electricity and HVAC, for starters. In my area, many church buildings would be impossible to worship in during a west Texas summer: no air and no light. They’d be little more than pitch-black ovens on the inside.

Start on the other end, though. Compare Sunday morning worship in your church today to just a few decades ago. Here are some concrete questions.

  • Could you go (could it “work”) without the use of smartphones?

  • What about video cameras?

  • What about spotlights and/or dimmers?

  • What about the internet?

  • What about screens?

  • What about computers?

  • What about a sound board?

  • What about electric amplification for musical instruments?

  • What about wireless mics?

  • What about microphones as such?

This list isn’t meant to prejudge whether any or all of these are “bad” or to be avoided in the liturgy. I’m happy to worship inside a building (technology) with A/C (technology) and electricity (technology)—not to mention with indoor plumbing available (also technology). Microphones make preaching audible to everyone, including those hard of hearing. And I’ve not even mentioned the most consequential technological invention for the church’s practice of worship: the automobile! Over the last century cars revolutionized the who and where and how and why of church membership and attendance. (In this Luddite’s opinion, clearly for the worse. Come at me.)

In any case, whatever one makes of these and similar developments, the foregoing exercise is meant to force us to reckon with technology’s presence in worship as both contingent and chosen. It is contingent because worship is possible without any/all of them. I’ve worshiped on a Sunday morning beneath a tree in rural east Africa. The people walked to get there. No A/C. No mics. No screens. No internet. Certainly no plumbing. Not that long ago in this very country, most of the technology taken for granted today in many churches did not even exist. So contingency is crucial to recognize here.

And because it is contingent, it is also chosen. No one imposed digital technology, or any other kind, on American churches. Their leaders implemented it. It does not matter whether they understood themselves to be making a decision or exercising authority. They were, whether they knew it or not and whether they liked it or not. It does not matter whether they even had a conversation about it. The choice was theirs, and they made it. The choice remains theirs. What has been done can be undone. No church has to stream, for example. Some never started. Others have stopped. It’s a choice, as I’ve written elsewhere. Church leaders should own it and take responsibility for it rather than assume it’s “out of their hands.”

Because the use and presence of digital technology in the church’s liturgy is neither necessary nor imposed—it is contingent and chosen—then the logical upshot is this: Church leaders who believe that digital technology is a clear and present danger to the well-being and faithfulness of disciples of Christ should act like it. They should identify, recognize, and articulate the threats and temptations of digital dysfunction in their lives and ours; they should formulate a vision for how the church can oppose this dysfunction, forcefully and explicitly; and they should find ways to enact this opposition, both negatively (by removing said dysfunction from within the church) and positively (by proposing and modeling alternative forms of life available to believers who want relief from their digital addictions).

What they should not do is say it’s a problem while avoiding dealing with it. What they should not do is leave the status quo as it is. What they should not do is accept Digital’s domination as inevitable—as somehow lying outside the sphere of the reign and power of Christ.

What they should not do is look the other way.

Read More
Brad East Brad East

Tech bubble

From what I read online, I appear to live in a tech bubble: everyone’s addicted to it while knowing it’s bad. Are there really people who aren’t addicted? Are there really others who are addicted, but think it’s good?

Lately it’s occurred to me that I must live in an odd sort of tech bubble. It has two components.

On one hand, no one in my context (a medium-sized city in west Texas) lives in any way “free” from digital technology. Everyone has smartphones, laptops, tablets, and televisions with streaming apps. Most little kids have Kindles or iPads; most 10-12-years olds have phones; nearly every middle school has a smartphone. Women are on Instagram and TikTok, men are on Twitter and YouTube. Boys of every age play video games (Switch, XBOX, PS5), including plenty of dads. Adults and kids alike are on their phones during church, during sporting events, during choir performances. Kids watch Disney+ and PBS Kids; parents watch Max and Netflix. Screens and apps, Amazon and Spotify, phones and tablets galore: this is just daily ordinary life. There are no radicals among us. No slices of life carved out. I don’t know anyone without a TV, much less without wireless internet. I don’t know anyone without a smartphone! Life in west Texas—and everywhere else I’m aware of, at least in the Bible Belt—is just like this. No dissenting communes. No screen-free spaces. I’m the campus weirdo for not permitting devices in my classroom, and doubly so for not using a Learning Management System. Nor am I some hard-edged radical. I’m currently typing on a MacBook, and when I leave my office, I’ll listen to an audiobook via my iPhone.

In other words, whenever anyone tells me that the world I’ve just described isn’t normal, isn’t typical, isn’t entrenched and established and nigh unavoidable—I think, “Okay, we simply live in different worlds. I’d like to come tour yours. I’ve not seen it with my own eyes before.” I’m open to being wrong. But I admit to some measure of skepticism. In a nation of 330 million souls, is it meaningful to point to this or that solitary digital experimenter as a site of resistance? And won’t they capitulate eventually anyway?

But maybe not. What do I know?

Here’s the other hand, though. Everyone I know, tech-addled and tech-saturated though they be, everyone agrees that digital technology and social media are a major problem, perhaps the most significant social challenge, facing all of us and especially young people today. No one thinks it’s “no big deal.” No one argues that their kids vegging out on video games all day does nothing to their brains. No one pretends that Instagram and TikTok and Twitter are good for developing adolescents. No one supposes that more screen time is better for anyone. They—we—all know it’s a problem. They—we—just aren’t sure what to do about it. And since it seems such an enormously complex and massive overarching matrix, by definition a systemic problem calling for systemic solutions, mostly everyone just keeps on with life as it is. A few of us try to do a little better: quantifying our kids’ screen time; deleting certain apps; resisting the siren song of smartphones for 12-year-olds. But those are drops in the bucket. No one disputes the nature or extent of the problem. It’s just that no one knows how to fix it; or at least no one has the resolve to be the one person, the one household, in a city of 120,000 to say No! to the whole shebang. And even if there were such a person or household, they’d be a one of one. An extraordinary exception to the normative and unthreatened rule.

And yet. When I read online, I discover that there are people—apparently not insignificant in number?—who do not take for granted that the ubiquity and widespread use of social media, screens, and personal devices (by everyone, but certainly by young people) is a bad thing. In fact, these people rise in defense of Silicon Valley’s holy products, so much so that they accuse those of us worried about them of fostering a moral panic. Any and all evidence of the detrimental effects of teenagers being online four, six, eight hours per day is discounted in advance. It’s either bad data or bad politics. Until very recently I didn’t even realize, naive simpleton that I am, that worrying about these things was politicized. That apparently you out yourself as a reactionary if … checks notes … you aren’t perfectly aligned with the interests of trillion-dollar multinational corporations. That it’s somehow right-wing, rather than common-sense, to want children and young people to move their bodies, to be outdoors, to talk to one another face to face, to go on dates, to get driver’s licenses, to take road trips, to see concerts, to star gaze, to sneak out at night(!), to go to restaurants, to go to parks, to go on walks, to read novels they hold in their hands, to look people in the eye, to play the guitar, to go camping, to visit national parks, to play pick-up basketball, to mow the yard, to join a protest march, to tend a garden, to cook a meal, to paint, to leave the confines of their bedrooms and game rooms, to go to church, to go on a picnic, to have a first kiss—must I go on? No, because everyone knows these are reasonable things to want young people to do, and to learn to do, and even (because there is no other way) to make mistakes and take real risks in trying to learn to do. I know plenty of conservatives and plenty of progressives and all of them, not an exception among them, want their kids off social media, off streaming, off smartphones—on them, at a minimum, much less—and want them instead to do something, anything, out there in the bright blue real world we all share and live in together.

I must allow the possibility, however, that I inhabit a tech bubble. There appear to be other worlds out there. The internet says so. In some of them, I’m told, there are tech-free persons, households, and whole communities enjoying life without the tyrannous glare of the Big Five Big Brother staring back at them through their devices. And in other worlds, running parallel to these perhaps, tech is as omnipresent as it is in my neck of the woods, yet it is utterly benign, liberating, life-giving, and above all enhancing of young people’s mental health. The more screens the better, in that world. To know this is to be right-thinking, which is to say, left-thinking: enlightened and progressive and educated. To deny it is right-thinking in the wrong sense: conservative and benighted and backwards.

Oh, well. Perhaps I’ll visit one of these other worlds someday. For the time being, I’m stuck in mine.

Read More
Brad East Brad East

Personal tech update

It’s been an unplanned, unofficial Tech Week here at the blog. I’ve been meaning to write a mini-update on my tech use—continuing previous reflections like these—so now seems as good a time as any.

It’s been an unplanned, unofficial Tech Week here at the blog. I’ve been meaning to write a mini-update on my tech use—continuing previous reflections like these—so now seems as good a time as any.

–I deactivated my Twitter account on Ash Wednesday, and I couldn’t be happier about the decision. It was a long time coming, but every time I came close to pulling the trigger I froze. There was always a reason to stay. Even Lent provided an escape hatch: my second book was being published right after Easter! How could I possibly hawk my wares—sorry, “promote my work in the public sphere”—if I wasn’t on Twitter? More to the point, does a writer even exist if he doesn’t have a Twitter profile? Well, it turns out he does, and is much the healthier for it. I got out pre–Elon Musk, too, which means I’ve been spared so much nonsense on the proverbial feed. For now, in any case, I’m keeping the account by reactivating then immediately deactivating it every 30 days; that may just be a sort of digital security blanket, though. Life without Twitter is good for the soul. Kempis and Bonhoeffer are right. Drop it like the bad habit that it is. Know freedom.

–I deleted my Facebook account two or three years ago, and I’ve never looked back. Good riddance.

–I’ve never had Instagram, TikTok, Snapchat, or any of the other nasty social media timesucks folks devote themselves to.

–For the last 3-4 years I’ve been part of a Slack for some like-minded/like-hearted Christian writers, and while the experience has been uniformly positive, I realized that it was colonizing my mind and thus my attention during the day, whether at work or at home. So, first, I set up two-factor authentication with my wife’s phone, which means I need her to give me access if I’m signed out; and, second, I began limiting my sign-ins to two or three Saturdays per month. After a few months the itch to be on and participate constantly in conversations has mostly dribbled away. Now I might jump on to answer someone’s question, but only for a few minutes, and not to “stay on” or keep up with all the conversations. I know folks for whom this isn’t an issue, but I’ve learned about myself, especially online, that it’s all or nothing. As with Twitter, I had to turn off the spout, or I would just keep on drinking until it made me sick.

–I don’t play video games, unless it’s a Mario Kart grand prix with my kiddos.

–I only occasionally use YouTube; nine times out of ten it’s to watch a movie trailer. I cannot relate to people, whether friends and students, who spend hours and hours on YouTube. I can barely watch a Zoom conversation for five minutes before needing to do something else with my time.

–I subscribe to Spotify, because it’s quality bang for your buck. I’d love to divest from it—as my friend Chris Krycho constantly abjures me to do—but I’m not sure how, should I want to have affordable, legal access to music (for myself as well as my family).

–I subscribe to Audible (along with Libby), because I gave up podcasts for audiobooks last September, a decision about which I remain ecstatic, and Audible is reasonably priced and well-stocked and convenient. If only it didn’t feed the Beast!

–I happily use Instapaper, which is the greatest app ever created. Hat tip to Alan Jacobs, from whom I learned about it in, I believe, his book Reading for Pleasure in an Age of Distraction. I’ve even paid to use the advanced version, and will do so again in the future if the company needs money to survive.

–I’ve dumbed down my iPhone as much as is in my power to do. I’ve turned off location services, the screen is in grayscale, and I’m unable to access my email (nor do I have my password memorized, so I can’t get to my inbox even if I’m tempted). I can call or text via Messages or WhatsApp. I have Audible, Spotify, and Instapaper downloaded. I use Marco Polo for friends and family who live far away. And that’s it. I aim to keep my daily phone usage to 45 minutes or so, but this year it’s been closer to 55-75 minutes on average.

–I use a MacBook Pro for work, writing, and other purposes; I don’t have an iPad or tablet of any kind. My laptop needs are minimal. I use the frumpy, clunky Office standbys: Word, Excel, PowerPoint. I’ve occasionally sampled or listened to pitches regarding the glories of alternatives to Word for writing, but honestly, for my needs, my habits, and my convenience, Word is adequate. As for internet browsing, I use Firefox and have only a few plug-ins: Feedly for an RSS reader, Instapaper, and Freedom (the second greatest app ever)—though I’ve found that I use Freedom less and less. Only when (a) I’m writing for 2-4 or more hours straight and (b) I’m finding myself distracted by the internet (but don’t need access to it); I pay to use it but may end up quitting if I find eventually that I’ve developed the ability to write without distraction for sustained periods of time.

–I’ve had a Gmail account since 2007; I daydream about deleting my Google account and signing up for some super-encrypted unsurveiled actually-private email service (again, Krycho has the recs), but so far I can’t find it within me to start from scratch and leave Gmail. We’ll see.

–I have the same dream about Amazon, which I use almost every day, order all my books from, have a Prime account with, and generally resent with secret pleasure (or enjoy with secret resentment). Divesting from Amazon seems more realistic than doing so from either Apple or Google, but then, how does anyone with a modest budget who needs oodles of books (or whatever) for their daily work purchase said books (or whatever) from any source but Amazon? That’s not a nut I’ve managed to crack just yet.

–I don’t have an Alexa or an Echo or an Apple Watch or, so far as I know, any species of the horrid genus “the internet of things.”

–In terms of TV and streaming services, currently my wife and I pay for subscriptions with … no platforms, unless I’m mistaken. At least, we are the sole proprietors of none. On our Roku we have available Netflix, Prime, Hulu, Disney+, Apple+, HBO Max, and YouTubeTV. But one of these is free with our cellular service (Hulu), two of them are someone else’s account (Apple+ and YouTubeTV), and another is a byproduct of free shipping (Prime). We pay a nominal fee as part of extended family/friend groups for Netflix and HBO, and honestly we could stop tomorrow and we’d barely notice. We paid a tiny fee up front for three years of Disney+, and if we could have only one streaming service going forward, that’s what we’d keep: it has the best combination of kids, family, classic, and grown-up selections, and you can always borrow a friend’s password or pay one month’s cost to watch a favorite/new series/season before canceling once it’s over. As for time spent, across a semester I probably average 3-7 hours of TV per week. I’ve stopped watching sports altogether, and I limit shows to either (a) hands-down excellence (Better Call Saul, Atlanta, Mare of Easttown), (b) family entertainment (basically, Marvel and Star Wars), or (c) undemanding spouse-friendly fare (Superstore, Brooklyn 99, Top Chef). With less time during the school year, I actually end up watching more TV, because I’m usually wiped by the daily grind; whereas during the summer, with much more leisure time, I end up reading or doing other more meaningful things. I will watch the NBA playoffs once grades are submitted, but then, that’s nice to put on in the background, and the kids enjoy having it on, too.

–Per Andy C.’s tech-wise advice, we turn screens off on Sundays as a general rule. We keep an eye on screen time for the kids Monday through Thursday, and don’t worry about it as much on Friday and Saturday, especially since outdoor and family and friend activities should be happening on those days anyway.

–Oddly enough, I made it a goal in January of this year to watch more movies in 2022. Not only am I persuaded that, my comparison to television, film is the superior art form, and that the so-called golden age of peak TV is mostly a misnomer, I regret having lost the time—what with bustling kids and being gainfully employed—to keep up with quality movies. What time I do have to watch stuff I usually give to TV, being the less demanding medium: it’s bite size, it always resolves (or ends on a cliffhanger), and it doesn’t require committing to 2-3 hours up front. I’ve mostly not been successful this year, but I’m hoping the summer can kickstart my hopes in that area.

–If I’m honest, I find that I’ve mostly found a tolerable equilibrium with big-picture technology decisions, at least on an individual level. If you told me that, in two years, I no longer used Amazon, watched even less TV, and traded in my iPhone for a flip phone, I’d be elated. Otherwise, my goals are modest. Mainly it has to do with time allocation and distraction at work. If I begin my day with a devotional and 2-4 hours of sustained reading all prior to opening my laptop to check email, then it’s a good day. If the laptop is opened and unread mail awaits in the inbox, it’s usually a waste of a day. The screen sucks me in and the “deep work” I’d hoped to accomplish goes down the drain. That may not be how it goes for others, but that’s how it is with me.

–The only other tech-related facet of my life I’m pondering is purchasing a Kobo Elipsa (again, on the recommendation of Krycho and some other tech-wise readerly types). I’m not an especially good reader of PDFs; usually I print them out and physically annotate them. But it would be nice to have a reliable workflow with digital files, digital annotations, and searchable digital organization thereof. It would also help with e-reading—I own a 10-year old Kindle but basically never use it—not only PDFs for work but writings in the public domain, ePub versions of new books I don’t need a physical copy of (or perhaps can only get a digital version of, for example, via the library), and Instapaper-saved articles from online sources. I’ve never wanted a normal tablet for this purpose because I know I’d just be duped into browsing the web or checking Twitter or my inbox. But if Kobo is an ideal balance between a Kindle and an iPad, designed for the sole purpose for which I need it, then I may end up investing in it here in the next year or two.

Read More
Brad East Brad East

From meaningless content to doomscrolling

One of the truly essential Substack writers is Justin E. H. Smith, who is neither a journalist nor a start-up freelancer but a major academic philosopher and polymath scholar of (what always strikes me as) ten thousand interesting things. His newsletter from two Sundays ago was a typically undefinable reflection on (inter alia) memory, streaming, tense, eternity, and the internet.

One of the truly essential Substack writers is Justin E. H. Smith, who is neither a journalist nor a start-up freelancer but a major academic philosopher and polymath scholar of (what always strikes me as) ten thousand interesting things. His newsletter from two Sundays ago was a typically undefinable reflection on (inter alia) memory, streaming, tense, eternity, and the internet. Here are some sample grafs that bring home one of the essay’s central points:

If this assessment sounds bleak or cynical, consider Amazon’s recent acquisition of MGM for $8.45 billion. Jeff Bezos now holds the rights to numerous treasures of twentieth-century American entertainment, not least Albert R. Broccoli’s almost boutique-style James Bond films with their iconic, mythos-incanting musical opening numbers. Bezos has explicitly stated his intention to “reimagine and redevelop that I.P. [sic] for the 21st century.” On the surface, his idea of what a “good plot” looks like would seem to make twenty-first century content scarcely different from the most archaic and deep-rooted elements of myth and lore. Thus he thinks there needs to be a heroic protagonist, a compelling antagonist, moral choices, civilizational high stakes, humor, betrayal, violence…

“I know what it takes to make a great show,” Bezos has confidently said, “this should not be that hard. All of these iconic shows have these basic things in common.” The problem is that Bezos’s purpose in returning to a quasi-Proppian schema of all possible storytelling is not at all to revive the incantatory power of cliché to move us into the ritual time of storytelling. It is rather to streamline and dynamicize the finished product, exactly as if it were shipping times Bezos were seeking to perfect, rather than the timing of a hero’s escape from a pit of conventional quicksand.

And so the college freshman imagining her life as a show seems doubly sad: she turns to the closest thing we have to new narrative art in order to frame her own life and make it meaningful, but the primary instances our culture yields up to her to help with this framing are in fact only meaningless content being passed off as narrative art. It is no wonder, then, that what she will likely end up doing, after the passing and briefly stimulating thought of life itself as a TV show, is to go back to doomscrolling and vain name-checking until sleep takes over.

Do go read the whole thing; the closing section is eloquent, incisive, and damning in equal parts. Then do your duty and subscribe.

Read More
Brad East Brad East

My technology habits

Developing good technology habits is one of the driving motivations of my daily life. Particularly since I surrendered and got a smart phone (only three years ago), combined with having children (the oldest is six) and getting a job (now in my second year), the possibility for the internet and screens to overtake my every waking moment has never been greater. A little less than two years ago I read Andy Crouch's The Tech-Wise Family, which galvanized and organized my approach to disciplining technology's role in my life. Here's where things stand at the moment.

Phone

I still have an iPhone, though an older and increasingly outdated model. When I read Crouch I realized I was spending more than 2 hours a day on my phone (adolescents average 3-6 hours—some of my students more than that!), and I followed his lead in downloading the Moment app to monitor my usage. Since then I've cut down my daily screen time on my phone to ~45 minutes: 10 or so minutes checking email, 10-20 or so minutes texting/WhatsApp, another 20-30 minutes reading articles I've saved to Instapaper.

I changed my screen settings to black and white, which diminishes the appeal of the phone's image (the eyes like color). My home screen consists of Gmail, Safari, Messages, WhatsApp, Calendar, Photos, Camera, Settings, Weather, Google Maps, and FaceTime. That's it. I have no social media apps. On the next screen I have, e.g., the OED, BibleGateway, Instapaper, Podcasts, Amazon, Fandango, and Freedom (which helps to manage and block access to certain websites or apps).

When we moved to Abilene in June 2016, we instituted a digital sabbath on Sundays: no TV (for kids or us), and minimal phone usage. Elaborating on the latter: I leave my phone in the car during church, and try to leave my phone plugged into the charger in the bedroom or away from living areas during the day. Not to say that we've been perfectly consistent with either of these practices, but for the most part, they've been life-giving and refreshing.

Oh, and our children do not have their own phones or tablets, and they do not use or play on ours, at home or in public. (Our oldest is just now experimenting with doing an educational app on our iPad instead of TV time. We'll see how that goes.)

Social Media

Currently the only social media that I am on and regularly use is Twitter. I was on Facebook for years, but last month I deactivated my account. I'm giving it a waiting period, but after Easter, or thereabouts, unless something has changed my mind, I am going to delete my account permanently. (Reading Jaron Lanier's most recent book had something to do with this decision.) I don't use, and I cannot imagine ever creating an account for, any other social media.

Why Twitter? Well, on the one hand, it has proved to be an extraordinarily helpful and beneficial means of networking, both personally and professionally. I've done my best to cultivate a level-headed, sane, honest, and friendly presence on it, and the results have so far wildly exceeded my expectations. Thus, on the other hand, I have yet to experience Twitter as the nightmare I know it is and can be for so many. Part of that is my approach to using it, but I know that the clock is ticking on my first truly negative experience—getting rolled or trolled or otherwise abused. What will I do then? My hope is that I will simply not read my mentions and avoid getting sucked into the Darkest Twitter Timeline whose vortex has claimed so many others. But if it starts affecting my actual psyche—if I start anxiously thinking about it throughout the day—if my writing or teaching starts anticipating, reactively, the negative responses Twitter is designed algorithmically to generate: then I will seriously consider deactivating or deleting my account.

How do I manage Twitter usage? First, since it's not on my phone, unless I'm in front of my own laptop, I can't access it, or at least not in a user friendly way. (Besides, I mostly use Twitter as I once did checking blogs: I go to individual accounts' home pages daily or every other day, rather than spend time scrolling or refreshing my timeline.) Second, I use Freedom to block Twitter on both my laptop and my phone for extended periods during the day (e.g., when writing or grading or returning emails), so I simply can't access it. Third, my aim is for two or three 5-10 minute "check-ins": once or twice at work, once in the evening. If I spend more than 20 or 30 minutes a day on Twitter, that day is a failure.

Laptop

I have four children, six and under, at home, so being on my laptop at home isn't exactly a realistic persistent temptation. They've got to be in bed, and unless I need to work, I'm not going to sit there scrolling around online indefinitely. I've got better things to do.

At work, my goal is to avoid being on my laptop as much as possible. Unless I need to be on it—in order to write, email, or prepare for class—I keep it closed. In fact, I have a few tricks for resisting the temptation to open it and get sucked in. I'll use Freedom to start a session blocking the internet for a few hours. Or I begin the day with reading (say, 8:00-11:00), then open the laptop to check stuff while eating an early lunch. Or I will physically put the laptop in an annoying place in my office: high on a shelf, or in a drawer. Human psychology's a fickle thing, but this sort of practice actually decreases the psychic desire to take a break from reading or other work by opening the laptop; and I know if I open it, Twitter or Feedly or Instapaper or the NYT or whatever will draw me in and take more time from me than I had planned or wanted.

[Insert: I neglected to mention that one way I try to read at least some of the innumerable excellent articles and essays published online is, first, to save them to Instapaper then, second, to print out the longest or most interesting ones (usually all together, once or twice a month). I print them front and back, two sides to a page, and put them in a folder to read in the evening or throughout the week. This can't work for everyone, but since I work in an office with a mega-printer that doesn't cost me anything, it's a nice way to "read online" without actually being online.]

One of my goals for the new year has been to get back into blogging—or as I've termed it, mezzo-blogging—which is really just an excuse to force myself to write for 15-30 minutes each day. That's proved to require even more hacks to keep me from going down rabbit holes online, since the laptop obviously has to be open to write a blog post. So I'll use Freedom to block "Distractions," i.e., websites I've designated as ones that distract me from productive work, like Twitter or Google.

I've yet to figure out a good approach with email, since I don't like replying to emails throughout the day, though sometimes my students do need a swifter answer than I'd prefer to give. Friday afternoons usually end up my catch-up day.

I should add that I am a binge writer (and editor): so if I have the time, and I have something to write, I'll go for three or six or even nine or more hours hammering away. But when I'm in the groove like that, the distractions are easy to avoid.

Oh, and as for work on the weekends: I typically limit myself to (at most) Saturday afternoon, while the younger kids nap and the older kids rest, and Sunday evening, after the kids go to bed. That way I take most or all of the weekend off, and even if I have work to do, I take 24+ hours off from work (Sat 5pm–Sun 7pm).

TV

In many ways my worst technology habits have to do with TV. Over many years my mind and body have been trained to think of work (teaching and reading and writing) as the sort of thing I do during the day, and rest from work after dinner (or the kids go to bed) means watching television. That can be nice, either as a respite from mentally challenging labor, or as a way to spend time with my wife. But it also implies a profoundly attenuated imagination: relaxation = vegging out. Most of the last three years have been a sustained, ongoing attempt at retraining my brain to resist its vegging-out desires once the last child falls asleep. Instead, to read a novel, to catch up with my wife, to clean up, to grab drinks with friends, to get to bed early—whatever.

If my goal is less than 1 hour per day on my phone, and only as much time on my laptop as is necessary (which could be as little as 30 minutes or as much as 4+ hours), my goal is six (or fewer) hours per week of TV time. That includes sports, which as a result has gone way down, and movies (whether with the kids or my wife). Reasonable exceptions allowed: our 5-month-old often has trouble getting down early or easily, and my wife and I will put on some mindless episode of comedy—current favorite: Brooklyn 99—while taking turns holding and bouncing her to sleep. But otherwise, my current #1 goal is as little TV as possible; and if it's on, something well worth watching.

Video games

I don't have video games, and haven't played them since high school. We'll see if this re-enters our life when our kids get older.

Pedagogy

I've written elsewhere about the principles that inform my so-called Luddite pedagogy. But truly, my goal in my classes is to banish technology from the classroom, and from in front of my students' faces, as much as is within my power. The only real uses I have for it is PowerPoint presentations (for larger lecture courses to freshmen) and YouTube clips (for a certain section of my January intensive course on Christianity and Culture). Otherwise, it's faces looking at faces, ears listening to spoken words, me at the table with the students or up scribbling on the white board. For 80 minutes at a time, I want my students to know what it's like not to constantly be scratching that itch.

Spiritual disciplines

All of this is useless without spiritual disciplines encompassing, governing, and replacing the time I'd otherwise be devoting to technology. I note that here as a placeholder, since that's not what this post is about; perhaps in another post I'll discuss my devotional regimen (which makes it sound far more rigorous than my floundering attempts in fact amount to).

I have been helped so much by learning what others do in order to curb and control their relationship to technology. I hope this might be helpful to others in a similar way.
Read More