Resident Theologian
About the Blog
NYT, guilt by association, and libraries
It's a relief to see so many thoughtful—albeit blistering—responses to the long-awaited NYT hit piece on Scott Alexander and his erstwhile blog. It means that I'm not crazy for having the reaction I did when I read it, and that I don't need to write much to draw attention to the article's numerous flaws:
It's a relief to see so many thoughtful—albeit blistering—responses to the long-awaited NYT hit piece on Scott Alexander and his erstwhile blog. It means that I'm not crazy for having the reaction I did when I read it, and that I don't need to write much to draw attention to the article's numerous flaws: only to point you to all the existing ones that already do the job. That's only a few, and none of the Twitter threads and dunks. I must say, the immediately striking thing about the piece is how boring and boringly written it is, in such a bone-deep passive-aggressive voice. Why all the fuss (internally, that is, at the NYT) for that?
But: one additional thought. It has to be strange, at the experiential (nay, existential) level, for a writer truly to think that he can damn another writer through nothing but guilt by association. And not just association in general, but association understood, first, as proximate contact (i.e., having read and engaged a "dangerous" or non-mainstream or legitimately Bad author); and, second, as having potentially contributed to the potential acceptance of said unsavory character's unsavory ideas (i.e., not having controlled in advance the judgments one's own readers might make in reading one's engagement of another's writings).
The entire hit piece is structured this way. "Alexander might be an okay dude, but it's possible that his blog might have led some readers to Wrong Thoughts." What I cannot for the life of me understand is what it means for a fellow writer to think this, to have written this way. Does he really suppose the way authors and their works ought to be judged is by the sheer possibility that some readers might draw undesirable conclusions? or misunderstand the authors' views? or go beyond the authors, or follow their evidence or arguments to different ends than the authors?
This view is patently preposterous. It's self-defeating for any writer to hold. And it's even worse as a picture of reading and learning. Imagine it applied to libraries:
- Libraries invariably contain books of all kinds.
- Some books invariably contain Bad Ideas.
- Moreover, All Kinds Of People read books.
- Therefore, some people will have Bad Ideas as a result of reading certain books contained in libraries.
- Therefore, libraries are Of Questionable Quality.
- Therefore, ban all libraries.
There is a real hatred afoot today: a hatred for learning, for thinking, for reading and giving space to ideas and authors outside of a narrow mainstream—whether or not that mainstream is a mushy moderate middle or something else entirely—and this hatred not only animates the hit piece on Alexander, it is a sort of electrical current pulsing through our culture today. Resist it, y'all. Resist it in any way you can.
Anthropomorphism and analogy
Andrew Wilson has a lovely little post up using Herman Bavinck's work to show the "unlimited" scope of the Bible's use of anthropomorphism to talk about God. It's a helpful catalogue of the sheer volume and range of scriptural language to describe God and God's action.
Andrew Wilson has a lovely little post up using Herman Bavinck's work to show the "unlimited" scope of the Bible's use of anthropomorphism to talk about God. It's a helpful catalogue of the sheer volume and range of scriptural language to describe God and God's action. It's a useful resource, too, for helping students to grasp the notion that most of our speech about God is metaphorical, all of it is analogical, and none of it is less true for that.
In my experience not only students but philosophers and theologians as well often imagine, argue, or take for granted that doctrine is a kind of improvement on the language of Scripture. The canon then functions as a kind of loose rough draft, however authoritative, upon which metaphysically precise discourse improves, or at least by comparison offers a better approximation of the truth. Sometimes those parts of the canon that are literal or less anthropomorphic are permitted some lexical or semantic control. But in any case the idea is that arriving at non-metaphorical and certainly non-anthropomorphic language is the ideal.
But this is a mistake. Anthropomorphism is not an error or an accommodation to avoid. It's the vehicle of truth, the sanctified means of truthful talk about God. It may in principle speak more truly about God than its contrary. And Scripture's saturation in it would suggest that in fact it is God's chosen manner of communicating with us, and thus a privileged discursive mode for talk about God.
The upshot: theological accounts of analogy and language about God are meant not to sit in judgment on Scripture but rather to show how Scripture's language about God works. It is meant to serve the canon and to ground trust in canonical idiom, not to qualify it. "Given divine transcendence and the character of human language, how is what the Bible says about God true?" is the question to which the doctrine of analogy is an answer. Analogy does not mitigate the truth of Scripture's witness. It is a way of establishing it philosophically.
So that when the Bible says God has a face or arms or nostrils, or has wrath or grief or regret or love, or knows or forgets or begets or weds, the Christian is right to hear it as what it is: the word of God, trustworthy and true.
Digital ash
I'm on the record regarding "streaming" the sacraments, or otherwise digitally mediating the celebration of the Eucharist. With Lent approaching, the question occurred to me: Might churches "stream" Ash Wednesday? That is to say, would they endorse or facilitate the self-imposition of ashes?
I'm on the record regarding "streaming" the sacraments, or otherwise digitally mediating the celebration of the Eucharist. With Lent approaching, the question occurred to me: Might churches "stream" Ash Wednesday? That is to say, would they endorse or facilitate the self-imposition of ashes?
Nonsense, was my initial thought. No way. Of course not. Who would suggest such a thing?
But that was naive. Surely, after almost a full year of administering the body and blood of Christ to oneself at home, the imposition of ashes upon one's own forehead at home is but a small leap; indeed, it is not so much a leap as a logical next step. If the blessed sacrament admits of auto-administration via digital consecration, how much more so the rites of Ash Wednesday?
I am prepared, therefore, for digital ash. Which is another way of saying that I am praying: Lord, deliver us from Covidtide.
“You are your actions”: close, but not quite
Over on Freddie deBoer's blog, he has a sharp piece up criticizing the vacuous identities induced by mass entertainment in late modern capitalism. Instead of having a nice time watching a Marvel movie, for instance, one's sense of self gets wrapped up in "being a Marvel fan." But Marvel doesn't care about you.
Over on Freddie deBoer's blog, he has a sharp piece up criticizing the vacuous identities induced by mass entertainment in late modern capitalism. Instead of having a nice time watching a Marvel movie, for instance, one's sense of self gets wrapped up in "being a Marvel fan." But Marvel doesn't care about you. Nor can it offer that depth of identity-constitutive meaning. It's just a movie that's a pinch of fun in a dark world, for which you fork over some money. Forgetting that, and allowing Disney to define who you are, is both childish and a trap. It doesn't end well, and it's a recipe for arrested development.
Here are the closing two paragraphs (my emphasis):
I wish I had a pat answer for what to do instead. Grasping for meaning – usually while drenching yourself in irony so that no one knows that that’s what you doing, these days – is universal. I will risk offending another very touchy subset of the internet by saying that I think many people turn to social justice politics and their prescriptivist politics, the notion that your demographic identifiers define you, out of motives very similar to the people I’ve been describing. There are readymade vehicles for acquiring meaning, from Catholicism to New Age philosophy to anarchism, that may very well create the solid ground people are looking for, I don’t know. I suspect that the best answer for more people would be to return to an idea that is very out of fashion: that you are what you do. You are your actions, not what you consume, what you say, or what you like.
It’s cool to name the bands you like to friends. It’s cool to be proud of your record collection. I’m sure it’s fun to create lists for Letterboxd. But those things don’t really say anything about you. Not really. Millions of people like all the things you like, after all. And trying to build a personality out of the accumulation of these things makes authentic appreciation impossible. I think it’s time to look elsewhere, as much as I admit that it would be nice if it worked.
The critique is on point, but the solution is not quite there. Part of the reason why presupposes what deBoer in turn presupposes is off the table (though he acknowledges it as a possibility for others): Christian anthropology. But the following points, though they trade on theological judgments about the nature of the human person, can be defended from other perspectives as well.
So why is defining one's identity by one's deeds an inadequate prescription?
First, because most people's actions are indistinguishable from others' actions: you wake up, you eat, you punch a clock, you watch a show, you pay the bills, you mow the lawn, you grab drinks with a friend, you text and email and post and scroll, maybe you put the kids to bed, you go back to sleep, you repeat it all over again. Such things "don't really say anything about you" either, since "millions of people [do] all the things you [do], after all."
Second, because meaning comes from without, not from within: even if your actions were robust enough to constitute a worthwhile identity, you'd still be seeking, desiring, yearning for that which is other than you, that which transcends you—whether in a lover, a friend, a child, a marriage, a job, a group, a church, a nation, a deity. Not only is it humanly basic for the source of our identity and meaning to come from beyond us (David Kelsey defines human life as "eccentric existence": the ground of our being stands outside ourselves); not only is it literally true that we depend on what is outside us for sustenance, care, and flourishing (gestation, birth, food, relationships, art); even more, turning in to oneself for one's own meaning is a dead end: simply put, none of us is up to the task. True navel gazing is monastic: it turns the self inside out—to find God within.
Third, because (positively speaking) your actions will never be enough: not impressive enough, not heroic enough, not virtuous enough, not even interesting enough. If I am what I do, I am a poor, indeed a boring and forgettable, specimen of the human species. Even if it were true that all that I am is found in my actions, that would be a cause for despair, not hope or meaning.
Fourth, because (negatively speaking) you are an inveterate sinner: you will fail, you will falter and stumble, you will invariably harm and hurt others, not always without intending to do so, but by commission and omission, you will err, you will induce pain, you will fall short—for the rest of your life, world without end. Christians don't think this is the end of the story (God not only forgives you but provides means of reparation for others, healing in oneself, and moral improvement over time), but there is a reason that Chesterton called original sin the one empirical doctrine. Look around at the world. Look at your own life. Does either inspire confidence? Does it suggest a source of reliable meaning and stable identity going forward? I didn't think so.
None of this means it's either deBoer's way or the church's. Even if my description were right, perhaps that merely means that life is meaningless and there is nowhere to look, even in one's own actions or character, for personal significance and rich identity. (Though in that case, who can blame the geeks for their projections?) Or perhaps I'm right at the formal level, but there are sources of transcendence beyond the church that folks like deBoer are remiss in overlooking. In any case, though, the upshot is clear. In terms of deep personal meaning and life-giving identity, the last place to look for who I am is in what I do. Look instead at what I'm looking for—looking at, looking to—and that'll tell you who I am. Or at least who I hope to be, who I'm trying to become.
Alan Jacobs on avoiding unpaid labor for surveillance capitalism
...it’s important to understand that a lot of what we call leisure now is actually not leisure. It is unpaid labor on behalf of surveillance capitalism, what Shoshana Zuboff calls surveillance capitalism. That is, we are all working for Google, we are working for Facebook.
...it’s important to understand that a lot of what we call leisure now is actually not leisure. It is unpaid labor on behalf of surveillance capitalism, what Shoshana Zuboff calls surveillance capitalism. That is, we are all working for Google, we are working for Facebook. I would like to spread a model of reading that is genuinely a leisure activity and that escapes the loop of being unpaid labor for surveillance capitalism. That will start small, and maybe it will stay small, but my hope is that it would be it would be bigger. Even people who have very hard, demanding lives can spend an enormous amount of time in this activity that we have been taught to feel is leisure, but is as I have said unpaid labor.
It’s interesting to see how things come into fashion. Think about how in the last few months we have decided that nothing is more important than the Post Office—that the Post Office is the greatest thing in the world. One of those bandwagons that I’ve been on for my entire life is libraries. I think libraries are just amazing. I grew up in a lower-middle-class, working class family. My father was in prison almost all of my childhood, and my mother worked long hours to try to keep us afloat. My grandmother was the one who took care of my sister and me, and we didn’t have much money for books. There were a lot of books in the house, but that was because members of my family would go to used bookshops and get these like ten cent used paperbacks that had been read ten times and had coffee spilled on them and so forth. So we spent massive amounts of time at the library. Once a week my grandmother and I would go to the library and come out with a great big bag full of books and we would just read relentlessly.
I’m the only person in my family who went beyond high school—in fact I’m the only person in my family to graduate from high school. Yet we read all the time. We always had a TV on in our house, but nobody ever watched it. A characteristic thing in my family would be me, my mother, my grandmother, my sister when she got old enough, sitting in a room with the TV on, the sound turned down, and we were all just sitting there reading books. That was everything to me. It opened the door for everything that I’ve done in the rest of my life. I owe so much to the city of Birmingham, Alabama’s, public libraries. I think when I was doing that I was simultaneously engaging in genuine leisure while also being formed as a thinker and as someone who could kind of step out of the flow of the moment and acquire perspective and tranquility. So I believe that I’m recommending something that is widely available, even to people who have very little money and very few resources, and I know that from my own childhood.
—Alan Jacobs, "The Fare Forward Interview with Alan Jacobs," Fare Forward (30 Dec 2020)
“TV" by John Updike
TV
By John Updike
As if it were a tap I turn it on,
not hot or cold but tepid infotainment,
and out it gushes, sparkling evidence
TV
By John Updike
As if it were a tap I turn it on,
not hot or cold but tepid infotainment,
and out it gushes, sparkling evidence
of conflict, misery, concupiscence
let loose on little leashes, in remissions
of eager advertising that envisions
on our behalf the better life contingent
upon some buy, some needful acquisition.
A sleek car takes a curve in purring rain,
a bone-white beach plays host to lotioned skin,
a diaper soothes a graying beauty’s frown,
an unguent eases sedentary pain,
false teeth are brightened, beer enhances fun,
and rinsed hair hurls its tint across the screen:
these spurts of light are drunk in by my brain,
which sickens quickly, till it thirst again.
A surefire way to increase the number of books you read this coming year
is to read less online. Not just to be online less, but to read online less. Read less news (or no news), fewer blogs and newsletters and Substacks, altogether fewer websites and online essays and articles, and replace that time with reading books. I promise you that your book-reading will increase dramatically, even exponentially, in 2021.
is to read less online. Not just to be online less, but to read online less. Read less news (or no news), fewer blogs and newsletters and Substacks, altogether fewer websites and online essays and articles, and replace that time with reading books. I promise you that your book-reading will increase dramatically, even exponentially, in 2021.
Nor would you lose much. Whatever you are reading online, of whatever quality, it will always prove to be inferior in substance, style, or relevance to your life than what has already been published in a book. Have you read Auden, Eliot, Thomas, Rilke, Levertov, Hopkins, Herbert, Shakespeare, Milton, Dante? All of them, and all of what they wrote? If not, then what you're reading online is subpar by comparison. What about the novels of Dostoevsky, Tolstoy, Eliot, Austen, Melville, Twain, Trollope, Ellison, Baldwin, Morrison, McMurtry, Robinson, et al? No? Get on it. It's better than whatever you're reading on the internet. Take that to the bank.
Very nearly everything you have ever read and will ever read online is instantly forgettable and will make not a bit of difference to your life. Even in those rare cases where that is not true, what you are reading remains forgettable by comparison to what you are able to find already published in a book. Go read that instead.
Note well: Though your life would surely be improved by dropping online reading entirely, I'm not making that (salutary) suggestion. I'm saying you ought to read less online. Do that, and not only will your intellect and wisdom and soul be improved. You willl read more books this year than last.
Such sage advice I offer free of charge, first to myself, then to you, dear reader. Close this tab—turn off your phone—and go read a book. Thank me in a year.
Publication round-up: recent pieces in First Things, Journal of Theological Interpretation, Mere Orthodoxy, and The Liberating Arts
I've been busy the last month, but I wanted to make sure I posted links here to some recent pieces of mine published during the Advent and Christmas seasons.
I've been busy the last month, but I wanted to make sure I posted links here to some recent pieces of mine published during the Advent and Christmas seasons.
First, I wrote a meditation on the first Sunday of Advent for Mere Orthodoxy called "The Face of God."
Second, I interviewed Jon Baskin for The Liberating Arts in a video/podcast called "Can the Humanities Find a Home in the Academy?" Earlier in the fall I interviewed Alan Noble for TLA on why the church needs Christian colleges.
Third, in the latest issue of Journal of Theological Interpretation, I have a long article that seeks to answer a question simply stated: "What Are the Standards of Excellence for Theological Interpretation of Scripture?"
Fourth and last, yesterday, New Year's Day, First Things published a short essay I wrote called "The Circumcision of Israel's God." It's a theological meditation on the liturgical significance of January 1 being simultaneously the feast of the circumcision of Christ (for the East), the solemnity of Mary the Mother of God (for Rome), the feast of the name of Jesus (for many Protestants), and a global day for peace (per Pope Paul VI). I use a wonderful passage from St. Theodore the Studite's polemic against the iconoclasts to draw connections between each of these features of the one mystery of the incarnation of the God of Israel.
More to come in 2021. Lord willing it will prove a relief from the last 12 months.
Peter van Inwagen on disciplinary hubris, relevant expertise, expectations of deference, and ordinary prudence
In the early 1990s the philosopher Peter van Inwagen wrote an essay called "Critical Studies of the New Testament and the User of the New Testament." It is a long, detailed philosophical investigation of the epistemic nature, or status, of academic biblical scholarship; specifically, it asks whether…
In the early 1990s the philosopher Peter van Inwagen wrote an essay called "Critical Studies of the New Testament and the User of the New Testament." It is a long, detailed philosophical investigation of the epistemic nature, or status, of academic biblical scholarship; specifically, it asks whether ordinary Christians or readers of the Bible ought to consult such scholarship, or defer to its judgments, prior to or in the course of their readings of the Bible or accompanying theological judgments. After many pages, his answer is a firm No. Here are the final paragraphs of the essay (bolded emphases are mine):
I conclude that there is no reason for me to think that Critical Studies have established that the New Testament narratives are historically unreliable. In fact, there is no reason for me to think that they have established any important thesis about the New Testament. I might, of course, change my mind if I knew more. But how much time shall I devote to coming to know more? My own theological writings, insofar as they draw on contemporary knowledge, draw on formal logic, cosmology, and evolutionary biology. I need to know a great deal more about these subjects than I do. How much time shall I take away from my study of them to devote to New Testament studies (as opposed to the study of the New Testament)? The answer seems to me to be: very little. I would suggest that various seminaries and divinity schools might consider devoting a portion of their curricula to these subjects (not to mention the systematic study of the Fathers!), even if this had to be done at the expense of some of the time currently devoted to Critical Studies.
Let me close by considering a tu quoque. Is not philosophy open to many of the charges I have brought against Critical Studies? Is not philosophy argument without end? Is not what philosophers agree about just precisely nothing? Are not the methods and arguments of many philosophers (especially those who reach extreme conclusions) so bad that an outsider encountering them for the first time might well charitably conclude that he must be missing something? Must one not devote years of systematic study to philosophy before one is competent to think philosophically about whether we have free will or whether there is an objective morality or whether knowledge is possible?—and yet, is one not entitled to believe in free will and knowledge and morality even if one has never read a single page of philosophy?
Ego quoque. If you are not a philosopher, you would be crazy to go to the philosophers to find anything out—other than what it is that the philosophers say. If a philosopher tells you that you must, on methodological grounds, since he is the expert, take his word for something—that there is free will, say, or that morality is only convention—you should tell him that philosophy has not earned the right to make such demands. Philosophy is, I think, valuable. It is a good thing for the study of philosophy to be pursued, both by experts and by amateurs. But from the premise that it is a good thing for a certain field of study to be pursued by experts, the conclusion does not follow that that field of study comprises experts who can tell you things you need to attend to before you can practice a religion or join a political party or become a conscientious objector. And from the premise that it is a good thing for a certain field of study to be pursued by amateurs, the conclusion does not follow that anyone is under an obligation to become an amateur in that field.
This is very close to some of the depreciatory statements I have made about the authority of Critical Studies. Since I regard philosophy as a Good Thing, it should be clear that I do not suppose that my arguments lend any support to the conclusion that the critical study of the New Testament is not a Good Thing. Whether it is, I have no idea. I don't know enough about it to know whether it is. I have argued only that the very little I do know about Critical Studies is sufficient to establish that users of the New Testament need not—but I have said nothing against their doing so—attend very carefully to it. (God, Knowledge, and Mystery [Ithaca: Cornell University Press, 1995], 189–190)
The choice quote here, reduced to a general maxim:
If an [expert in X] tells you that you must, on methodological grounds, since he is the expert, take his word for something, you should tell him that [X] has not earned the right to make such demands.
One cannot substitute just anything for "X," but one can substitute most things, and certainly anything outside the hardest of hard disciplines. Any and all discursive practices and realms of knowledge in which prudence is required or normative questions are involved, or in which ongoing contestation, adjudication, and dissent are prominent or at least typical, are by definition substitutable for "X." Moreover, if a legitimate expert in X attempts to mandate deference to her authority, but in this case regarding not X but Y, the attempt is patently fallacious, mendacious, confused, and absurd. One owes such an attempt and such an expert little more than an eye-roll, though laughter and mockery are warranted.
Let the reader understand.
New piece published in Mere Orthodoxy: "Befriending Books: On Reading and Thinking with Alan Jacobs and Zena Hitz"
I'm in Mere Orthodoxy with a long review-essay of two new books: Breaking Bread with the Dead: A Reader's Guide to a More Tranquil Mind by Alan Jacobs and Lost in Thought: The Hidden Pleasures of an Intellectual Life. Here's a section from the opening:
I'm in Mere Orthodoxy with a long review-essay of two new books: Breaking Bread with the Dead: A Reader's Guide to a More Tranquil Mind by Alan Jacobs and Lost in Thought: The Hidden Pleasures of an Intellectual Life. Here's a section from the opening:
If the quality of one’s thinking depends upon the quality of those one thinks with, the truth is that few of us have the ability to secure membership in a community of brilliant and wise, like-hearted but independent thinkers. Search for one as much as we like, we are bound to be frustrated. Moreover, recourse to the internet—one commonly proffered solution—is far more likely to exacerbate than to alleviate the problem: we may find like-minded souls, yes, but down that rabbit hole lies danger on every side. Far from nurturing studiositas, algorithms redirect the energies of the intellect into every manner of curiositas; far from preparing a multicourse feast, our digital masters function rather like Elliott in E.T., drawing us on with an endless trail of colorful candies. Underfed and unsatisfied, our minds continue to follow the path, munching on nothing, world without end.
Is there an alternative?
Jacobs believes there is. For the community of potential collaborators in thinking is not limited to the living, much less those relatively few living folks who surround each of us. It includes the dead. And how do we commune with the dead? Through books. A library is a kind of mausoleum: it houses the dead in the tombs of their words. We break bread with them, in Auden’s phrase, when we read them. Reading them, we find ourselves inducted into the great conversation that spans every civilization and culture from time immemorial on to the present and into the future. We encounter others who are really and truly other than us.
New piece published in LARB: an essay review of N. T. Wright's Gifford lectures
This morning I'm in the Los Angeles Review of Books with a long essay review of History and Eschatology: Jesus and the Promise of Natural Theology, which are the book form of N. T. Wright's Gifford lectures. Here's the opening paragraph:
This morning I'm in the Los Angeles Review of Books with a long essay review of History and Eschatology: Jesus and the Promise of Natural Theology, which are the book form of N. T. Wright's Gifford lectures. Here's the opening paragraph:
DOES GOD EXIST? An affirmative answer is presupposed by the world’s major religions traditions, particularly those that claim Abraham as forebear. Contemporary atheists, however, are far from the first to wonder about the question. Ancient philosophers and Abrahamic believers of every stripe have grappled with it in one form or another. For Christians who reflect on the matter, the catchall term is “natural theology.” But there is no one habit of thought or mode of analysis captured by that title. Rather, it gathers together a complex heritage marked as much by internal disagreement as by shared inquiry. That heritage is in part a genealogy. In order to come to terms with natural theology today, therefore, one must have some sense, as it were, of the family history in view.
Between pandemic and protest: introducing The Liberating Arts
Earlier this year I had the opportunity to join a group of gifted Christian scholars with an idea for a grant proposal. The idea was to respond to the crisis facing institutions of higher education, particularly liberal arts colleges, proactively rather than reactively.
Earlier this year I had the opportunity to join a group of gifted Christian scholars with an idea for a grant proposal. The idea was to respond to the crisis facing institutions of higher education, particularly liberal arts colleges, proactively rather than reactively. That is, to see the moment—pandemic, protest, political upheaval, demographic collapse, threats to the future of the liberal arts on every side—as an apocalyptic one, in which deep truths about ourselves and our culture are unveiled, as it were, from without. What to do in light of those revelations? How to shore up the ruins, and more than that, to articulate a positive and hopeful case for the institutions and areas of expertise to which we all belong, and by which we have been so profoundly formed, in the midst of so many competing challenges and voices?
Led by Jeff Bilbro, Jessica Hooten Wilson, Noah Toly, and Davey Henreckson, the proposal was approved and we received the grant from CCCU. Earlier this month the project launched, and The Liberating Arts was born. Go check it out!
Here's the description from the About page:
COVID-19 has been apocalyptic for higher education, and indeed for our nation as a whole. It has intensified pressures already threatening liberal arts education: concerns over the cost of college, particularly for majors without clear career outcomes; the popularity of professional degrees with large numbers of required credits; the push for badges or micro-credentials as alternatives to a four-year degree; declining birth rates; the growth of online programs and other hybrid forms of “content delivery.” Concerns over the practicality of the liberal arts intensify ongoing questions about the very idea of moral formation central to this tradition. And within our nation, the pandemic has exacerbated preexisting inequalities and racial injustice. Pandemic conditions have fueled a surprisingly robust protest movement that is powerfully, and inspiringly, raising questions too often ignored by Christian educators. These are particularly pressing issues for Christian colleges and universities, which situate career preparation, moral formation, and critical inquiry within a broader vision for spiritual vocation.
This project gathers faculty from a variety of institutions to lead conversations regarding the enduring relevance of the liberal arts. We welcome you to watch or listen to these conversations and participate in these vital discussions. The 2020-2021 academic year will likely prove an inflection point for higher education as the coronavirus pandemic and #BlackLivesMatters protests accentuate financial difficulties and surface mission ambiguities. Might it be a tipping point in a positive direction, as institutions seek to better equip students for the complexities facing them? Our conversations will enable colleges and universities across the country to learn from one another in addressing these challenges and opportunities, and they will encourage these institutions to draw on the rich heritage of the liberal arts tradition, while acknowledging its historical limitations, in shaping their responses. Our goal is to think and talk in public about the enduring value of the liberal arts for the particular concerns and challenges of our time.
Other members of the project include Jonathan Tran, Angel Adams Parham, Francis Su, Stephanie Wong, Greg Lee, Rachel Griffis, Kristin Du Mez, Joseph Clair, and Joe Creech. Each week we will be posting 2-3 video interviews with different leading scholars, thinkers, and writers from a variety of backgrounds and institutions. The interviews will track with one of four main thematic "channels" on the website: questions about the liberal arts of a definitional, formational, institutional, or liberational sort.
Already we have videos up featuring Willie James Jennings, Zena Hitz, Alan Jacobs, Karen Lee, and Francis Su. We have many more in the can or scheduled, including my own interview of Alan Noble, which should be posted next week.
I encourage you to peruse the site, watch/listen to the interviews, and share what will hopefully develop into a useful resource with as many others as you can!
On “anti" films that succeed, and why
More than one friend has pointed out an exception or addendum to my last post on "anti" films, which makes the claim that no "anti" films are successful on their own terms, for they ineluctably glorify the very thing they are wanting to hold up for critique: war, violence, misogyny, wealth, whatever.
More than one friend has pointed out an exception or addendum to my last post on "anti" films, which makes the claim that no "anti" films are successful on their own terms, for they ineluctably glorify the very thing they are wanting to hold up for critique: war, violence, misogyny, wealth, whatever.
The exception is this: There are successful "anti" films—meaning dramatic-narrative films, not documentaries—whose subject matter is intrinsically negative, and not ambiguous or plausibly attractive. Consider severe poverty, drug addiction, or profound depression. Though it is possible to make any of these a fetish, or to implicate the audience as a voyeur in relation to them, there is nothing appealing about being depressed, addicted, or impoverished, and so the effect of the cinematic form does nothing to make them appealing: for the form magnifies, and here there is nothing positive to magnify, only suffering or lack.
So, for example, The Florida Project and Requiem for a Dream and Melancholia are successful on their own terms; my critique of "anti" films does not apply to them.
But note well a few relevant features that distinguish these kinds of movies.
First, no one would mistake such films for celebrations of poverty, drug abuse, or depression. But that isn't because they're overly didactic; nor is it because other "anti" films aren't clear about their perspective. It's because no one could plausibly celebrate such things. But people do mistake films about cowboys, soldiers, assassins, vigilantes, gangsters, womanizers, adulterers, and hedge fund managers(!) as celebrations of them and their actions.
Second, this clear distinction helps us to see that films "against" poverty et al are not really "anti" films at all. Requiem is not "anti-hard drugs": it is about people caught up in drug abuse. It's not a D.A.R.E. ad for middle schoolers—though, as many have said, it certainly can have that effect. In that film Aranofsky glamorizes nothing about hard drugs or the consequences of being addicted to them. But that is more a critique of the way most films ordinarily bypass such consequences and focus on superficial appurtenances of the rich and famous, including the high of drugs but little more.
Third, this clarification helps to specify what I mean by "anti" films. I don't mean any film that features a negative subject matter. I mean a film whose narrative and thematic modus operandi is meant to be subversive. "Anti" films take a topic or figure that the surrounding culture celebrates, enjoys, or prefers left unexamined and subjects it to just that undesired examination. It deconstructs the cowboy and the general and the captain of industry. Or it does the same to the purported underbelly of society, giving sustained and sympathetic attention to the Italian mafia or drug-runners or pimps or what have you. In the first case, the lingering, affectionate gaze of the camera cannot but draw viewers into the life of the heretofore iconic figure, deepening instead of complicating their prior love. In the second, the camera's gaze does the same for previously misunderstood or despised figures. Michael Corleone and Tony Montana and Tommy DeVito become memorialized and adored through repeated dialogue, scenes, posters, and GIFs. Who could resist the charms of such men?
Fourth, the foregoing raises the question: Why are bad things like crime and violence and illicit sex plausibly "attractive" to filmmakers and audiences in a way that other bad things are not? I think the answer lies, on the one hand, with the visual nature of the medium: sex and violence, not to mention the excitement and/or luxury bound up with the life of organized crime, are visual and visually thrilling actions; in the hands of gifted directors, their rendering in film is often gorgeous and alluring to behold. Bodies in motion, kinetic choreography, beautiful people doing physically demanding or intriguing or seductive deeds: the camera was made for such things. Depression and deprivation? Not so much. (A reminder that film is not a medium of interiority; psychology is for print.)
On the other hand, the perennial topics of "anti" films are, as I said in my first post, not wholly bad things. War, needless to say, is a deeply complex phenomenon: just causes and wicked intentions, wise leaders and foolish generals, acts of heroism and indiscriminate killing, remarkable discipline and wanton destruction. War is a force that gives us meaning for a reason. But sex and westerns and extravagant wealth and even organized crime are similarly ambivalent, which is to say, they contain good and bad; or put differently, what is bad in them is a distortion of what is good. The Godfather is a classic for many reasons, but a principal one is its recognizable depiction of an institution in which we all share: family.
One friend observed that, perhaps, films cannot finally succeed in subverting vices of excess, but they can succeed in negative portrayals of vices of privation. I'll have to continue to ruminate on that, though it may be true. Note again, however, the comment above: vices of privation are not generally celebrated, admired, or envied; there is no temptation to be seduced by homelessness, nor is the medium of film prone to glorify it. Which means there is nothing subversive, formally speaking, about depicting homelessness as a bad thing that no one should desire and everyone should seek to alleviate. Whereas an "anti" film, at least in my understanding of it, is subversive by definition.
Fifth, another friend remarked that the best anti-war films are not about war at all: the most persuasive case against a vice is a faithful yet artful portrait of virtue. Broadly speaking, I think that is true. Of Gods and Men and A Hidden Life are "anti-war" films whose cameras do not linger on the battlefield or set the audience inside the tents and offices of field generals and masters of war. Arrival is a "pro-life" film that has nothing to do with abortion. So on and so forth. I take this to be a complimentary point, inasmuch as it confirms the difficulty (impossibility?) of cinematic "anti" films, according to my definition, and calls to mind other mediums that can succeed as subversive art: literature, poetry, music, photography, etc. I think the phenomenon I am discussing, in other words, while not limited to film, is unique in the range and style of its expression—or restriction—in film.
A simple way to put the matter: no other art form is so disposed to the pornographic as film is. The medium by its nature wants you to like, to love, to be awoken and shaken and shocked and moved by what you see. It longs to titillate. That is its special power, and therefore its special danger. That doesn't make it all bad. Film is a great art form, and individual films ought to be considered the way we do any discrete cultural artifact. But it helps to explain why self-consciously "subversive" films continually fail to achieve their aims, inexorably magnifying, glamorizing, and glorifying that which they seek to hold up to a critical eye. And that is why truly subversive literary art so rarely translates to the screen; why, for example, Cormac McCarthy's "anti-western" Blood Meridian is so regularly called "unfilmable." What that novel induces in its readers, not in spite of but precisely in virtue of its brilliance, is nothing so much as revulsion. One does not "like" or identify with the Kid or the Judge or their fellows. One does not wish one were there. One is sickened, overwhelmed with the sheer godforsaken evil and suffering on display. No "cowboys and Indians" cosplay here. Just violence, madness, and death.
Can cinema produce an anti-western along the same lines? One that features cowboys and gorgeous vistas and heart-pounding action and violence? Filmmakers have tried, including worthy efforts by Clint Eastwood, Tommy Lee Jones, and Kelly Reichardt. I'd say the verdict is still out. But even if their are exceptions, the rule stands.
No such thing as an anti-war film, or anti-anything at all
There is no such thing as an anti-war film, François Truffaut is reported to have said. In a manner of speaking, there is no such thing as an anti-anything film, at least so long as the subject in question is depicted visually.
There is no such thing as an anti-war film, François Truffaut is reported to have said. In a manner of speaking, there is no such thing as an anti-anything film, at least so long as the subject in question is depicted visually.
The reason is simple. The medium of film makes whatever is on screen appealing to look at—more than that, to sink into, to be seduced by, to be drawn into. Moving images lull the mind and woo the heart.
Moreover, anything that is worth opposing in a film contains some element of goodness or truth or beauty. The wager or argument of the filmmaker is not that the subject matter is wholly evil; rather, it is that it is something worthwhile that has been corrupted, distorted, or disordered: by excess, by wicked motives, by tragic consequences. Which means that whatever is depicted in the film is not Evil Writ Large, Only Now On Screen. It is something lovely or valuable—something ordinary people "fall for" in the real world—except portrayed in such a way as to try to show the harm or problems attending it.
Unfortunately, the form of film itself works against the purposes of an "anti film," since the nature of the form habituates an audience to identify with and even love what is on screen. Why? First, because motion pictures are in motion, that is, they take time. Minute 30 is different than minute 90. Even if minute 90 "makes the point" (whether subtly or didactically), minutes 1 through 89 might embody the opposite point, and perhaps far more powerfully.
Second, cinematic form is usually narrative in character. That means protagonists and plot. That means point of view, perspective. That means viewers inevitably side with, line up with, a particular perspective or protagonist. And when that happens, sympathies soften whatever critique the filmmaker wants to communicate; the "bad fan" effect generates a shared instinct to cheer on the lead, however Illustratively Bad or even an Unqualified Antihero he may be. If only Skyler wouldn't get in Walter's way, you know?
Third, the temporal and narrative shape of film means that endings mean everything and nothing. Everything, because like all stories they bring to a head and give retroactive meaning to all that came before. Nothing, because what many people remember most is the experience of the journey getting there. And if the journey is 99% revelry in Said Bad Thing, even if the final 1% is denunciation thereof, what people will remember and continue returning for is the 99%. And that's just not something you can control, no matter how blunt you're willing to be in the film's flashing-neon messaging.
That, in short, is why there are no "anti" films, only failed attempts at them.
No anti-war films: only films that glorify the spectacle and heroes of warfare.
No anti-gangster films: only films that glorify the thrills of organized crime.
No anti-luxury films: only films that glorify the egregious excesses of the 1%.
No anti-western films: only films that glorify the cowboy, the vigilante, and lawless violence.
No anti-ultraviolence films: only films that glorify the wild anarchy of the uncontrolled and truly free.
No anti-misogynist films: only films that glorify the untamed libido and undomesticated talk of the charming but immature adolescent or bachelor.
And finally, no films subversive of the exploitation and sexualization of young girls: only films that glorify the same. To depict that evil, in this medium, in a way that captures faithfully the essence of the evil, is little more than reproducing the evil by other means, in another form. However artful, however sophisticated, however well-intended, it is bound for failure—a failure, I hasten to add, for which the filmmakers in question are entirely culpable.
2021: the year of Martin, Rothfuss, and Williams?
Could 2021 see the publication of long-awaited sequels to three major fantasy series?
Could 2021 see the publication of long-awaited sequels to three major fantasy series?
It would mark a full decade since George R. R. Martin published the fifth entry in his Song of Ice and Fire. He's been writing pages and pages since then, or so he says. He turns 72 this month, and following the TV adaptation of Game of Thrones concluded, then living through a global pandemic, he's had nothing but time to write. In any case, even after #6, he's got at least one more book in the series to write, assuming it doesn't keep multiplying and fracturing indefinitely. One can hope, no?
It's also been a decade since the second book in Patrick Rothfuss's Kingkiller Chronicle trilogy was published. Four years spanned the first two books. Perhaps ten will span the second and the third? Rothfuss insists that he is hard at work on The Doors of Stone, yet reacts cantankerously to continuous "Are you finished yet?" queries. It's unclear whether it's the perfectionist in him or whether, like Martin, the sprawl of the story plus the lure of other projects is keeping completion just over the horizon.
Speaking of completion: A full three decades ago Tad Williams published his extraordinary trilogy Memory, Sorrow, and Thorn. He sat on it, sat on it, and sat on it some more—then finally made good on some hints and gestures in the closing pages of the final volume, To Green Angel Tower, with the publication in 2017 of a "bridge novel," The Heart of What Was Lost (a more Williams-esque title there never was, plangent and grand in epic lament as so much of his work is). Thus began a new trilogy, The Last King of Osten Ard, with volumes published in orderly sequence: 2017, 2019, and, in prospect, October 2021. (A second short prequel novel will be published in the summer.) Once he breaks the story, the man knows how to tell it, and how to finish it.
In sum, these are three great fantasy novelists working to complete three much-beloved fantasy series. And we could have all three authors publishing eagerly anticipated books in the next 15 months. Let it be so!
Louis Dupré on symbolism and ontology in religious language
Religious language must, by its very nature, be symbolic: its referent surpasses the objective universe. Objectivist language is fit only to signify things in a one-dimensional universe. It is incapable of referring to another level of reality, as art, poetry, and religion do.
Religious language must, by its very nature, be symbolic: its referent surpasses the objective universe. Objectivist language is fit only to signify things in a one-dimensional universe. It is incapable of referring to another level of reality, as art, poetry, and religion do. Rather than properly symbolizing, it establishes external analogies between objectively conceived realities. Their relation is allegorical rather than symbolic. A truly symbolic relation must be grounded in Being itself. Nothing exposes our religious impoverishment more directly than the loss of the ontological dimension of language. To overcome this, poets and mystics have removed their language as far as possible from everyday speech.
In premodern traditions, language remained closer to the ontological core which all things share and which intrinsically links them to one another. Symbols thereby participated in the very Being of what they symbolized, as they still do in great poetry. Religious symbols re-presented the divine reality: they actually made the divine present in images and metaphors. The ontological richness of the participatory presence of a truly symbolic system of signification appeared in the original conception of sacraments, rituals, icons, and ecclesiastical hierarchies.
The nominalism of the late Middle Ages resulted in a very different representation of the creature's relation with God. The world no longer appears as a divine expression except in the restricted sense of expressing the divine will. Finite reality becomes separated from its Creator. As a result, creatures have lost not only their intrinsic participation in God's Being but also their ontological communion with one another. Their relation becomes defined by divine decree. Nominalism not only has survived the secularization of modern thought, but has became radicalized in our own cybernetic culture, where symbols are reduced to arbitrary signs in an intramundane web of references, of which each point can be linked to any other point. The advantages of such a system need no proof: the entire scientific and technical functioning of contemporary society depends on it. At the same time, the modern mind's capacity for creating and understanding religious symbols has been severely weakened. Symbols have become man-made, objective signs, serviceable for making any reality part of a system without having to be part of that reality.
Recent theologians have attempted to stem the secular tide. Two of them did so by basically rethinking the relation between nature and grace, the main causes of today's secularism. Henri de Lubac undertook a historical critique of the modern separation of nature and supernatural. Not coincidentally, he also wrote a masterly literary study on religious symbolism before the nominalist revolution. In a number of works Hans Urs von Balthasar developed a theology in which grace, rather than being added to nature as a supernatural accident, constitutes the very depth of the mystery of Being. Being is both immanent and transcendent. Grace consists in its transcendent dimension. Whenever a poet, artist, or philosopher penetrates into the mystery of existence, he or she reveals an aspect of divine grace. Not only theology but also art and poetry, even philosophy, thereby regain a mystical quality, and religion resumes its place at the heart of human reality.
No program of theological renewal can by itself achieve a religious restoration. To be effective a theological vision requires a recognition of the sacred. Is the modern mind still capable of such a recognition? Its fundamental attitude directly conflicts with the conditions necessary for it. First, some kind of moral conversion has become indispensable. The immediate question is not whether we confess a religious faith, or whether we live in conformity with certain religious norms, but whether we are of a disposition to accept any kind of theoretical or practical direction coming from a source other than the mind itself. Such a disposition demands that we be prepared to abandon the conquering, self-sufficient state of mind characteristic of late modernity. I still believe in the necessity of what I wrote at an earlier occasion: "What is needed is a conversion to an attitude in which existing is more than taking, acting more than making, meaning more than function—an attitude in which there is enough leisure for wonder and enough detachment for transcendence. What is needed most of all is an attitude in which transcendence can be recognized again."
—Louis Dupré, Religion and the Rise of Modern Culture (2008), 115-117
François Furet on revolutionary consciousness
[T]he revolutionary situation was not only characterised by the power vacuum that was filled by a rush of new forces and by the 'free' activity of society. . . . It was also bound up with a kind of hypertrophy of historical consciousness and with a system of symbolic representations shared by the social actors.
[T]he revolutionary situation was not only characterised by the power vacuum that was filled by a rush of new forces and by the 'free' activity of society. . . . It was also bound up with a kind of hypertrophy of historical consciousness and with a system of symbolic representations shared by the social actors. The revolutionary consciousness, from 1789 on, was informed by the illusion of defeating a State that had already ceased to exist, in the name of a coalition of good intentions and of forces that foreshadowed the future. From the very beginning it was ever ready to place ideas above actual history, as if it were called upon to restructure a fragmented society by means of its own concepts. Repression became intolerable only when it became ineffectual. The Revolution was the historical space that separated two powers, the embodiment of the idea that history is shaped by human action rather than by the combination of existing institutions and forces.
In that unforeseeable and accelerated drift, the idea of human action patterned its goals on the exact opposite of the traditional principles underlying the social order. The Ancien Régime had been in the hands of the king; the Revolution was the people's achievement. France had been a kingdom of subjects; it was now a nation of citizens. The old society had been based on privilege; the Revolution established equality. Thus was created the ideology of a radical break with the past, a tremendous cultural drive for equality. Henceforth everything - the economy, society and politics - yielded to the force of ideology and to the militants who embodied it; no coalition nor any institution could last under the onslaught of that torrential advance.
Here I am using the term ideology to designate the two sets of beliefs that, to my mind, constitute the very bedrock of revolutionary consciousness. The first is that all personal problems and all moral or intellectual matters have become political; that there is no human misfortune not amenable to a political solution. The second is that, since everything can be known and changed, there is a perfect fit between action, knowledge and morality. That is why the revolutionary militants identified their private lives with their public ones and with the defence of their ideas. It was a formidable logic, which, in a laicised form, reproduced the psychological commitment that springs from religious beliefs. When politics becomes the realm of truth and falsehood, of good and evil, and when it is politics that separates the good from the wicked, we find ourselves in a historical universe whose dynamic is entirely new. As Marx realised in his early writings, the Revolution was the very incarnation of the illusion of politics: it transformed mere experience into conscious acts. It inaugurated a world that attributes every social change to known, classified and living forces; like mythical thought, it peoples the objective universe with subjective volitions, that is, as the case may be, with responsible leaders or scapegoats. In such a world, human action no longer encounters obstacles or limits, only adversaries, preferably traitors. The recurrence of that notion is a telling feature of the moral universe in which the revolutionary explosion took place.
No longer held together by the State, nor by the constraints that had been imposed by power and had masked its disintegration, society thus recomposed itself through ideology. Peopled by active volitions and recognising only faithful followers or adversaries, that new world had an incomparable capacity to integrate. It was the beginning of what has ever since been called 'politics', that is, a common yet contradictory language of debate and action around the central issue of power. The French Revolution, of course, did not 'invent' politics as an autonomous area of knowledge; to speak only of Christian Europe, the theory of political action as such dates back to Machiavelli, and the scholarly debate about the origin of society as an institution was well under way by the seventeenth century. But the example of the English Revolution shows that when it came to collective involvement and action, the fundamental frame of intellectual reference was still of a religious nature. What the French brought into being at the end of the eighteenth century was not politics as a laicised and distinct area of critical reflection but democratic politics as a national ideology. The secret of the success of 1789, its message and its lasting influence lie in that invention, which was unprecedented and whose legacy was to be so widespread. The English and French revolutions, though separated by more than a century, have many traits in common, none of which, however, was sufficient to bestow on the first the rôle of universal model that the second has played ever since it appeared on the stage of history. The reason is that Cromwell's Republic was too preoccupied with religious concerns and too intent upon its return to origins to develop the one notion that made Robespierre's language the prophecy of a new era: that democratic politics had come to decide the fate of individuals and peoples.
The term 'democratic politics' does not refer here to a set of rules or procedures designed to organise, on the basis of election results, the functioning of authority. Rather, it designates a system of beliefs that constitutes the new legitimacy born of the Revolution, and according to which the people', in order to establish the liberty and equality that are the objectives of collective action, must break its enemies' resistance. Having become the supreme means of putting values into action and the inevitable test of 'right' or 'wrong' will, politics could have only a public spokesman, in total harmony with those values, and enemies who remained concealed, since their designs could not be publicly admitted. The people were defined by their aspirations, and as an indistinct aggregate of individual 'right' wills. By that expedient, which precluded representation, the revolutionary consciousness was able to reconstruct an imaginary social cohesion in the name and on the basis of individual wills. That was its way of resolving the eighteenth century's great dilemma, that of conceptualising society in terms of the individual. If indeed the individual was defined in his every aspect by the aims of his political action, a set of goals as simple as a moral code would permit the Revolution to found a new language as well as a new society. Or, rather, to found a new society through a new language: today we would call that a nation; at the time it was celebrated in the fête de la Fédération.
—François Furet, Interpreting the French Revolution (trans. Elborg Forster; 1978), 25-27
10 thoughts on colleges reopening
1. Not every college is the same. There are community colleges, private colleges, public colleges. Some have 1,500 students, some have 50,000 students. Some are in rural areas and small towns, some are in densely populated urban centers. Some have wild and uncontrollable Greek life, some (very much) do not.
1. Not every college is the same. There are community colleges, private colleges, public colleges. Some have 1,500 students, some have 50,000 students. Some are in rural areas and small towns, some are in densely populated urban centers. Some have wild and uncontrollable Greek life, some (very much) do not.
2. Not every place is the same. There are regions, locales, and states that are still hot spots on lockdown, and there are others that are in rather better shape.
3. Not every institution is the same. Some are cash-strapped or risk-averse or profit-driven or top-down—or what have you—and some have resilient and time-honored habits of shared governance.
4. Not every professor is the same. This applies not only to characteristics like age or discipline but also to personal judgment or preference: universities are not split, in a perfect dichotomy, between administrators who are pushing forward with reopening and faculty who are pushing back.
5. Economic pressures are real. Although it is a sad and lamentable fact, American higher ed is a teetering tower ready to topple over at any moment. Without relitigating the relevant history, that is where things stand. Many, perhaps most, colleges were forced to make an impossible decision: reopen residential instruction, or initiate systematic lay-offs, firings, departmental cuts, and program eliminations. If the latter would have been truly unavoidable, how many staff and faculty would have opted to go fully online?
6. Students and families have agency. No one and nothing is forcing students to return to campus; if families preferred doing a gap year, getting a job, or learning online, there are readily available options for all three of those routes. Students and their families are making their own assessment of the risk. It is far from irrational to select residential college instruction, given the options.
7. Consider the alternatives for students. If a 20-year old who would be in college isn't in college, what is she doing? Either she is quarantined at home, working (likely not from home), or living it up with friends. The latter two options entail levels of risk commensurate with or far greater than residential college life. And the first follows on the heels of six full months of functional social lockdown, in the best of circumstances alone with a couple family members, in the worst trapped in dysfunctional households with negligent or unhealthy persons. We are only beginning to reckon with the effects of such extended isolation on young persons' mental health. In any case, there is no prefabricated universal calculus for what would be best for each person: continued isolation versus greater risk of exposure, and if the latter, at work or at college. Given the fact that so many families have chosen to send their children to college, shouldn't we presume a reasonable assessment of the risk and of the various options, and respect the possibility that such a decision was wise (or, again, the least bad choice, given the options)?
8. It follows from all of these reflections that there is no one-size-fits-all judgment upon "American colleges reopening." I have no doubt that some colleges should not have reopened; that some could have done so wisely but have not implemented adequate precautions; that some made the decision largely out of fear; that most did so as a financial necessity; that plenty did so cynically, hoping to make it long enough to capture non-refundable tuition. But those facts, in the aggregate, have little to say regarding particular cases; nor still do bad actors or even commonly held bad motivations per se mean that most colleges should not have reopened. The need for tuition and the precarity of jobs, on the one hand, and the rational desire of families to have a residential college to send their children to, on the other, can all coexist together.
9. What has been ascendant these last few months is a species of scientism that supposes questions of politics and prudence can be answered by experts in epidemiology and pandemics. But they cannot. What is needed instead is wisdom, the virtue of phronesis that puts practical reason to work in making global judgments, in light of all the evidence and the totality of factors, about what is best to be done in this case for this person for these reasons. Those reasons cannot be put under a microscope or studied in a lab. They are not verifiable. But resort to them is what is sorely needed in this moment.
10. In addition to wisdom, what we need is grace. Prudence can be applied differently; different people come to opposed judgments, either about distinct cases or about the same one. That is okay. We can live with that. I understand and respect those friends and writers and fellow academics who think reopening is foolhardy or unwise; I know they have their reasons. But such a judgment, even if correct, is not self-evident or uncontested. We have to have grace with one another, we have to learn how to be patient. I myself am impatient in the extreme. We have to try nonetheless (I have to try). That's the only way we're going to get through this together.
David Walker on slavery and the justice of God
And as the inhuman system of slavery, is the source from which most of our miseries proceed, I shall begin with that curse to nations, which has spread terror and devastation through so many nations of antiquity, and which is raging to such a pitch at the present day in Spain and in Portugal.
And as the inhuman system of slavery, is the source from which most of our miseries proceed, I shall begin with that curse to nations, which has spread terror and devastation through so many nations of antiquity, and which is raging to such a pitch at the present day in Spain and in Portugal. It had one tug in England, in France, and in the United States of America; yet the inhabitants thereof, do not learn wisdom, and erase it entirely from their dwellings and from all with whom they have to do. The fact is, the labour of slaves comes so cheap to the avaricious usurpers, and is (as they think) of such great utility to the country where it exists, that those who are actuated by sordid avarice only, overlook the evils, which will as sure as the Lord lives, follow after the good. In fact, they are so happy to keep in ignorance and degradation, and to receive the homage and the labour of the slaves, they forget that God rules in the armies of heaven and among the inhabitants of the earth, having his ears continually open to the cries, tears and groans of his oppressed people; and being a just and holy Being will at one day appear fully in behalf of the oppressed, and arrest the progress of the avaricious oppressors; for although the destruction of the oppressors God may not effect by the oppressed, yet the Lord our God will bring other destructions upon them—for not unfrequently will he cause them to rise up one against another, to be split and divided, and to oppress each other, and sometimes to open hostilities with sword in hand. Some may ask, what is the matter with this united and happy people?—Some say it is the cause of political usurpers, tyrants, oppressors, &c. But has not the Lord an oppressed and suffering people among them? Does the Lord condescend to hear their cries and see their tears in consequence of oppression? Will he let the oppressors rest comfortably and happy always? Will he not cause the very children of the oppressors to rise up against them, and oftimes put them to death? "God works in many ways his wonders to perform."
—David Walker's Appeal, in Four Articles; Together with a Preamble, to the Coloured Citizens of the World, But in Particular, and Very Expressly, to Those of the United States of America (1829), from the Preamble
Tony Judt on the New Left in the '60s
It was a curiosity of the age that the generational split transcended class as well as national experience. The rhetorical expression of youthful revolt was, of course, confined to a tiny minority: even in the US in those days, most young people did not attend university and college protests did not necessarily represent youth at large. But the broader symptoms of generational dissidence—music, clothing, language—were unusually widespread thanks to television, transistor radios and the internationalization of popular culture. By the late ’60s, the culture gap separating young people from their parents was perhaps greater than at any point since the early 19th century.
This breach in continuity echoed another tectonic shift. For an older generation of left-leaning politicians and voters, the relationship between ‘workers’ and socialism—between ‘the poor’ and the welfare state—had been self-evident. The ‘Left’ had long been associated with—and largely dependent upon— the urban industrial proletariat. Whatever their pragmatic attraction to the middle classes, the reforms of the New Deal, the Scandinavian social democracies and Britain’s welfare state had rested upon the presumptive support of a mass of blue collar workers and their rural allies.
But in the course of the 1950s, this blue collar proletariat was fragmenting and shrinking. Hard graft in traditional factories, mines and transport industries was giving way to automation, the rise of service industries and an increasingly feminized labor force. Even in Sweden, the social democrats could no longer hope to win elections simply by securing a majority of the traditional labor vote. The old Left, with its roots in working class communities and union organizations, could count on the instinctive collectivism and communal discipline (and subservience) of a corralled industrial work force. But that was a shrinking percentage of the population.
The new Left, as it began to call itself in those years, was something very different. To a younger generation, ‘change’ was not to be brought about by disciplined mass action defined and led by authorized spokesmen. Change itself appeared to have moved on from the industrial West into the developing or ‘third’ world. Communism and capitalism alike were charged with stagnation and ‘repression’. The initiative for radical innovation and action now lay either with distant peasants or else with a new set of revolutionary constituents. In place of the male proletariat there were now posited the candidacies of ‘blacks’, ‘students’, ‘women’ and, a little later, homosexuals.
Since none of these constituents, at home or abroad, was separately represented in the institutions of welfare societies, the new Left presented itself quite consciously as opposing not merely the injustices of the capitalist order but above all the ‘repressive tolerance’ of its most advanced forms: precisely those benevolent overseers responsible for liberalizing old constraints or providing for the betterment of all.
Above all, the new Left—and its overwhelmingly youthful constituency—rejected the inherited collectivism of its predecessor. To an earlier generation of reformers from Washington to Stockholm, it had been self-evident that ‘justice’, ‘equal opportunity’ or ‘economic security’ were shared objectives that could only be attained by common action. Whatever the shortcomings of over-intrusive top-down regulation and control, these were the price of social justice—and a price well worth paying.
A younger cohort saw things very differently. Social justice no longer preoccupied radicals. What united the ’60s generation was not the interest of all, but the needs and rights of each. ‘Individualism’—the assertion of every person’s claim to maximized private freedom and the unrestrained liberty to express autonomous desires and have them respected and institutionalized by society at large—became the left-wing watchword of the hour. Doing ‘your own thing’, ‘letting it all hang out’, ‘making love, not war’: these are not inherently unappealing goals, but they are of their essence private objectives, not public goods. Unsurprisingly, they led to the widespread assertion that ‘the personal is political’.
The politics of the ’60s thus devolved into an aggregation of individual claims upon society and the state. ‘Identity’ began to colonize public discourse: private identity, sexual identity, cultural identity. From here it was but a short step to the fragmentation of radical politics, its metamorphosis into multiculturalism. Curiously, the new Left remained exquisitely sensitive to the collective attributes of humans in distant lands, where they could be gathered up into anonymous social categories like ‘peasant’, ‘post-colonial’, ‘subaltern’ and the like. But back home, the individual reigned supreme.
However legitimate the claims of individuals and the importance of their rights, emphasizing these carries an unavoidable cost: the decline of a shared sense of purpose. Once upon a time one looked to society—or class, or community—for one’s normative vocabulary: what was good for everyone was by definition good for anyone. But the converse does not hold. What is good for one person may or may not be of value or interest to another. Conservative philosophers of an earlier age understood this well, which was why they resorted to religious language and imagery to justify traditional authority and its claims upon each individual.
But the individualism of the new Left respected neither collective purpose nor traditional authority: it was, after all, both new and left. What remained to it was the subjectivism of private—and privately-measured—interest and desire. This, in turn, invited a resort to aesthetic and moral relativism: if something is good for me it is not incumbent upon me to ascertain whether it is good for someone else—much less to impose it upon them (“do your own thing”).
True, many radicals of the ’60s were quite enthusiastic supporters of imposed choices, but only when these affected distant peoples of whom they knew little. Looking back, it is striking to note how many in western Europe and the United States expressed enthusiasm for Mao Tse-tung’s dictatorially uniform ‘cultural revolution’ while defining cultural reform at home as the maximizing of private initiative and autonomy.
In distant retrospect it may appear odd that so many young people in the ’60s identified with ‘Marxism’ and radical projects of all sorts, while simultaneously disassociating themselves from conformist norms and authoritarian purposes. But Marxism was the rhetorical awning under which very different dissenting styles could be gathered together—not least because it offered an illusory continuity with an earlier radical generation. But under that awning, and served by that illusion, the Left fragmented and lost all sense of shared purpose.
On the contrary, ‘Left’ took on a rather selfish air. To be on the Left, to be a radical in those years, was to be self-regarding, self-promoting and curiously parochial in one’s concerns. Left-wing student movements were more preoccupied with college gate hours than with factory working practices; the university-attending sons of the Italian upper-middle-class beat up underpaid policemen in the name of revolutionary justice; light-hearted ironic slogans demanding sexual freedom displaced angry proletarian objections to capitalist exploiters. This is not to say that a new generation of radicals was insensitive to injustice or political malfeasance: the Vietnam protests and the race riots of the ’60s were not insignificant. But they were divorced from any sense of collective purpose, being rather understood as extensions of individual self-expression and anger.
These paradoxes of meritocracy—the ’60s generation was above all the successful byproduct of the very welfare states on which it poured such youthful scorn—reflected a failure of nerve. The old patrician classes had given way to a generation of well-intentioned social engineers, but neither was prepared for the radical disaffection of their children. The implicit consensus of the postwar decades was now broken, and a new, decidedly unnatural consensus was beginning to emerge around the primacy of private interest. The young radicals would never have described their purposes in such a way, but it was the distinction between praiseworthy private freedoms and irritating public constraints which most exercised their emotions. And this very distinction, ironically, described the newly emerging Right as well.
—Tony Judt, Ill Fares the Land (2010), pp. 85-91