Resident Theologian
About the Blog
Ethics primer
There are two sets of fundamental distinctions in ethics. The first concerns the kind of ethics in view. The second concerns the difference between morality and other terms or concepts we are prone to confuse with morality.
There are two sets of fundamental distinctions in ethics. The first concerns the kind of ethics in view. By my count, there are four such:
First is descriptive ethics. This is, as the name suggests, ethics in a descriptive mode: it does not propose what is good or evil, what actions to pursue or avoid, but rather offers an account, meant to be accurate but not evaluative, of what individuals, groups, religions, or philosophies believe to be good or evil, etc.
Second is metaethics. This is a philosophical approach to ethics that takes a bird’s-eye view of the very task and concept of ethics, asking what is going on when we “do” ethics. If first-order ethics is the exercise of practical reason in real time on a daily basis by ordinary people, and if second-order ethics is critical rational reflection on the reasoning processes and resulting behaviors embodied in those daily habits of moral living, then metaethics is third-order ethics: critical rational reflection on what we’re up to when we engage in second-order reasoning about first-order living. Metaethics asks questions like, “What does the word ‘good’ have in common as between its use in, e.g., Thomist and Kantian discourses?” Or: “Is all second-order ethics ineluctably teleological?” So on and so forth.
Third is normative ethics. This is the second-order ethics mentioned above: critical rational reflection on what the good life consists in and what behaviors conduce to it. Put differently, normative ethics is prescriptive; it wants, at the end of its labors, to arrive at how you and I should live if we would be good persons. The mood or mode of normative ethics is the imperative (though not only the imperative): Thou shalt not murder, steal, lie, covet, and what not. Only rarely does anyone but academics do metaethics or descriptive ethics. More or less everyone does normative ethics, at least in terms of making appeals to concrete traditions of normative ethics on appropriate occasions: faced with a hard decision; helping a friend work through a problem; teaching a child how to behave; etc.
Fourth is professional ethics. This is the code of conduct or statute of behaviors proper to a particular profession, institution, job, business, or guild. It is a contingent set of recommendations for what makes a fitting or excellent member of said sphere: If you would practice law/medicine/whatever, then you may (not) do X, Y, Z … It is important to see that professional ethics is a derivative, secondary, and belated species of ethics. It is derivative because its principles stem from but are not synonymous with normative ethics. It is secondary because, when and where it requires actions that are (normatively) wrong or forbids actions that are (normatively) right, a person “bound” by professional ethics not only may but must transgress the lines drawn by his or her professional ethics, in service to the higher good required by normative ethics. By the same token, much of professional ethics consists of “best practices” that are neither moral nor immoral, but amoral. They aren’t, that is, about right or wrong in themselves, only about what it means to belong to this or that career or organization. Finally, professional ethics is belated in the sense that late modern capitalism generates byzantine bureaucracies beholden to professional ethics not as a useful, if loosely held, revolving definition of membership in a guild, but instead as hidebound labyrinths by which to protect said members from legal liability. In this way professional ethics partakes of a certain mystification, insofar as it suggests, by its language, that persons formed by its rules and principles will be good or virtuous in character, whereas in truth such persons are submitting to a form of ideological discipline that bears little, if any, relationship to the good in itself or what makes for virtuous character.
*
Having made these distinctions, we are in a position to move to a second set. The following distinctions concern the difference between morality (which is what ethics proper, or normative ethics, is about) and other terms or concepts we are prone to confuse with morality. By my count there are five such:
1. Morality and legality. This is the difference between what one ought to do and what one is permitted by law to do. So, e.g., it is morally wrong to cheat on your spouse, but in this country, at this time, adultery is not illegal. Or consider Jim Crow: “separate but equal” was legal for a time, but it was never moral. If a black person jumped into a public swimming pool full of white people, he did nothing wrong, even if the police had a legal pretext by which to apprehend or punish him.
2. Morality and freedom. This is the difference between what one ought to do and what one is capable of doing. E.g., when I ask student X whether it is morally permissible (or “ethical”) for student Y to cheat on an exam, eight times out of ten the answer is: “He can what he wants.” But that’s not the question. No one disputes that he, student Y, “can do what he wants.” I’m asking whether, if what he wants is to cheat on an exam, that action is a moral one, i.e., whether it is right or wrong.
3. Morality and convention. This is the difference between what one ought to do and what one’s community (family, culture, religion) presupposes one ought to do. If I ask, “Is it right for person A to perform action B?” and someone answers, “Well, that’s the sort of thing that’s done in the community to which person A belongs,” the question has not yet been answered. Cultural assumptions are just that: assumptions. They may or may not be right. Ancient Rome permitted the paterfamilias of a household to expose a newborn infant who was unwanted or somehow deemed to be defective. But infanticide is morally wrong, regardless of whether or not a particular culture has permitted, encouraged, and/or legalized it. That is why we are justified in judging the ancient Roman practice of exposure to be morally wrong, even though they could well have responded, “But that’s the sort of thing that’s done here by and among us.”
4. Morality and beliefs-about-morality. This is the difference between what one ought to do and what people think one ought to do. In other words, no one is morally infallible; each of us, at any one time, has and has had erroneous ethical beliefs. This is why, from childhood through adulthood and onward, to be human is to undergo a lifelong moral education. It is likewise why it is intelligible for someone, even in midlife or older, to say, “You know, I used to believe that [moral claim] too; but recently my mind was changed.” This distinction also makes clear that relativism is false. It is not morally right for a serial killer to murder, even if he genuinely believes it is good for him, the serial killer, to do so. It is wrong whatever he believes, because murder is objectively wrong. The truth of murder’s wrongness is independent of his, your, or my beliefs about murder. If it is wrong, it is wrong prior to and apart from your and my agreement with its wrongness—though it is certainly desirable for you and I to come to see that murder is objectively wrong, and not merely wrong if/because we believe it to be wrong.
5. Morality and behavior. This is the difference between what one ought to do and what people actually do. No one believes human beings to be morally perfect; further, no one believes human beings to be perfectly consistent in the application of their moral convictions. E.g., whether or not you would lie in such-and-such a situation does not (yet) answer whether or not it would be right to do so. My students regularly trip up on this distinction. I ask: “Would it be morally justified for you knowingly to kill an innocent person in order to save five innocent persons?” They say: “I guess I would, if I were in that situation.” But as we have seen, that isn’t an answer to my question. The question is not whether you or I would do anything at all, only whether the behavior in question is morally right/wrong. Jews, Christians, and Muslims are doubly committed to the importance of this distinction, between we believe that all human beings are sinners. Our moral compass is broken, and although we may do good deeds, our proclivity runs the other direction: to vanity, pride, selfishness, sloth, self-loathing, lust, envy, deceit, self-justification. If that belief about human sinfulness is true (and it is), then on principle we should never suppose that what anyone would do in a given situation, real or hypothetical, reveals the truth of what one ought to do. The latter question must be answered on other grounds entirely.
*
In my experience, these two sets of distinctions, if imbibed thoroughly or taught consistently, make a world of difference for students, Christians, and other persons of good will who are interested in understanding, pursuing, and deliberating on what makes for good, ethical, or moral human living. If we agreed on them in advance, we might even be able to have a meaningful conversation about contested ethical matters! Imagine that.
What do I want for my students?
I teach theology to undergraduates. That means, on one hand, that I’m not teaching them a skill. Many professors pass on a set of concrete skills meant for a job or some other professional activity: how to interview a client; how to detect a speech impediment; how to parse a verb; how to mix solutions. Not me. That’s not what theology is about.
I teach theology to undergraduates. That means, on one hand, that I’m not teaching them a skill. Many professors pass on a set of concrete skills meant for a job or some other professional activity: how to interview a client; how to detect a speech impediment; how to parse a verb; how to mix solutions. Not me. That’s not what theology is about.
On the other hand, I’m not teaching my students a discrete collection of facts, such that they might memorize them and, having done so, be assessed for their (or my) success. To be sure, theology contains facts—the date of the seventh ecumenical council, the name of the angelic doctor, the location of the crucifixion—but these are not the point of theology; they are necessary but relatively unimportant elements along the way.
Moreover, nine out of ten students register for a class with me because it is part of a menu of courses they are required to take. In other words, they’re with me because they have to be, not (necessarily) because they want to be. I cannot assume either prior knowledge or present interest.
Finally, professors should be honest with themselves. Whatever a student learns from me, she will almost certainly forget within five to fifteen years. No student is going to see me at a restaurant in 2035 and say, “Dr. East! Chalcedon! Theotokos! St. Cyril and the Tome of St. Leo!” Even if they did, they wouldn’t remember what those words meant. It would be an impressive student who did.
I imagine it’s hard for some teachers to accept this. Why teach if they’re going to forget it all?
Well, to contradict myself for a moment: I remember verbatim a line from a professor in a course on teaching my senior year of college. He said: Learning is what you remember when you’ve forgotten everything you were taught. (Or something like that.) That is, you do take something with you, even if you forget all the facts and figures. So what is that something?
The answer will vary based on the teacher and the topic. Here’s mine.
My principal task as a teacher of theology is the act of exposure. I want to expose my students, usually for the first time, to the Christian theological tradition. I want to show them that it exists, that it makes a claim on their lives, that it is of crucial importance to understanding God, and that it is supremely intellectually interesting. If I do nothing else whatsoever, and students walk out of my classroom having imbibed those lessons, I will have succeeded beyond my wildest dreams.
Put from a different angle, and more simply, my goal is for students to understand—or at least to see a visceral instance of someone who believes—that God matters. There is nothing more important than God, nothing more interesting, nothing more vital, nothing more imperiously imposing, nothing more existentially significant.
Further, I want my students to see that the person in front of them not only believes this to be true but has staked his life on it. More, that this person is morally and intellectually serious and—for this very reason, not in spite of it—believes it to his core. In other words, having taken me, no student will be able to say, for the rest of his or her life, that he or she never met an educated, intelligent, committed Christian adult. I’ve got all the credentials. I’ve got the knowledge. In worldly terms, I’ve got the goods. Not the goods that matter, mind you—like the fruit of the Holy Spirit or the cardinal virtues or any meaningful sign of holiness—but the goods that the world cares about. The Ivy PhD, the books and articles, the whatever other superficial symptoms of success that are meant to impress on social media and dust jackets.
If the students listen to my teaching, then they will know that the point about the gospel is that these things don’t matter. They are means to other ends, often little more than filthy lucre and in any case full of temptations—not least to seek after prestige or to be impressed with one’s own resume. Nevertheless, one thing they communicate is that the person bearing them cannot be dismissed as a country bumpkin or a dime-a-dozen fundie. Even if I’m wrong, it’s not because (as they say) I haven’t done the reading. No student finishes a semester with me and thinks I haven’t done my homework. That’s the one thing I make sure to rule out.
In that sense, then, I use what’s to hand as a tool for amplifying what I’ve judged to be most important for them to hear. For the most part, they won’t remember the grammar of orthodoxy as I’ve tried to spell it out for them. What they’ll remember is that there is such a thing as orthodoxy. And whether or not they were raised on it in their home church, now they can’t claim ignorance: it exists, it’s grand, it’s rich and wide and deep—the sort of thing one might give one’s life to, as their (somewhat excitable and quite strange) professor seems to have done and (even stranger) seems to think they should, too.
My courses, in a word, remove plausible deniability. They can’t say they weren’t told. Through sheer relentless heartfelt passion, energy, and love, I give all that I have and use all that I know to show forth the truths of the gospel of God. The assignments aren’t onerous, but the reading is. I want to saturate them in the wisdom and beauty of the doctors and saints and martyrs of the church. (I want them, secondarily, to imagine that reading might be a habit worth acquiring.) I want them to see themselves in the writings of the tradition, by which I mean, I want them to see the Christ they already know in the words of ancient and unknown forebears. They knew Christ, too! Perhaps, as a result, they might have something to teach us of Christ in the here and now.
More than anything, I want my students to see in the sacred tradition of the church what Rilke saw in the torso of Apollo: A peremptory and inescapable word from beyond, addressing them by name: You must change your life.
That’s what I want for my students. I want them to know Christ, and to keep on knowing him for the rest of their lives. They can do that while eventually, or even quickly, forgetting all I ever said. And that would be just fine with me.
(Re)construction
There’s a lot of talk these days about deconstruction. I’m often asked how I approach deconstructing my students’ beliefs in the classroom; it’s typically a given not only that I do it but that I ought to do it, that it’s part of my job description. I do not deconstruct what my students come into my class believing. I don’t as a point of fact and I don’t on principle. Why?
There’s a lot of talk these days about deconstruction. I’m often asked how I approach deconstructing my students’ beliefs in the classroom; it’s typically a given not only that I do it but that I ought to do it, that it’s part of my job description.
I do not deconstruct what my students come into my class believing. I don’t as a point of fact and I don’t on principle. Why?
Not because my students lack beliefs worth giving up (which, by the way, we all do, all the time). I’ve written elsewhere about what I call theological demons that demand exorcising in this generation of Bible-belt students. So it’s true in one sense that I identify and criticize particular beliefs that (I am explicit) I want my students to reject.
But that isn’t what people mean by deconstruction, either in form or in content. The form is the thing: deconstruction is a style. Deconstruction is a mode of being, a moral, social, and spiritual habitation in which to dwell, for a time or indefinitely. Deconstruction says: I’m unlearning all that I ever thought I knew—usually about the Bible, Christian teaching, Jesus, faith, or some charged element therein. Deconstruction in the imperative says: You must unlearn what you have learned. And what you have learned, you learned from an authority in your life, namely a parent, a pastor, a church, a school, a mentor, a sibling, an aunt, a grandmother, a coach, a friend. Which means, at least as the message is received, that you must unbind yourself from the wisdom of such authorities; you must accept me, your teacher, as an authority above your inherited authorities, and defer to my learning over theirs.
This, as I hope is evident, is foolish, self-serving, and manipulative pedagogy. But it is the regnant pedagogical mode not only for professors but for every would-be influencer, life coach, self-help writer, and podcaster on the market, doubly so if they purport to be an expert on matters spiritual. And the content (i.e., the catechesis) matches the form (i.e., the pedagogy): nothing concrete whatsoever. Generic therapeutic self-affirmation clothed in whatever the latest HR-approved, capital-appropriated progressive cause happens to be. Goop gone wild; woke goop. De-toxined crystals against toxic positivity, VR social justice in the metaverse, and oh by the way click here for your subscription to the weekly newsletter from Deconstruction, Inc, it’s only $39.99/month.
So no. As far as I can help it I don’t add my voice to the deconstruction chorus. What do I do instead then?
I build. Which is to say, I construct, or reconstruct. It’s all foundations, floor plans, building permits, and fashioning of pillars in my classroom. We don’t tear down an inch, not if I can help it.
The reason is simple. My students don’t have anything to deconstruct. Deconstruction implies the razing of a building, the demolition of a house. But for the most part, my students don’t walk into my classes with mental palaces furnished in gold, granite, and crystal. All too often, their faith is a house of cards. One gust of wind, one gentle puff of air will knock it down. I’m not interested in that. Not only am I not teaching at a state school in a religion department. I’m a Christian theologian, a teacher in and for the church. It’s my business to fortify, to strengthen, to secure, and to ground their faith—not to tear it down. Deconstruction is a razing, as I said, but I’m in the business of raising homes to live in. I want sturdy foundations and load-bearing walls. I want to build houses on the rock.
Because the storm is coming. It’s already here. I’m given students who for the most part believe already, or want to believe. What I do is say: Guess what? It’s true. All of it. You can trust what you’ve been taught, though you may not have been given the resources to explore the how or the why or the what-for. But Jesus really is God’s Son; he really did rise from the dead; he really is the Lord and savior of the cosmos. And from there it’s off to the races: church history, sacred tradition, ecumenical councils, creedal formulas, saints and doctors, mystics and martyrs, doctrines and dogmas and the rest.
Not one word is meant to undermine the faith they brought with them to the course. It’s meant to bolster and stabilize it. The unmaskers and destabilizers, the Deconstructors™ with all their pomp will be knocking on the doors of their hearts soon enough. I’m doing what I can in the time that I have to reinforce and buttress their defenses, so that when the time comes they are ready. Not because I want them to live free from risk; not because I want them to avoid hard questions. On the contrary. I’m usually the first to raise some of those hard questions on their behalf. But I don’t pretend that it’s better to leave questions untouched than to seek truth by answering them; I don’t model for them the faux profundity of the hip philosopher who hides his actual convictions while interrogating everyone else’s unfashionable ones.
On that day, fast approaching, when my students find themselves facing an unexpected question or challenge to their faith, instead of thinking, “My deconstructing professor was right: this Christian thing is a sham,” they might think instead, “I’m not sure what the answer is here, but the way my theology professor acted, I bet the church has thought about this before; I should look into it.” I want my students to learn the reflex, at the gut level, that there’s a there there, i.e., there’s something to be looked into—not merely something to be walked away from.
That’s why I don’t deconstruct. My classroom is a construction site. Day and night, we’re building, building, building, world without end, amen.
Exorcising theological demons
By the end of last year I realized there were two primary "isms"—but let's call them theological demons—I was implicitly seeking to exorcise in class: biblicism and Marcionism (or supersessionism). Upon reflection, as I plan to teach some upper-level majors this semester in their one and only Theology course before graduation (it all comes down to me!), I realized I have a lot more theological demons in view. Ten, in fact. Here's a brief rundown of what this pedagogical exorcist has in his sights this spring.
(I should add, before starting, that these are specifically intellectual-theological: they aren't moral or political. So, e.g., nationalism is ripe for mention, and that comes up in a different class I teach; but it's not in view here.)
1. Biblicism
By this term I mean the view that the one and only factor for any and all matters of faith and Christian life is the Bible. Think of this as sola scriptura, only with "sola" in all caps. It isn't that the Bible is sufficient for faith and morals, or the final arbiter of church teaching and practice. It's that, in a real sense, there is nothing but the Bible. This can lean in the direction of fundamentalism, but it can also lean toward hollowed-out, seeker-sensitive non-denominationalism: if teaching X or practice Y isn't explicitly commanded/forbidden in Scripture, then not only is it automatically permissible; there is no other relevant theological factor for consideration. The market wants what the market wants.
2. Primitivism
Here I mean the idea that the ultimate goal for Christians is to approximate whatever the church looked like during the time of the apostles. Just to the extent that our worship, doctrine, or practices look different from that of the "early church" (however plausibly or implausibly reconstructed), we are departing from what God wants of us.
3. Individualism
This is in the DNA of each and every one of us, so I don't fault my students for this. Nevertheless, I do my very best, across the 15 weeks I have them, to interrogate the received notion that the individual is the locus of ultimate significance, and propose alternatively that there is a way of being in the world that gives priority, or at least equal significance, to the community. They rarely bite, but the attempt is worth it. This particular demon manifests as religious autonomy: faith is a private business between me and my God, and the church is an optional add-on that I am free to accept or reject as I see fit.
4. Subjectivism
Each of these is cumulative, and subjectivism builds on the foregoing through the implicit belief that the primary, or even sole, criterion for an action is how it affects me, or how I experience its effect on me. So, e.g., certain styles of worship are self-validating because I, or the worshipers in question, self-report a positive experience. Combined with biblicism, this becomes the working principle that everything is licit that (a) produces reportage of positivity and (b) is not expressly forbidden by the New Testament.
5. Presentism
What I mean is twofold: on the one hand, the view that what is new is prima facie superior to what is old; and, on the other hand, a widespread historical amnesia to the church's past, bordering on an active, principled ignorance about and opposition to "tradition," understood as whatever the church has believed, taught, or practiced between the death of the last apostle and the day before yesterday. The former is often explicit: innovation and creativity are chief virtues in all areas of life, including religion. The latter is almost always implicit, merely inherited from church leaders and teachers who inculcated it in them, wittingly or not. I find a great deal of success in using this latter assumption as the point of entry for introducing students to a different way of thinking about the church, faith, theology, and tradition. It's hard to overstate how receptive students are to that conversation.
6. Constructivism
Here I mean what I describe for my students as "DIY Christianity." No one fancies him or herself a proponent of the view that "Christianity is whatever I make it to be," but an astonishing number belong to churches that come very close to suggesting it. As you can tell, all six of these theological assumptions are varying forms of anti-catholicity: the church is not a living community with a rich storehouse of wisdom, knowledge, and teaching built up across the centuries; it is the sort of thing a pastor with entrepreneurial ambition can found, alone, in a local abandoned warehouse, with not a single concrete connection to either actual existing churches or the manifold saints and doctors long departed. Doctrine, statements of faith, liturgical rituals: they're built from the ground up, each and every year, each and every generation starting from scratch.
7. Anti-intellectualism
Christian faith, for most of my students, is a matter of the heart, a feeling expressed in an intimate relationship with the Lord. So far, so good. But as such, it is adamantly not a matter of the mind. Theology might be relevant to pastors—though, on the evidence, their pastors disagree—but, at best, it is optional for the laity and, at worst, is a dangerous and irrelevant abstraction. "Irrelevance" captures the heart of it: if I don't have a clear answer to the question of what I can do with a doctrine, what its practical implications for daily life are, then what could it be good for? Practicality trumps the theoretical every day of the week and twice on Sunday.
8. Marcionism
Switching gears, it is perhaps my principal goal, in every one of my classes, to exorcise my students of this ancient, wicked demon. Again, rarely consciously held, the idea is nevertheless pervasive that there is some sort of disconnect or disjunction between "the God of the Old Testament" and "the God of the New Testament." Or, the church replaces the Jews as God's people. Or, Jesus came to save us from the Law (which was, hands down, the worst). Or, God is finally loving and forgiving rather than violent and wrathful. Etc., etc. The sheer volume of times I refer to Abraham's election, or "the God of Israel," or "Jesus, the Jewish Messiah," is meant as a rhetorical corrective to what I'm sure are years of marinading in supersessionist and even at times full-on Marcionite language in their churches.
9. Gnosticism
Just as all Americans, Christian or not, are individualists, so they are Gnostics of one variety or another. In this case it manifests in one of two ways. Either none of "this" (i.e., creation, materiality, the body) "matters," since we're all going to heaven anyway (and, as I say, putting words in their mouths, nuking the earth as we depart). Or what "really" matters in Christian faith and spirituality is "the heart" or "the soul" or "the inside," not the body or what we do with the body. Fortunately, this doesn't usually lead to flat-out libertinism, though I do think there's an element of that informing behavior outside of sex. But it does inform a kind of anti-ascesis, that is, the view that spiritual disciplines are dead routines, and the notion of self-imposed (not to mention externally imposed!) periods of self-restraint in food, labor, entertainment, or sex is a conversation-stopper. It's not even intelligible as an idea.
10. Anti-ritualism
Last but not least, building on individualism, subjectivism, and Gnosticism, hostility to ritual as such rules the day. Ritual means "going through the motions," which is always and everywhere a bad thing. Hence why innovation is so important, not least in worship: what we do needs to be new lest we slip into dead routines, which we would then do "just because" rather than because "our hearts are in the right place." One's relationship with God is modeled on the early courtship or honeymoon period of young lovers: it's always summer, always sunshine, and you only spend time together—doe-eyed, deeply in love—spontaneously, because spontaneity signifies the depth of true love. (Think about contemporary Christian worship songs.) Rituals, on this picture, are what middle-aged spouses do when they schedule dates and have "talks" and even "fights." That's not what faith is like—which means we know what's happening when it starts to look routinized and ritualistic. Something's the matter.