- Donate
- Subscribe
My Account
Craig Mattson
Wayne Booth reconsidered.
- View Issue
- Subscribe
- Give a Gift
- Archives
Every now and then, I catch my more senior colleagues casting longing glances back to the public life of the Sixties, which, for all its asperities, exhibited more vibrancy than contemporary rhetorical culture. Several years ago, I began teaching at a small Midwestern liberal arts college, and I recall vividly when one of my new colleagues showed me, with no little chagrin, a program of student papers for an annual academic fair on our campus. Despite our school’s legacy of neo-Calvinist transformationalism, which in the late Sixties was almost indistinguishable from a neo-Marxist social critique, most of these essays in the program represented politically conservative commitments. On another occasion, I heard a peer confess feeling disoriented upon seeing student residences dotted with Bush/Cheney signs on a campus that witnessed, thirty years back, Nixon burned in effigy.
When I ask students why they do protest so little, they reassure me that they write a lot of e-mail. Oh, and they cultivate blog presence, too. But it’s hard to be impressed with point-and-click activism. Thirty years ago, in protest of an administrative decision to scuttle the college’s adherence to a particular brand of neo-Calvinist thought, students from our school joined professors for a sit-in. When I told my students this story, one asked, “What’s a sit-in?”
I sound nostalgic and more than a little censorious. But I’m not trying to resuscitate protest rhetoric. A picket line in our cafeteria today would be as odd as those red-faced street preachers who used to point their bibles at our windshields. I am curious, though, about what this change means.
One place to start looking for an answer is a series of Notre Dame lectures by the late Wayne Booth, published in 1974 as Modern Dogma and the Rhetoric of Assent.1 Booth’s death this past October prompts a reexamination of his depiction of the rhetorical culture of the Sixties and his intuition that student rhetoric anticipates the discourse of the broader culture.
As a University of Chicago dean at the height of that turbulent decade, Booth stood between administrative rationalists and student ranters. These opposing sides, he argued, shared an essentially religious commitment to the segregation of fact from value. He could find no more articulate advocate for this divide than the public intellectual Bertrand Russell, “perhaps the last and greatest modernist to embody both extremes of the creed.” Russell, in other words, managed to speak for both sides of the divide—the champions of “fact” and the champions of “value”—because both were willing to defer to his faith in critical doubt—i.e., his insistence that mind, world, and knowledge can be reduced to what can be known by science. Buy into Russell’s dogma that we only know for sure what we can’t doubt, and here’s the insight you’re left with, Booth concluded: “I can only trick you, or force you, or blackmail you, or shoot you—and thus change your mind permanently.”
Booth’s counter was to doubt the doubters—in hopes, perhaps, that two negatives would undo a positivist. Watch, for instance, how he (Q) jabbed questions into a passage from Russell’s What I Believe (R):
R: “Man is a part of Nature, not something contrasted with Nature.”
Q: I agree, but this seems to me to be precisely what you deny when you choose to rule out all of man’s values as irrelevant to Nature.
R: “His thoughts and his bodily movements follow the same laws that describe the motion of stars and atoms.”
Q: Why? What kind of laws? The laws—no doubt extremely general—of Supreme Being? You have rejected those. The law of gravity? Of chemical combination? You have made a huge leap here. …
R: “Of this physical world, uninteresting in itself, Man is a part.”
Q: The original proposition reasserted and still unproved.
R: “His body, like other matter, is composed of electrons and protons, which, so far as we know, obey the same laws as those now forming part of animals or plants.”
Q: So far as we know, in your sense, we can also say that they don’t.
R: “There are some who maintain that physiology can never be reduced to physics, but their arguments are not very convincing.”
Q: Why? Let’s see one. And what about psychology and politics and ethics?
R: “And it seems prudent to suppose that they are mistaken.”
Q: Why? What a curious inversion of Pascal we have here!
But Booth did more than scrape the skeptics. He built a kind of transcendental argument based on what he took to be indisputable about the nature of the person: “Man is essentially, we are now saying, a self-making-and-remaking, symbol-manipulating creature, an exchanger of information, a communicator, a persuader and manipulator, an inquirer.” If this is true to human experience, then even “in a time when ‘everyone believes’ that ‘there are no shared values any more,’ ” our nature requires adherence to one “firm public value”: that we ought to engage with each other argumentatively. John Lennon’s lyrics, Vietnam protests, and Auden’s poetry are thus just as important as academic papers, logical arguments, and scientific formulae. All can be instances of the rhetoric of assent.2
Some variation on this theme continued to be integral to Booth’s work even up to his final book, The Rhetoric of Rhetoric.3 In what follows, I’d like to critique an omission in this book (as representative of an omission in the whole of his work), but I find that criticism daunting to mount, not only because I admire him so much, but also because he has, in a way, anticipated my criticism: “I can only answer, ‘Sorry, but did your last short book cover everything?'” At times, his book does read like a short course on the Rhetoric of Everything. From the history of rhetorical theory, to the contemporary state of rhetoric in education, politics, and media, to a “rheterology” of science and religion, the book frames “Listening Rhetoric” in a way deftly helpful for pastors, teachers, media practitioners, and anybody else interested in the “art of discovering warrantable beliefs and improving those beliefs in shared discourse.” One chief obstacle to such discovery and improvement is what he called “rhetrickery,” a sophistic vice whose origin Booth traced to the dualism he had explored some thirty years back in Russell’s rhetoric.
But although Booth’s long argument with modernism has helped to disrupt the dualism between romanticism and rationalism, anyone who pays much attention to our public discourse today has to wonder about the persistence of the fact/value divide. There are some continuities: as in Booth’s day, we are still long on aspirate assertion, short on clearly consonanted reasons. But more and more, our dominant dualism emerges between preference and procedure. When I ask my students why they don’t protest administrative missteps, they tug down their designer ball-caps and say they’re not sure what channels to use. This delicate attention to procedure would, to put the matter gently, strike students of the Sixties as odd. But even more significantly, it has a privatistic momentum. The old fact/value divide compelled people to pit “What is rational?” against “What do I feel?”—a dualism that is at least half public. Today, the tension is between the questions, “What do I want?” and “How do I get it?” Or, better, “What should I wear?” and “Do you take Discover?”
Somebody ought to write Postmodern Dogma and the Rhetoric of Assent, this time critiquing a representative intellectual for our time. Any takers for Richard Rorty? Dubbed by Harold Bloom “the most interesting philosopher in the world today,” Rorty, like Russell, is an adroit rhetorician capable of speaking to non-philosophical audiences. But unlike Russell, whose rhetorical notions encouraged the noisy but sometimes necessary activism of the Sixties, Rorty’s discourse tends toward the privatistic. He would strenuously object to this characterization, noting that he has spent a great deal of ink on cultivating public-mindedness in such works as Achieving Our Country. But he brackets to private life such concerns as religion, sexuality, and other projects of self-perfection. This bracketing gives the American experiment the feel of a satellite dish network: politics becomes the technical procedure to keep the satellite online, so we can all go home and watch what we want.
At times, nonetheless, Rorty sounds very much like a rhetor of assent. Indeed, Booth included him among “those who have taught me a lot about rhetoric, even when I sometimes disagreed with them.”4 Furthermore, Rorty’s construction of a dialogue with Steven Weinberg sounds like Booth’s dialogue with Russell. Weinberg, one of today’s great preachers of the gospel of objectivity, insists that there is as much correspondence between scientific laws and nature as there is between your insistence that a stone in your backyard is objectively heavy and the stone’s “actual” heaviness. Give an ear to Rorty’s response:
But ask yourself, common reader, in your capacity as everyday speaker about rocks, whether you recognize anything of the sort. If you do, we philosophers would be grateful for some details. Do both the subject and the predicate of your sentences about rocks (“This rock is hard to move,” say) stand in such a relation of correspondence? Are you sure that hard-to-moveness is really an aspect of objective reality? It’s not hard for some of your neighbors to move, after all. Doesn’t that make it an aspect of only subjective reality?
Or is it that the whole sentence stands in one-to-one correspondence to a single aspect of objective reality? Which aspect is that? The rock? Or the rock in its context, as obstacle to your gardening endeavours? What is an “aspect” anyway? The way something looks in a certain context? Aren’t some contexts more objective than others? …
I can come up with conundrums like this for a long time, but I suspect that Weinberg would not see the point of my raising any of them. The difference between us is that I am in the philosophy business and he is not. I concoct and hash over conundrums like that for a living.5
You may be beginning to understand why Booth refers to Rorty as “excessively relativistic,” when passages like this one, with its seemingly irresistible sequence of questions, offer more velocity than validity. Here is no inquiry to be considered, only a momentum to be admired—apparently on the grounds that clever performance achieves solidarity quicker than truth claims.6 But what else is there besides performance when, as Rorty argues in Contingency, Irony, and Solidarity, language has no access to truth, the self has no identifiable nature, and society has no warrantable goal but to avoid causing pain?7
Thirty years ago, on the very day my fellow professors and their students desecrated Nixon’s image, the Ladies Guild decided to plant petunias on our campus. It must have looked like a Neil Simon troupe stumbling onstage during the second act of The Crucible. But the story also suggests that there was a time when the line between the politics and petunias (or, as Rorty’s essay has it, between “Trotsky and the Wild Orchids”) was more sharply drawn than it is now. We’re not likely to see such back-to-back performances of beautification and protest on our campus today, because politics and petunias have, if not exactly kissed, at least become indistinguishable from each other. Not that there are fewer performances—only that they now fold together so well that they appear seamless. As sociologists Nicholas Abercrombie and Brian Longhurst argue, our mass-mediated lives, lined with speakers and screens, encourage us to construe public life as a sequence of performances in which “we are audience and performer at the same time; everybody is an audience all the time.” In other words, the diffusion of our mediascape wreaks havoc with once tidy modernist dualisms, such as the public/private divide: “Performances for diffused audiences are public and private. Indeed, they erode the difference between the two.”8
Rhetorical criticism of Rorty’s performances might suggest a different set of questions than the ones Booth raised about Russell. In comparison with students of the Sixties, students today may look indolent, apathetic, and egocentric. But what if their classroom quiescence is actually a species of what speech therapists call topophobia, the fear of speaking in public? What if Rorty’s garrulity is only the strange obverse of a pervasively felt stage fright?9
I listen to students for a living, as they talk in the classroom, on the sidewalk, at the coffee shop. Their vocal quality is sibilant, often nasal, with plenty of back-of-the-throat fry. Few students use their chest cavity for resonation. They often qualify their own remarks, deprecate themselves, leave sentences unfinished. Their favorite tag is some variation on “You know what I mean?” Now, you could say that all these apparently modest habits of discourse suggest a mastery of the rhetoric of assent. But it sounds to me like a loss of rhetorical nerve, as if students have picked up Booth’s inflections but not his convictions. They sound like actors who have mastered a dialect but can’t remember their lines. Call it the rhetoric of accent: slow to speak, slow to anger, and quick to shrug.
Could we tinker with Booth’s rhetoric of assent to help out the topophobics? Stephen Webb has argued that stage fright is a deeply theological issue, so for starters let’s think about Booth’s theology. He identified himself as “a lifetime pursuer of religious truth” and described his journey “beginning as a devout orthodox Mormon, through increasing doubt to professed atheism, to a recovery of religious belief that some might call mere pantheism, or perhaps Deism.”10 At first blow, these two theological descriptors, one transcendent and one immanent, appear opposed. But he was right to insist upon the importance of both. “Scores of books have reported the quest for a final theory that will explain everything. Why? Because ‘everything’ is really there, waiting to be explained—and it is also here, supporting our pursuit of it.” But what is neither here nor there, Booth thought, is a personal God. Instead he preached a god term, helpful for instilling modesty in people who talk too much, but not finally adequate for people who can’t manage to talk at all.11 What John Updike somewhere describes as our frail and faltering being may well require not just a god term but a Word who, in all eloquent grace and outspoken truth, is God coming to terms with us.
Craig Mattson is associate professor of Communication Arts at Trinity Christian College in Palos Heights, Illinois.
1. Wayne C. Booth, Modern Dogma and the Rhetoric of Assent (Univ. of Chicago Press, 1974).
2. These discourses—Lennon’s lyrics, for example—are referred to by John Hammerback, one of Booth’s careful readers, as “reconstitutive rhetoric”—conversionist discourse that works as much by identification and irony as by logic and empirical demonstration. Hammerback draws heavily on Booth (in conjunction with Kenneth Burke and Edwin Black) in developing a model of reconstitutive rhetoric that seeks to explain how certain rhetorics which, by every modernist light, should have failed, nonetheless succeeded. His model posits that some people, when addressed by convincing discourse, don’t just change their minds—they change their identities. See John C. Hammerback, “Barry Goldwater’s Rhetorical Legacy,” The Southern Communication Journal, Vol. 64, No. 4 (Summer, 1999), pp. 323-332.
3. Wayne C. Booth, The Rhetoric of Rhetoric (Blackwell, 2004).
4. Booth, Rhetoric of Rhetoric, p. 82–83.
5. Richard Rorty, “Thomas Kuhn, Rocks, and the Laws of Physics,” in Philosophy and Social Hope (Penguin, 1999), pp. 184-85.
6. Booth would also want to point out that such a momentum is hardly as overwhelming as it might appear at first. Rorty assumes that if the truth cannot be found to exactly correspond to our statements than the truth must not be findable at all—a dogma Booth deconstructed for more than thirty years.
7. Rorty’s problem may not be merely epistemological but also “acoustemological,” to use Stephen Webb’s neologism from The Divine Voice (Brazos, 2004), which I reviewed in the May/June 2005 B&C. Booth, whose practice of “Listening Rhetoric” makes him attentive to so many aspects of public discourse, is peculiarly unobservant when it comes to the sound of talk. But aurality may be especially important in the Rortyan passage cited above. Rorty cleverly ticks questions off in a list of “conundrums,” but he sounds bored, almost as if he were reading the passage over an intercom. C. S. Lewis accused moderns of being hollow-chested. This discourse sounds hollow-voiced.
8. Nicholas Abercrombie and Brian Longhurst, Audiences: A Sociological Theory of Performance and Imagination (London: Sage, 1998), p. 73, 76.
9. Stephen Webb is helpful again: “Displaced, uprooted, and detached from traditional communities, we often do not know what to say when we open our mouth.” The Divine Voice, p. 77.
10. Rhetoric of Rhetoric, p. 160.
11. Kenneth Burke, A Rhetoric of Motives (Univ. of California Press, 1969), pp. 275-276. A “god term” is our peculiarly human quest for “pure persuasion, absolute communication, beseechment for itself alone, praise and blame so universalized as to have no assignable physical object”—whatever, in short, we persistently name as our highest good.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromCraig Mattson
Paul Charles Merkley
Israel’s most relentless critic.
- View Issue
- Subscribe
- Give a Gift
- Archives
A dedicated, manic assassin of the reputation of the Jews and of Israel, Norman Finkelstein is much admired by college student audiences for his lively platform presence and his snarling, late-night comic style. The Finkelstein method (which brings the audiences to his lectures) is to hold up to ridicule individual pro-Israel polemicists by endless nitpicking about references that go wrong or about anomalies and contradictions between and among their many published statements in many different times and places. The entire lifetime record of the published author is picked over for anomalies, contradictions, and food for tu quoque. Finkelstein brings a virtual wheelbarrow of documented errors onto the platform and pours it out, to the delight of the audience, as proof that the general truths from which his adversary draws his scholarly or political commitments have, before your very eyes, been proved to be “myths,” “frauds,” and “hoaxes.”
Beyond Chutzpah: On the Misuse of Anti-Semitism and the Abuse of History
Norman G. Finkelstein (Author)
343 pages
$65.00
No serious person can deny the doggedness of Finkelstein’s pursuit down the path from footnote to footnote. The effect can be quite chilling, especially when it comes home to the vulnerable celebrity-polemicists such as Alan Dershowitz and various spokesmen for the Anti-Defamation League or the Simon Wiesenthal Center. Dershowitz gets the full Finkelstein treatment in the present book; in fact, the bulk of it is a rehash of the record of Finkelstein’s several assaults on Dershowitz’s uneven polemics. The real scholars, producers of the unanswerable accounts which draw upon documents in all the relevant languages, don’t get noticed—except (for example) in a footnote about the reigning master of Middle East History, Bernard Lewis, where in the tu quoque mode, a reference appears to a remotely relevant matter (Lewis’ judgment on the historicity of the Armenian massacre). There are no references to any of my three published scholarly books on matters very germane to Finkelstein’s apologetics. But then, I am being petty.
Anyone whose familiarity with the historical record is second- or third-hand is almost bound to carry away from these lively performances the impression that he has just seen reduced to ruins the truth upon which the other side (the Jews, Israel, and the friends of both) depends—that all that massive detail about discrepancies in the references, all that gotcha, adds up to demolition of historical truth.
Let me note a few departures from reality (in order of occurrence): The opening line of Finkelstein’s book is about Joan Peters’ 1984 book, From Time Immemorial: The Origins of the Arab-Jewish Conflict Over Palestine, which we are immediately told is now universally dismissed as “a gigantic hoax.” In fact, the first reviewers of that book noted the whiff of scissors-and-paste about it but welcomed it rightly as providing for general readers proof of the fallacy of Palestinian nationhood. Meanwhile, although other scholars have fleshed out the same theme with more accurately reported documentation, Peters’ book remains a valuable summary vindication of the observation (sustained by all the travel literature and all the governmental surveys, all the royal commissions and all the scientific demographic and topographic studies) that it was the success of the Zionist experiment in that part of the Ottoman Empire that created the basis for the development of economic life. And it was this success that drew an adequate population base (including tens of thousands of Arabs from nearby regions) to lay the foundations for partition of the region and the eventual erection on the site of two mutually respectful political entities: a Jewish State and another Arab State (Jordan, four times the size of Israel, having already been carved from the mandate).
Everything about the past that Palestinians believe they cannot live with Finkelstein dismisses as “hoary Zionist myths,” “propaganda,” or “fairy tales.”
Finkelstein shows no familiarity with the monumental scientific studies of the region conducted in the 1930s by Walter Lowdermilk, Assistant Chief of the Soil Conservation Division of the Department of Agriculture in the Roosevelt Administration, housed today in the Franklin Delano Roosevelt Library. These proved irrefutably that centuries of absence of adequate population had led to the degradation of the soil, and that the beginning of its restoration traces to Jewish colonization. Nor does Finkelstein make reference to any of the other scientific studies, including those commissioned by the Mandate Authority in the inter-war years, which document the same conspicuous truth. If he has heard of these studies, Finkelstein is suppressing knowledge of them; if he has not, he stands in contempt of historical record and scientific fact.
Then comes Finkelstein’s blanket denial that any substantial part of the population of the region in pre-Mandate days was other than “indigenous”—a term which he uses exactly as it is used of North American first nations. This assertion stands together with another: that what was indigenous was Arab. The reality (again borne out by all the Ottoman documents, all the Royal Commissions and other scientific studies of the time as well as the memoirs and reports of all the missionaries) is that many races besides the Arabs (Turks, Kurds, Circassians, and many descended from blocs of inhabitants transferred from other parts of the Ottoman Empire, including the Balkan peoples) lived in Palestine when the Jewish settlers arrived in the mid-19th century. In the face of all this, Finkelstein simply posits that Palestine and the Palestinians have always been exclusively Arab.
Throughout the book we find evidence that Finkelstein has bought, kit-and-kaboodle, the PLO’s fantastic anti-history: the Jews did not originate there four thousand years ago, they wandered in somewhere along the line unnoticed by History, but were thrown out by the Romans upon the destruction of the Second Temple two thousand years ago; none were to be found thereafter until after the modern Zionists, abetted by cynical British imperialists, began their usurpation of the land. The record of history and the methods of history count for nothing in this company. Everything about the past that Palestinians believe they cannot live with Finkelstein dismisses as “hoary Zionist myths,” “propaganda,” or “fairy tales.”
Then we have the assertion that “it is today conceded by all serious scholars” that the Arab radio broadcasts of 1947–1948, urging local Arabs to flee, are “a Zionist fabrication.” No, not all serious historians have conceded this point; indeed, even memoirs of Arab statesmen of the time make reference to the broadcasts. Ditto Finkelstein’s assertion of undisputed consensus for the claim “that the Palestinians had been ethnically cleansed in 1948.” All the major details of the story leading to the success of the Jewish struggle to achieve their homeland with the approval of the United Nations are tossed aside as “myths . … Zionist fabrications,” which persist, we are told, only because most people have read the book Exodus by Leon Uris, or have seen the sentimental movie based upon it.
The same “unqualified consensus,” Finkelstein assures us, maintains that “Palestinian detainees have been systematically ill treated and tortured, the total number now reaching probably tens of thousands.” Among many other difficulties with this assertion, there is no reference at all in these pages to the thousands of Palestinian prisoners imprisoned as terrorists, many of them multiple-murderers of Israeli citizens, who have been released and restored to the Palestinian Authority in the misguided hope of winning points with American and world opinion. Most of these have evidently resumed their careers as assassins.
But the epitome of chutzpah is Finkelstein’s breathtaking assertion that there is “on historical questions” an “unqualified consensus” against any part of the story about current events as told by Israel and the Jews—”or at least among those sharing normal human values.”
All of this gets us to the top of page 3. To continue at this pace would take a lifetime, for which I have better uses. What sticks to me as I put aside this noisome book is the odor of loathing for Jews everywhere—for their history, for their habits of thinking, and for their pervasive influence in the world. To achieve this unrelieved contempt for Jews, for Israel, and for the friends of Israel, Finkelstein raises the Palestinians to the unrelieved dignity of victims. There is no hint in all these pages of why Israel has had to resort to violence. There is no hint of moral distinction between violent force exercised by a state in defense of the lives of its citizens and violence exercised by suicide bombers. There is no reference to the history of Arab terrorism and no reference to the present reality of Islamic terrorism—although we are told that “the overarching purpose of the ‘war on terrorism’ [led by the U.S.] has been to deflect criticism of an unprecedented assault on international law.”
No thought at all is given to the circ*mstance that has forced Israel to arm itself as it has done, to rally its entire population to make the sacrifice of compulsory military service. There is no recognition that from its birth the only option given to Israel has been self-defense or liquidation—something always candidly declared by its Arab foes. Symptomatic of this technique of avoiding matters of behavior on the part of Israel’s enemies is the fact that there is not a single substantial reference to Islam in the entire book. (Present company will want to know what is said about Christians and the Church—but there are no references in the index to either. There is, however, a routine sprinkling of undeveloped asides concerning American fundamentalist tools of the Israeli right.)
No honest friend of Israel denies that Israeli soldiers and Israeli citizens have often—too often—resorted to foul means. Israel has in place legal mechanisms for detecting and punishing these aberrant acts; many Israelis languish in prison today for their unlicensed assaults against Arab civilians. There exists in the Israeli press, among Israeli scholars, and in the fray of Israeli politics a lively debate on such matters (including voices as critical of Israeli government policy as Finkelstein is).
But in the world according to Finkelstein, Israelis are sad*stic oppressors—partly by conditioning, but ultimately by nature. “Of course,” he says, “with marginal exceptions, no one contests Israel’s right to defend itself against terrorism; the criticism springs from its gross violation of human rights in the name of fighting terrorism.” But the “exceptions” are not “marginal”: they include the entire Muslim world, a working majority of the member-states in the General Assembly of the United Nations, and at least the left end of the political spectrum throughout the Western world.
Those of us who imagine that our loyalty to Israel follows from sincere calculation of historical rights and wrongs are mindless dupes of the Jews’ mighty propaganda machine. The Jews exploit the world’s fear of appearing anti-Semitic to command silence while they perpetrate unremitting sad*stic violence upon an entirely innocent population. No one will be surprised to learn that their partner in this cosmic crime is the United States: “The brutal U.S. aggression against Vietnam and the Bush administration’s aggression against Iraq engendered a generalized anti-Americanism, just as the genocidal Nazi aggression during World War II engendered a generalized anti-Teutonism. Should it really surprise us if the cruel occupation by a self-declared Jewish state engenders a generalized antipathy to Jews? … The real wonder is that the spillover hasn’t been greater.”
Leave aside, for our purposes here, the equation of the United States with Nazi Germany. Finkelstein’s words encourage his readers and listeners to treat Israel and all Jews as outlaws. What does this lack to distinguish it from the Jew-hatred which spills out daily from the imams of Palestine as of the other Arab polities? Theological referents aside, how do these words differ from the recent message of the official imam of the Palestinian authority, Sheikh Mudeiris: “Why is there this malice [as Muhammad taught, on the part of the rocks and the trees towards the Jews]? Because there are none who love the Jews on the face of the earth: not man, not rock, and not tree, everything hates them. They destroy everything, they destroy the trees and destroy the houses. Everything wants vengeance on the Jews, on these pigs on the face of the earth, and the day of our victory, Allah willing, will come.”
This is a book that has to be read very slowly and carefully, neglecting none of the footnotes, none of the charts and graphs and none of the appendices—or not read at all. Having done the former, I recommend the latter.
Paul Charles Merkley, a retired professor of history at Carleton University in Ottawa, Canada, is a consultant on foreign policy. He is the author most recently of American Presidents, Religion and Israel (Praeger).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromPaul Charles Merkley
John Wilson
- View Issue
- Subscribe
- Give a Gift
- Archives
Wendy, my wife, is trying—again—to persuade me to stop subscribing to the New York Times. (“We could pick up the Sunday paper every week at Starbucks,” she says.) I know that she is mostly thinking of me. She is sure I am suffering from information overload. But also to preserve her own sanity and the harmony of our long union she’d like to reduce, even just a little, the flow of printed matter into our home. Quite apart from what’s going into my head, there’s too much wordstuff, books and magazines and journals and newspapers, always threatening to colonize another flat surface.
You may be thinking that there’s a very satisfactory compromise ready at hand: the Times on the web. I do go to the website a number of times in the course of a week, for one reason or another, but, much as I value that resource, it’s no substitute for holding the paper in my hands. The Times and the Chicago Tribune arrive each day in their blue plastic wrappers as surely as the sun rises. When a good friend and fellow editor told me recently that he’d stopped reading the Times, fed up with the smugness and moral vacuity of the paper’s party line, I was stunned. It was a little like hearing that a friend has sold or given away his possessions and gone to live among the poor.
Of course I understand his exasperation. Perhaps he was afraid that reading the Times was tempting him on a daily basis to feel morally superior. That’s certainly a hazard one must reckon with. Consider this headline from the Tuesday Science section (Nov. 29): “A Pair of Wings Took Evolving Insects on Nonstop Flight to Domination.” Can’t you hear that intoned in the slightly menacing voice of a pbs narrator? The article, by the well-known science writer Carl Zimmer (and illustrated with superb photos), lives up to the headline. Here’s my favorite paragraph:
And insects are also ecologically essential. If all humans decided to leave for Mars, taking all vertebrates with them, the disruption of life on Earth would be incomparably less than the catastrophe that would ensue if insects disappeared. Forests would probably collapse, rivers and oceans would be poisoned, and many other animals would starve.
It’s hard not to sense in this passage the implication that insects are somehow virtuous even as they revel in world domination; as Zimmer puts it, no matter how you slice it, “insects still win.”
We’ll return to this theme in a future issue (right now I am reading a fascinating book by Thomas Eisner, For Love of Insects, published by Harvard University press in 2003). But it would be an impoverished reader who gleaned from the Times only those pieces that come heavy with ideological freight. Last night while Wendy took a bath I read aloud to her from an article by Sarah Lyall, datelined Barry, Wales (Nov. 29). Lyall was reporting on the Mosquito, an ingenious invention of security consultant Howard Stapleton, who drew on an obscure feature of human hearing—”that children can hear sounds at higher frequencies than adults can—to fashion a novel device that he hopes will provide a solution to the eternal problem of obstreperous teenagers who hang around outside stores and cause trouble.”
In its first trial, outside a convenience store in Wales, the device has performed superbly, emitting a “high-frequency pulsing sound” that is extremely irritating to young people but that Lyall herself, she reports, could not hear.
And then there was Howard W. French’s “Kung Pao? No, Gung Bao, And Nix the Nuts,” (Nov. 23), which my daughter Anna read aloud to the whole family on Thanksgiving. French visited the city of Guiyang in China’s Guizhou province, the “ancestral home” of the dish Americans know as kung pao chicken but which in Guizhou is called gong bao jiding, “a dish whose perfume wafts through the air, distinctive even over the smell of tobacco smoke.” A celebrated chef in Guiyang, Wang Xingyun, is quoted at length deploring the manner in which the dish is prepared in Sichuan province—especially the use of peanuts. Accompanying the article is a recipe based on Wang’s own, which we haven’t yet had a chance to try.
Just before Thanksgiving, Wendy and I were in Philadelphia for aar/sbl (the annual meetings of the American Academy of Religion and the Society of Biblical Literature). In a report on the conference for Books & Culture’s website, I described AAR/SBL as a “chaotic marketplace of ideas” and said I found it exhilarating. A friend wondered about that description. Wouldn’t “depressing” be a better word for it, given the high proportion of confusion and sheer untruth?
That was a good question. The convention, like a sprawling city, is both exhilarating and depressing, a site of great energy and variousness—a lively place—and also a place of darkness. Of all the reasons I read the Times, I think the foremost is to taste that variousness, the unpredictable harvest of the day—unpredictable, yes, despite the paper’s ideological grid. “When a man is tired of London,” Dr. Johnson said, “he is tired of life; for there is in London all that life can afford.”
In conjunction with the 50th anniversary of Christianity Today magazine, founded in 1956 by Billy Graham, Christianity Today International, with support from the Pew Charitable Trusts, is embarking in 2006 on what we’re calling the Christian Vision Project. On p. 7 of this issue, Andy Crouch, the project director, outlines this three-year venture and introduces Books & Culture’s first piece under the CVP rubric, an essay by Lauren Winner urging Christians to do something truly countercultural: get more sleep.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJohn Wilson
Neil Gussman
An American weapon that has never killed an enemy but still claims innocent victims.
- View Issue
- Subscribe
- Give a Gift
- Archives
Editor's Note: This article about the strange and admonitory history of the chemical weapon lewisite was first published in the January/February 2006 issue of Books & Culture. Less than a week ago, I read a news article about cleanup efforts at Redstone Arsenal in Alabama, where one of the plants produced mustard gas and lewisite. Other sites at the facility contain residue from chemical weapons produced during World War II. The cleanup is projected to last for several decades.
Dew of Death: The Story of Lewisite, America's World War I Weapon of Mass Destruction
Joel A. Vilensky (Author)
Indiana University Press
240 pages
$8.89
Does the word "chemical" make you uncomfortable? Do you think of "natural" and "chemical" as opposites? When the tv pitchman promises "Cleans faster with no harsh chemicals" do you give him just a little more attention? If so, you are not alone. Most modern people are afraid of chemicals. Which is too bad. Because elephants, bacteria, humans, mice, trees, and tigers are all chemical factories so efficient that chemical makers wish they could approach even a small fraction of the efficiency of any living organism.
Chemicals and chemistry were not always the subject of fear and dread. In the 19th century, chemicals became the building blocks for effective drugs and for rapid advances in public health. Clean water, anesthesia, and painkillers—things we take for granted today—were and are the result of advances in chemistry.
It's not hard to trace the beginning of the downhill slide in the image of chemistry. On April 22, 1915, Captain Fritz Haber ordered German troops to open the valves on 6,000 pre-positioned cylinders of chlorine. Within minutes, Algerian and French troops in trenches near the Belgian village of Ypres saw a yellowish-green cloud rolling toward them. As the heavier-than-air gas filled their revetments, the troops who could run did; the rest writhed in agony as the gas burned their throats and eyes and finally drowned them in the fluid of their own lungs.
It is doubly sad that Haber selected chlorine for the debut of gas warfare, because chlorine has made billions of lives better across the globe in the last hundred years. The vast majority of drugs use chlorine in some step of their synthesis, and chlorine is still the most widely used and effective disinfectant for public water systems. The beginning of the 20th century was the beginning of the end of cholera and other water-borne plagues because chlorine kills germs so well. Then, Captain Haber showed that an element that could kill germs could also kill and maim people.
Had the German army pressed its attack on that horrible day, the war might have ended before the United States joined the Allied armies in 1917. But the Germans checked their advance. The French and British counterattacked, and the war dragged on for three more years: years with millions of combat casualties, including hundreds of thousands injured and dead from chemical attack.
If chemistry's reputation was damaged by World War I, the field nevertheless retained much of its luster through the great advances of the mid-20th century. Then Love Canal, Silent Spring, Agent Orange, and Bhopal tarnished the public image of chemistry to the point that, beginning in the 1990s, some chemical companies and even some chemical trade organizations have changed their name to remove the word "chemical." Sometimes the switch is subtle, from chemical to chemistry (focus groups show the public is less afraid of the latter); sometimes the new name is a vague, Latin-derived neologism that hints at science.
And then in 2005, the 90th anniversary of the first use of chemicals as a Weapon of Mass Destruction (WMD) and the 60th year since the first use of the most fearful WMD, the atomic bomb, Joel A. Vilensky published Dew of Death: The Story of Lewisite, America's World War I Weapon of Mass Destruction—a weapon hailed as the deadliest in history, yet one that has never been used by America in combat and likely would not have been effective if it had been employed.
Vilensky's account begins with two American chemists who were born in the same year and were instrumental in the development of lewisite. They never worked together, they did research in very different areas of chemistry, and while they may have corresponded, they probably never spoke to each other.
The development of lewisite began in 1903, when one of these chemists, Father Julius Nieuwland, mixed acetylene and arsenic trichloride and nearly killed himself. Fifteen years later, our second man, Captain W. Lee Lewis, a chemist with no background in poison gas development, volunteered for the war service. He was given lab equipment and told to develop a chemical weapon deadly enough to end the war.
Where did Captain Lewis begin his search for the ultimate weapon? In facilities donated to the war effort by Catholic University of America (CUA) and American University. His search ended when a librarian remembered that the first Ph.D. thesis ever approved at CUA included an experiment that put the candidate in the hospital for a week.
Lewis expanded Nieuwland's work and developed the organic arsenic compound that would later bear his name. In testing, this evil concoction killed dogs, donkeys, and goats by the score and had much to recommend it as chemical weapon, but it also had drawbacks; chief among them was a tendency to break down in water. (As a reader, I wanted to know why this weapon was selected for development and mass production when it had no combat trials. The author wanted to know this also, but the 90-year-old records were sealed shortly after the attack on America on September 11, 2001. Without a change in policy concerning these records, we may never know.)
Could the leaders of the American Chemical Warfare Service really have thought that the Germans had not developed and tested the same compound? Their commitment to secrecy suggests they did believe that the United States was alone in developing lewisite. In fact, however, the German chemical weapons program had synthesized and tested lewisite (along with other organic arsenic compounds) before rejecting it.
In this instance the assumption of U.S. weapons-makers was extremely parochial. During the period in which lewisite was developed, Germany dominated chemistry, especially organic chemistry. (An organic chemical is a compound that contains at least one atom of carbon. Lewisite includes two carbon atoms in its most deadly variant, more carbon in its weaker forms.)
How extensively did Germany dominate organic chemistry? On November 2, 1916, while America remained neutral and World War I raged in Europe, the submarine Deutschland, having crept through the British blockade of American ports, landed at New London, Connecticut. Onboard this unlikely cargo vessel was a shipment of indigo dye and of Salvarsan, the first drug to successfully treat syphilis. America's chemical industry was weak enough and the German need for currency was strong enough that a U-boat carried less ammunition in order to deliver dye to American mills. The following year, America was at war with its dye and drug supplier, and U-boats no longer docked in Connecticut.
Unaware of the German testing and rejection of this purported superweapon, the United States began to produce lewisite on a large scale. The result: hundreds of casualties among soldiers who never left America, poisoned ground in Ohio and Washington, D.C., tons of arsenic-based poison dumped at sea and buried on American soil, and not one enemy casualty.
The story of lewisite shows that even an unused weapon can be lethal. At the end of World War I, the U.S. Army was making lewisite at the rate of at least several tons per day in Willoughby, Ohio. Since the government records are sealed, Vilensky could not determine the exact amount stockpiled by the end of the war, but it is likely that many tons of lewisite and lewisite-contaminated equipment were buried in and around Willoughby as well as near Catholic University and American University, where Lewis and his team did much of their testing and development work.
Although informed opinion concluded that lewisite was impractical for battlefield use, leading American newspapers and magazines reported on lewisite after the war with claims that have their genesis in a Department of the Interior exposition held in Washington. One of the displays included a vial of the "deadliest poison ever known, 'Lewisite.' " On May 25, 1919, the New York Times said ten airplanes carrying lewisite "would have wiped out . . . every vestige of life—animal and vegetable—in Berlin." The article went on to claim "a single day's output [of the Willoughby plant] would snuff out millions of lives on Manhattan Island." Also reporting on the exposition on the same day, the Washington Post said, "one day's output of the lewisite plant was sufficient to kill all four million inhabitants of Manhattan." Less than three weeks later an article in the Cleveland Plain Dealer said lewisite was "72 times as toxic as mustard gas." Mustard gas was in fact used during the war with devastating effect by armies on both sides of the conflict. Vilensky quotes these and many other contemporary sources—in which the sense of horror is trumped by a boasting tone—to paint a picture of an age almost beyond the imaginative reach of most people living today, an age in which the label "scientific" was good and the godlike men of science would lead us to a better future.
But the story doesn't end there. Vilensky goes on to tell how this untried weapon spread across the globe. In the years between the world wars, most military leaders were averse to using chemical weapons but needed to have these weapons in case the other side used them. So most of the major combatants in World War II built up stockpiles of chemical weapons that were never used. As in World War I, the only lewisite casualties in World War II were plant workers who made the poison and soldiers who "volunteered" to test the weapons. The greatest production—and the highest death toll—by far was in the Soviet Union, with stockpiles the precise extent of which, Vilensky says, cannot be accurately determined but were surely in the tens of thousands of tons.
These Soviet stockpiles have been buried, dumped at sea, or are still waiting to be neutralized and disposed of. And the tale of horror continues. Japan, China, Canada, and other countries as well as the United States and Russia are still dealing with lewisite-blighted ground and poisoned citizens nearly 90 years after it was hailed as the most deadly weapon in the American arsenal.
This brief and thoroughly chilling book shows how men of good intentions under the pressure of war can make errors in judgment that haunt the world long after they are gone. Vilensky connects America's two largest WMD programs to one man: James Bryant Conant. In the summer of 1918, Captain Conant moved lewisite from development to mass production facilities, hoping that lewisite could at last bring an end to the terrible conflict. When the war ended sooner than expected, Conant returned to civilian life and went on to become president of Harvard University. Then during World War II, he became administrative head of the Manhattan Project. In that role, Conant used many of the procedures that he developed as production chief for lewisite to aid in development and production of the first atomic bomb. And this time, Conant oversaw the production of a weapon that was infamously effective.
Vilensky opens a window into the world of science at war, how discoveries become weapons, and how weapons can harm those who wield them. His modest original intent was to explain the origins of a compound called British Anti Lewisite (BAL) that was developed as an antidote to lewisite. BAL has been used to treat nervous disorders for half a century and has been much more useful in this role than as an antidote to a poison gas never used against British troops. His persistence and curiosity led to a book that will have an important place in the literature on this ghastly form of warfare.
Before launching into the text of the book, the reader will see that politics and WMDs are never far apart. Richard Butler, former head of the United Nations Special Commission to Disarm Iraq, writes in the foreword that chemical weapons were not used in the latter half of the 20th century with "two notable exceptions." They are Iraq in its war with Iran and the United States in Vietnam. Butler says that the United States used defoliants with the intent to harm enemy soldiers and has been lying about that use ever since. From its foreword by a controversial weapons inspector to its very thorough bibliography, Vilensky's book is interesting, provocative, and frightening.
Neil Gussman writes about the history of chemistry for The Chemical Heritage Foundation in Philadelphia.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromNeil Gussman
William Edgar
Rediscovering the witness of Hans Rookmaaker.
- View Issue
- Subscribe
- Give a Gift
- Archives
In retrospect, romanticism about the 1960s is overstated. Alongside George Harrison’s sermons on Sergeant Pepper about being “all one and life flows on” and Timothy Leary’s League of Spiritual Discovery (lsd) we must set the addictions, the deaths, and the wasted lives from Haight Ashbury to suburban New York. Alongside the anti-establishment flower power of the hippy movement, the confused lives in the communes. Alongside the Pax Americana, the brutal Realpolitik of American engagement in Vietnam. Alongside the social programs and the war on poverty, the political assassinations in America and student barricades in Paris.
Although things would eventually return to some kind of normalcy, the 1960s represented a sea change, from the relative social conformity of the years after World War II to a multi-layered, conflicted culture, an unprecedented polarization between Left and Right, new and old, rebellion and conformity. Earlier voices in the 1950s had pushed the envelope, from the Juvenile Delinquents saluted in The Blackboard Jungle to Elvis’ risqué gyrations and Chuck Berry’s celebration of teenage identity, but the full flood of defiance came in the next decade. Elvis joined the army, and rock became profligate. Hopeful Abstract Expressionism gave way to cynical Pop, Op, Neo-Dada, and Happenings. Cary Grant and Doris Day were replaced by Meryl Streep and Dustin Hoffman. Ozzie and Harriet were no longer everyone’s pop and mom. Things were at best confusing. At worst they were dangerous. The arts were both descriptive weathervanes and prescriptive prophesies.
At the center of those times, a rather lost young man, a jazz pianist by night, a sophom*ore music student at Harvard by day, made his way up the mountain toward Villars, Switzerland, stopping in a tiny village called Huémoz, where his life would be forever changed. After a long journey I became a follower of Christ. The people I met there, and their message, became the network undergirding my new-found countercultural faith in evangelical Christianity. The year was 1964, not long after John F. Kennedy’s assassination. The first student to don a Beatles haircut had just walked across Harvard Yard to everyone’s amusem*nt. Less amusing was the spread of hallucinogenic drugs around the community. We lived under the threat of the bomb, and of the draft, a conscription which would send us to Asian jungles to fight a war we did not endorse. The Cold War was seething.
Here at l’Abri I had found a place, completely off the beaten path, where enlightened instructors could make some sense out of our disturbed times, based on biblical Christian faith. The major voice in the community was Francis A. Schaeffer. I had not known such exuberance in my college classes as I did under his teaching. It was wide-ranging, imprecise, passionately delivered, and always related to a unifying worldview. But another voice, at first more muted, but which became for me the more significant influence, was that of an idiosyncratic Dutch art historian. I first knew about him from a chart hanging on the wall of Farel House, the name given to a section of Chalet Beausite, where we studied tapes every day. It was a history of African American music, beginning with spirituals and blues, and moving to the jazz era. It was signed Hans Rookmaaker. I had come to expect connections of all kinds at l’Abri, a place dedicated to exploring the relation of Christian faith to just about everything. But jazz music? Could I have arrived at paradise before my time? And who was this man?
I eagerly found my way through the large tape collection to a series on jazz, full of musical illustrations from rare recordings, delivered in beautiful English with a Dutch accent. More careful, less overtly emotional than Francis Schaeffer’s, the voice was clear, compelling, and utterly fascinating. Hans Rookmaaker spoke of the great artistry and authenticity of Victoria Spivey, Texas Alexander, Bumble Bee Slim, Blind Willie Johnson, and a host of other founders of classic black music. Not only was Rookmaaker the European editor of Fontana Record’s series, Treasures of North American Negro Music, but he had been to America and met Thomas A. Dorsey, Mahalia Jackson, and Langston Hughes. What was the attraction of jazz to this Dutch art historian? For that is what he was during his professional career.
He said it often in his lectures and throughout his writings. It put iron into the blood! Discussing his hero, Joseph “King” Oliver, he compares the New Orleans cornetist’s orchestral sounds to the music of J. S. Bach. He finds very similar musical qualities in the baroque polyphony of the Brandenburg Concertos and Oliver’s Creole Jazz Band from the 1920s. Not only the technical structure, but the mood and atmosphere are similar. Especially, he finds in both of them joy, true joy, not romantic escape. In stark contrast to Theodor Adorno’s attacks on jazz, which found it “unruly,” “rebellious,” and “emasculating,” Rookmaaker describes it as orderly, harmonious, and full of vigor. The opposite of joy for him is happiness, or the escapism of those who look for depth in the tragic and ruinous. And the ultimate source of true joy, whether in jazz or any other human expression, is biblical Christian faith, which Bach and Oliver shared.
During his lifetime, Hans Rookmaaker guided a great host of students into a strategy for understanding their times and working within their society with courage and creativity. His best-selling Modern Art and the Death of a Culture (IVP, 1970) was nothing short of a ground-breaking study of the surrounding culture, both in its threats and its promises. He dared to make sense of the steps to modern art by noting the general trend from a theocentric world to an absurd universe that lay behind the pictures. Malcolm Muggeridge, himself a returned prodigal, gave it a ringing endorsem*nt on the pages of Esquire. Following in the tradition of the historian Groen van Prinsterer, the theologian-statesman Abraham Kuyper, and the philosopher Herman Dooyeweerd, Rookmaaker believed there was a spiritual background to Western painting which was the key to unlocking its meaning. However, unlike amateur attempts to reduce art to philosophy, Rookmaaker led the reader on a visit to hundreds of paintings, writings, and musical numbers, pausing to scrutinize their composition and motifs.
While the clarity of his pages has fooled some into thinking he was merely a popularizer, or, more gravely, that he ran slipshod over the inner dynamics of particular works of art in order to discern their message, the truth is that behind every one of his judgments there was considerable research. It’s just that he did not want to miss the forest for the trees. What his critics feared at the time was that he made facile connections between an artistic statement and its philosophical orientation. They worried that he was from a bygone era which had not yet escaped the carelessness and even the paternalism of such judgments.
Perhaps there is some truth to this. In his praise of Groen van Prinsterer, Rookmaaker compares the statesman’s history of Holland to the books of Kings in the Bible, because both are able to discern the hand of God in history. So, there is a hint of providentialism here. Still, we have gone way over to the other extreme. Besides often being unfair, there is something sad about our timid refusal to look for meaning in a text. Have we not become jaded in our over-sensitivity to hermeneutics? Have not our critical requirements turned us into snobs of a different kind? When we read the works of Rookmaaker and others in the previous generation of scholars, we are in a different world. The air is full of oxygen. They are capable of enviable lucidity. Sure, they made their judgments, but these were often well considered, delivered without today’s required guilt feelings for treading on the wrong toes. They are careful and nuanced in their own way, but full of passion and courage. Besides, the final reason for Rookmaaker’s calling as a critic is that he believed in objective truth, while many of his contemporaries were seducing their audiences away from the possibility of truth.
Close to three thousand pages of limpid prose are gathered between the covers of the six volumes of Rookmaaker’s Complete Works, the appearance of which is truly a publishing event. Marlene Hengelaar-Rookmaaker’s editorial loving care, and respect for her father, shine on these pages.
I thought I knew the man and his subject well. He was a mentor, a friend, a correspondent, and a frequent visitor to our home. Reading these pages, though, I realize that I only knew a part of his work. The sheer quantity is a first revelation. It is marvelous to see all of his major books reproduced, including an English translation of Jazz, Blues and Spirituals. But there is so much more, much of it previously unavailable in English.
The second revelation for me is the variety of subjects discussed. Here are technical articles on philosophical aesthetics. Here, too, are gathered personal letters, transcribed tapes of lectures and interviews, revealing the pastoral and emotional side of the scholar. We find studies on various portions of the Bible, some of them daring in their understanding of symbolism and ancient historiography. There are sermons, and writings about patience and suffering in the Christian life. Numerous book reviews are reproduced. Rookmaaker writes about God’s sovereignty over human history, about his favorite Albrecht Dürer, about the nature of culture, Escher’s graphic art, freedom in the Christian life, and myriad other subjects. These pages are simply a feast.
Significantly absent is almost any attention to photography or film. He does comment on them here and there, but usually negatively, worried that they are bound up with a two-dimensional world. In a memorable review of Luis Buñuel’s surrealist sermon Un Chien Andaloux he describes the film as hateful, chaotic, meaningless, and then compares it to art and music which came out of suffering, but with Christian hope, such as Schütz’s Psalms, the golden age of Dutch painting, or African American blues. Surely this reluctance to engage with movies and photographs emanates from his concern not to reduce reality to brute facts.
One of the richest portions, not previously familiar to most of us, is the collection of articles in volume 4 entitled “Western Art.” Moving from medieval times to the present, it contains a dazzling array of references and examples. The Danube School, Bruegel, the Anabaptists on art, Raphael’s Madonna of the Sistine Chapel, Vertumnus and Pomona, Jan van Goyen, Daumier, kitsch . . . this journey is simply magical. In the process we are reintroduced to Rookmaaker’s basic commitments. A poem, a piece of music, a painting must have primary aesthetic qualities. But they also teach us something. Not in the moralistic manner of didactic art, but by opening our eyes to something in the world we had not seen before. He insisted that art be given no less, but no more a place in the scheme of things than it is due. In the 19th century, art (small a) became Art (upper case) because it left its proper place and pretended to be revelatory. No, says Rookmaaker: “Art has a function of its own in culture and human life. Just being art. Not autonomous, but bound by a thousand threads to full reality and human life. A thing of beauty is a joy forever, just because it is related to humanness and reality.”
Art history is a task and calling for today which traces the engagement of artists whose work contributes to the good or the downfall of humanity. Indeed, at their best, artists are called to “elevate the humanity of those who consider their work.” Certainly Rookmaaker’s life was dedicated to the elevation of the humanity of everyone he encountered, in his profession and in his ministry.
The final volume contains a beautiful biography by Laurel Gasque, entitled “Hans Rookmaaker: An Open Life.” No one could be better qualified for the task. Not only were she and her husband, Ward, close friends of the entire Rookmaaker family, but they shared his vision. She speaks for many of us when she writes: Hans Rookmaaker never failed to encourage me intellectually and spiritually through friendship or to inspire me to independence of vocation by his creative example and serious conversation. Through his generous gift of time in viewing art and architecture, listening to music, and in discussing vigorously, extensively, and openly issues of culture and meaning with me, he gave a dimension to my education that I could never have obtained by formal means. Hans’s complete confidence in the indissoluble relation between art and reality and his wise understanding of their inter-relatedness have enriched my thinking, and, indeed, my life.
At the same time, this 130-page life story is not a hagiography. Hans and his wife Anky went through periods of spiritual dryness. Hans was something of a workaholic. He was often restless. Yet in the end, he did what few persons in any generation can do, because he was truly a universal thinker: navigate easily from the study to the living room, from the Bible to the art museum, from learned books to real people with spiritual gifts and needs.
Several aspects of Rookmaaker’s life and thought are particularly worth underscoring. What were his major influences? During World War II, he served in the Dutch navy. He was interned in a prison camp near Nuremberg, then another in Stanislau, doing hard labor. Though not from a believing background, he began to read the Bible upon the recommendation of a friend back home. He became convinced of its truth. He read other books, and wrote papers on prophecy and aesthetics. In prison, he met Captain Johan Pieter Albertus Mekkes, a Christian, who introduced him to the Amsterdam philosophy espoused by Stoker, Vollenhoven, and, especially, Herman Dooyeweerd (1894–1977), whose New Critique of Theoretical Thought revolutionized Rookmaaker’s outlook on epistemology and apologetics. (How many POWs were reading Dutch neo-Calvinist philosophy in their deprived circ*mstances?)
After the war, Rookmaaker devoted much of his early writing to aesthetic theory based on the Cosmonomic Idea, which posited that nothing was neutral, and that meaning was lodged in spheres and laws governing every part of the created world. Accordingly, beauty and harmony were at the center of the aesthetic sphere, while at the same time there was overlapping into others, so that psychology or theology could be beautiful. Students of Rookmaaker’s in the ’60s and ’70s may not have realized how deeply his thinking was permeated by the Amsterdam philosophy. Much of this school of thought is of technical interest only; the originality of Rookmaaker’s contribution lies in applying it to the arts. As he moved into circles where artists and students were asking hard questions, the theoretical language moved into the background, and he became eminently practical. Still, his commitment to the basic contours of the philosophy was always there. It often came out in his reactions to issues. For example, if a student asked him whether God exists, his answer would first be to dismantle a presumed Cartesian presupposition behind the question, and only then attempt a reply, which would assert that everything in the Bible and in the world is a proof of God. Or if an art student expressed preference for Rubens’ robust infants over the grown-up medieval baby in a Madonna and Child, he would say that neither of them really connects to reality. The Rubens baby, with its Herculean musculature, is just as idealized as the medieval adult icon.
Rookmaaker’s lectures at l’Abri, also reproduced here, stress the unity of life. In them he defends the Kuyperian approach to a world-and-life view. He reminisces on his discussions with his closest friend, Francis Schaeffer, about Dooyeweerd, recalling that they both profited from his critique enormously but made a conscious effort not to use his difficult terminology. Rookmaaker was deeply critical of pietism. He believed that the great tragedy of modernity was to have split the world into a sacred and a secular realm. He cautioned against Christian attempts at living in a subculture, because that unwittingly supported the same split world.
Arguably, the central question which characterized all of Rookmaaker’s investigations was the problem of meaning. There were meaning-structures in the world, which he simply called “reality.” He believed that history has been unfolding since the creation of humanity and its purpose in the cultural mandate of Genesis 1:26-31. When artists try to rebel against the laws of creation, they violate its inner structure, and therefore end up in absurdity. But this dilemma cannot last long, as the unfolding process will continue to develop under God’s providence regardless of whether a particular people conforms or not. Even though much in the West has ultimately headed toward “death” (a word found throughout his writings, and heard frequently at l’Abri), the ultimate direction of history is positive. The Reformation was a high point where the Neo-Platonic chain of being was destroyed, to be replaced by a healthier understanding of creation and human dignity. The Dutch landscapists of the 17th century, along with Rembrandt’s œuvre, mark the high points in this unfolding thus far. Since then, the forces of secularization have taken over. But nothing rules out further progress and a new Reformation.
He reflected over and over again on the doctrine of calling. He worried that the modern spirit of revolution, coupled with pietism, would flatten everything out and squeeze any hope for meaning out of the discussion. In his view, we can only combat this with a fully informed worldview, one that recognizes both the dignity of human beings within the creation and the decimation wrought by the Fall. In the booklet The Creative Gift, republished in these volumes, Rookmaaker suggests that much of the effort to solve the problem of Christianity and culture got off on a wrong footing, because it falls into abstraction. “Christianity” does not really have meaning. There are Christians, some good, some weak, but no “Christianity.” And “culture” is not something to be isolated from the universe. Rather, it is an environment where God has placed us, one which he rules despite its pretended revolt. “Creativity” is no special dimension, but is what we should be practicing all the time wherever we find ourselves.
Reading these rich pages will put iron in our blood. And we will remember why we were so grateful for such a unique guide, a prophet, and a friend. His voice still carries today. We need it more than ever.
William Edgar is professor of apologetics, coordinator of the Apologetics Department, and chairman of the faculty at Westminster Theological Seminary in Philadelphia. He is the author most recently of Truth in All Its Glory: Commending the Reformed Faith (P&R).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromWilliam Edgar
Allen Guelzo
A matter of conviction.
- View Issue
- Subscribe
- Give a Gift
- Archives
Michael Lind has made so many, and such glowing, references to me in What Lincoln Believed: The Values and Convictions of America’s Greatest President that I am not sure whether I should have appeared as a co-author of the book rather than its reviewer. So let me say at the outset that there are two things about this book which I think are worth admiring—and one very large questionable thing which may render the admirable parts moot. Those who are satisfied with this as an example of disinterested benevolence are invited to read on in safety.
What Lincoln Believed: The Values and Convictions of America's Greatest President
Michael Lind (Author)
368 pages
$27.28
As a pundit, a columnist, and a senior fellow at the New America Foundation, Lind is looking for the sort of thing in Lincoln which most people outside the analytical realms of academe look for, and that is some form of guidance about the nature of democracy. You might think that this looking would be better directed to the Founders—to Madison, Hamilton, Washington, and the Revolutionary generation. But we have become accustomed to the notion that an élitist republic rather than a democracy was the real goal of the Founders, and that democracy was something which was happening outside their circle, and not with their approbation. And so people turn, like Mr. Smith at the Lincoln Memorial, to what bearings on democracy Lincoln can give them.
Therein lies one of the great points Lind scores in What Lincoln Believed, because Lind understands how very, very perilous the status of democracy was in Lincoln’s day. In the middle of the 19th century, the United States was the only large-scale, functioning nation-state in the world living under anything that approached the idea of democracy. “In Europe,” Lind begins, “the dominant region of the world, monarchs and aristocrats were securely in command.” And with, apparently, good reason: the most recent attempts at popular self-government—the French revolutionaries of 1789 and 1830, and the German and Austrian revolutionaries of 1848—had collapsed the moment one faction’s notion of self-government differed from another faction’s notion. Democracy seemed to possess a lethal, and unavoidable, centripetal force, based on the sheer perversity of human nature.
That democracy survived the Civil War has permitted us to forget that it was ever in serious jeopardy, and forced us to explain Lincoln’s goals in more fuddled and contradictory terms—as the Great Commoner who wanted to raise up the little guy, as a willing dupe who paved the way for the emergence of the Robber Barons of the Gilded Age, as a mystical Unionist, as a prophet of the New Deal, as the Great Emancipator. What Lind sees, and sees with hairline accuracy, was that for Lincoln all of these were subordinate to proving to the theater of the world that democracy was fully capable of resisting the pressures democracies generated from within, without losing its democratic soul. “This is essentially a People’s contest,” Lincoln explained to Congress. “On the side of the Union, it is a struggle for maintaining in the world, that form, and substance of government, whose leading object is, to elevate the condition of men.”1 The war was thus more than a war, or even a civil war—it was an ideological test, to see whether the American experiment in self-government, “or any nation so conceived and so dedicated can long endure.”
This much forms Lind’s first cheer for Lincoln; the second cheer emerges at the end of the book, when he extends Lincoln’s defense of democracy as a defense, not of an airy theoretical principle, but of the democratic nation-state. Formed in the mold of Alexander Hamilton, Henry Clay, and Clay’s Whig Party, Lincoln believed profoundly in the right of Americans to self-government. But it was Americans, as Americans, who possessed that right. Lind’s Lincoln is not an internationalist—he is perfectly happy to have other nations follow the American example into democracy, but he does not think that Americans have any special ownership of the idea of democracy, and he has little interest in forcibly exporting it, on the pattern of a Wilsonian or a Rooseveltian internationalism. It was democracy in America, not American democracy, which Lincoln sought to defend, and sought to hold up as the “last, best hope of mankind.”
Which means that Lind sees in Lincoln no automatic assumption that the huddled masses, everywhere and always, were hungering for the American model, even if they took encouragement from the example of American democracy’s survival. Lincoln’s stand against the expansion of slavery, and then the secession of the Confederacy, was also a stand against the export of American democracy, if that export was tainted with slavery. This is not, we are invited to presume, a Lincoln who would have much interest in neo-conservative unilateralism.
But two cheers do not make a hurrah, and it is in the broad expanse of the book’s middle that Lind’s Lincoln turns peculiarly sour. Because Lincoln was a committed, if domestic, democrat (Lind argues), he could not have been any of the other good things people attribute to him—not the Great Commoner, not the Great Emancipator, and certainly not the almost-Christian mystic. Part of this argument is a fairly reasonable exercise of logical inference on Lind’s part; a larger part of it, I suspect, is visceral, from a man who understands Lincoln’s ideas remarkably well and simply doesn’t much like them, or much like Lincoln’s ideological descendents.
Take, for starters, Lincoln’s nationalism—this gave intellectual stiffening to his Whiggish preferences for high tariffs, government-funded superstructure investment, and a national banking system. The downside of such nationalism is that it also laid the foundation for the emergence of a swaggering and arrogant corporate capitalism and a kind of human tariff in labor, in the form of exclusionary immigration policies that tried to shut out foreign workers from competition with white Americans. Lincoln was thus responsible for a revolution in American affairs, a “Second Republic” as Lind calls it, a closed economic shop whose one focus was on the cultivation of industrial productivity of, by and for white Americans. Only with the advent of the New Deal and World War II—what Lind calls “the Third Republic”—did Americans finally throw off the mantle of protectionism and become the arch-proponents of economic globalization, free-trade, open immigration, and broadly based civil rights.
The same exclusionary logic that operated in favor of white Americans and against foreigners also operated against non-whites at home. Lind does not doubt the sincerity of Lincoln’s aversion to slavery; what he doubts is whether it amounted to much beyond that, and whether the elimination of slavery operated principally in Lincoln’s mind as a way to eliminate yet another form of competition with free white labor. And true enough, Lincoln was slow to oppose more than simply the expansion of slavery; even when he finally did realize that he had no alternative to abolishing slavery and emancipating Southern blacks, he did so with the clearly enunciated intention of deporting the freed blacks somewhere else and reserving the United States for whites. “I am … in favor of our new Territories being in such a condition that white men may find a home—may find some spot where they can better their condition—where they can settle upon new soil and better their condition in life,” Lincoln said in 1858.2 To be sure, the deportation never happened. But the freed slaves were dumped under the wheels of something nearly as repugnant, in the form of Jim Crow segregation. To this, Lind doubts whether Lincoln would have had much objection, and so the 14th and 15th Amendments would likely never have followed the 13th if Lincoln had served out his second term as president.
This may not seem very consistent with Lind’s previous depiction of Lincoln as the Great Democrat. But Lind’s Lincoln, remember, is a nationalist—American democracy is a virtue for those who can be defined as Americans, and Lincoln does not define as Americans anyone with black skin. This does not mean—and it is this which decisively separates Lincoln from Stephen A. Douglas and the pro-slavery militants—that Lincoln was simply another Romantic racist who denied that blacks were even human, or who claimed that blacks have no natural rights as whites do. What he doubted was whether the physical markers that defined races would ever allow full civil equality and civil integration of multiple races within a single nation-state. The majority race, by virtue of their majority, had a legitimate power to exclude minority races from civil equality in a democracy; but otherwise, those minority races were perfectly capable of practicing democracy within their own nation. “No sane man will attempt to deny that the African upon his own soil has all the natural rights” everyone else possesses, Lincoln argued, and he fully expected that the blacks who were colonized abroad after emancipation would create model democracies of their own.3 (Lincoln, in fact, took the dramatic step of extending diplomatic recognition to one such black republic, Haiti.) But at the bottom line, Lincoln’s interest in blacks was strictly subordinate to his interests in whites. And that, in turn, explains for Lind the great geo-political shift Lincoln’s Republicans experienced in the 20th century, from being a coalition of Northern capitalists and Western farmers to being a party of Southern whites and Christian fundamentalists. The interests of white people were always at the heart of Republican affairs, and by the election of 2000, without much difficulty, “the party of Abraham Lincoln had become the party of Jefferson Davis.”
After damning Lincoln for racial indifference and running-dog capitalism, there may not be much enthusiasm left for giving the two cheers Lind wants to give Lincoln as the Great Democrat and the Great Nationalist. But more troubling is whether Lind really has the evidence he wants for Lincoln as the Great Racist and the Capitalist Tool. There is no question but that Lincoln resonated fully with Henry Clay’s “American System” (Clay was, after all, his “beau ideal of a statesman”), or that, as his partner William Herndon remarked, Lincoln managed to make quite a good living as a lawyer, representing the interests of big railroad corporations. “Much as we deprecated the avarice of great corporations,” Herndon chuckled, “we both thanked the Lord for letting the Illinois Central Railroad fall into our hands.”4
But in Lincoln’s imagination, the great virtue of capitalism was its power to liberate people from the trammels of status and class, to promote social mobility. “I don’t believe in a law to prevent a man from getting rich,” Lincoln insisted, because laws that prevented a man from getting rich were precisely what aristocrats used to keep power in their own hands. “Free society is such … that there is no fixed condition of labor”; anyone who “starts poor, as most do in the race of life … knows he can better his condition.” And a man who knows he can better his condition is the first and deadliest enemy of every aristocrat, whose future depends on everyone keeping to their own place and not jeopardizing theirs. Lincoln wanted “every man to have the chance—and I believe a black man is entitled to it—in which he can better his condition. … That is the true system … and so it may go on and on in one ceaseless round so long as man exists on the face of the earth!”5 Lincoln was not contradicting his allegiance to democracy by his devotion to capitalist development, whether in the form of tariffs or “internal improvements”; it was precisely the wedding of ambition to “the fuel of interest,” rather than to social rank, which gave democracy its vitality.
Lind’s most egregious failure, however, is his mischaracterization of Lincoln on race. No one needs to mistake Lincoln for a racial equalitarian; they are, in fact, pretty thin on the ground around the world even today. At the same time, no one needs to mistake him for a lily-white bigot, either. It was Lincoln, to the horror of Stephen Douglas, who kicked off the great senatorial campaign of 1858 by saying, “Let us discard all this quibbling about this man and the other man—this race and that race and the other race being inferior … and unite as one people throughout this land, until we shall once more stand up declaring that all men are created equal.”6 Lincoln, likewise, was never the ardent colonizationist Lind makes him out to be (a case so weak that Lind must resort to citing instances of colonizationist talk from Lincoln which he knows to be bogus).7 The one experiment in colonization which Lincoln did sponsor, in 1863, was framed as a purely voluntary, Congressionally funded expedition to the Caribbean, and when it flopped after six months, Lincoln had a warship retrieve the colonists and never raised the subject again. From that point onward, Lincoln progressively talked more and more about integration and voting rights, not colonization. “How to better the condition of the colored race has long been a study which has attracted my serious and careful attention,” Lincoln told New York abolitionist and Union general James Wadsworth in January, 1864. “In assisting to save the life of the Republic, they have demonstrated in blood their right to the ballot, which is but the humane protection of the flag they have so fearlessly defended.”8
Nor is it fair for Lind to suggest that a persistent strain of Lincolnian racism is what turned the South into the stronghold of the Republican party in the 20th century. Lind assumes that the Confederate South has remained, demographically as well as ideologically, the same Confederate South it always was. But this ignores the massive migration of American capital and population from the Northeast to the South and the Sun-Belt beginning in the 1970s, a migration which brought middle-class Northerners to the South in numbers unseen since Reconstruction, and brought with them the Republican party in similar numbers, similarly unseen since 1877.
I suspect that, lurking deep within Lind’s own authorial and political subconscious, is the realization that Lincolnian principles have not only shaped, but continue to shape, a good deal of the political life of the nation—and that these are principles with which Michael Lind has little personal sympathy. He can endorse Lincoln the Great Democrat, but only to the extent of seeing him as a knight of democratic faith; he would prefer not to see this Great Democrat striding through the world like Sir Artegal’s iron man Talus (or George W. Bush) with his righteous flail, and so Lincoln is tailored down to being a domestic democrat rather than an internationalist one. But even the domesticated Lincoln can be something of a threat, which is why I am inclined to think that streaking him with racism and cupidity, as Lind does, is a device to keep people from taking Lincoln too far or too seriously. What we end up with is a singularly lopsided Lincoln—and a flawed but interesting book. A book worth two cheers, yes; but not a hurrah.
Allen C. Guelzo is the Henry R. Luce Professor of the Civil War Era and director of the Civil War Era Studies program at Gettysburg College. He is a two-time winner of the Lincoln Prize for Abraham Lincoln: Redeemer President (2000) and Lincoln’s Emancipation Proclamation: The End of Slavery in America (2004).
1. Abraham Lincoln, “Message to Congress in Special Session” (July 4, 1861), in Collected Works of Abraham Lincoln, ed. Roy P. Basler (Rutgers Univ. Press, 1953), vol. 4, p. 438.
2, AL, “Seventh and Last Debate with Stephen A. Douglas at Alton, Illinois” (Oct 15, 1858), in C.W., vol. 3, p. 312.
3. AL, “Speech at Carlinville, Illinois” (August 31, 1858) in C.W., vol. 3, p. 79.
4. Herndon’s Life of Lincoln: The History and Personal Recollections of Abraham Lincoln as Originally Written by William H. Herndon and Jesse W. Weik, ed. Paul M. Angle (World, 1942), p. 284.
5. AL, “Speech at New Haven, Connecticut ” (March 6, 1860), in C.W., vol. 4, pp. 24-5.
6. AL, “Speech at Chicago, Illinois” (July 10, 1858), in C.W., vol. 2, p. 501.
7. Lind cites as his clinching example of a Lincoln persistent in his determination to deport freed blacks the claim of Benjamin F. Butler, made in 1885, that Lincoln told Butler as late as January, 1865, that he was still looking for ways to effect colonization; the Butler story, however, has been demonstrated to be a fabrication by Mark Neely in “Abraham Lincoln and Black Colonization: Benjamin Butler’s Spurious Testimony,” Civil War History, Vol. 25 (March 1979), pp. 76-83. But Lind, even after acknowledging that “historians have questioned Butler’s veracity,” still steams serenely past them and claims that “there is no reason to doubt his [Butler’s] account of Lincoln’s obsession with the colonization scheme” (p. 225).
8. AL, “To James S. Wadsworth,” in C.W., vol. 7, p. 101
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAllen Guelzo
Mary Noll Venables
The rest of the story.
- View Issue
- Subscribe
- Give a Gift
- Archives
Ireland is one of the few remaining countries where it’s a major news item that Catholics make up less than 90 percent of the population. According to reports last spring, the number of Protestants is edging higher while the number of Catholics is holding steady. The Church of Ireland, Ireland’s largest Protestant denomination and the former established church, gained congregants for the first time in over a century. Presbyterian and Methodist memberships also increased. Meanwhile, many new non-Catholics have recently arrived in Ireland, and groups that still represent only a tiny fraction of the Irish population, such as Muslims and Orthodox Christians, are nevertheless growing rapidly relative to their numbers a decade ago. As a result, only 88.4 percent of residents in the Republic of Ireland are Catholic.1
Making the Grand Figure: Lives and Possessions in Ireland, 1641 1770
Toby Barnard (Author)
520 pages
$12.97
Changing religious affiliation reflects a changing Ireland. Thanks to the “Celtic tiger” economy, Ireland has become a country that attracts, rather than sends, migrants. Its diversifying population has encouraged many, from political commentators to radio presenters, to ponder what it means to be Irish. Do you have to be born in Ireland to be Irish? Do you need to speak Irish to be Irish? And do you have to be Catholic to be Irish?
Toby Barnard’s work on the often-neglected history of Irish Protestants has something to add to this contemporary discussion. A New Anatomy of Ireland: The Irish Protestants, 1648-1770 outlines who Irish Protestants were; Making the Grand Figure: Lives and Possessions in Ireland, 1649–1770 describes what Irish Protestants owned. Filled with detailed and careful research, Barnard’s books remind us that Protestants have a long history in Ireland and that their history includes more than Oliver Cromwell’s rampage in the 1650s.
In Cork my husband and I often encounter remnants of that forgotten history: a Methodist church (now a clothing store), a Quaker assembly room (now closed), and three Church of Ireland churches that have been turned into a Catholic church, a concert hall, and an office development. Barnard’s books help the reader envision who might have filled such Protestant churches from the 1650s to the 1770s, a period known as the Protestant ascendancy. At this time the Protestant population in Ireland was around 400,000, or a quarter of the island’s population. Catholics outnumbered Protestants, but Dublin and parts of Ulster, the northernmost province, had more Protestant than Catholic residents after 1732. Protestants continued to dominate Ulster demographically, while the Protestant presence in Dublin declined over the eighteenth century. In Cork, 33 to 40 percent of the population was Protestant. Other Irish towns—Limerick, Drogheda, Kilkenny, and Galway—were less than a third Protestant. In any case, Protestants enjoyed disproportionate wealth and influence. The law of the land reserved the upper reaches of Irish society—as well as positions in the church, law courts, and army and navy—for Protestants.
A New Anatomy surveys the Irish Protestant population, from peers to the poor. Barnard organizes the book by social class, but he acknowledges that defining someone’s social standing depended more on perception than on substance. Participating in hunts, which marked “quality,” required an annual income of forty pounds. Beyond appearing on horseback, dress and living arrangements greatly influenced the perception of “quality.”
Barnard’s decision to separate his subjects by class, despite the elusive nature of social definitions, gives the book a sterile feel. The reader learns tidbits about social classes but gains few extended introductions to specific peers, clergy, or barristers. Barnard has compiled so much information in these two volumes that he sometimes loses sight of the people in the study. For example, he notes that rank within the hierarchy of professions depended on the price of training. Therefore practicing law at Dublin Four Courts was highly prestigious since it required studying at the London Inns of Court. However, no one barrister stands out much more than any other in Barnard’s account.
This lack of individuality is a pity, because when Barnard turns to biography, he brings Irish Protestantism to life. He describes four land agents who ran the Boyle estates in southern Ireland to illustrate the varieties of Protestant landowners. Digby Foulke, William Congreve, Roger Power and Richard Bagge came from distinct regions, made different fortunes, and had varying success. Foulke’s parents were tenants on the Boyle estate, and he and many relations continued in Boyle employment. Congreve was from Yorkshire but became fully integrated into Irish Protestant society. Roger Power came from an Old English family near the Boyle estate. He was elected to parliament in 1703. When he died, his estate was estimated at six thousand pounds, a great sum. Bagge, the lowliest in status of the four, left the earl’s employment after he was accused of corruption. The obvious differences in the backgrounds and success of the four men indicate that simple categories such as “land agent” do not tell the whole story.
To be fair, Barnard’s excursions into biography are limited by the records and correspondence that his subjects left. Lady Arbella Denny (1707-1792), a regular letter writer (and a fascinating character in her own right), features prominently in both books. Her wide-ranging accomplishments typify the influence that Protestants had in Ireland during the ascendancy. Lady Denny was the daughter of an earl, the wife of a member of parliament, and the first woman elected to the Royal Dublin Society. She was widely known for her charitable works; she reformed the Foundling Hospital and in 1767 opened the Magdalen Asylum as a refuge for women from good homes who had become prostitutes.
Barnard’s second volume, Making the Grand Figure, describes the possessions that Arbella Denny and other Irish Protestants would have owned. Although the book sometimes feels like a catalogue of country houses, Barnard argues that the materialism of Protestant culture characterized the entire Protestant experience in Ireland. Protestant wealth, particularly elaborate displays of wealth, distinguished Protestant from Catholic. Since maintaining distinctions between privileged Protestants and poor Catholics was at the heart of the Protestant ascendancy, the material goods that Protestants used to reinforce their separation from Catholics are central to the history of the ascendancy.
Barnard begins with Protestant houses, which were built with stone and mortar, in contrast to Catholic dwellings of straw and mud. Protestant houses had high ceilings and wooden floors, while Catholic cottages had low ceilings and mud floors. And Protestants filled their homes and calendars with goods and pursuits that most Irish Catholics could not afford. Barnard devotes a good portion of the book to the production and accumulation of silver. He notes that banking was more difficult in Ireland than in England and that owning silver may have been a convenient way to hold assets. Irish Protestant householders also collected paintings when they had the funds and etchings and engravings when resources were limited. Outdoors Protestants rode, hunted, raised dogs, and planted ornate gardens.
To contextualize the lives of Irish Protestants, Barnard provides occasional comparisons with English society. The greatest similarity between English and Irish society in this period was the monopoly that the established church held. To participate in state functions or to practice most professions required holding membership in the Church of England or Ireland and receiving the Eucharist at least once a year. Among those who fulfilled the confessional qualifications, Irish Protestants were distinct from their English counterparts. Overall, residents of Ireland were less wealthy, and even Irish “quality” were generally poorer than “quality” in lowland England. Clerical stipends were also lower in Ireland than in England.
Barnard’s comparisons between Irish Protestants and upper-class English highlight a fundamental question that neither of his books addresses. Is Protestant the distinguishing characteristic for the people whose lives he describes? Barnard writes about Irish Protestant and Catholic housing, but better descriptors might be rich and poor housing.
Barnard’s neat picture becomes still more complex when we take into account the shifting relationship between Protestantism and national identity. During the ascendancy, Protestants in Ireland largely regarded themselves as English. But the Protestant ascendancy itself started a historical process that led many Irish Protestants to decide they were Irish. Hence the uneasy relationship between English and Protestant and Irish that persists to this day.
As Barnard exhaustively documents, English settlers in Ireland were privileged in their training, careers, houses, furnishings, and leisure. They built grand houses and elaborate gardens, purchased fine silver and family portraits, and tried to impress their neighbors with their dress and comportment. And then, Barnard occasionally hints, at some point they were no longer completely English. Just as new arrivals to Ireland are changing the definition of what it means to be Irish in the 21st century, so English Protestant settlers in Ireland have changed past definitions of Irishness in ways that are still potent. Barnard’s books begin to open up their lives.
Mary Noll Venables recently received her Ph.D. in Early Modern European History from Yale University and is now living in Ireland.
1. Conor Pope, “Major rise in Muslims, Orthodox Christians—Census,” Irish Times (Dublin, Ireland), April 8, 2004; Georgina O’Halloran, “Success story for Church,” Evening Echo (Cork, Ireland), June 3, 2005.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromMary Noll Venables
N. D. Wilson
Books & CultureJanuary 1, 2006
When I was two, I was inclined to certain misbehaviors in my bath. If memory serves, I believe standing up and fiddling with the knobs was involved. And splashing. During one particular bathing experience my mother had to leave the room briefly. So, she relied on my older sister, who was not yet five, to occupy me.
"Tell him a story," my mother said. And my sister did.
"Once," my sister said, "there were four children whose names were Peter, Susan, Edmund and Lucy. This story is about something that happened to them when they were sent away from London during the war because of the air-raids."
She was reciting, and she recited from the beginning of The Lion, the Witch and the Wardrobe to somewhere around Lucy's second passage through the fur coats. The rendition was abridged, but she hadn't done the abridging. Our cassette tape had. Ian Richardson, narrator, had read an abridged version to us so many times that my sister had a sizeable chunk of it word for word.
The film … well, the film isn't just abridged, and it isn't read by Ian Richardson.
Sitting in a Hollywood screening room, waiting for my advance glimpse of the Disney/Walden rendition of talking beavers and a forest-infested wardrobe, I have a lot of time to think about my relationship with Narnia. I wonder if I am capable of liking any film adaptation. Will I simply spend the entire time noticing small changes, unable to see the film apart from its inspiration? Probably.
Two was a good year for me. I sat through my first readings of Narnia, both abridged and unabridged. I sat in my highchair after dinner and listened to my father read to us as his father had read to him. That year I was introduced to both Lewis and Tolkien. My mother questioned my comprehension, but my father, ever optimistic, pointed out my red and sweaty cheeks, which made their appearance during scenes of battle.
I was marinated in Narnia, and I've been on a slow-roast ever since. I have no way of estimating how many times I have passed through those books, only how recently the last reading came—just last month. I love my mother, and I love Narnia. And if anyone chooses to show me an artistic rendition of either, they can expect criticisms. They can expect me to limber up and become a thorough and enthusiastic picker of nits.
Andrew Adamson, who brought us Shrek, Shrek II, and Shrek in the Swamp Karaoke Dance Party, is the director. Tell me that's promising.
But the story is far from ruined. The primary conflict remains virtually intact. The Stone Table scene is phenomenal. Aslan is effective and easily believable, and Lewis' Christianity has a loud presence. While book-readers like myself might be prone to stress and quibble, I expect this film to have nothing other than a deservedly positive reception in the broader evangelical world.
The film begins where it must, with German bombers over London. The opening sequence also gives us early tension and differences between Peter and Edmund, with Peter why-can't-you-do-what-you're-tolding his younger brother, a question which will serve as a bookend for the entire film.
The texture of the opening act is strong, and the casting works well. I find myself relaxing a little in my seat. But I wait for the inevitable, for some shifting of motivation, some change in dramatic tension that patronizes Lewis' original. That change does not come for a long while. But it does come.
Lewis himself had complaints about film adaptations. He was a lover of virtually every adventure story that could "introduce the marvelous or supernatural," including such prose-tripe as Voyage to Arcturus:
Unaided by any special skill or even sound taste in language, the author leads us up a stair of unpredictables. … He builds whole worlds of imagery and passion, any one of which would have served another author for a whole book, only to pull each of them to pieces and pour scorn on it. The physical dangers … here count for nothing: it is we ourselves and the author who walk through a world of spiritual dangers which make them seem trivial."
—Of Other Worlds, "On Stories"
This "marvelous or supernatural" was in fact what he strove to achieve in all of his stories, and is the common attribute of every story he admired critically, from King Solomon's Mines to Paradise LostParadise Lost. And it was the film version of King Solomon's Mines that bothered him.
Lewis complains that the producer of the film, "for me, ruined the story." This narrative ruin came about through the substitution of one danger for another, and that substitution of danger was an outworking of a literary paradigm of excitement. "Where excitement is the only thing that matters kinds of dangers must be irrelevant. Only degrees of danger will matter. The greater the danger and the narrower the hero's escape from it, the more exciting the story will be." Lewis goes on to explain that different kinds of dangers produce different kinds of fear—fear with awe, fear with horror, fear with disgust, numbing fear, and a quivering almost pleasurable fear. The imagination responds differently to these fears. They change the personality of a story accordingly.
While I notice simple shifts in description—Why does the white witch have blond dreadlocks? Where are her red lips? What happened to the charismatically, seductively, dangerously, beautiful Jadis? What happened to her palace? Why is it made entirely of icicles?—I finally come to the first shift in danger, the first place where the writers felt Lewis lacked "excitement."
The children are in the Beavers' house, and Edmund has left them. In the book, we immediately sense betrayal. Peter wants to follow Edmund, but the Beavers make him see the folly of this, and they all trek off as quickly as possible (leaving behind Mrs. Beaver's sewing machine). The children must trek stealthily, always listening for the bells of the witch's sleigh behind them (the wolves were sent to the Stone Table to discover if Aslan really had returned and to cut off the children if necessary). If you have ever done any sneaking with the fear of followers and ambushes, if you have ever attempted any stealthy and yet speedy treks across the park, across the lawn, or simply shifting hiding places from the bedroom to the hall closet, then you know this tension, this sensation of breathless, bottled-up, speedy caution.
But for the film, such understated tension isn't exciting enough. The children follow Edmund to the witch's ice castle, only then deciding to run back to the dam, pack up, and leave. Rather than sending the wolves ahead to the Stone Table, the witch sends them directly to the Beavers' house, and we have our necessary excitement.
The children are inside the house when the wolves begin tearing through the walls. I sit, wondering how the writers expect to believably get them out and all the way to the Stone Table with wolves on their heels. But the writers hand us a minor deus ex machina, and Beaver confesses to his wife that he has a secret tunnel that leads to Badger's house. We are then off on a wolf-chase climaxing on thin ice beneath a thawing waterfall. The waterfall tumbles, the ice shatters, and everyone washes down the frigid river, but nobody drowns, and because Spring is coming, there is no danger of hypothermia.
Certainly, this is more "exciting." But it produces a different danger, a different taste. Like MSG-ridden Chinese food, everything tastes a lot, but all the same. The market of tension becomes glutted, decreasing the value even of our primary conflict.
For myself, I flinch with every minor change of hair color, motivation, and the lack of gifts for the Beavers. I have trouble with the inflation of tension (though not of the battle). Peter's character is too conflicted (he just wants to get Edmund and go home). But at the same time, this film was lovelier than I expected it to be, frequently beautiful, and while it does get a bit distracted, it still communicates the best of what Lewis has to offer.
My own son is three. He knows the story, but Narnia is not yet concrete enough in his imagination to survive such a film. My nephew, son to my sister the bath-bard, is in first grade and is already doing laps through the Narnia Chronicles. When he sees the movie, I expect him to find frustration in the variances. Knowing him, and knowing his mother, his frustration will probably be greater than my own. I never knew any part of it word for word.
N. D. Wilson is a Fellow of Literature of New St. Andrews College and the managing editor of Credenda/Agenda magazine. His first novel for children will release in Spring 2007.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromN. D. Wilson
Andrea R. Nagy
How the OED was made.
- View Issue
- Subscribe
- Give a Gift
- Archives
Like the Bible, the dictionary is a book of weighty authority, and the Oxford English Dictionary is the most weighty and authoritative of all. Conceived in 1857 and published in its first edition between 1884 and 1928, the OED comprised 15,488 pages, 50 million words overall, and two million illustrative quotations. Today, in its updated and uploaded form, the OED defines some 600,000 lemmas, tracing word-by-word the history of our enormous and ever-changing language.
Lost for Words: The Hidden History of the Oxford English Dictionary
Lynda Mugglestone (Author)
Yale University Press
273 pages
$8.24
As a masterpiece of imperial English culture, the OED has been the subject of extensive criticism and analysis. In Caught in the Web of Words (1977), James Murray’s granddaughter recounted the sacrificial devotion of Murray in his 36 years as chief editor of the dictionary. In Empire of Words (1994), John Willinsky documented the Victorian bias toward great white men built into the dictionary. In The Professor and the Madman (1999), Simon Winchester told the story of the murderer in the insane asylum who contributed more than anyone knew to the making of the OED, and in The Meaning of Everything (2003), Winchester completed his story of the OED with anecdotes and personal portraits. Beyond these popular works, numerous scholarly articles and books have uncovered omissions, antedatings, and corrections to the dictionary.
So is there more “hidden history” to be revealed? According to Lynda Mugglestone, there certainly is. Behind the OED’s authoritative text is a history of composition, complete with personalities, debates, and prejudices that shaped its first edition. How were definitions written? How were quotations selected for inclusion? How was spelling and pronunciation decided upon? Does the OED really trace the history of every English word that has ever existed? These questions are the subject of Lost for Words: The Hidden History of the Oxford English Dictionary. Mugglestone has closely examined the editing process of the OED in a way that has not been done before. By poring over a vast archive of annotated proof sheets, as well as letters, reviews, articles, and speeches, she has filled in many details about the editorial decisions that shaped the dictionary at the final stages of publication.
Mugglestone’s research supports much of what we already know about James A. H. Murray. Like Samuel Johnson, he was “a poet doomed at last to wake a lexicographer.” Murray dreamed of creating a fully descriptive, exhaustive, historical record of the language. With the impartiality of a scientist, he would document the story of every English word, whether low or high, old or new, common or esoteric. Such a biography of the language would be a “historical monument” fit for a great nation. But alas, as Mugglestone puts it, “The lexicon could not, in practice, be encompassed by the lexicographer.” Although Murray wished to create an ideal dictionary, he was forced by budget constraints and cultural pressures to edit the text in more prescriptive directions.
The annotated proof sheets reveal that editing primarily meant cutting. Murray was constantly obligated to compromise his descriptive ideal, deleting quotations, definitions, and entire entries. Mugglestone discusses the rationale behind the deletions, confirming that literary language tended to be favored over vulgarisms, established vocabulary over neologisms. Thus quotations from daily newspapers were cut, while the wisdom of poets and bishops was kept. “Linguipotence” was retained because it was a coinage of the poet Samuel Taylor Coleridge, while “greyhoundy” was omitted, being used only in the popular journal Black and White. “Condom” was omitted without much question, while some of the most potent four-letter words were regretfully suppressed after lengthy debate. “Enthuse” was labeled “colloquial,” and “gent” was censured as “vulgar.” The editors made quite a few concessions to Victorian sensibilities.
Of course, with the second edition and the creation of the Oxford English Dictionary online, many of these omissions have since been corrected. The OED now features a complete history of every well-known taboo word in the English language, including a 281-word entry on “condom” with quotations beginning in 1706. A full range of Americanisms is covered, as well as world English from Australia, South Africa, Canada, and other Anglophone countries. Slang, too, is amply represented, from “awesome!” to “yo!”, as are the most obscure technical terms, such as “algology” and “ampelography.”
But even in its first edition, the OED ended up being more descriptive than its cultural milieu was accustomed to. For example, although the delegates of the Oxford University Press specified that the dictionary should avoid scientific terminology, Murray directed the dictionary’s researchers and writers to embrace a wide variety of technical vocabulary. And although he received numerous complaints about the “incorrect” definitions of such words as “arcade” and “abhorrence,” Murray declared, “I am not the editor of the English language,” and he defined these and other words in accordance with the evidence before him. As Mugglestone summarizes it, “The fact that so many letter-writers … saw fit to complain about the undue liberality of the dictionary … serves as a useful index of the level of descriptive impartiality which the dictionary did indeed achieve.” Her careful study of these hitherto unexamined letters and proofs shows exactly where Murray adhered to his principles and where he chose to compromise.
But how much of this “hidden history” needs to be revealed? For a devoted scholar of the OED, perhaps all of it. However, for the nonspecialist Lost for Words contains too much information. We are given far too many quotations from Murray and his correspondents debating details of spelling and usage, comments that might have been summarized in a few paragraphs with data presented in a table. We are given extensive dictionary definitions of “loss,” “prune,” and “adjust” as background for a discussion about the cutting of entries. And at times Mugglestone belabors the obvious, as when she spends several pages lamenting that the dictionary uses “man” where we in the 21st century would use “person.”
It makes for painful reading, too, when Mugglestone takes on the jargon of a new-historicist literary critic, uncovering “cultural agendas” and “cultural codings” that are invariably “disturbing.” In one of her most impenetrable sentences, she states, “If such socially constructed edicts are in keeping with a self-styled manual on ‘good’ usage, then they can seem disconcertingly normative when they appear within the intentionally objective domain of the OED in which empiricism rather than language attitudes—particularly those based on convictions of one’s place in the social order—had been given categorical pre-eminence.” In other words, although Murray claimed to be fully objective, he was influenced by his culture and values. Why does Mugglestone need to make simple things complicated?
For this book makes essentially a simple argument: in spite of heroic efforts on the part of James Murray, the OED was to some extent shaped by social preferences in favor of high culture. This is not a new argument, but what is new is Mugglestone’s examination of the proof sheets and other contemporary documents, which lend support to this well-established understanding of the development of the OED. It is unfortunate that this point is obscured by superfluous detail and an excessively analytical style.
If you want to know about the short life of “lustricity” or the comparative advantages of “rhyme” and “rime” or the lexicographical debates over the correct usage of “avocation”; how “fray” came to lose its “obsolete” label or how “okonite” was derived from “ok,” then Lost for Words will provide informative reading. On the other hand, if you want to hear the story of the making of the OED, stick with the books of Elisabeth Murray and Simon Winchester.
Andrea R. Nagy has been a project editor for the New Oxford American Dictionary, a citation reader for the Oxford English Dictionary, and the author of scholarly articles on the history of English dictionaries.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAndrea R. Nagy
Lauren F. Winner
In search of a counterculture for the common good.
- View Issue
- Subscribe
- Give a Gift
- Archives
Discuss this article
If there is one thing that has defined evangelical Christians, it is their volatile relationship to the cultures where they have sojourned. In America, evangelicals have at various times enjoyed everything from near hegemony to internal exile. They have abjured political power and sold pearls of great price to obtain it—often in the same lifetime. They have censored, critiqued, consumed, and copied the fruits of mass culture—sometimes all at once. They have harbored some of the most enduringly radical American voices on social responsibility and racial justice, yet in recent years their most innovative and influential leaders have been found in exurban locales of hom*ogeneous wealth. They have produced notable scholars of history and enthusiastic popularizers of the end of the world.
It would be more honest, though, to say “we” instead of “they.” As a publication of Christianity Today International, Books & Culture is very much part of the ongoing, unpredictable, sometimes combustible evangelical engagement with culture. Over the next three years we will join our sister magazines Christianity Today and Leadership Journal in the Christian Vision Project, an effort to ask three “big questions” that define critical territory in the Christian relationship to culture, mission, and the gospel. In the first year, with the generous assistance of the Pew Charitable Trusts, we focus on the question, How can followers of Christ be a counterculture for the common good? This piquant phrase, which we have borrowed from the Rev. Timothy Keller of Redeemer Presbyterian Church in Manhattan, juxtaposes two neglected themes. We hope the contributions in these pages, on the website ChristianVisionProject.com that will launch in February, and in a series of DVD documentaries will spark much fruitful conversation and action.
We have asked six people to respond to this question in Books & Culture in 2006. All of them are serious and creative Christian thinkers—though not all are evangelical Protestants—and many will be familiar to longtime readers. Perhaps none will be more familiar than our first contributor, Lauren F. Winner, who at 29 is completing a Ph.D. in American religious history from Columbia University while both teaching and studying at Duke Divinity School, and travels widely speaking to audiences in the wake of her book Real Sex. With all this on her plate, perhaps the subject of her answer to our “big question” is natural—but that doesn’t make it any less important.
My subject is the theology of sleep. It is an unusual subject, but I make no apology for it. I think we hear too few sermons about sleep. After all, we spend a very large share of our lives sleeping. I suppose that on an average I’ve slept for eight hours out of twenty-four during the whole of my life, and that means that I’ve slept for well over twenty years. What an old Rip van Winkle I am! But then, what Rip van Winkles you all are, or will one day become! Don’t you agree then that the Christian gospel should have something to say about the sleeping third of our lives as well as about the waking two-thirds of it?
—John Baillie, “The Theology of Sleep,” in Christian Devotion (1962)
Last night, I pulled one of my very few all-nighters. These were not uncommon in my college years, but my capacity to stay up all night and be anything approximating coherent the next morning has declined as I’ve marched through my twenties. So now I stay up all night very rarely, once every two years or so, and only when I am truly desperate.
But the storied all-nighters are just the most extreme example of something many of us do quite a lot: chip away at sleep in order to do something else. Usually that something else is work.
A simple glance at my email inbox tells me that I am not alone in sacrificing sleep in order to squeeze in a few more hours of work. Last Tuesday alone, I received 23 work-related emails that had been sent between 10:00 p.m. and 5:00 A.M. This creeped me out. The next night, in fact, I had some trouble falling asleep. I lay in bed worrying about the correspondence that was accumulating in my email account, the possibly pressing matters I would need to address in the morning, and the number of hours the next morning that I would have to devote not to preparing to teach my afternoon class, but to replying to email. Eventually I rolled over and set my alarm back from 6:30 to 5:00, resolved to use the extra 90 minutes of wakefulness for email.
Wakefulness, actually, may not be the right word. For though I “gained” 90 minutes in which I was awake, I actually lost wakefulness. Sleep specialists are virtually unanimous on this: With some notable exceptions who seem wired to operate on a different schedule (Thomas Edison is a famous example), we human beings cannot lose sleep without decreasing our attention span, our response time, our acuity. I may have been awake for 90 extra minutes, but I was less wakeful all day long.
According to the National Sleep Foundation, the average adult sleeps six hours and 58 minutes per night during the work week. One hundred years ago—before Mr. Edison’s marvelous invention—people slept about nine hours a night. They were right in line with the eight to ten hours of sleep specialists say we need. Now we are a nation of the chronically sleep-deprived.
Adults’ zeal for cutting back on sleep has consequences for children, too—and not just that parents and teachers are crabbier because they’re not well-rested. Children need even more sleep than adults, yet parents now keep them up later and later, possibly because working moms and dads want to “spend quality time” with their children (a phrase laden with many revealing contradictions and falsehoods, but that’s for another day), something that’s just not possible if you arrive home from work at six o’clock and Junior’s in bed by 7:15. Last year the Washington Post reported that naptime is increasingly “a luxury that 4-year-olds no longer can afford.” Many Washington-area schools are eliminating naps from the kindergarten curriculum, so that 45 more minutes can be devoted to instruction. Administrators seem unconcerned that their charges would learn better if they were well-rested, but that may not be the point. In trading nap time for more time spent studying the alphabet, these tots are really learning to value productivity, or at least activity, above all else.
The irony is that although many of us trade sleep for productivity, we would actually be more productive if we slept more. When we don’t get enough sleep, we accumulate “sleep debt” which has to be paid back. (It’s no coincidence that we describe this state with a metaphor drawn from banking, one William Wordsworth nicely turned on its head when he asked, in his poem “To Sleep,” “Without Thee what is all the morning’s wealth?”) We concentrate better and are less easily distracted when well-rested. A study from the University of Minnesota recently showed that when high schools started the day 85 minutes later, at 8:40 A.M. instead of 7:15 A.M., students got more sleep at night, fell asleep in class less often, and got better grades. When we’ve gotten good sleep, we are also happier, nicer, and healthier. Michael Irwin, director of the Cousins Center of Psychoneuroimmunology at UCLA, says, “Even a modest disturbance of sleep produces a reduction of natural immune responses and [production of] T-cell[s],” the cells that combat the effects of viruses and other pathogens on our bodies.
Indeed, sleep deprivation carries great costs, both in dollars and in human life. Tragedies related to sleep deprivation—car wrecks, accidents at the workplace, and so forth—cost Americans more than $50 billion a year, and result in at least 20,000 deaths. The National Highway Traffic Safety Administration says sleep deprivation causes 100,000 traffic accidents a year. (The slower response time of people who’ve not gotten enough sleep accounts in part for the spike in wrecks on the day after the spring shift to Daylight Savings Timely, when people often lose an hour of sleep.) Psychologist and sleep specialist Stanley Coren has suggested that the accidents at Chernobyl and Three Mile Island both occurred in part because sleepy employees, dragged down by sleep debt, were “not working at top efficiency and were not motivated to check details closely.” According to Coren, sleep deprivation was also a factor in the Exxon Valdez oil spill. To save money, Exxon had been cutting back on staff, which required the remaining employees to put in longer hours. The oil spill would not have happened had not an exhausted third mate fallen asleep on the job.
When folks from my local church gather for an evening meal or adult education class, we usually close with Compline, the nighttime service from the Book of Common Prayer. This service—in which we pray for a peaceful night and a perfect end, repeating the nunc dimittis (originally uttered by Simeon in a somewhat different context, asking God to let his servant depart in peace)—is helping me to understand sleep as part of faithfulness. For it is sheer hypocrisy to pray with my community for a peaceful night and a perfect end if I know I am going home to put in three or four more hours answering email.
Sleep more: this may seem a curious answer to the question of what Christians can do for the common good. Surely one could come up with something more other-directed, more sacrificial, less self-serving. Or more overtly political—refusing to serve in the current war. Or more communitarian, making a commitment to street and neighborhood that overrides new job offers.
And let’s be honest. Had I instead written a rousing essay calling all Christians to hold vigils against the death penalty next week, the very improbability that anyone would heed my call would let us all off the hook. One of the reasons you may be wishing I hadn’t suggested we Christians sleep more is that sleeping more is something you can choose to do, or not do, this very night.
It was one of the reasons I was tempted to write about protesting capital punishment instead, for I will have a chance this very night to practice what I’m preaching, and it will be much harder than sending a check to Virginians Against the Death Penalty.
All of those things—protesting capital punishment, working with our neighborhood association, and so on—would be good things for us Christians to undertake as well. But for the moment I am sticking with the small, if challenging, task of becoming better rested. Not only does sleep have evident social consequences, not only would sleeping more make us better neighbors and friends and family members and citizens. Sleeping well may also be part of Christian discipleship, at least in our time and place.
It’s not just that a countercultural embrace of sleep bears witness to values higher than “the cares of this world, the deceitfulness of riches, and the desire for other things.” A night of good sleep—a week, or month, or year of good sleep—also testifies to the basic Christian story of Creation. We are creatures, with bodies that are finite and contingent. For much of Western history, the poets celebrated sleep as a welcome memento mori, a reminder that one day we will die: hence Keats’s ode to the “sweet embalmer” sleep, and Donne’s observation, “Natural men have conceived a twofold use of sleep; that it is a refreshing of the body in this life; that it is a preparing of the soul for the next.” Is it any surprise that in a society where we try to deny our mortality in countless ways, we also deny our need to sleep?
The unarguable demands that our bodies make for sleep are a good reminder that we are mere creatures, not the Creator. For it is God and God alone who “neither slumbers nor sleeps.” Of course, the Creator has slept, another startling reminder of the radical humility he embraced in becoming incarnate. He took on a body that, like ours, was finite and contingent and needed sleep. To push ourselves to go without sleep is, in some sense, to deny our embodiment, to deny our fragile incarnations—and perhaps to deny the magnanimous poverty and self-emptying that went into his Incarnation.
French poet Charles Peguy makes the point well:
I don’t like the man who doesn’t sleep, says God. Sleep is the friend of man, Sleep is the friend of God. Sleep is perhaps the most beautiful thing I have created. And I myself rested on the seventh day. … But they tell me that there are men Who work well and sleep badly. Who don’t sleep. What a lack of confidence in me.
Peguy’s words have perhaps never been more fitting: to sleep, long and soundly, is to place our trust not in our own strength and hard work, but in him without whom we labor in vain.
More CVP articles from our sister publications are available on ChristianVisionProject.com. Also check out the Christian Vision Project’s new video documentary, Intersect|Culture. The videos take you into the stories of ordinary believers who, by faith, changed their communities. The set includes a DVD with 6 videos and coordinating group curriculum.
Lauren F. Winner is the author most recently of Real Sex: The Naked Truth About Chastity (Brazos).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLauren F. Winner