John Wilson
- View Issue
- Subscribe
- Give a Gift
- Archives
Wendy, my wife, is trying—again—to persuade me to stop subscribing to the New York Times. (“We could pick up the Sunday paper every week at Starbucks,” she says.) I know that she is mostly thinking of me. She is sure I am suffering from information overload. But also to preserve her own sanity and the harmony of our long union she’d like to reduce, even just a little, the flow of printed matter into our home. Quite apart from what’s going into my head, there’s too much wordstuff, books and magazines and journals and newspapers, always threatening to colonize another flat surface.
You may be thinking that there’s a very satisfactory compromise ready at hand: the Times on the web. I do go to the website a number of times in the course of a week, for one reason or another, but, much as I value that resource, it’s no substitute for holding the paper in my hands. The Times and the Chicago Tribune arrive each day in their blue plastic wrappers as surely as the sun rises. When a good friend and fellow editor told me recently that he’d stopped reading the Times, fed up with the smugness and moral vacuity of the paper’s party line, I was stunned. It was a little like hearing that a friend has sold or given away his possessions and gone to live among the poor.
Of course I understand his exasperation. Perhaps he was afraid that reading the Times was tempting him on a daily basis to feel morally superior. That’s certainly a hazard one must reckon with. Consider this headline from the Tuesday Science section (Nov. 29): “A Pair of Wings Took Evolving Insects on Nonstop Flight to Domination.” Can’t you hear that intoned in the slightly menacing voice of a pbs narrator? The article, by the well-known science writer Carl Zimmer (and illustrated with superb photos), lives up to the headline. Here’s my favorite paragraph:
And insects are also ecologically essential. If all humans decided to leave for Mars, taking all vertebrates with them, the disruption of life on Earth would be incomparably less than the catastrophe that would ensue if insects disappeared. Forests would probably collapse, rivers and oceans would be poisoned, and many other animals would starve.
It’s hard not to sense in this passage the implication that insects are somehow virtuous even as they revel in world domination; as Zimmer puts it, no matter how you slice it, “insects still win.”
We’ll return to this theme in a future issue (right now I am reading a fascinating book by Thomas Eisner, For Love of Insects, published by Harvard University press in 2003). But it would be an impoverished reader who gleaned from the Times only those pieces that come heavy with ideological freight. Last night while Wendy took a bath I read aloud to her from an article by Sarah Lyall, datelined Barry, Wales (Nov. 29). Lyall was reporting on the Mosquito, an ingenious invention of security consultant Howard Stapleton, who drew on an obscure feature of human hearing—”that children can hear sounds at higher frequencies than adults can—to fashion a novel device that he hopes will provide a solution to the eternal problem of obstreperous teenagers who hang around outside stores and cause trouble.”
In its first trial, outside a convenience store in Wales, the device has performed superbly, emitting a “high-frequency pulsing sound” that is extremely irritating to young people but that Lyall herself, she reports, could not hear.
And then there was Howard W. French’s “Kung Pao? No, Gung Bao, And Nix the Nuts,” (Nov. 23), which my daughter Anna read aloud to the whole family on Thanksgiving. French visited the city of Guiyang in China’s Guizhou province, the “ancestral home” of the dish Americans know as kung pao chicken but which in Guizhou is called gong bao jiding, “a dish whose perfume wafts through the air, distinctive even over the smell of tobacco smoke.” A celebrated chef in Guiyang, Wang Xingyun, is quoted at length deploring the manner in which the dish is prepared in Sichuan province—especially the use of peanuts. Accompanying the article is a recipe based on Wang’s own, which we haven’t yet had a chance to try.
Just before Thanksgiving, Wendy and I were in Philadelphia for aar/sbl (the annual meetings of the American Academy of Religion and the Society of Biblical Literature). In a report on the conference for Books & Culture’s website, I described AAR/SBL as a “chaotic marketplace of ideas” and said I found it exhilarating. A friend wondered about that description. Wouldn’t “depressing” be a better word for it, given the high proportion of confusion and sheer untruth?
That was a good question. The convention, like a sprawling city, is both exhilarating and depressing, a site of great energy and variousness—a lively place—and also a place of darkness. Of all the reasons I read the Times, I think the foremost is to taste that variousness, the unpredictable harvest of the day—unpredictable, yes, despite the paper’s ideological grid. “When a man is tired of London,” Dr. Johnson said, “he is tired of life; for there is in London all that life can afford.”
In conjunction with the 50th anniversary of Christianity Today magazine, founded in 1956 by Billy Graham, Christianity Today International, with support from the Pew Charitable Trusts, is embarking in 2006 on what we’re calling the Christian Vision Project. On p. 7 of this issue, Andy Crouch, the project director, outlines this three-year venture and introduces Books & Culture’s first piece under the CVP rubric, an essay by Lauren Winner urging Christians to do something truly countercultural: get more sleep.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJohn Wilson
Neil Gussman
An American weapon that has never killed an enemy but still claims innocent victims.
- View Issue
- Subscribe
- Give a Gift
- Archives
Editor's Note: This article about the strange and admonitory history of the chemical weapon lewisite was first published in the January/February 2006 issue of Books & Culture. Less than a week ago, I read a news article about cleanup efforts at Redstone Arsenal in Alabama, where one of the plants produced mustard gas and lewisite. Other sites at the facility contain residue from chemical weapons produced during World War II. The cleanup is projected to last for several decades.
Dew of Death: The Story of Lewisite, America's World War I Weapon of Mass Destruction
Joel A. Vilensky (Author)
Indiana University Press
240 pages
$7.49
Does the word "chemical" make you uncomfortable? Do you think of "natural" and "chemical" as opposites? When the tv pitchman promises "Cleans faster with no harsh chemicals" do you give him just a little more attention? If so, you are not alone. Most modern people are afraid of chemicals. Which is too bad. Because elephants, bacteria, humans, mice, trees, and tigers are all chemical factories so efficient that chemical makers wish they could approach even a small fraction of the efficiency of any living organism.
Chemicals and chemistry were not always the subject of fear and dread. In the 19th century, chemicals became the building blocks for effective drugs and for rapid advances in public health. Clean water, anesthesia, and painkillers—things we take for granted today—were and are the result of advances in chemistry.
It's not hard to trace the beginning of the downhill slide in the image of chemistry. On April 22, 1915, Captain Fritz Haber ordered German troops to open the valves on 6,000 pre-positioned cylinders of chlorine. Within minutes, Algerian and French troops in trenches near the Belgian village of Ypres saw a yellowish-green cloud rolling toward them. As the heavier-than-air gas filled their revetments, the troops who could run did; the rest writhed in agony as the gas burned their throats and eyes and finally drowned them in the fluid of their own lungs.
It is doubly sad that Haber selected chlorine for the debut of gas warfare, because chlorine has made billions of lives better across the globe in the last hundred years. The vast majority of drugs use chlorine in some step of their synthesis, and chlorine is still the most widely used and effective disinfectant for public water systems. The beginning of the 20th century was the beginning of the end of cholera and other water-borne plagues because chlorine kills germs so well. Then, Captain Haber showed that an element that could kill germs could also kill and maim people.
Had the German army pressed its attack on that horrible day, the war might have ended before the United States joined the Allied armies in 1917. But the Germans checked their advance. The French and British counterattacked, and the war dragged on for three more years: years with millions of combat casualties, including hundreds of thousands injured and dead from chemical attack.
If chemistry's reputation was damaged by World War I, the field nevertheless retained much of its luster through the great advances of the mid-20th century. Then Love Canal, Silent Spring, Agent Orange, and Bhopal tarnished the public image of chemistry to the point that, beginning in the 1990s, some chemical companies and even some chemical trade organizations have changed their name to remove the word "chemical." Sometimes the switch is subtle, from chemical to chemistry (focus groups show the public is less afraid of the latter); sometimes the new name is a vague, Latin-derived neologism that hints at science.
And then in 2005, the 90th anniversary of the first use of chemicals as a Weapon of Mass Destruction (WMD) and the 60th year since the first use of the most fearful WMD, the atomic bomb, Joel A. Vilensky published Dew of Death: The Story of Lewisite, America's World War I Weapon of Mass Destruction—a weapon hailed as the deadliest in history, yet one that has never been used by America in combat and likely would not have been effective if it had been employed.
Vilensky's account begins with two American chemists who were born in the same year and were instrumental in the development of lewisite. They never worked together, they did research in very different areas of chemistry, and while they may have corresponded, they probably never spoke to each other.
The development of lewisite began in 1903, when one of these chemists, Father Julius Nieuwland, mixed acetylene and arsenic trichloride and nearly killed himself. Fifteen years later, our second man, Captain W. Lee Lewis, a chemist with no background in poison gas development, volunteered for the war service. He was given lab equipment and told to develop a chemical weapon deadly enough to end the war.
Where did Captain Lewis begin his search for the ultimate weapon? In facilities donated to the war effort by Catholic University of America (CUA) and American University. His search ended when a librarian remembered that the first Ph.D. thesis ever approved at CUA included an experiment that put the candidate in the hospital for a week.
Lewis expanded Nieuwland's work and developed the organic arsenic compound that would later bear his name. In testing, this evil concoction killed dogs, donkeys, and goats by the score and had much to recommend it as chemical weapon, but it also had drawbacks; chief among them was a tendency to break down in water. (As a reader, I wanted to know why this weapon was selected for development and mass production when it had no combat trials. The author wanted to know this also, but the 90-year-old records were sealed shortly after the attack on America on September 11, 2001. Without a change in policy concerning these records, we may never know.)
Could the leaders of the American Chemical Warfare Service really have thought that the Germans had not developed and tested the same compound? Their commitment to secrecy suggests they did believe that the United States was alone in developing lewisite. In fact, however, the German chemical weapons program had synthesized and tested lewisite (along with other organic arsenic compounds) before rejecting it.
In this instance the assumption of U.S. weapons-makers was extremely parochial. During the period in which lewisite was developed, Germany dominated chemistry, especially organic chemistry. (An organic chemical is a compound that contains at least one atom of carbon. Lewisite includes two carbon atoms in its most deadly variant, more carbon in its weaker forms.)
How extensively did Germany dominate organic chemistry? On November 2, 1916, while America remained neutral and World War I raged in Europe, the submarine Deutschland, having crept through the British blockade of American ports, landed at New London, Connecticut. Onboard this unlikely cargo vessel was a shipment of indigo dye and of Salvarsan, the first drug to successfully treat syphilis. America's chemical industry was weak enough and the German need for currency was strong enough that a U-boat carried less ammunition in order to deliver dye to American mills. The following year, America was at war with its dye and drug supplier, and U-boats no longer docked in Connecticut.
Unaware of the German testing and rejection of this purported superweapon, the United States began to produce lewisite on a large scale. The result: hundreds of casualties among soldiers who never left America, poisoned ground in Ohio and Washington, D.C., tons of arsenic-based poison dumped at sea and buried on American soil, and not one enemy casualty.
The story of lewisite shows that even an unused weapon can be lethal. At the end of World War I, the U.S. Army was making lewisite at the rate of at least several tons per day in Willoughby, Ohio. Since the government records are sealed, Vilensky could not determine the exact amount stockpiled by the end of the war, but it is likely that many tons of lewisite and lewisite-contaminated equipment were buried in and around Willoughby as well as near Catholic University and American University, where Lewis and his team did much of their testing and development work.
Although informed opinion concluded that lewisite was impractical for battlefield use, leading American newspapers and magazines reported on lewisite after the war with claims that have their genesis in a Department of the Interior exposition held in Washington. One of the displays included a vial of the "deadliest poison ever known, 'Lewisite.' " On May 25, 1919, the New York Times said ten airplanes carrying lewisite "would have wiped out . . . every vestige of life—animal and vegetable—in Berlin." The article went on to claim "a single day's output [of the Willoughby plant] would snuff out millions of lives on Manhattan Island." Also reporting on the exposition on the same day, the Washington Post said, "one day's output of the lewisite plant was sufficient to kill all four million inhabitants of Manhattan." Less than three weeks later an article in the Cleveland Plain Dealer said lewisite was "72 times as toxic as mustard gas." Mustard gas was in fact used during the war with devastating effect by armies on both sides of the conflict. Vilensky quotes these and many other contemporary sources—in which the sense of horror is trumped by a boasting tone—to paint a picture of an age almost beyond the imaginative reach of most people living today, an age in which the label "scientific" was good and the godlike men of science would lead us to a better future.
But the story doesn't end there. Vilensky goes on to tell how this untried weapon spread across the globe. In the years between the world wars, most military leaders were averse to using chemical weapons but needed to have these weapons in case the other side used them. So most of the major combatants in World War II built up stockpiles of chemical weapons that were never used. As in World War I, the only lewisite casualties in World War II were plant workers who made the poison and soldiers who "volunteered" to test the weapons. The greatest production—and the highest death toll—by far was in the Soviet Union, with stockpiles the precise extent of which, Vilensky says, cannot be accurately determined but were surely in the tens of thousands of tons.
These Soviet stockpiles have been buried, dumped at sea, or are still waiting to be neutralized and disposed of. And the tale of horror continues. Japan, China, Canada, and other countries as well as the United States and Russia are still dealing with lewisite-blighted ground and poisoned citizens nearly 90 years after it was hailed as the most deadly weapon in the American arsenal.
This brief and thoroughly chilling book shows how men of good intentions under the pressure of war can make errors in judgment that haunt the world long after they are gone. Vilensky connects America's two largest WMD programs to one man: James Bryant Conant. In the summer of 1918, Captain Conant moved lewisite from development to mass production facilities, hoping that lewisite could at last bring an end to the terrible conflict. When the war ended sooner than expected, Conant returned to civilian life and went on to become president of Harvard University. Then during World War II, he became administrative head of the Manhattan Project. In that role, Conant used many of the procedures that he developed as production chief for lewisite to aid in development and production of the first atomic bomb. And this time, Conant oversaw the production of a weapon that was infamously effective.
Vilensky opens a window into the world of science at war, how discoveries become weapons, and how weapons can harm those who wield them. His modest original intent was to explain the origins of a compound called British Anti Lewisite (BAL) that was developed as an antidote to lewisite. BAL has been used to treat nervous disorders for half a century and has been much more useful in this role than as an antidote to a poison gas never used against British troops. His persistence and curiosity led to a book that will have an important place in the literature on this ghastly form of warfare.
Before launching into the text of the book, the reader will see that politics and WMDs are never far apart. Richard Butler, former head of the United Nations Special Commission to Disarm Iraq, writes in the foreword that chemical weapons were not used in the latter half of the 20th century with "two notable exceptions." They are Iraq in its war with Iran and the United States in Vietnam. Butler says that the United States used defoliants with the intent to harm enemy soldiers and has been lying about that use ever since. From its foreword by a controversial weapons inspector to its very thorough bibliography, Vilensky's book is interesting, provocative, and frightening.
Neil Gussman writes about the history of chemistry for The Chemical Heritage Foundation in Philadelphia.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromNeil Gussman
William Edgar
Rediscovering the witness of Hans Rookmaaker.
- View Issue
- Subscribe
- Give a Gift
- Archives
In retrospect, romanticism about the 1960s is overstated. Alongside George Harrison’s sermons on Sergeant Pepper about being “all one and life flows on” and Timothy Leary’s League of Spiritual Discovery (lsd) we must set the addictions, the deaths, and the wasted lives from Haight Ashbury to suburban New York. Alongside the anti-establishment flower power of the hippy movement, the confused lives in the communes. Alongside the Pax Americana, the brutal Realpolitik of American engagement in Vietnam. Alongside the social programs and the war on poverty, the political assassinations in America and student barricades in Paris.
Although things would eventually return to some kind of normalcy, the 1960s represented a sea change, from the relative social conformity of the years after World War II to a multi-layered, conflicted culture, an unprecedented polarization between Left and Right, new and old, rebellion and conformity. Earlier voices in the 1950s had pushed the envelope, from the Juvenile Delinquents saluted in The Blackboard Jungle to Elvis’ risqué gyrations and Chuck Berry’s celebration of teenage identity, but the full flood of defiance came in the next decade. Elvis joined the army, and rock became profligate. Hopeful Abstract Expressionism gave way to cynical Pop, Op, Neo-Dada, and Happenings. Cary Grant and Doris Day were replaced by Meryl Streep and Dustin Hoffman. Ozzie and Harriet were no longer everyone’s pop and mom. Things were at best confusing. At worst they were dangerous. The arts were both descriptive weathervanes and prescriptive prophesies.
At the center of those times, a rather lost young man, a jazz pianist by night, a sophom*ore music student at Harvard by day, made his way up the mountain toward Villars, Switzerland, stopping in a tiny village called Huémoz, where his life would be forever changed. After a long journey I became a follower of Christ. The people I met there, and their message, became the network undergirding my new-found countercultural faith in evangelical Christianity. The year was 1964, not long after John F. Kennedy’s assassination. The first student to don a Beatles haircut had just walked across Harvard Yard to everyone’s amusem*nt. Less amusing was the spread of hallucinogenic drugs around the community. We lived under the threat of the bomb, and of the draft, a conscription which would send us to Asian jungles to fight a war we did not endorse. The Cold War was seething.
Here at l’Abri I had found a place, completely off the beaten path, where enlightened instructors could make some sense out of our disturbed times, based on biblical Christian faith. The major voice in the community was Francis A. Schaeffer. I had not known such exuberance in my college classes as I did under his teaching. It was wide-ranging, imprecise, passionately delivered, and always related to a unifying worldview. But another voice, at first more muted, but which became for me the more significant influence, was that of an idiosyncratic Dutch art historian. I first knew about him from a chart hanging on the wall of Farel House, the name given to a section of Chalet Beausite, where we studied tapes every day. It was a history of African American music, beginning with spirituals and blues, and moving to the jazz era. It was signed Hans Rookmaaker. I had come to expect connections of all kinds at l’Abri, a place dedicated to exploring the relation of Christian faith to just about everything. But jazz music? Could I have arrived at paradise before my time? And who was this man?
I eagerly found my way through the large tape collection to a series on jazz, full of musical illustrations from rare recordings, delivered in beautiful English with a Dutch accent. More careful, less overtly emotional than Francis Schaeffer’s, the voice was clear, compelling, and utterly fascinating. Hans Rookmaaker spoke of the great artistry and authenticity of Victoria Spivey, Texas Alexander, Bumble Bee Slim, Blind Willie Johnson, and a host of other founders of classic black music. Not only was Rookmaaker the European editor of Fontana Record’s series, Treasures of North American Negro Music, but he had been to America and met Thomas A. Dorsey, Mahalia Jackson, and Langston Hughes. What was the attraction of jazz to this Dutch art historian? For that is what he was during his professional career.
He said it often in his lectures and throughout his writings. It put iron into the blood! Discussing his hero, Joseph “King” Oliver, he compares the New Orleans cornetist’s orchestral sounds to the music of J. S. Bach. He finds very similar musical qualities in the baroque polyphony of the Brandenburg Concertos and Oliver’s Creole Jazz Band from the 1920s. Not only the technical structure, but the mood and atmosphere are similar. Especially, he finds in both of them joy, true joy, not romantic escape. In stark contrast to Theodor Adorno’s attacks on jazz, which found it “unruly,” “rebellious,” and “emasculating,” Rookmaaker describes it as orderly, harmonious, and full of vigor. The opposite of joy for him is happiness, or the escapism of those who look for depth in the tragic and ruinous. And the ultimate source of true joy, whether in jazz or any other human expression, is biblical Christian faith, which Bach and Oliver shared.
During his lifetime, Hans Rookmaaker guided a great host of students into a strategy for understanding their times and working within their society with courage and creativity. His best-selling Modern Art and the Death of a Culture (IVP, 1970) was nothing short of a ground-breaking study of the surrounding culture, both in its threats and its promises. He dared to make sense of the steps to modern art by noting the general trend from a theocentric world to an absurd universe that lay behind the pictures. Malcolm Muggeridge, himself a returned prodigal, gave it a ringing endorsem*nt on the pages of Esquire. Following in the tradition of the historian Groen van Prinsterer, the theologian-statesman Abraham Kuyper, and the philosopher Herman Dooyeweerd, Rookmaaker believed there was a spiritual background to Western painting which was the key to unlocking its meaning. However, unlike amateur attempts to reduce art to philosophy, Rookmaaker led the reader on a visit to hundreds of paintings, writings, and musical numbers, pausing to scrutinize their composition and motifs.
While the clarity of his pages has fooled some into thinking he was merely a popularizer, or, more gravely, that he ran slipshod over the inner dynamics of particular works of art in order to discern their message, the truth is that behind every one of his judgments there was considerable research. It’s just that he did not want to miss the forest for the trees. What his critics feared at the time was that he made facile connections between an artistic statement and its philosophical orientation. They worried that he was from a bygone era which had not yet escaped the carelessness and even the paternalism of such judgments.
Perhaps there is some truth to this. In his praise of Groen van Prinsterer, Rookmaaker compares the statesman’s history of Holland to the books of Kings in the Bible, because both are able to discern the hand of God in history. So, there is a hint of providentialism here. Still, we have gone way over to the other extreme. Besides often being unfair, there is something sad about our timid refusal to look for meaning in a text. Have we not become jaded in our over-sensitivity to hermeneutics? Have not our critical requirements turned us into snobs of a different kind? When we read the works of Rookmaaker and others in the previous generation of scholars, we are in a different world. The air is full of oxygen. They are capable of enviable lucidity. Sure, they made their judgments, but these were often well considered, delivered without today’s required guilt feelings for treading on the wrong toes. They are careful and nuanced in their own way, but full of passion and courage. Besides, the final reason for Rookmaaker’s calling as a critic is that he believed in objective truth, while many of his contemporaries were seducing their audiences away from the possibility of truth.
Close to three thousand pages of limpid prose are gathered between the covers of the six volumes of Rookmaaker’s Complete Works, the appearance of which is truly a publishing event. Marlene Hengelaar-Rookmaaker’s editorial loving care, and respect for her father, shine on these pages.
I thought I knew the man and his subject well. He was a mentor, a friend, a correspondent, and a frequent visitor to our home. Reading these pages, though, I realize that I only knew a part of his work. The sheer quantity is a first revelation. It is marvelous to see all of his major books reproduced, including an English translation of Jazz, Blues and Spirituals. But there is so much more, much of it previously unavailable in English.
The second revelation for me is the variety of subjects discussed. Here are technical articles on philosophical aesthetics. Here, too, are gathered personal letters, transcribed tapes of lectures and interviews, revealing the pastoral and emotional side of the scholar. We find studies on various portions of the Bible, some of them daring in their understanding of symbolism and ancient historiography. There are sermons, and writings about patience and suffering in the Christian life. Numerous book reviews are reproduced. Rookmaaker writes about God’s sovereignty over human history, about his favorite Albrecht Dürer, about the nature of culture, Escher’s graphic art, freedom in the Christian life, and myriad other subjects. These pages are simply a feast.
Significantly absent is almost any attention to photography or film. He does comment on them here and there, but usually negatively, worried that they are bound up with a two-dimensional world. In a memorable review of Luis Buñuel’s surrealist sermon Un Chien Andaloux he describes the film as hateful, chaotic, meaningless, and then compares it to art and music which came out of suffering, but with Christian hope, such as Schütz’s Psalms, the golden age of Dutch painting, or African American blues. Surely this reluctance to engage with movies and photographs emanates from his concern not to reduce reality to brute facts.
One of the richest portions, not previously familiar to most of us, is the collection of articles in volume 4 entitled “Western Art.” Moving from medieval times to the present, it contains a dazzling array of references and examples. The Danube School, Bruegel, the Anabaptists on art, Raphael’s Madonna of the Sistine Chapel, Vertumnus and Pomona, Jan van Goyen, Daumier, kitsch . . . this journey is simply magical. In the process we are reintroduced to Rookmaaker’s basic commitments. A poem, a piece of music, a painting must have primary aesthetic qualities. But they also teach us something. Not in the moralistic manner of didactic art, but by opening our eyes to something in the world we had not seen before. He insisted that art be given no less, but no more a place in the scheme of things than it is due. In the 19th century, art (small a) became Art (upper case) because it left its proper place and pretended to be revelatory. No, says Rookmaaker: “Art has a function of its own in culture and human life. Just being art. Not autonomous, but bound by a thousand threads to full reality and human life. A thing of beauty is a joy forever, just because it is related to humanness and reality.”
Art history is a task and calling for today which traces the engagement of artists whose work contributes to the good or the downfall of humanity. Indeed, at their best, artists are called to “elevate the humanity of those who consider their work.” Certainly Rookmaaker’s life was dedicated to the elevation of the humanity of everyone he encountered, in his profession and in his ministry.
The final volume contains a beautiful biography by Laurel Gasque, entitled “Hans Rookmaaker: An Open Life.” No one could be better qualified for the task. Not only were she and her husband, Ward, close friends of the entire Rookmaaker family, but they shared his vision. She speaks for many of us when she writes: Hans Rookmaaker never failed to encourage me intellectually and spiritually through friendship or to inspire me to independence of vocation by his creative example and serious conversation. Through his generous gift of time in viewing art and architecture, listening to music, and in discussing vigorously, extensively, and openly issues of culture and meaning with me, he gave a dimension to my education that I could never have obtained by formal means. Hans’s complete confidence in the indissoluble relation between art and reality and his wise understanding of their inter-relatedness have enriched my thinking, and, indeed, my life.
At the same time, this 130-page life story is not a hagiography. Hans and his wife Anky went through periods of spiritual dryness. Hans was something of a workaholic. He was often restless. Yet in the end, he did what few persons in any generation can do, because he was truly a universal thinker: navigate easily from the study to the living room, from the Bible to the art museum, from learned books to real people with spiritual gifts and needs.
Several aspects of Rookmaaker’s life and thought are particularly worth underscoring. What were his major influences? During World War II, he served in the Dutch navy. He was interned in a prison camp near Nuremberg, then another in Stanislau, doing hard labor. Though not from a believing background, he began to read the Bible upon the recommendation of a friend back home. He became convinced of its truth. He read other books, and wrote papers on prophecy and aesthetics. In prison, he met Captain Johan Pieter Albertus Mekkes, a Christian, who introduced him to the Amsterdam philosophy espoused by Stoker, Vollenhoven, and, especially, Herman Dooyeweerd (1894–1977), whose New Critique of Theoretical Thought revolutionized Rookmaaker’s outlook on epistemology and apologetics. (How many POWs were reading Dutch neo-Calvinist philosophy in their deprived circ*mstances?)
After the war, Rookmaaker devoted much of his early writing to aesthetic theory based on the Cosmonomic Idea, which posited that nothing was neutral, and that meaning was lodged in spheres and laws governing every part of the created world. Accordingly, beauty and harmony were at the center of the aesthetic sphere, while at the same time there was overlapping into others, so that psychology or theology could be beautiful. Students of Rookmaaker’s in the ’60s and ’70s may not have realized how deeply his thinking was permeated by the Amsterdam philosophy. Much of this school of thought is of technical interest only; the originality of Rookmaaker’s contribution lies in applying it to the arts. As he moved into circles where artists and students were asking hard questions, the theoretical language moved into the background, and he became eminently practical. Still, his commitment to the basic contours of the philosophy was always there. It often came out in his reactions to issues. For example, if a student asked him whether God exists, his answer would first be to dismantle a presumed Cartesian presupposition behind the question, and only then attempt a reply, which would assert that everything in the Bible and in the world is a proof of God. Or if an art student expressed preference for Rubens’ robust infants over the grown-up medieval baby in a Madonna and Child, he would say that neither of them really connects to reality. The Rubens baby, with its Herculean musculature, is just as idealized as the medieval adult icon.
Rookmaaker’s lectures at l’Abri, also reproduced here, stress the unity of life. In them he defends the Kuyperian approach to a world-and-life view. He reminisces on his discussions with his closest friend, Francis Schaeffer, about Dooyeweerd, recalling that they both profited from his critique enormously but made a conscious effort not to use his difficult terminology. Rookmaaker was deeply critical of pietism. He believed that the great tragedy of modernity was to have split the world into a sacred and a secular realm. He cautioned against Christian attempts at living in a subculture, because that unwittingly supported the same split world.
Arguably, the central question which characterized all of Rookmaaker’s investigations was the problem of meaning. There were meaning-structures in the world, which he simply called “reality.” He believed that history has been unfolding since the creation of humanity and its purpose in the cultural mandate of Genesis 1:26-31. When artists try to rebel against the laws of creation, they violate its inner structure, and therefore end up in absurdity. But this dilemma cannot last long, as the unfolding process will continue to develop under God’s providence regardless of whether a particular people conforms or not. Even though much in the West has ultimately headed toward “death” (a word found throughout his writings, and heard frequently at l’Abri), the ultimate direction of history is positive. The Reformation was a high point where the Neo-Platonic chain of being was destroyed, to be replaced by a healthier understanding of creation and human dignity. The Dutch landscapists of the 17th century, along with Rembrandt’s œuvre, mark the high points in this unfolding thus far. Since then, the forces of secularization have taken over. But nothing rules out further progress and a new Reformation.
He reflected over and over again on the doctrine of calling. He worried that the modern spirit of revolution, coupled with pietism, would flatten everything out and squeeze any hope for meaning out of the discussion. In his view, we can only combat this with a fully informed worldview, one that recognizes both the dignity of human beings within the creation and the decimation wrought by the Fall. In the booklet The Creative Gift, republished in these volumes, Rookmaaker suggests that much of the effort to solve the problem of Christianity and culture got off on a wrong footing, because it falls into abstraction. “Christianity” does not really have meaning. There are Christians, some good, some weak, but no “Christianity.” And “culture” is not something to be isolated from the universe. Rather, it is an environment where God has placed us, one which he rules despite its pretended revolt. “Creativity” is no special dimension, but is what we should be practicing all the time wherever we find ourselves.
Reading these rich pages will put iron in our blood. And we will remember why we were so grateful for such a unique guide, a prophet, and a friend. His voice still carries today. We need it more than ever.
William Edgar is professor of apologetics, coordinator of the Apologetics Department, and chairman of the faculty at Westminster Theological Seminary in Philadelphia. He is the author most recently of Truth in All Its Glory: Commending the Reformed Faith (P&R).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromWilliam Edgar
Allen Guelzo
A matter of conviction.
- View Issue
- Subscribe
- Give a Gift
- Archives
Michael Lind has made so many, and such glowing, references to me in What Lincoln Believed: The Values and Convictions of America’s Greatest President that I am not sure whether I should have appeared as a co-author of the book rather than its reviewer. So let me say at the outset that there are two things about this book which I think are worth admiring—and one very large questionable thing which may render the admirable parts moot. Those who are satisfied with this as an example of disinterested benevolence are invited to read on in safety.
What Lincoln Believed: The Values and Convictions of America's Greatest President
Michael Lind (Author)
368 pages
$27.28
As a pundit, a columnist, and a senior fellow at the New America Foundation, Lind is looking for the sort of thing in Lincoln which most people outside the analytical realms of academe look for, and that is some form of guidance about the nature of democracy. You might think that this looking would be better directed to the Founders—to Madison, Hamilton, Washington, and the Revolutionary generation. But we have become accustomed to the notion that an élitist republic rather than a democracy was the real goal of the Founders, and that democracy was something which was happening outside their circle, and not with their approbation. And so people turn, like Mr. Smith at the Lincoln Memorial, to what bearings on democracy Lincoln can give them.
Therein lies one of the great points Lind scores in What Lincoln Believed, because Lind understands how very, very perilous the status of democracy was in Lincoln’s day. In the middle of the 19th century, the United States was the only large-scale, functioning nation-state in the world living under anything that approached the idea of democracy. “In Europe,” Lind begins, “the dominant region of the world, monarchs and aristocrats were securely in command.” And with, apparently, good reason: the most recent attempts at popular self-government—the French revolutionaries of 1789 and 1830, and the German and Austrian revolutionaries of 1848—had collapsed the moment one faction’s notion of self-government differed from another faction’s notion. Democracy seemed to possess a lethal, and unavoidable, centripetal force, based on the sheer perversity of human nature.
That democracy survived the Civil War has permitted us to forget that it was ever in serious jeopardy, and forced us to explain Lincoln’s goals in more fuddled and contradictory terms—as the Great Commoner who wanted to raise up the little guy, as a willing dupe who paved the way for the emergence of the Robber Barons of the Gilded Age, as a mystical Unionist, as a prophet of the New Deal, as the Great Emancipator. What Lind sees, and sees with hairline accuracy, was that for Lincoln all of these were subordinate to proving to the theater of the world that democracy was fully capable of resisting the pressures democracies generated from within, without losing its democratic soul. “This is essentially a People’s contest,” Lincoln explained to Congress. “On the side of the Union, it is a struggle for maintaining in the world, that form, and substance of government, whose leading object is, to elevate the condition of men.”1 The war was thus more than a war, or even a civil war—it was an ideological test, to see whether the American experiment in self-government, “or any nation so conceived and so dedicated can long endure.”
This much forms Lind’s first cheer for Lincoln; the second cheer emerges at the end of the book, when he extends Lincoln’s defense of democracy as a defense, not of an airy theoretical principle, but of the democratic nation-state. Formed in the mold of Alexander Hamilton, Henry Clay, and Clay’s Whig Party, Lincoln believed profoundly in the right of Americans to self-government. But it was Americans, as Americans, who possessed that right. Lind’s Lincoln is not an internationalist—he is perfectly happy to have other nations follow the American example into democracy, but he does not think that Americans have any special ownership of the idea of democracy, and he has little interest in forcibly exporting it, on the pattern of a Wilsonian or a Rooseveltian internationalism. It was democracy in America, not American democracy, which Lincoln sought to defend, and sought to hold up as the “last, best hope of mankind.”
Which means that Lind sees in Lincoln no automatic assumption that the huddled masses, everywhere and always, were hungering for the American model, even if they took encouragement from the example of American democracy’s survival. Lincoln’s stand against the expansion of slavery, and then the secession of the Confederacy, was also a stand against the export of American democracy, if that export was tainted with slavery. This is not, we are invited to presume, a Lincoln who would have much interest in neo-conservative unilateralism.
But two cheers do not make a hurrah, and it is in the broad expanse of the book’s middle that Lind’s Lincoln turns peculiarly sour. Because Lincoln was a committed, if domestic, democrat (Lind argues), he could not have been any of the other good things people attribute to him—not the Great Commoner, not the Great Emancipator, and certainly not the almost-Christian mystic. Part of this argument is a fairly reasonable exercise of logical inference on Lind’s part; a larger part of it, I suspect, is visceral, from a man who understands Lincoln’s ideas remarkably well and simply doesn’t much like them, or much like Lincoln’s ideological descendents.
Take, for starters, Lincoln’s nationalism—this gave intellectual stiffening to his Whiggish preferences for high tariffs, government-funded superstructure investment, and a national banking system. The downside of such nationalism is that it also laid the foundation for the emergence of a swaggering and arrogant corporate capitalism and a kind of human tariff in labor, in the form of exclusionary immigration policies that tried to shut out foreign workers from competition with white Americans. Lincoln was thus responsible for a revolution in American affairs, a “Second Republic” as Lind calls it, a closed economic shop whose one focus was on the cultivation of industrial productivity of, by and for white Americans. Only with the advent of the New Deal and World War II—what Lind calls “the Third Republic”—did Americans finally throw off the mantle of protectionism and become the arch-proponents of economic globalization, free-trade, open immigration, and broadly based civil rights.
The same exclusionary logic that operated in favor of white Americans and against foreigners also operated against non-whites at home. Lind does not doubt the sincerity of Lincoln’s aversion to slavery; what he doubts is whether it amounted to much beyond that, and whether the elimination of slavery operated principally in Lincoln’s mind as a way to eliminate yet another form of competition with free white labor. And true enough, Lincoln was slow to oppose more than simply the expansion of slavery; even when he finally did realize that he had no alternative to abolishing slavery and emancipating Southern blacks, he did so with the clearly enunciated intention of deporting the freed blacks somewhere else and reserving the United States for whites. “I am … in favor of our new Territories being in such a condition that white men may find a home—may find some spot where they can better their condition—where they can settle upon new soil and better their condition in life,” Lincoln said in 1858.2 To be sure, the deportation never happened. But the freed slaves were dumped under the wheels of something nearly as repugnant, in the form of Jim Crow segregation. To this, Lind doubts whether Lincoln would have had much objection, and so the 14th and 15th Amendments would likely never have followed the 13th if Lincoln had served out his second term as president.
This may not seem very consistent with Lind’s previous depiction of Lincoln as the Great Democrat. But Lind’s Lincoln, remember, is a nationalist—American democracy is a virtue for those who can be defined as Americans, and Lincoln does not define as Americans anyone with black skin. This does not mean—and it is this which decisively separates Lincoln from Stephen A. Douglas and the pro-slavery militants—that Lincoln was simply another Romantic racist who denied that blacks were even human, or who claimed that blacks have no natural rights as whites do. What he doubted was whether the physical markers that defined races would ever allow full civil equality and civil integration of multiple races within a single nation-state. The majority race, by virtue of their majority, had a legitimate power to exclude minority races from civil equality in a democracy; but otherwise, those minority races were perfectly capable of practicing democracy within their own nation. “No sane man will attempt to deny that the African upon his own soil has all the natural rights” everyone else possesses, Lincoln argued, and he fully expected that the blacks who were colonized abroad after emancipation would create model democracies of their own.3 (Lincoln, in fact, took the dramatic step of extending diplomatic recognition to one such black republic, Haiti.) But at the bottom line, Lincoln’s interest in blacks was strictly subordinate to his interests in whites. And that, in turn, explains for Lind the great geo-political shift Lincoln’s Republicans experienced in the 20th century, from being a coalition of Northern capitalists and Western farmers to being a party of Southern whites and Christian fundamentalists. The interests of white people were always at the heart of Republican affairs, and by the election of 2000, without much difficulty, “the party of Abraham Lincoln had become the party of Jefferson Davis.”
After damning Lincoln for racial indifference and running-dog capitalism, there may not be much enthusiasm left for giving the two cheers Lind wants to give Lincoln as the Great Democrat and the Great Nationalist. But more troubling is whether Lind really has the evidence he wants for Lincoln as the Great Racist and the Capitalist Tool. There is no question but that Lincoln resonated fully with Henry Clay’s “American System” (Clay was, after all, his “beau ideal of a statesman”), or that, as his partner William Herndon remarked, Lincoln managed to make quite a good living as a lawyer, representing the interests of big railroad corporations. “Much as we deprecated the avarice of great corporations,” Herndon chuckled, “we both thanked the Lord for letting the Illinois Central Railroad fall into our hands.”4
But in Lincoln’s imagination, the great virtue of capitalism was its power to liberate people from the trammels of status and class, to promote social mobility. “I don’t believe in a law to prevent a man from getting rich,” Lincoln insisted, because laws that prevented a man from getting rich were precisely what aristocrats used to keep power in their own hands. “Free society is such … that there is no fixed condition of labor”; anyone who “starts poor, as most do in the race of life … knows he can better his condition.” And a man who knows he can better his condition is the first and deadliest enemy of every aristocrat, whose future depends on everyone keeping to their own place and not jeopardizing theirs. Lincoln wanted “every man to have the chance—and I believe a black man is entitled to it—in which he can better his condition. … That is the true system … and so it may go on and on in one ceaseless round so long as man exists on the face of the earth!”5 Lincoln was not contradicting his allegiance to democracy by his devotion to capitalist development, whether in the form of tariffs or “internal improvements”; it was precisely the wedding of ambition to “the fuel of interest,” rather than to social rank, which gave democracy its vitality.
Lind’s most egregious failure, however, is his mischaracterization of Lincoln on race. No one needs to mistake Lincoln for a racial equalitarian; they are, in fact, pretty thin on the ground around the world even today. At the same time, no one needs to mistake him for a lily-white bigot, either. It was Lincoln, to the horror of Stephen Douglas, who kicked off the great senatorial campaign of 1858 by saying, “Let us discard all this quibbling about this man and the other man—this race and that race and the other race being inferior … and unite as one people throughout this land, until we shall once more stand up declaring that all men are created equal.”6 Lincoln, likewise, was never the ardent colonizationist Lind makes him out to be (a case so weak that Lind must resort to citing instances of colonizationist talk from Lincoln which he knows to be bogus).7 The one experiment in colonization which Lincoln did sponsor, in 1863, was framed as a purely voluntary, Congressionally funded expedition to the Caribbean, and when it flopped after six months, Lincoln had a warship retrieve the colonists and never raised the subject again. From that point onward, Lincoln progressively talked more and more about integration and voting rights, not colonization. “How to better the condition of the colored race has long been a study which has attracted my serious and careful attention,” Lincoln told New York abolitionist and Union general James Wadsworth in January, 1864. “In assisting to save the life of the Republic, they have demonstrated in blood their right to the ballot, which is but the humane protection of the flag they have so fearlessly defended.”8
Nor is it fair for Lind to suggest that a persistent strain of Lincolnian racism is what turned the South into the stronghold of the Republican party in the 20th century. Lind assumes that the Confederate South has remained, demographically as well as ideologically, the same Confederate South it always was. But this ignores the massive migration of American capital and population from the Northeast to the South and the Sun-Belt beginning in the 1970s, a migration which brought middle-class Northerners to the South in numbers unseen since Reconstruction, and brought with them the Republican party in similar numbers, similarly unseen since 1877.
I suspect that, lurking deep within Lind’s own authorial and political subconscious, is the realization that Lincolnian principles have not only shaped, but continue to shape, a good deal of the political life of the nation—and that these are principles with which Michael Lind has little personal sympathy. He can endorse Lincoln the Great Democrat, but only to the extent of seeing him as a knight of democratic faith; he would prefer not to see this Great Democrat striding through the world like Sir Artegal’s iron man Talus (or George W. Bush) with his righteous flail, and so Lincoln is tailored down to being a domestic democrat rather than an internationalist one. But even the domesticated Lincoln can be something of a threat, which is why I am inclined to think that streaking him with racism and cupidity, as Lind does, is a device to keep people from taking Lincoln too far or too seriously. What we end up with is a singularly lopsided Lincoln—and a flawed but interesting book. A book worth two cheers, yes; but not a hurrah.
Allen C. Guelzo is the Henry R. Luce Professor of the Civil War Era and director of the Civil War Era Studies program at Gettysburg College. He is a two-time winner of the Lincoln Prize for Abraham Lincoln: Redeemer President (2000) and Lincoln’s Emancipation Proclamation: The End of Slavery in America (2004).
1. Abraham Lincoln, “Message to Congress in Special Session” (July 4, 1861), in Collected Works of Abraham Lincoln, ed. Roy P. Basler (Rutgers Univ. Press, 1953), vol. 4, p. 438.
2, AL, “Seventh and Last Debate with Stephen A. Douglas at Alton, Illinois” (Oct 15, 1858), in C.W., vol. 3, p. 312.
3. AL, “Speech at Carlinville, Illinois” (August 31, 1858) in C.W., vol. 3, p. 79.
4. Herndon’s Life of Lincoln: The History and Personal Recollections of Abraham Lincoln as Originally Written by William H. Herndon and Jesse W. Weik, ed. Paul M. Angle (World, 1942), p. 284.
5. AL, “Speech at New Haven, Connecticut ” (March 6, 1860), in C.W., vol. 4, pp. 24-5.
6. AL, “Speech at Chicago, Illinois” (July 10, 1858), in C.W., vol. 2, p. 501.
7. Lind cites as his clinching example of a Lincoln persistent in his determination to deport freed blacks the claim of Benjamin F. Butler, made in 1885, that Lincoln told Butler as late as January, 1865, that he was still looking for ways to effect colonization; the Butler story, however, has been demonstrated to be a fabrication by Mark Neely in “Abraham Lincoln and Black Colonization: Benjamin Butler’s Spurious Testimony,” Civil War History, Vol. 25 (March 1979), pp. 76-83. But Lind, even after acknowledging that “historians have questioned Butler’s veracity,” still steams serenely past them and claims that “there is no reason to doubt his [Butler’s] account of Lincoln’s obsession with the colonization scheme” (p. 225).
8. AL, “To James S. Wadsworth,” in C.W., vol. 7, p. 101
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAllen Guelzo
Mary Noll Venables
The rest of the story.
- View Issue
- Subscribe
- Give a Gift
- Archives
Ireland is one of the few remaining countries where it’s a major news item that Catholics make up less than 90 percent of the population. According to reports last spring, the number of Protestants is edging higher while the number of Catholics is holding steady. The Church of Ireland, Ireland’s largest Protestant denomination and the former established church, gained congregants for the first time in over a century. Presbyterian and Methodist memberships also increased. Meanwhile, many new non-Catholics have recently arrived in Ireland, and groups that still represent only a tiny fraction of the Irish population, such as Muslims and Orthodox Christians, are nevertheless growing rapidly relative to their numbers a decade ago. As a result, only 88.4 percent of residents in the Republic of Ireland are Catholic.1
Making the Grand Figure: Lives and Possessions in Ireland, 1641 1770
Toby Barnard (Author)
520 pages
$13.00
Changing religious affiliation reflects a changing Ireland. Thanks to the “Celtic tiger” economy, Ireland has become a country that attracts, rather than sends, migrants. Its diversifying population has encouraged many, from political commentators to radio presenters, to ponder what it means to be Irish. Do you have to be born in Ireland to be Irish? Do you need to speak Irish to be Irish? And do you have to be Catholic to be Irish?
Toby Barnard’s work on the often-neglected history of Irish Protestants has something to add to this contemporary discussion. A New Anatomy of Ireland: The Irish Protestants, 1648-1770 outlines who Irish Protestants were; Making the Grand Figure: Lives and Possessions in Ireland, 1649–1770 describes what Irish Protestants owned. Filled with detailed and careful research, Barnard’s books remind us that Protestants have a long history in Ireland and that their history includes more than Oliver Cromwell’s rampage in the 1650s.
In Cork my husband and I often encounter remnants of that forgotten history: a Methodist church (now a clothing store), a Quaker assembly room (now closed), and three Church of Ireland churches that have been turned into a Catholic church, a concert hall, and an office development. Barnard’s books help the reader envision who might have filled such Protestant churches from the 1650s to the 1770s, a period known as the Protestant ascendancy. At this time the Protestant population in Ireland was around 400,000, or a quarter of the island’s population. Catholics outnumbered Protestants, but Dublin and parts of Ulster, the northernmost province, had more Protestant than Catholic residents after 1732. Protestants continued to dominate Ulster demographically, while the Protestant presence in Dublin declined over the eighteenth century. In Cork, 33 to 40 percent of the population was Protestant. Other Irish towns—Limerick, Drogheda, Kilkenny, and Galway—were less than a third Protestant. In any case, Protestants enjoyed disproportionate wealth and influence. The law of the land reserved the upper reaches of Irish society—as well as positions in the church, law courts, and army and navy—for Protestants.
A New Anatomy surveys the Irish Protestant population, from peers to the poor. Barnard organizes the book by social class, but he acknowledges that defining someone’s social standing depended more on perception than on substance. Participating in hunts, which marked “quality,” required an annual income of forty pounds. Beyond appearing on horseback, dress and living arrangements greatly influenced the perception of “quality.”
Barnard’s decision to separate his subjects by class, despite the elusive nature of social definitions, gives the book a sterile feel. The reader learns tidbits about social classes but gains few extended introductions to specific peers, clergy, or barristers. Barnard has compiled so much information in these two volumes that he sometimes loses sight of the people in the study. For example, he notes that rank within the hierarchy of professions depended on the price of training. Therefore practicing law at Dublin Four Courts was highly prestigious since it required studying at the London Inns of Court. However, no one barrister stands out much more than any other in Barnard’s account.
This lack of individuality is a pity, because when Barnard turns to biography, he brings Irish Protestantism to life. He describes four land agents who ran the Boyle estates in southern Ireland to illustrate the varieties of Protestant landowners. Digby Foulke, William Congreve, Roger Power and Richard Bagge came from distinct regions, made different fortunes, and had varying success. Foulke’s parents were tenants on the Boyle estate, and he and many relations continued in Boyle employment. Congreve was from Yorkshire but became fully integrated into Irish Protestant society. Roger Power came from an Old English family near the Boyle estate. He was elected to parliament in 1703. When he died, his estate was estimated at six thousand pounds, a great sum. Bagge, the lowliest in status of the four, left the earl’s employment after he was accused of corruption. The obvious differences in the backgrounds and success of the four men indicate that simple categories such as “land agent” do not tell the whole story.
To be fair, Barnard’s excursions into biography are limited by the records and correspondence that his subjects left. Lady Arbella Denny (1707-1792), a regular letter writer (and a fascinating character in her own right), features prominently in both books. Her wide-ranging accomplishments typify the influence that Protestants had in Ireland during the ascendancy. Lady Denny was the daughter of an earl, the wife of a member of parliament, and the first woman elected to the Royal Dublin Society. She was widely known for her charitable works; she reformed the Foundling Hospital and in 1767 opened the Magdalen Asylum as a refuge for women from good homes who had become prostitutes.
Barnard’s second volume, Making the Grand Figure, describes the possessions that Arbella Denny and other Irish Protestants would have owned. Although the book sometimes feels like a catalogue of country houses, Barnard argues that the materialism of Protestant culture characterized the entire Protestant experience in Ireland. Protestant wealth, particularly elaborate displays of wealth, distinguished Protestant from Catholic. Since maintaining distinctions between privileged Protestants and poor Catholics was at the heart of the Protestant ascendancy, the material goods that Protestants used to reinforce their separation from Catholics are central to the history of the ascendancy.
Barnard begins with Protestant houses, which were built with stone and mortar, in contrast to Catholic dwellings of straw and mud. Protestant houses had high ceilings and wooden floors, while Catholic cottages had low ceilings and mud floors. And Protestants filled their homes and calendars with goods and pursuits that most Irish Catholics could not afford. Barnard devotes a good portion of the book to the production and accumulation of silver. He notes that banking was more difficult in Ireland than in England and that owning silver may have been a convenient way to hold assets. Irish Protestant householders also collected paintings when they had the funds and etchings and engravings when resources were limited. Outdoors Protestants rode, hunted, raised dogs, and planted ornate gardens.
To contextualize the lives of Irish Protestants, Barnard provides occasional comparisons with English society. The greatest similarity between English and Irish society in this period was the monopoly that the established church held. To participate in state functions or to practice most professions required holding membership in the Church of England or Ireland and receiving the Eucharist at least once a year. Among those who fulfilled the confessional qualifications, Irish Protestants were distinct from their English counterparts. Overall, residents of Ireland were less wealthy, and even Irish “quality” were generally poorer than “quality” in lowland England. Clerical stipends were also lower in Ireland than in England.
Barnard’s comparisons between Irish Protestants and upper-class English highlight a fundamental question that neither of his books addresses. Is Protestant the distinguishing characteristic for the people whose lives he describes? Barnard writes about Irish Protestant and Catholic housing, but better descriptors might be rich and poor housing.
Barnard’s neat picture becomes still more complex when we take into account the shifting relationship between Protestantism and national identity. During the ascendancy, Protestants in Ireland largely regarded themselves as English. But the Protestant ascendancy itself started a historical process that led many Irish Protestants to decide they were Irish. Hence the uneasy relationship between English and Protestant and Irish that persists to this day.
As Barnard exhaustively documents, English settlers in Ireland were privileged in their training, careers, houses, furnishings, and leisure. They built grand houses and elaborate gardens, purchased fine silver and family portraits, and tried to impress their neighbors with their dress and comportment. And then, Barnard occasionally hints, at some point they were no longer completely English. Just as new arrivals to Ireland are changing the definition of what it means to be Irish in the 21st century, so English Protestant settlers in Ireland have changed past definitions of Irishness in ways that are still potent. Barnard’s books begin to open up their lives.
Mary Noll Venables recently received her Ph.D. in Early Modern European History from Yale University and is now living in Ireland.
1. Conor Pope, “Major rise in Muslims, Orthodox Christians—Census,” Irish Times (Dublin, Ireland), April 8, 2004; Georgina O’Halloran, “Success story for Church,” Evening Echo (Cork, Ireland), June 3, 2005.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromMary Noll Venables
N. D. Wilson
- View Issue
- Subscribe
- Give a Gift
- Archives
When I was two, I was inclined to certain misbehaviors in my bath. If memory serves, I believe standing up and fiddling with the knobs was involved. And splashing. During one particular bathing experience my mother had to leave the room briefly. So, she relied on my older sister, who was not yet five, to occupy me.
"Tell him a story," my mother said. And my sister did.
"Once," my sister said, "there were four children whose names were Peter, Susan, Edmund and Lucy. This story is about something that happened to them when they were sent away from London during the war because of the air-raids."
She was reciting, and she recited from the beginning of The Lion, the Witch and the Wardrobe to somewhere around Lucy's second passage through the fur coats. The rendition was abridged, but she hadn't done the abridging. Our cassette tape had. Ian Richardson, narrator, had read an abridged version to us so many times that my sister had a sizeable chunk of it word for word.
The film … well, the film isn't just abridged, and it isn't read by Ian Richardson.
Sitting in a Hollywood screening room, waiting for my advance glimpse of the Disney/Walden rendition of talking beavers and a forest-infested wardrobe, I have a lot of time to think about my relationship with Narnia. I wonder if I am capable of liking any film adaptation. Will I simply spend the entire time noticing small changes, unable to see the film apart from its inspiration? Probably.
Two was a good year for me. I sat through my first readings of Narnia, both abridged and unabridged. I sat in my highchair after dinner and listened to my father read to us as his father had read to him. That year I was introduced to both Lewis and Tolkien. My mother questioned my comprehension, but my father, ever optimistic, pointed out my red and sweaty cheeks, which made their appearance during scenes of battle.
I was marinated in Narnia, and I've been on a slow-roast ever since. I have no way of estimating how many times I have passed through those books, only how recently the last reading came—just last month. I love my mother, and I love Narnia. And if anyone chooses to show me an artistic rendition of either, they can expect criticisms. They can expect me to limber up and become a thorough and enthusiastic picker of nits.
Andrew Adamson, who brought us Shrek, Shrek II, and Shrek in the Swamp Karaoke Dance Party, is the director. Tell me that's promising.
But the story is far from ruined. The primary conflict remains virtually intact. The Stone Table scene is phenomenal. Aslan is effective and easily believable, and Lewis' Christianity has a loud presence. While book-readers like myself might be prone to stress and quibble, I expect this film to have nothing other than a deservedly positive reception in the broader evangelical world.
The film begins where it must, with German bombers over London. The opening sequence also gives us early tension and differences between Peter and Edmund, with Peter why-can't-you-do-what-you're-tolding his younger brother, a question which will serve as a bookend for the entire film.
The texture of the opening act is strong, and the casting works well. I find myself relaxing a little in my seat. But I wait for the inevitable, for some shifting of motivation, some change in dramatic tension that patronizes Lewis' original. That change does not come for a long while. But it does come.
Lewis himself had complaints about film adaptations. He was a lover of virtually every adventure story that could "introduce the marvelous or supernatural," including such prose-tripe as Voyage to Arcturus:
Unaided by any special skill or even sound taste in language, the author leads us up a stair of unpredictables. … He builds whole worlds of imagery and passion, any one of which would have served another author for a whole book, only to pull each of them to pieces and pour scorn on it. The physical dangers … here count for nothing: it is we ourselves and the author who walk through a world of spiritual dangers which make them seem trivial."
—Of Other Worlds, "On Stories"
This "marvelous or supernatural" was in fact what he strove to achieve in all of his stories, and is the common attribute of every story he admired critically, from King Solomon's Mines to Paradise LostParadise Lost. And it was the film version of King Solomon's Mines that bothered him.
Lewis complains that the producer of the film, "for me, ruined the story." This narrative ruin came about through the substitution of one danger for another, and that substitution of danger was an outworking of a literary paradigm of excitement. "Where excitement is the only thing that matters kinds of dangers must be irrelevant. Only degrees of danger will matter. The greater the danger and the narrower the hero's escape from it, the more exciting the story will be." Lewis goes on to explain that different kinds of dangers produce different kinds of fear—fear with awe, fear with horror, fear with disgust, numbing fear, and a quivering almost pleasurable fear. The imagination responds differently to these fears. They change the personality of a story accordingly.
While I notice simple shifts in description—Why does the white witch have blond dreadlocks? Where are her red lips? What happened to the charismatically, seductively, dangerously, beautiful Jadis? What happened to her palace? Why is it made entirely of icicles?—I finally come to the first shift in danger, the first place where the writers felt Lewis lacked "excitement."
The children are in the Beavers' house, and Edmund has left them. In the book, we immediately sense betrayal. Peter wants to follow Edmund, but the Beavers make him see the folly of this, and they all trek off as quickly as possible (leaving behind Mrs. Beaver's sewing machine). The children must trek stealthily, always listening for the bells of the witch's sleigh behind them (the wolves were sent to the Stone Table to discover if Aslan really had returned and to cut off the children if necessary). If you have ever done any sneaking with the fear of followers and ambushes, if you have ever attempted any stealthy and yet speedy treks across the park, across the lawn, or simply shifting hiding places from the bedroom to the hall closet, then you know this tension, this sensation of breathless, bottled-up, speedy caution.
But for the film, such understated tension isn't exciting enough. The children follow Edmund to the witch's ice castle, only then deciding to run back to the dam, pack up, and leave. Rather than sending the wolves ahead to the Stone Table, the witch sends them directly to the Beavers' house, and we have our necessary excitement.
The children are inside the house when the wolves begin tearing through the walls. I sit, wondering how the writers expect to believably get them out and all the way to the Stone Table with wolves on their heels. But the writers hand us a minor deus ex machina, and Beaver confesses to his wife that he has a secret tunnel that leads to Badger's house. We are then off on a wolf-chase climaxing on thin ice beneath a thawing waterfall. The waterfall tumbles, the ice shatters, and everyone washes down the frigid river, but nobody drowns, and because Spring is coming, there is no danger of hypothermia.
Certainly, this is more "exciting." But it produces a different danger, a different taste. Like MSG-ridden Chinese food, everything tastes a lot, but all the same. The market of tension becomes glutted, decreasing the value even of our primary conflict.
For myself, I flinch with every minor change of hair color, motivation, and the lack of gifts for the Beavers. I have trouble with the inflation of tension (though not of the battle). Peter's character is too conflicted (he just wants to get Edmund and go home). But at the same time, this film was lovelier than I expected it to be, frequently beautiful, and while it does get a bit distracted, it still communicates the best of what Lewis has to offer.
My own son is three. He knows the story, but Narnia is not yet concrete enough in his imagination to survive such a film. My nephew, son to my sister the bath-bard, is in first grade and is already doing laps through the Narnia Chronicles. When he sees the movie, I expect him to find frustration in the variances. Knowing him, and knowing his mother, his frustration will probably be greater than my own. I never knew any part of it word for word.
N. D. Wilson is a Fellow of Literature of New St. Andrews College and the managing editor of Credenda/Agenda magazine. His first novel for children will release in Spring 2007.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromN. D. Wilson
Andrea R. Nagy
How the OED was made.
- View Issue
- Subscribe
- Give a Gift
- Archives
Like the Bible, the dictionary is a book of weighty authority, and the Oxford English Dictionary is the most weighty and authoritative of all. Conceived in 1857 and published in its first edition between 1884 and 1928, the OED comprised 15,488 pages, 50 million words overall, and two million illustrative quotations. Today, in its updated and uploaded form, the OED defines some 600,000 lemmas, tracing word-by-word the history of our enormous and ever-changing language.
Lost for Words: The Hidden History of the Oxford English Dictionary
Lynda Mugglestone (Author)
Yale University Press
273 pages
$8.24
As a masterpiece of imperial English culture, the OED has been the subject of extensive criticism and analysis. In Caught in the Web of Words (1977), James Murray’s granddaughter recounted the sacrificial devotion of Murray in his 36 years as chief editor of the dictionary. In Empire of Words (1994), John Willinsky documented the Victorian bias toward great white men built into the dictionary. In The Professor and the Madman (1999), Simon Winchester told the story of the murderer in the insane asylum who contributed more than anyone knew to the making of the OED, and in The Meaning of Everything (2003), Winchester completed his story of the OED with anecdotes and personal portraits. Beyond these popular works, numerous scholarly articles and books have uncovered omissions, antedatings, and corrections to the dictionary.
So is there more “hidden history” to be revealed? According to Lynda Mugglestone, there certainly is. Behind the OED’s authoritative text is a history of composition, complete with personalities, debates, and prejudices that shaped its first edition. How were definitions written? How were quotations selected for inclusion? How was spelling and pronunciation decided upon? Does the OED really trace the history of every English word that has ever existed? These questions are the subject of Lost for Words: The Hidden History of the Oxford English Dictionary. Mugglestone has closely examined the editing process of the OED in a way that has not been done before. By poring over a vast archive of annotated proof sheets, as well as letters, reviews, articles, and speeches, she has filled in many details about the editorial decisions that shaped the dictionary at the final stages of publication.
Mugglestone’s research supports much of what we already know about James A. H. Murray. Like Samuel Johnson, he was “a poet doomed at last to wake a lexicographer.” Murray dreamed of creating a fully descriptive, exhaustive, historical record of the language. With the impartiality of a scientist, he would document the story of every English word, whether low or high, old or new, common or esoteric. Such a biography of the language would be a “historical monument” fit for a great nation. But alas, as Mugglestone puts it, “The lexicon could not, in practice, be encompassed by the lexicographer.” Although Murray wished to create an ideal dictionary, he was forced by budget constraints and cultural pressures to edit the text in more prescriptive directions.
The annotated proof sheets reveal that editing primarily meant cutting. Murray was constantly obligated to compromise his descriptive ideal, deleting quotations, definitions, and entire entries. Mugglestone discusses the rationale behind the deletions, confirming that literary language tended to be favored over vulgarisms, established vocabulary over neologisms. Thus quotations from daily newspapers were cut, while the wisdom of poets and bishops was kept. “Linguipotence” was retained because it was a coinage of the poet Samuel Taylor Coleridge, while “greyhoundy” was omitted, being used only in the popular journal Black and White. “Condom” was omitted without much question, while some of the most potent four-letter words were regretfully suppressed after lengthy debate. “Enthuse” was labeled “colloquial,” and “gent” was censured as “vulgar.” The editors made quite a few concessions to Victorian sensibilities.
Of course, with the second edition and the creation of the Oxford English Dictionary online, many of these omissions have since been corrected. The OED now features a complete history of every well-known taboo word in the English language, including a 281-word entry on “condom” with quotations beginning in 1706. A full range of Americanisms is covered, as well as world English from Australia, South Africa, Canada, and other Anglophone countries. Slang, too, is amply represented, from “awesome!” to “yo!”, as are the most obscure technical terms, such as “algology” and “ampelography.”
But even in its first edition, the OED ended up being more descriptive than its cultural milieu was accustomed to. For example, although the delegates of the Oxford University Press specified that the dictionary should avoid scientific terminology, Murray directed the dictionary’s researchers and writers to embrace a wide variety of technical vocabulary. And although he received numerous complaints about the “incorrect” definitions of such words as “arcade” and “abhorrence,” Murray declared, “I am not the editor of the English language,” and he defined these and other words in accordance with the evidence before him. As Mugglestone summarizes it, “The fact that so many letter-writers … saw fit to complain about the undue liberality of the dictionary … serves as a useful index of the level of descriptive impartiality which the dictionary did indeed achieve.” Her careful study of these hitherto unexamined letters and proofs shows exactly where Murray adhered to his principles and where he chose to compromise.
But how much of this “hidden history” needs to be revealed? For a devoted scholar of the OED, perhaps all of it. However, for the nonspecialist Lost for Words contains too much information. We are given far too many quotations from Murray and his correspondents debating details of spelling and usage, comments that might have been summarized in a few paragraphs with data presented in a table. We are given extensive dictionary definitions of “loss,” “prune,” and “adjust” as background for a discussion about the cutting of entries. And at times Mugglestone belabors the obvious, as when she spends several pages lamenting that the dictionary uses “man” where we in the 21st century would use “person.”
It makes for painful reading, too, when Mugglestone takes on the jargon of a new-historicist literary critic, uncovering “cultural agendas” and “cultural codings” that are invariably “disturbing.” In one of her most impenetrable sentences, she states, “If such socially constructed edicts are in keeping with a self-styled manual on ‘good’ usage, then they can seem disconcertingly normative when they appear within the intentionally objective domain of the OED in which empiricism rather than language attitudes—particularly those based on convictions of one’s place in the social order—had been given categorical pre-eminence.” In other words, although Murray claimed to be fully objective, he was influenced by his culture and values. Why does Mugglestone need to make simple things complicated?
For this book makes essentially a simple argument: in spite of heroic efforts on the part of James Murray, the OED was to some extent shaped by social preferences in favor of high culture. This is not a new argument, but what is new is Mugglestone’s examination of the proof sheets and other contemporary documents, which lend support to this well-established understanding of the development of the OED. It is unfortunate that this point is obscured by superfluous detail and an excessively analytical style.
If you want to know about the short life of “lustricity” or the comparative advantages of “rhyme” and “rime” or the lexicographical debates over the correct usage of “avocation”; how “fray” came to lose its “obsolete” label or how “okonite” was derived from “ok,” then Lost for Words will provide informative reading. On the other hand, if you want to hear the story of the making of the OED, stick with the books of Elisabeth Murray and Simon Winchester.
Andrea R. Nagy has been a project editor for the New Oxford American Dictionary, a citation reader for the Oxford English Dictionary, and the author of scholarly articles on the history of English dictionaries.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAndrea R. Nagy
Lauren F. Winner
In search of a counterculture for the common good.
- View Issue
- Subscribe
- Give a Gift
- Archives
Discuss this article
If there is one thing that has defined evangelical Christians, it is their volatile relationship to the cultures where they have sojourned. In America, evangelicals have at various times enjoyed everything from near hegemony to internal exile. They have abjured political power and sold pearls of great price to obtain it—often in the same lifetime. They have censored, critiqued, consumed, and copied the fruits of mass culture—sometimes all at once. They have harbored some of the most enduringly radical American voices on social responsibility and racial justice, yet in recent years their most innovative and influential leaders have been found in exurban locales of hom*ogeneous wealth. They have produced notable scholars of history and enthusiastic popularizers of the end of the world.
It would be more honest, though, to say “we” instead of “they.” As a publication of Christianity Today International, Books & Culture is very much part of the ongoing, unpredictable, sometimes combustible evangelical engagement with culture. Over the next three years we will join our sister magazines Christianity Today and Leadership Journal in the Christian Vision Project, an effort to ask three “big questions” that define critical territory in the Christian relationship to culture, mission, and the gospel. In the first year, with the generous assistance of the Pew Charitable Trusts, we focus on the question, How can followers of Christ be a counterculture for the common good? This piquant phrase, which we have borrowed from the Rev. Timothy Keller of Redeemer Presbyterian Church in Manhattan, juxtaposes two neglected themes. We hope the contributions in these pages, on the website ChristianVisionProject.com that will launch in February, and in a series of DVD documentaries will spark much fruitful conversation and action.
We have asked six people to respond to this question in Books & Culture in 2006. All of them are serious and creative Christian thinkers—though not all are evangelical Protestants—and many will be familiar to longtime readers. Perhaps none will be more familiar than our first contributor, Lauren F. Winner, who at 29 is completing a Ph.D. in American religious history from Columbia University while both teaching and studying at Duke Divinity School, and travels widely speaking to audiences in the wake of her book Real Sex. With all this on her plate, perhaps the subject of her answer to our “big question” is natural—but that doesn’t make it any less important.
My subject is the theology of sleep. It is an unusual subject, but I make no apology for it. I think we hear too few sermons about sleep. After all, we spend a very large share of our lives sleeping. I suppose that on an average I’ve slept for eight hours out of twenty-four during the whole of my life, and that means that I’ve slept for well over twenty years. What an old Rip van Winkle I am! But then, what Rip van Winkles you all are, or will one day become! Don’t you agree then that the Christian gospel should have something to say about the sleeping third of our lives as well as about the waking two-thirds of it?
—John Baillie, “The Theology of Sleep,” in Christian Devotion (1962)
Last night, I pulled one of my very few all-nighters. These were not uncommon in my college years, but my capacity to stay up all night and be anything approximating coherent the next morning has declined as I’ve marched through my twenties. So now I stay up all night very rarely, once every two years or so, and only when I am truly desperate.
But the storied all-nighters are just the most extreme example of something many of us do quite a lot: chip away at sleep in order to do something else. Usually that something else is work.
A simple glance at my email inbox tells me that I am not alone in sacrificing sleep in order to squeeze in a few more hours of work. Last Tuesday alone, I received 23 work-related emails that had been sent between 10:00 p.m. and 5:00 A.M. This creeped me out. The next night, in fact, I had some trouble falling asleep. I lay in bed worrying about the correspondence that was accumulating in my email account, the possibly pressing matters I would need to address in the morning, and the number of hours the next morning that I would have to devote not to preparing to teach my afternoon class, but to replying to email. Eventually I rolled over and set my alarm back from 6:30 to 5:00, resolved to use the extra 90 minutes of wakefulness for email.
Wakefulness, actually, may not be the right word. For though I “gained” 90 minutes in which I was awake, I actually lost wakefulness. Sleep specialists are virtually unanimous on this: With some notable exceptions who seem wired to operate on a different schedule (Thomas Edison is a famous example), we human beings cannot lose sleep without decreasing our attention span, our response time, our acuity. I may have been awake for 90 extra minutes, but I was less wakeful all day long.
According to the National Sleep Foundation, the average adult sleeps six hours and 58 minutes per night during the work week. One hundred years ago—before Mr. Edison’s marvelous invention—people slept about nine hours a night. They were right in line with the eight to ten hours of sleep specialists say we need. Now we are a nation of the chronically sleep-deprived.
Adults’ zeal for cutting back on sleep has consequences for children, too—and not just that parents and teachers are crabbier because they’re not well-rested. Children need even more sleep than adults, yet parents now keep them up later and later, possibly because working moms and dads want to “spend quality time” with their children (a phrase laden with many revealing contradictions and falsehoods, but that’s for another day), something that’s just not possible if you arrive home from work at six o’clock and Junior’s in bed by 7:15. Last year the Washington Post reported that naptime is increasingly “a luxury that 4-year-olds no longer can afford.” Many Washington-area schools are eliminating naps from the kindergarten curriculum, so that 45 more minutes can be devoted to instruction. Administrators seem unconcerned that their charges would learn better if they were well-rested, but that may not be the point. In trading nap time for more time spent studying the alphabet, these tots are really learning to value productivity, or at least activity, above all else.
The irony is that although many of us trade sleep for productivity, we would actually be more productive if we slept more. When we don’t get enough sleep, we accumulate “sleep debt” which has to be paid back. (It’s no coincidence that we describe this state with a metaphor drawn from banking, one William Wordsworth nicely turned on its head when he asked, in his poem “To Sleep,” “Without Thee what is all the morning’s wealth?”) We concentrate better and are less easily distracted when well-rested. A study from the University of Minnesota recently showed that when high schools started the day 85 minutes later, at 8:40 A.M. instead of 7:15 A.M., students got more sleep at night, fell asleep in class less often, and got better grades. When we’ve gotten good sleep, we are also happier, nicer, and healthier. Michael Irwin, director of the Cousins Center of Psychoneuroimmunology at UCLA, says, “Even a modest disturbance of sleep produces a reduction of natural immune responses and [production of] T-cell[s],” the cells that combat the effects of viruses and other pathogens on our bodies.
Indeed, sleep deprivation carries great costs, both in dollars and in human life. Tragedies related to sleep deprivation—car wrecks, accidents at the workplace, and so forth—cost Americans more than $50 billion a year, and result in at least 20,000 deaths. The National Highway Traffic Safety Administration says sleep deprivation causes 100,000 traffic accidents a year. (The slower response time of people who’ve not gotten enough sleep accounts in part for the spike in wrecks on the day after the spring shift to Daylight Savings Timely, when people often lose an hour of sleep.) Psychologist and sleep specialist Stanley Coren has suggested that the accidents at Chernobyl and Three Mile Island both occurred in part because sleepy employees, dragged down by sleep debt, were “not working at top efficiency and were not motivated to check details closely.” According to Coren, sleep deprivation was also a factor in the Exxon Valdez oil spill. To save money, Exxon had been cutting back on staff, which required the remaining employees to put in longer hours. The oil spill would not have happened had not an exhausted third mate fallen asleep on the job.
When folks from my local church gather for an evening meal or adult education class, we usually close with Compline, the nighttime service from the Book of Common Prayer. This service—in which we pray for a peaceful night and a perfect end, repeating the nunc dimittis (originally uttered by Simeon in a somewhat different context, asking God to let his servant depart in peace)—is helping me to understand sleep as part of faithfulness. For it is sheer hypocrisy to pray with my community for a peaceful night and a perfect end if I know I am going home to put in three or four more hours answering email.
Sleep more: this may seem a curious answer to the question of what Christians can do for the common good. Surely one could come up with something more other-directed, more sacrificial, less self-serving. Or more overtly political—refusing to serve in the current war. Or more communitarian, making a commitment to street and neighborhood that overrides new job offers.
And let’s be honest. Had I instead written a rousing essay calling all Christians to hold vigils against the death penalty next week, the very improbability that anyone would heed my call would let us all off the hook. One of the reasons you may be wishing I hadn’t suggested we Christians sleep more is that sleeping more is something you can choose to do, or not do, this very night.
It was one of the reasons I was tempted to write about protesting capital punishment instead, for I will have a chance this very night to practice what I’m preaching, and it will be much harder than sending a check to Virginians Against the Death Penalty.
All of those things—protesting capital punishment, working with our neighborhood association, and so on—would be good things for us Christians to undertake as well. But for the moment I am sticking with the small, if challenging, task of becoming better rested. Not only does sleep have evident social consequences, not only would sleeping more make us better neighbors and friends and family members and citizens. Sleeping well may also be part of Christian discipleship, at least in our time and place.
It’s not just that a countercultural embrace of sleep bears witness to values higher than “the cares of this world, the deceitfulness of riches, and the desire for other things.” A night of good sleep—a week, or month, or year of good sleep—also testifies to the basic Christian story of Creation. We are creatures, with bodies that are finite and contingent. For much of Western history, the poets celebrated sleep as a welcome memento mori, a reminder that one day we will die: hence Keats’s ode to the “sweet embalmer” sleep, and Donne’s observation, “Natural men have conceived a twofold use of sleep; that it is a refreshing of the body in this life; that it is a preparing of the soul for the next.” Is it any surprise that in a society where we try to deny our mortality in countless ways, we also deny our need to sleep?
The unarguable demands that our bodies make for sleep are a good reminder that we are mere creatures, not the Creator. For it is God and God alone who “neither slumbers nor sleeps.” Of course, the Creator has slept, another startling reminder of the radical humility he embraced in becoming incarnate. He took on a body that, like ours, was finite and contingent and needed sleep. To push ourselves to go without sleep is, in some sense, to deny our embodiment, to deny our fragile incarnations—and perhaps to deny the magnanimous poverty and self-emptying that went into his Incarnation.
French poet Charles Peguy makes the point well:
I don’t like the man who doesn’t sleep, says God. Sleep is the friend of man, Sleep is the friend of God. Sleep is perhaps the most beautiful thing I have created. And I myself rested on the seventh day. … But they tell me that there are men Who work well and sleep badly. Who don’t sleep. What a lack of confidence in me.
Peguy’s words have perhaps never been more fitting: to sleep, long and soundly, is to place our trust not in our own strength and hard work, but in him without whom we labor in vain.
More CVP articles from our sister publications are available on ChristianVisionProject.com. Also check out the Christian Vision Project’s new video documentary, Intersect|Culture. The videos take you into the stories of ordinary believers who, by faith, changed their communities. The set includes a DVD with 6 videos and coordinating group curriculum.
Lauren F. Winner is the author most recently of Real Sex: The Naked Truth About Chastity (Brazos).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLauren F. Winner
Stan Guthrie
How India’s former “untouchables” are finding freedom.
- View Issue
- Subscribe
- Give a Gift
- Archives
Guruammal, 26, was a member of India’s despised Dalits (formerly known as untouchables). As such, she possessed fewer rights than almost anyone on earth. Working the fields, she earned the equivalent of 44 cents a day. But Guruammal and her husband were glad for the work. She was four months pregnant, and the family would need every bit they could scrape together.
One day in December, the police raided her village. The superintendent called Guruammal a pallachi, a caste name for a prostitute, and unzipped his pants in a sign of utter disrespect. Later that morning, Guruammal complained to another official about the superintendent.
The next morning, the police were back, and they were looking for revenge. Guruammal’s husband hid under the bed. The police broke down all the doors of the villagers’ homes and arrested 53 men, but the superintendent was looking for Guruammal. Finding her in her nightclothes, the police called her a pallachi again and began beating her. The superintendent dragged her, naked, for 100 feet. A 60-year-old neighbor woman asked the officers to stop, and the police beat her, too, fracturing her hands. One of the village men gave Guruammal his wrap so she could cover up.
At the jail, Guruammal asked the officers for help, saying she was pregnant. They simply mocked her for the previous day’s boldness and locked her up. After 10 days, she miscarried the baby. Fifteen days later, they let her go. No charges were filed against the officers.
Prisoners of the Hindu caste system, India’s 250 million Dalits face such indignities on a daily basis. According to Human Rights Watch, nearly 100,000 crimes of hate were committed against Dalits between 1994 and 1996 nationwide—including many cases of murder, rape, and assault as well as lesser crimes. Many more incidents were not reported. Observers believe that with the rise of rightwing Hindu fundamentalists in India, such attacks are increasing in frequency. And apart from physical assault, Dalits face systematic social, economic, and religious exploitation. India’s pernicious caste system dwarfs South African apartheid, both in scale and in effect. Apartheid is gone, but caste remains.
A new book, Dalit Freedom—Now and Forever, chronicles the Dalits’ ages-long plight. Written by an Indian Christian and supplemented by commentary from notable Dalit leaders, it issues a ringing call not only for political liberation but also for spiritual liberation. And it makes the case that these two freedoms go together.
The Aryans, who invaded India more than 3,500 years ago, divided the society into four groups, or castes. Asserting that different groups came from different parts of the body of Purusha (the supreme personification of the god Vishnu), they made themselves the priestly class (Brahmin), followed, in order of dignity, by the warriors and protectors of Hinduism (Kshatriya), the business class (Vaishya), and workers who support the first three castes (Sudra). Outside the caste system are the Dalits, who are thus called outcastes. Hindu religion, enforced by the Brahmins (who constitute about 5 percent of India’s people), makes these distinctions immutable. Education and achievement do not allow a person to escape one’s caste.
The caste system is Jim Crow on steroids. While human-rights activists have campaigned against apartheid in South Africa and genocide in Rwanda, Sudan, and Serbia, they have had surprisingly little to say about caste in India. If divestment was the right approach in freeing blacks in Africa, why is it not in freeing Dalits in India, which is increasingly tied to the global economy? The upper castes reap almost all the benefits of globalization and thus would have to pay attention if economic sanctions over caste became an issue.
Members of the four castes consider the Dalits of less worth than animals. Dalits are despised as socially polluted and are given the jobs no one else wants: sweeping streets, cleaning latrines, and skinning cows, for example. According to Hindu tradition, they were not allowed even to touch those of other castes (hence the name “untouchables”).
This relentless oppression undermines India’s claim to be the world’s largest democracy, just as the persistence of systematic racial discrimination in the United States long after the abolition of slavery flagrantly contradicted America’s democratic ideals. After India won independence from Britain in 1947, several attempts were made at reform. Mahatma Gandhi, a member of the business caste, treated Dalits with respect and successfully argued that India’s new Constitution should ban the practice of untouchability, but he did not seek the eradication of caste itself.
A contemporary of Gandhi, B.R. Ambedkar, was a Dalit who received a law degree in the United States. Ambedkar discovered upon his return to India that his credentials and experience could not shield him from the old caste discrimination. He was instrumental, with Gandhi, in seeing that the country’s charter banned untouchability. Ambedkar also fought for quotas for Dalit positions in the government. As a result, 5 million Dalits have dignified jobs today.
But Ambedkar concluded that his people could never truly be free if they remained Hindus. The Brahmins had too much to lose, he believed, to endorse a thorough reformation of the caste system. Ambedkar said, “I was born a Hindu, but I will not die a Hindu.” Near the end of his life, he converted to Buddhism, in part because he appreciated its rejection of caste and polytheism and its embrace of equality.
Why did Ambedkar reject Christianity, which has been in India since the second century? After all, he was familiar with Christianity’s teachings about the God-given dignity of human beings and how it laid the groundwork for political and economic freedom in the West. (Dalit Freedom‘s strongest chapter may be the one comparing the caste system to the Pharisees’ religious system during New Testament times. The book shows how utterly attractive Christ can be to weary and heavy-laden Dalits.)
First, of course, Christianity was associated with colonialism. Second, the Indian churches were rife with caste, although in a milder form than in Hinduism. Unfortunately, although the church in India—less than 3 percent of the national population—is mostly of Dalit background, caste is still a major factor of church life.
Joseph D’souza is attempting to change that tragic reality. An upper-caste Christian from Mangalore, D’souza condemns and has apologized for caste in the Indian church. More than that, D’souza, as president of the All India Christian Council, is standing publicly with the Dalits. The aicc is an interdenominational group that supports Dalit freedom and the rights of religious minorities. (A related group, the Dalit Freedom Network, www.dalitnetwork.org, published this book.)
D’souza does more than write books. He’s out on the front lines. On November 4, 2001, Dalit leader Udit Raj organized a mass conversion event in New Delhi. About 50,000 Dalits who made it to the rally—many more were turned back by police—converted to Buddhism. In a land where interreligious tensions quickly surface, D’souza and other Christian leaders were on hand to encourage Dalits in their quest for freedom, but not to proselytize.1
“Our love for the Dalit people is like the love of Christ for them—unconditional,” D’souza writes. “We love people whether or not they choose to follow Jesus.”
Such sensitivity is no doubt needed in India. As Dalit political consciousness has grown, rightwing Hindu fanaticism—aimed at both the outcastes and at religious minorities, including Christians and Muslims—has grown with it. Violence against those who oppose the complete Hinduization of India has accompanied the rise of the Bharatiya Janata Party as the national ruling party (unexpectedly overturned in 2004 in parliamentary elections). Unfounded charges of bribery-induced conversions continue to dog Christian ministries. Hundreds of Christians have been attacked and killed over the last decade, so the biblical command to be as wise as serpents and as harmless as doves makes sense.
Yet the Bible also commands believers to make disciples. Courageous ones such as Joseph D’souza are clearly leaving the church doors open for any Dalits who want to experience Christ’s promise, “So if the Son sets you free, you will be free indeed” (John 8:36, ESV). With Dalits increasingly looking to break the shackles of caste, Christians in India have the opportunity to offer them real hope—both for now and for eternity. If the Indian church can throw off its own ugly shackles of caste, it may indeed model Christ-centered freedom in a way that could transform that great nation.
Stan Guthrie is a senior associate editor at Christianity Today magazine and author of Missions in the Third Millennium, recently published in a second edition by Paternoster. His website is www.stanguthrie.com.
1. Manpreet Singh, “50,000 Dalits Renounce Hinduism,” Christianity Today, January 7, 2002, p. 25.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromStan Guthrie
- Stan Guthrie
Brian Howell
Conversion in a Papua New Guinea community.
- View Issue
- Subscribe
- Give a Gift
- Archives
There are two big questions for anthropologists examining widespread conversion to Christianity: why and how? Why do people abandon coherent religious systems they have practiced for centuries in favor of a new—often radically new—way of thought? Then, even after people decide to accept the new religious framework, how does that happen? How do people apprehend these entirely new forms of thought if, as so many anthropologists argue, new concepts can only be grasped in terms of a pre-existing cultural framework? And for Christian anthropologists, there may be a third question: how might our interaction with these questions within the disciplinary framework of anthropology be distinctively informed by a theological understanding of the human person and the overarching story of creation, sin, and redemption?
Becoming Sinners: Christianity and Moral Torment in a Papua New Guinea Society (Volume 4) (Ethnographic Studies in Subjectivity)
Joel Robbins (Author)
University of California Press
410 pages
$36.92
In answer to the first question, some have pointed to social disruption—colonialism, capitalism and globalization—as the reason people are abandoning traditional beliefs on a massive scale and turning toward Christianity or other non-local religions. Others argue that there are clear material advantages for those who adopt these widespread religions. Both these explanations falter, however, in the face of ethnographic evidence. First, traditional religions are often very capable of adapting to contemporary capitalism and modern citizenship. Just look at what Shirley MacLaine and New Age spirituality did for old-fashioned animism. Second, people frequently sacrifice material or social benefits in joining non-local religions such as Christianity. There is little economic or political incentive for Chinese citizens to join the house church movement, yet it is thought that 100 million may have done so.
Answers to the second question—how people convert—have been even more elusive and unsatisfying. One popular anthropological answer is that, in truth, traditionalists don’t really convert at all. Instances of “conversion” are largely cosmetic changes in form, while the “real” cultural structures remain unchanged. Christianity, in this view, is a thin veneer; scratch an African Anglican and you make a traditionalist bleed. Another view puts Christianity within the power structures of capitalism and the modern state; Christianity is part of a hegemonic cultural system, sometimes resisted with more or less success, but always seeking to supplant traditional systems through the powerful mechanisms of capitalist institutions, discourse, and social formation. Again, these explanations make sense of some cases, but certainly not all. To paint non-Western Christians as either clever imitators of Christianity or victims of a global hegemony is to dismiss all those who would say that they really have become Christians because they want to, or to tell them that they aren’t really Christians anyway. As Christianity spreads throughout much of the world without the assistance of Western missionaries or other “foreign” agents, these depictions of global Christianity ring increasingly hollow.
The best analyses come from scholars who take Christians seriously enough to actually believe them when they say that they are Christians, while finding theoretically satisfying ways of exploring what that entails in both the why and how of their conversion. One of the best studies to come out of this select group is Joel Robbins’ recent ethnographic work on the Urapmin of Papua New Guinea. Robbins seeks to do more than simply describe Christianity among an out-of-the-way people; he explores the largest questions of cultural change:
How does a traditional culture come to motivate change? How can it direct that change without losing its own shape? How can people who start with traditional motivations quickly come to understand a new cultural logic? What kind of culture does this process of cultural doubling produce? And finally, how do people coordinate the two logics that result, and how is the relation between them made livable?
These are fundamental questions for anyone seeking to understand the rapidly changing, globalizing world of traditional peoples anywhere, but they are particularly pertinent for an understanding of worldwide Christianity.
The case of the Urapmin is brilliantly chosen for addressing these questions. Unlike many of their neighbors in the Papuan highlands, the Urapmin were never directly “missionized.” Starting in the 1950s, several young men from the area traveled to neighboring groups (the larger Telefomin) to learn at the Baptist schools there. But the tiny Urapmin community (numbering 396 people in 1990) was never part of the missionary or colonial project. As “development” passed them by, the Urapmin brought Christianity to themselves, and, by the mid-1970s, there was a viable, though minority, church in the Urapmin village. Then, in 1977, a Pentecostal revival swept through their area and the entire population confessed Christian belief. Again, without “outside” influence, the Urapmin radically reorganized their cosmology, “throwing out” their traditional religion, literally and figuratively, as the bones of ancestors and other traditional fetishes were destroyed.
Knowing that many will see this conversion as a consequence of colonial disruption and intrusions of the state, Robbins spends the first part of the book providing a detailed history of the Urapmin to challenge the view that “social disruption” is the best explanation for their desire to embrace Christianity. Instead, he convincingly argues for a cultural cause, in which the leadership sought to reclaim a position within the ritual structure between groups. With the development of other groups, ripples in the ritual arrangement of the whole area caused the Urapmin to feel “cultural humiliation.” Trying to rectify this unease, the Urapmin turned to a religion that did not promise material or social benefit but did seem to offer a new place in the changing ritual world of the Papuan Highlands and new access to spiritual power. The conversions, which by testimony and observation Robbins argues to be culturally authentic and profound, did bring a form of ritual life that addressed much of the anxiety the Urapmin felt.
As Robbins moves from the “why” of conversion to the “how,” he shows how this Pentecostal Christianity demanded a kind of moral reorganization that was both welcome and unsettling. It is the resulting “moral torment” that provides the focus for the second half of the book. Robbins skillfully draws together a complex but accessible conceptual framework to provide a satisfying theoretical structure for his data. He argues that the translation of Christian moral concepts into the Urapmin world was neither “syncretic” nor seamless, but represented subtle cultural shifts and reinterpretation (“hybridity”) producing a recognizable Christianity and a uniquely Urapmin predicament. For example, while “sin” seemed to map easily on the category of taboo (i.e., things which cannot be done), the translation became troublesome, providing a sense of internal sinfulness without the traditional ritual world that would have cleansed the community. The promised redemption of Jesus looms large, but sin hangs over the community like a troubling and ominous cloud.
This leads the Christian reader to reflect on her own spiritual world. When Robbins asks, “What, as [the Urapmin] see it, is the nature of their sinfulness, and how have they become convinced that they possess it as a quality?”, he moves us into familiar Christian territory. How do Christians anywhere come to see “sinfulness” as something that we have “inside”? Working through the details of Urapmin Pentecostalism—some of which seem quite mundane, while others (notably the kind of group Holy Spirit possessions called Spirit Diskos) have the exoticism that is the stuff of anthropologists’ dreams—Robbins holds up a view of Christianity that is both familiar and strange, providing what good anthropology always does, a window into our own experience through the lives of “exotic Others.”
The Urapmin also reveal some of the surprising—and, at times, disturbing—ways in which global Christianity is actually global. Through examinations of such religio-cultural features of Urapmin Christianity as their racial ideology (many Urapmin are convinced that once Jesus returns they will become white people) and a fixation on signs of the Second Coming carried through every conceivable form of communication with the “outside” world (what Robbins calls their “everyday millennialism”), we learn how these Christians reconcile a theology of spiritual equality with the reality of economic and political marginality.
I recently taught this book in an upper-division anthropology seminar. Students worked through the theoretical arguments, rich historical detail, and ethnographic narrative to come out, almost universally, in praise of the work. Not only did it supply answers to some big questions of global Pentecostalism and cultural change, many gained insights into their own spiritual lives. When starting the book, I doubt that many Americans would expect to find themselves reflected in the lives of a remote tribal society only recently introduced to many innovations of the 20th century. Yet it was without irony that, by the end of the course, one of my students enthused, “It is easy to see the similarities between the Urapmin and Wheaton College students!”
The explosive growth of Christianity outside the West should be more than a cause for celebration or concern. It should be an opportunity to delve deeply into these processes of change and conversion, drawing on the experiences of others to understand ourselves. Robbins’ book provides a perfect opportunity to do just that.
Brian Howell is assistant professor of anthropology at Wheaton College.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromBrian Howell