Issue 55: Links to amazing stuff.
- View Issue
- Subscribe
- Give a Gift
- Archives
The Physics of Flips (and Twists)
In case you missed it, here’s one of our favorite moments from the Olympics in Rio this year: 16-year-old American gymnast Laurie Hernandez winking at the judges before her gold-medal-winning floor routine. (Gutsy, right? But, as we can see, her confidence was totally justified.)
In this article from Wired, Rhett Allain offers some insight into the physics that goes into pulling off those incredible stunts. The science is awesome, and the GIFs he includes as illustrations—like footage of an astronaut flipping in the SkyLab—are pretty neat, too. And for those who are curious, Allain’s article also helps explain another phenomenon (which has nothing to do with the Olympics): how cats seem to always land on their feet.
The Terrible Beauty of Wildfires
Fueled by dry conditions and scorching summer temperatures, wildfires are ravaging California again this year. If you’ve lived in an area where these are one of the more common natural disasters, you might know firsthand the “dual nature of the dangerous, untamed, and glorious force of wildfire,” in the words of Behemoth author Dorothy Boorse earlier this issue. In a collection of gripping photos from Getty photographer David McNew, The Atlantic writer Alan Taylor notes a strange kind of “beauty amid the horrible destruction and efforts to battle these blazes.” The Atlantic also showcased two more photo essays of the awe-inspiring beauty of wildfires: one from southern California, and another from Alberta, Canada.
How Jocks and Mathletes Are Alike
Scan a lineup of Olympic athletes, and one thing is clear: Their bodies often look quite a bit different from the rest of us, from “bulging biceps to seven-foot wingspans to a striking paucity of fat,” as Nautilus contributor Sarah Zhang writes. They may also look a lot different from each other, since training for different events requires the strength of different muscles and the development of a unique set of skills. You might say Zhang’s article puts the emphasis on the “mind games” behind the Olympic Games: It shows how athletes’ “brains are just as finely turned to the mental demands of a particular sport” as their bodies are.
The Optimal American Road Trip, Brought to You by Science
As Ted noted in the editor’s note, it’s the 100th anniversary of the National Park Service. Celebrate it while using up the end of your summer vacation with this ridiculously long but carefully calculated American excursion, courtesy of University of Pennsylvania researcher Randy Olson. The route, which forms a complete circle around the contiguous United States, was put together based on data from the Gurobi TSP solver. (TSP stands for Traveling Salesman Problem, which you might remember from this Behemoth piece by Andy Walsh.) While the route leaves out plenty of the nation’s most stunning parks (Hawaii and Alaska didn’t make it on the itinerary), you’ll still see a ton of neat places, from the Everglades to the Rockies to Death Valley. He has alternate routes, too.
LaVonne Neff
The untold aftermath of a once-famous crime.
- View Issue
- Subscribe
- Give a Gift
- Archives
One hot July afternoon in1895, Emily Coombes used the landlady’s key to open a locked bedroom door in an east London row house. Inside she discovered the rotting body of another Emily Coombes, her 37-year-old sister-in-law, sprawled on the bed and crawling with maggots. Beside her was a knife, and a truncheon was on the floor. The stench was overpowering.
The Wicked Boy: The Mystery of a Victorian Child Murderer
Kate Summerscale (Author)
Penguin Press
400 pages
$16.75
There was no question about who killed her. Robert Coombes, aged 13, immediately confessed: first to his aunt, then to a police constable, and finally to a police sergeant. The boy’s statements were cool and consistent. “I did it,” he said to the constable. “My brother Nattie got a hiding for stealing some food, and Ma was going to give me one. So Nattie said that he would stab her, but as he could not do it himself he asked me to do it… . I did it with a knife, which I left on the bed. I covered her up and left her.”
“The most dreadful murder of the century,” screamed the ‘News of the World’.
There was some question about how to apportion the blame. Was Robert alone guilty, or was Nattie, aged 12, an accessory to murder? What about John Fox, the unemployed, slow-witted dockworker the boys had recruited to help them get money and food during the days between the murder and its discovery? How much did he know? Was he involved?
And there were myriad questions about what had been going on in Robert’s mind. Why would a boy—neither a trouble-maker nor noticeably troubled—commit matricide? Why did he then spend more than a week and quite a lot of money at cricket matches, the theater, coffee shops, and a fishing expedition? Why did he appear unconcerned and even cheerful at his trial?
Newspaper reporters immediately began questioning the neighbors. Were the boys truthful? Did their mother drink? Was Fox seen at the house before the murder? The boys’ father, a ship’s steward who was sailing to America when the murder occurred, learned the grisly news from a hand-delivered newspaper as his ship approached New York. Reporters avidly rushed aboard. “My elder boy had an abnormally developed brain,” Coombes told the man from The New York Times. Nevertheless, he said to the man from the Pittsburgh Commercial Gazette, “I am positive that John Fox had some hand in the deed.”
Newspapers dubbed the crime the “Plaistow Horror” after the Coombes family’s working-class neighborhood. “The most dreadful murder of the century,” screamed the News of the World. “The most awful and revolting crime that we have ever been called upon to record,” proclaimed the Stratford Express. As the boys and Fox were taken to Holloway gaol, questioned at the coroner’s inquest, and tried at the Old Bailey, reporters noted (or invented) the minutest detail, not only about the events of the day but also about what the accused had been doing for the past several years. The ongoing story was great for circulation: newspapers “were being snapped up as quickly as if they had carried updates on a political crisis, a military battle or an important sporting event,” sniffed the Spectator.
Author Kate Summerscale knows newspapers. With a double-first in English from Oxford and an MA in journalism from Stanford, she edited and wrote for several UK newspapers before turning to writing books. In fact, an obituary she was researching for the Daily Telegraph inspired her first book, The Queen of Whale Cay (1997), a biography of the cross-dressing, speedboat-racing, cheroot-smoking Marion “Joe” Carstairs, self-proclaimed ruler of a small Bahamian kingdom.
Summerscale relied heavily on newspapers in writing her second book, The Suspicions of Mr. Whicher (2008), the story of a real-life English country-house murder in 1860 (it won the prestigious Samuel Johnson Prize for nonfiction); and newspaper accounts of a ballyhooed Victorian divorce trial helped her reconstruct the now-lost romantically explicit diary that caused Mrs. Robinson’s Disgrace (2012). “[I’m] a journalist playing historian, and then trying to convert what I’ve found into something that approximates a novel,” she told the Guardian.
The Wicked Boy, like Summerscale’s previous books, is meticulously researched, with newspapers supplying much of the information referenced in over 40 pages of notes. In her capable hands, the wealth of detail does not weigh down the story but rather adds lively context. The book, though about Robert Coombes, is equally a vivid social history of England at the turn of the 20th century.
Robert, for example, was no Oliver Twist abandoned in Dickensian doom. Rather, born during what was known as the “age of progress,” he was the elder son of a married couple who “aspired to the respectable, relatively well-to-do life to which they had been raised—with good clothes for churchgoing, musical instruments for the children, literary magazines, an exotic bird in a cage—[though] they did not own property and employ servants as their parents had done.” He lived with his parents and brother in a modest but well-kept row house (two rooms up, two rooms down, plus wash house and privy, with gas and running water). He had completed his elementary education, was a voracious reader, played the mandolin, enjoyed cricket in the park near his home, and attended church with his family every week. Earlier in the year of the murder, he had even traveled to America with his seafaring father.
His biggest apparent vice was a common one: his collection of “penny dreadfuls,” sensational pulp fiction for adolescent males. These provided “Britain’s first taste of mass-produced popular culture for the young,” Summerscale writes, and were “often held responsible for the decay of literature and of morality” because of their lurid and often violent tales. And Robert loved violent tales, whether fictional or factual. “If he happened to read of a ghastly or horrible murder,” his father told a reporter from the New York Tribune, “his whole mind appeared to become taken up by it, and nothing could divert him. During these morbid spells he would read all the literature of that character that he could obtain.”
Nobody, however, expected him to commit murder. Was he a wicked boy, or was he—as the defense counsel tried to show—”not in his right mind”? The insanity defense had become popular in English trials, and the court examined all the angles. Various witnesses testified to Robert’s frequent headaches, “cerebral irritation,” trauma from a forceps-assisted delivery, “excitable fits,” “voices in his head,” “disordered nervous system,” and alleged “homicidal mania,” while others characterized him as a good student, “well-spoken,” and “of more than average intelligence.”
Much to the judge’s disgust—he favored hanging—the jury found Robert guilty but insane. At age 13, the boy was consigned to Broadmoor, described by Summerscale as “a fortified criminal lunatic asylum that housed [and still houses] the most notorious killers in Britain.” Broadmoor inmates serve no definite sentence; some remain there for life.
Summerscale’s descriptions of life at Broadmoor are riveting. The asylum was surprisingly modern. “Since [the superintendent] believed that madness was at least partly caused by a person’s surroundings and experience,” she writes, he “tried in Broadmoor to foster an environment conducive to sanity.” Care was taken to keep inmates healthy, comfortable, and clean. Attendants were trained to be gentle and sympathetic. Mechanical restraints were not used, and drugs were used only sparingly. One fellow inmate, William Chester Minor, spent his days researching words for the first edition of the Oxford English Dictionary. Minor was given two rooms; he filled the second room with books.
Robert was eventually released from the asylum. In 1914, he emigrated to Australia, and later that year, when war broke out in Europe, he signed up with the Australian Imperial Force. As always, Summerscale includes rich historical detail:
Over the last three months of 1914 Robert trained in a series of camps in south-eastern Australia, taking part in parades, drills, route marches … for up to sixteen hours a day. He was taught to turn in formation, to stand to attention, to form fours. In the absence of uniform, he and his fellow soldiers drilled in shirtsleeves or singlets, dungarees and white hats. They slept twenty-three to a ten-man tent. The diet, everywhere, was meat stew, bread and jam.
At the end of the year, 12,000 Australians and New Zealanders—Robert among them—were shipped first to a training camp in Egypt and then to the killing fields of Gallipoli. Events during Robert’s harrowing tours of duty, both in Turkey and in Northern Europe, bring the story to an almost satisfactory close. But don’t even think of skipping the final chapter.
The Wicked Boy is not a whodunit: the perpetrator is never in doubt. Why Robert killed his mother is the mystery that Summerscale explores. In “Epilogue: Another Boy,” she herself joins the story, meeting people who knew Robert well and learning some things that put the events of 1895 in an entirely new light. I do not appreciate spoilers, so I will say no more.
LaVonne Neff used to blog about politics and religion but has grown too discouraged about both.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLaVonne Neff
Karl W. Giberson
Isaac Newton with contradictions intact.
- View Issue
- Subscribe
- Give a Gift
- Archives
Nature and Nature’s laws lay hidden in night;
God said “Let Newton Be!” and all was light.
—Alexander Pope
Alexander Pope’s homage to Isaac Newton reflected the legacy of the great scientist as seen at the time of his death. The oft-quoted couplet comes from Pope’s Essay on Man, first published anonymously in 1733 and 1734, less than a decade after Newton died. Pope’s mythologizing of Newton came sooner than one might have expected, especially in an era before mass media made it so easy to secure great fame. Nowadays only in North Korea can a direct connection to divinity be established so quickly.
The Newton Papers: The Strange and True Odyssey of Isaac Newton's Manuscripts
Sarah Dry (Author)
Oxford University Press
256 pages
$20.95
Before his death at the age of 83, Newton was widely celebrated as a singular, towering, transcendent intellect, who pushed our rational capacities into unoccupied precincts long considered unreachable. In the preface to a later edition of his most famous work, Principia Mathematica, Newton’s friend Edmond Halley (of cometary fame) wrote a tribute that ends with these glowing words:
Then ye who now on heavenly nectar fare,
Come celebrate with me in song the name
Of Newton, to the Muses dear; for he
Unlocked the hidden treasuries of Truth:
So richly through his mind had Phoebus cast
The radiance of his own divinity.
Nearer the gods no mortal may approach.
Newton wrapped up the Scientific Revolution, securing the place of science in society once and for all—first in Britain, then Europe, and now, of course, throughout the world. If the future of science seemed at all tenuous in 1633, when an elderly and chastened Galileo knelt on the hard marble floors of the Vatican and recanted his astronomy, such concerns were dispelled by the time Newton, born on Christmas day in the year of Galileo’s death, died less than a century later.
Newton’s first biographers, writing awkwardly at a time when biographies were mainly inspirational stories of religious saints, transformed Newton into the great symbol of the Enlightenment—the embodiment of the triumph of science over religion, of hard-nosed facts over mushy superstition, of reason over revelation.
The intervening centuries have witnessed the expansion of the supposed divide between science and religion until now many simply assume that science, reason, secularism, and even atheism are a package deal—you embrace one and the others follow. In Ray Comfort’s bombastic “documentary” Evolution versus God, he sticks a microphone in people’s faces and challenges them to justify their belief in evolution and lack of belief in God. He asks an associate professor of anthropology at UCLA, who admits she does not believe in God, to “name a famous atheist.” She responds “Isaac Newton.”
That a presumably well-informed scientist at a major university could think that Newton was an atheist is astonishing. It is also worrisome, given that viewing Newton as an atheist is a self-serving confusion, like thinking that human activities are not causing climate change, or that Obama was born in Kenya. “Newton as atheist” represents successful propaganda of a particularly pernicious sort—the sort that motivated Sam Harris to complain that Francis Collins couldn’t be an effective scientist because he had religious beliefs, or PZ Myers to describe my endorsement of theistic evolution as “halfway to crazytown.”
A great back story illuminates this odd historical puzzle, however, and is ably told by Sarah Dry in The Newton Papers: The Strange and True Odyssey of Isaac Newton’s Manuscripts. Newton, as most scholars who don’t teach anthropology at UCLA now know, was deeply, obsessively, and idiosyncratically fascinated with religion, especially the Bible. He wrote more pages about the Bible than he did about math and physics, but he published only the latter, leaving posterity to puzzle over the disposition of the former.
When Newton died, his voluminous papers on religion began their long, strange trip. A fraction made it into print right away, but most went into hiding, emerged into the light in the early 20th century, and eventually showed up on the Internet. The implications of the papers have yet to seep into our cultural consciousness, where the historical Newton remains a caricature. And frankly, I doubt that the “real” Newton will ever make it beyond the walls of the academy and the pages of a few highbrow journals like this one.
The last project on which Newton worked, using up his final bits of mental energy, was unrelated to physics or math. Titled The Chronology of Ancient Kingdoms, the 87,000-word piece corrected what Newton thought were egregious errors in the accepted histories of his day. In particular, Newton sought to prove that Solomon was the first king in the world and that his temple was the first temple ever built, with subsequent temples being copies. His proposed dates are wildly off, and his Chronology exerted no influence on our ideas of history.
Newton requested that his voluminous papers be sorted after his death, with anything of value being published in due course. His Chronology was published almost immediately and fetched an impressive $50,000 in today’s dollars. Two other publications followed in the next few years: the final volume of the Principia Mathematica—his masterpiece on gravity and the laws of physics—and Observations upon the Prophecies of Daniel and The Apocalypse of St. John. A 2011 publication of the latter is, as of this writing, selling more briskly than any of my books on Amazon.com. Its sales may pick up over the next few decades as it targets 2060 as the date of the apocalypse.
The rest of Newton’s papers were dismissed into the hands of John Conduitt, a confidante who had married Newton’s niece, Catherine. Conduitt set out to write a biography, which never made it past the research stage. We don’t know what derailed him, but in all likelihood it had much to do with Newton’s theological heresies. Newton was convinced—and angry in his conviction—that the Church fathers had wrongly proclaimed in the 4th century that Jesus was fully divine and belonged in the Trinity. St. Athanasius, said Newton, was a theological scoundrel, manipulating the Church into exalting Jesus beyond what the Bible could support. (Newton’s rejection of the Trinity was ironic, since he had spent his career on the faculty at Trinity College in Cambridge!)
In the decades after Newton’s death, biographies and encyclopedia accounts kept him on a steadily rising pedestal of secular scientific enlightenment, as he evolved in the popular imagination into a symbol of the victory of science over religion. Nevertheless, his unpublished work and voluminous correspondence suggested another side to the great scientist. A curious response to this apparent contradiction appeared in an extended 1822 encyclopedia entry by the French scientist Jean-Baptiste Biot. Informed acquaintances of Newton had long known that he had a mental breakdown around 1792, when he was 50 years old. The breakdown was so severe that for two years he could not even understand his own work. During this period of dementia, Newton wrote a strange letter to John Locke accusing the great philosopher of trying to “embroil me with women” and “sell me an office.” Newton even told Locke it would be “better if he were dead.” Locke’s response indicates that he had absolutely no idea what Newton was talking about. Newton penned a letter of apology retracting the odd claims, and blaming them on persistent insomnia.
Biot confirmed Newton’s dementia and used it to sunder Newton into two incompatible halves—a young clear-headed genius interested only in science, and a demented senior far past his prime, obsessed with theological nonsense. Biot’s bifurcation enraged those most familiar with Newton, who knew better; it enraged everyone in England, who objected to this treatment of their national hero. After all, the elderly Newton had been Master of the Mint in England, a complex task he had executed with remarkable skill. But it satisfied those for whom scientific rationality and religious belief were a zero sum game—where the former could increase only if the latter were to decrease.
Biot’s two Newtons continued to trouble Newton scholars. His achievement, of course, stood the test of time and became the paradigm for the many sciences that developed in its wake. As late as 1859, Sir John Herschel would object that Darwin’s Theory of Evolution by Natural Selection was disappointingly un-Newtonian, calling it the “Law of Higgedy-Piggledy.” The 19th century also saw the steady growth of anti-clericalism in science, waged in large part as a propaganda campaign emphasizing the “war” between science and religion—a campaign that, as we saw above, would ultimately convince many that Newton must have been an atheist.
In The Newton Papers, Sarah Dry tells the remarkable tale of how Newton’s unpublished works, personal correspondence, and some obscure posthumously published material orbited about the great natural philosopher like comets about the sun—occasionally appearing with great drama, but generally out of sight. The story of the manuscripts reaches a quiet, almost invisible, climax when John Maynard Keynes, the great economist, casually arrives at a Sotheby’s auction in London on July 13, 1936.
The auction house, of which Dry provides a fascinating description, had over 300 collections of Newton’s papers, called “lots.” The bidding on the lots was lackluster. Nobody really knew what was contained in the five million words handwritten by the most famous scientist in history. Nobody had seen all of the papers and only a handful had seen even a few—and many of them had conspired to ensure their content remained secret. Newton’s papers were auctioned off for about a half million dollars in today’s currency, a pittance considering that 5 pages of his notes can bring in more than that today. Keynes bought 38 of the lots. The rest went to professional dealers with limited interest in them.
The influential Keynes began to read and was astonished at the Newton that arose from the centuries-old dusty manuscripts. He was only too eager to reveal this new Newton to the world—Newton the alchemist; Newton the biblical scholar; Newton the theologian; Newton the conspiracy theorist; Newton the pre-modern man clad for three centuries in ill-fitting Enlightenment garb.
Keynes saw Newton’s varied and eclectic interests as springing from the common pre-modern intuition that the world was a unified system, where everything affected everything, and all was an expression of the divine. God, by these lights, had placed clues to understand planetary motion in the heavens; clues to understand the alchemical powers of matter in obscure ancient texts; clues to understand when the world would end in the Bible. The same Newton who waited patiently for new observations of the moon to check against his gravitational calculations pored over apocalyptical literature to figure out when the world would end.
“Newton was not the first of the age of reason,” wrote Keynes in the essay that would expose the troubled complexity of the great scientist to the world. “He was the last of the magicians.” Unlike Biot, with his “two Newtons,” Keynes was convinced that all of Newton’s work sprang from the same genius, that the “wayward” investigations into alchemy, the Trinity, and prophecy were not the product of a mind in decline but arose from the same wellspring that gave us the law of universal gravitation.
Dry notes that Keynes’ corrective pendulum may have swung too far in moving Newton from “clear-headed rationalist” to “wild-eyed magician.” Nevertheless the true Newton was, at least for those with any inclination to look, visible at last. In all of his wild-eyed madness, cold rational analysis, theological obscurantism, and social eccentricity, he now resides in that most public of places—the Internet.
Singular figures like Newton—or Jefferson, or Einstein, or Lincoln—rarely show up with all their contradictions intact. They are historical shape-shifters, too useful as caricatures, easily conscripted into armies on either side of the same battle. Newton spent two centuries as the poster boy for enlightened rationalism, his religious beliefs ignored or dismissed as the product of senility. Now he is recalled as a famous atheist by those fighting that battle. But he is also invoked as an anti-evolutionary creationist and Bible-believing scientist by those who find that version of Newton more attractive. The Institute for Creation Research (ICR) has long used Newton to fight evolution: “Newton was not unacquainted with the atheistic evolutionary theory on origins,” we read on their website.
He was convinced against it and wrote: “Blind metaphysical necessity, which is certainly the same always and every where, could produce no variety of things. All that diversity of natural things which we find suited to different times and places could arise from nothing but the ideas and will of a Being, necessarily existing.” (http://www.icr.org/article/newton/)
Nowhere on the ICR website do we find any acknowledgment that Newton was formally a heretic who denied the full divinity of Jesus.
Hardly a semester goes by that I do not teach a class at Stonehill College covering Newton and the Scientific Revolution in some way. After a week of watching students’ eyes glaze over with discussions of elliptical orbits and gravitational forces, I conclude with a discussion of what Newton has meant to the world. I share many of the accolades that have been heaped on him by everyone from his friend Halley to Stephen Hawking. I conclude the discussion with a winsome image of my three-year-old nephew, Aaron, playing on a beach. The image contains this extraordinary quote from Newton:
I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the seashore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.
Karl W. Giberson is professor of science & religion at Stonehill College. The author of many books, he served most recently as editor of Abraham’s Dice: Chance and Providence in the Monotheistic Traditions, published earlier this year by Oxford University Press.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromKarl W. Giberson
Interview by Todd C. Ream
A conversation with Mark Noll.
- View Issue
- Subscribe
- Give a Gift
- Archives
Perhaps no living Christian intellectual defies the standard measures of one’s legacy more than Mark Noll, who retired last spring as Francis A. McAnaney Professor of History at the University of Notre Dame. Noll began his career at Trinity College (Deerfield, Illinois) in 1975, leaving three years later for his alma mater, Wheaton College, where he would spend the next 27 years. In 2006, Noll left Wheaton for Notre Dame.
One could try to quantify Noll’s legacy by tallying up the number of articles and books he has written. Doing so would not tell the whole story and, given their sheer volume, may exceed my mathematical skills. Another way to try and gauge Noll’s legacy is by recounting the awards he received over the course of his career, including the National Humanities Medal in 2006. Perhaps the best way, however, is to turn to colleagues who have come to know him first and foremost as a friend.
In thinking about his former teacher, Timothy Larsen, the McManis Professor of Christian Thought at Wheaton College (the chair Noll once held), said, “I cannot think of another scholar who has reached such extraordinary heights of recognition in the secular academic [circles] who has also given so much of his time and attention to serving the church and speaking explicitly to a lay Christian audience. I also cannot think of another scholar who has been as highly honored and successful in the secular academy, who has also given of himself so freely, generously, thoughtfully, and attentively to his students, his colleagues, and even to random strangers soliciting his advice and input.”
Recognition of Noll’s influence is felt by senior and junior scholars alike. Bruce Kuklick, the Nichols Professor of American History at the University of Pennsylvania, acknowledged that “Mark Noll was a great gift to me. He is an outstanding historian, and the class of the historians of faith. Most of all he has been a model for me because he lives the exhortation to walk humbly with one’s God.”
Molly Worthen, assistant professor of history at the University of North Carolina at Chapel Hill, offered that “More than anything, I think of Mark as a mentor—one of the most generous spirits in the academy, unfailingly openhanded with his time and insight, even to bumbling graduate students who show up out of nowhere to seek his advice (I should know; I was one). Mark has taught us all that while the historian’s vocation may sometimes feel like a monastic enterprise, it is really a collaborative ‘community of saints.’ “
Perhaps Thomas S. Kidd, Distinguished Professor of History and the associate director of the Institute for the Studies of Religion at Baylor University, summed up these sentiments best when he contended: “Mark Noll has set the pace for what a forthright and humble evangelical witness in academia should look like. He is the epitome of a ‘Christian scholar,’ with equal weight on both of the words in that oft-used term.”
In an effort to try and capture some of why Noll has become so important to so many people, I sat down with him in his office in Notre Dame’s Decio Hall just prior to the start of his last semester of collegiate teaching.
At what point did you realize you possessed an abiding interest in history?
I read history from the time I started to read and then probably read as much history during my career as an English major in college as I did English. But I’m old enough now that when I studied English, the task of setting literary works in historical context was a central task—that was before the new historicism, and before deconstruction. My interest in literature, reading, and writing was both literary and historical. As long as I have been able to read I have been interested in what happened in the past.
Is there a particular figure (or event) from your childhood that you can remember reading about who you found more captivating than others?
I remember going to the library in probably the second, third, or fourth grade, and reading all the sports books I could find. But then reading about Babe Ruth, Ted Williams, or Ty Cobb seamlessly transitioned into reading about D-Day, Abraham Lincoln, the founding of the United States, World War I, and World War II. I really can’t remember a time when reading like that was not just something that I did.
At what point did you realize history was your life’s calling?
Certainly at some stage I knew I wanted to make my living dealing with words. Lecturing and writing articles and books thus came along pretty naturally. I applied to do literary studies in graduate school and was accepted at some graduate programs. I went on to study comparative literature at the University of Iowa, but it became clearer as my own sense of Christian faith developed that I was most interested in things that the Protestant Reformers did and most interested in the historical context of literary questions. When I finished the MA in comparative literature at Iowa, I thought I should study church history. I wanted to understand the faith, and it seemed like history was the obvious way to help me do that. And then you can get into graduate school and, lo and behold, you find out you can get paid for work on such material. I’m sure I could have changed at some point if doors had closed, but by following inertia I ended up being a historian.
What teachers proved to have the greatest influence on your life?
I was glad that I was recently able to publish a memoir, From Every Tribe and Nation: A Historian’s Discovery of the Global Christian Story, with Baker Books, because in that volume I could express public thanks for several teachers who left a deep impression. At Wheaton as an undergraduate, I had remarkable teachers. In that memoir, I mentioned Arthur Holmes and Clyde Kilby—both of whom were inspiring as teachers of subject matter but also inspiring as teachers of undergraduates. Frank Bellinger in the political science department was a practitioner, a member of the DuPage County board, and was for me a real eye-opener, since evangelicals in the 1960s and early ’70s tended to be suspicious of political life (though soon thereafter we’d be in it head over heels). Frank took it in stride and was a good instructor. Bob Warburton in the English department was serious about literature—a really good critic. When I wrote a senior honors paper for him on the novels of Thomas Hardy as tragedy, his patience with a novice went far beyond the call of duty.
During the year I did comparative literature at the University of Iowa, a professor of German, John A. A. TerHaar, was the right kind of instructor for someone who was trying to read Schelling and other Romantic writers. I found out later he was an active supporter of the Christian group on campus and that made it even more meaningful.
At Trinity Seminary, David Wells was a major influence—a theologically minded person, very orthodox in his own Christian faith, who yet understood the importance of historical investigation. It was either my first or second year there when George Marsden came to do a visiting year at Trinity Seminary. George, David Wells, and I met for coffee almost weekly during that year, which in many ways served me as graduate school. Here were two really sharp people, two very seriously committed Christian people, who, as examples more than as preceptors, showed not just the value of the intellectual life, but some of the … I don’t know what you would call it … perhaps the simple joy of intellectual life. I had known George for some time, but not too much really, and I had taken courses from David. Their combined influence was life-transforming. I don’t think I have ever had better models or exemplars for what the Christian academic life might mean. With such teachers I have been genuinely blessed.
What scholars proved most influential in terms of how you understand and do history?
I remain deeply impressed with the first serious books I read on the Reformation—Roland Bainton’s biography of Martin Luther, A. G Dickens’ history of the English Reformation, Jaroslav Pelikan’s writing on the Reformation and much else. When I shifted to early American history, I was really fortunate to come on the scene when Perry Miller’s star was in its ascendency. Eventually I did come to see some of the weaknesses and blind spots in his work, but I have never ceased to be impressed by his dedication to the importance of ideas and to the importance of understanding these ideas in their cultural contexts.
I also feel extremely fortunate to work in a field where really good scholars have done great work on topics of interest to me. They were not necessarily Christian believers, though in some cases they were. Yet they all shared a real sensitivity to Christian convictions and the relationship of Christianity to broader social settings. I would put in that group Edmund Morgan, Gordon Wood, David Hall, Daniel Walker Howe, Henry May, and, in his own way, Richard Bushman. (I think Richard Bushman is two historians—one when he is writing about Mormon topics, and one when he is not writing about Mormon topics—both estimable but in different ways.) But these historians, some I have met and some I have never met, were to me great models of outstanding historical scholars. I then felt providentially fortunate to have peers like George and then Nathan Hatch, Harry Stout, Grant Wacker. They are terrific historians. The chance to work with them has been another great blessing.
Would you walk me through the process you undertake in developing a book project?
Most of the books I have written have been requested or suggested by other people. I probably have written no more than three or four books that were my own idea. Usually there is a sense of a problem I’m interested in exploring. With Princeton and the Republic, it was about how believing, confessional, theologically oriented people got along when the landscape shifted so dramatically from being a colony of the British Empire to being an independent nation.
I’m working on what I hope will be this set of histories on the Bible in American public life. One book came out this fall. If I can do the next one in two or three years, I’d be really grateful. But that project started after a conference at Trinity College in 1977 or 1978. We had a little conference of young evangelical historians I was getting to know, and somebody suggested a project examining the uses to which people put the Scriptures—at a time when the evangelical world was being tied in knots disputing over the character of the Bible. It was not surprising that at a luncheon with historians somebody said, “Wouldn’t it be great to study how the Bible is being used rather than simply what people say the Bible is supposed to do?” Everybody immediately said, “Yes.” Nathan Hatch was there and said, “I think I know Bob Lynn at the Lilly Endowment well enough that he might give us a small grant.” Bob then did. We had a conference at Wheaton in 1979, “The Bible in American Culture,” and a book of essays from that conference came out in 1982. I’m still trying to work on that larger project to this day.
Of all the historical figures you have explored, which ones have proved to be the most intriguing or inspirational?
I have certainly been privileged to do research on a lot of interesting people. Working on the College of New Jersey or Princeton, I found John Witherspoon, the president, and then his successors, Samuel Stanhope Smith and Ashbel Green, intriguing for very different reasons. Like many people, I have butted up against the impenetrable personality of Abraham Lincoln and very much enjoyed doing that. I don’t think I ever want to write a book on Lincoln because we have such good scholars like Allen Guelzo who have done such splendid work on Lincoln. But I very much enjoyed trying to factor Lincoln into broader subjects. He played a fairly significant role in America’s God and will figure in the 19th-century coverage of The Bible in American Public Life. He is so intriguing and so complex. Abraham Lincoln would be right at the top of my list.
The question about the most inspirational character is also a good one. I very much enjoyed working with my friend Carolyn Nystrom on a book profiling significant non-Western Christian leaders from the fairly recent past. We thought it was a good idea for Western Christians to learn about them, though given the paltry sales of the book that resulted, book buyers did not think it was as good an idea as we did! But work on that project introduced me to some really admirable people.
One who sticks in my mind is Vedanayagam Samuel Azariah, the first Anglican Bishop in India, who was a second-generation Christian trained at Anglican schools. He was active in the YMCA and became a very well-rounded bishop in the southern part of India. He was an educator as well as an evangelist, concerned about various forms of human development. His wife helped with women’s education and reaching out to the Dalit, or the untouchable population. She was active in public life and a genuinely admirable person.
I’ve learned to think hard about historical figures from my colleague here at Notre Dame, Brad Gregory. He has helped me come to a greater appreciation for people willing to be killed for their faith, but also in some ways to kill for their faith.
In what ways has the study of American religious history changed over the course of your career?
The greatest change would be a shift in the center of gravity from church history to American religious history. When I was younger, the main names in our field were people like Sydney Ahlstrom, Robert Handy, Winthrop Hudson, John Wilson at Princeton, and Martin Marty—all of whom were really good historians, coming from church institutions, and all with a deep interest in theology. They were exemplary, I thought, particularly because they were able to thoughtfully relate historical incidents in the church to external, political, religious, and social contexts. To this day, I am very pleased to call myself a historian of Christianity.
The field at large has been enriched by shifting toward “religion in American society”—people coming from sociology, anthropology, sometimes politics, sometimes religious studies. Their focus has been more on what religious developments in the US mean over and against US history. I have benefited much from that kind of scholarship.
You’ve written on The Civil War as a Theological Crisis and God and Race in American Politics. Are there historical insights that might be of use to us amid racial conflict today—perhaps ignored or distorted in much public discourse?
These books are substantially products of the larger effort I mentioned earlier, to study the history of the Bible in American public life. In that history, most of the really difficult questions have concerned Scripture and slavery or Scripture and race. That study has in some ways been extremely disconcerting, for as a Bible-believer myself it is painful to view deeply engrained cultural convictions eviscerate apparently straightforward biblical teaching. (If others whose faith is deeper than my own have, for example, used Scripture to condemn interracial marriage or have ignored the Golden Rule when considering the legitimacy of slavery, where do my own cultural convictions keep me seeing and following the way of Christ?) After so much study, I should have come up with more than two conclusions, but these are what I have: First, despite abuses of many kinds in putting Scripture to use, the biblical message of liberation in Christ has never faltered, especially among people for whom no one else cares. Second, danger lurks when I move from trusting the biblical message that brought me reconciliation with God to thinking I can definitely proclaim God’s will for other people or the entire society. This danger need not necessarily lead to disaster, though it has far too often resulted in that outcome—and from both the conservative Right and the progressive Left.
If you were to offer a concise definition of evangelicalism, how might it differ from the working definition you started out with?
What has become clearer over time is what I think I was working with intuitively early on in my career. It was useful very early on to do a book of essays called The Gospel in America, where we tried to give a kind of thematic account of several aspects of evangelical life. At the time we were talking about groups that shared a certain code of beliefs and practices and shared also a certain historical background.
Subsequently, a great contribution came from David Bebbington and his full definition of evangelicalism, which works quite well for many purposes: 1) the Bible is supreme authority; 2) the cross is significant and foundational to theology; 3) conversion to Christ is essential; and 4) activity in living the Christian life, particularly with regards to the practice of evangelism. I still think, however, that any kind of programmatic, doctrinal, or behavioral definition of evangelicalism needs to have some sense of historical development.
At the present, and I’m delighted and witness it here at Notre Dame, there is a small minority of Roman Catholics who have all of the evangelical characteristics. Are they evangelicals? Well, from one angle, yes. Do they share the history from the Protestant Reformation to pietism through the evangelical revivals through the 19th century’s more democratic mission enterprises? No, they don’t. These days I’m not really too interested in trying to define things precisely, except as definitions are necessary for people to define their research projects.
In what ways, if at all, has your understanding of Christian scholarship changed over the course of your career?
I don’t think much has changed, though my thinking has deepened over the course of a life in the academy. Certainly, beginning with David Wells and George Marsden, I had a very strong sense that there needed to be a serious commitment to the Christian faith itself, but also a discerning commitment to broader intellectual standards. That can be a tightrope that is difficult to walk. The element that has been added over time is an awareness that Christian scholars need to truly live as Christians in all aspects of their lives. That insight was present maybe in vestigial forms early on. But from our great friend from Canada, George Rawlyk, I think all of us associated with him learned about how important life was alongside scholarship. George was an outgoing Slav instead of a diffident Teuton and very much concerned about the personal lives, including the personal spiritual lives, of his historian friends. It was clear from what he said and what other people reflected on his career, after he had passed away in 1995, that he was a great Christian scholar partly because of the significance of his works, but just as much because of the significance of his life.
What do you have planned after your formal retirement in May?
Like so many people who “retire,” I hope to keep on doing many of the things I’ve always done, with fewer restrictions on my time. I would like to finish the project on the Bible in American public life into the early 20th century. I also have ideas for projects having to do with hymnody and Canada, and several plots for novels are bouncing around in my mind. My wife would say, and I would agree with her, that I’ve got to do something with my books because I’ve got too many of them to move to a smaller accommodation.
I am also very concerned about the placement of PhD students. I feel extremely privileged to have worked with excellent students. But I am also aware of the terrific strain in finding regular employment. So I write a lot of letters of recommendation these days, and I expect to be doing that for quite a few years into the future as the Lord gives me health and keeps my mind sane. I’m very much committed to that enterprise and will not mind at all requests to write letters so long as I can handle a keyboard and have something rational to commend about students.
Todd C. Ream is professor of higher education at Taylor University and research fellow with Baylor University’s Institute for Studies of Religion. His most recent book, Restoring the Soul of the University (with Perry L. Glanzer and Nathan Alleman), is scheduled for release in February by InterVarsity Press.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromInterview by Todd C. Ream
Naomi Haynes
Church conflict in Papua New Guinea.
- View Issue
- Subscribe
- Give a Gift
- Archives
That Protestants are prone to schism is no surprise—it’s in the name, for heaven’s sake. However, just what causes schism is not always obvious, apocryphal stories of congregations that split over the color of the new fellowship hall carpet notwithstanding. The driving forces behind schism are a complex tangle of historical, relational, theological, and practical concerns. This means that schism is a kind of knot that social science is especially good at unravelling. Toward this end, Courtney Handman offers us a sophisticated look at denominational conflict in a place many of us would not necessarily think to look for it: the Waria Valley in northern Papua New Guinea (PNG).
Critical Christianity: Translation and Denominational Conflict in Papua New Guinea (Volume 16) (The Anthropology of Christianity)
Courtney Handman (Author)
University of California Press
328 pages
$29.95
Despite its remoteness, the Waria Valley is a highly appropriate location for this study because of PNG’s position in both missionary and anthropoligical thought. Over the last several decades, PNG has attracted scores of missionaries and anthropologists, thanks primarily to its historic cultural isolation and staggering linguistic diversity—more than 800 languages spoken by a population of fewer than eight million people. The result has been a tremendous amount of ethnography, a great deal of which has explored the consequences of Christian missionization. The most prominent argument to emerge from anthropological treatments of Melanesian Christianity is that conversion requires individualism. Perhaps because this line of analysis has been so central, anthropologists have largely ignored the study of Christian social groups, whether churches or denominations, and it is this lacuna that Handman seeks to address. She argues that schism is a natural part of Christian, and especially Protestant, life: “not the failure of Christianity, but its very practice.”
Handman argues that schism is a natural part of Christian, and especially Protestant, life: “not the failure of Christianity, but its very practice.”
The first Christian missionaries to the Waria Valley were Lutherans, whose greatest challenge was the numerous languages spoken in the region. Their response was to develop a local lingua franca that would enable leaders and laypeople to move throughout the region with greater ease. The mission church would therefore be a church for all people. Just as Lutheran missionaries were not concerned with protecting the boundaries of “every little tribe,” as one missionary brief quoted by Handman puts it, they were likewise not all that interested in maintaining local cultural traditions, much less employing them in Christian worship. While, as we will see, this approach differed markedly from that of subsequent missionaries, it solidified the church as the key Christian group, the social form through which Christian religious life was experienced.
The Lutheran mission was succeeded by Bible translators affiliated with the Summer Institute of Linguistics (SIL), whose work in the Waria Valley was shaped by a unique form of what anthropologists call “language ideology.” SIL was founded on the belief that each person needed to hear the Christian message in his or her mother tongue or “heart language.” The latter is an affectively charged term in which the core of a person’s being—the heart—is inextricably tied to a particular set of linguistic conventions, apart from which understanding will likely, perhaps necessarily, be superficial. SIL translators have historically relied on what in their organization is known as the “dynamic equivalence” model. In this approach, the goal of translation is not to match the text word-for-word but rather to get the message across through an extensive use of local idioms. So for example, a translator might have Jesus proclaiming that he was the yam or taro of life, thereby communicating the message of his sufficiency without the hassle of having to teach people what bread is and what role it had in the 1st-century Palestinian diet. These twin emphases on heart language and dynamic equivalence guided the American missionary who translated the New Testament into Guhu-Samane, a language spoken by 13,000 people across the Waria Valley. The volume was dedicated in 1975. Two years later there was a charismatic revival that eventually resulted in schism with the Lutheran church and the formation of a new congregation that Handman calls New Life.
Whether or not this turn of events would have been viewed as a success in the eyes of the SIL missionaries, who by that time had left PNG, Handman does not say, but she does note that the revival fit within the evangelistic ethos of SIL, such as it is. On principle, SIL does not plant churches; their only objective is to make the Bible available in a community’s heart language, teach people to read it, and encourage them to think through the implications of the text for their community. Put differently, what translation was meant to accomplish was a rigorous and ongoing process of critique through which Guhu-Samane Christians would decide, based on Scripture, which elements of their culture needed to change if they were to follow Jesus. The paradox of SIL’s dynamic equivalence model was therefore that it placed a premium on local cultural knowledge, such as idioms, in order to encourage people to call that knowledge into question. For members of the revival church, New Life, the schism had followed from a critical engagement with the missionary-established Lutheran church, which had not assigned the same value to language and culture that SIL did. In contrast to the Lutheran model, New Life church services featured traditional drums and readings from the Guhu-Samane New Testament, rather than the old hymns and the missionary-established lingua franca.
Once this process of Christian cultural critique was set in motion, it was impossible to stop. In time, New Life came under the same type of scrutiny that the Lutheran church had faced. Amidst criticisms that the group had been too accommodating to traditional and revivalist practices, a third congregation, which Handman calls Reformed Gospel, broke away from the original schismatic denomination. While New Life church members refused to use any text apart from the original Guhu-Samane New Testament, Reformed Gospel leaders used this text alongside Bibles in English and Tok Pisin, one of the official languages of Papua New Guinea. While leadership in New Life followed only from charismatic authority, Reformed Gospel pastors were required to have formal training before entering the ministry. Reformed Gospel services employed acoustic guitars rather than the traditional drums used at New Life. In short, the process of critical engagement facilitated by translation produced a church that was, as Handman notes, “always reforming” in the way that Luther himself advocated.
What are the implications of Handman’s analysis? First I should say what I do not think she has done (or intended to do), namely criticize the efforts of SIL, or the Lutheran mission for that matter. The historical enmity between anthropologists and missionaries is something of a trope in both circles, though in my observation the feelings of animosity are not what they were a generation or two ago. In any case, it is not the role of the anthropologist to make judgments about what has happened in a particular place as much as it is to figure out how it happened. What has made Protestantism, at least in the Waria Valley, so schismatic? In this task Handman succeeds, and we come away from her rich and detailed analysis with a much greater appreciation for the complex social processes that undergird schism, which I have only begun to outline here.
This leads me to what is perhaps the most important accomplishment of Handman’s study. Just as the focus on the individualizing effects of conversion kept anthropologists of Christianity from paying sufficient attention to Christian communities, the tendency among social scientists to view churches as indistinguishable from other social groups resulted in a kind of reductivism when it came to understanding schism. If Christian groups were really just the same as other groups, then the things that caused Christian groups to break apart were no different from the things that caused other groups to fracture. In other words, a schism was just another word for an interpersonal conflict or a political struggle that happened to take place among Christians. Handman argues convincingly that this is not the case. Christian groups are not like other groups because they are sites of mediation, communities that are meant to present Jesus and allow his presence to be experienced. If Christian groups are indeed different from other social groups, then schism is also a different form of social breakdown. Rather than mere politics, schism results from debates about how to mediate the presence of God—how he ought to be best represented and experienced. We should be careful, then, before we resort to common explanations for why churches split, whether the ego of a new leader, the politics of a church budget, or yes, even personal preferences about the new fellowship hall carpet. Schism may be about all these things, but it is not reducible to them. And, if Handman is right in her treatment of Protestantism, it is a necessary fact of life for those who would represent Jesus to the world.
Naomi Haynes is a Chancellor’s Fellow in the Department of Social Anthropology at the University of Edinburgh. Her monograph Moving by the Spirit: Pentecostal Social Life on the Zambian Copperbelt is forthcoming from University of California Press. She is a co-curator for anthrocybib, the Anthropology of Christianity Bibliographic Blog.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromNaomi Haynes
Peter T. Chattaway
The many lives of “Ben-Hur.”
- View Issue
- Subscribe
- Give a Gift
- Archives
Today, Ben-Hur is best-known as the 1959 film that grossed millions of dollars, won eleven Academy Awards and, coming three years after The Ten Commandments, cemented Charlton Heston’s position as the personification of 1950s Bible epics. But the story of Judah Ben-Hur, and his quest for revenge against the childhood friend who betrayed him, had been immensely popular in a variety of media for almost eighty years before that film came out, and there have been several new tellings of the story in the decades since—culminating this year in a 3D action epic from the producers of The Bible, the Oscar-winning writer of 12 Years a Slave, and the director of such films as Abraham Lincoln: Vampire Hunter.
Bigger than Ben-Hur: The Book, Its Adaptations, and Their Audiences (Television and Popular Culture)
Bigger than Ben-Hur Barbara Ryan (Editor), Milette Shamir (Editor), Neil Sinyard (Contributor), Eran Shalev (Contributor), Jefferson J. A. Gatrall (Contributor), Hilton Obenzinger (Contributor), Howard Miller (Contributor), Thomas J. Slater (Contributor), Richard Walsh (Contributor), Ina Rae Hark (Contributor), David Mayer (Contributor), Jon Solomon (Contributor)
Syracuse University Press
304 pages
$27.50
The new film’s rather eccentric pedigree speaks to how this story has appealed to people on very different levels—some sacred, some unabashedly secular—ever since Lew Wallace’s novel was published in 1880. As I write, the release of the new film is still several weeks away, so I cannot comment on it in detail here, but I just finished reading Bigger than Ben-Hur: The Book, Its Adaptations & Their Audiences, a collection of essays that look at everything from the original novel to the miniseries that aired on Canadian television in 2010. It has been fascinating to see how Wallace’s story has evolved over the years and how the people who adapt it have often sold it as pure spectacle to a mass audience on the one hand while making a special pitch to Christians on the other hand. The story might morph over time, but the marketing has been remarkably consistent.
The parallels come through most clearly in Howard Miller’s essay on the hugely successful stage adaptation that opened on Broadway in 1899 and toured the country until 1920. Consider: the play was marketed heavily to a religious audience that, until then, had avoided and even condemned the theater,[1] while the new film has actively courted the same “untapped audience” that made The Passion of the Christ a big hit 12 years ago; publicists for the play promised that it would function as a “sermon” for its audience, while publicists for the film have talked up its potential as “a powerful evangelism tool”;[2] and renowned evangelists like Billy Sunday endorsed the play in newspaper ads, while the new film has a website filled with endorsements from pastors and parachurch ministry leaders.[3] (One key difference: the publicists for the play took pains to distance it from the Catholic Passion Play tradition, but the new film has been promoted across church boundaries at a time when many evangelicals have openly embraced Mel Gibson’s cinematic contribution to that very same tradition.)
Miller’s essay is one of the book’s highlights, but there are some other fascinating essays here that situate Wallace’s novel within the various historical trends of its time. For example, Jefferson J. A. Gatrall explores how the novel came out near the peak of the Sunday school movement and was featured prominently in its libraries and curricula (back when Sunday schools were still considered a “movement”), while Hilton Obenzinger looks at how the novel’s focus on one Jew’s efforts to return to his homeland anticipated the rise of Zionism by just a few years.
Eran Shalev sets the book’s treatment of the ancient Roman Empire against the evolving attitude toward ancient Rome within American society. He notes that the American founding fathers revered the Roman Republic and sought to emulate its political institutions, but by Wallace’s time people had come to focus on the tyranny of the Roman Empire even as America itself was on the verge of imperial adventures of its own in the Pacific and the Caribbean. Something of the “good Roman” remains in Wallace’s novel, most notably in the character of Quintus Arrius, the nobleman who adopts Judah as his son,[4] but it is the injustices of the Roman Empire that drive the story, from the torture and imprisonment of Judah’s family and friends to the crucifixion of Jesus himself.
Co-editor Milette Shamir makes a similarly compelling case that Wallace’s novel combines two kinds of narratives, both of which were very popular at that time: a future-oriented “progressive” plot, in which Judah achieves maturity as he moves out of Judea and into the heart of Roman society, and a nostalgic “regressive” plot, in which the hero returns home and tries to reunite with his mother and sister. In the end, these two genres are synthesized when Judah uses his wealth to support the budding Christian community in Rome—a plot element that has been left out of every Ben-Hur movie to date—and the reader is reminded that Rome itself, which once represented the future, now lies in the past, while religion, which had been associated with the past, is revealed to be the way of the future.
One thing that comes through fairly clearly in Bigger than Ben-Hur is that Ben-Hur has always been a visual telling of the Jesus story. Wallace’s book was aimed primarily at Protestants, many of whom were uncomfortable with visual depictions of Jesus, but it offered detailed descriptions of the Holy Land (which Wallace would not see for himself until after the book was published) and it even described the appearance of Jesus himself. The 1899 stage adaptation substituted a beam of light for Jesus but was nevertheless filled with visual spectacle, from an onstage chariot race, in which real horses ran on treadmills, to miraculous portents in the heavens achieved through theatrical lighting effects. The transition to film, with its capacity for even larger, bloodier, and more realistic sea battles and chariot races, was inevitable. Audiences of a certain era may have been reluctant to see an actor play the Son of God, but they certainly wanted to feel what it was like to live in Jesus’ world.
The essays that focus on the films, alas, are more of a mixed bag. The first feature-length adaptation of Wallace’s story was produced in 1925, and Thomas Slater draws our attention to the fact that the woman who initiated the development of that film was one of many who lost their influence in Hollywood as the studios entered the corporate mainstream. It’s an important piece of film history, but the excerpts from June Mathis’s script don’t necessarily persuade us that her version of the film would have been better than the version that got made. Richard Walsh makes some insightful points regarding the same film’s treatment of Judah Ben-Hur as both a Christ-figure and a Judas-figure, but he tethers these points to assumptions about supersessionism that not all readers will share.
And Ina Rae Hawk provides the obligatory consideration of the homoerotic subtext, intended or otherwise, of the 1959 film and its half-naked galley-slave Heston. Gore Vidal, who worked on the screenplay, famously claimed that Stephen Boyd was privately instructed to play Messala as though he were a spurned ex-lover of Judah’s, and frankly, it’s impossible to see Boyd’s performance in any other light once you’ve heard that story. There is also ample room to discuss the sexual subtexts of ancient gladiator epics and the like in general. But Hawk stretches ideas like these to the breaking point when she claims that the various father figures who encounter Judah throughout the story all represent “departures from heteronormativity” because they are never seen in the company of wives or lovers.
The strangest essay by far is a piece by co-editor Barbara Ryan, who argues at length that John Buchan’s novel Sick Heart River (1941) was in some way a response to the Ben-Hur phenomenon, though she has little to go on beyond the fact that a character in Buchan’s novel is named Lew.
Bigger than Ben-Hur concludes with two chapters that take stock of the past and look to the future. In one, Jon Solomon charts the phenomenal popularity of Ben-Hur by listing all the various products and companies that have been named after Wallace’s novel, with or without his estate’s permission,[5] going back to the 1880s, from Ben Hur Cigars and Ben Hur Bicycles to Ben Hur Whiskey and Ben Hur Sewing Machines.[6] And in the other, David Mayer expresses the hope that future films will pay more attention to aspects of the novel that have often been overlooked, such as the central role that wealth plays in the story and a subplot involving a temptress who flirts with Judah before turning to Messala. Whether either of those wishes are fulfilled in this year’s film, I cannot say, but no doubt there will be more opportunities to tell this story down the road.
Peter T. Chattaway is a freelance film critic and blogger at Patheos.com with a special interest in Bible movies. He lives with his family in Surrey, B.C.
1. Miller notes that Christians who had never been to a play before refrained from applauding, because that simply wasn’t how one behaved during a sermon—and on at least one occasion, the regular theatergoers in the audience tried to “rally” the churchgoers “into a vigorous encore.”
2. http://www.benhursimulcast.com/
4. Interestingly, it looks like the new film will be the first major adaptation of the story that drops the Quintus Arrius subplot altogether. Do we find it harder to imagine a “good Roman” now?
5. The first screen adaptation, an unauthorized short film produced in 1907, sparked a copyright-infringement lawsuit that set the precedent for all subsequent film adaptations of published works.
6. Curiously, Bigger than Ben-Hur acknowledges the existence of an animated version of Ben-Hur produced in 1989 but never mentions the animated film that Charlton Heston himself produced in 2003.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromPeter T. Chattaway
Collin Hansen
Who measures up?
- View Issue
- Subscribe
- Give a Gift
- Archives
What makes a good president? It’s a question of perennial interest—and one that may seem particularly relevant (or ironic) just after the Republicans and the Democrats have held their national conventions. Historians don’t have access to infallible criteria. But at least they have the perspective of time. They can judge reality against campaign promises. In the middle of an emotionally charged campaign, two reasonably intelligent voters can believe that one candidate will make America great again and the other person’s choice will send the country into irreversible decline. You can be sure that Donald Trump or Hillary Clinton would ruin America as president. But only historians claim the authority to rank someone alongside James Buchanan and Warren Harding.
The American President: From Teddy Roosevelt to Bill Clinton
William E. Leuchtenburg (Author)
OXFORD UNIVERSITY PRESS
904 pages
$11.94
How, then, do historians judge the 44 US presidents? One interested observer to this debate said, “The mark of a leader is whether he gives history a nudge.” When ranking presidents, this comment certainly applies to such remarkable executives as Thomas Jefferson, Abraham Lincoln, and Franklin Delano Roosevelt. But it also applies to the man who made the statement: Richard Nixon. He nudged the United States toward relations with Communist China, and also nudged the American population toward cynicism when Watergate revealed his true character.
‘The American President’ provides perspective only available from the best historians, who help us remember the past as it was and not as nostalgia would have it.
More conservative observers might urge great presidents to affirm, “The history of liberty is the history of the limitation of governmental power, not the increase of it.” This statement sounds like one of the Founding Fathers, perhaps George Washington or John Adams. Or maybe that most photogenic president, Ronald Reagan, offered these words to support his case that government is the source of our problems. Actually, it was Woodrow Wilson, who by the end of his second term advocated, unsuccessfully in his own country, for a League of Nations.
History, in the eyes of those who write about it, belongs to the doers, the leaders who nudge the world toward progress. Such a standard would seem to benefit liberals, who earn extra credit for expanded constitutional rights and government programs. But the great conservative presidents, such as Lincoln and Reagan, nudged history themselves as commanders-in-chief during times of war, whether civil or cold. Indeed, everyone near the top of most rankings has expanded government or fought a war or both.
Even so, when viewing the presidency through the historian’s eyes, you can see why the country tends to move in a liberal direction. The modern media campaign demands a purpose, a plan. How do you run for an office you don’t intend to use? Why command an army if you don’t want it to fight? And once a program has been started, it’s nearly impossible to take it away. When presidents launch wars, they tend to unleash liberal forces far beyond anyone’s control. Nothing is less conducive to conserving the status quo or budgetary constraint than mass military mobilization. For example, Wilson’s government, committed to peace through his re-election, spent more money in 19 months on World War I than the entire federal government had spent on everything combined until then. But that conflict pales in comparison to World War II, which cost more than everything the government spent between 1789 and 1940.
When assessing a president, journalists need copy. They’ll write either about your accomplishments or your scandals. And when historians read those contemporary accounts, they need narratives. They need action. That’s why Calvin Coolidge belongs to the era of the silent film, relegated to the dusty attics of forgotten history. He lacked both personality and policy. To rank among the greats and earn a PBS miniseries from Ken Burns, you need both.
William E. Leuchtenburg, author of The American President: From Teddy Roosevelt to Bill Clinton, is professor emeritus at the University of North Carolina in Chapel Hill. He is an expert by virtue of both productivity and also longevity. He was born when Harding was president, and he has written or edited eight books about FDR and his era. He seeks in this book to advance his thesis that the modern, active presidency did not begin with FDR but with his older cousin, Theodore Roosevelt, who succeeded the assassinated William McKinley in 1901. So this book that spans the entire 20th century hinges not on an arbitrary date but on a dramatic transformation in what we as Americans want and demand from our presidents.
No doubt the book’s length will turn away many readers. That’s a shame, because Leuchtenburg actually writes with flair. He includes no footnotes, and when you consider how much happened in a century that ranged from the Great War and the Great Depression to Watergate and Whitewater, not even an account of this length allows for more than a brisk tour in which Leuchtenburg offers largely orthodox interpretations of his widely studied subjects. Still, The American President provides perspective only available from the best historians, who help us remember the past as it was and not as nostalgia would have it. Rather than memorializing the 1950s as an interval of peace and prosperity, for example, Leuchtenburg describes the decade as the “most terrifying the country has ever experienced.” It’s hard for those who didn’t live through the 1950s to remember it as a time dominated by the looming threat of nuclear annihilation.
The president most closely associated with the 1950s, Dwight Eisenhower, doesn’t usually rate highly among historians, though his stock has been slowly rising. After all, you don’t get credit for what doesn’t happen. Under Eisenhower the country enjoyed few new rights and few new government programs, at least when compared to the bookend 1940s and 1960s. And the great victor of World War II did not lead the nation into World War III after the Korean conflict ended. The last president born in the 19th century, Eisenhower reflected some of that era’s assumptions, when presidents did not regard it as their responsibility to provide for the unemployed during economic downturns or engineer other social reforms. The electorate did not expect them to do so. Until FDR in the mid-20th century, few Americans bothered to write the White House, because they did not expect such a personal connection with the president. Teddy Roosevelt was ahead of his time when he said, “Better the occasional faults of a government living in the spirit of charity than the consistent omissions of a government frozen in the ice of its own indifference.”
Perhaps the greatest insights from Leuchtenburg come in his masterful if reductionist sense for deploying illustration to capture the mark of a man. Describing the vanity of LBJ, he writes, “When he visited the Vatican, the pope gave him a Renaissance painting. In return, Johnson presented His Holiness with a bust of himself.” Explaining the unprecedented influence of First Lady Hillary Clinton, he tells a story of when Chelsea Clinton needed parental permission at school to take a pill. The administrator offered to call her mother. “Oh, don’t call my mom,” Clinton said. “She’s too busy. Call my dad.”
Bill Clinton was only the last in a long string of 20th-century presidents whose second terms were undone by scandal. Leuchtenburg’s explanation for this discouraging phenomenon comes earlier in words about Teddy Roosevelt: “If Roosevelt showed the value of a strong president who battled for the people, he also demonstrated the danger of unbridled executive power.” By the end of the 20th century, the presidency hardly resembled what the Founding Fathers intended. “Though the Founding Fathers had taken pains to avoid creating another George III when they framed the presidency,” Leuchtenburg writes, “the office in the twentieth century took on aspects of the majesty, even divinity, that doth hedge a king.”
From Roosevelt to Clinton we see that the more power we give the president, the less we trust him. The higher our expectations rise, the greater our disappointments. We’re not content with a chief executive. We want someone who moves the emotional register of the nation with the optimistic confidence of Ronald Reagan and the heartfelt empathy of Bill Clinton. So our presidents get credit for things they don’t do and blame for things they don’t control. We want our presidents strong like Clinton when they stare down the likes of Newt Gingrich in conflict over a government shutdown. But they can’t handle their power, as when amid that high-pressure shutdown Clinton began his infamous affair with Monica Lewinsky.
Hence we vacillate between unrealistic hopes and abject cynicism, as did one American who gave up voting in 1976. “I’m a three-time loser,” the erstwhile voter said. “In 1964 I voted for the peace candidate—Johnson—and got war. In ’68 I voted for the law-and-order candidate—Nixon—and got crime. In ’72 I voted for Nixon again, and we got Watergate.”
So what makes a good president? Maybe we need to change our expectations. “We give the President more work than a man can do, more responsibility than a man should take, more pressure than a man can bear,” the novelist John Steinbeck said. “We wear him out, use him up, eat him up … . He is ours and we exercise the right to destroy him.”
Collin Hansen is the editorial director for The Gospel Coalition, an editor at large of Christianity Today, and the author most recently of Blind Spots: Becoming a Courageous, Compassionate, and Commissioned Church (Crossway).
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromCollin Hansen
Alister Chapman
A robustly counter-revisionist history of England.
- View Issue
- Subscribe
- Give a Gift
- Archives
How do you write the history of your country? According to Thomas Hobbes, “A writer of history ought, in his writings, to be a foreigner, without a country, living under his own law only.” That advice has proved hard to follow. Many date the emergence of the modern discipline of history to Leopold von Ranke, who taught at the University of Berlin in the 19th century. Scholars remember him as a champion of primary source analysis, but as Royal Historiographer to the Prussian Court his contemporary fame rested in large part on the patriotic nature of his work. The first major historian of the United States, George Bancroft, studied with Ranke in Germany, and his work furthered ideas of American exceptionalism. In Victorian England, Whig historians such as Thomas Babington Macaulay stroked English self-approval.
By contrast, historians in the 20th century were more likely to cut their countries down to size. Charles Beard of Columbia University made the Founding Fathers look grimy. The leading advocate of German responsibility for World War I was a German historian, Fritz Fischer. British historians excoriated imperialism.
Tombs maintains that England’s longstanding commitment to the rule of law and to parliamentary government has been good for the country and the world it helped shape.
There have always been exceptions. If you wish to reach a wider audience, you probably need to be one of them. David McCullough, Doris Kearns Goodwin, and Stephen Ambrose have sold hundreds of thousands of copies of books that are more Bancroft than Beard. Britain’s Niall Ferguson has made money applauding empire. Academia’s doyens usually frown—whether out of envy, politics, or concern for the integrity of the guild is hard to say.
Those who believe that God made humanity in his image and that this image is now defaced will expect to find good and bad in any country they study. The best histories make us want to cheer and weep. On this measure, Robert Tombs’ new history of England is a success.
Given that Tombs is professor of history at Cambridge, what is surprising is that his book is as positive about England as it is. English academic historians have long been wary of the idea of English exceptionalism. That Tombs would write a book that calls on his compatriots to take more pride in their country thus requires explanation.
One reason may be that Tombs has made his living as a historian of France. This helps him to see aspects of his country’s story that scholars who focus primarily on England often miss. For example, he remarks that the English state has not suffered a major collapse for a thousand years. There have been no major improvements brought about through violence for more than eight hundred. Tombs has written two books on the Paris Commune of 1871, and he is relieved to find no parallels at home.
The bullishness of Tombs’ account can also be explained by when he wrote it. It’s a good time to celebrate England. For longer than anyone can remember, the English have often referred to Britain simply as England, much to the annoyance of the Scottish and Welsh. In response, historians have tried harder to tell the history of Britain as a whole. Against that background, Tombs’ decision to write the story of England reflects a particular moment in the island’s history. As Westminster has devolved power to national assemblies in Edinburgh and Cardiff, Britain now has the anomalies of parliaments for everyone except the English while representatives from Scotland and Wales continue to vote on legislation for England. English lips have begun to tremble in mild indignation. The assertion that the English have their own history and that it is a good one is therefore appealing.
The most important reason why Tombs decided to write an apologia for England, however, is that for decades many English people have been convinced that their country is in decline. They still like their country, but it seems a shadow of its former self. Tombs sets out to address this concern by organizing his narrative around the ways in which the English have told their history over the past thousand years. He chooses four themes in particular: the aftermath of the Norman Conquest; the Whig history of progress; the history of empire; and the widespread belief since 1945 that England is a nation in decline. To understand his book, it helps to take these in reverse order.
Most English people know that their country, along with their neighbors in the United Kingdom, used to rule the waves and no longer does. European competitors experienced faster economic growth after World War II and caught up with England. The empire disappeared. Commentators lamented the change and assigned blame. Some suggested that Britain was becoming a shabby, third-rate country.
Tombs has no time for this hand-wringing. Little England was always going to suffer relative decline once larger countries industrialized, but economically the country stands where one would expect in the ranks of its European neighbors. He scarcely laments the decline of manufacturing, buying the idea that England’s move to a service economy is a healthy sign (although he notes that when in 1945 Germany offered England the design for the Volkswagen Beetle as war reparations, industrialists turned it down as inferior to the English-built Morris Minor). Life expectancy has increased along with income. Those who have lived in England since 1945 “have been among the luckiest people in the existence of Homo sapiens, rich, peaceful and healthy.” Tombs wants his compatriots to cheer up.
This desire explains why he is less critical of empire than most historians. He recognizes evils, such as the violent suppression of the Mau Mau uprising in Kenya in the 1950s, but he also cites counter-examples. Many leading politicians were unenthusiastic about imperial ventures. The government turned down requests from people in Ethiopia, Uruguay, Sarawak, and Morocco to join the empire. There was a domestic humanitarian lobby (which included many evangelicals) that campaigned for England to use its power for good in the world—as when, in 1850, the Royal Navy forcibly entered Brazilian ports to destroy slave ships. For all the crimes (Tombs’ word), English hegemony also fostered global communication, trade, travel, parliamentary government, and the rule of law.
Moving on to the third of Tombs’ themes: even if you haven’t heard of the Whig interpretation of history, you probably know it. It is the story of progress and of freedom, and of how the English-speaking peoples have led the world toward political liberty and happiness. Initially, it told the tale of English defense of its representative government against threats from absolutist France and from those Tory kings, Charles I and II. In its English version, it celebrated the powers of the English parliament and provided justification for England’s growing global influence in the 19th century. Transported to the United States, it made its way into textbooks that featured America as the 20th-century standard bearer for liberty and hope in the world. On both sides of the Atlantic, it was a good story but too simple, and the resulting histories and self-perceptions were warped. But Tombs still maintains that England’s longstanding commitment to the rule of law and to parliamentary government has been good for the country and the world it helped shape.
For Tombs, these two principles lie at the heart of English identity. He traces them back to the centuries before the Norman victory of 1066—the most traumatic event in English history. Four thousand English nobles died in the Battle of Hastings or shortly after. Norman soldiers pulled down Anglo-Saxon buildings and built forts. English, which at the time had more copyists than Italian did during the Renaissance, was suppressed. As late as the 18th century, Tom Paine understood the island’s history as an ongoing fight by the English people to rid themselves of the “Norman yoke.”
For all the changes, however, England and English survived. They did so because they had a long, distinct identity that dated back to the Venerable Bede’s Ecclesiastical History of the English People in 731 and beyond. Magna Carta was not new in 1215: it was a restatement of what the English had long believed and practiced, namely that people should participate in politics through courts, tithes, juries, and parliaments. Continuity transcended rupture after 1066. And inasmuch as most would agree that what continued was good, that, for Tombs, is cause for thanksgiving, if not pride.
Tombs provides not only narrative and reflection on how the English have understood their story but also delightful detail. Hippos once swam in what we now call the Thames. The Duke of Norfolk at the time of the Reformation declared that “he had never read the Scriptures, nor ever would, and it was merry in England before this New Learning came up.” Queen Anne frustrated her bishops by losing the paperwork they needed to prosecute heretics. London sociability in the 18th century included a club for the ugly. Taxation in England in the 1770s was twenty-six times higher than in its colonies in North America. During the French Revolution, one English radical was hauled to the local pub and forced to buy 329 gallons of beer. During the Blitz, respondents to a Gallup poll said the weather depressed them more than the bombing. And, because he brings his history right up to the very recent past: after the financial crisis of September 2008, 734 second-hand Ferraris went on sale in the City of London in a week.
For Tombs, there is little question that England’s history and its contribution to history have been positive. He wonders why his compatriots appear reluctant to boast even about the vital part they played in the defeat of Nazism, especially in 1940-41. In the end, Tombs outdoes the Whig historians: his story is not so much one of progress but one of an ancient system of rights, justice, and political participation that endured (despite many follies, domestic and foreign, along the way) and that has been a great gift to the world.
Do the English really need Tombs’ encouragement as much as he supposes? Are they as down on their country as he claims? The recent Brexit decision suggests an enduring national pride. Those who campaigned to leave the European Union argued that Britain was a great country that could be even greater if it stood alone, and millions of voters agreed. (Interestingly, the English seem much less enthusiastic about breaking up with Scotland.) Or perhaps the vote to leave fits with Tombs’ narrative, reflecting a desire to address a prevalent sense of national malaise. Either way, Tombs’ book and this year’s elections in both the UK and the US raise the question of how teachers, politicians, and others can nurture just the right amount of national loyalty—enough to foster gratitude and civic-mindedness but not chauvinism and its almost-always offspring, war.
“We owe respect to the past,” Tombs concludes, “as we do to other societies today, not for the sake of our predecessors, who are beyond caring, but for our own sake. Treating the past as grotesque and inferior is the attitude of the tourist who can see nothing ‘Abroad’ but dirt and bad plumbing. Recognizing the qualities of past societies with resources a fraction of ours may at least deflate our own complacency, and remind us that we have little excuse for our present social and political failings.” A little preachy, perhaps. But given that historians often use the past to advocate for aspects of contemporary righteousness, a sermon on respect makes for a nice change.
Alister Chapman teaches at Westmont College in Santa Barbara. He is a British/English expat.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromAlister Chapman
Michael Ledger-Lomas
Heroic failure often masked ruthlessness.
- View Issue
- Subscribe
- Give a Gift
- Archives
Half overgrown village, half London commuter town, renowned for golf, well-kept inns and Arts and Crafts houses, Chislehurst in Kent is an unlikely place to find a shrine to a dead hero. On the fringes of its common is a large granite Celtic cross, erected as its inscription says “In memory of the Prince Imperial and in sorrow at his death … by the dwellers of Chislehurst 1880.” At the Roman Catholic Church is a mortuary chapel that borrows its Renaissance flourishes from the Château d’Amboise in the Loire. Inside, the Prince Imperial rests on his sarcophagus like a medieval knight, hands clasped in prayer, arms around his sword. He was a failure. Kicked out of France when his father Napoleon III was overthrown in 1870 and suddenly an heir to nothing, he needed an occupation and trained as a British army officer. Keen to see action, he enrolled as a “special observer” in the army that entered Zululand in the spring of 1879 to salvage a disastrous campaign against King Cetshwayo. On the morning of 1 June, the Prince’s reconnaissance party was surprised by a party of Zulu warriors. Bungling his getaway, he was speared to death. Yet if he was a failure, he was a celebrated failure. Prints of his demise, waving an ineffectual revolver at muscled spearmen, made him into a noble victim; his friend Queen Victoria treasured a prayer he had penned in his last days, commissioned a painting of the discovery of his corpse and came to Chislehurst to watch as he was consigned to the grave.
The Prince Imperial does not feature in Stephanie Barczewski’s Heroic Failure and the British, but he perfectly exemplifies the phenomenon it describes: the tendency of imperial cultures to lavish sorrow on their agents rather than their colonized victims. So strong was the desire of the British during the Victorian apogee of their Empire to feel it was they who bore the costs of its expansion that they could shed a tear even for the luckless son of their former rival. For Barczewski, the reason why the British distracted themselves with stories of those who died serving Britain was simple. Along with many recent historians, she argues that the ideology and rhetoric of the Empire was libertarian. The British liked to feel that they were spreading freedom, prosperity, and Christianity around the world and preferred the bits of the Empire—Canada, Australia, New Zealand and the Cape—whose colonization could be presented, misleadingly enough, as the expansion by freedom loving Anglo-Saxons into waste spaces. Yet in Asia and Africa the expansion of Empire depended on wars that toppled kingdoms and broke up societies and was run more by force than persuasion. The historian Bernard Porter has not found much support for his thesis that domestic Britons were “absent-minded imperialists” relatively oblivious to the Empire’s existence. But if not absent-minded, they were perhaps blinkered imperialists: the more ruthless and effective was imperial expansion, the more the British fixed their gaze on their own dead.
Most of the “last stands” toasted in late Victorian poetry and painting happened in conflicts that ended in bloody victories for the British.
Failure had tended to be incidental rather than integral to early 19th-century heroism. It is possible to find brave or even suicidal heroes who died doing their duty: men like Sir John Moore, who died retreating from Napoleon’s army at Corunna in 1809, or the hellraising Rollo Gillespie, shot in the chest while storming an impregnable Gurkha fortress during the 1814 Anglo-Nepalese War. Here though the duty not the death was the point: the public celebrated fearlessness, not failure. It was not on the battlefield but in undiscovered territories that the shift toward the hero as a man of sorrows emerged. For explorers to be “alibis for Empire,” they should suffer rather than inflict suffering. The explorer Mungo Park went down fighting on his final voyage down the River Niger, shooting dead many natives from his boat before running out of ammunition and drowning in making his escape. The posthumous tributes omitted this detail, just as accounts of the missionary David Livingstone’s demise skated over the fact that his crusade against slavery entailed the deaths of Africans. While Livingstone’s discoverer Henry Morton Stanley was frank about his own violence, it was important that nothing should obscure Livingstone’s presentation as an exemplar of “self-yielding in these all too selfish days.” In a gripping chapter on polar exploration, Barczewski shows how self-sacrifice became all-important. Parry, Ross, and most famously Sir John Franklin all went looking for the Northwest Passage and did so with the latest technology, Franklin setting out in steam powered icebreaker ships. Yet for the public their failed ventures became medieval exercises in frostbitten “self immolation,” not scientific coups. What counted was not whether they found the Passage but the character they displayed in the hunt, which kept Britons manly in the enervating intervals between Continental wars. Just as well, for Franklin’s miscalculations cost the deaths of 130 people. The Admiralty ended up spending around £600,000 in searching for their remains, suggesting that for empires the line between dreams and strategic priorities is a thin one. If the public did not want to know about Park’s muskets, they were just as unimpressed by suggestions that some of Franklin’s crew had eaten their dead mates, suggestions which Charles Dickens angrily dismissed as the lies of Eskimo informants.
It was not far from the useless sacrifice of polar explorers to the “chivalry” displayed by the cavalrymen sent on the suicidal charge of the Light Brigade. The adversaries in the Crimean War were evenly matched, but Barczewski goes on to suggest that the theme of noble sacrifice became strongest where there was a stark asymmetry between British forces and their opponents. Most of the “last stands” toasted in late Victorian poetry and painting happened in conflicts that ended in bloody victories for the British. The massacre of British troops at Isandlwana and the successful stand at Rorke’s Drift both occurred in the Anglo-Zulu War that killed the Prince Imperial. Fifteen Britons were killed at Rorke’s Drift but 350 Zulus, with the survivors massacring another 500 wounded Zulus after the battle. In the battle which avenged Isandlwana, 26 Britons were killed but 2,000 Zulus, with fired-up Britons massacring 500 wounded. The same was true years later in the Sudan. Britons wept over General Gordon, the modern Galahad who in George Joy’s famous painting offered himself in Christlike fashion to the spears of the dervishes who overran Khartoum. No tears were shed for the 11,000 Sudanese killed and 16,000 wounded by British machine guns at the battle of Omdurman, which avenged Gordon’s killers. Major-General Kitchener considered that they had got a “thorough dusting.” If heroic failure often masked ruthlessness, then by the early 20th century it articulated fears of imperial overreach. The heroism of Captain Scott’s death in Antarctica disguised British embarrassment at coming second in the race for the South Pole and could not wholly prevent anguished discussion about whether Britons were physically prepared to compete with the agents of other powers.
If the failures Barczewski covers are familiar, then she has plotted them onto a provocative thesis about how the moral imagination worked in the British Empire, which is perhaps how it works in all empires. Even seasoned readers will encounter many arresting details, such as Queen Victoria’s meeting with Bobbie, the mongrel dog who was one of the few survivors of a regiment massacred at Maiwand in Afghanistan. Dogs recur in the story of Scott: sentimental Britons could not forgive his Norwegian rival Amundsen for eating his huskies when they had expended their usefulness. At an English banquet in his honor, Amundsen glowered while Lord Curzon proposed a toast to his dogs.
It is surprising that the book does not reflect much on Christianity’s role in all this. Despite mentioning funerary monuments in churches and the odd sermon, Barczewski misses the bigger point that this was a culture programmed by its faith to celebrate a hero whose peaceable triumph was his death. It is Christianity’s influence over an extensive domestic public which perhaps explains why the voices of the soldiers who eagerly slaughtered Zulus and Afghans were seldom heard at home. If the British clearly wanted to see their heroes as representing a peaceable, almost defensive Empire, rather than one red in tooth and claw, that yearning invites further explanation. Perhaps also the book could have done with fewer summaries of the “boilerplate” tributes that heroes attracted and more controversy. Heroes were controversial figures because so were the enterprises in which they engaged. Queen Victoria for instance wanted a monument to the Prince Imperial put up in Westminster Abbey, to the fury of radicals who regarded him as a “tyrant’s cub” and threatened to dynamite it if she went ahead. There were always pockets of resistance or indifference in British society to imperialism, even when cloaked in “alibis”; Barczewski might have been more precise about who believed the propaganda. Nonetheless, as a rollicking account of a moral quirk in British culture, one which can affect any powerful and aggressive nation, this book deserves a wide readership.
Michael Ledger-Lomas is lecturer in the History of Christianity in Britain at King’s College London.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromMichael Ledger-Lomas
Craig Mattson
On podcasting.
- View Issue
- Subscribe
- Give a Gift
- Archives
Jonah Weiner has argued in Slate that the podcast is friendlier than the book. We experience the super-abundance of audio blogs, he explains, differently from our bedside book stacks. Tomes evoke a Sisyphean obligation. “Podcasts, somehow, are different. I just tallied up the 53 unheard episodes sitting in my podcast library, and, doing so, felt none of the guilt, dread, or FOMO that, say, my clogged Instapaper queue can inspire in me.” Instead, he smiles at the thought of “53 ways to feel like I’m being productive with my time while I’m wasting time.”[1]
Weiner’s comparison of podcasts and books recalls Jim Gaffigan’s comparison of movies and books: “You ever talk about a movie with someone that read the book? They’re always so condescending. ‘Ah, the book was much better than the movie.’ Oh really? What I enjoyed about the movie: no reading.”[2] But it’s not simply the laboriousness of literacy that makes books a weariness to the flesh. Walter Ong has argued that print culture conveys the wearying inescapability of human presence to—humans. “The presence of man to himself over the face of the globe is basically a presence of the word,” Ong observes, adding that we feel this word most in our media.[3] “Today lettered words thrust themselves up like shouting monuments not only in the cities but along country highways as well, and are inscribed in the heavens by skywriters.” In contrast, humans in oral society were “scattered in tiny isolated populations which had lost memory of each other’s existence.” Books and magazines and newspapers contribute to what Ong calls “hominizing” the creation, that is, our “entering into possession of the world, filling it up, becoming the active focus of more and more of its operations.” I am myself a bibliophile. I spend every birthday gift card in the codex section of Amazon. But even I have to admit that books bear our smudge and share our smell in a way that the podcast, with its dear freshness deep down things, does not.
Gerard Manley Hopkins’ imagery, so tactile and olfactory, contrasts sharply with Ong’s notions of how we privilege the visual above our other bodily senses. This modern sensorium has long worried scholars like Ong for reasons he detailed in The Presence of the Word and many other books. The visual spatializes and silences the creation, reducing it to a set of objects, which encourages us to construe the world methodically (usually with lots of charts and arrows) not conversationally (with tropes and stories). Consequently, the visual disinclines us from listening to each other or to God. I’d like to ask in this essay whether the rise of the podcast in the past few years suggests a sensorial shift—and what this shift might mean for our public lives and theologies. Digital media tend to evoke anxiety in us, as suggested by the rise of anti-distraction discourse. (You’ve no doubt seen those tersely titled books, Distracted, Rapt, Mediated, Focus, and The Shallows.) But does the viral popularity of podcasts like Sarah Koenig’s Serial suggest a reorganizing sensorium today? Does it evoke new possibilities for shared concentration and mutual involvement?
Weiner thinks so. Podcasts are as intimate as that subtlest and most wearable of technologies, the earbud. Screens induce guilt (I shouldn’t be looking at this while I’m driving), whereas earbuds evoke a more decorous mode of diffuse attention. What makes digital audio so acceptable in public life, despite its tendency towards alienating individualism? (I mean, given the free choice between what’s in your headphones and what’s in the world around you, you’ll go for the earbud 87 percent of the time, right?) Weiner traces the attractiveness of podcasting to “the form’s special sense of intimacy and even its erotics: the dulcet phonemes of Jad Abumrad, issuing into us from earbuds snugly nestled into our heads.” In Ong’s phenomenology, these erotics entail one person non-violently but insistently sharing interiority with another. Not surprisingly, then, podcast fans like Weiner see in mediated audio near wondrous possibilities for empathy. Noting our inclination to entrust ourselves to people’s voices, Weiner argues that podcasts are both consoling (not least when we find ourselves dislocated by travel) and capable of pushing “us outside of a blinkered comfort zone” so that we grasp other people’s experiences “with a different kind of nuance and immediacy than print can muster.” That voice in your ear, so sexily immanent, inclines you towards empathy in a way that the detached, carefully curated voices of broadcast journalism, announcing the latest Boko Haram atrocity, do not. Morning Edition’s Robert Segal and his flawless Standard American dialect may occasionally draw our sympathy. But This American Life‘s Chana Joffe Walt’s verbal clutter and vocal fry woo us into identification and empathy.
Well, sometimes they do. Voices can also evoke unthinking dislike, as I learned in my straight-out-of-college job as a Gulf Coast radio show host. I remember cringing every time the light flared on the in-studio phone. Great. Yet another call about my unsanctified voice. As a Michigan transplant to Florida, I was asked to eradicate my nasality. Why? Perhaps because that particular quality was too heady, too intellectualist, by regional standards. And then, in a quirk of providence, I migrated back to the Midwest in order to teach speech in Chicago, the City with Big Nasal Cavities.
But I’ve had an easy time of it in comparison with the women hosts of my favorite podcast, DoubleX Gabfest, a show I often listen to while vacuuming the house on Saturday mornings. Any misplaced pride in my dudely household-choring is frequently chastened by these witty progressives—most frequently, Hanna Rosin, Noreen Malone, and June Thomas. Perhaps my favorite was their “Broadscasting” episode, where they discuss the snark that women’s voices frequently evoke in listeners.[4] In the show, Invisibilia podcaster Alix Spiegel notes that listeners frequently despise her voice, and Rosin observes that contempt for your voice sometimes entails contempt for you, too. A person’s voice somehow lets us be dismissive of her or him in a way that the face does not.
What was disconcerting about this conversation, at least to me, was its implication that although podcasts can be warmly, empathically, even erotically personal, they can also be strangely impersonal. I shouldn’t have been surprised. This combination of sensuousness and distance actually characterizes the voices of the most popular podcast hosts. Re-listen to Serial or This American Life or Explain Things to Me or Radio Lab NYC, and you’ll hear voices at once intimate and detached. They are sensually close to listeners, indwelling earbuds with quirky, idiosyncratic voices, informally paced, breathily um-cluttered, tentatively inflected. They seem to have bypassed any interpersonal gap, having achieved a kind of immediate access to the listener’s psyche. But even so, their uncanny proximity is offset by being in earshot of things that are not at all about the listener. Attentive to the incongruous, the weird, the singular, these podcasts decenter the listener in the cosmos. And they do so in voices pitched flatly, sounding at once wryly curious and perpetually unsurprised.
There is a gratifyingly broad range of vocalities in the podcast universe—from Ira Glass’ comfortable nasality to Hanna Rosin’s brassy bluntness to Jad Abumrad’s elvish quickness. But for all this idiosyncrasy, there is a disconcerting uniformity. As Weiner notes, if you’re a podcasting addict, you’re probably not listening to voices shaped by the stridency of the ideological right or the ethnically diverse. You’re probably listening to the voices of white progressiveness, a point that not only Weiner but also Jay Casper King has described as a shortcoming in the politics of podcasting.[5] But King’s critique of “White Reporter Privilege,” well taken as it is, locates the ideology of white liberalism while missing something even stranger: the speedy mobility and shareability of the podcasting sound. I think rhetorician Eric Jenkins might call this podcasting’s modality. “Modes are collective, emergent phenomena,” he explains, “that express the circulating energies of contemporary existence rather than re-presenting the interests of particular rhetors.”[6] What may be most important about podcasting’s current modality is not the pale progressiveness of podcasters so much as the spreadability of podcasting’s intimately impersonal affect.
But talking about modalities is conspicuously difficult. We might try the language of Michael Polanyi’s tacit dimension to account for how audiobloggers indwell an indeterminate but materially participatable frame. We might talk about Pierre Bourdieu’s notion of habitus. Or we might borrow Gilles Deleuze’s distinction between abstract and virtual realities, distinguishing between the static ideological particularities of a podcaster’s own identity and the affective energies that propel podcasting’s innovation. But however we name the real but only barely palpable comportments that discipline and drive the podcasting sound, we are clearly dealing with a different sort of materialism than the Enlightenment naturalism that Ong criticized.
Ong’s discussion of aurality is most effective as a critique of a Newtonian worldview that spatializes the world, and the persons within it, reducing everything to objects. But this critique is less helpful for explaining the weirdly diffuse collectivity of podcasting, its speediness, placelessness, pathos, and tweet-ableness. As Korean podcasts such as SeoulPodcast or African American podcasts such as Denzel Washington Is the Greatest Actor of All Time Period gain popularity, they will no doubt alter the ideological place of the podcast. But at least for the for-hearable future, their voices will have to chime with this detachedly erotic modality in order to alter it.
This impersonal intimacy raises questions about the ethics of empathy that Weiner hopes could improve liberal democracy and which Ong, in another, more theologically informed way, hoped would prepare late moderns for hearing the divine. I think I agree that podcasts can invite identification, which in Kenneth Burke’s formulation means enabling a kind of rhetorical consubstantiation, a sharing of substance between speakers and hearers. But the podcast’s dependence on voice inevitably adds to this intimacy an impersonality arising from the simple social fact that some substance is unshareable. Sounds, in particular, confront us with qualities that we simply can’t abide, even when we can’t explain our distaste.
This recalcitrance came through to me with special vividness in a Gabfest DoubleX discussion of online misogyny. Here’s my transcription of Noreen Malone’s question about the seeming impossibility of consubstantiation with online trolls:
Previously before Gamergate and, you know, Reddit and 4chan occasionally pop up in the news as enacting horrible things and I sorta read those news stories, but they are not a part of my everyday life. Is this responsible of me to just say, “OK, These are not my people,” and I can sort of compartmentalize them over here and say, “Well these men are horrible. It doesn’t affect me”? Or does it affect me?[7]
The question is a hard one because it locates the limits of everybody’s favorite words: identification, empathy, communication, dialogue. Podcasting does achieve empathy—but only against a backdrop of massive inter-tribal aggressions. In response to this predicament, Ong’s observation (again in The Presence of the Word) of “the curious irenic tone” in contemporary rhetorical exchange sounds quaint some 50 years on. He hopes “that through the reorganization of man’s worldview, enforced by development in the media, some kind of new prospect of peace and understanding is indeed dawning.” Yet today, even in a world in which dialogue is touted constantly, the sad tribalizing of our tongues makes us push our earbuds in deeper.
Perhaps this is an occasion for what Stephen Webb calls “the acoustemology of the church.”[8] How, in other words, does God sound? If Ong is right to say that our technology has fostered a kind of religious deafness, the church is called to build publicly accessible acoustic spaces for hearing the divine. The Gospel’s narratives offer a starter, especially in the narratives about the post-resurrection Christ. The Jesus biographers record conversations with tear-blinded Mary in the cemetery garden, with Peter and the bleary-eyed fishermen on the beach, with Cleopas and company on the road to Emmaus. In each case, Jesus speaks intimately, but in a voice so strange that no intimates recognize him. We might even say that Jesus offers the two disciples on the road to Emmaus a first-century version of the intimately impersonal podcast. There’s an abruptness to his appearance, and an unfamiliarity—not unlike your experience putting on a pair of headphones to join a vigorous program in progress. With a brisk, avid proficiency, Jesus explains the whole of available Scripture, making the truth about God’s life for the world manageably, accessibly coherent. And like podcast listeners today, the disciples, when talking about it later, described a richly affective involvement with Jesus’ speech: “Were not our hearts burning within us while he was talking to us on the road, while he was opening the Scriptures to us?”
But as we mull over what to say to cyber-misanthropists, perhaps the resurrected voice of Christ offers less of an ethics than an eschatology of voice. Chana Joffe Walt noted in that “Broads-casting” gabfest referenced earlier that the great challenge of audio performance (in a world of voice-critics at best and voice-haters at worst) is to sound like yourself. But the paschal mystery means that our selves have not fully arrived yet. If we could hear ourselves as we will be, might we not be struck by a resurrection strangeness in our voices? And how might we conjugate our future voices presently using the vocal folds we do not yet fully have, the larynx whose length we have not felt, articulating the sound no ear has yet heard?
Think of the strangeness and the familiarity of Jesus standing in that Emmaus dining room. Perhaps as he shifts the supper loaf from hand to hand, he repeats his upper room words: “I still have many things to say to you, but you cannot bear them right now.” And then he tears the bread and is gone. But the ensuing absence is not a withdrawal, an escape, like many of us are tempted to in encounters with despoilers of democracy or sinners in church. This going leaves no grievous sense of gap. Instead, like every communicant at Eucharist, Cleopas and his friend take up the bread, take up the Christ. This Word they hold is a strange materiality, neither Newtonian nor Deleuzian, but a mediation sometimes audible as word, sometimes graspable as sacrament. Here is real presence in an acoustics of peace, no less than intimate, quite more than personal.
Craig Mattson is professor and department chair of Communication Arts at Trinity Christian College in Palos Heights, Illinois, where he also serves as Honors Program Director.
1. “What Makes Podcasts So Addictive and Pleasurable?” Slate (December 14, 2014). www.slate.com/articles/arts/ten_years_in_your_ears/2014/12 /what_makespodcasts_so_addictive_and_pleasurable.html.
2. www.cc.com/jokes/tg9h39/stand-up-jim-gaffigan–jim-gaffigan–the-book-vs–the-movie.
3. The Presence of the Word: Some Prolegomena for Cultural and Religious History (Yale Univ. Press, 1967).
5. King, ” ‘Serial’ and White Reporter Privilage,” The Awl, November 13, 2014. www.theawl.com/2014/11/serial-and-white-reporter-privilege
6. “The Modes of Visual Rhetoric: Circulating Memes as Expressions,” Quarterly Journal of Speech, Vol. 100 No. 4 (November 2014), p. 443.
8. Stephen H. Webb, The Divine Voice: Christian Proclamation and the Theology of Sound (Brazos Press, 2004), p. 27.
Copyright © 2016 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromCraig Mattson