In The Atlantic Allen Guelzo argues that “the Civil War made it impossible for religious absolutism to address problems in American life—especially economic and racial ones—where religious absolutism would in fact have done a very large measure of good.” This is an intriguing but deeply flawed argument. Leaving aside the dubious assumption that moral absolutes are good, I want to challenge only one aspect of his argument: his claim that
“From the Civil War onward, American Protestantism would be locked deeper and deeper into a state of cultural imprisonment, and in many cases, retreating to a world of private experience in which Christianity remained of little more significance to public life than stamp-collecting or bridge parties. Appeals to divine authority at the beginning of the Civil War fragmented in deadlock and contradiction, and ever since then, it has been difficult for deeply rooted religious conviction to assert a genuinely shaping influence over American public life.”
Guelzo provides very little evidence for this claim, as well as failing to connect the moral angst created by the Civil War to the retreat of religion in public life.
Kevin M. Kruse traces the evolution of the myth that America was founded as a Christian nation from the 1920s through the Cold War. As he points out, “During these years, Americans were told, time and time again, not just that the country should be a Christian nation, but that it always had been one. They soon came to think of the United States as ‘one nation under God.’ They’ve believed it ever since.” I’ll post a review of the book as soon as I read it, but if you want a brief summary of his argument you can find it here:
A Christian Nation? Since When? – NYTimes.com.
I think we all know why the myth persists. This might be wishful thinking but maybe Peter Manseau’s new book One Nation, Under Gods will persuade some who are not familiar with the history that this is not a Christian nation. Here is an excerpt from Laura Miller’s review of the book:
“The Pilgrims might have all called themselves Christians, but some differences among them were seen by their theocratic leaders as profound threats to the spiritual survival of the community. Both Williams and Hutchinson were cast out and created communities of their own. There was literally never a point in the history of the colonies or the U.S. when all or most Americans genuinely shared the same faith. ‘The true gospel of the American experience,’ Manseau writes, ‘is not religious agreement but dissent.’”
No, America Has Never Been a Christian Country — Why Does the Myth Persist? | Alternet.