Are We a Christian Nation?
The debate is about as old as this country itself, and you need not travel far to find supporters for both sides of the argument. Many people insist that although Christianity remains the dominant religion in the United States, we are not, by definition, a “Christian country”. They point to the Constitution, which is devoid of references to God and does not advocate for any particular religion over another.
However, there are those who disagree. They contend that our country was founded on Christian ideals, therefore it must be Christian. Their arguments frequently cite implicit references to religion found in the Constitution, such as the phrase “in the Year of our Lord,” which appears near the end of the document.
For obvious reasons, this conversation usually ends in a stalemate. So, let’s throw a wrench into it: were the Founding Fathers actually Christians?
The Faith of the Founders
The Founders were all religious people – that fact is widely acknowledged. What is often disputed, however, is (1) the nature of their faith, and (2) whether they thought America ought to be based on Christian beliefs.
On the first point, certain historians insist that the Founders were not actually Christians – at least not in the traditional sense. According to this theory, their faith was better characterized as Deism – a belief in a divine creator of the earth, but one that has since relinquished contact with the human world. The Founders were people of faith with strong Christian values, the thinking goes, but may not have identified with Christianity as we commonly understand it.
Of course, this is just a theory. Other experts will argue that the Founders were devout Christians. However, if they were in fact Deists, then it becomes quite difficult to argue that they intended to create a Christian country.
Christian Influence Today
While the Constitution may not explicitly say so, there is a lot of evidence that we do live in a Christian country. Think about all the elements of Christianity we encounter on a daily basis. The phrase “In God We Trust” appears on all our coins and paper money, evoking questions about the supposed separation of church and state. In 1954 Congress voted to add the words “under God” to the United States Pledge of Allegiance, which is recited by children in schools across America every day. Not all school districts require the pledge, but many do.
In the realm of education, Christian beliefs have long been present. Over the years, there have been countless attempts to incorporate the Bible into student curriculum. For example, many schools teach creationism as an alternative to evolution. But it doesn’t stop there – youth sex education courses often invoke conservative religious principles and advocate abstinence-only approaches.
In a recent Pew Research Center study, 32 percent of respondents said people should be Christian to be considered true Americans.
Why Not Endorse Christianity?
You may have heard the common refrain: “the vast majority of Americans have always been Christians, so why shouldn’t we be a Christian country?”
Although Christianity is the dominant religion in the U.S., making it the official religion could have serious consequences – consequences that the Founders understood quite well. After all, many colonists came to the New World to flee religious persecution in Europe. If the government were to endorse a specific religion, laws promoting that religion (and suppressing others) might follow.
In an 1814 letter, Thomas Jefferson famously wrote that “Christianity neither is, nor ever was a part of the common law.”
The debate rages on. These days, the argument for protecting Christian beliefs through law is often couched in the importance of religious liberty. Indeed, the free expression of faith is universally supported in this country. However, that support can quickly erode when the expression of a religious belief intentionally violates the rights of others.
What do you think? Is America an “unofficial” Christian country? What does the future hold?