One point being pushed by some folks on the right is the notion that America is a Christian nation. Whether this claim is true or not depends what is meant by the ambiguous term.
If the term means that Christianity plays an important role in American history, values, and that many Americans profess to be members of this faith, then the answer is an obvious “yes.” Many of the founders were Christian deists and showed a clear belief in God. Also, the political philosophy that America is based upon includes strong Lockean elements and Locke’s theory is based quite strongly on God. So far, so good and easy.
However, the folks that claim that America is a Christian nation seem to mean more than this. In general their view seems to involve claims that the founders held the same views that they themselves hold. This does not seem to be supported by the historical evidence. To check on this, do as David Barton advocates: go read all of the original writings of Jefferson, Adams, Paine, Franklin and so on. But, do what Barton does not seem to do: be sure to consider the full text of the documents rather than merely focusing on a specific quote or two. You will find reference to God, but you will not find the sort of Christianity being endorsed by the likes of Palin, Bachmann and Barton. I do not expect you to take my word on this-get the texts and read them thoroughly and completely with an objective mind.
The folks that claim America is a Christian nation also tend to hold this as more than a description but also as a prescription. To be specific, they contend that since America is a Christian nation, then we should change our laws to reflect this. Abortion should be outlawed, same sex marriage should be banned and usury should not be allowed. I am, of course, kidding about the last one. Usury is just fine-these folks are not going to shut down the banking industry (which is but one sign of how consistently Christian many of the folks are).
Even if it is assumed that these views are truly Christian, there is an obvious problem. America is a democracy. Now, if it is assumed that America is a Christian nation and it is assumed that America is a democracy, it would seem to follow that this Christian nation accepts things that certain Christians claim go against Christianity. If people continue to democratically support views that certain Christians oppose, should we abandon democracy in favor of imposing a certain set of religious views? I, of course, think we should not.
That is one rather serious problem with having an official state religion (or something close to it)-it tends to be rather inimical to democracy-something the founders were well aware of. So, insofar as we favor democracy over theocracy, we are not a Christian nation. Rather, we are a democratic nation with the notion of religious freedom (and freedom from religion) as a fundamental principle.