Just posting this quote from Richard Land because I have a feeling that it'll come in handy at some point in the future:
Is America a Christian nation?
America has always been a very religious country, but I don't think America is a Christian nation. I don't think it was founded as a Christian nation. The majority of the country thinks so, and I think the majority of the country is wrong.
As an evangelical, I find the phrase "Christian nation" to be problematic because for me being a Christian is an individual decision and a personal relationship.