Ryuzaki
Creator
No Life King[M:-197492]
[Mo0:6]
Posts: 2,348
|
Post by Ryuzaki on Dec 13, 2008 7:38:23 GMT -5
Is the US a "christian nation", as many US public figures say? Was the US ever "truly" a christian nation? Was there any point in the history of the US that it lost what it might have once had?
|
|
|
Post by MajinKirby on Dec 14, 2008 0:04:30 GMT -5
My answer: Absolutely not, sir. The United States of America was never in any way a christian nation. There have always been differences in cultures and ideals in the country, like Judism, Bhuddism, and a slew of other beliefs. Even back when the USA was founded, there were disputes about who had the "better" beliefs.
Just because Christianity is one of the main beliefs in the USA does not mean that the whole nation is under its influence. So there. =)
|
|
humarwhitill
Reminisce VIP Member
[M:-92500]
[Mo0:0]
Posts: 1,098
|
Post by humarwhitill on Dec 14, 2008 14:01:16 GMT -5
I a way it was a Christian nation, though not to any large extent. The you.S. was founded off of Christian beliefs and many of the original states were formed by a group of a specific branch of Christianity.
On the other hand though the founding fathers wanted to make sure they religion could not be forced onto anyone and because of that only took the parts of the Christian beliefs that would make you a "good" person I.e. not stealing, not killing etc.
|
|
|
Post by boogieknight on Dec 14, 2008 17:29:17 GMT -5
It depends, we are not a religious autocracy even though some of our more arbitrary laws are influenced by religious traditions (like the ones that don't allow alcohol to be sold on Sundays in certain counties). However, the architects of our government were predominantly Christian and came from nations that were mostly Christian. While the different denominations disagree on many fine points, they are the same reigion and have common ideas. To say that this was nonexistant as an influence sounds more like wishful thinking rather than a realistic assessment of history.
Let's examine the First Ammendment and the context of the times it came out of. In Europe, it was customary for the censoring, banning, and burning of books which had ideas that a ruling power believed undermined his authority by simply promoting ideas that he didn't like! As the Protestant Revolution caught on, the religious affiliations of the citizenry was often dictated by the beliefs of the ruler. England's people were forced out of the Catholic Church because Henry VIII was in a quarrel over annulment, and the nephew of Henry's wife sacked Rome to stop the annulment.
History is full of examples in which much crazy shit happens because those in power make political/doctrine driven choices that intrude on the spiritual pursuits of other peoples. The Founding Fathers had no desire to see that sort if insanity in their own country and thought every man should be free to follow their conscience without fear, regardless of whether they were a regular guy or a legistlator. Some were more devote, and others more pragmatic, but they saw no reason to legislate on what others believed or how they practised it as long as they didn't break the law.
From the 1800s on, there were tensions with immigrants from the East and Eastern Europe. There were also hostilities, but as best I can tell, there was no attempt to force conversion on the immigrants. And even though there are a significant number of people not affiliated with Christianity, there still are those who claim to be. Does single proportions make a nation a Christian nation? Do the laws have to be in exact accordance to religious laws, like in Islamic countries? It's been argued that Christ abrogated the Mosaic law, and the only commandments are: "Love the Lord" and "Love thy neighbor as thyself." I doubt that federal or state law was really written with that in mind.
It depends on how you define the terms. We have a strong Christian tradition, with all sorts of revivals and new demoninations beginning or at least taking root in the United States over the nations history.
|
|