Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Anonymous
Anonymous asked in Society & CultureReligion & Spirituality · 1 decade ago

When did America become a Christian country?

For goodness sake it's founding fathers weren't even christian they were against christianity. So were along the line did it become a christian country?

Update:

I'm not christian -_-

32 Answers

Relevance
  • ?
    Lv 5
    1 decade ago
    Favorite Answer

    It always was a Christian country.The founding fathers were Christians Episcopalians to be more precise. However the migrants who have settled there are trying hard to show that it was not a christian country.

  • Anonymous
    1 decade ago

    If you are asking when did Christian belief influence American political thinking, then I would point you back to the founding of the USA in 1776. If you are asking when did America become a country of, for, and by the Christians, then the answer is that has not happened.

    American was founded on the principles of religious freedom, and as such has never been a nation only for Christians. However, you have to understand that it was Christians of differing beliefs that brought that concept of religious freedom to the foundation of this country.

  • 1 decade ago

    over 95% perhhaps 98% of the Founding Fathers were Born Again, Bible believing Christians.

    Let's focus in on the 56 Signers of the Declaration of Independence:

    3 were Deists

    1 Roman Catholic

    52 Born Again Bible believing Christians

    29 of those 52 were Seminary Graduates

    24 of them served as Pastors when they were elected to serve in the Continental Congress which drafted the Declaration of Independence.

    Similar percentages would be found if you examined the 55 signers of the Articles of Confederation, the US Constitution and the Bill of Rights.

    Whoever says anything different is not telling the truth.

    Every history book wrtten prior to 1920 called the USA a Christian Nation.

    Source(s): 43+ years following a Jewish Carpenter & studying His Book!
  • 1 decade ago

    You are wrong about the founding fathers; many of them were Christian. This list shows the religious affiliation of the signers of the Declaration of Independence. As you can see, many were some denomination of Christianity and some were Deists.

    http://www.usconstitution.net/declarsigndata.html

    However, there is ample evidence that many of the founding fathers did not want religion to be a guiding principle in our country. It's one thing for religion to form the principles of a man, but they wanted the principles of the nation to be based on law and the idea of justice.

  • How do you think about the answers? You can sign in to vote the answer.
  • 1 decade ago

    I believe your information is incorrect. However, the seperation of church and state is clearly envoked in the constitution, which has now become a dusty, clouded, piece of withering history. Thomas Paine and Benjamin Franklin were apposed to christianity. But those are the only two which I know of that took that stance. Paine was a deist, and Franklin was a citizen of this fine earth. Our country was clearly founded on christian morals despite the founding fathers efforts to seperate them from politics

  • 1 decade ago

    I agree with the immigrant answer. A lot of people looked down on immigrants when they started coming (such as Russian Jews, Irish, German and more like how some people look down on being gay today) and striven for a completely Christian, White community, saying that would be the Utopia.

    The country did not start out as Christian, although that was the majority of the population, as it is today, but of course it has gone down.

  • bekah
    Lv 4
    1 decade ago

    fyi ben franklin was in the hellfire club. around the 40's when they put "under god" in the pledge of allegiance. people forgot about the whole escaping religious persecution and having a rational government that didn't dictate religion thing. there was a revival of faith and people decided that that might makes right and the majority gets what it wants. since we're in the midst of an evangelical, zealous crazed revival now, they've decided we're all christian. i think we're in a sot of dark age right now, but don't despair: they're followed by enlightenments and Renascences.

    p.s. it depends on which founding fathers ur talking about: the puritains who just came here lived, or the really important ones (like franklin) who had independance wars and wrote nice little declarations and constitutions and such.

  • 1 decade ago

    ...when the first Pilgrim set foot ashore at Provincetown (Cape Cod Mass ) then on to Plymouth Harbor, MA...

    The Federal Government was founded on Christian principles, not its religion. (In God We Trust) the National motto is still firmly engraved in our Nation... however, you can believe in what-ever you wish... or nothing at all.

    You really should do some research on the signers of the Declaration of Independence... "9" of them were Ordained Christian Ministers... "3" others were Founders of Major University's of Christian Faiths... Princeton, William & Mary and Notra Dame...

    Source(s): American History.
  • Anonymous
    1 decade ago

    it really never has been a christian country

    when the 1st group settlers came over here it was separation of state and religion in other words you could go to church if you felt like it

    you would not be punished if you didn't go our founding fathers were christians they were not afraid to pray and ask God for guidance

    like some of our "yellow belly" leaders are now once we get ALL of them out of office and get a true God fearing person in office you will see a lot of changes we don't need a "fence straddlers"

  • Lol @ booyah.

    What do you mean Christian country? I've never fully understood what people mean, do you mean are it's policies governed or made to include a religion? Then yes during the Bush years, but they shouldn't. Do you mean is the majority of the population Christian? I don't know.

    Source(s): Agnostic atheist.
  • Anonymous
    1 decade ago

    You Should Re-Read Your History Books (Before They Are Changed) Our Founding Fathers Most Definitely Were Christians (Puritans).

Still have questions? Get your answers by asking now.