Why have Americans ruined Christianity?

I believe in God and pray daily but cannot force myself to accept organized religion or Christians specifically. I find Christians, in general to be downright offensive. The "holier than thou" attitude is disgusting and the exact opposit of neighborly. I would love for Christianity to be what it was intended to be but it couldn't be further from what it should be. I see people hate constantly in the name of "religion" I see atheists, Jews, Muslims and Buddhists show extreme kindness and generosity to those around them (regardless of religion, creed or nationality) and yet more times than not when I meet a rude, judgmental, selfish person they are a devout Christian. I do know some good Christians but in my experience that is NOT the norm.
Obviously there are bad people in all religions but it truly saddens me that I can't practice the religion I believe in because the people are so obnoxious and hateful. The saying "you catch more bees with honey" seems completely lost on most Christians, rather than acting kindly and accepting those around them most Christians act like they either want nothing to do with those outside of their church or like those who don't believe what they do need to be "fixed."
Before you say my experiences are limited to one area I have lived on three different states and currently work on the road (in the Midwest at the moment) but have spent at least a month in nearly every continental US state.
I also doesn't quite a bit of time in Italy and Ireland and noticed that not a single Christian in either catholic country harbored this attitude. In fact I found some of the most open minded people I have ever met in rural Ireland. What has happened to American Christians? What has caused them to venture so far from "love thy neighbor"?

2012-09-07T17:47:55Z

I live in NY state and am very thankful that my little town is mostly non-religious, no way I could stand living in the Midwest. I just wish that the people that chose to devote their lives to religion would focus on the good parts and not the "hate anyone who doesn't believe what you do" side. I had a Mormon teacher that just about ruined all religion for me because he blatently treated the religious kids better than everyone else. It was disgusting and luckless he was fired and went to teach at a Christian school.
I guess what my question comes down to is how can anyone who wants to be Christian focus so heavily on hating other religions, gay people ect and not on the good?
I also had a boss that did help at food kitchens and donate to the needy but was such a holier than thou ****** that people couldn't help but hate her. Maybe many religious people are just bad people trying to mask their wickedness with "God."
In all fairness I have met many wonderful,

Anonymous2012-09-07T17:11:26Z

Favorite Answer

I think you meant to ask "Why have Christians ruined America?"

Ken2014-04-07T05:39:50Z

Yes, its true that there are many people, who call themselves Christians, who don't live up to the name, but I do not believe that it should be the reason to leave the church. If the foundational teachings of the church you are a part of are still biblical and sound, then that is the right place to be. Jesus never said His people would be perfect, but he did say that His church would be, in the sense of what it teaches. In fact, the church will always attract sinners seeking for God. Thats what it was designed to be. So if you see people struggling with sin in their lives (or even cherishing it) and are still in the church, point them to the Saviour and pray earnestly for them. Thats what our Lord did. He never left the Jewish religion, although there were many hypocrites in it. We are all sinners, its just that we have different kinds and degrees of sins. Let's learn to love one another and help break the chains of sin. If I were you, I would stay in the church and be God's hands to touch the world around me. Be the light set on a hill. God bless.

Jac2014-01-29T13:06:06Z

It is because of the very foundation that America was based on. We Christians are supposed to be a light that shines in the world. That is done through unconditional love and hospitality to those in need. We are to live among the others, but the founders of America chose to run away instead. They wanted to live far away from the so-called heathens and only in the company of those who share their beliefs. It was kind of like a pseudo-heaven they were trying to accomplish. Today America is a land of many religions, but you still get Conservatives who have that "Us and Them" attitude. They are called conservatives because they want to stay the way they are, like their forefathers before them. However, they have forgotten what it means to be in the service of Christ.

Skaggmo2012-09-07T17:16:55Z

That's funny, I drove trucks OTR for 20 years, stopping at strange new churches at least twice a month. And I NEVER had that problem. I met some of the kindest, gentlest strangers at most of them- people who worked in their communities feeding the poor, giving rides to the poor, etc.

I did find a lot of churches that had no Holy Spirit, or lacked the ambition to be Biblical.

ThatOne2012-09-07T17:14:14Z

Honestly, religions have ruined people. We all fight for all of our beliefs, but if we believed in just the logical thing, which is reality, then the fighting would stop. Yes, believing in religions is fun with holidays and parties, but who said we had to do that? All of these beliefs are just figments of our imagination. How do we really know that they are real if we've never seen them? Yes, the religions teach great lessons, but people don't tend to listen to them because they don't know if it's true or not. My words 'If I can't see, hear, or touch it, it's air. Or it has a superpower."

Show more answers (5)