Saturday, November 28, 2009

Is Christianity the only religion that America acknowledges?

I was looking for a background for my Myspace earlier as Yahoo was down and whatnot. So I figured let me check out if there is any religious backgrounds that i would like such as a Buddhist or Taoist or of any different religious backgrounds as I was curious at what the site for all kinds of backgrounds had to had to offer in terms of religious backgrounds. I saw mostly Christian backgrounds 5 Muslim backgrounds and 4 Wiccan backgrounds but around 30 Christian backgrounds. What's up with that? When people talk about God is the Christian God the only god that people know in America or something? Is that the only definition or characteristic of God that matters or something? Am I missing something here? Is America truly just a Christian nation and only the religion of Christianity exists and matters?



Is Christianity the only religion that America acknowledges?

No, but I think it's the dominant religion. If you figure, the US was basically created as a place for religious freedom, most of our forefathers and the people that came over on the Mayflower were Christians. A large portion of the immigrants that came to the country were from the Spanish countries (Catholics) and Italy (also Catholic). There were a mixture of Irish people that came here (either Protestant or Catholic). These were the people that made up the population of the US in the 1800s and early 1900s. The majority if you will. In later years, more Asians and Middle Eastern folks started coming to the US. The populations aren't very large here overall, but there are some large concentrations.



We are a melting pot of religions, and should respect all beliefs or non beliefs. That's what America is all about, but in the end (and I know you're an atheist) our country was founded on the principal of the Christian God. We can look on the face of any of the currency that's in our pockets and we're reminded "In God We Trust".



EDIT----"The only Real God"...ugh. In church today the whole service was about the "eleventh commandment" "love others and God has loved you". How can these people call themself believers. It makes me sick and was the reason I didn't go to church for so many years. I hate hypocrites....sorry, this isn't the place for me to get so pissed.



Is Christianity the only religion that America acknowledges?

I know...If people were more respectful of each others beliefs, no matter what they are, there would be a lot more harmony in the world. Report It



Is Christianity the only religion that America acknowledges?

I disagree. I am sorry but i think America acknowledges EVERY single religion in the world. From Judaism to Sikhism. I think you have to see EVERY single Myspace account owned by Americans and check the difference. It's great that you are bringing this up but i can say that you can't just base something like this on about a hundred things when there are so many Myspace users in the States.



Is Christianity the only religion that America acknowledges?

In God ( of the Bible ) we trust. You can make up your own belief it is your choice. Make your own background to fit your beliefs if you like.



Is Christianity the only religion that America acknowledges?

There are 4 wiccan backgrounds? As long as they're making things up, why not.



Since Christianity has the only real God, I guess it can have the most variations.

No comments:

Post a Comment

 
pda