America is no longer a country; it's become more of a corpration. Yes, it's greedy, it's all about money, money, money, and more money. That's the States and it sux. However, I do not feel that America should be religious. This country was built on religious freedom and being free to practice whatever religion you wish. Anyone who says that this country is Christian needs to look back at the Constitution and US history. It may be largely Christian having been established by people who practiced some form of Christianity, but the US was built on the concept of religious and spiritual freedom. So, it's not really fair to call this country any specific religion since its citizens are free to practice any religion and there are many in this country who practice Islam, Judaism, Buddhism, Wicca, and all other religions. So, one cannot truly call this country 'Christian' since not all of its people are christian. Many people here aren't, that's why they or their ancestors came here to begin with. Besides, hasn't anyone ever heard of 'separation of church and state'?

(This post is not meant to offend anyone, I hope it doesn't, and if it does, I'm sorry and don't flame me over it. These views are the views of a single individual.)