Christianity in Politics

Is America suppose to be Christian? This is a strong feeling for many, a feeling that if we could only get back to the morals of Christianity, we would no longer have the problems we face in this country. Many think back fondly to a time in the 1950’s, after the war, prosperity rose, people looked put together and happy. This is the “proof” that the white Christian world is the one that brings peace.

What this ignores is the dark side of this time, blind acceptance of racism, domestic violence, back alley abortions and oppression of those that were not white males. It’s not all bad or all good. There was, and still is, a mix of advances and oppression we struggle to solve. Will Christianity as the national religion solve those problems? No.

In America we have the right to freedom of religion. All religions have the potential to bring peace and compassion to those in distress. All religions have the right to be practiced. Some people are twisting the history to say the framers of the Constitution meant for America to be a Christian country. Don’t fall for it. Don’t allow power seekers to change the fundamental truth of America as Church separate from State.

Sunday mornings can be a time for reflection, peacefulness and connection with ‘what is’ in this world. For some this is Nature or Energy or God or the Universe. Today, reflect on the meaning of forcing people to be Christian, of exterminating other religions and demanding everyone to be the same. Take some time to research and read about this issue from sources you don’t normally access. If your religion can’t take a good hard look, how strong is your stance? Get all the sides on this issue. Imagine this in reverse, someone forces you to send your children to school where only Buddhism is taught. All children raised as Buddhists. Does that seem right? I’d love to hear your thoughts.

Leave a Reply