Christianity’s domination of American politics is one the worst things that could possibly happen to the religion. It is Christians themselves — evangelicals, in particular — who are bringing about the end of their faith.

In an article for Baptist News, Miguel De La Torre writes:

The beauty of the gospel message — of love, of peace and of fraternity — has been murdered by the ambitions of Trumpish flimflammers who have sold their souls for expediency. No greater proof is needed of the death of Christianity than the rush to defend a child molester in order to maintain a majority in the U.S. Senate. … Evangelicalism has ceased to be a faith perspective rooted on Jesus the Christ and has become a political movement whose beliefs repudiate all Jesus advocated.

De La Torre goes on to describe the most recent ways that Christians continue to sabotage their witness: blaming natural disasters on gay people, endorsing Donald Trump, being complicit in white supremacy, and more.

I’m not one to pull out the No True Scotsman card to win an argument and prove that my understanding of the Christian faith is the “correct” one. But when I read the Bible, I have a hard time understanding evangelicals’ idea of who they think Jesus is. The Jesus that I know stood with the poor and the marginalized. He was killed, in part, for breaking societal “rules” about who was acceptable to invite to the dinner table. It’s one of the things I admire most about Him.

It’s pretty easy being a Christian in America. But evangelicals seem to be doing all they can to make their religion look as ugly as possible.

(Image via Shutterstock)

