Something’s gone wrong in American evangelicalism. For nearly two decades I have been telling people that I am not an evangelical. I was raised in southern Churches of Christ, and we were far too sectarian to be evangelicals. I used to find that sectarianism problematic. Now, the older I get the more grateful I am for it.
For quite some time now, American evangelicalism has been a pernicious, misguided, and toxic effort to co-opt Christianity and the Christian scriptures to gain political power. Even though I have never considered myself an evangelical, the fact that I am both a Christian and an American means I am both branded an Evangelical and have to deal with the myriad ways evangelicalism has bastardized Christianity. It’s not just me. In reality, the entire world — regardless of faith or lack of it — has to deal with the reality of American Evangelicalism. And understanding what Evangelicals have done may be more important now than ever before.
If you’ve spent any time in church circles over the past decade, you’ve probably felt it — the tension, the division, the sense that somewhere along the way, we traded the radical teachings of Jesus for the fleeting rush of political power. When Jesus would teach us to love our neighbors, evangelicals have chosen to antagonize and dominate their neighbors. When Jesus invites His followers to suffer, evangelicals choose to visit suffering on others. When Jesus calls us to love the stranger, like the Good Samaritan, evangelicals have allowed their fear of the Other to rob people of basic human dignity and legal protections. The list goes on, and, at this point, the evidence is undeniable.
This is the heart of the problem: American Christians would rather have supremacy, power, and wealth than have Jesus. And they are willing to excuse, baptize, and sanctify whatever concepts of ideas that promise to give it to them.