Mainstream American Thought

Throughout the past two years, when I've gotten into online arguments with conservatives, invariably they've said that Democrats (by which they mean liberals) are out of touch with the mainstream of American thought. Ever since the election on Tuesday, I've been wondering about that, and I've come to the conclusion that they may well be right. Not because overall attitudes have changed, but because there's a different group of people making their presence felt these days.

We have to face it--the evangelical movement has made its mark in the political and social arenas, and as a result, the country has lurched rightward on significant issues like abortion rights and gay rights. And because their socially libertarian brethren in the Republican party haven't stood up to them, these evangelicals are now in a position to pull the country even farther to the right. Because of this, the mainstream, which was once moving toward greater social justice and overall equality, has now shifted back toward the Puritanical.

We're now looking at a group that is largely dictating morality in Biblical terms, even though you'd never know it from watching network or cable tv, but it is happening, and they've managed, with the help of their brethren in the Republican party, to fashion an electoral majority. By my lights, that makes them the mainstream, painful as that idea may be.

So what does that mean? It means that I'm damn glad I'm out of the mainstream. If being in the mainstream means that I have to be a gay-hater, that I have to believe that the Bible is absolute truth and the only moral guide to follow, and that it's my duty to legislate that morality on the infidels (read, liberals), then I'll stay over here on the left, well out of the mainstream, thanks just the same.

Newer Post Older Post Home