Like, it seems, everyone else under the sun who was disappointed after November 2, I've been giving a lot of thought to the future of liberalism lately.
It frustrates me that "liberal" has become a bad word. This seems like it might be a trivial concern, but it isn't. It is central to the problem that I see facing the political landscape of this country. Everywhere, conservatives set the terms of the debate.
They call themselves the Right. What is the opposite of Right? Left... or Wrong?
Democrats have allowed the conservatives to convince them that being liberal is a bad thing. They've moved toward the center, as if that gives them more legitimacy.
What does it actually accomplish?
It creates strong third parties that stand to the left of the Democrats and take away their votes. Democrats then become angry at these people, believing that they had a right to their votes. Infighting and divisiveness amoung potential allies ensues.
It robs the Democrats of a unified moral stance. The Republicans appear firm, consistent, and strong. The Democrats appear to make concessions to things they actually believe in in order to move toward the center. (example: I have no idea what Kerry actually thinks of same-sex marriages, but when he claimed to believe that "marriage is between a man and a woman" he came off, to me, as horribly insincere.)
Liberals seem to act in a way that would indicate that they are actually, at some strange level, convinced of the moral superiority of conservatives. They let conservatives set the moral standards, and they feel the need to prove themselves to be morally superior. In the debates, iirc, Kerry claimed he would never say Bush lied (despite plenty of opportunities). Bush felt no such compulsion. Conservatives openly make concessions for allies of theirs who are criminals (witness today's news about DeLay). Liberals tend not to call them on such things, unless the media does so first.
Democrats, of late, have been declaring themselves to have deep faith. Why? The pessimist in me says that it isn't because the Democrats have had a resurgence of religion. Conservatives have shown them that faith is important to voters or something. It is another move toward the center. Is it inappropriate for Liberals to be religious? No... but the manner in which they declare it strikes me as an implicit admission of moral inferiority. "I know my opponent is a man of great faith. I applaud and respect that. I too am a man of faith." - or somesuch bullshit that effectively comes down to a cry of, "me too! (but not as much or as visibly)." Liberals really need to court religion (and specifically Christianity), though. No matter what Keyes says, if he were around today in the U.S., there is no way that Jesus would vote Republican. Someone needs to point that out. Loudly.
We can all gripe about this. I've read enough griping lately. Is there anything that people can actually do?
Part of me wants to do something meaningful.
Perhaps I should start a non-profit organization... some sort of liberal alliance that builds bridges between Democrats, Greens (and other progressive shades), liberal religious groups, and other non-conservative-leaning organizations... that tries to find common ground and a set of (gasp) shared values... that doesn't shirk from pointing out when conservatives act inappropriately... but, at the same time, doesn't alienate the moderate Democrats.