I'm trying to find a book that I saw listed on Goodreads earlier today to reference here but haven't had any luck yet.
The book was all about the left / Democrats and how the party has pretty much doomed itself.
The political scientist who authored the book pretty much said the party has gone to the extreme left and has abandoned the center where most of their historic, working class base is from.
And to be fair, the author did say both sides have moved to the extreme ends of the spectrum which has resulting in our government being to polarized to function. But somehow the Right has a movement and base behind them as they have shifted further right, where as the Democrats have abandoned their base in doing so.
I'm just sitting here thinking what the actual fuck. The left is in no way or means extreme. They are all mostly centalists in my opinion. The neo liberal policies are not far off from what the right pushes for. Only a hand full of Democrats could be considered far left, and they really are not extreme. They are just left pushing ideas for things like Universal Health Care, you know, something the rest of the developed world already has... This is not extremism people.
Why do we keep hearing about the left moving further left? I just don't see it. I see more central / right movement since the 70's with neoliberalism.