There is no "radical left" in this country. There's not even a mainstream left wing political party. The Democrats are center right, Republicans are far right. That's why you think anything resembling the center is "far left". Yall are so far to the right that your perception is skewed.
