by Bas » Sun 15 Jul 2007, 19:43:21
It seems that the whole political spectrum in Europe is tilted to the left or that of America to the right, at least on a bunch of issues. What may be considered right wing politics in Europe may well be considered a centrist point in America. And what's a left wing "point" in America may be considered more of a centrist position in Europe. Also, having to look at eachother with binoculars across an ocean, no doubt magnifies these differences, making Europe far more leftist than it is in American eyes, and America more right wing than it really is in European eyes. For me personally it results that on this forum I often feel compelled to "attack" some "right wing" American points of view, while in real life here in Europe, I usually get to defend America against Europeans of whom I think they have overly simplistic views about the country; it's a feast of misunderstanding and I don't see it changing anytime soon though ending the Iraq war should go a long way in returning opinions to more favourable levels. I think apart from Vietnam and Iraq wars the general perception of America has always been positive and very much so in the years/decades just after WWII. There was also complete support for America just after sept 11th, which ended for the most part with the invasion of Iraq eventhough a number of countries were part of the "coalition of the willing" among which the country I live in.