Many of the military interventions of the U.S.A. here in the Americas and elsewhere have been for wealth, not for enhancement of the residents freedoms or rights.
I am in the midst of reading this book and find it fascinating. Here is an interview with the author on NPR
Ever wonder how and why Hawaii became a U.S. state? How about Puerto Rico? And, just why have do so many Central Americans, Cubans and Iranians have a bitter taste in their mouths about the U.S.A.? The bottom line thread in these incidents, really is about the "bottom line", it has mostly been to do with business expansion. The author claims that the major industrialists of the McKinley and (Teddy) Roosevelt administrations saw great profitable opportunities evident in building a U.S. centered empire of sorts.
There was also a history of U.S. missionaries focusing on Hawaii as there was the perception that their heathen lifestyle would lead the Hawaiin people to perdition. So, even back in the 19th century, there was a Christian influence in U.S. foreign affairs. Many prominent politicians of the era believed that God had carved out a special mission for the U.S.A. to be a light among nations and lead them to the true faith. Funny how these lofty ideals merged so conveniently with economic ones. And, these were similar to the typical European imperialist powers who also managed neatly to converge religious and economic ambitions in establishing colonies.
For me, these points were never covered in high school history. We always thought of the Americans as good guys, well ever since the War of 1812, anyway. We were taught the U.S.A. stood as a bulwark against this kind of imperialism.




