This is not a homework.. but I was just wondering... was the US government forced to start imperialism ?

Was it because of the cold war US was dragged into the worldly affairs?

4 answers

The US has been involved in "worldly affairs" from the beginning!

http://www.google.com/search?rlz=1C1GGGE_enUS379US379&aq=1&oq=timeline+USA&gcx=w&sourceid=chrome&ie=UTF-8&q=timeline+us+history
Timelines are really interesting, not to mention different people put timelines together differently! Read through several.
So the Americans have been imperialistic since then? Or the cold war lol?
If you google "manifest destiny," you will see that American presidents have been greedy since the beginning of the nation, always wanting to push people out of their lands and take advantage of those people's natural resources.

"(It is) ..our manifest destiny to over spread and to possess the whole of the continent which Providence has given us for the development of the great experiment of liberty"

First it happened with the "Indians" ("manifest destiny") and the doctrine that the white man must move all the way from the eastern United States to California, taking Indians' land and killing them off along the way.

Once that engine of US economic growth (robbery) had been exhausted, when all of the Indians had been driven off of their lands and white men were mining, fishing, hunting and planting those lands that the Indians once owned, then the nation (run by white men) had to look outside its borders for ways to extract wealth from others, and that's where what you call "imperialism" comes in.

Google this: "The Monroe Doctrine was articulated in President James Monroe's seventh annual message to Congress on December 2, 1823. The European powers, according to Monroe, were obligated to respect the Western Hemisphere as the United States' sphere of interest."

It's just a new way of doing the same old thing: announcing that the white men of the United States are the rightful owners of something they do not yet own and then going to a land where people live, taking over the land, resources and markets -- one way or another -- and reaping the profits.

If it seems unfair or combative to use the phrase "white men," just reflect on who was able to vote and hold office at the time: ONLY white men.
American slavery was another example of the same arrogant megalomaniacal greed. Quotation: "The right of property in a slave is distinctly and expressly affirmed in the Constitution."

In 1857, the all-white-male US Supreme Court announced that Blacks had "no rights which the white man was bound to respect." Blacks had not even the right to profit from their own labor, because the US Government said no such right existed for Blacks.

That was just another of many announcements that normal concepts of property rights did not apply when the US (with white men and only white men at the helm) was laying claim to what others had, in order to make a profit.