Asked by jere
the united states began to emerge as a world power after
a. the spanish-american war
b. the revolutionary war
c. wwI
b
a. the spanish-american war
b. the revolutionary war
c. wwI
b
Answers
Answered by
drwls
This is a matter of opinion, since we were already taking land from Mexico before the Spanish American War. In the latter war, we started to exert major influence in the Caribbean and the Pacific.
Whatever influence the USA acquired in WW1 was lost when it refused to join the League of Nations and greatly reduced its military forces while pursuing isolationist policies and entering the Depression in the 1930's.
It was really during and after WW2 that America emerged a major world power.
Whatever influence the USA acquired in WW1 was lost when it refused to join the League of Nations and greatly reduced its military forces while pursuing isolationist policies and entering the Depression in the 1930's.
It was really during and after WW2 that America emerged a major world power.
Answered by
jere
yes, i had placed that as my answer in the booklet, but it had been returned to me because the answer "WW2" was wrong.
Answered by
renee
AND THE CIVIL WAR I THINK
There are no AI answers yet. The ability to request AI answers is coming soon!