the united states began to emerge as a world power after

a. the spanish-american war
b. the revolutionary war
c. wwI
b

3 answers

This is a matter of opinion, since we were already taking land from Mexico before the Spanish American War. In the latter war, we started to exert major influence in the Caribbean and the Pacific.

Whatever influence the USA acquired in WW1 was lost when it refused to join the League of Nations and greatly reduced its military forces while pursuing isolationist policies and entering the Depression in the 1930's.

It was really during and after WW2 that America emerged a major world power.
yes, i had placed that as my answer in the booklet, but it had been returned to me because the answer "WW2" was wrong.
AND THE CIVIL WAR I THINK