Asked by ally

Many historians argue that the vietnam war in Vietnam was a profoundly disillusioning experience for Americans. Would you agree? Did the Vietnam experience signal a fundamental shift in the American foreign policy? what have been the consequences for american foreign policy in the post-vietnam era? have they changed in the post-9/11 era?

Answers

Answered by drwls
Take a look at this book, or the summary of it, written by the Secertary of Defense during the early stages of that war.

http://en.wikipedia.org/wiki/The_Fog_of_War
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions