Asked by Anya

What is Nazism and how was it able to gain popularity in Germany after WWI?

Answers

Answered by Ms. Sue
Nazism was a fascist doctrine embraced by the German Nazi party.

Germany was left economically and socially devastated after WWI. The Nazis came to power by promising jobs, ethnic pride, and social status.

Check this site for more information.

http://en.wikipedia.org/wiki/Nazism


Answered by Anya
Thank you!:)
Answered by Ms. Sue
You're welcome.


Related Questions