Nazism was a fascist doctrine embraced by the German Nazi party.
Germany was left economically and socially devastated after WWI. The Nazis came to power by promising jobs, ethnic pride, and social status.
Check this site for more information.
http://en.wikipedia.org/wiki/Nazism
What is Nazism and how was it able to gain popularity in Germany after WWI?
3 answers
Thank you!:)
You're welcome.