preview

How Did The Nazis Gain Control Of Germany?

Decent Essays
Hitler Youth
How did the Nazis gain control of Germany?
Firstly, the Nazi’s used Germany’s defeat during the First World War (which began in the summer of 1914 and ended in November 1918) to their advantage. WWI claimed many lives and of course changed the lives of those in Germany forever. The years after the First World War were to see the rise to power of Adolf Hitler, the most notorious character of the twentieth century. Hitler offered to the Germans that one day he would Germany great again. Hitler also provided a scapegoat to the people of Germany, saying that the reason why Germany was in great debt and had such high unemployment was because of the Jewish people. This appealed to the German’s because it meant that they’d have an easy way out rather than taking the blame upon themselves. After Hitler was appointed as chancellor of Germany the basic democratic structure of the Germany was weakened then abolished. This then created a sense of fear the German population, and they were too scared to fight back. The lives of many millions of people across Europe would be devastated as a result of the beliefs, policies and actions of the Nazis led by Hitler.
Hitler and the Nazis gained power on 30 January 1933. By March that year Hitler had total power over the country.
The Nazi’s had a very systematic approach to gaining control of Germany, which covered most parts of society. This includes German foreign policy, religion, culture, media and propaganda, education and
Get Access