What Happened To Germany After Hitler Died?


1 Answers

Anonymous Profile
Anonymous answered
They believed that some 'races' were better than others. Racism is a type of hatred. Hatred can easily lead to violence. When the Nazis gained power in 1933 their racist beliefs were well known but few people expected them to use violence against the people they ruled. Before WWII broke out, however, the first steps towards the Holocaust had been taken. In a letter from Hitler himself to Josef Hell,1922 it said that If I am ever in power the destruction of the Jews will be my first and most important job. As soon as I have the power I shall have gallows after gallows erected. Then Jews will be hanged one after another and will stay hanging until they stink. The Jews were frequently referred to in "Mein Kampf" and Hitler had made plain his hatred for them. References to the "filthy Jew" litter the book. In one section, Hitler wrote about how the Jews planned to "contaminate" the blood of pure Germans: "The Jewish youth lies in wait for hours on end...spying on the
unsuspicious German girl he plans to seduce...He wants to
contaminate her blood and remove her from the bosom of her own
people. The Jew hates the white race and wants to lower its cultural
level so that the Jews might dominate.“ Once in power, Hitler used his position to launch a campaign against the Jews that culminated in the Holocaust.

Answer Question