Started By
Message

re: When did the German soldiers of WWII realize they were the 'bad guys'?

Posted on 7/18/17 at 3:18 pm to
Posted by TheFonz
Somewhere in Louisiana
Member since Jul 2016
20468 posts
Posted on 7/18/17 at 3:18 pm to
quote:

But did Germany really want to expand, or did they want to simply fortify their defensive position? What, really, was it about?


Google "Lebensraum."

Hitler wanted to unite all Germans under one umbrella, hence the annexation of Austria and the Sudetenland, and the want of the Polish Corridor and Danzig. That would be followed by conquests in Eastern Europe, where the inferior Slavic races would be removed, killed, or enslaved and new German settlers would move in.

Hitler did not have a particular beef with Britain and France other than the Versailles Treaty. He only went to war with them because they declared war on him. Hitler's true hatred lay with the Communists and Jews.

quote:

One of the first things the Nazis did was remake the education system. The adult population may have been willing to accept what benefits the Nazis initially provided, but you wouldn't find many true believers among them. However, by 1939 there was a generation of teen and twentysomething fanatics willing to fight to the end.


“When an opponent declares, ‘I will not come over to your side,’ I calmly say, ‘Your child belongs to us already. … What are you? You will pass on. Your descendants, however, now stand in the new camp. In a short time they will know nothing else but this new community.’” - Adolf Hitler, November 6, 1933
This post was edited on 7/18/17 at 3:21 pm
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram