Quote:
Originally Posted by DemolitionRed
I did some of my education in the French school system where we were taught that the French won the war (virtually single handedly) and that the Brits were supposed to be our allies but dropped more bombs on French civilians than the Germans ever did.
One of my closest friends was educated in the US and she was taught to believe that America won the 2nd world war and stopped the German invasion into the UK.
History is scewed by each and every country. The only exception is probably 'former' west Germany.
I was never taught about the former white slave ships that enslaved thousands of Irish people or the blood on our hands when forming the British Empire.
History is important when we can learn valuable lessons from it.
|
America did protect the UK from German invasion, and just as importantly stopped a complete Russian/Soviet invasion of western Europe. America really did singularly save the world in WW2. that's a fact.