Started By
Message

re: The sun never sets on the British Empire

Posted on 11/18/16 at 10:30 am to
Posted by Tchefuncte Tiger
Bat'n Rudge
Member since Oct 2004
57595 posts
Posted on 11/18/16 at 10:30 am to
quote:

With that being said, we kicked their asses all the way across the ocean to their tea and krumpets and then bailed their asses out in two wars.


You could almost say that the American Revolution was the beginning of the end of the British Empire, particularly after we thumped them in the War of 1812.
Posted by Tigeralum2008
Yankees Fan
Member since Apr 2012
17170 posts
Posted on 11/18/16 at 10:40 am to
quote:

particularly after we thumped them in the War of 1812.


Umm, we may have won the last battle but they kicked our teeth in

1. We tried to invade Canada and were thwarted
2. They burned our capitol and occupied Maine
3. Blockaded our East Coast

Posted by Volvagia
Fort Worth
Member since Mar 2006
51958 posts
Posted on 11/18/16 at 10:43 am to
Given the larger environment, not sure it's fair to say "we thumped them"

On the field it was a wash in tactical results, except the Americans were all hands on deck, and the British regarded the whole thing as an afterthought to Napoleon.

And they still crushed the American economy via a continental scale blockade.

Didn't really accomplish anything either. We never forced them to concede to our reasons to starting the war. They just persisted until Napoleons fall meant Britian no longer needed to continue.
This post was edited on 11/18/16 at 10:43 am
Posted by STLDawg
The Lou
Member since Apr 2015
3777 posts
Posted on 11/18/16 at 10:44 am to
I'd argue that WWI was the beginning of the end of the British Empire. After losing the 13 colonies, Britain expanded greatly in Africa and India (the jewel of the empire). Additionally, in many of their colonies, there was a paradigm shift from a trading post model to a resource extraction and management model, making the colonies more profitable.
Posted by Tigeralum2008
Yankees Fan
Member since Apr 2012
17170 posts
Posted on 11/18/16 at 10:49 am to
quote:

I'd argue that WWI was the beginning of the end of the British Empire. After losing the 13 colonies, Britain expanded greatly in Africa and India (the jewel of the empire). Additionally, in many of their colonies, there was a paradigm shift from a trading post model to a resource extraction and management model, making the colonies more profitable.


Agreed

WWI began an era of nationalism within colonial regions. Indigenous people saw that they could govern themselves quicker and more justly than their British colonial overlords
first pageprev pagePage 2 of 2Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram