The Treaty of Paris
The Treaty of Paris that was signed in 1783 brought to end the hostilities between Britain and its colonies (GMW, 2010). This treaty marked the beginning of American Independence and Britain consented to remove its military establishments from American soil. After the treaty was signed the new American nation was formed and it was now a sovereign country that could make its own decisions. A significant point to note is that after the treaty was signed, Britain decided to forego some of the debts it was owed by America based on the argument that the British colonization of the country ...