U.S. History A29 (11/20)
The American colonies became politically independent, but still claimed themselves as colonies under the British Empire. A result of the British government relaxing their efforts to form their vision of structure within the American colonies.
Need Help? Watch the Video.
©2018 - Freenoah