If you think that Western imperialist powers have actually relinquished control over their former African colonies, think again. Even after many Western nations supposedly granted their colonies independence in the mid-20th century, they still continue to exploit the vast mineral and natural wealth of Africa and exert de facto influence over the continent’s financial systems.Continue Reading