Western influence on Africa

Historically, the wildlife, natural resources, and culture have made Africa a highly valuable continent to the western world. Africa has gathered the attention of western tourists, western explorers, and western imperialists from all over. As such, Africa has been heavily influenced over time by western interests.


From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Tubidy