Interesting article in the Economist about Africa’s image in the West. It is rare you see Western media coverage of Africa that does not feature starving kids, wars, famine or safari holidays. No wonder many Westerners are so contemptuous of Africa. Is it time for Africans themselves to start showing off more positive images of their continent?
Does Africa Need a New Image?