HOW THE WESTERN WORLD VIEWS AFRICA
Posted by Ebunoluwa Broomes on
As we approach the millennium, with the triumph of the United States over Russia, capitalism over socialism and the market over planning, the depiction of Africa through the centuries has not changed considerably.
The euro crisis, double-dip recessions, Occupy protests and Libor corruption scandals aside, it seems that capitalism is alive and well – at least in Africa. Africa is 'Rising', westerners are often told these days, after decades of economic ruin, civil war and governmental mismanagement. Impressive economic growth statistics, the "burgeoning African middle class", mushrooming mobile phone and internet use – these things are all proudly trumpeted, "reminding the world of the capitalist way". But why all this 'good news' now?
The seemingly obvious answer is that things are indeed improving in Africa and the West's commentary are now, quite simply, reporting what is happening. But to properly understand the Africa Rising narratives, we also need to look at what they are a response to – the much older, and much more negative, Dark Continent narratives that have dominated western discourses on Africa for centuries.