It is alarming how Africa is perceived by westerners; mention the name "Africa" and the first thoughts that come to the mind of most westerners will be along the lines of poverty, HIV/AIDS, war and every other negative rhetoric you can fathom. While it is true that the continent is being impoverished by its bad leaders, it is not all bad news like the western media have painted it to be.
We have beautiful places, intriguing cultures spread across the continent, wonderful and hospitable people just to mention a few, but the western media refuses to show that side of Africa. To this end, scholars and critics have laid the blames for the misconception of Africa at the doors of the western media. Do you agree with this submission? If yes, can you advise ways that this can be corrected?