In terms of how we (in Britain at least) are taught in Schools and Universities, the attitude towards the British Empire has changed drastically during the past 50 years. The change within media and academic discussions has paralleled this. Once the world's largest formal Empire, the British Colonialism overseas has gone from being viewed as a triumph to a travesty.
However, I still don't think it's all clear cut. So...fellow ADISCers:
Is colonising any other country, especially one which is resistant to foreign influence always wrong?
Does the fact that many of the countries which were reasonably stable under the empire - such as Nigeria and Rhodesia (now Zimbabwe) - and have since deteriorated justify or even promote the idea that Colonialism can be a good thing?
And is bringing more 'formal' Western Cultures and moral attitudes to culturally different countries something which brings more progress or more problems?