We hear about “Western Involvement”, “Western Values”, and “Western Interests” in the media. People say that the West is the best, or that the West is in decline. Some country is either Westernising or hates the West's way of life.

The West is the countries with democracies and free markets right? Or countries that are part of Western Civilisation? Then what about Latin America, are they Western? What does “The West” even mean and what exactly is Western Civilisation. Let’s find out