A Liberal History
Liberalism became the dominant ideology of the West when it was adopted by Britain and the United States. But its roots lie elsewhere.
Liberalism became the dominant ideology of the West when it was adopted by Britain and the United States. But its roots lie elsewhere.