Dictionary

imperialism

noun im·pe·ri·al·ism \im-ˈpir-ē-ə-ˌli-zəm\

: a policy or practice by which a country increases its power by gaining control over other areas of the world

: the effect that a powerful country or group of countries has in changing or influencing the way people live in other, poorer countries

Full Definition of IMPERIALISM

1
:  imperial government, authority, or system
2
:  the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas; broadly :  the extension or imposition of power, authority, or influence <union imperialism>
im·pe·ri·al·ist \-list\ noun or adjective
im·pe·ri·al·is·tic \-ˌpir-ē-ə-ˈlis-tik\ adjective
im·pe·ri·al·is·ti·cal·ly \-ti-k(ə-)lē\ adverb
ADVERTISEMENT

First Known Use of IMPERIALISM

1800

Other Government and Politics Terms

agent provocateur, agitprop, autarky, cabal, egalitarianism, federalism, hegemony, plenipotentiary, popular sovereignty, socialism

Browse

Next Word in the Dictionary: imperializationPrevious Word in the Dictionary: imperial greenAll Words Near: imperialism
ADVERTISEMENT
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears