Pages • 2
American Imperialism American Imperialism American Imperialism American Imperialism has been a part of United States history ever since the American Revolution. Imperialism is practice by which powerful nations or people seek to expand and maintain control or influence over weaker nations or peoples. Throughout the years there has been many instances where the Americans have.