Pages • 6
Life in the U.S. After World War IEssay title: Life in the U.S. After World War ILife in the U.S. After World War IWorld War I which was known as a war that ended all the other wars and as the Great War finally came to an end in 1918 changing life in many countries.