Pages • 2
Feminism In America Essay Preview: Feminism In America Report this essay Feminism, in the dictionary, is the theory of the political, economic, and social equality of the sexes. However, it can be more justly defined as the desire for equality between a man and a woman. This is a strong desire, as both sexes have.