|
|
Since the 1960s, the role of women in America has changed dramatically. From politics to business to academics to sports, women have gained positions of prominence that would have been unimaginable to earlier generations. Much of this change can be attributed to the feminist movement, which detailed significant issues for women's lives and encouraged women (and men) to rethink the role of women in our society.
While one major goal of the movement, an Equal Rights Amendment, was never realized, other legislation and an overall change in cultural attitudes have had substantial impact. Yet many objectives of the movement are still unfulfilled, and some in America believe that feminism has had a negative impact on the family and on our society.
|
|