However, the current 'feminism' movement across the...
Feminism in America (in 2017)
By definition feminism is: "the advocacy of women's rights on the basis of the equality of the sexes." So in other words, feminism is believing women should have equal rights as men. To this definition, I agree wholeheartedly. Men and women should have equal rights (to an extent - maternity leave etc). However, the current 'feminism' movement across the western world (or at least NA) doesn't follow the true meaning of the word. Modern 'feminists' are mostly SJWs trying to raise women above, and to push down men. Most women polled across the states don't even label themselves as feminists anymore because of these radical 'feminists'. These 'feminists' constantly attack men, get triggered by things trivial and are unable to see where rights are balanced. I disagree with you that true feminism isn't needed anymore in the US, for there are still a few instances where women are at a disadvantage to men. The pay gap is a myth, yet there is still sexism, and sexual harassment of women: which I do NOT support. In these ways feminism is still needed. In summary: I am completely for true feminism, but I think the 'feminism' movement right now is toxic and detrimental to society.