In the United States of America, the government have...
Feminism is necessary in modern day United States.
In the United States of America, the government have given both men and women the same rights, but feminists are still rampant. This would not be a big if the feminist stuck to their morals, but while the dictionary definition of feminism is the advocacy of women's rights on the grounds of political, social, and economic equality to men, individual feminists have demonstrated otherwise and have turned this once proud movement into a cult.