Somehow, and I am sure it is mostly from men, the word “feminist” has become somehow synonymous with a woman who hates men, doesn’t shave her legs, maybe burned her bra, and wants to “wear the pants” in relationships.
Guess what? That’s freaking wrong, and if you truly believe that, then you didn’t take the .05 seconds it would take you to get the ACTUAL definition, and the actual practice of feminism.
The definition of feminism from Mirriam-Webster:
1: the theory of the political, economic, and social equality of the sexes
Those damn feminists. They simply want equality for themselves.
Not to TAKE AWAY from what you have, but simply to be ALLOWED to have the same things (rights, money, etc).
Oh no! What is a man to do? I know!
Let’s try to make “feminism” into a dirty word with a negative connotation instead of — I’m just throwing this out there — wanting the other half of the human race to flourish.
What more could be accomplished if women weren’t having to constantly be fighting the patriarchy for equal pay, equal rights, the right to make decisions about our own damn bodies.
In 2017, almost a full 100 years after women got the right to vote (which took decades of marches, protests, and women wanting to slap some sense into men), we still have idiot men kicking congresswomen off the floor, the word “mansplaining” has to exist, and the government defunding Planned Parenthood — which legally does not use federal funds for abortions anyway. I know! Let’s keep women from affordable birth control (totally her responsibility, of course) and cancer screenings and prenatal health care (stupid babies!).