Recently, I was out with two (male) friends of mine. The subject of relationships inevitably came up, and I made a comment that led one friend to note, “That’s a very feminist stance on dating.”
My other friend immediately turned to me and replied, “But…you’re not a feminist.”
A feminist is someone who is concerned with the rights, interests, and legal protection of women. That’s it. While there are a whole host of different movements, theories, and opinions that fall under the umbrella definition of “feminism”, fundamentally the word signifies a belief in (as trusty Merriam-Webster puts it) “the political, economic, and social equality of the sexes”.
Yet the word has been appropriated to mean any number of things, many of them pejorative. It is frequently–especially in pop culture–attached to portrayals of man-hating, granola-eating “womyn” who refuse to shave their legs. Just think of the character in Legally Blonde who wants Harvard to start calling its terms “ovesters”. (Incidentally, “semester” is derived from the Latin “semestris”, meaning a six month period–nothing to do with semen!)
Why has a word for someone who defends the rights of themselves and/or others become an insult? Why did my friend assume I wasn’t a feminist just because I’m not a walking stereotype?
Shouldn’t the majority of educated, aware, compassionate people be feminists almost by default? Wouldn’t it be nice if more men wore these shirts (and didn’t look as surprised by the message as that male model does)?
Of course, we can always turn to Reverend Pat Robertson for his sage thoughts on the matter:
“Feminism encourages women to leave their husbands, kill their children, practice witchcraft, destroy capitalism, and become lesbians.”