The subject was feminism.
Do you have to be a feminist if you are a woman?
If you are a woman and you don’t want to be a feminist, are you a bad woman?
If you are pro woman but hate the label of feminist, can you decline it?
Can you be a feminist and be male?
Are all feminist raging radical man haters who think women are better than men?
These are interesting questions and judging by the insane number of comments on the threads I was following, they are way more complex than can be distilled down for a quick and easy sound bite.
I would wager that the answers would vary from feminist to feminist as well.
One thing though that kept getting brought up was the idea that the reason feminism got started in the first place has passed… we as a society have moved on and so we don’t really need feminism or feminists anymore.
Now, I don’t agree, but instead of linking to studies dealing wage gap issues, domestic violence issues, women’s health issues, women’s rights issues, etc… I’ll just drop this little photo.
I’m going to say that when a publisher of children’s books still thinks it is ok to teach girls that they need to be pretty while teaching boys that they need to be smart, we still need feminists to stand up and say “Bad!”