Personally speaking, I think I was born being a feminist. I thought traditional roles were ridiculous. I always look back to the burning question: Why can’t women work in the ABC Stores? (they couldn’t until I was well into my adult life) You know, I never got a credible answer other than they would have to deal with drunks. I suppose Captain Obvious forgot that women go to parties with men?
But I digress… What does it mean to be a feminist? Had the definition changed over the years? Are feminists bad people in your mind? Read More