I’m in high school, and up until this point (sophomore year), I’ve either never experienced or registered sexism. This year, however, perhaps due to the mainstreaming and acceptance of previously unthinkable actions/words, I’ve begun to understand what women and girls are talking about. I have heard my male classmates, who are for the most part respectful and intelligent, call girls the b-word, degrade them as objects and essentially promote rape culture. I had a female classmate try to justify sexual assault, listen to a health teacher (and my male classmates) speak about how women who are sexually assaulted shouldn’t be wearing “slutty” clothing (because obviously we shouldn’t show an inch of skin…but oh wait, full facial covering is associated with Islam, and that’s definitely not okay either). I’ve been told that there is no such thing as sexism anymore, that reproductive rights shouldn’t be guaranteed to women, that we should accept and be thankful for what rights we already have and not push for more and that girls need to be dominated by a stronger male. I’ve listened to boys talk about “moving onto a girl” or brag about “banging” x amount of chicks; I’ve heard “x profession is no place for a girl” and “go back to the kitchen” and had guys encroach upon my personal space or stare at me, when I’m obviously creeped out. Basically, I’ve found it kinda sucks to be a girl today. So much for progress!