sexism in movies

anonymous

I remember watching two films- one in school and one at home. Both of them involved a scene where a man told women that they are not unwanted and therefore don’t need to depressed. I feel anger when they called women- an individual “something that will only live a better life if they are wanted by a man”. Another issue is that almost everyone around me said that they are not feminist as feminists are people who think women are better than men. I tried to tell them feminist means gender equality but no one thought so.