Original Reddit post

I bring this up because I feel like sexism towards men is just constantly ignored. This bothered me today when I saw a TV show advertised where men aren’t allowed out after dark and they’re just instantly accused of murdering this girl, as well as the whole village being incredibly sexist towards them. I feel like if this was the other way round there would be an absolute outrage. Does anyone else feel like society just doesn’t care about the suffering of men and their mental health anymore, because I’m constantly seeing that people don’t care. Our English lessons also taught us that men should just start killing themselves for women and most of my class agreed, they taught this to 14 year olds by the way. submitted by /u/Phantom_Hyde

Originally posted by u/Phantom_Hyde on r/AskMen