You always hear about how easy men have it and how the world is handed to us. But honestly, as an average guy, I feel invisible most of the time. Nobody cares about your mental health, if you’re broke you’re treated like you don’t exist, and society expects you to just “man up” and deal with it. What’s a so-called “privilege” that people keep talking about that you’ve literally never experienced? submitted by /u/sphinxUx
Originally posted by u/sphinxUx on r/AskMen
You must log in or # to comment.
