Is it bad experiences? Are your doctors informative, kind, and good listeners? Do they shrug you off? Does it make you feel lesser? I honestly just want to know the reason. I know many men do go to the doctor. But many avoid it until they’re genuinely sick. What’s the reason? What’s submitted by /u/AdhesivenessFun7097
Originally posted by u/AdhesivenessFun7097 on r/AskMen
You must log in or # to comment.
