Most people will actually smile back at you if you're even just somewhat pleasant to be around.
Witches VS Patriarchy
i feel like a dumb question. i know it's common, I just fear "how common".
I'm a guy, so I have no clue, but how often women are told to "smile"?
Early on during covid, when most of us were wearing masks, I had a guy look at me and I should smile.
I stared at him with my mask covered face.
He realized suddenly and fucking skedaddled like the dingus he was. I laughed about it at home later.
Nah. You be you. I be me. Smile and laugh at whatever you feel like. I have no desire to be in control because control does not matter between equals.
Not just women. I (M) get that too sometimes (always from guys that think they're just being nice). I don't owe you my smile asshole.
But if the people around me arent smiling, doesnt that add cognitive dissonance to the illusions i maintain as a prophylactic between me and the world i refuse to fix and regularly make worse?
I think it does, and it makes me sad when they do, so it's a personal attack.