Why You Shouldn't Tell a Woman to Smile

Why You Shouldn’t Tell a Woman to Smile

There is nothing that says happiness like seeing someone smile from ear to ear. So naturally, people constantly tell other people to smile with the intention that this will rub on them and they will appear happy out of the sudden. However, women have claimed that there is nothing more uncomfortable and annoying than when a man tells them to smile. While they are probably attempting to change the mood in a woman, this does the exact opposite, it doesn’t make them happy and truly bothers them. This happens in the same way for people who are constantly under scrutiny in the spotlight, celebrities who just chose not to smile in pictures and they face criticism because they choose not to. 

The main point is that men don’t often face the same judgement from others, it is women who are criticised when they are not smiling. If a woman isn’t smiling it doesn’t immediately mean that she isn’t happy, and this is what a lot of people don’t understand. 

Society vs Women 

Society, and women especially, have come to realize the role that women have and therefore, when a man tells a woman to smile it is simply because they are caring or simply they are trying to be controlling. While one might have a good intention behind it, both can be perceived as controlling.  

This is not something that everyone is aware of, which is why women are calling this topic to men’s attention now more than ever. The roles in society are definitely changing and women are becoming more empowered by other women and they are now not afraid to speak their minds when something’s not right, or even when their own partner seems to be controlling them. 

Men Smiling vs Women Smiling

It has been a claim by society and also through research that women who smile can appear to others as vulnerable or even weak. On the other hand, some men even decide not to smile as much or use many facial expressions for the same reason, however for men it seems normal that their high-testosterone levels makes them look tough and serious, however society demands that women look more vulnerable to the public eye.

Does smiling make women look more approachable?

Absolutely, and one of the main reasons why women aren’t  quite convinced about smiling all the time is because it makes them look more approachable. Sometimes women don’t want to look more approachable, and they don’t want people to feel the commodity of reaching out to them for no reason. Especially strangers, a man that believes that a woman is smiling at him might feel invited to come in for a conversation. This is one of the primary reasons why women don’t like to be told to smile. It doesn’t mean that they’re not happy, they just don’t want to be perceived as weak and vulnerable or approachable.

Previous Article
Painting Murals as a Form of Therapy

Painting Murals as a Form of Therapy

Next Article
What is Art Therapy?

What is Art Therapy?