Non-verbal clusters November 12, 2009
In a general post, I wrote about how to improve your lie detection. Now when I state lie detection, I’m also talking about every other thing a human can do. Truth detection, for example, is much harder than lie detection.
Emotion detection is also a useful skill. You may think that’s obvious to tell when someone’s emotional, and you’re right. However, we often find meaning where there is none.
Imagine you walk into a room, and there is an acquaintance of yours sitting by themselves with tears running down their cheeks. Are they happy or sad? We don’t know – chances are they are sad but we are not a statistic, so we can’t tell just by seeing the tears.
Now, if there are tears and they are frowning, that increases the likelihood that they are sad. If their shoulders are hunched, they hide their face in their hands, their breathing is shallow, and the likelihood goes up even more. All these single non-verbal messages go together – they cluster. The more matching messages clustered together, the more certain you can be (but never to 100%) of the emotion.
Another way to put this: Every single human gesture has multiple meanings. If you sense one single gesture, it could mean any one of those meanings. When you see two gestures at the same time, the meanings overlap. The meanings that don’t overlap can be ignored. The more gestures you spy, the more likely you can know the meaning.