MIT Demos Smartwatch that Can Read Your emotions | Wearables

Loading...
When people are in a intense conversation, the vocal sounds made, volume, gaze and body gestures are what really matters not the the words we speak.
The signals transmitted by these forms can be misinterpreted by people who struggle to understand non-verbal communication. 
That's what prompted researchers Tuka AlHanai and Mohammad Mahdi Ghassemi  at MIT to develop software that could take the ambiguity out of what people say, and what they do.

Researchers Tuka AlHanai and Mohammad Mahdi Ghassemi built an algorithm that can analyze speech and tone. 
This data is crunched to work out what emotion a person is roughly feeling for every five second block of conversation. In one example, a person is recalling a memory of their first day in school, and the algorithm can identify the moment the
tone shifts from positive, through neutral, down to negative.
The researchers used an iPhone 5S to record the audio part of the conversations, 
but made each test subject wear Samsung's Simband. That's the company's developer-only wearable platform that runs Tizen and has space for various additional sensors. It's not the most elegant of implementations, but the pair have built the system with an eye on incorporating it inside a wearable device with no outside help.

Right now, the implementation is very rough around the edges, and basic to the point where it couldn't be used more widely. But, the pair believe that it could be the first step on the road to building a social coach for people with an anxiety disorder or conditions like autism. It's early days, but if there was a device that meant an end to awkward conversations, 
it would probably be quite popular.
Engadget
Loading...

No comments

Leave a Reply