in ,

Can A Computer Actually Read Your Emotions?

Can A Computer Actually Read Your Emotions?

With the advent of AI and machine learning, the ability of computers to perform extraordinary tasks that were once unthinkable is quickly advancing.

The ability to solve complex math problems, puzzles, or even analyze medical symptoms and laboratory data to arrive at a diagnosis are but some of the awesome feats that are now possible. Alexa’s ability to respond to voice Turkish artist Ugur Gallenkus works with his laptop at his home on April 14, 2019 in Istanbul. – With over half a million followers on Instagram, and thousands more on Facebook and Twitter, 29-year-old Ugur Gallenkus has stirred a huge reaction by commands to execute online searches, complete tasks such as playing music and ordering products on Amazon is certainly impressive.

One area that is even more daunting—deciphering the complexity and nuances of human emotion—is a frontier where researchers are beginning to see progress, according to new research published in the journal, Science Advances .

The question is this: can a computer discern the difference between a happy image and one that is sad? In a few milliseconds, can it pick up nuances of human emotion based on interpretation of pixelated images?

The answer seems to be yes, and possibly as good as a human, based on their preliminary research.

“Machine learning technology is getting really good at recognizing the content of images – of deciphering what kind of object it is,” said Tor Wager, previously a Professor of Psychology and Neuroscience at the University of Colorado Boulder at the time of the research in a press release . “We wanted to ask: could it do the same with emotions? The answer is yes.” 

The research, combining machine learning with human brain imaging, represents an important advance in the development and application of neural networks to study human emotion. A neural network is a computer system that is modeled after the human brain.

But the study also exposes the importance of how and where images are represented in the human brain, implying that what we see—even transiently—might have a more significant effect on our emotions than we may realize.

“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

The Birth of Emonet

As part of the research, Kragel utilized an existing neural network known as AlexNet , which allows computers to recognize objects. He also relied upon previous research that pinpointed stereotypical emotional responses to images, and reconfigured his new network to predict how a person would feel when they see a certain image.

Subsequently, Kragel introduced his new network—dubbed EmoNet — to 25,000 novel images ranging from erotic photos to simple nature images and asked it to categorize them into 20 categories including awe and surprise, craving, sexual desire, or horror.

While EmoNet was able to accurately and reliably categorize 11 of the emotion types, it was better at recognizing certain type of emotions compared to others. In this case, it identified photos that exemplify sexual desire or craving with greater than 95% accuracy. However, it was not as accurate in identifying more nuanced or subjective emotions such as confusion, awe and surprise.

What was even more remarkable was that exposure to a simple color could record an emotion: when EmoNet was exposed to a black screen, it recorded anxiety. Red registered as craving. Puppies produced amusement, and when two of them were together, it selected romance. EmoNet was also able to consistently rate the intensity of images, identifying the actual emotion and its corresponding intensity.

EmoNet was also shown movie trailers and 75% of the time was able to categorize them as romantic comedies, action films or horror movies.

What You See Is How You Feel

But the key part of the study was when the researchers introduced 18 human participants to EmoNet .

Utilizing functional magnetic resonance imaging (fMRI) to record and measure brain activity, the participants were shown 4-second flashes of 112 images. EmoNet also viewed the same images, and was considered the19th participant in the study.

The results were intriguing: when activity in the neural network was compared to that in the participants’ brains (based on MRI data), the patterns were similar.

“We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so,” said Kragel.

The MRI data also revealed that even a brief exposure to a simple image—a face or an object—could trigger emotion-based signal in the visual cortex of the brain. In addition, different types of emotions highlighted unique regions of the cortex, implying anatomical specificity for emotions.

“This shows that emotions are not just add-ons that happen later in different areas of the brain,” said Wager, now a Professor of Neuroscience at Dartmouth College. “Our brains are recognizing them, categorizing them and responding to them very early on.”

The researchers believe that neural networks like EmoNet may potentially help people to analyze and select positive and negative images in the design of homes, offices, or other community settings. It may also help to advance research on emotions and how humans relate to computers in a general sense.

This study helps to underscore the importance of visual cues and input in our surroundings and how integral they are to our emotional well-being or conversely, adversity.

To improve our emotional well-being, evolution suggests it’s important to focus on our surroundings.

Source: www.forbes.com

What do you think?

48 points
Upvote Downvote

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

AI researcher offers insight on promise, pitfalls of machine learning

AI researcher offers insight on promise, pitfalls of machine learning

A reality check on the role of machine learning in cybersecurity

A reality check on the role of machine learning in cybersecurity