Columbus, Ohio – The area of the brain responsible for recognizing facial expressions appears to be on the right side, just behind the ear. A study published Tuesday in the Journal of Neuroscience found that nerve patterns within this area, which is known as the posterior superior temporal sulcus (pSTS), seem to be programmed to identify movement in certain areas of the face.

While one neural pattern detects a furrowed brow, another recognizes the upturned lips of a smile, according to the paper.

“That suggests that our brains decode facial expressions by adding up sets of key muscle movements in the face of the person we are looking at,” study author Aleix Martinez said in a university press release, as reported by CBS News.

After using the brain scan from the first nine students, the research team used machine learning to predict what facial expressions the tenth person was looking at based on the activity registered from their pSTS. Credit: Neurorexia
After using the brain scan from the first nine students, the research team used machine learning to predict what facial expressions the tenth person was looking at based on the activity registered from their pSTS. Credit: Neurorexia

Martinez, a cognitive scientist who teaches electrical and computer engineering at Ohio State, added that humans use a significant number of facial movements to express emotion, as well as other non-verbal communications signals and language. Still, people recognize it immediately when others make a face, apparently without being fully aware of it.

This study holds great importance because scientists had long wondered how the brain can decode so efficiently the information encoded by a facial expression. They now know that the small pSTS is dedicated to this task.

The study’s foundation

Ohio State University researchers tracked the brain activity of 10 college students while they were shown over 1,000 pictures of other people making a variety of facial expressions, which fell into different categories: happily surprised, fearfully surprised, angrily surprised, disgusted, happily disgusted, fearfully disgusted and sadly fearful.

After using the brain scan from the first nine students, the research team used machine learning to predict what facial expressions the tenth person was looking at based on the activity registered from their pSTS. The algorithm could decode human facial expressions with roughly 60 per cent success, meaning that this process is very similar in most everyone’s brain, Martinez said.

This study could have several applications aside from helping researchers understand the way the human brain processes facial expressions. For example, this work could also have provided information on how such process may differ in people diagnosed with autism, according to study co-author Julie Golomb. She is an assistant professor of psychology and director of the university’s Vision and Cognitive Neuroscience Lab.

The research is part of ongoing experiments in the field of neuroscience to find out how humans and other animals recognize each other. By using fMRI scans, researchers have found activity in several areas of dogs’ brains that indicate they can process not only human faces, but also human expressions.

Source: CBS News