As you can see from the picture here, we naturally tend to see objects that resemble faces everywhere. Be it clouds, tortilla’s, bits of toast or odd-shaped hills on Mars, the pattern recognition engine between your ears does an amazing job. Now, what is interesting is that it is just a resemblance, you can also at the same time recognise that it is not an actual face. Some have recognised this and as a result some fascinating research has now been published.

“You can tell that it has some ‘faceness’ to it, but on the other hand, you’re not misled into believing that it is a genuine face,” – Observes Pawan Sinha, professor of brain and cognitive sciences at MIT.

OK, so what is going on here, how is it possible to sort of see a face, but at the same time not see an actual face?

Well we now have an answer from the new study that has been published by Sinha and his colleagues, entitled “Lateralization of face processing in the human brain“. As announced by MIT News, what they have discovered is that two separate brain regions make the analysis.

First we have the fusiform gyrus — an area long associated with face recognition — on the left side of the brain carefully calculates how “facelike” an image is.

Then we have the right fusiform gyrus which appears to use that information to make a quick, categorical decision of whether the object is, indeed, a face.

This distribution of labor is one of the first known examples of the left and right sides of the brain taking on different roles in high-level visual-processing tasks, although hemispheric differences have been seen in other brain functions, most notably language and spatial perception. It also explains why we sort of see a face, but at the same time are not fooled into thinking it is an actual face.

Face versus nonface

Sinha and his students set out to investigate how that brain region decides what is and is not a face, particularly in cases where an object greatly resembles a face.

To help them do that, the researchers created a continuum of images ranging from those that look nothing like faces to genuine faces. They found images that very closely resemble faces by examining photographs that machine vision systems had falsely tagged as faces. Human observers then rated how facelike each of the images were by doing a series of one-to-one comparisons; the results of those comparisons allowed the researchers to rank the images by how much they resembled a face.

The research team then used functional magnetic resonance imaging (fMRI) to scan the brains of research subjects as they categorized the images. Unexpectedly, the scientists found different activity patterns on each side of the brain as follows:

On the right side, activation patterns within the fusiform gyrus remained quite consistent for all genuine face images, but changed dramatically for all nonface images, no matter how much they resembled a face. This suggests that the right side of the brain is involved in making the categorical declaration of whether an image is a face or not.

Meanwhile, in the analogous region on the left side of the brain, activity patterns changed gradually as images became more facelike, and there was no clear divide between faces and nonfaces. From this, the researchers concluded that the left side of the brain is ranking images on a scale of how facelike they are, but not assigning them to one category or another.

“From the computational perspective, one speculation one can make is that the left does the initial heavy lifting,” Sinha says. “It tries to determine how facelike is a pattern, without making the final decision on whether I’m going to call it a face.”

The researchers also found that activation in the left side of the fusiform gyrus preceded that of the right side by a couple of seconds, supporting the hypothesis that the left side does its job first and then passes information on to the right side.

Sinha says that given the sluggishness of MRI signals (which rely on blood-flow changes), the timing does not yet constitute definitive evidence,

“but it’s a very interesting possibility because it begins to tease apart this monolithic notion of face processing. It’s now beginning to get at what the constituents are of that overall face-processing system.”

What happens Next?

The researchers hope to obtain more solid evidence of temporal relationships between the two hemispheres with studies using electroencephalography (EEG) or magnetoencephalography (MEG), two technologies that offer a much more precise view of the timing of brain activity.

They also hope to discover how and when the right and left sides of the fusiform gyrus develop these independent functions by studying blind children who have their sight restored at a young age. Many such children have been treated by Project Prakash, an effort initiated by Sinha to find and treat blind children in India.

Links

Ref.: Ming Meng et al., Lateralization of face processing in the human brain, Proceedings of the Royal Society B, 2012 [DOI: 10.1098/rspb.2011.1784]

MIT News Article

Some Examples of Pareidolia

Share this: Facebook

Twitter

Reddit

Tumblr

Pinterest

LinkedIn

Pocket

Skype

WhatsApp

Email

Print

