April 30, 2014
Understanding Face/Place Memory
Memory is a tricky thing, and apparently trying to understand just how it works is even more so. There are still a great many mysteries about just how the brain functions and how we are able to process data within our own heads. How are we able to recall things? How are we able to pick out a single face from a crowd of people? How are we able to distinguish one place from another based on thought and memory alone? All question begging to be answered, and though we still do not possess the answers to these questions in full, we have taken another great step forward in our understanding of just how memory works.
A new study by researchers at the Massachusetts Institute of Technology (MIT) has shown just how the brain achieves the sort of attention that is required to pick a face out of a crowd of people. Despite the apparent simplicity of a task that one might assume, doing so is actually very complicated. It involves our brains retrieving the memory of the face that we are looking for and holding while we scan over the crowd, trying to identify similarities and comparing those to what he have stored in our biological archive. Our brains do this using a part of the prefrontal cortex that is known as the inferior frontal junction, or IFJ. At present, the scientific community knows much less about this sort of attention to detail, known as object-based attention, than it does about spatial attention, which is what we use to focus on what is happening in a particular location. What the researchers have found is that these two different types of attention possess many similar mechanisms involving related regions of the brain. In both types of attention the prefrontal cortex, which is the control center for most of our cognitive functions, takes charge of the brain’s attention and other relevant parts of the visual cortex, which is what receives sensory input.
What this new study found is that the IFJ works with the a region of the brain that processes faces, which is known as the fusiform face area or FFA, as well as another region that interprets information about places, which is known as the parahippocampal place area or PPA, both of which were first identified in humans by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience at MIT. Using magnetoencephalography, or MEG, the researchers found that when they asked subjects to look for faces – showing them overlapping images of faces and houses so that the brain could not use spatial information to distinguish them – they found that activity between the FFA and IFJ become synchronized, which suggests a level of communication. When asked to look for houses instead, the IFJ synchronized with the PPA rather than the FFA. They also found that all communication between these two brain regions was initiated by the IFJ and took as long as it would take for neurons to convey information from one region to another – about 20 milliseconds. This has led to the conclusion that the IFJ is what holds onto the idea of the object that the brain is scanning for and then has the other relevant part of the brain – either the FFA or PPA – look for it.
A truly remarkable and fascinating discovery.
Image Credit: Thinkstock