A simple way to compare monkey neuron responses with psychophysical detection performance is to use the periodicity of spike train amplitude modulation. This enables comparison of neurometric and psychometric curves (Fig. 3C, red and black curves respectively).
Recent 14C-deoxyglucose autoradiography experiments indicate that areas 45B in the prearcuate cortex, LST and LB1, as well as LIPa in the lower bank of STS are involved in action observation. Moreover, human and monkey data suggest that parietal neurons code not only observable actions but also their social context.
When a monkey grasps an object, it activates the same neurons that fire when the monkey itself is performing the action. These neurons are called mirror neurons, and they first came to light in the 1990s, when researchers at the University of Parma in Italy first identified them. Since then, researchers have been studying the role of mirror neurons in a variety of tasks and have found that they can be used to understand how people learn (see Page 52), how humans read other’s emotions, and even how language develops.
Scientists have also been using mirror neurons to study the effects of drugs on behavior, and they’re finding that some of the same areas of the brain that are active during an emotion are involved in regulating that emotion. For example, mirror neurons are critical for the ability to regulate anger. Scientists have used fMRI and electrophysiological recordings to show that the same neurons are activated when a monkey watches its own body move, and when it sees an object in the same position as its own hand.
The scientists recorded the activity of individual neurons in the chimpanzee’s frontal parietal cortex area F5 as the chimpanzee performed a transitive grasping task, observed an experimenter perform the same task, and then performed a grasping observation task where the objects were presented behind a barrier. The researchers used a special computer to control the presentation of each object and to determine whether or not the chimpanzee was allowed to pull on the object when it became available (Fig. 1).
They compared the responses of canonical neurons to the various objects and actions in each trial, and found that the most important variable was whether or not the chimpanzee could access the object. The response of a canonical neuron to the presentation of an object was significantly stronger when it could be pulled than when the object was presented with the barrier present or not. This indicates that these neurons are primarily used for the processing of object affordances, and not for visuomotor transformations for gripping.
Visualization is a technique used by many elite athletes to improve performance. It involves practicing the specific movements needed in a given event before actually engaging in the event, and it is believed to be an effective way of activating neuroplasticity to overcome the negative effects of injury or illness. Athletes who have experienced a stroke, for example, use visualization to help them overcome their limitations and return to full function. The process of visualization can also help a patient control their pain and anxiety, which are common side effects of some medications.
Behavioral, neurophysiological and imaging studies have shown that visualizing a desired result improves the ability to achieve it. For example, when a person concentrates on a goal, such as reaching a finish line in a running race, they are more likely to finish the race successfully. This effect is thought to occur because the brain recognizes the positive outcomes of a certain action and encourages that action to be repeated.
To investigate whether the activity of monkey neurons correlates with motivation, researchers performed a series of experiments in which they recorded cell activity in the lateral geniculate nucleus (LGN) of monkeys while performing a task that required them to attend to and detect mechanical vibrations delivered to the fingertip of their restrained hand. The vibrations were a sinusoidal wave that varied in amplitude over trials, and monkeys were instructed to report whether the stimulus was present or absent (Fig. 1).
The results showed that LGN neurons responded to the presence of the vibrating fingertip with a pattern of firing rate modulation that was highly correlated with the amplitude of the vibration. Interestingly, these responses were independent of the monkey’s internal state, suggesting that neural activity in this area is associated with sensory input rather than a representation of a particular motor response.
To increase the likelihood of obtaining clear contrast images, scientists employed a scanning method called quantitative susceptibility mapping (QSM). This technique uses MRI to determine the local magnetic properties of tissue, which enables neuroscientists to visualize deep gray matter structures that are obscured by white matter in T1-weighted and T2-weighted MR scans. In this study, QSM significantly elevated the contrast-to-noise ratios (CNR) of subcortical structures in macaque brains, including the ventral pallidum (VP), globus pallidus external and internal segments (GPe and GPi), substantia nigra and dentate nucleus of cerebellum. In addition, the CNR of these structures was correlated with the age of the monkeys.
Perception involves transforming low-level sensory information into higher-level information such as the shapes of objects or the location of a touch. The brain must also be able to connect these pieces of information with prior knowledge and expectations to make sense of them, a process called top-down processing. Moreover, the perception of an object or event must be combined with the ability to plan, decide and execute actions accordingly.
During the 1990s, researchers studying monkeys found that certain neurons in an area of the brain called the premotor cortex fired when another monkey performed a particular action such as grasping a peanut. These neurons, which are now referred to as mirror neurons, fired in the same way when the monkey watched a bystander perform the same action. Scientists believe that the activity in these neurons allows the brain to “mirror” the behavior of a bystander and learn from it.
When the same team of scientists studied a set of neurons in the temporal lobe, they found that these neurons were associated with the sense of familiarity. Stimulating these neurons at a rate of 10-15 Hz caused the monkeys to treat even unfamiliar images as familiar, while stimulating them at 30-40 Hz resulted in partially treating familiar and novel images as equally familiar.
The same team has recently shown that single VPL neurons can signal stimulus presence by using a simple coding scheme based on the periodicity of their spike train. They compared this to psychometric curves (shown in red) and amplitude modulations derived from the evoked responses of the monkeys in a simple sensory detection task where a probe was touched to the skin. The mean psychometric and neurometric curves were very similar in shape, but the monkeys displayed a greater sensitivity to changes in stimulus amplitude than the neurons.
These findings support the hypothesis that the brain is a complex information processor, taking in all kinds of input and linking it together in various ways to produce an accurate representation of the world around us. But, as the Necker cube and Rubin vase demonstrate, perception is more than simply the passive reception of these signals. The brain transforms these signals into percepts shaped by the physiology of the individual, their previous experience and expectations, and restorative and selective mechanisms such as attention and memory.
In some cases, a decision must be made that requires weighing costs and benefits to oneself or others. Social decisions often require evaluation of the impact on other people, and this is a process that involves the amygdala. Other times, a decision must be made about risky behavior or the value of specific outcomes. In this case, the nucleus accumbens dopamine can reward decisions that lead to better-than-expected outcomes and inhibit those that lead to worse outcomes.
The neural correlates of such decisions have been difficult to pinpoint, however. Recordings of fMRI and single-cell electrophysiology have shown that activity in the ventromedial prefrontal cortex (OFC) and striatum correlate with the choice that an animal will go on to select, but not necessarily with the relative value of different choices. The most convincing evidence to date comes from a series of experiments that measured activity in OFC neurons while monkeys performed a two-alternative forced-choice task. These data were used to train a decoder, which was able to distinguish neural representations of the different options. The result was that OFC neurons mainly represent the value of a particular option at any given time, with an equal and opposite activation pattern for each option.
It might seem that it would be necessary for humans to consider the values of all the available options in parallel, in order to maximize their ability to make good decisions in complex and unfamiliar domains. However, this type of parallel consideration is impossible when the goal is sufficiently important to require all available cognitive resources and when the available time is limited.
We propose that basal ganglia learn to adopt a canonical decision circuit in which cortex proposes a potential plan, and linked regions of basal ganglia act as a critic to assess the value of the proposed action based on prior experience in similar contexts. The system is then rewarded or punished, and a new iteration of the canonical circuit is used to consider a new action.
Our model and theory differ from many previous theories of how cortex and basal ganglia contribute to decision-making, but it may offer an alternative explanation for how we acquire the skills to make good decisions in highly variable environments. This alternative might be especially important for overcoming challenges that are so common in the real world, where information about the environment is not readily available.