Monkey neurons in the ventral premotor cortex (F5) and the inferior parietal area (IFG) were recorded while monkeys observed different videos. Neurons showed a high selectivity for specific stimuli.
For example, a neuron shown in Fig 2c responds exclusively to the observation of a monkey grasping an object from the first and third person perspective, showing peak discharge before movement onset.
Neurons in the visual cortex (V1) respond to a wide range of visual stimuli. These visual stimuli are often accompanied by auditory stimulation, and the response of V1 neurons to these combinations can be examined. Using this approach, researchers have found that some neurons respond differently to visual stimuli presented under different behavioral conditions. For example, the response latency of a V1 neuron to a visual target is shorter in a bimodal visuo-auditory condition than it is in a visual-only condition.
Another example of this effect is seen in the response of a neuron to a visual target that is presented as part of a saccadic task. These stimuli are designed to encourage a saccade toward a visual target that is associated with the onset of a sound. Neurons that have this type of response are often called saccadic receptive cells.
Several studies have demonstrated that the activity of V3A neurons is modulated by both visual and auditory stimuli in monkeys. For example, the activity of these neurons is elevated during a covert visual search task that requires the animal to move their eyes to search for a popout stimulus. This indicates that the activity of these neurons reflects both the visual stimulus and the eye movement that is needed to detect the stimulus.
These findings have been replicated in other experiments. For example, one study used a simple passive fixation control task in which the monkey was informed by the color of the FP whether they were to maintain a passive central fixation or make a saccade. The results showed that the activity of V3A neurons increased during the saccade epoch and was suppressed during the presaccadic epoch.
Another important finding is that the activity of V1 neurons is modulated by auditory stimuli in monkeys that have been trained to perform a covert visual search task. These results show that the activity of V1 neurons reflects both the visual stimulus and the movement that is needed to detect the stimulus. These results also support the hypothesis that saccadic representations are based in area V1 in the calcarine sulcus.
In addition to visual and tactile stimuli, monkey neurons can be activated by auditory stimuli as well. For example, when a monkey sees another monkey making a gesture with his hand, the auditory cortex of the monkey brain will be activated by this action. This activation is a result of a cross-modal association between vision and sound, in which the visual representation of the gesture is linked to the auditory representation through a series of neural pathways. This type of cross-modal association is a fundamental part of the primate brain and can be used to understand the neural basis for actions.
In a study conducted by Ferrari and Lepage, monkeys were trained to passively fixate on a visual target during the presentation of a fixed-periodic stimulus (FP). In half of the trials, this FP was presented together with a 25 ms sound in a bimodal visuo-auditory condition. The researchers found that the response latency of the target-response cell was significantly shorter when it was presented in a visuo-auditory condition.
This reduction in latency suggests that the MNs are coding more than simple body parts displacements, but rather the purpose of an observed action. A similar finding was recently reported in orang-utans in natural context: the automatic imitation of conspecific facial expressions of play during observation of a conspecific’s display occurs within one second.
One of the most interesting phenomena associated with MNs is their ability to code an observed action at a very low level, independent of what is actually seen on the screen. For example, MNs in ventral premotor area F5 can discharge during the observation of a grasping motor act even if the final stage of the action is hidden behind a screen and can only be inferred by the acoustic feedback of the executed action (Umilta et al. 2001).
Another intriguing phenomenon is that MNs can also encode speech and other non-visual communication signals. This type of multisensory integration is mediated by the direct projections of primary sensory cortices to primary motor cortex (M1), which have been found in both human and monkey brains.
In some prefrontal areas, neurons are activated by the observation of motor actions performed by humans or monkeys. These actions are often goal-directed, such as grasping a ball or mimicking another person’s hand movements. These neurons are thought to play a role in exploiting the context of the action for planning and guiding behavioral responses. However, it is not clear whether the neurons that respond to these videos are directly involved in planning or executing specific motor actions. This study was designed to test that question. The monkey was passively exposed to a series of 12 videos depicting either human or monkey actions. Four of the videos were identical to those used in the basic task (MGI, MGIII, HG, and HM). The remaining 8 were modified versions of these video clips that obscured the first or second phase of the video using black shading. The goal was to determine which visual features were critical for eliciting neuronal responses.
Several studies have shown that VLPF neurons in ventral premotor area F5 and inferior parietal areas PFG and AIP are activated by the observation of biological hand movements. These neurons are often thought to be part of a central pattern generator (CPG), which generates alternating oscillatory activity in two pools of neurons. The neurons in each pool reciprocally inhibit each other, ensuring that they do not fire at the same time. This is necessary for generating alternating activity, which is essential for locomotion control.
These experiments used videos of monkeys performing various tasks, including grasping a ball and mimicking other people’s actions. Each video was presented for a randomized period of 500 to 900 ms, with the monkey instructed to keep its eyes on the fixation point. The monkey was rewarded if it kept its gaze fixed. The experiment was controlled by a home-built Labview software that generated digital output signals for the onset and offset of the fixation point, the beginning and ending of the stimuli presentation, and the reward delivery.
The HS neurons were grouped according to their statistically highest response to a given stimulus, based on the epoch of the video showing that particular behavior. For example, one neuron that responded to the video of a monkey grasping in first and third person perspective displayed a preference for Video Epoch 1. This means that this neuron is most likely coding aspects of the observed monkey’s hand movement, such as its context or beginning, or its hand-object contact.
In addition to coding contextual information, monkey neurons also code biological stimuli. Specifically, they code actions that belong to other individuals and movements of the body of another person. This kind of coding has been observed in orbitofrontal and medial prefrontal cortex. These cells are particularly sensitive to the movement of other monkeys. They are able to distinguish the identity of a particular action or a particular biological movement and can even predict what the other monkey will do in a specific situation. These predictions can be used for control, coercion, or diverting the attention of other monkeys.
This is important because it means that social stimulation can be used to trigger a specific behavioral response. In fact, this is what happens when a monkey encounters another monkey with a food reward or when it perceives a predator approaching its nest. These social stimuli can trigger a specific response in the other monkey, which might be to retreat or attack. This is the basis of the infamous “mass behavior” of the monkey, when many members of the group abandon their tasks to pursue a single food item or to protect their mates.
The emergence of this phenomenon was accompanied by the spreading of a “language” consisting of minute taps made with the feelers of the tail. These signals were transmitted through the entire colony and signaled different situations such as danger, food, or coercion. The emergence of this language can be interpreted as an attempt to communicate the intentions and plans of other monkeys in order to avoid conflict or to make a joint decision.
A number of studies have shown that human subjects synchronise better with auditory rather than visual stimuli in a variety of synchronisation tasks. However, it is difficult to explain why this difference exists. It might be due to the fact that the auditory modality performs better in tasks that involve temporal processing. Alternatively, it might be due to the fact that a social stimulus is more likely to engage the participants’ attention and thus attract their cognitive resources. Further research is needed to establish whether the effect of a social versus non-social stimulus depends on the response modality and the type of task.