Jump to content

Bi-directional hypothesis of language and action

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Novasdid (talk | contribs) at 23:41, 28 April 2017 (Action --> language, movement). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This sandbox is in the article namespace. Either move this page into your userspace, or remove the {{User sandbox}} template. The bi-directional hypothesis of language and action proposes that the sensorimotor and higher cognitive areas of the brain exert reciprocal influence over one another[1]. This hypothesis argues that areas of the brain involved in movement and sensation, as well as movement itself, influence cognitive areas of the brain as well as cognition. In addition, the reverse effect is argued, where it is proposed that language comprehension influences movement and sensation. The theory that sensory and motor processes are coupled to cognitive processes stems from action-oriented models of cognition[2]. These theories, such as the embodied and situated cognitive theories, propose that cognitive processes are rooted in areas of the brain involved in movement planning and execution, as well as areas responsible for processing sensory input, termed sensorimotor areas or areas of action and perception[3].

Proponents of the bi-directional hypothesis of language and action conduct and interpret linguistic, cognitive, and movement studies within the framework of embodied cognition and embodied language processing. Embodied language developed from embodied cognition, and proposes that sensorimotor systems are not only involved in the comprehension of language, but that they are necessary for understanding the semantic meaning of words. According to action-oriented models, higher cognitive processes evolved from sensorimotor brain regions, thereby necessitating sensorimotor areas for cognition and language comprehension[4].

Development of the bi-directional hypothesis

Effects of Language Comprehension on Systems of Action

Language comprehension tasks can exert influence over systems of action, both at the neural and behavioral level. This means that language stimuli influence both electrical activity in sensorimotor areas of the brain, as well as actual movement.

Neural activation

Language stimuli influence electrical activity in sensorimotor areas of the brain that are specific to the bodily-association of the words presented. This is referred to as semantic somatotopy, which indicates activation of sensorimotor areas that are specific to the bodily association implied by the word. For example, when processing the meaning of the word “kick,” the regions in the motor and somatosensory cortices that represent the legs will become more active[5][6]. Boulenger et al. (2009)[6] demonstrated this effect by presenting subjects with action-related language while measuring neural activity using fMRI. Subjects were presented with action sentences that were either associated with the legs (e.g. “John kicked the object”) or with the arms (e.g. “Jane grasped the object”). The medial region of the motor cortex, known to represent the legs, was more active when subjects were processing leg-related sentences, whereas the lateral region of the motor cortex, known to represent the arms, was more active with arm-related sentences. This body-part-specific increase in activation was exhibited about 3 seconds after presentation of the word, a time window that is thought to indicate semantic processing. In other words, this activation was associated with subjects comprehending the meaning of the word. This effect held true, and was even intensified, when subjects were presented with idiomatic sentences. Abstract language that implied more figurative actions were used, either associated with the legs (e.g. “John kicked the habit”) or the arms (e.g. “Jane grasped the idea”). Increased neural activation of leg motor regions were demonstrated with leg-related idiomatic sentences, whereas arm-related idiomatic sentences were associated with increased activation of arm motor regions. This activation was larger than that demonstrated by more literal sentences (e.g. “John kicked the object”), and was also present in the time window associated with semantic processing.

 Action language not only activates body-part-specific areas of the motor cortex, but also influences neural activity associated with movement. This has been demonstrated during an Action-Sentence Compatability Effect (ACE) task, a common test used to study the relationship between language comprehension and motor behavior[7]. This task requires the subject to perform movements to indicate understanding of a sentence, such as moving to press a button or pressing a button with a specific hand posture, that are either compatible or incompatible with movement implied by the sentence[7]. For example, pressing a button with an open hand to indicate understanding of the sentence "Jane high-fived Jack" would be considered a compatible movement, as the sentence implies an open-handed posture. Motor potentials (MP) are an Event Related Potentials (ERPs) stemming from the motor cortex, and are associated with execution of movement. Enhanced amplitudes of MPs have been associate with precision and quickness of movements[1][8][9]. Re-afferent potentials (RAPs) are another form of ERP, and are used as a marker of sensory feedback[10] and attention[11]. Both MP and RAP have been demonstrated to be enhanced during compatible ACE conditions[1]. These results indicate that language can have a facilitory effect on the excitability of neural sensorimotor systems. This has been referred to as semantic priming[12], indicating that language primes neural sensorimotor systems, altering excitability and movement.

Movement

The ability of language to influence neural activity of motor systems also manifests itself behaviorally by altering movement. Semantic priming has been implicated in these behavioral changes, and has been used as evidence for the involvement of the motor system in language comprehension. The Action-Sentence Compatability Effect (ACE) is indicative of these semantic priming effects. Understanding language that implies action may invoke motor facilitation, or prime the motor system, when the action or posture being performed to indicate language comprehension is compatible with action or posture implied by the language. For example, moving the hand away from the body to press a button upon comprehension of the sentence "He closed the drawer," which implies movement away from the body, would be considered a compatible ACE task. Compatible ACE tasks have been shown to lead to shorter reaction times[1][7][13]. This effect has been demonstrated on various types of movements, including hand posture during button pressing[1], reaching[7], and manual rotation[13].

Language stimuli can also prime the motor system simply by describing objects that are commonly manipulated. In a study performed by Masson et al. (2008), subjects were presented with sentences that implied non-physical, abstract action with an object (e.g. "John thought about the calculator" or "Jane remembered the thumbtack")[14]. After presentation of language stimuli, subjects were cued to perform either functional gestures, gestures typically made when using the object described in the sentence (e.g. poking for calculator sentences), or a volumetric gesture, gestures that are more indicative of whole hand posture (e.g. horizontal grasp for calculator sentences)[14]. Target gestures were either compatible or incompatible with the described object, and were cued at two different time points, early and late. Response latencies for performing compatible functional gestures significantly decreased at both time points, whereas latencies were significantly lower for compatible volumetric gestures in the late cue condition[14]. These results indicate that descriptions of abstract interactions with objects automatically (early time point) generate motor representations of functional gestures, priming the motor system and increasing response speed[14]. The specificity of enhanced motor responses to the gesture-object interaction also highlights the importance of the motor system in semantic processing, as this enhanced motor response was dependent on the meaning of the word.

A study performed by Dr. Olmstead et al. (2009)[15], described in detail elsewhere, demonstrates more concretely the influence that the semantics of action langauge can have on movement coordination. Briefly, this study investigated the effects of action language on the coordination of rhythmic bimanual hand movements. Subjects were instructed to move two pendulums, one with each hand, either in-phase (pendulums are at the same point in their cycle, phase difference of roughly 0 degrees) or anti-phase (pendulums are at the opposite point in their cycle, phase difference of roughly 180 degrees)[15]. Robust behavioral studies have revealed that these two phase states, with phase differences 180 and 0 degrees, are the two stable relative phase states, or the two coordination patterns that produce stable movement[16]. This pendulum swinging task was performed as subjects judged sentences for their plausibility; subjects were asked to indicate whether or not each presented sentence made logical sense[15]. Plausible sentences described actions that could be performed by a human using the arms, hands, and/or fingers ("He is swinging the bat"), or actions that could not be performed ("The barn is housing the goat")[15]. Implausible sentences also used similar action verbs ("He is swinging the hope"). Plausible, performable sentences lead to a significant change in the relative phase shift of the bimanual pendulum task[15]. The coordination of the movement was altered by action language stimuli, as the relative phase shift that produced stable movement was significantly different than in the non-performable sentence and no language stimuli conditions[15]. This development of new stable states has been used to imply a reorganization of the motor system utilized to plan and execute this movement[15], and supports the bi-directional hypothesis by demonstrating an effect of action language on movement.

Effects of Systems of Action on Language Comprehension

The bi-directional hypothesis of action and language proposes altering the activity of motor systems, either through altered neural activity or actual movement, influences language comprehension. Neural activity in specific areas of the brain can be altered using transcranial magnetic stimulation (TMS), or by studying patients with pathologies leading to specific neural sensory and/or motor deficits. Movement is also used to alter the activity of neural motor systems, increasing overall excitability of motor and pre-motor areas.

Neural activation

Altered neural activity of motor systems has been demonstrated to influence language comprehension. One such study that demonstrates this effect was performed by Dr. Pulvermüller et al. [17]. TMS was used to increase the excitability of either the leg region or the arm region of the motor cortex[17]. Authors stimulated the left motor cortex, known to be more closely involved in language processing in right-handed individuals, the right motor cortex, as well as a sham stimulation where stimulation was prevented by a plastic block placed between the coil and the skull[17]. During the stimulation protocols, subjects were shown 50 arm, 50 leg, 50 distractor (no bodily relation), and 100 pseudo- (not real) words[17]. Subjects were asked to indicate recognition of a meaningful word by moving their lips, and response times were measured[17]. It was found that stimulation of the left leg region of the motor cortex significantly reduced response times for recognition of leg words as compared to arm words, whereas the reverse was true for stimulation of the arm region[17]. Stimulation site on the right motor cortex, as well as sham stimulation, did not exhibit these effects[17]. Therefore, somatotopically-specific stimulation of the left motor cortex facilitated word comprehension in a body-part-specific manner, where stimulation of the leg and arm regions lead to enhanced comprehension of leg and arm words, respectively[17]. This study has been used as evidence for the bi-directional hypothesis of language and action, as it showcases that manipulating motor cortex activity alters language comprehension in a semantically-specific manner[17].

A similar experiment has been performed on the articulatory motor cortex, or the mouth and lip regions of the motor cortex used in the production of words[18]. Two categories of words were used as language stimuli: words that involved the lips for production (e.g. "pool") or the tongue (e.g. "tool)[18]. Subjects were listened to the words, were shown pairs of pictures, and were asked to indicate which picture matched the word they heard with a button press[18]. TMS was used prior to presentation of the language stimuli to selectively facilitate either the lip or tongue regions of the left motor cortex; these two TMS conditions were compared to a control condition where TMS was not applied[18]. It was found that stimulation of the lip region of the motor cortex lead to a significantly decreased response time for lip words as compared to tounge words[18]. In addition, during recognition of tongue words, reduced reaction times were seen with tounge TMS as compared to lip TMS and no TMS[18]. Although this same effect was not seen with lip words, authors attribute this to the complexity of tongue as opposed to lip movements, and the increase difficulty of tongue words as opposed to lip[18]. Overall, this study demonstrates that the activity in the articulatory motor cortex influences the comprehension of single spoken words, and highlights the importance of the motor cortex in speech comprehension[18].

Lesions of sensory and motor areas have also been studied to elucidate the effects of sensorimotor systems on language comprehension. One such example of this is the patient JR; this patient has a lesion in areas in the auditory association cortex implicated in processing auditory information[19]. This patient showcases significant impairments in conceptual and perceptual processing of sound-related language and objects[19]. For example, processing the meaning of words describing sound-related objects (e.g., "bell') was significantly impaired in JR as compared to non-sound-related objects (e.g., "armchair")[19]. These data suggest that damage of sensory regions involved in processing auditory information specifically impair processing of sound-related conceptual information[19], highlighting the necessity of sensory systems for language comprehension.

Movement

Movement has been shown to influence language comprehension. This has been demonstrated by priming motor areas with movement, increasing the excitability of motor and pre-motor areas associated with the body part being moved[20]. It has been demonstrated that motor engagement of a specific body part decreases neural activity in language processing areas when processing words related to that body part[20]. This decreased neural activity is a feature of semantic priming, and suggests that activation of specific motor areas through movement can facilitate language comprehension in a semantically-dependent manner[20].

Movement can also inhibit language comprehension tasks, particularly tasks of verbal working memory[21]. When asked to memorize and verbally recall four-word sequences of either arm or leg action words, performing complex, rhythmic movements after presentation of the word sequences was demonstrated to interfere with memory performance[21]. This performance deficit was body-part specific, where movement of the legs impaired performance of recall of leg words, and movement of the arms impaired recall of arm words[21]. These data indicate that sensorimotor systems exhibit cortically specific "inhibitory casual effects" on memory of action words[21], as impairment was specific to motor engagement and bodily association of the words.

Current Theories on the Organization of Neural Substrates

Identifying neural substrates or correlates of cognition is a key dilemma in neuroscience, as quantifying subjective cognitive processes is difficult. Observing areas of the central nervous system that are active during certain motor or cognitive tasks can be done via brain imaging, altering brain activity with stimulation or movement, or by making observations from patients with neural pathology. The neural substrates of grounded cognition are often studied using the cognitive tasks of object recognition, action recognition, working memory tasks, and language comprehension tasks.

Circuit organization

Evidence for shared neural networks

Language interferring with and causing reorganization of the motor system

Criticisms

See also

References

  1. ^ a b c d e Aravena, Pia; Hurtado, Esteban; Riveros, Rodrigo; Cardona, Juan Felipe; Manes, Facundo; Ibáñez, Agustín (2010-07-28). "Applauding with Closed Hands: Neural Signature of Action-Sentence Compatibility Effects". PLOS ONE. 5 (7): e11751. doi:10.1371/journal.pone.0011751. ISSN 1932-6203. PMC 2911376. PMID 20676367.{{cite journal}}: CS1 maint: PMC format (link) CS1 maint: unflagged free DOI (link)
  2. ^ Kilner, J.; Hommel, B.; Bar, M.; Barsalou, L.W.; Friston, K.J.; Jost, J.; Maye, A.; Metzinger, T.; Pulvermuller, F.; et al. (2016). "Action-oriented models of cognitive processing: A little less cogitation, a little more action please". The Pragmatic Turn: Toward Action-Oriented Views in Cognitive Science (PDF). Vol. 18. Cambridge, MA: MIT Press. pp. 159–173. {{cite book}}: Explicit use of et al. in: |first9= (help)CS1 maint: extra punctuation (link)
  3. ^ Barsalou, L.W. (2008). "Grounded Cognition". Annual Review of Psychology. 59: 617–645.
  4. ^ Wolpert, Daniel (July, 2011). "The Real Reason for Brains". Ted. Retrieved 04/04/2017. {{cite web}}: Check date values in: |access-date= and |date= (help); Cite has empty unknown parameter: |dead-url= (help)
  5. ^ Hauk, O.; Johnsrude, I.; Pulvermüller, F. (2004). "Somatotopic representation of action words in human motor and premotor cortex". Neuron. 41: 301–307.
  6. ^ a b Boulenger, V.; Hauk, O.; Pulvermüller, F. (2009). "Grasping ideas with the motor system: Semantic somatotopy in idiom comprehension". Cerebral Cortex. 19: 1905–1914.
  7. ^ a b c d Glenberg, A. M.; Kaschak, M.P. (2002). "Grounding language in action". Psychonomic bulletin & review. 9: 558–565.
  8. ^ Hatta, A.; Nishihira, Y.; Higashiura, T.; Kim, S.R.; Kaneda, T. (2009). "Long-term motor practice induces practice-dependent modulation of movement-related cortical potentials (MRCP) preceding self-paced non-dominant handgrip movement in kendo players". Neuroscience Letters. 459: 105–108.
  9. ^ Slobounov, S.; Johnston, J.; Chiang, H.; Ray, W.J. (2002). "Motor-related cortical potentials accompanying enslaving effect in single versus combination of fingers force production tasks". Clinical Neurophysiology. 113: 1444–1453.
  10. ^ Deecke, L. (1987). "Bereitschaftspotential as an indicator of movement preparation in supplementary motor area and motor cortex". Ciba Foundation Symposium. 132: 231–250.
  11. ^ Smith, A.L.; Staines, W.R. (2006). "Cortical adaptions and motor performance improvement associated with short-term bimanual training". Brain Research. 1071: 165–174.
  12. ^ Grisoni, L.; Dreyer, F.R.; Pulvermüller, F. (2016). "Somatotopic Semantic Priming and Prediction in the Motor System". Cerebral Cortex. 26: 2353–2366.
  13. ^ a b Zwaan, R.A.; Taylor, L.J. (2006). "Seeing, acting, understanding: motor resonance in language comprehension". Journal of Experimental Psychology: General. 135.
  14. ^ a b c d Masson, M.E.; Bub, D.N.; Newton-Taylor, M. (2008). "Language-based access to gestural components of conceptual knowledge". The Quarterly Journal of Experimental Psychology. 61: 869–882.
  15. ^ a b c d e f g Olmstead, A.J.; Viswanathan, N.; Aicher, K.A.; Fowler, C.A. (2009). "Sentence comprehension affects the dynamics of bimanual coordination: Implications for embodied cognition". The Quarterly Journal of Experimental Psychology. 62: 2409–2417.
  16. ^ Kugler, P.; Turvey, M. (1987). Information, natural law, and the self-assembly of rhythmic movement. Hillside, NJ: Routledge.
  17. ^ a b c d e f g h i Pulvermüller, F.; Hauk, O.; Nikulin, V.; Ilmoneimi, R.J. (2005). "Functional links between motor and language systems". European Journal of Neuroscience. 21: 793–797.
  18. ^ a b c d e f g h Schomers, M.R.; Kirilina, E.; Weigand, A.; Bajbouj, M.; Pulvermüller, F. (2014). "Causal influence of articulatory motor cortex on comprehending single spoken words: TMS evidence". Cerebral Cortex. 25: 3894–3902.
  19. ^ a b c d Trumpp, N.M.; Kliese, D.; Hoenig, K.; Haarmeier, T.; Kiefer, M. (2013). "Losing the sound of concepts: Damage to auditory association cortex impairs the processing of sound-related concepts". Cortex. 49: 474–486.
  20. ^ a b c Mollo, G.; Pulvermüller, F.; Hauk, O. (2016). "Movement priming of EEG/MEG brain responses for action-words characterizes the link between language and action". Cortex. 74: 262–276.
  21. ^ a b c d Shebani, Z.; Pulvermüller, F. (2013). "Moving the hands and feet specifically impairs working memory for arm-and leg-related action words". Cortex. 49: 222–231.