Jump to content

Bi-directional hypothesis of language and action

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Novasdid (talk | contribs) at 22:32, 27 April 2017 (Expanded on language --> action, movement section. Started action --> language, neural). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This sandbox is in the article namespace. Either move this page into your userspace, or remove the {{User sandbox}} template. [1]

The bi-directional hypothesis of language and action proposes that the sensorimotor and higher cognitive areas of the brain exert reciprocal influence over one another[1]. This hypothesis argues that areas of the brain involved in movement and sensation, as well as movement itself, influence cognitive areas of the brain as well as cognition. In addition, the reverse effect is argued, where it is proposed that cognitive processes influence movement and sensation. The theory that sensory and motor processes are coupled to cognitive processes stems from action-oriented models of cognition[2]. These theories, such as the embodied and situated cognitive theories, propose that cognitive processes are rooted in areas of the brain involved in movement planning and execution, as well as areas responsible for processing sensory input, termed sensorimotor areas or areas of action and perception[3].

Proponents of the bi-directional hypothesis of language and action conduct and interpret linguistic, cognitive, and movement studies within the framework of embodied cognition and embodied language processing. Embodied language developed from abodied cognition, and proposes that sensorimotor systems are not only involved in the comprehension of language, but that they are necessary for understanding the semantic meaning of words. According to action-oriented models, higher cognitive processes evolved from sensorimotor brain regions, thereby necessitating sensorimotor areas for cognition and language comprehension[4].

Development of the bi-directional hypothesis

Effects of Language Comprehension on Systems of Action

Language comprehension tasks can exert influence over systems of action, both at the neural and behavioral level. This means that language stimuli influence both electrical activity in sensorimotor areas of the brain, as well as actual movement.

Neural activation

Language stimuli influence electrical activity in sensorimotor areas of the brain that are specific to the bodily-association of the words presented. This is referred to as semantic somatotopy, which indicates activation of sensorimotor areas that are specific to the bodily association implied by the word. For example, when processing the meaning of the word “kick,” the regions in the motor and somatosensory cortices that represent the legs will become more active[5][6]. Boulenger et al. (2009)[6] demonstrated this effect by presenting subjects with action-related language while measuring neural activity using fMRI. Subjects were presented with action sentences that were either associated with the legs (e.g. “John kicked the object”) or with the arms (e.g. “Jane grasped the object”). The medial region of the motor cortex, known to represent the legs, was more active when subjects were processing leg-related sentences, whereas the lateral region of the motor cortex, known to represent the arms, was more active with arm-related sentences. This body-part-specific increase in activation was exhibited about 3 seconds after presentation of the word, a time window that is thought to indicate semantic processing. In other words, this activation was associated with subjects comprehending the meaning of the word. This effect held true, and was even intensified, when subjects were presented with idiomatic sentences. Abstract language that implied more figurative actions were used, either associated with the legs (e.g. “John kicked the habit”) or the arms (e.g. “Jane grasped the idea”). Increased neural activation of leg motor regions were demonstrated with leg-related idiomatic sentences, whereas arm-related idiomatic sentences were associated with increased activation of arm motor regions. This activation was larger than that demonstrated by more literal sentences (e.g. “John kicked the object”), and was also present in the time window associated with semantic processing.

 Action language not only activates body-part-specific areas of the motor cortex, but also influences neural activity associated with movement. This has been demonstrated during an Action-Sentence Compatability Effect (ACE) task, a common test used to study the relationship between language comprehension and motor behavior[7]. This task requires the subject to perform movements to indicate understanding of a sentence, such as moving to press a button or pressing a button with a specific hand posture, that are either compatible or incompatible with movement implied by the sentence[7]. For example, pressing a button with an open hand to indicate understanding of the sentence "Jane high-fived Jack" would be considered a compatible movement, as the sentence implies an open-handed posture. Motor potentials (MP) are an Event Related Potentials (ERPs) stemming from the motor cortex, and are associated with execution of movement. Enhanced amplitudes of MPs have been associate with precision and quickness of movements[1][8][9]. Re-afferent potentials (RAPs) are another form of ERP, and are used as a marker of sensory feedback[10] and attention[11]. Both MP and RAP have been demonstrated to be enhanced during compatible ACE conditions[1]. These results indicate that language can have a facilitory effect on the excitability of neural sensorimotor systems. This has been referred to as semantic priming[12], indicating that language primes neural sensorimotor systems, altering excitability and movement.

Movement

The ability of language to influence neural activity of motor systems also manifests itself behaviorally by altering movement. Semantic priming has been implicated in these behavioral changes, and has been used as evidence for the involvement of the motor system in language comprehension. The Action-Sentence Compatability Effect (ACE) is indicative of these semantic priming effects. Understanding language that implies action may invoke motor facilitation, or prime the motor system, when the action or posture being performed to indicate language comprehension is compatible with action or posture implied by the language. For example, moving the hand away from the body to press a button upon comprehension of the sentence "He closed the drawer," which implies movement away from the body, would be considered a compatible ACE task. Compatible ACE tasks have been shown to lead to shorter reaction times[1][7][13]. This effect has been demonstrated on various types of movements, including hand posture during button pressing[1], reaching[7], and manual rotation[13].

Language stimuli can also prime the motor system simply by describing objects that are commonly manipulated. In a study performed by Masson et al. (2008), subjects were presented with sentences that implied non-physical, abstract action with an object (e.g. "John thought about the calculator" or "Jane remembered the thumbtack")[14]. After presentation of language stimuli, subjects were cued to perform either functional gestures, gestures typically made when using the object described in the sentence (e.g. poking for calculator sentences), or a volumetric gesture, gestures that are more indicative of whole hand posture (e.g. horizontal grasp for calculator sentences)[14]. Target gestures were either compatible or incompatible with the described object, and were cued at two different time points, early and late. Response latencies for performing compatible functional gestures significantly decreased at both time points, whereas latencies were significantly lower for compatible volumetric gestures in the late cue condition[14]. These results indicate that descriptions of abstract interactions with objects automatically (early time point) generate motor representations of functional gestures, priming the motor system and increasing response speed[14]. The specificity of enhanced motor responses to the gesture-object interaction also highlights the importance of the motor system in semantic processing, as this enhanced motor response was dependent on the meaning of the word.

A study performed by Dr. Olmstead et al. (2009)[15], described in detail elsewhere, demonstrates more concretely the influence that the semantics of action langauge can have on movement coordination. Briefly, this study investigated the effects of action language on the coordination of rhythmic bimanual hand movements. Subjects were instructed to move two pendulums, one with each hand, either in-phase (pendulums are at the same point in their cycle, phase difference of roughly 0 degrees) or anti-phase (pendulums are at the opposite point in their cycle, phase difference of roughly 180 degrees)[15]. Robust behavioral studies have revealed that these two phase states, with phase differences 180 and 0 degrees, are the two stable relative phase states, or the two coordination patterns that produce stable movement[16]. This pendulum swinging task was performed as subjects judged sentences for their plausibility; subjects were asked to indicate whether or not each presented sentence made logical sense[15]. Plausible sentences described actions that could be performed by a human using the arms, hands, and/or fingers ("He is swinging the bat"), or actions that could not be performed ("The barn is housing the goat")[15]. Implausible sentences also used similar action verbs ("He is swinging the hope"). Plausible, performable sentences lead to a significant change in the relative phase shift of the bimanual pendulum task[15]. The coordination of the movement was altered by action language stimuli, as the relative phase shift that produced stable movement was significantly different than in the non-performable sentence and no language stimuli conditions[15]. This development of new stable states has been used to imply a reorganization of the motor system utilized to plan and execute this movement[15], and supports the bi-directional hypothesis by demonstrating an effect of action language on movement.

Effects of Systems of Action on Language Comprehension

Altering the activity of motor systems, either through altered neural activity or actual movement, influences language comprehension. Neural activity in specific areas of the brain can be altered using transcranial magnetic stimulation, or by studying patients with pathologies leading to specific neural sensory and/or motor deficits. Movement also alters the activity of neural motor systems, increasing overall excitability of motor and pre-motor areas.

Neural activation

Altered neural activity of motor systems has been demonstrated to influence language comprehension. One such study that demonstrates this effect was performed by Dr. Pulvermüller et al. [17].

Movement

Movement can influence language comprehension (speech therapy, object manipulation, etc)

Current Theories on the Organization of Neural Substrates

Identifying neural substrates or correlates of cognition is a key dilemma in neuroscience, as quantifying subjective cognitive processes is difficult. Observing areas of the central nervous system that are active during certain motor or cognitive tasks can be done via brain imaging, altering brain activity with stimulation or movement, or by making observations from patients with neural pathology. The neural substrates of grounded cognition are often studied using the cognitive tasks of object recognition, action recognition, working memory tasks, and language comprehension tasks.

Circuit organization

Evidence for shared neural networks

Language interferring with and causing reorganization of the motor system

Criticisms

See also

References

  1. ^ a b c d e f Aravena, Pia; Hurtado, Esteban; Riveros, Rodrigo; Cardona, Juan Felipe; Manes, Facundo; Ibáñez, Agustín (2010-07-28). "Applauding with Closed Hands: Neural Signature of Action-Sentence Compatibility Effects". PLOS ONE. 5 (7): e11751. doi:10.1371/journal.pone.0011751. ISSN 1932-6203. PMC 2911376. PMID 20676367.{{cite journal}}: CS1 maint: PMC format (link) CS1 maint: unflagged free DOI (link)
  2. ^ Kilner, J.; Hommel, B.; Bar, M.; Barsalou, L.W.; Friston, K.J.; Jost, J.; Maye, A.; Metzinger, T.; Pulvermuller, F.; et al. (2016). "Action-oriented models of cognitive processing: A little less cogitation, a little more action please". The Pragmatic Turn: Toward Action-Oriented Views in Cognitive Science (PDF). Vol. 18. Cambridge, MA: MIT Press. pp. 159–173. {{cite book}}: Explicit use of et al. in: |first9= (help)CS1 maint: extra punctuation (link)
  3. ^ Barsalou, L.W. (2008). "Grounded Cognition". Annual Review of Psychology. 59: 617–645.
  4. ^ Wolpert, Daniel (July, 2011). "The Real Reason for Brains". Ted. Retrieved 04/04/2017. {{cite web}}: Check date values in: |access-date= and |date= (help); Cite has empty unknown parameter: |dead-url= (help)
  5. ^ Hauk, O.; Johnsrude, I.; Pulvermüller, F. (2004). "Somatotopic representation of action words in human motor and premotor cortex". Neuron. 41: 301–307.
  6. ^ a b Boulenger, V.; Hauk, O.; Pulvermüller, F. (2009). "Grasping ideas with the motor system: Semantic somatotopy in idiom comprehension". Cerebral Cortex. 19: 1905–1914.
  7. ^ a b c d Glenberg, A. M.; Kaschak, M.P. (2002). "Grounding language in action". Psychonomic bulletin & review. 9: 558–565.
  8. ^ Hatta, A.; Nishihira, Y.; Higashiura, T.; Kim, S.R.; Kaneda, T. (2009). "Long-term motor practice induces practice-dependent modulation of movement-related cortical potentials (MRCP) preceding self-paced non-dominant handgrip movement in kendo players". Neuroscience Letters. 459: 105–108.
  9. ^ Slobounov, S.; Johnston, J.; Chiang, H.; Ray, W.J. (2002). "Motor-related cortical potentials accompanying enslaving effect in single versus combination of fingers force production tasks". Clinical Neurophysiology. 113: 1444–1453.
  10. ^ Deecke, L. (1987). "Bereitschaftspotential as an indicator of movement preparation in supplementary motor area and motor cortex". Ciba Foundation Symposium. 132: 231–250.
  11. ^ Smith, A.L.; Staines, W.R. (2006). "Cortical adaptions and motor performance improvement associated with short-term bimanual training". Brain Research. 1071: 165–174.
  12. ^ Grisoni, L.; Dreyer, F.R.; Pulvermüller, F. (2016). "Somatotopic Semantic Priming and Prediction in the Motor System". Cerebral Cortex. 26: 2353–2366.
  13. ^ a b Zwaan, R.A.; Taylor, L.J. (2006). "Seeing, acting, understanding: motor resonance in language comprehension". Journal of Experimental Psychology: General. 135.
  14. ^ a b c d Masson, M.E.; Bub, D.N.; Newton-Taylor, M. (2008). "Language-based access to gestural components of conceptual knowledge". The Quarterly Journal of Experimental Psychology. 61: 869–882.
  15. ^ a b c d e f g Olmstead, A.J.; Viswanathan, N.; Aicher, K.A.; Fowler, C.A. (2009). "Sentence comprehension affects the dynamics of bimanual coordination: Implications for embodied cognition". The Quarterly Journal of Experimental Psychology. 62: 2409–2417.
  16. ^ Kugler, P.; Turvey, M. (1987). Information, natural law, and the self-assembly of rhythmic movement. Hillside, NJ: Routledge.
  17. ^ Pulvermüller, F.; Hauk, O.; Nikulin, V.; Ilmoneimi, R.J. (2005). "Functional links between motor and language systems". European Journal of Neuroscience. 21: 793–797.