Jump to content

Representational systems and submodalities (NLP)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by FT2 (talk | contribs) at 15:10, 9 June 2006. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

You must add a |reason= parameter to this Cleanup template – replace it with {{Cleanup|June 2006|reason=<Fill reason here>}}, or remove the Cleanup template.

Representational systems (also known as modalities and abbreviated to VAKOG or known as the 4-tuple) is a Neuro-linguistic programming model that examines how the human mind processes information. It states that for practical purposes, information is (or can be treated as if) processed through the senses. Thus people say one talks to oneself (the auditory sense) even if no words are emitted, one makes pictures in ones head when thinking or dreaming (the visual sense), and one considers feelings in the body and emotions (known as the kinesthetic sense).

NLP holds it as crucial in human cognitive processing to recognize that the subjective character of experience is strongly tied into, and influenced by, how memories and perceptions are processed within each sense, in the mind. It considers that expressions such as "Its all misty" or "I can't get a grip on it", can often be precise literal unconscious descriptions from within those sensory systems, communicating unconsciously where the mind perceives a problem in handling some mental event.

Within NLP, the various senses in their role as information processors, are known as "representation systems", or modalities. The model itself is known as the VAGOG model (from the initial letters of each of the five senses), or since taste and smell are so closely connected, sometimes as a 4-tuple, meaning its 4 way sensory-based description.

Submodalities are the various structural parameters by which impressions are accessed within a sensory representation. Thus a picture (of whatever kind) is by nature colored or "black and white" when accessed, and a sound is by nature mono or stereo when accessed. NLP posits that the submodalities with which a sensory impression is recalled are not arbitrary or unimportant, but are in fact often related to associated emotions, related memories, felt-sense perceptions such as "importance", and so on. Submodalities are therefore seen as offering a valuable therapeutic insight (or metaphor) and working method, into how the human mind internally organizes and 'views' events.

More

NLP's representational systems model examines the verbal and non-verbal cues that people exhibit in their behavioral patterns. Certain patterns existing between the sensory modalities, primarily visual, auditory, and kinesthetic (touch), and sometimes taste and olfaction (smell).

When we think about the world or about our past experiences we represent those things inside our heads. For example, think about the holiday you went on last year. Did you see a picture of where you went, tell yourself a story about what you did, feel the sun on your back and the wind in your hair? Can you bring to mind the smell of your favourite flower or the taste of a favourite meal?

The use of the various modalities can be identified based by learning to respond to subtle shifts in breathing, body posture, accessing cues, gestures, eye movements and language patterns such as sensory predicates. [1][2]

NLP proponents found that pacing and leading the various cues tended to build rapport, and allowed people to communicate more effectively. Exercises in NLP training involve learning how to calibrate and respond to the various cues in real time. [citation needed]

4-tuple, VAKOG, First Access

The 4-tuple, <V, A, K, O> is used to denote the 4 primary representational systems (Visual, Auditory, Kinesthetic, Gustatory/Olfactory). It is also known as First Access (John Grinder)[3], or primary experience (Freud).

Representation systems and eye movements ("accessing cues")

Grinder and Bandler identified pattern of relationship between the sensory-based language people use in general conversation, and for example, their eye movements (known as "eye accessing cues). [4]

A common (but not universal) style of processing in the West is shown in the attached chart, where "eye flickers" in specific directions often seem to tie into specific kinds of internal (mental) processing. NLP also suggests that that sometimes (again not universally), such processing is associated with sensory word use, so for example a person asked what they liked about the beach, may flick their eyes briefly in some characteristic direction (visual memory access, often upwards), and then also use words that describe it in a visual sense ("the sea looked lovely", and so on). Likewise asked about a problem, someone may look in a different direction for a while (kinesthetic access, typically downwards) and then look puzzled and say "I just can't seem to get a grip on things". Taken together, NLP suggests such eye accessing cues (1) are idiosyncratic and habitual for each person, and (2) may form significant clues as to how a person is processing or representing a problem to themselves unconsciously.

The most common arrangement for eye accessing cues in a right-handed person.

Note: - NLP does not say it is 'always' this way, but rather that one should check whether reliable correlations seem to exist for an individual, and if so what they are

Common (but not universal) Western layout of eye accessing cues:

  • Upwards -- Visual -- "I can imagine the big picture"
  • Level (left/right) -- Auditory -- "Let's tone down the discussion"
  • Down -- Kinesthetic -- "I can grasp a hold of it"
  • Down-left Auditory internal dialogue (talking to oneself inside)

Eye movements to the left or right for many people seem to indicate if a memory was remembered (past) or constructed (future). Thus remembering an actual image is associated more with up-left, whilst imagining one's dream home tends (again not universally) to be more associated with up-right.

Submodalities

The presupposition underlying the concept of submodalities which arose in the field of neuro-linguistic programming (NLP), and particularly the work of its co-creator Richard Bandler, is that human beings code internal experiences using aspects of their different senses, or 'representation systems', or 'sensory modalities', or the way in which we experience the world through our senses .

Submodalities refers to the subdivisions within any one representation system. For example, in visual: brightness, degree of colour (saturation), size, distance, sharpness, focus, and so on; in auditory: loudness, pitch, tonal range, distance, clarity, timbre, and so on. Ordinarily, you would get these by asking, "This image - is it bright, or dim? Coloured or black and white? How much colour? Is it big or small? Is it near or far? In focus, or out of focus? etc" And, "This sound - is it loud or soft? Is it high pitched or low pitched? Does it have a range? Is it near or far? Is it one point source or spread out? Is it clear or muffled? Is it a pure tone or ... " The interesting discovery is that voluntary change of these on the part of the subject alters the concomitant 'feeling' response, paving the way for a number of change techniques based on deliberately changing internal representations.

Thus, the modality of vision has submodalities including brightness (of one's mental image), location (where the image is in relation to its viewer), and so on. Submodalities of auditory experience include volume and timbre, and kinesthetic submodalities include pressure and temperature. NLP co-originator Richard Bandler in particular has made extensive use of submodality manipulations in the evolution of his work.

Practical applications include, for instance, changing the auditory qualities of an internal voice that a client responds to with fear -- eg hearing such a voice as Mickey Mouse rather than an angry partner. For an example of political voter manipulation using Anchoring via Submodalities, see Persuasion uses of NLP: Political persuasion

To match these internal distinctions, Eric Robbie made the discovery in 1984 that sub-modalities could be detected through external behaviour - in the case of visual submodalities, combinations of subtle changes in the eye and facial muscles surrounding the eye, and, in the case of auditory, from subtle changes in the muscles surrounding the ears. The interesting question of the last few years is: do these commonly expressed distinctions or sub-modalities map to specific areas of the brain?

NLP modalities (also referred to as Senses, Perception, Sensory systems) in Neuro-linguistic programming are divided into:

Olfactory/Gustatory

The senses are closely related to VAK or VARK learning styles. While discussing the learning styles some scientists also add the fourth group:

  • Learning by processing text

Criticism

The NLP developers, Robert Dilts et al. [1]proposed that eye movements (and sometimes gestures) correspond to accessing cues for representations systems, and connected it to specific regions in the brain. [citation needed] Sharpley [5]found little support for preferred representational system (PRS), which is observed in the choice of words or direction of eye movements. The concept of a "preferred" representation system (PRS), and categorization of people "as" visual, auditory or kinesthetic, has been dropped within NLP, in favor of a view that people will usually use any or all of these, in combination.

See also

Notes and References

  • Bandler's Using Your Brain for a Change (Real People Press, 1985)
  • Seki's Inner Vision: an Exploration of Art and Brain, OUP, 2000.
  1. ^ a b Dilts, Robert B, Grinder, John, Bandler, Richard & DeLozier, Judith A. (1980). [. Neuro-Linguistic Programming: Volume I - The Study of the Structure of Subjective Experience]. Meta Publications, 1980. . pp. pp.3-4, 6, 14, 17. . {{cite book}}: |pages= has extra text (help); Check |url= value (help)CS1 maint: multiple names: authors list (link)
  2. ^ Dilts, Robert B, DeLozier, Judith A (2000). Encyclopedia of Systemic Neuro-Linguistic Programming and NLP New Coding. NLP University Press. pp. p.75, 383, 729, 938–943, 1003, 1300, 1303. ISBN 0970154003. {{cite book}}: |pages= has extra text (help); External link in |Url= (help); Unknown parameter |Url= ignored (|url= suggested) (help)CS1 maint: multiple names: authors list (link)
  3. ^ Grinder, John & Carmen Bostic St Clair (2001.). Whispering in the Wind. CA: J & C Enterprises. pp. 127, 171, 222, ch.3, Appendix. -. {{cite book}}: Check date values in: |year= (help)CS1 maint: year (link)
  4. ^ Bandler, Richard & John Grinder (1979). [- Frogs into Princes: Neuro Linguistic Programming]. Moab, UT: Real People Press. pp. p.15, 24, 30, 45, 52. -. {{cite book}}: |pages= has extra text (help); Check |url= value (help)
  5. ^ Sharpley C.F. (1987). "Research Findings on Neuro-linguistic Programming: Non supportive Data or an Untestable Theory". Communication and Cognition. Journal of Counseling Psychology, 1987 Vol. 34, No. 1: 103–107, 105.