Attention schema theory
This article needs additional citations for verification. (September 2017) |
The attention schema theory (AST) of consciousness is a neuroscientific and evolutionary theory of consciousness (or subjective awareness) developed by neuroscientist Michael Graziano at Princeton University.[1][2] It proposes that brains construct subjective awareness as a schematic model of the process of attention.[1][2] The theory is a materialist theory of consciousness. It shares similarities with the illusionist ideas of philosophers like Daniel Dennett, Patricia Churchland, and Keith Frankish.[3][4]
Graziano proposed that an attention schema is like the body schema. Just as the brain constructs a simplified model of the body to monitor and control its movement, it also constructs a simplified model of attention to help monitor and control its own attention. The information in that model, portraying an incomplete and simplified version of attention, leads the brain to conclude that it has a non-physical essence of awareness. Thus subjective awareness is the brain's efficient but imperfect model of its own attention. This approach intends to explain how awareness and attention are similar in many respects, yet are sometimes dissociated; how the brain can be aware of internal and external events, and provides testable predictions.[2]
In the theory, an attention schema necessarily evolved due to its fundamental adaptive uses in perception, cognition, and social interaction.
Description
[edit]The AST describes how an information-processing machine can claim to have a conscious, subjective experience,[5] while having no means to discern the difference between its claim and reality.
In the theory, the brain is an information processor captive to the information constructed within it. In this approach, the challenge of explaining consciousness is not, "How does the brain produce an ineffable internal experience," but rather, "How does the brain construct a quirky self description, and what is the useful cognitive role of that self model?"
In other words, because we claim to be conscious, some mechanism in the brain must therefore have computed the requisite information about consciousness to enable the system to output that claim. AST proposes that this is an adaptive function: it serves as an internal model of one of the brain's most important features: attention.
A crucial aspect of the theory is model-based knowledge. The brain constructs rich internal models that lie beneath the level of higher cognition or of language. Cognition has partial access to those internal models, and the content of those models are reported as literal reality.
The AST can be summarized in three broad points:[1]
- The brain is an information-processing device.
- The brain has a capacity to focus its processing resources more on some signals than on others. That focus may be on incoming sensory signals or internal information such as recalled memories. This ability is called attention.
- The brain also builds a set of information, or a representation, descriptive of its own attention. This internal model is the attention schema.
The attention schema allows a machine to make claims about its consciousness. When it claims to be conscious of concept X (to have a subjective awareness or a mental possession of X), the machine is using higher cognition to access an attention schema, and reporting the information therein.
Example
[edit]Suppose a person looks at an apple. When the person reports, "I have a subjective experience of that shiny red apple," three items are linked together in that claim: the self, the apple, and a subjective experience.
- The claim about the presence of a self depends on cognitive access to a self model. Without a self model, and its requisite information, the system would be unable to make claims referencing itself.
- The claim about the presence of, and properties of, an apple depends on cognitive access to a model of the apple, presumably constructed in the visual system. Again, without the requisite information, the system would be unable to make any claims about the apple or its visual properties.
- In AST, the claim about the presence of subjective experience depends on cognitive access to an internal model of attention. That internal model does not provide a scientifically precise description of attention, complete with the details of neurons, lateral inhibitory synapses, and competitive signals. The model is silent on the physical mechanisms of attention. Instead, like all internal models in the brain, it is simplified and schematic for the sake of efficiency.
Accessing the information within these three, linked internal models, cognitive machinery claims that there is a self, an apple, and the self has a mental possession of the apple. The mental possession is invisible and has no physically descriptive properties, but it has a general location inside the body and a specific anchor to the apple. This mental essence empowers the self to understand, react to, and remember the apple. The brain (as a machine) relies on its (incomplete and inaccurate) model of attention, claims to have a metaphysical consciousness of the apple.
Subjective experience
[edit]In the AST, subjective experience (consciousness, or mental possession of a object or experience) is a simplified construct that describes the act of attending to something.
The internal model of attention is not constructed at a higher cognitive level. It is not a cognitive self theory or learned. Instead, it is constructed beneath the level of cognition and is automatic (and necessary), much like the internal model of the apple and the self. The attention schema is a perception-like model of attention, to distinguish it from higher-order cognitive models such as beliefs or intellectually reasoned theories.
This explains how a machine with an attention schema contains the requisite information to claim to have a consciousness of something, whether of an apple, a thought, or of itself. The machine can understand consciousness in the same ways that we do; and on accessing its internal information, it does not find explanatory meta-information, computes a conclusion, or accesses an internal model, but it instead learns only the narrow contents of the internal models. In AST, human are machines of that sort.
Connection with illusionism
[edit]AST is consistent with the perspective called illusionism.[4] The term "illusion", however, may have connotations that are not quite apt for this theory. Three issues with that label arise.
- In the AST, the attention schema is a well-functioning internal model, which is not normally in error. This differs from the common view of an 'illusion' as dismissible or harmful.
- An illusion is often equated with a mirage, falsely indicating a presence that actually does not exist. If consciousness is an illusion, then by implication nothing real is present behind the illusion. But in the AST, consciousness is a good, if detail-poor, account of attention, which is a physical and mechanistic process emergent from the interactions of neurons. When one claim to be subjectively conscious of something, they are providing a schematized version of the physical reality.
- An illusion is experienced by some agent. When calling consciousness an illusion, one needs to be careful to define what is meant by "experience" so as to avoid circularity. The AST is not a theory of how the brain has experiences, but rather how a machine can make the claim to have experiences. By being stuck in a logic loop, or captive to its own internal information, an intelligent agent cannot avoid making such a claim.
Proposed functions of the attention schema
[edit]The central hypothesis in AST is that the brain constructs an internal model of attention, the attention schema. Its primary adaptive function is to enable a better, more flexible control of attention. Two main types of functions have been proposed for the attention schema: control of attention[2] and social cognition.
Control of attention
[edit]In the theory of dynamical systems control, a fundamental principle is that a control system works better and more flexibly if it constructs an internal model of the item it controls. An airplane autopilot system works better if it incorporates a model of the dynamics of the airplane. An air and temperature controller for a building works better if it incorporates a rich, predictive model of the building's airflow and temperature dynamics. Similarly, the brain's controller of attention should work better by constructing an internal model of what attention is, how it changes over time, what its consequences are, and what state it is in at any moment.
Thus the brain's controller of attention should incorporate an internal model of attention – a set of information that is continuously updated and that reflects the dynamics and the changing state of attention. Since attention is one of the most pervasive and important processes in the brain, the proposed attention schema, helping to control attention, would be of fundamental importance to the system.
A growing set of behavioral evidence supports this hypothesis. When subjective awareness of a visual stimulus is absent, people can still direct attention to that stimulus, but that attention loses some aspects of control. It is less stable over time, and is less adaptable given training on perturbations.[2][6] Initial experiments suggest that this may be true, and support the proposal that awareness acts like the internal model for the control of attention.
Social cognition
[edit]A second proposed function of an attention schema is for social cognition – using the attention schema to model the attentional states of others as well as of ourselves.[1] In effect, just as humans attribute awareness to themselves, it is also attributed to others.
An advantage of this use of an attention schema is in behavioral prediction, which can aid in the survival of intelligent social agents. This is because an agent's attention influences its behavior, since what it is attending to, they are likely to behave toward (and vice-versa). An internal model of attention, and its dynamics and consequences, would be useful for predicting behavior.
An intelligent agent can also plan its own future partly by predicting their own actions.
Research into AST therefore focuses on the overlap between one's own claims of awareness and one's attributions of awareness to others. Initial research using brain scanning in humans suggests that both processes recruit cortical networks that converge on the temporoparietal junction.[7][8]
Analogy to the body schema
[edit]AST was developed in analogy to the psychological and neuroscientific work on the body schema, an area of research to which Graziano contributed heavily in his previous publications.[1] In this section, the central ideas of AST are explained by use of the analogy to the body schema.
Example
[edit]Suppose a person, Kevin, has reached out and grasped an apple, and is asked what he is holding. He can say that the object is an apple, and can describe its properties. This is because Kevin's brain has constructed a schematic description of the apple, here called an internal model. This internal model is a set of information, about size, color, shape, and location, that is constantly updated as new signals are processed. This model allows Kevin's brain to react to the apple and even predict how it may behave in different circumstances.
Kevin's brain has constructed an apple schema. His cognitive and linguistic processors can access this internal model of an apple, and thus he can answer questions about it.
Now Kevin is asked, "How are you holding the apple? What is your physical relationship to the apple?" Once again Kevin can answer.
The reason is that, in addition to an internal model of the apple, Kevin's brain also constructs an internal model of his body, including his arm and hand. This internal mode (the body schema), is a set of information which is constantly updated as new signals are processed, about the size and shape of Kevin's limbs, how they are hinged, how they tend to move, their state at each moment and in the near future.
The primary purpose of this body schema is to allow Kevin's brain to control movement. Because he knows the state that his arm is in, he can better guide its movement. A side effect of this internal body schema is that he can explicitly talk about his body. His cognitive and linguistic processors can access this body schema, and therefore Kevin can answer, "I am grasping the apple with my hand, while my arm is outstretched."
However the body schema is limited. If Kevin is asked, "How many muscles are in your arm? Where do they attach to the bones?" he cannot answer based on his body schema. He may have intellectual knowledge learned from a book, but he has no immediate insight into the muscles of his particular arm. The body schema is a reduced model which lacks that level of mechanistic detail.
AST takes this analysis one step further, and includes Kevin's ability to pay attention to the apple, in addition to physically grasping it.
AST's definition of attention means that Kevin's brain has focused processing resources on the apple. The internal model of the apple has been 'boosted in strength', and as a result Kevin's brain processes the apple deeply, is more likely to store it in memory, or to trigger a response to it. In this definition, attention is a mechanistic, data-handling process. It involves a relative deployment of processing resources to a specific signal.
Now if Kevin is asked, "What is your mental relationship to the apple?", he can answer this question too. According to AST, this is because Kevin's brain constructs not only an internal model of the apple and his body, but also an internal model of his attention. This attention schema is a set of information describing what attention is, its basic properties, dynamics, consequences, and its state at a particular moment. Kevin's cognitive and linguistic machinery has access to this internal model, and therefore Kevin can describe his mental relationship to the apple. However, just as in the case of the body schema, the attention schema lacks information about its mechanistic details. It does not contain information about neurons, synapses, or electrochemical signals that make attention possible. As a result, Kevin reports an experience lacking clear physical attributes. He says, "I have a mental grasp of the apple. That mental possession, in and of itself, has no physical properties. It just is. It's vaguely located inside me. It is what allows me to know about that apple, remember it, and react to it. It's my mental self taking hold of the apple – my experience of the apple."
Here Kevin describes a subjective, experiential consciousness of the apple, which seems (from his point of view) to transcend any physical mechanism. Again, this is only because it is an incomplete description of the physical reality: Kevin's account of his consciousness is a simplified, schematic description of his state of attention.
The example given above relates to a consciousness of an apple. The same reasoning can be applied to other concepts, such as consciousness of a sound, a memory, or oneself as a whole.
See also
[edit]- Integrated information theory of consciousness
- Global workspace theory of consciousness
References
[edit]- ^ a b c d e Graziano MS (19 September 2013). Consciousness and the Social Brain. OUP USA. ISBN 978-0-19-992864-4.
- ^ a b c d e Webb TW, Graziano MS (2015). "The attention schema theory: a mechanistic account of subjective awareness". Front Psychol. 6: 500. doi:10.3389/fpsyg.2015.00500. PMC 4407481. PMID 25954242.
- ^ Graziano MS (2016). "Consciousness Engineered" (PDF). Journal of Consciousness Studies. 23 (11–12): 98–115.
- ^ a b Frankish K (2016). "Not Disillusioned: Reply to Commentators". Journal of Consciousness Studies. 23 (11–12): 256–289.
- ^ Graziano MS (2017-11-14). "The Attention Schema Theory: A Foundation for Engineering Artificial Consciousness". Frontiers in Robotics and AI. 4: 60. doi:10.3389/frobt.2017.00060.
- ^ Webb TW, Kean HH, Graziano MS (2016). "Effects of Awareness on the Control of Attention" (PDF). J Cogn Neurosci. 28 (6): 842–51. doi:10.1162/jocn_a_00931. PMID 26836517. S2CID 9378474.
- ^ Kelly YT, Webb TW, Meier JD, Arcaro MJ, Graziano MS (2014). "Attributing awareness to oneself and to others" (PDF). Proc. Natl. Acad. Sci. U.S.A. 111 (13): 5012–7. Bibcode:2014PNAS..111.5012K. doi:10.1073/pnas.1401201111. PMC 3977229. PMID 24639542.
- ^ Webb TW, Igelström KM, Schurger A, Graziano MS (2016). "Cortical networks involved in visual awareness independent of visual attention". Proc. Natl. Acad. Sci. U.S.A. 113 (48): 13923–13928. Bibcode:2016PNAS..11313923W. doi:10.1073/pnas.1611505113. PMC 5137756. PMID 27849616.
Further reading
[edit]- Graziano MS, Kastner S (January 2011). "Human consciousness and its relationship to social neuroscience: A novel hypothesis". Cogn Neurosci. 2 (2): 98–113. doi:10.1080/17588928.2011.565121. PMC 3223025. PMID 22121395.
- Michael S. A. Graziano (19 September 2013). Consciousness and the Social Brain. OUP USA. ISBN 978-0-19-992864-4.
- Webb TW, Graziano MS (2015). "The attention schema theory: a mechanistic account of subjective awareness". Front Psychol. 6: 500. doi:10.3389/fpsyg.2015.00500. PMC 4407481. PMID 25954242.
- Graziano M, Webb T (2017). "From Sponge to Human: The Evolution of Consciousness" (PDF). Evolution of Nervous Systems. pp. 547–554. doi:10.1016/B978-0-12-804042-3.00098-1. ISBN 9780128040966.
- Graziano MS, Guterstam A, Bio BJ, Wilterson AI (September 2019). "Toward a standard model of consciousness: Reconciling the attention schema, global workspace, higher-order thought, and illusionist theories". Cogn Neuropsychol. 37 (3–4): 155–172. doi:10.1080/02643294.2019.1670630. PMID 31556341. S2CID 203441429.
- Michael S. A. Graziano (17 September 2019). Rethinking Consciousness: A Scientific Theory of Subjective Experience. W. W. Norton. ISBN 978-0-393-65262-8.
External links
[edit]- How Consciousness Works. And Why We Believe in Ghosts (21 August 2013) – Michael Graziano – Aeon
- Consciousness and the Unashamed Rationalist (30 August 2013) – Michael Graziano – HuffPost
- Are We Really Conscious? (10 October 2014) – Michael Graziano – The New York Times
- Can We Make Consciousness into an Engineering Problem? (10 July 2015) – Michael Graziano – Aeon
- Rethinking Consciousness: A Q&A with Michael Graziano (29 July 2015) – Evan Nesterak & Michael Graziano – The Psych Report Archived 31 December 2019 at the Wayback Machine
- What is Consciousness? Dr. Michael Graziano on Attention Schema Theory (4 February 2018) – Isabel Pastor Guzman & Michael Graziano – Brain World