KEYWORDS: psychological complementarity, causality, consciousness,
first person, third person, self, thing-itself, mind, conscious process
Abstract of 1991 target article: Investigations of the function of consciousness in human information processing have focused mainly on two questions: (1) where does consciousness enter into the information processing sequence and (2) how does conscious processing differ from preconscious and unconscious processing. Input analysis is thought to be initially "preconscious," "pre-attentive," fast, involuntary, and automatic. This is followed by "conscious," "focal-attentive" analysis which is relatively slow, voluntary, and flexible. It is thought that simple, familiar stimuli can be identified preconsciously, but conscious processing is needed to identify complex, novel stimuli. Conscious processing has also been thought to be necessary for choice, learning and memory, and the organization of complex, novel responses, particularly those requiring planning, reflection, or creativity. The present target article reviews evidence that consciousness performs none of these functions. Consciousness nearly always results from focal-attentive processing (as a form of output) but does not itself enter into this or any other form of human information processing. This suggests that the term "conscious process" needs re-examination. Consciousness appears to be necessary in a variety of tasks because they require focal-attentive processing; if consciousness is absent, focal-attentive processing is absent. From a first-person perspective, however, conscious states are causally effective. First-person accounts are complementary to third-person accounts. Although they can be translated into third-person accounts, they cannot be reduced to them.
The commentaries by Neill, Glicksohn, Navon, Habibi & Bendele, and Rao largely address different issues. Neill challenges my claim that focal attention replaces consciousness in third-person perspective models of processing (section 1). Glicksohn and Navon agree with me that phenomenal awareness cannot be reduced to information processing (section 2), however much depends on how one defines "consciousness" or "awareness". Rao argues that "consciousness" and "awareness" are equally ambiguous, and defines "consciousness" in terms of "subjectivity" (section 3.1); Glicksohn attempts to show that the three senses in which a process can be said to "be conscious" (in my target article) reduce, without loss, to one sense (section 3.2); by contrast, Navon suggests that "awareness" has a dual sense (section 3.3). I oppose these suggestions. A recurring theme in my reply to the commentaries is the observer-relative nature of consciousness and its causal effects, resulting in a "psychological complementarity principle" and an account of "mixed-perspective" explanations. These provides a nonreductionist alternative to Neill's reduction of consciousness to the contents of working memory (section 1). It also enables a simple explanation of Libet's paradoxical findings regarding the subjective time of occurrence of events, which contrasts with Habibi & Bendele's attempt to account for the same findings in terms of relativity theory (section 4). Rao's thoughtful exploration of my suggestions, provides a welcome opportunity to expand on the nature of psychological complementarity and its implications for the causal nature of consciousness (in sections 5 and 6). References to my target article and original reply to commentaries are indicated by TA and R respectively.
1. Does focal attention replace consciousness as a causal agent in processing? In cognitive psychology it is commonly taken for granted that focal attention is a causal agent in processing. However, according to Neill, I am mistaken in adopting this assumption. In his view, focal attention might facilitate the processing of a selected stimulus (A) relative to a competing stimulus (B), or it might inhibit the processing of competing stimuli, enabling selected ones to be processed without interference. For Neill, facilitation of (A) relative to (B) may be said to causally influence (A), but inhibition of (B) does not causally influence (A) - it merely "enables" it. Neill reviews some interesting evidence from negative priming experiments which suggests that inhibition of (B) does occur in some situations. Consequently, he argues that focal attention is not "a causal agent in processing" - and concludes that the critical role of "organizing and executing behaviour towards objects in the environment" is performed by consciousness.
However, Neill's case provides little support for his conclusion. Take, for example, his distinction between "enabling" and "causing". According to Neill, inhibition of (B) enables (A) but does not causally influence it. But selective attention theories commonly assume that inhibition of selected stimuli enables additional processing resources to be focused on selected stimuli, thereby altering the way they are processed. If so, inhibition of (B) causally influences the processing of (A) albeit indirectly. Even if the processing options available for (A) were to remain unchanged by inhibition of (B), inhibition of (B) would cause the system as a whole to respond preferentially to (A). Consequently, even if focal attention did nothing more than inhibit nonselected stimuli, it would be a causal agent in processing.
More importantly, the tactic of Neill's argument against focal attention having a causal role is to redefine "focal attention" in a way not intended in my target article. He simply implies at the outset that the only role of focal attentive processing is stimulus selection - for example, in his assertion that "In the absence of a competing stimulus B, processing of A may not require attention at all". Having eliminated focal attention from post-selection processing, Neill hopes to leave the field free for consciousness. However, since their inception, models of selective attention have been concerned not only with how and when stimulus selection takes place, but also with the way the processing of selected stimuli differs from that of nonselected ones (see, for example, Broadbent, 1958; Cherry, 1953). Traditionally, stimuli that have been selected are thought to be subject to "focal attentive processing", which is relatively slow, limited in processing capacity, voluntary and flexible. Nonselected stimuli are given pre-attentive processing which is fast, greater in processing capacity, automatic and inflexible. In my target article, I have argued that pre-attentive processing may be more sophisticated than generally thought (TA, 2.1, 2.2). Nonetheless, I have adopted the conventional view that focal attentive processing follows stimulus selection, rather than being confined to selection, as Neill appears to assume (see review of attention theories in TA, 1.1 to 1.4). As pointed out in TA, 3, stimulus selection is preconscious. If so, consciousness cannot be responsible for stimulus selection. Consequently, it would have made no sense for me to claim that 'focal attention' carries out the functions attributed to consciousness in stimulus selection (and I made no such claim).
In common usage, "focal attentive processing" refers to whatever processing takes place for stimuli currently at the focus of attention. The nature of such processing is still under active investigation. It might, for example, be involved in the integration of stimuli (La Berge, 1981; Kahneman & Treisman, 1984), the dissemination of processing results throughout the processing system (Baars, 1989; Kahneman & Treisman, 1984) and the control of complex, novel responses (see TA, 5). Neill does not deny that after stimuli are selected, further complex processing takes place; he simply chooses not to call that processing "focal attentive" - and then goes on to confuse a difference in definition with a difference in theoretical substance. For example, Neill opts for a "production system" model of post-selection processing, consisting of a procedural knowledge base, a declarative knowledge base, and a working memory. Whether or not this model turns out to be the best way of specifying the nature of "focal attentive processing" (in the sense I intended) is tangential to whether or not such processing performs all the functions attributed to consciousness.
The target article argued that the contents of consciousness are a late-arising product of focal attentive processing, and should not be confused with the processing itself. By contrast, Neill goes on to equate working memory with consciousness. In this he fails to address the different senses in which a process may (or may not) be said to "be conscious" (see section 3.2 below). He also ignores the extensive case against reducing the phenomenology of consciousness to a state or function of the brain, discussed in the target article, the commentaries, and the reply (see also Velmans, 1990; 1992a,b; 1993).
As an alternative to reductionism, I have argued that each conscious experience has a distinct neural correlate (a neural representational state) which encodes the same information as that which is manifest in conscious experience, albeit in a different format. Each neural representation "is an integrated state, combining information relating to current events, with expectations, needs, goals, and so forth, in a way that allows the dissemination of that information throughout the processing system" (R9.3, p717). If Neill's model of post-selection processing is correct, such neurally encoded information may reside in "working memory". From a third-person, external observer's perspective, such neurally encoded information performs the processing functions which, from a first-person perspective, are attributed to consciousness (for example, it would encode the ongoing results of the arithmetic operations discussed by Neill). However, first-person causal accounts remain complementary to third-person accounts and cannot be reduced to them. Neill's bland reductionism ignores these complex relationships; but they are discussed in TA9.3, R9, and I discuss them further in sections 3.3, 4 and 5 below.
Neill closes his commentary on a rhetorical note, claiming that without consciousness human behaviour would be controlled by the strength of S-R connections, or by mere habit and instinct. This ignores the extensive evidence (reviewed in TA 1 to 7) for sophisticated, flexible, nonconscious processing in all the main phases of human information processing ranging from encoding, storage, retrieval and transformation, to output.
2. Dissociating consciousness from human information processing. The differences between the theoretical positions of Navon, Glicksohn and my own are more subtle. For example, Glicksohn accepts that "One must distinguish between the constructs of cognition and consciousness," that "preconscious processing seems to predominate" (in the brain), and that it is not possible to put consciousness into an information processing 'box'. In Navon's view, "nobody would dispute ... that "phenomenal experience is something that a person has, and that it does not make sense to attribute it to any construct in an information processing model, high-level as it may be." I reached the same conclusions in my target article - although many others would dispute these claims (see, for example, target article commentaries by Mandler, Baars, Block, Sloman, the commentary by Neill above, and many other forms of functionalism mentioned in TA Note 2, p668). I also agree with Navon that having phenomenal experience is "not required for having information", that "information is the functional currency a mind deals with" (viewed from a third-person perspective), and that "experience is a phenomenon that neither universally emerges from the processes of computation nor is functionally required for them" (this is one main thrusts of my target article).
Glicksohn doubts that theories of consciousness actually advance. Rather, "We have been running a fast race along a short round track, rediscovering problems and part solutions along the way, inflating the literature with various notions of consciousness and cognition without being able to agree on our definitions." However, the dissociation of consciousness from information processing (agreed on by Gicksohn, Navon, and myself) is neither circular nor trivial. It poses problems for reductionism, and it suggests that information processing models that deal solely with the way input is converted into behavioral output are incomplete - surely a matter of some interest to cognitive psychology and philosophy of mind! If the suggestion were trivial, it would hardly have upset so many of our esteemed colleagues (see the commentaries following the target article).
3. Disputes about definitions. Glicksohn is right to note that we cannot agree on our definitions. This arises, for example, in some of my current points of dispute with Glicksohn, Navon, Rao, and Neill (discussed above). This is to be expected; definitions are "theory laden"; our different definitions reflect fundamental differences in theory.
3.1 Definitions of "consciousness". Rao reminds us that the term "consciousness" can be used to refer to many different things. In my target article I refer to consciousness in the sense of "awareness", or "experience" - the most common everyday usages of the term. The main contrast I draw is between conscious events and nonconscious events. Events which we are conscious of, aware of, or experience are contrasted with events which we are not conscious of, aware of, or experience. In my usage, one cannot be "aware" of a nonconscious event; one can speak of "subliminal perception" and "unconscious memories", but "subliminal awareness" and "unconscious experience" are contradictions in terms. This makes it possible to define the various senses in which a process may be said to "be conscious" with precision (see 3.2 below).
According to Rao, the term "awareness" is no less ambiguous
than "consciousness". However, the examples he cites largely refer to the varied things of which one can be aware (inner events in "introspective awareness", external events in "perceptual awareness", and so on). They do not bear on the legitimacy of identifying consciousness with awareness or on the conscious, nonconscious distinction that I draw. Rao takes the defining feature of consciousness to be "subjectivity". I agree that human consciousness is essentially subjective, but in my own ontology, "subjectivity" contrasts with "intersubjectivity" and "objectivity". One can ask, for example, whether reports of conscious experience are exclusively subjective, or whether they can also be intersubjective or objective. I examine these questions in depth in Velmans (1993). However, the subjectivity of consciousness is tangential to the main issue addressed in the target article. The question was not "Is human information processing subjective?"
The confusion that follows from Rao's redefinition becomes apparent in his consequent suggestion that "It may be trivially true that awareness follows from focal-attentive processing, if ... Awareness..is no more than a quality bestowed on experience" ..."Awareness as a quality of experience can hardly be regarded as a process. It is not surprising therefore that in this sense consciousness does not enter into information processing in any causative role." But in my usage, awareness is not a "quality" of experience any more than experience is a "quality" of awareness; these terms are co-extensive. My suggestion was that focal attentive processing has a causal role in converting stimuli from being preconscious to conscious, not that focal attentive processing converts "non-aware experience" to "aware experience" (it makes no sense to speak of "non-aware experience"). The association of consciousness with late-arising products of focal-attentive processing may turn out to be erroneous, but it is not trivial. Current understanding of how pre-attentive processing differs from focal-attentive processing, and of the detailed stages of focal-attentive processing is incomplete, but is nevertheless a hard-won empirical advance.
3.2 Definitions of a "conscious process". In TA 9.1, I suggest that the term "conscious process" requires re-examination, as common usage frequently conflates three meanings of this term. In the psychological literature a process is sometimes said to be "conscious"
(a) in the sense that one is conscious of the process
(b) in the sense that the operation of the process is accompanied by consciousness (of its results) and
(c) in the sense that consciousness enters into or causally influences the process.
For example, thinking (but not visual perception) may be conscious in sense (a). Both thinking and visual perception may be conscious in sense (b) (but not subliminal perception). According to the target article, no information processing is conscious in sense (c). If valid, these distinctions have obvious relevance to the question my target article addressed (Is human information processing conscious?).
Glicksohn, however, denies the need for such distinctions. For him, a "conscious process" is just "a cognitive process whose contents enter awareness." In my view, this confounds the information contents of a process with its detailed operations. For example, according to Glicksohn, answering the question "what is your name?" is nonconscious, because the name simply pops out of memory with no awareness of how it is done. I agree. But the process is conscious in the sense that the output is conscious (one is conscious of the name) - unlike, say, memory in an implicit memory task. Such important differences in processing need to be reflected in definitional distinctions (recalling one's name is nonconscious in sense (a), but conscious in sense (b)). Glicksohn's definition also sidesteps the issue of whether consciousness causally influences processing - a central question addressed by the target article and reply.
3.3 Defining the function of "awareness" and its relation to "information". Navon, by contrast, introduces two distinct usages of the term "awareness". He suggests that "it makes a difference whether we ask what is the function of awareness in the sense of information or we ask what is the function of phenomenal experience proper." (my italics). The former enters into information processing; the latter does not. While I agree that information needs to be distinguished from phenomenal experience, to call these both "awareness" muddles the issue. For example, Navon goes on to identify the function of "awareness in the sense of information" with information dissemination. But Navon (1991) himself points out that information dissemination "can also be presented in terms of information processing without any reference to consciousness." So why should one call this "consciousness" or "awareness"?1
Having separated information from phenomenal experience conceptually, Navon suggests that the two are actually confounded "in any empirical investigation of the issue" - for example, in my target article review. Much of the review, however, examined the surprising sophistication of nonconscious and preconscious processing in which the dissociation of information processing from consciousness is clear. Admittedly, dissociating consciousness from its own neural correlates is a hard-nut to crack (see the discussion of blindsight and similar phenomena in TA, 8, p665). But the strategy of attempting experimental dissociations enables one to close in on the processes most closely associated with consciousness - and I agree with Navon that, on the present evidence, these seem to have to do with information dissemination, a late stage of focal-attentive processing.
Having denuded phenomenal experience of any information content, Navon is forced to be obscure about its function. He concludes that, "Actually it might have a function, but not at the level that Velmans is interested in. As argued here, any particular phenomenal experience may not affect the processing to which it corresponds at a given moment. However, that does not entail that having the property of being capable of phenomenal experience in general does not have any function for the being that has that property. But that is a topic for another discussion."
I agree that conscious experience may have some general function for the being that has it, for without it few humans would wish to go on living (R9.4, p718). However, my conclusions regarding the functions of particular experiences are very different to those of Navon. According to R9.3, p717, each experience has a distinct neural correlate which encodes identical information. From a first-person perspective this information is displayed in the form of a structured, phenomenal experience (a conscious representation of some event); from a third-person perspective, information (about the same event) is displayed (we assume) in a neural format. Viewed from a first-person perspective, particular experiences have innumerable causal influences on subsequent experiences and behaviour. Viewed from a third-person perspective the same influences may be accounted for in the way (corresponding) neurally encoded information enters into subsequent human information processing. Consequently, my target article conclusion was "not that consciousness is causally ineffective. Viewed from a first-person perspective consciousness is central to the determination of human action" (TA, p667). The mistake is to look for a first-person function in a third-person information processing model of the brain (R9, p716-718).
4. Observer-relativity and the theory of relativity in physics. The causal effects of consciousness are observer-relative. Viewed from a third-person perspective, epiphenomenalism appears true; viewed from a first-person perspective, epiphenomenalism appears false. From a third-person external observer's perspective the causes of a subject's actions appear to be explainable entirely in terms of how those causes appear when observed from that perspective, in terms of neurally encoded information, brain processing and so on. From the subject's perspective the causes of the same actions may be entirely explainable in terms of what he experiences. Such differences are to be expected whenever the external observer and the subject have asymmetrical access to the events which determine the subject's actions. If the causal determinants are within the subject's body or brain, the external observer and subject are located differently with respect to the events in question, and the observing systems they employ are different; the external observer, for example, uses exteroceptors, supplemented perhaps by instrumentation; the subject must rely on biologically given interoceptive systems. Neither perspective is automatically privileged or incorrigible (cf Velmans, 1993; R9, p716-718).
According to Habibi & Bendele, Einstein's relativity theory may provide a simpler way of understanding observer-relative effects, at least in the case of Libet's findings regarding the temporal order of perceived events. As they note, the temporal ordering of events in physical space may be entirely dependent on the distance of an observer from the events, given that the speed of light is fixed; for example, if events A and B are simultaneous from the perspective of an observer equidistant from A and B, then for an observer closer to A, A precedes B, and for an observer closer to B, B precedes A.
There is no paradox in such differences; nor, according to Habibi & Bendele, is there any paradox in Libet's data. From a first-person perspective, the urge to initiate a voluntary action precedes the action, but Libet found that the readiness potential preceded the urge by around 350 milliseconds. Habibi & Bendele claim that this data "may be viewed as data on perception of temporal order on two events: the urge to initiate action and the initiation of the voluntary action itself. The data are that the experimenter's perception of the temporal order of events differs from that of the subject's." If so, there are no consequences one way or the other for the causal status of consciousness.
In assessing this claim, one should note that Libet (1985) did not find a different temporal ordering of the urge and the initiation of action, when viewed from a third- versus a first-person perspective. From a first-person perspective the urge preceded the initiation of action; from a third-person perspective the unconscious readiness potential preceded the (reported) urge, but the urge still preceded the initiation of action (by around 150 milliseconds).2 The central point I stressed in my discussion of this data (TA 3, p658) was that the readiness potential nevertheless preceded the urge. From a third-person perspective, readiness to act is signalled by the readiness potential; if this precedes the (reported) conscious urge, preparedness to act must be preconscious. From a first-person perspective, preparedness to act is manifested not as a readiness potential, but in the form of a conscious volition or urge. Hence, from a first-person perspective conscious volition appears causal, whereas from a third-person perspective it does not.
However, apparent time reversals do occur; for example, reaction time to a tactile stimulus may be as little as 100 milliseconds, but awareness of that stimulus does not arise until at least 200 milliseconds after it projects onto the cortical surface. So, from the perspective of the experimenter, the subject's awareness appears to arise after his response. For the subject, on the other hand, awareness of the stimulus occurs before his response. Libet explains the latter by suggesting that the experienced time of occurrence of the stimulus is subjectively referred "backwards in time" to the instant it first projected onto the cortical surface.
Habibi & Bendele hope to account for such data purely in terms of differences in experimenter versus subject 'location'. If, for example, the experimenter is observationally 'closer' to the cortical activity associated with initiating an action than to the urge to act, then the initiation of action will seem to precede the urge; similarly, if the subject is observationally 'closer' to the urge than the initiation of action, then for the subject, the urge will seem to precede the initiation of action. In their various references to the "relativistic nature of time" Habibi & Bendele clearly intend their relativistic account to be taken literally. However, relativity theory cannot plausibly be applied to Libet's data, where observed differences in the time of occurrence of a given event (as observed by the experimenter and the subject) may be in the order of hundreds of milliseconds. For example, in Libet's 1985 experiment above, the readiness potential observed by the experimenter preceded the urge to act experienced by the subject, by around 350 milliseconds. Let us suppose the readiness potential and the urge are actually observations of the same event, which appear temporally different due to the relativistic nature of time. If the experimenter and subject are stationary in relation to each other and the speed of light is 186,000 miles per second, this discrepancy would occur only if the experimenter were 65,100 miles 'closer' to the events in the subject's brain than the subject (not a suggestion to be taken seriously)!
If Habibi & Bendele intend the speed of neural impulses to be substituted for the speed of light, their suggestion that observed time of occurrence depends on the 'location' of the subject's consciousness vis a vis the location of an external observer must be judged on its own merits (rather than on the merits of relativity theory). Read this way, their suggestion still faces serious problems. At the end of their commentary, for example, they write that "consciousness is treated as a sensor at a location at which a signal arrives, similar to points on the scalp recording cortical activity". However, in their explanation of relativistic effects in the brain, they find it difficult (not surprisingly) to specify a location for consciousness (in their Figure 1). More to the point, some of Libet's findings on subjective time of occurrence cannot be understood in terms of the time taken for neural impulses to travel to a consciousness 'location'. Libet et al. (1979) for example, found that whether a stimulus arriving at the cortex was referred backward in time depended on whether its arrival at the cortex was "marked" by an early evoked potential. Only "marked" stimuli were referred backwards in time. Assuming that the 'distance' of consciousness from the cortical surface is fixed, this cannot be explained in Habibi & Bendele's terms.
My own account of such first- versus third-person perspectival differences is consistent with the account given by Libet. As noted above, the experimenter and the subject are not only located differently with respect to events taking place within the subject, they also rely on different observing "apparatus" which accesses such events in different ways. What the experimenter and the subject observe results from the way information relating to a given event is processed by their observing apparatus. For the experimenter, preparedness to act (in Libet's 1985 experiment) takes the form of a readiness potential which he observes via his visual system, supplemented by physical equipment. The interoceptive system of the subject represents the same event in the form of a conscious urge or volition. Consequently, from the experimenter's point of view the initiating cause of action appears to be the readiness potential; from the subject's point of view the initiating cause appears to be the experienced urge. Neither representation is "true" at the expense of the other. They are simply alternative representations of the same event. The subjective representation (of an urge) is delayed (from the experimenter's perspective) because the subject's brain requires time to construct a conscious experience.
Similarly, for input stimuli, the brain requires time to identify, select, and otherwise process stimuli prior to updating long-term memory, disseminating the results of prior processing throughout the processing system, and entry of the (processed) stimuli into consciousness. Consequently, a delay between arrival of the stimulus at the cortex and consciousness of that stimulus is unavoidable. In order to represent the time of occurrence of the stimulus accurately the brain needs to compensate for the duration of its own processing operations - a problem analogous to that faced by many forms of physical measuring system whenever the operation of the system substantially alters what is being measured.3 The brain's technique is to use information about when the stimulus arrives at the cortex (provided, according to Libet, by an early evoked potential) to represent the time of occurrence of the stimulus, rather than its arrival time at some 'consciousness location' as Habibi & Bendele propose (see also Dennett, 1991; Dennett & Kinsbourne, 1992 on this point). This neurally encoded cortical arrival time is eventually translated into experienced time. Consequently, for the subject, experienced time of occurrence is the time of arrival at the cortical surface. "Backwards referral in time" takes place from an external observer's point of view only in the sense that the time of arrival at the cortex is represented in the subject's consciousness rather than the observed time of occurrence of the conscious representation itself. A similar "referral" occurs in the conscious representation of space; from an experimenter's point of view, information about spatial location of external stimuli is encoded within the brain - but the subject experiences a three-dimensional phenomenal world outside his brain (see Velmans, 1990; 1992a,b; 1993). Matters only become paradoxical if one forgets that from an experimenter's third-person perspective one only has access, in principle, to the information and processing used by the brain to construct conscious experiences, while from a first-person perspective one only has access to the conscious experiences themselves.
5. Is consciousness causal or isn't it? Rao subjects my account of third-person versus first-person observations to a thorough examination. If consciousness is causal only from a first-person perspective, "Does consciousness have a causal primacy of it own?" Rao accepts that there is "no need to invoke consciousness to explain neural activity or to invoke neural activity to explain conscious experience" (in many circumstances it makes sense to explain conscious experiences in terms of prior experiences). At the same time he accepts that there are many situations in which consciousness and the brain appear to causally interact; it makes perfect sense to investigate the neural causal antecedents of given conscious experiences, and to investigate the ways in which different experiences causally influence the brain, body and behaviour. Taken together, these relationships constitute a "causal paradox" which any adequate theory of consciousness needs to address (R9.2, p716).
My resolution of this paradox introduced a "psychological complementarity" principle and an account of "mixed perspective explanations". It should be stressed that psychological complementarity of (first- and third- person perspective observations) is not the wave-particle complementarity of quantum mechanics (see TA note 18, p669); in electrons, wave and particle-like behaviour are both observable from a third-person perspective. Other differences relating to the mutual "translatability" of the two perspectives are discussed in section 6 below. Unlike Jahn and Dunne (1987) (cited by Rao) I do not assume that "consciousness may have the option of a wave-like or particulate function."
However, in some respects, the parallels with the complementarity principle in physics are exact. In particular, both principles stress that observations cannot be divorced from the conditions of observation. The causes of action observed by a subject and an experimenter may be very different if they observe those causes from very different 'locations' via very different observing apparatus (e.g. exteroceptors versus interoceptors). Take, for example, Libet's paradoxical findings discussed above. From a third-person perspective, preparedness to act may appear in the form of a "readiness potential", whereas from a first-person perspective, it is experienced as an "urge" -hence from one perspective consciousness is causal, and from the other it is not.
But Rao asks, "what does it mean to say that consciousness is causally efficacious from the first-person perspective and not from the third-person perspective? Does it mean that the effects of consciousness can be observed only from the first-person perspective? This question is not answered by the assertion that the observations from the first-person perspective are as "real" as those from the third-person perspective. If conscious states cause effects that can be observed from a third-person perspective, they cannot be explained away as a consequence of our viewing the phenomena from a mixed perspective."
Taking Rao's last point first, it should be noted that while the effects attributed to conscious causes can sometimes be viewed from a third-person perspective, conscious states cannot be observed to cause those effects (from this perspective); conscious states only appear to be causal from a first-person point of view. Consequently, when we do invoke (first-person) conscious causes to explain third-person perspective effects, we are tacitly switching perspectives. Making such "mixed perspective" switches explicit helps to clarify the nature of such explanations; it is not intended to 'explain the phenomena away'.
Understandably, Rao wants to know whether consciousness has causal efficacy or whether it merely appears to have it (from a first-person perspective). In my view, there are two possible answers to this question, depending on whether it is meaningful in this context to distinguish what is observed to be "real" from what is "real". Suppose one asks the same question for wave-particle duality. Are electrons really waves observed under some conditions and really particles viewed under other conditions, or does their wavelike nature merely appear to be "real" under some conditions (perhaps electrons are really particles after all). According to the convention adopted by Neils Bohr (cf Folse, 1985), observations in physics define the limits of what can be known of the physical world - and physical observations cannot be divorced from the conditions of observation. Consequently, for Bohr, it makes no sense to suggest that electrons only appear to be waves when viewed under certain conditions. If they behave like waves under certain conditions, they are waves (under those conditions) and if under other conditions they behave like particles, they are particles. Observations of wave and particle-like behaviour are complementary and mutually irreducible. Taken together, they provide a complete description of electrons.
Suppose now that we could observe the interface of consciousness and the brain - the neural correlates, say, of a visual experience of a cat, out in the world. Such correlates encode the same information as that which is manifest in conscious experience (see above). To an external observer, this neurally encoded information is the subject's representation of the entity out in the world (the cat); to the subject the entity out in the world is the cat he experiences. In this situation, it does not make sense to ask whether the neural representation is more "real" than the conscious representation - the subject's representation of a cat simply appears differently when viewed from a third- versus a first-person perspective. These third-and first-person observations are complementary and mutually irreducible. Taken together they provide a complete description of the subject's representation (of a cat). On this view, what is observed to be "real" is "real".
On the other hand, if one takes it as axiomatic that observations are always observations of something, then it is meaningful to ask, "What is it that appears as a wave under some conditions and a particle under others?". For Bohr this question went beyond physics, but other physicists (e.g. Bohm, 1952; Bohm & Hiley, 1987) disagree. Similarly, one can ask, "What is it that appears as neurally encoded information to an external observer, but as a conscious experience from a first-person point of view?" For Spinoza, these may be dual aspects of some deeper "nature", for Kant some "noumenon" or "thing-itself", and for Jung some deeper "self". Looked at in this way, first-person and third-person causal accounts may be thought of as alternative representations of a causal process within the "thing itself." This provides an alternative to the "unadulterated psychological determinism" which Rao rejects. In the latter part of his commentary, Rao also moves in this direction, although his claims for the causal efficacy of consciousness do not always clearly distinguish (manifest) conscious experience from its grounding conditions (in the "self" or "thing itself"). For example, Rao appears to be referring to the grounding conditions when he claims that "Subjectivity is not the construction of the brain. Rather it is something we must presuppose as a necessary condition of our experiences. It is the essence of self and consciousness in its most appropriate sense." Later, he blurs the manifest consciousness/grounding conditions distinction in his suggestion that "Consciousness is self-determining and self-manifesting. It is this principle of consciousness that renders it in principle irreducible to brain states." I will not comment on the details of these suggestions as they move well beyond the scope of my target article.4 For Kant, and for Bohr, the nature of the "thing itself" is, in any case, beyond scientific enquiry (a view shared, no doubt, by many readers). In my own view however it is far too early to close this account. If observations are always of something other than themselves, then what is observed to be "real" only represents what is "real" (cf Velmans, 1990, 1992a,b). If so, we will always be tempted to move beyond current observations to peer more deeply into the nature of the "thing-itself".
6. Is psychological complementarity genuine? Rao also asks for clarification of my suggestion that while first-person accounts can be translated into third-person accounts, they cannot be reduced to them. He suggests that if all first-person accounts can be translated, their significance would be lost (and genuine complementarity would no longer exist). My target article stressed, however, that sentience must be dissociated from functioning within the human brain. While accounts of input-output functioning viewed from a first-person perspective, can be translated, in principle, to third-person accounts of processing, the existence of first-person conscious experience (encompassing an entire phenomenal world) provides an irreducible residue. For example, a conscious representation of a cat could be translated into neurally encoded information for the purposes of understanding how such representations entered into cerebral processing, but the phenomenal cat cannot be reduced to a state or function of the brain (cf Velmans, 1990). As I note in R9.3, p717, "Conscious states no more reduce to neurophysiological ones (or the reverse) than pictures on television screens reduce to the magnetic codings on videotapes that encode identical information."
Rao wonders whether third-person accounts can also be translated into first-person accounts. This is an empirical rather than a theoretical question - as it depends on the extent to which events accessible from a third-person perspective have some manifestation in a subject's experience. As noted above, the apparatus available to subjects to know about their own physical or mental states is biologically given, whereas third-person observations may be aided by physical equipment of many different kinds. Consequently, some events are far more accessible from a third-person perspective, for example, the details of brain processing. Some processes (thinking, problem solving, etc.) are partly available to consciousness, but many are not (for example, the processes involved in stimulus analysis, memory storage and retrieval, and so on). On the other hand, some events are more readily accessed from a first-person perspective. The first indication that all is not well in physical or psychological functioning is usually (but not always) the pain or discomfort experienced; some utopian third-person science might be able to assess when all physical and psychological functions were operating well, but the final test would still have to be whether this 'translated' into an experienced feeling of well-being. Physical instrumentation enables more of nature to be observed from a third-person perspective, but as Shevrin (1991) points out, exploration of one's own (unconscious) nature can also take place from a first-person perspective, using a variety of introspective therapeutic techniques. These include the varied meditation techniques referred to by Rao. In short, the complementarity I intend, is a genuine one.
Note 1. This point of difference with Navon has already been discussed in R9.3, p717.
Note 2. Libet (1985; 1991) hoped to salvage a third-person causal role for consciousness within this 150 millisecond period (but see discussion in R1.6, p705).
Note 3. An ammeter, for example, has to draw off current in order to measure current; this is compensated for in the way the ammeter is calibrated.
Note 4. Space limitations also prevent my commenting on Rao's analysis of the interaction of intentionality and attention.
Baars, B.J.(1989) A Cognitive Theory of Consciousness. Cambridge University Press.
Bohm, D.(1952) A suggested interpretation of the quantum theory in terms of hidden variables. Physical Review 85:166-189.
Bohm, D & Hiley, B.J.(1987) An ontological basis for the quantum theory. Physics Reports 144:323-348.
Broadbent, D.E.(1958) Perception and communication. Pergamon Press.
Cherry, C.(1953) Some experiments on the reception of speech with one and with two ears. Journal of the Acoustical Society of America 25:975-979.
Dennett, D.C.(1991) Consciousness explained. Little Brown.
Dennett, D.C & Kinsbourne, M.(1992) Time and the observer: The where and when of consciousness in the brain. Behavioral and Brain Sciences 15:183-200.
Folse, H.J.(1985) The philosophy of Niels Bohr. (The framework of complementarity). Elsevier Science Publishers.
Kahneman, D. & Treisman, A.(1984) Changing views of attention and automaticity. In Varieties of Attention, ed. R. Parasuraman & D.R. Davies. Academic Press.
La Berge, D.(1981) Automatic information processing: A review. In Attention and Performance IX, ed. J. Long & A. Baddeley, pp.173-186. Erlbaum.
Libet, B.(1985) Unconscious cerebral initiative and the role of conscious will in voluntary action. Behavioral and Brain Sciences 8:529-566.
Libet, B.(1991) Conscious functions and brain processes. Behavioral and Brain Sciences. 14:685-686.
Libet, B., Wright Jr., E.W., Feinstein, B. & Pearl, D.K.(1979) Subjective referral of the timing for a conscious experience: A functional role for the somatosensory specific projection system in man. Brain 102:193-224.
Navon, D.(1991) The function of consciousness or of information? Behavioral and Brain Sciences 14:690-691.
Shevrin, H.(1991) A lawful first-person psychology involving a causal consciousness: a psychoanalytic solution. Behavioral and Brain Sciences 14:693-694.
Velmans, M.(1990) Consciousness, brain and the physical world. Philosophical Psychology 3:77-99.
Velmans, M.(1992a) Synopsis of 'Consciousness, brain and the physical world'. Philosophical Psychology 5:155-157.
Velmans, M.(1992b) The world as-perceived, the world as-described by physics, and the thing-itself: a reply to Rentoul and Wetherick. Philosophical Psychology 5:167-172.
Velmans, M.(1993) A reflexive science of consciousness. In Experimental and theoretical studies of consciousness. Ciba Foundation Symposium No.174, Wiley, Chichester (in press).