Velmans, Max (1995)  The Relation of Consciousness to the Material World  Journal of Consciousness Studies, 2 (3), 255-265
(a commentary on Chalmers, D.(1995) Facing up to the problem of consciousness, JCS, 2(3), 200-219). 
 

THE RELATION OF CONSCIOUSNESS TO THE MATERIAL WORLD

(a commentary on Chalmers, D.(1995) Facing up to the problem of consciousness, JCS, 2(3), 200-219).
 
 

Max Velmans
Department of Psychology
Goldsmiths
University of London
Lewisham Way
London
SE14 6NW
England

Email m.velmans@gold.ac.uk
URL: http://www.gold.ac.uk/academic/ps/velmans.htm
 

KEYWORDS: consciousness, Chalmers, dualism, reductionism, mind/body problem, dual-aspect, information, functionalism, complementarity, blindsight, cortical implant, qualia
 
 

Summary

Many of the arguments about how to address the hard versus the easy questions of consciousness put by Chalmers (1995) are similar to ones I have developed in Velmans (1991a,b; 1993a). This includes the multiplicity of mind/body problems, the limits of functional explanation, the need for a nonreductionist approach, and the notion that consciousness may be related to neural/physical representation via a dual-aspect theory of information. But there are also differences. Unlike Chalmers I argue for the use of neutral information processing language for functional accounts rather than the term "awareness." I do not agree that functional equivalence cannot be extricated from phenomenal equivalence, and suggest a hypothetical experiment for doing so - using a cortical implant for blindsight. I argue that not all information has phenomenal accompaniments, and introduce a different form of dual-aspect theory involving "psychological complementarity." I also suggest that the hard problem posed by "qualia" has its origin in a misdescription of everyday experience implicit in dualism. 

1. Why consciousness is puzzling.

The relation of consciousness to the material world is puzzle, which has its origin in dualism, a philosophy of mind which posits their fundamental separation. Dualism, in turn, has its roots in folk wisdom. The belief that humans are more than bodies and that there is something in human nature that survives bodily death has its origins in prehistory; it becomes explicit in the mythology of Ancient Egypt and Assyria and was formulated into a philosophical position in the Platonic thought of Ancient Greece. But the contemporary view that the interaction of consciousness with matter poses a problem which may be beyond scientific understanding can be traced to a clearer formulation of dualism proposed by Descartes.

According to Descartes (1644) the Universe is composed of two fundamentally different substances, res cogitans, a substance which thinks, and res extensa, a substance which extends in space. Res extensa is the stuff of which the material world is made, including living bodies and brains; res cogitans is the stuff of consciousness. Descartes maintained that, in humans, res cogitans and res extensa interact via the pineal gland, located in the center of the brain. However, even in the seventeenth century, the causal interaction of substances as different as these was thought by some to pose an insuperable problem.

Leibniz (1686), for example, argued that only physical events could cause physical events and only mental events could cause mental events. Fortunately, he thought, God has arranged physical events and mental events into a pre-established harmony so that given sequences of mental and physical events unfailingly accompany each other ("parallelism"). Consequently, there is an apparent causal interaction of mind with body rather than an actual one. This view resurfaces in the contemporary assumption that for every conscious experience there is a distinct neural "correlate." However, attribution of such correspondences to the workings of a munificent Deity has little appeal to the modern scientific mind.

Within twentieth century philosophy and science it is far more fashionable to reduce dualism to a form of materialism, for example to assume or attempt to show that consciousness is nothing more than a state or function of the brain (physicalism or functionalism). If either form of reduction is successful the explanatory gap left by dualism disappears, for the reason that all that needs to be explained can then be explained within the domain of natural science. Fashion, however, is beginning to change (see, for example, the debates between Dennett, Fenwick, Gray, Harnad, Humphrey, Libet, Lockwood, Marcel, Nagel, Searle, Shoemaker, Singer, Van Gulick, Velmans, and Williams in Ciba Foundation Symposium 174, 1993). The reasons for this are many - but in essence they have to do with the realization that once one has explained everything that there is to explain about the material structure and functioning of brains, one will still be left with the problems of consciousness. To put matters crudely, one cannot find consciousness by any conceivable histological examination of the brain. Nor, as Nagel (1974) puts it, can one know what it is like to be something from a physical description alone. In Velmans (1991a) I have considered functional explanations of consciousness, tracing functional models of the mind through from input to output and concluded that consciousness cannot be found within any information processing "box" within the brain. Consciousness accompanies or results from certain forms of processing but can be dissociated conceptually, and in most cases empirically from the processes with which it is commonly identified in the cognitive literature (perception, learning, memory, language, creativity and so on). The same can be said of models of functioning couched in other terms, such as parallel distributed processing or the language of neurophysiology (Gray 1995; Velmans 1995a).

In short, while it is likely that consciousness will eventually be found to be associated with given forms of processing, it looks increasingly likely that consciousness cannot be reduced to such processing. Or, to put matters another way, "first-person perspective facts" cannot be fully reduced to "third-person perspective facts" (cf Goldman 1993; Velmans 1991a,b, 1993a). In his recent "keynote" article (this issue), Chalmers (1995) comes to the same conclusion.

But if consciousness cannot be reduced to a state or function of the brain, how might one fill the explanatory gap left by dualism? Logically, it might be possible to reduce matter to forms of existence in the mind, for example to argue along with Berkeley (1710) that material events only exist in so far as they are perceived to exist (idealism). Idealism has its modern defenders, for example in some interpretations of the observer effect in quantum mechanics (the view that the Shrodinger wave equation only collapses into an actuality once an observation is made). In the macroworld it may also be true that the world as-perceived only exists if there are perceivers (Velmans 1990). However, as a general theory of the ontology of macroevents this position has its own well-known problems. It might be that the material world cannot have an appearance without perceivers, but it seems counterintuitive that its very existence is similarly vulnerable. Closing one's eyes, for example, does not seem to be enough to make unpleasant events go away.

Given the implausibilities of both materialist and mentalist reductionism, it is important to consider nonreductionist forms of monism (ways of healing the split induced by dualism, without reducing either consciousness or the material world to something other than they appear to be). An early version of this is the "dual-aspect theory" of Spinoza (1677) - the view that mind and matter are manifest aspects of something deeper in Nature, which appear to interact by virtue of some unfolding, grounding process within Nature itself. This view may be thought of as a precursor of the contemporary proposal that consciousness and its correlated brain states may be thought of as dual aspects of a particular kind of "information," which is in turn, a fundamental property of nature (Velmans 1991b; 1993a; Chalmers 1995).

2. The interface of consciousness and brain.

We do not yet have precise knowledge of the events which form the interface of consciousness with the brain. However, in Velmans (1991a,b; 1993a) I considered what the general character of such events might be. Conscious experiences are representational (they are about something); consequently it seems likely that the neural or other physical correlates of conscious experiences will also be representational (i.e. that they will be representational states). It seems plausible to assume that for every distinct conscious experience there will be a distinct neural or other physical correlate. If so, any information (about something) manifest in experience will also be encoded in the correlates in a neural or other physical form. It follows from these assumptions that given conscious experiences and their physical correlates encode identical information that is formatted in different ways.

For human beings, of course, the physical correlates of consciousness are formed within living brains with multiple functions, and there has been much research and theory about where, in this complex system, consciousness "fits in" (cf Velmans 1991a). Information at the focus of attention usually enters consciousness (like this sentence). Conversely, information which is not attended to does not enter consciousness (like the pressure of your tongue against your upper teeth). So the relation of consciousness to focal-attention is close. But once information at the focus of attention enters consciousness it has already been analyzed. For example, the processes involved in reading and comprehending this text are extremely complex - involving visual pattern recognition, syntactic analysis, semantic analysis, relating of input information to global knowledge in long-term memory, judgements about plausibility, conversion into some consciously experienced visual form and, if it is accompanied by inner speech (verbal thoughts), into phonemic imagery. But all that enters consciousness is the result of such processing (in the form of seen words, phonemic imagery, feelings of comprehension and so on).

Given this, I have argued that consciousness relates closely to a late-arising product of focal-attentive processing. Clues about the nature of this late-arising product come from those rare situations where focal attention is devoted to input analysis without accompanying consciousness. One striking example is blindsight, a condition in which subjects are rendered blind in one half of their visual field as a result of unilateral striate cortex damage. If stimuli are projected to their blind hemifield subjects cannot see them in spite of the fact that their full attention is devoted to the task. As they cannot see the stimulus they maintain that they have no knowledge about it. However, if they are persuaded to make a guess about the nature of the stimulus in a forced choice task, their performance may be very accurate. For example, one subject investigated by Weiskrantz et al (1974) was able to discriminate horizontal from vertical stripes on 30 out of 30 occasions although he could not see them. In short, the subject has the necessary knowledge but does not know that he knows. In information processing terms, it is as if one (modular) part of his system has information which is not generally available throughout the system. On the basis of this and other evidence I have argued that consciousness relates closely to information dissemination (Velmans 1991a). Although Chalmers (1995) does not refer to this and related papers, he again makes the same suggestion (a similar suggestion regarding information dissemination has also been made by Baars (1988) and Navon (1991).

3. Points of agreement with Chalmers (1995)

In sum, the resemblance of Chalmers' main proposals to my own (in Velmans 1991a,b; 1993a) is striking - although the arguments used are sometimes different (and Chalmers puts the case in a very elegant way). I will not quote from my earlier papers in detail, however points of agreement include:

1) that the mind/body problem is not one problem but many (see also Velmans 1995b, 1996a).

2) that exploration of the physical and functional causes of consciousness can be undertaken by conventional scientific procedures, but that consciousness as such cannot be captured by such procedures.

3) that the proper route through this is to adopt a nonreductive approach, which relates first-person accounts of experience to third-person accounts of physical structure and functioning (psychophysical bridging laws) without seeking to reduce consciousness to brain or vice-versa.

4) that consciousness may be fundamental, in the sense that matter and energy are fundamental (if so, a reduction of consciousness to matter is unlikely to be successful).

5) that at the interface of consciousness with the brain, the structure of conscious representations can be related to the structure of physical representations in the brain through the notion of information and, in particular, to information dissemination.

6) that this amounts to a form of dual-aspect theory in which conscious experiences and their physical correlates may be thought of as dual manifestations of something more fundamental than either - the information structure of some thing-itself (there are also some differences between us on this point, but more of this in section 4.4 below).

4. Points of disagreement with Chalmers (1995).

4.1 Definitions.

Within his keynote article Chalmers suggests that it would be useful to distinguish "consciousness" from "awareness." The term "consciousness" refers to "experience", "qualia," or "phenomenal experience." "Awareness," he suggests should refer to phenomena associated with consciousness, such as the ability to discriminate, categorize, and react to environmental stimuli, the integration of information by a cognitive system, and so on (p6). Contributors to this issue have been urged to follow this usage - and, if there is consensus, this might become standard within the field as a whole. Given this, it is important to examine Chalmers' usage with care.

As noted earlier, I have also argued that consciousness in the sense of "experience" should be clearly distinguished from the information processing functions with which it is commonly associated (Velmans 1991a), however Chalmers' choice of the term "awareness," for such functions is unfortunate. In common usage, the terms "consciousness," "awareness," and "experience" are often interchangeable as are the terms "conscious awareness" and "conscious experience." More seriously, the use of the term "awareness" for information processing functions is not theoretically neutral. It suggests (subtly) that information processes associated with consciousness are themselves in some sense "sentient." Indeed, Chalmers go on to argue that information processing of certain kinds is inevitably associated with consciousness of certain kinds, whether or not such processes are embodied in the brain (p19). This might, or might not be so (we return to this below). But the matter cannot be decided by defining information processing functions in such a way that they already have something of the quality of consciousness (in the form of "awareness").

Blurring of the functioning/sentience distinction also leads to confusion about what scientific or conceptual advances might illuminate our understanding of consciousness. For example, many of the "easy" problems of consciousness listed by Chalmers (pp5-6) are, strictly speaking, not problems of consciousness at all. The ability to discriminate, categorize, and react to environmental stimuli, the integration of information in a cognitive system, and the ability of a system to access its own internal states, can be accounted for (in principle) in information processing terms which make no reference to consciousness or awareness. For example, personal computers which print error messages on a screen have access, of a kind, to their own internal states. But this rudimentary form of meta-representation reveals nothing about PC consciousness or awareness.

For reasons such as this a number of other recent reviewers of the field have tended to use the terms "consciousness," "awareness" and "experience" interchangeably, and to express information processing functions in neutral information processing terms (see Farthing 1992; readings in Velmans 1996b). This retains the convention within cognitive psychology that human information processing can be investigated without making any presumptions about the extent to which it enters consciousness or awareness. It allows the possibility that various forms of information processing can take place in humans without "conscious awareness" (in blindsight, visual masking experiments etc.). It also remains open to the possibility that machines can perform many of the functions performed by humans without any awareness of what they do.

Chalmers notes that, "The ambiguity of the term 'consciousness' is often exploited by both philosophers and scientists writing on the subject" enabling them to present theories of functioning as if they are theories of consciousness (p7). His suggestion that the term "awareness" should be reserved for such information processing functions perpetuates this exploitation for theoretical reasons of his own.

4.2 Do all systems with the same functional organization have qualitatively identical experiences?

According to Chalmers systems with the same fine-grained functional organization have identical experiences and, if the functional organization associated with consciousness is defined to be the system's "awareness," this controversial claim becomes more plausible. But Chalmers' case is not simply a matter of definition. His principal argument is a thought experiment which demonstrates that systems which are functionally isomorphic could not themselves distinguish between having experiences of different kinds. Conversely, if they could make such distinctions (and noticed or reported different experiences) they would not be functionally isomorphic.

However, whether functionally isomorphic systems could notice experiential differences must be distinguished from whether experiential differences in such systems exist. A totally nonconscious machine, for example, would have no way of noticing that it was totally nonconscious (the conscious/nonconscious distinction would have no meaning for it). It could nevertheless be made to respond as if it were conscious - for example, it could be programmed to report only on information at the focus of attention being disseminated throughout its system (assuming that in humans only such information is conscious). If the machine could be made functionally isomorphic to a conscious human being there would be no way of distinguishing the machine from a human by its behaviour alone. But that would not alter the fact that in the machine there is nobody at home.

According to Chalmers there is no empirical way to distinguish between such functionally isomorphic systems. Perhaps for completely isomorphic systems he is right. However, in human beings, that is not how one would attempt to distinguish conscious from nonconscious functioning.

A cortical implant for blindsight. In typical psychology experiments on consciousness, one requires subjective reports, for example about whether a subject has experienced a stimulus or not. One has to accept that these two potential outcomes are associated with functional differences, if only for the reason that they are associated with different verbal reports. Usually, there are additional functional differences in these two situations. Suppose, however, that we are able to eliminate these additional differences so that the only functional difference between the two situations is that in one case there is a conscious experience (of the stimulus) with a corresponding verbal report, and in the other case there is not.

Suppose also that consciousness is most closely associated with information dissemination (as suggested above). Recall that in blindsight, subjects can identify stimuli projected to their blind hemifields without conscious experience, but information about the stimuli does not appear to be disseminated throughout the brain. Consequently, subjects do not "know that they know" what the stimuli are and have to be forced to guess.

Imagine that we could implant a cortical network constructed out of silicon chips which restores information dissemination (of stimuli in the blind hemifield) so that in terms of all functions other than the appearance of consciousness and its associated verbal report blindsighted patients can no longer be distinguished from normals. Equipped with their new implants blindsighted patients now identify stimuli presented to their blind hemifields with confidence, learn from them, behave appropriately towards them and so on. But suppose that they still report that only stimuli presented to their nonaffected hemifields are consciously experienced. If so, information dissemination is dissociable from consciousness.

We cannot second-guess the outcome of this experiment, and in practice this bit of medical wizardry may turn out to be impossible. But the experiment is not unthinkable. And if it is not unthinkable, then Chalmers is wrong to argue that functional equivalence (in this case, in terms of information dissemination) cannot be empirically extricated from phenomenal equivalence.

4.3 Is all information associated with consciousness?

Chalmers is open to the possibility that not all information has a phenomenal aspect but argues that it is both plausible and elegant to suggest that it does so. If so, not just mice, but even thermostats might have some maximally simple experience (p22). If his argument that functional equivalence entails phenomenal equivalence were correct, it would follow that machines that function like humans would also have the experiences of humans.

While I have also argued that consciousness may be related to certain forms of information in a fundamental way, there are very good empirical grounds for supposing that this does not apply in every case. In the human brain, for example, there is massive information, but at any given moment only a small proportion of this has any manifestation in conscious experience. There is, for example, a vast store of knowledge in long-term memory which at any given moment remains unconscious. Of the currently processed information, only that which is at the focus of attention ultimately reaches consciousness (like this sentence); the details of current information processing do not themselves enter consciousness (like the processing required to read this sentence). In fact, very few information processing details ever enter consciousness (Velmans 1991a). Given that human information processing operates in largely nonconscious fashion it is plausible to assume that it operates in a similarly nonconscious fashion in machines.

It follows from this that we need to constrain the theory that information (always) has phenomenal aspects. This leaves open the possibility that information of certain kinds has phenomenal aspects. But what characterizes such information?

This is an empirical question to which there are no confident answers (as yet). It could be that information only has a phenomenal aspect if it is embodied in a neurochemical form. But this would not be a sufficient constraint as most neural representations have no manifestation in consciousness. An additional constraint might be that neural representations only become conscious when they have been activated above a given threshold. Another common hypothesis is that information only becomes conscious when it is passed to a particular region or circuit in the brain. A third possibility is that information only becomes conscious when it is subjected to a certain kind of processing (such as information dissemination, as suggested above). A fourth possibility is that information in the brain becomes conscious only when the conditions above combine in some way - for example, if information activated above some threshold is disseminated throughout the brain.

These suggestions share the assumption that for information to become conscious something added has to happen. There is, however, another intriguing possibility. It could be that information has a phenomenal aspect unless it is prevented from doing so. In the human brain, for example, it could be that information and accompanying consciousness is massively inhibited, to prevent information (and consciousness) overload. Much of neural activity is known to be inhibitory. To enable coherent, adaptive response it might be that only the information at the focus of attention which is disseminated throughout the system, is released from inhibition.

Release from inhibition has been hypothesised to explain certain psychological effects, for example, the sudden improvement in short-term memory performance, if after a series of trials with similar stimuli, one changes the features of the to-be-remembered stimuli (Wickens 1972). There is also evidence that selective attention operates, in part, by the inhibition of nonattended stimuli (cf Arbuthnott 1995). This is consistent with the view that the brain may act as a filter (as well as an organizer) of information (e.g. Broadbent 1958). If consciousness is a naturally occurring accompaniment of neurally encoded information (which is massively inhibited) Chalmers' suggestion that information generally has a phenomenal aspect could not be so easily dismissed.

One would still, of course, have to establish whether consciousness can be decoupled from neurochemistry - for example, by experiments such as the cortical implant for blindsight suggested above. And, even if it were found that consciousness accompanies information embodied in silicon and other physical substances, one would still have to be cautious about the forms of consciousness that might accompany different forms of information.

Chalmers, for example, suggests that thermostats have some minimal experience. But, of what? For human purposes, thermostats convey information about temperature. Feelings of hot and cold also convey information about temperature. Given this, it is possible (in principle) to device a thermostat which controls temperature in a closed environment in a way that would be indistinguishable from that of a human controller responding to feelings of hot and cold. So for these purposes one could create a form of man/machine functional equivalence. But it would be facile to assume that thermostats (in this situation) feel hot and cold in the way that humans do.

In humans, feelings of hot and cold are mediated by free nerve endings embedded in the skin which pass their information to complex circuitry in the thalamus and cerebral cortex. A simple thermostat might be constructed out of a bimetal strip. Because the expansion coefficient of one metal is greater than the other, the strip bends as temperature increases. If the metal bends enough it opens (or closes) a circuit which controls a source of heat. In short, differential expansion in metals is used by human beings for the purposes of human beings - to convey information about and control temperature. But the bimetal strip simply expands and bends. If it does experience something that relates to how it expands and bends, there is no reason to suppose that this will relate in any way to the feelings of hot and cold experienced by human beings. As I have argued above, functional equivalence does not guarantee phenomenal equivalence.

4.4. Dual-aspect theory or naturalistic dualism?

One can only speculate about thermostat consciousness, but few would deny the existence of human consciousness, or its close association with activities in brains. As noted above, it is also plausible to link different forms of consciousness to different forms of neural or other physical encoding (via the notion of information). But this does not explain how, in an otherwise physical system, consciousness comes to exist. We have seen that the neural conditions for the appearance of consciousness can in principle be determined empirically. For example, it might turn out that given combinations of activation, brain location, and information processing are involved. Isolation of the necessary and sufficient conditions for the appearance of consciousness in the brain would provide a form of causal explanation. But that would not provide an understanding of why these neural activities are accompanied by something so unlike neural activities.

It is worth bearing in mind that nature does not always bother to arrange things so that they are immediately obvious to humans. Sometimes, for example, apparently dissimilar phenomena (to our eyes) are fundamentally linked. For example, electric current passing through a wire produces a surrounding, spatially distributed magnetic field. Conversely, moving a wire through an existing magnetic field produces current in the wire. In Velmans (1991b; 1993a) I have suggested that there is some similarly fundamental process taking place in the brain that unifies certain forms of neural functioning with conscious experience.

As noted in section 1 above, there are different ways in which one can construe the nature of this consciousness/ brain relationship, for example in dualist or reductionist terms. However, the simplest nonreductionist way to capture the intimate linkage of consciousness to its neural correlates is to suppose that they are two aspects of one representational process (dual-aspect theory). As noted in section 2 above, it is plausible to think of conscious experiences and their neural correlates as representational states, and to assume that the information encoded in a given conscious experience and its physical correlates is identical. Given this underlying identity, why does the information embodied in conscious experiences and accompanying neural states appear so different? In Velmans (1991b, 1993a) I have argued that the appearance of this information depends entirely on the perspective from which it is viewed - i.e., it depends on whether the information in question is being viewed from an external observer's third-person perspective or from the first-person perspective of the subject.

Suppose a subject S is asked to focus his attention on a cat, while an experimenter E tries to determine what is going in the subject's brain. Using his visual system aided by sophisticated physical equipment E might, in principle, be able to view the information about the cat being formed in S's brain. Given our current knowledge of brains, it is reasonable to assume that this information will be embodied in a neurophysiological or other physical form. We also know that for S, the same information will be displayed in the form of a phenomenal cat out in the world. We do not know how representations in S's brain are translated into a conscious experience, but we can safely assume that the biological arrangements whereby information (about the cat) in S's brain is translated into S's experience are very different to the experimental arrangements used by E for accessing the same information in S's brain. Consequently, it is not surprising that for S and E the same information appears to be formatted in very different ways. Note that there is nothing unusual about identical information appearing in very different formats. This happens, for example, to the information encoded in the form of magnetic variations on videotape once it is displayed as a moving picture on a TV screen.

But this is only an analogy. The manner in which different arrangements for observing the information in S's brain, produce such dramatically different phenomenal consequences, is more closely reminiscent of wave/particle duality in quantum mechanics. In Velmans (1991b, 1993a) I have sketched out the beginnings of a "psychological complementarity" principle (operating at the interface of consciousness with the brain). This resembles complementarity in physics in its stress on the dependence of the observation on the entire observation arrangements. Psychological and physical complementarity are also similar in requiring both (complementary) descriptions for completeness. Both a wave and a particle description are required for a complete understanding of photons. Likewise, both a (third-person) neural/physical and a (first-person) phenomenal description are required for a complete psychological understanding of subjects' representations.

But there the similarities end. I do not suggest for example that consciousness is wavelike, or that neural correlates are like particles. And crucially, wave/particle manifestations are both accessible to an external observer, whereas it is a unique feature of the consciousness -brain relationship that the complementarity obtains between what is accessible to an external observer versus an experiencing subject.

This account of psychological complementarity is only a first sketch, but attempts to grapple with how a fundamental representational process might have such different manifestations. While Chalmers also argues that experience is a fundamental feature of the world, his analysis of how it relates to physical features is quite different. According to Chalmers, experience is a fundamental feature alongside mass, charge, and space-time. Consequently he suggests that the relationship of experience to physical processes may be thought of as a kind of "naturalistic dualism" (p15). Although elsewhere (p4, p20) Chalmers considers his theory to be a form of "double-aspect" theory, and he refers to the intrinsic versus the extrinsic nature of information, the relation of this double-aspect theory to naturalistic dualism is not made clear.

5 How to bridge a nonexistent gap.

There can be little doubt that some of the questions about consciousness are hard - and that not all the questions are just the result of conceptual confusion. For example, there really are conscious experiences and we have every reason to suppose that there really are neural or other physical accompaniments of these. Working out the details of the brain/consciousness interface is consequently important.

But I believe that there are some "hard" questions that are the result of conceptual confusion. It must be remembered that the "mind/body problem" has resulted, in part, from the splitting of mind from body implicit in dualism. Consciousness, in Descartes' formulation, is nonmaterial and has no location or extension in space. Conscious states are commonly thought to be private and subjective in contrast to physical states which are public and objective. Given this, it is hardly surprising that consciousness has been thought to pose an insuperable problem for science - with a consequent shift towards materialist reductionism within twentieth century philosophy of mind.

In Velmans (1990, 1993b) I have reviewed evidence that Descartes' way of describing consciousness and its relation to the brain accords neither with science (the nature of perception) nor with the everyday experiences of human beings. That is to say, the mind/body problem results in part from a description of everyday experience that does not correspond to everyday experience. I have also taken issue with the private vs public and subjective vs objective distinctions in the forms that they usually appear.

It is not possible to enter into the detailed analysis relating to these basic points in the limited space available here. But, to put matters very briefly, very few aspects of consciousness have the nonextended character suggested by Descartes (thoughts, some feelings, some images and so on). Classical "mental" events such as pains nearly always have a location and distribution in one's body, and the same applies to other tactile sensations. Auditory and visual phenomena are generally perceived to be outside the body and distributed in three-dimensional space. In short, the contents of consciousness include not just "inner events" such as thoughts, but body events, and events perceived to be in the external phenomenal world. This external phenomenal world is what we normally think of as the "physical world." In short, what we normally think of as the "physical world" is part-of the contents of consciousness (part-of what we experience). Consequently, there is no unbridgeable divide separating "physical phenomena" from the "phenomena we experience."

Consider how this description applies to the situation described in 4.4 above, in which an experimenter observes the brain of a subject. While the experimenter focuses on the subject, his phenomenal world includes the subject, and if his experimental arrangement is successful, the representational states in the brain of the subject. While the subject focuses on the cat his phenomenal world includes the cat. It is fashionable (at present) to think of E's "observations" (of the subject's brain) as public and objective. S's "experiences" of the cat, by contrast, are private and subjective. Indeed this radical difference in the status of E and S is enshrined in the different terminology applied to what they perceive; that is, E makes "observations," whereas S merely has "subjective experiences."

But suppose they turn their heads, so that E switches his attention to the cat, while S switches his attention to what is going on in E's brain. Now E is the "subject" and S is the "experimenter." Following the same convention, S would now be entitled to think of his observations (of E's brain) as public and objective and to regard E's observations of the cat as private and subjective. But this would be absurd - as nothing has changed in the character of the observations of E and S other than the focus of their attention.

I cannot do more in a few lines than to sew a few seeds of doubt. A full pursuit of such simple points requires a reanalysis of idealism versus realism, of private versus public, of subjectivity, intersubjectivity and objectivity, of the relation of psychology to physics, and of the proper nature of a science of consciousness (Velmans 1990, 1993b; Harman 1993).

Suffice it to say that, in the new analysis, the problem of how to incorporate conscious "qualia" into empirical science disappears. All phenomena in science are seen to be aspects of the phenomenal worlds of observers, and therefore part-of the world that they experience. "Qualia" are in there from the beginning. Indeed, the whole of science may be seen as an attempt to make sense of the phenomena that we observe or experience.

References

Arbuthnott, K.D. 1995. Inhibitory mechanisms in cognition: Phenomena and models. Cahiers de Psychologie Cognitive 14(1):3-45.

Baars, B.J. 1988. A Cognitive Theory of Consciousness. Cambridge University Press.

Berkeley, G. 1710. The principles of human knowledge. Reissued by T. Nelson & Sons: London, 1942.

Broadbent, D.E. 1958 Perception and communication. Pergamon Press.

Chalmers, D. 1995. Facing up to the problem of consciousness. Journal of Consciousness Studies 1(2):5-23.

Ciba Foundation Symposium 174. 1993. Theoretical and Experimental Studies of Consciousness. Wiley, Chichester.

Descartes, R. 1644. Treatise on Man. Trans. by T.S.Hall. Harvard University Press, 1972.

Farthing, W. 1992. The Psychology of Consciousness. Prentice-Hall.

Goldman, A.J. 1993. Consciousness, folk psychology, and cognitive science. Consciousness and Cognition 2:364-382.

Gray, J. 1995. The contents of consciousness: A neuropsychological conjecture. Behavioral and Brain Sciences (in press).

Harman, W. 1993. Towards an adequate epistemology for the scientific exploration of consciousness. Journal of Scientific Exploration 7(2):133-143.

Leibniz, G.W. 1686. Discourse of Metaphysics, Correspondence with Arnauld, and Monadology. Trans. by M. Ginsberg, London: Allen & Unwin, 1923.

Nagel, T. 1974. What is it like to be a bat? Philosophical Review 83:435-451.

Navon, D. 1991. The function of consciousness or of information? Behavioral and Brain Sciences 14:690-691.

Neill, W.T. 1993. Consciousness, not focal attention, is causally effective in human information processing. Behavioral and Brain Sciences 16(2):406-407.

Spinoza, B. 1677. The Ethics. Reprinted in The Ethics of Benedict Spinoza, New York: Van Nostrand, 1876.

Velmans, M. 1990. Consciousness, brain, and the physical world. Philosophical Psychology 3(1):77-99.

Velmans, M. 1991a. Is human information processing conscious? Behavioral and Brain Sciences 14(4):651-669.

Velmans, M. 1991b. Consciousness from a first-person perspective. Behavioral and Brain Sciences 14(4):702-726.

Velmans, M. 1993a. Consciousness, causality, and complementarity, Behavioral and Brain Sciences 16(2):404-416.

Velmans, M. 1993b. A reflexive science of consciousness. In Experimental and theoretical studies of consciousness. Ciba Foundation Symposium No.174, Wiley, Chichester.

Velmans, M. 1995a. The limits of neurophysiological models of consciousness. Behavioral and Brain Sciences (in press).

Velmans, M. 1995b. Theories of consciousness. In (M. Arbib, ed.) The Handbook of Brain

Theory and Neural Networks. Bradford Books/ The MIT Press (in press).

Velmans, M. 1996a. An introduction to the science of consciousness. In (M. Velmans, ed.) The Science of Consciousness: Psychological, Neuropsychological and Clinical Reviews, Routledge (in press).

Velmans, M., Ed. 1996b. The Science of Consciousness: Psychological, Neuropsychological and Clinical Reviews, Routledge (in press).

Weiskrantz, L., Warrington, E.K., Sanders, M.D. & Marshall,J. 1974. Visual capacity in the hemianopic field, following a restricted occipital ablation. Brain 97:709-728.

Wickens, D.D. 1972. Characteristics of word encoding. In (A.W. Melton & E. Martin, eds.) Coding Processes in Human Memory, John Wiley & Sons.