PRECONSCIOUS FREE WILL
Max Velmans, Department of Psychology, Goldsmiths, University of London, New Cross, London SE14 6NW, England.
ABSTRACT. This paper responds to continuing commentary on Velmans (2002a) “How could conscious experiences affect brains,” a target article for a special issue of JCS. I focus on the final question dealt with by the target article: how free will relates to preconscious and conscious mental processing, and I develop the case for preconscious free will. Although “preconscious free will” might appear to be a contradiction in terms, it is consistent with the scientific evidence and provides a parsimonious way to reconcile the commonsense view that voluntary acts are freely chosen with the evidence that conscious wishes and decisions are determined by preconscious processing in the mind/brain. I consider alternative interpretations of how “conscious free will” might operate by Libet and by Mangan and respond to doubts about the extent to which the operations of mind are revealed in consciousness, raised by Claxton and Bouratinos. In reconciling commonsense attributions of freedom and responsibility with the findings of science, preconscious free will can be shown to have practical consequences for adjudications in law.
In everyday life, we assume that we have a self that is in control of our voluntary acts. But what kind of “self” is in control? Our inner sense of self is closely tied to our conscious self. Yet, the evidence that our mental processing is largely preconscious or unconscious is extensive. How the conscious self relates to mental processing and the precise sense in which such processing might “be conscious” therefore has to be examined with care.
In Velmans (1991a) I have suggested that a mental process might be conscious (a) in the sense that one is conscious of it, (b) in the sense that it results in a conscious experience, and (c) in the sense that consciousness causally affects that process. We do not have introspective access to how the preconscious cognitive processes that enable thinking produce individual, conscious thoughts in the form of “inner speech.” However, the content of such thoughts and the sequence in which they appear does give some insight into the way the cognitive processes (of which they are manifestations) operate over time in problem solving, thinking, planning and so on. Consequently such cognitive processes are partly conscious in sense (a), but only in so far as their detailed operation is made explicit in conscious thoughts, thereby becoming accessible to introspection. Many psychological processes are conscious in sense (b), but not in sense (a)—that is, we are not conscious of how they operate, but we are conscious of their results. This applies to perception in all sense modalities. When consciously reading this sentence for example you become aware of the printed text on the page, accompanied, perhaps, by inner speech (phonemic imagery) and a feeling of understanding (or not). But you have no introspective access to the processes that enable you to read. Nor does one have introspective access to the details of most other forms of cognitive functioning, for example to the detailed operations that enable “conscious” learning, remembering, engaging in conversations with others and so on.
Crucially, having an experience that gives some introspective access to a given process, or having the results of that process manifest in an experience, says nothing about whether that experience carries out that process. That is, whether a process is “conscious” in sense (a) or (b) needs to distinguished from whether it is conscious in sense (c). Indeed, it is not easy to envisage how the experience that makes a process conscious in sense (a) or (b), could make it conscious in sense (c). Consciousness of a physical process does not make consciousness responsible for the operation of that process (watching paint dry does not actually make it dry on the wall). So, how could consciousness of a mental process carry out the functions of that process? Alternatively, if conscious experience results from a mental process it arrives too late to carry out the functions of that process.
In Velmans (1991a, 2000, 2002a) I have suggested that such doubts over the causal role of conscious phenomenology apply even to conscious free will. We commonly experience wishes, desires, decisions to act, or not to act, and take it for granted that it is the conscious experiences themselves that exercise control over our consequent acts. However, Libet (1985) found that one’s brain prepares to act not just before one acts, but even before one experiences a wish to act. For example, a simple motor act such as flexing one’s wrist is preceded by a negative-going readiness potential (RP) recorded at the scalp by around 550 milliseconds. Surprisingly, the readiness potential also precedes the experienced wish to act by around 350 milliseconds. This suggests that, like the act itself, the experienced wish (to flex one’s wrist) may be one output from the (prior) cerebral processes that actually select a given response rather than being the cause of those processes.
These and many other findings (summarized in Velmans, 1991a, 2000, 2002a) challenge a well-established view about the way preconscious processing relates to conscious processing—that preconscious processing has to be involuntary and inflexible, and conscious processing is voluntary and flexible. Such findings also seem to challenge a well-established, conventional understanding of the functions of the conscious self. As noted above, I normally take it for granted that my conscious self is responsible for my voluntary acts. But, given the evidence, is it my conscious self that is responsible for my voluntary acts or is it my preconscious brain?
Answers to such questions have clear consequences for our ethical and legal systems. I feel responsible for my voluntary acts and am likely to be held responsible for them by the courts. But, if my conscious self is not responsible for my acts and if the act is determined by preconscious processing, can’t I plead, in mitigation, that I could not have chosen to do otherwise as the acts were controlled by my preconscious brain? Adjudication of such a plea requires one to make sense of one of the hardest of the “hard” problems of consciousness: how to understand the causal interaction of consciousness and brain.
In Velmans (2000, 2002a,b) I have suggested a way of understanding “conscious causation” that both accommodates the scientific findings and defends our common sense view of freedom and responsibility. How could an act that is executed preconsciously be my act? Because I (the agent) include the operations of my unconscious and preconscious mind as well as my conscious sense of self. How could a preconsciously determined act be “voluntary”? Voluntary acts imply the possibility of choice (albeit choice within constraints). Voluntary acts are also potentially flexible and capable of being novel. In the psychological literature such properties are traditionally associated with controlled rather than automatic processing and with focal-attentive rather than pre-attentive or non-attended processing. I do not deny that voluntary processes are controlled and focal-attentive. Nor do I deny that they are conscious. They are conscious in sense (b) and, to a lesser degree, sense (a) above. They are merely not conscious in sense (c). In Libet’s experiments the conscious wish to act appears around 350 milliseconds after the onset of preconscious preparations to act that are indexed by the readiness potential. This says something about the timing of the conscious wish in relation to the processes that generate it and about its restricted role once it appears. But it does not argue against the voluntary nature of that preconscious processing. On the contrary, the fact that the act consciously feels as if it is voluntary and controlled suggests that the processes that have generated that feeling are voluntary and controlled, as conscious experiences generally provide reasonably accurate representations of what is going on. Such feelings of being in control contrast sharply with the feelings of constraint where voluntary choice is absent—for example, where one’s range of action is controlled by others, or where voluntary control is not possible, as in the muscular twitches induced by Parkinson’s disease.
In sum, the feeling that we are free to choose or to exercise control is compatible with the nature of what is actually taking place in our own mind/brain, following processes that select amongst available options, in accordance with current needs, goals, available strategies, calculations of likely consequences and so on. While I assume that such processes operate according to determinate principles, the system architecture that embodies them enables the ability to exercise the choice, flexibility and control that we experience—a form of determinism that is compatible with experienced free will.
Ben Libet’s commentary
Given the importance of Ben Libet’s research findings to the issue of conscious versus preconscious volition, I am particularly grateful for his critical comments, as they allow my own suggestions to be compared and contrasted with his own carefully considered approach. Libet is concerned, as I am, to reconcile the conviction that we are consciously responsible for our acts, with experimental evidence that a wish to act results from preconscious processing in the brain. However his own route through the problem is to stress that the volitional process is merely initiated unconsciously. In his experiment, the conscious wish to act (W) still preceded the final motor act by around 150 milliseconds, which provided an opportunity for the conscious will to control the outcome of the volitional process by blocking or ‘vetoing’ the process (Libet, 1985). As Libet notes, “The existence of a potentiality to veto is not in doubt. Everyone has experienced having a wish or urge to perform an act, but vetoed the actual performance of the act. That presumably occurs when a given W is recognized as being incompatible with social acceptability and with one’s personality.” (p x)
How do these vetoes operate? According to Libet, the conscious decision takes the form of a non-physical field that arises from the activity of the brain subsequent to the conscious wish, that supervenes over and controls the activity from which it arises (see below). Libet accepts that in Velmans (1991a, 2002a) I have raised an obvious question for this position: “Why doesn’t the veto decision have its own unconscious antecedents, just as appears to be the case for W itself? If the veto were developed by preceding unconscious processes, that would eliminate conscious free will as the agent for the veto decision.” (Libet, p x) His reply is that this does not necessarily follow: “The content of the awareness of the volitional urge to act could include a conscious content of factors that may affect the conscious veto decision. However, the conscious decision to veto could be made without direct specifications for that decision by the preceding unconscious processes.” (p x)
I accept, of course, that, in tasks requiring a quick decision (within 150 milliseconds of the response), a decision to act or not to act appears to arise in consciousness spontaneously, without preconscious antecedents. But this does not settle the matter. Viewed from a first-person perspective, all conscious phenomena appear to arise spontaneously, and Libet’s own findings suggest that, on close investigation, apparently ‘spontaneous’ conscious phenomena can have preconscious antecedents. We seem, for example, to feel a surface as soon as we touch it. But Libet et al (1979) established that at least 200 milliseconds of processing time is required before preconscious processing of a tactile stimulus becomes adequate to support a conscious, tactile sensation. A conscious wish to do something also seems to arise spontaneously, but around 350 milliseconds of preconscious processing is required before such a wish appears, as we have seen above.
While such findings do not rule out Libet’s contention that decisions are arrived at in an entirely conscious way, his analysis does suggests a curious asymmetry between so-called “conscious” wishes and “conscious” decisions. A voluntary decision to act, or not to act, “based on personal and social factors” requires an assessment of possible consequences that requires one to access information about such consequences (stored in long-term memory), or construction of alternative scenarios in real-time. However, if there were no preconscious antecedents to arriving at such a conscious decision, less time would be available for its development and execution than for the development of a simple wish to do something (only 150 versus 350 milliseconds). Given its similar (or perhaps greater) complexity, why should a conscious decision to carry out an act (or not) take much less time to develop than a conscious wish?  
In any case, just as a conscious decision to do something (such as flex a wrist) requires preconscious neural preparation to activate musculature for carrying out a given task, a decision to refrain from action is likely to require preconscious neural preparation to inhibit musculature appropriate for that task. In Velmans (2000, 2002a) I suggested that there may already be some evidence that bears on this issue. Karrer, Warren and Ruth (1978) and Konttinen & Lyytinen (1993), for example, found that refraining from irrelevant movements is associated with slow positive-going readiness potentials that follow a similar time course to the more classical negative-going RPs associated with preparation to carry out an act studied by Deecke, Scheid & Kornhuber (1969). Libet, however, begs to differ. According to him, a careful reading of those articles indicates
(a) that their evidence is not directly relevant to the veto issue and
(b) that “There was no evidence bearing on the possibility that the positivity tendency was related to inhibition of irrelevant movements”. (p x)
Given our difference of opinion and the potential importance of this evidence, I give a more detailed account of it below.
Positive-going readiness potentials and muscular inhibition.
The studies of motor acts mentioned above are part of two research programmes that have been carried out by Karrer and his colleagues at the University of Illinois, and by Konttinen, Lyytinen and colleagues at the University of Jyväskylä in Finland respectively. In essence, they investigate the idea that a motor act requires both muscular activation and inhibition. As Chisholm and Karrer (1988) note, “In performing an action, one must not only activate the appropriate muscles of the target movement but coordinate gross skeletal-postural musculature and inhibit those muscle systems that interfere with effective target movement. Action therefore, is a system of discrete movements within the context of other bodily movement and nonmovement. After practice and maturation, this action system may become highly differentiated and automatic. It is often the case that many nontarget associated movements produce only minor, if any, interference with target performance, and therefore, require no inhibitory effort during performance. Nevertheless, it is also clear that an action requires active inhibition of some processes and activation of other processes.” (p. 131)
In a series of studies using button press or hand squeeze movements Karrer and his co-workers found that the polarity of the motor readiness potential (MRP) depends on the development and/or general level of motor skill of the participants. Normal adults typically have negative-going waveforms (cf Deecke, Scheid & Kornhuber, 1969; Vaughan, Costa, & Ritter, 1968). However, Karrer, Warren, & Ruth (1978), Warren & Karrer (1984a), Warren & Karrer (1984b), Chisholm & Karrer (1983), and Chisholm, Karrer & Cone (1984) found that children, preadolescents and mentally retarded young adults have prominent positive components in the MRP waveform prior to motor onset that superimpose on and confound the negative waveforms. A similar finding has also been obtained for elderly adults (Deecke, 1980).
Warren & Karrer (1984b) proposed that these positive components preceding a motor response reflect processes that are related to the development of motor control and the inhibition of extraneous movement. Positive going MRPs appear to reflect the effort needed to inhibit irrelevant movements associated with a given response when automatic response differentiation and automatic control of the required response are lacking. Chisholm & Karrer (1988) tested this by comparing ability to lift a target finger while leaving the other fingers at rest in adults (ages 24-50) and three groups of children (aged 12-13, 8-9, and 5-6). In this task participants placed both hands palms down on the recording apparatus and the four fingers of each hand were taped to metal levers attached to potentiometers so that an upward finger lift caused a rotation of the potentiometer. Each trial was initiated by the experimenter pointing to the finger targeted for lift. Subjects were instructed to lift each finger when pointed to, in their own time, and not to move nontarget fingers (or make other irrelevant movements such as eye movements or blinks). A global analysis of finger lifts with or without other associated movements showed different MRPs. MRPs in trials on which associated movements were successfully inhibited (NAM trials) had greater positivity than trials on which associated movements were present (AM trials). AM trials did not differ significantly across age groups. However NAM trials did differ across age groups, the younger groups having greater variability and overlapping positivity. According to Chisholm & Karrer, MRP positivity is associated with the effort required to inhibit unwanted movements. The adults required little effort to produce discrete finger movements or to inhibit unwanted movements resulting in more negative MRPs. In the young, control has not yet become automated, consequently their MRPs were more variable and showed greater positivity.
The studies by Konttinen, Lyytinen and their colleagues focused on the combined muscular activation and inhibition required for skilled rifle shooting. As Konttinen, Lyytinen & Viitasalo (1998) point out, “In precision shooting, the trigger pull is preceded by a period of increased preparedness during which the final adjustments in the pre-trigger set are made. One of the most important factors in this set is the shooter’s ability to reach and maintain a state of alertness and steadiness until the pull of the trigger. There has been a consensus of opinions that this type of alert immobility not only plays an important role in the success of shooting performance, but also reflects strategic differences between highly skilled and less skilled shooters.” (p78)
To dissociate the effects of readying oneself to pull a trigger from the need to control irrelevant movements, Konttinen & Lyytinen (1992) recorded the SPs  at the frontal, central, and occipital sites of subjects whose rifles were partly fixed, thereby reducing the need to control irrelevant movements. In this situation, negative SP shifts preceded the trigger pull by up to several seconds. According to Konttinen & Lyttinen this enhanced SP negativity reflects an increased preparedness and arousal as the moment of the trigger pull draws closer, not just the changes associated with the motor act (the trigger pull) itself.
In a second study, Konttinen & Lyytinen (1993) differentiated the visual aiming and motor components of shooting performance, using conditions in which aiming and the need to steady the rifle were doubly dissociated. In condition 1 (rifle steadying without aiming), subjects were asked to pull the trigger when the rifle was completely steady, pointing in the direction of a homogeneous white background, in the absence of any target. In condition 2 (visual aiming and steadying), subjects were required to pull the trigger when the rifle was completely steady and on target. In condition 3 (no aiming or steadying), subjects were simply asked to perform a self-paced trigger pull, without aiming in a situation where the rifle was fixed. In condition 4 (aiming without steadying), the rifle was fixed, but there was a visual target with 15 parts (that could be moved from trial to trial), and subjects were required to pull the trigger as soon as they located the numbered part of the target on which the rifle was aimed. Overall, visual targeting was found to be associated with enhanced SP negativity and steadying of the rifle with enhanced SP positivity, the latter being the dominant factor in situations where these were combined. This was confirmed in another study by Konttinen, Lyytinen, & Konttinen (1995), which investigated the relation between competitive rifle shooters’ preparatory performance and the concomitant frontocentral SP changes in an actual precision shooting task in which the shooters also had to allocate resources to motor control. In contrast to Konttinen & Lyytinen, (1992), the trigger pull was now preceded by positive frontal and occipital shifts. In a study that compared pre-elite shooters with the Finnish national team, Konttinen, Lyytinen & Viitasalo (1998) also found that increasing positivity in frontal (Fz) slow wave potentials correlated with increasing rifle stability for pre-elite shooters. By contrast, elite shooters achieve rifle stability with less effort and show less frontal (Fz) positivity.
Konttinen, Lyytinen & Era (1999) suggest that overall, these results support an effort hypothesis. When a shooter exerts psychomotor effort on bodily control, the preparation is associated with increased SP positivity. Conversely, if assuming a rigid shooting position requires less effort there is less SP positivity. They conclude that “on the basis of the past research on marksman rifle shooting, there seems to be good grounds for the statement that in a realistic shooting situation, the trigger pull is preceded by an increase in positivity that reflects the sustained voluntary effort required to inhibit irrelevant motor activity.” (p 12) They also note that their results and conclusions are consistent with those of Karrer’s group.
Given his view that vetoes (and consequent muscular inhibition) are carried out by consciousness, it is not surprising that Libet challenges the thesis that positivity in preconscious SPs indexes effortful, preconscious muscular inhibition. According to Libet, Karrer, et al (1978) merely “speculated that the positive type RPs were associated with inhibition of irrelevant motor activity in producing a non-cued voluntary button press with the thumb.” (p x) He also argues that in the paper by Konttinen & Lyytinen (1993) “There was no evidence bearing on the possibility that the positivity tendency was related to inhibition of irrelevant movements. Indeed, it seems likely that the movements in motor stabilization of a rifle were responses to sensory feedback from proprioceptive organs in the muscles and tendons.” (p x)
Libet’s alternative hypothesis, however, is not in conflict with the views of Karrer, Konttinen and their colleagues. While feedback from proprioceptive organs in the muscles and tendons is required to monitor motor and rifle stability, effortful muscular activation and inhibition may be required to control it. “Responses to sensory feedback from proprioceptive organs in the muscles and tendons” are nevertheless controlled responses, and the findings suggest that the greater effort required to achieve control over irrelevant movements at decreasing levels of skill is associated with increased RP positivity.
Libet goes onto ask, “if the positivity reflects inhibition of irrelevant movements producing this quick motor act, how is it possible for the subject to produce any act at all? The recorded positivity occupies the entire period of a second or more preceding the act, in those subjects showing a positive RP.” (p x) However, as both research groups point out, the RP (also known as MRP or SP) is a global indicator of cortical activity that can combine preparation and activation of relevant muscular groups indexed by negative shifts, with inhibition of irrelevant movements indexed by positive shifts that superimpose on the negative shifts (usually dominating them if the movements are not well practiced).
Libet notes finally that, “In any case, the conditions did not involve our kind of veto to block the ability of a voluntary process, initiated endogenously, to produce the motor act.” (p x) Here I agree. But, to the best of my knowledge, Libet’s own work has not included a conscious veto condition either, making his claims about the conscious operation of such a veto entirely speculative. As I have noted above, there are good grounds for suggesting that such a veto would have preconscious antecedents that may be detectable by shifts in RP prior to a voluntary motor act (or prior to a decision not to act). Given the relevance of this prediction to the possibility of “preconscious free will” it would be interesting to put it to the test. While we must await the outcome of more focused investigations to settle this issue, the most parsimonious interpretation of the existing evidence suggests that, under the time constraints of Libet’s experiments, “conscious” decisions, like “conscious” wishes are preconsciously generated.
According to Libet my complementary dual-aspect theory is an identity theory: “In this, there is a single substrate (for the brain in this case), which exhibits an ‘inner quality’ (accessible only to the individual) and an ‘outer quality’ (observable only to an external observer). There is no way of knowing what the ‘substrate’ is, and it is difficult to explain how both such qualities can be exhibited by the same substrate. Velmans offers some analogies for complementarianism, but these involve two different physically observable phenomena. This doesn’t solve the unique nature of the ‘mind-brain’ relationship, in which one feature is conscious subjective experience that cannot be observed by any external physical means.” (p x)
I have dealt with most of these points in my earlier reply to commentators (Velmans, 2002b), so I will only give a brief response here. My dual-aspect theory does suggest a form of identity between conscious experiences and their neural correlates, in that I suggest that conscious experiences and their neural correlates encode identical information. But this should not be mistaken for a physicalist identity theory that claims conscious experiences to be nothing more than states of the brain. And this does not make the underlying substrate (“the nature of mind” or “mind-itself”) unknowable. On the contrary, the nature of mind can be and is known through its aspects (its first- and third-person manifestations). From a first-person perspective, the nature and workings of mind appear in the form of conscious experiences. To an external observer the nature and workings of mind appear to be the nature and workings of brain. As the nature of mind encompasses both of its aspects, its nature can be thought of as psychophysical—or to be more precise, a psychophysical form of information processing that develops over time (Velmans, 2000, ch11). While I accept that it is difficult to explain how both physical and conscious qualities can be exhibited by the same substrate, I contend that this is nevertheless a given, readily observable fact of nature—in this case the nature of the mind.
Libet’s alternative is his conscious mental field (CMF) theory, in which consciousness is a form of non-physical field that arises from the activity of the brain that supervenes over and controls the activity from which it arises. Libet (1994) believes this theory to be testable, if one can demonstrate a mode of intracerebral communication that can proceed without requiring connecting neural pathways. Aficionados of consciousness studies will recognize this to be a strong form of property dualism (a) because Libet’s conscious field emerges from the activity of the brain and is in that sense a property of it, and (b) because he insists that this emergent field non-physical. Given that the CMF is non-physical, this proposal runs into the same problems as more traditional substance dualism: How could something entirely non-physical arise from something physical? And, if the physical world is causally closed, how could anything non-physical causally affect it? While it is true that field theories are testable in principle, a test of a CMF would have to rule out all other forms of indirect, physical communication. For example, intracerebral communication via means other than direct neuronal connections might be produced by an electromagnetic field of the kind proposed to be the substrate of consciousness in this journal by McFadden (2002).
Is free will an illusion?
Libet accepts that I am not a reductionist, although I assume that “the processes giving rise to conscious experience follow deterministic physical laws.” His account of my analysis of free will is nevertheless misleading. According to Libet, “Velmans then offers the view that the unconscious neural processes that lead to a conscious wish to act could be regarded as an expression of free will, because we feel that we have free choice and control over the act. Clearly, such a view does not represent genuine free will. The voluntary act is, in this view, not free of the inexorable adherence to deterministic physical processes. In this view, the feeling of an independent freedom of choice and control is merely an illusion.” (p x)
In my analysis, the freedom or lack of freedom of an act is not determined by the accompanying feeling. On the contrary, once the conscious feeling arises the relevant preconscious choice has already taken place. Given this, the impression that the conscious feeling itself makes the choice is indeed an illusion. An act cannot be free because we feel it to be free. Rather we feel and judge it to be free, and are generally right to do so, because our conscious representations are usually accurate. If I feel free to choose between alternatives I usually am, and if I feel that my choice was constrained or absent, it usually was. Does this make free will an illusion? No. There are circumstances under which the feeling might be misleading (for example if I feel that I am free to move a phantom limb). But in the many circumstances where the feeling is accurate, free will is not an illusion.
How could genuine free choice be compatible with a preconscious, “inexorable adherence to deterministic physical processes”? By combining an expanded sense of self (to include the operations of the unconscious and preconscious mind as well as the conscious mind) with a proper understanding of the flexibility and options that are open to decision making systems in the mind, embodied in the brain, once they achieve a sufficient level of complexity. As I have noted above, human freedom does not come without constraints (both physical and social). However, any model of the unconscious, preconscious and conscious operations of mind that does not match the degree of freedom and choice that we actually experience and manifest in our behaviour, needs to go back to the drawing board.
Causal symmetry or asymmetry between the brain and consciousness?
According to Mangan, this nevertheless results in an absolute causal asymmetry between the brain and consciousness. “The brain has causal power over consciousness at least to the degree that the brain is able to shape the contents of consciousness, which include representations of the brain's own non-conscious neural activity and representations of the external world.” But, “Velmans asks us to believe that causal activity which flows so constantly in one direction never flows in the other; that consciousness has no power whatsoever to effect the brain.” (p x) I disagree. As I have noted repeatedly in Velmans (1996, 2000, 2002a,b) ontological monism combined with epistemological dualism does not result in an absolute causal asymmetry between brain and consciousness. Rather, consciousness and brain are dual-aspects of something more fundamental: the nature of mind, viewable from respectively first- and third-person perspectives. And, “If consciousness and its physical correlates are actually complementary aspects of a psychophysical mind, we can close the “explanatory gap” in a way that unifies consciousness and brain while preserving the ontological status of both. It also provides a simple way of making sense of all four forms of physical (P) and mental (M) causal interaction. Operations of mind viewed from a purely external observer’s perspective (P®P), operations of mind viewed from a purely first-person perspective (M®M), and mixed-perspective accounts involving perspectival switching (P®M; M®P) can be understood as different views (or a mix of views) of a single, psychophysical information process, developing over time. In providing a common psychophysical ground for brain and experience, such a process also provides the “missing link” required to explain psychosomatic effects.” (Velmans, 2000, p 251).
However, Mangan goes on to claim that this conception of the mental/physical distinction in terms of first-person/third-person contrasts “points to a serious mistake of emphasis: The relationship of consciousness to the rest of the physical world is clearly not as disjoint as Velmans, and many others, take it to be.” According to him, I treat “the relationship between mental and physical in oppositional terms” and “as decidedly more incompatible than integrated.” (p x) Given my fairly extensive writings on the complementary nature of mental and physical causal accounts (dating back to Velmans, 1991a,b) and my suggestion that the mental and physical are dual-aspects of the one mind, readers might find this puzzling. The reason for his claim nevertheless becomes clear in what follows: he hopes to establish an oppositional relationship in my own approach (whether it is there or not) in order to establish an advantage for his own approach—which tries to avoid a consciousness/brain opposition in an entirely third-person, physicalist way. His “presumption is that the relationship between “mental” and “physical” is in one sense more integrated, and in another more hierarchical … (than in my own approach) … the mental is taken to be a small subset of entities and processes among the much more inclusive set of entities and processes that, in toto, constitute the material world.” (p x)
I have discussed the problems of physicalism in Velmans (2000 ch3, 2002a Appendix) and the particular problems of hierarchical, “nonreductive physicalism” in my response to Torrance, Van Gulick, and Chrisley & Sloman in Velmans (2002b). Mangan does however make a number of additional points in its defence:
To begin with, he challenges my argument that, unlike physical phenomena, first-person consciousness cannot be observed from the outside, and makes the point that “in at least one “conventional, third-person sense” of physical phenomena, unobserved physical entities are accepted by physics all the time. Sub-atomic particles are a standard example.” Of course, I agree. But that misses the implicit thrust of my argument: it is true that, like inferred entities in physics, first-person consciousness cannot be observed from a third-person perspective; but unlike physical phenomena, it can be observed from a first-person perspective (that is the reason for calling it “first-person consciousness”).
Mangan goes on to suggest that we can “go much further than just entertain the possibility that consciousness is something physical. For if the brain causes consciousness, then consciousness is presumptively physical. From the most straightforward, conventional, and completely third-person scientific standpoint, if a physical process is taken to cause something, then that something is presumed to be physical as well.” This is the standard “causation argument”, discussed at length in Velmans (1998, 2000, 2002a). As I have noted in these prior discussions, it is not obvious to many thinkers that conscious experiences are nothing more than physical states of the brain, even if they accept their physical causes to be in the brain, in which case such ‘adjudication by presumption’ begs the question. A non question-begging, empirical investigation of consciousness has to start with an accurate description of its phenomenology—i.e. with its first-person phenomenology and its observable properties. If these properties can be demonstrated to be ontologically identical to an equivalent set of third-person physical properties so be it. But merely identifying their antecedent physical causes won’t achieve this, for the reason that causation and ontological identity are very different relationships. This should be obvious from the fact that dualist-interactionism, epiphenomenalism, dual-aspect theory and physicalism agree that brain states can have causal effects on conscious experiences, while having entirely divergent views about the nature of those experiences. Consequently the identification of causes in the brain won’t settle the issue—and Mangan offers us nothing else.
As I have noted above, I accept that brain states can be said to cause conscious experiences. But I also accept that earlier conscious experiences can be said to cause later ones (in everyday accounts of our mental states), and that conscious experiences can be said to have causal effects on brain and behaviour (for example in psychosomatic effects). Mangan agrees that “we can think of first-person consciousness as having the brain as its cause, and we can think of first-person consciousness as having causal power. We can think of the third-person brain as causing first-person consciousness, and of first-person consciousness as having causal power over the brain.” (p x). He also agrees that, “The concepts “caused by” and “causing” are equally at home in the third-person scientific sense, and in the first-person phenomenological sense.” (p x). However, Mangan does not really come to grips with why all these accounts make sense (the subject of Velmans, 2002a). Contrary to evidence and without any valid, supporting argument, he simply presumes first-person consciousness to be nothing more than a third-person state of the brain.
Mangan's defence of physicalism is driven by the conviction that consciousness plays an important role in human life (a conviction that I share). However, in Mangan’s view, such a role for consciousness can only be secured if its first-person phenomenology can be shown to play a significant third-person causal role in the operations of the brain. While I have no doctrinal objection to the possibility of such a causal role, the evidence adduced in Velmans (1991a, 2000, 2002a) weighs decisively against it. Limited introspective access to mental processing exacerbates the problem. If one is not conscious of one’s own brain/body processing, how could there be conscious control of such processing?
In defence of his position, Mangan claims that volitional phenomenology does contain sufficient information to serve as a basis for conscious control—a point that he hopes to secure by contrasting my own analysis of hesitation pauses in speech (associated with the formulation of ideas and the choice of appropriate words) with a similar analysis by William James. I suggest that, “During a hesitation pause, one might experience a certain sense of effort (perhaps the effort to put something in an appropriate way). But nothing is revealed of the processes that formulate ideas, translate these into a form suitable for expression in language, search for and retrieve words from memory, or assess which words are most appropriate.... The fact that a process demands processing effort does not ensure that it is conscious. Indeed, there is a sense in which one is only conscious of what one wants to say after one has said it!”
In a similar vein, James writes, “Has the reader never asked himself what kind of mental fact is his intention of saying a thing before it has been said? ... How much of it consists of definite sensorial images, either of words or of things? Hardly anything! Linger and the words and the things come to mind, the anticipatory intention, the divination is no more.” However, James goes on to note that, “as the words that replace them arrive, it welcomes them successfully and calls them right if they agree with it, it rejects them and calls them wrong if they do not.” (James, 1890, p. 253) James also notes that similar feelings are present during a tip-of-the-tongue experience: “Suppose we try to recall a forgotten name. The state of our consciousness is peculiar. There is a gap therein; but no mere gap. It is a gap that is intensely active. A sort of a wraith of a name is in it, beckoning us in a particular direction, making us at moments tingle with the sense of closeness, and then letting us sink back without the longed for term.” (Ibid. p251)
James’ writings on such “fringes” of consciousness are particularly sensitive and Mangan has done special service in resurrecting these writings within cognitive psychology. However James was not a physicalist, and Mangan’s views about what can be concluded from James’ descriptions are his own. Mangan accepts that James and I are in substantial agreement about the sensory poverty of hesitation pauses. As Mangan notes, however, James shows that these pauses, along with TOT states, also contain non-sensory experiences—and I agree. How does this bear on the issue of fringe consciousness functioning? According to Mangan (but not James) “Non-sensory experiences represent precisely the kind of information that consciousness would need if it does exercise voluntary control over its own activity and over non-conscious processing…. non-sensory experiences work to represent the gist of what we want to say before we say it, and signal how well the words that we actually speak fit our antecedent intention. In more abstract functional terms, non-sensory experiences at least appear to represent summary context information in consciousness, and mediate voluntary retrieval by providing “targets” that, in conjunction with shifts in attention, call new information into consciousness.” (p x)
This needs careful unravelling. I agree that, viewed from a first-person perspective, fringe conscious experiences seem to function in the ways that Mangan describes. We have a feeling of what we want to say before we say it and in this sense the feeling provides an implicit target. We also have a sense of whether our words fit our meaning, indicating whether they are ‘on target’ and so on. But where does these feelings about what we want to say come from? Where do the words that make our ideas concrete come from? And where do the judgements about whether or not our words are on target come from? Are the “target feelings” generated by consciousness, the concretising words retrieved from memory by consciousness, and the judgements about being on target made by consciousness? No. A “target feeling” (of having an idea to express) is represented in consciousness, but nothing is revealed of the processes that formulate such ideas; words appear in covert or overt speech that express our ideas in language without any conscious knowledge of how to search for and retrieve such words from memory; a feeling of rightness or goodness of fit arises (if our expression goes well) without any explicit knowledge of how we assess such fit—and once the feeling arises the judgement has already been made.
In sum, I agree with Mangan that the fringe of consciousness contains feelings and judgements about material at the focus of attention thereby providing context in a highly compressed form. But this does not resolve the issue of how such feelings arise or how first-person experiences could have third-person causal effects on the brain. If fringe and other experiences can be reduced to physical states of the brain there is no mind/body problem. If conscious experiences cannot be reduced to such physical states (as I contend) having compressed information represented as feelings does not explain how such feelings could affect third-person neural processing. Nor does being “fringe” alter the relationship between such conscious experiences and preconscious mind/brain processing. Like conscious wishes and decisions, feelings of having a thought to express, verbal expressions, and consequent feelings of ‘good fit’ represent the global state of our current cognitive processing and for everyday purposes we can treat these as if they constitute our mental processing (in first-person accounts of what is going on). But like wishes, decisions, and experiences at the focus of attention, they are the result of preconscious processing in the mind/brain.
If we are to believe Claxton, consciousness cannot be understood by means of conscious thought. Rather, “its attempts to explicate itself in conscious representations are to be marvelled at and enjoyed, not to be taken as serious candidates for the ‘truth’ about human nature or indeed the mind-body problem.” (p x) If true, “serious” studies of consciousness in Western philosophy and science might have to close up shop!
But, perhaps this is just a bit of light entertainment. Unless I have missed something, Claxton’s claim is itself an attempt by Claxtons’ consciousness “to explicate itself in conscious representations.” Given the self-referential nature of his claim, he faces a conundrum: If his statement is true, we cannot take it seriously. And it is only if it is false that we can take it seriously.
What leads Claxton into this cul-de-sac? He observes that there is often a mismatch or at least a delay between the wishes and decisions that pass through the conscious mind and subsequent action. “Mind and body are loosely correlated, not tightly coupled.” He then goes on to suggest that the absence of a tight coupling between experiences that pass through the conscious mind and actions rules out any possibility of finding any clear solutions to the problems of consciousness. Why? Because conscious experiences (including complex abstract trains of thought) are intermittent products of complex brain and body interactions conditioned by given contexts, and not windows through which these workings can be directly observed. In such circumstances, he suggests, “ it is much more profitable – assuming one is free to do so – to look for a third entity, perhaps not directly visible itself, which can be seen as the causal progenitor of both the mind state and the body state.” (p x)
As it happens, some of these observations have also been central to my own work. There is far more to human information and other biological processing than is manifest in conscious experience (Velmans, 1991a, 2000). The human mind is also adept at multitasking, so the correlation between what passes through conscious experience and immediate action is a loose one. Like Claxton I find recent embodied, enactive approaches to mind interesting and potentially useful—although I do not think that they address the classical problems of consciousness. And it should be obvious from the above (and from Velmans, 2000, 2002a,b) that the dual-aspect monism that I adopt assumes the existence of a third entity, “psychophysical mind,” that is the causal progenitor of both brain states and conscious experiences.
However, none of this prevents a systematic search for the neural correlates of given experiences, or prevents a systematic enquiry into how first-person experiences relate to third-person processing. Nor does a loose correlation between experience and action undermine the potential veracity and utility of first-person accounts in everyday life or affect my analysis of such accounts in Velmans (2002a). Claxton worries about the mismatch between his conscious intentions and actions: “When I say to myself ‘I’m going to get up now’, the truth is I often don’t. Statistically, the lag between intention and act is highly variable: sometimes I do get up immediately; sometimes it takes an hour. Often, what I observe is that I experience the intention, no related act occurs, my mind drifts onto other things, and at some unpredictable time later I find myself in the bathroom half way through shaving.” This leads him to doubt the veracity of such accounts. I don’t see why. His account of the lag between his intentions and actions is probably accurate. Conscious thoughts and intentions represent momentary states of one’s cognitive and motivational systems that may or may not be immediately expressed in action. But what I consciously think or feel may represent something true and important about what is going on in my own mind, whether I choose to act on it immediately or not.
Emilios Bouratinos’ thoughtful review (of my target article, the eight commentaries and the response) is much more accepting of “attempts by consciousness to explicate itself” and provides a nice counterpoint to Claxton. Bouratinos makes the point that what arises in consciousness is one objectification (of many potential objectifications) that results from the dynamic operations of the preconscious mind. Like Claxton, he stresses that such conscious contents point to the realities that they objectify, but they do not mirror them in any complete sense. However, for Bouratinos, the consequence is not that we should abandon conscious self-reflection or a systematic examination of how conscious experiences relate to preconscious processing. Rather, we should engage in these investigations more deeply—following a tradition dating back to Socrates, Plotinus and Goethe. Pushed to its limits this involves an examination of “pre-epistemic processes”, requiring a first- and third-person investigation of how our preconscious minds fashion the so-called ‘objective’ data that forms the basis for our subsequent concepts and theories. I entirely agree.
What are we to conclude? Mental states such as intentions and decisions and how they affect our actions are not just of interest to academics, but also of central concern to practical men in all walks of life. As David Bakan notes:
“Politicians concern themselves with the mental states of their constituents and others. Military commanders are particularly concerned with the mental states of those against whom they are warring, as well as the mental states of those on whom they spy. The mental events in the minds of Einstein, Fermi, Szilard, and other physicists, in connection with atomic energy, were of no small moment with respect to the physical world. Deceivers are very concerned with the mental states of those whom they deceive and vice versa. Lenders are concerned with the mental states of those who borrow. Salesmen and advertising agents are concerned with the mental states of potential and actual customers. Everybody has an interest in the mental states of motor vehicle operators.” (Bakan, 1980, p127)
The state of mind accompanying an illegal act that makes it a crime (mens rea) is also central to the decisions arrived at by the courts:
“Judges concern themselves with the mental state of the accused. They are interested in whether there was an intention to murder or not. A United States Supreme Court decision on discrimination ruled that disproportionality itself could not be taken as discrimination. The court ruled there had to be evidence of intention to discriminate.” (Ibid, p127)
And whether an act is “voluntary” or “involuntary” is a matter of equal concern in such adjudications. It is generally assumed that conscious free will (evidenced by conscious intentions, choices and acts) underwrites legal responsibility, and according to Libet and Mangan, it is conscious free will that distinguishes voluntary from involuntary acts, giving consciousness a potent causal role in the activities of body and brain. However, as Claxton and Bouratinos point out, what is represented in consciousness is only a pale reflection of the complex operations of the preconscious and unconscious mind/brain, and in the present analysis I have given reasons to doubt both Libet’s (dualist) and Mangan’s (physicalist) way of making sense of consciousness/brain causal interactions.
Can we nevertheless reconcile the commonsense view that we are responsible for our voluntary acts with the findings of science? I think that we can. When preconscious mental processes have an appropriately complex architecture they can make informed choices (within the constraints of heredity and environment) in the light of inner needs, goals and external contingencies. Conscious experiences arise from such preconscious mental operations. Once they arise, they usually represent those operations in a highly compressed, global way—but nevertheless faithfully, and for practical purposes we can take them to be those operations. If I know that an act is unlawful, but consciously choose to commit it, this reflects my state of mind irrespective of whether my conscious awareness of that state is determined by preconscious mental processing. “I” include my unconscious and preconscious mind/brain as well as my conscious experience. This allows one to establish mens rea and legal responsibility. I could plead that my conscious decision to commit a crime can’t be held responsible, as it was determined preconsciously, by my brain. But then the judge could say: “The court accepts that your conscious decision is not guilty, but we will have to jail your brain!”
Bakan, D. (1980) On the effect of mind on matter, in R. W. Rieber (ed.) Body and Mind: Past, Present and Future, New York: Academic Press.
Chisholm, R.C. and Karrer, R. (1983) Movement-related brain potentials during hand squeezing in children and adults. International Journal of Neuroscience, 19, 243-258.
Chisholm, R.C. and Karrer, R. (1988) Movement related potentials and control of associated movements. International Journal of Neuroscience, 42, 131-148.
Chisholm, R. C., Karrer, R. and Cone, R. (1984) Movement-related ERPs in children during right versus left hand squeeze: Effects of age, motor control, and independence of components. In R. Karrer, J. Cohen and P. Tueting (eds). Brain and information: Event related potentials, Vol. 425, New York: New York Academy of Sciences.
Deecke, L. (1980) Influence of age on human cerebral potentials associated with voluntary movement. In D. G. Stein (ed) The Psychobiology of Aging: Problems and Perspectives, Amsterdam: Elsevier.
Deecke, L., Scheid, P. and Kornhuber, H. H. (1969) Distribution of readiness potential, premotion positivity, and motor potential of the human cerebral cortex preceding voluntary finger movements. Experimental Brain Research, 7, 158-168.
James, W. (1890). The Principles of Psychology. New York: Holt.
Karrer, R., Warren, C. and Ruth, R. (1978) Steady potential activity of the brain preceding non-cued and cued movement: Effects of developments and mental retardation. In D.Otto (ed) Multidisciplinary Perspectives in Event-Related Potential Brain Research. Washington, D.C.: U.S. Government Printing Office, NTIS, 322-330.
Konttinen, N. and Lyytinen, H. (1992) Physiology of preparation: Brain slow waves, heart rate, and respiration preceding triggering in rifle shooting. International Journal of Sport Psychology, 23, 110-127.
Konttinen, N. and Lyytinen, H. (1993) Brain slow waves preceding time-locked visuo-motor performance. Journal of Sport Sciences, 11, 257-266.
Konttinen, N., Lyytinen, H. & Era, P. (1999) Brain slow potentials and postural sway behavior during sharpshooting performance. Journal of Motor Behavior, 31(1), 11-20.
Konttinen, N., Lyytinen, H. & Konttinen, R. (1995) Brain slow potentials reflecting successful shooting performance. Research Quarterly for Exercise and Sport, 66, 64-72.
Konttinen, N., Lyytinen, H. and Viitasalo, J. (1998) Rifle-balancing in precision shooting: behavioural aspects and psychological implication. Scandinavian Journal of Medicine and Science in Sports, 8, 78-83.
Libet, B. (1985) Unconscious cerebral initiative and the role of conscious will in the initiation of action. Behavioral and Brain Sciences, 8, 529-566.
Libet, B., Wright Jr., E.W., Feinstein, B. and Pearl, D.K. (1979) Subjective referral of the timing for a conscious experience: A functional role for the somatosensory specific projection system in man, Brain 102, 193-224.
McFadden, J. (2002) Synchronous firing and its influence on the brain’s electromagnetic field: evidence fro an electromagnetic field theory of consciousness. Journal of Consciousness Studies, 9(4), 23-50.
Vaughan, H. G., Costa, L. and Ritter, W. (1968) Topography of the human motor potential. Electroencephalography and Clinical Neurophysiology, 25, 1-10.
Velmans, M. (1991a) Is human information processing conscious? Behavioral and Brain Sciences 14(4), 651-701.
Velmans, M. (1991b) Consciousness from a first-person perspective. Behavioral and Brain Sciences 14(4), 702-726.
Velmans, M. (1996) Consciousness and the “causal paradox.” Behavioral and Brain Sciences, 19(3), 537-542.
Velmans, M. (1998) Goodbye to reductionism. In S.Hameroff, A.Kaszniak & A.Scott (eds) Towards a Science of Consciousness II: The Second Tucson Discussions and Debates. MIT Press, pp 45-52.
Velmans, M. (1999) Intersubjective science. Journal of Consciousness Studies, 6(2/3), 299-306.
Velmans, M. (2000) Understanding Consciousness, London: Routledge/Psychology Press.
Velmans, M. (2002a) How could conscious experiences affect brains? Journal of Consciousness Studies, 9(11), 3-29.
Velmans, M. (2002b) Making sense of causal interactions between consciousness and brain. Journal of Consciousness Studies, 9(11), 69-95.
Velmans, M. (2003) How could conscious experiences affect brains? Exeter: Imprint Academic.
Warren, C. and Karrer, R. (1984a) Movement-related potentials in children: A replication of waveforms and teir relationships to age, performance, and cognitive development. In R. Karrer, J. Cohen and P. Tueting (eds). Brain and information: Event related potentials, Vol. 425, New York: New York Academy of Sciences.
Warren, C. and Karrer, R. (1984b) Movement related potentials during development: A replication and extension of relationships to age, motor control, mental status and IQ. International Journal of Neuroscience, 24, 81-96.
 I a voluntary act can be initiated preconsciously, then how can preconscious processing be entirely involuntary?
 The following discussion focuses only on the problem of free will. Readers interested in the full analysis of conscious causation should refer to the special issue of JCS, 11(9), 2002, reprinted in full as Velmans (2003).
 We can only choose to act within the range of human possibility, constrained by heredity and environment, past experience, inner needs and goals, available strategies, current options offered by physical and social contexts and so on.
 Libet prefers to call this initiation “unconscious” rather than “preconscious” for the reason that subjects have no reportable awareness or intuitive feeling that the brain has started a process before their conscious wish/urge to act appears. However this difference in our use of terms has more to do with a difference in emphasis than any real theoretical difference on this point. I call the initiation preconscious to stress the point that it occurs before consciousness of the wish to act appears, but I agree with Libet that “Unconscious initiation of the voluntary process appeared to mean that conscious free will could not actually ‘tell’ the brain to began its preparation to carry out a voluntary act.” (p x)
 The alternative view, that conscious decisions (like wishes) are initiated/generated preconsciously avoids such asymmetries.
 Libet’s suggestion that preconscious wishes always activate action while conscious decisions merely allow it or inhibit it introduces another questionable asymmetry. The reverse might also apply. One might, for example, have a strong wish or desire not to do something and nevertheless decide to do it (e.g. for social reasons). In sum, even if we were to accept Libet’s two-stage model in which a preconsciously generated wish is followed by a conscious decision, there are four distinct scenarios:
a) a wish to do something is followed by a decision to do it (allow)
b) a wish to do something is followed by a decision not to do it (veto)
c) a wish not to do something is followed by a decision not to do it (allow)
d) a wish not to do something is followed by a decision to do it (veto).
It should be clear that these scenarios can in principle dissociate preconscious preparations to carry out or to avoid acts, which are presumably associated with conscious wishes to do or not to do something, from the activation or inhibition of musculature consequent on decisions to carry out or to veto those wishes (see discussion below). For example in d) a conscious veto would activate appropriate muscle groups while in b) a veto would be inhibitory.
 Kontinnen and Lyytinen refer to SPs (slow potentials) whereas Karrer and his colleagues call the evoked potentials that they studied MRPs (motor readiness potentials). Both programmes however studied slow potentials preparatory to a motor act, making their results comparable, in spite of differences in preferred terminology. Libet prefers to refer to RPs (readiness potentials). In the present paper I use these terms interchangeably.
 It should be possible to doubly dissociate decisions to act or not to act from wishes to do or not to do something (see note 3 above).
 See, for example, the discussion of analogies in Velmans (2002b). In fairness to Libet, it should be noted that his commentary was written in response to Velmans (2002a) prior to my (2002b) reply.
 A full discussion of the sense in which a thing-itself is unknowable is given in Velmans (2000) ch7.
 See also the discussion of the primacy of the physical versus the mental and the “the problem of origins” in Velmans (2002b).
 The latter claim distinguishes his view from forms of “nonreductive physicalism”, which accept that higher order properties of the brain may have functions that resemble the functions of mind, while nevertheless remaining physical.
I give a much fuller analysis of the problems of dualism in Velmans (2000) chapter 2.
 I prefer to call these particular laws psychophysical rather than physical, but this does not affect the assumption that the laws are deterministic.
 To support his case that feelings of volition actually cause free choice rather than merely represent it, Mangan claims that, unlike other conscious experiences, conscious volitional contents are not representational. But he asserts this without explanation or argument. There is an obvious sense in which such contents are representational. They represent the degree to which choices are free (within human limits) or are constrained, ranging from feeling entirely free to choose and act, to feeling entirely constrained, with many gradations in between.
 As with Libet, Mangan’s commentary was written prior to the publication of Velmans (2002b).
 Mangan is wrong to suggest that I “assume from the start that consciousness cannot be a physical entity or process.” (p x) Unlike Mangan, I just don’t presume from the start that it is a physical process.
 If A is identical to B, it follows that B is identical to A (symmetry) and all the properties of A are the same as all the properties of B (obeying Leibniz’s Law). If A causes B, it does not follow that B causes A (asymmetry), nor are all the properties of A identical to the properties of B (causation does not obey Leibniz’s Law). See Velmans (1998, 2000, 2002a) for a more detailed discussion.
 Mangan goes on to admit that third-person science draws on the first-person evidence of consciousness all the time, and that one of the main aims of psychophysics is to establish the relationships between first- and third-person standpoints (see his note 1). I have often made the same points, concluding that first- and third-person perspectives are complementary and mutually irreducible (Velmans, 1991a)—a theme that I have developed at length in Velmans (1999, 2000 ch8). Mangan does not appear to realise that to establish the “relationships between first- and third-person standpoints”, psychophysics requires valid first-person observations as well as third-person observations. Conversely, in challenging the veridical nature of first-person observations, physicalism undermines psychophysics. It also saws away the (first-person) branch on which much of third-person science sits.
 In Velmans (2000) I develop the suggestion that consciousness enables us to real-ize the world (in becoming aware of things they become subjectively real for us). Without consciousness, life would be like nothing.
 Given that the physical world is causally closed one would in any case expect the neural correlates of a fringe feeling to encode the same compressed information, and, viewed from a purely third-person perspective, these correlates would already do the cognitive work ascribed by Mangan to that first-person feeling.
 They are in essence extended third-person approaches that explicate functioning and behaviour of the mind and body in terms of their dynamic interactions with the surrounding context. As such they do not directly address the puzzle of how first-person experiences relate to third-person functioning.
 If the act can be shown to be involuntary or unintentional, for example in cases of brain damage or other forms of diminished responsibility, this can be entered as a plea in mitigation.