Degrees of Consciousness

An interesting blog post by William Lycan gives a brisk treatment of the interesting question of whether consciousness comes in degrees, or is the kind of thing you either have or don’t. In essence, Lycan thinks the answer depends on what type of consciousness you’re thinking of. He distinguishes three: basic perceptual consciousness, ‘state consciousness’ where we are aware of our own mental state, and phenomenal consciousness. In passing, he raises interesting questions about perceptual consciousness. We can assume that animals, broadly speaking, probably have perceptual, but not state consciousness, which seems primarily if not exclusively a human matter. So what about pain? If an animal is in pain, but doesn’t know it is in pain, does that pain still matter?

Leaving that one aside as an exercise for the reader, Lycan’s answer on degrees is that the first two varieties of consciousness do indeed come in degrees, while the third, phenomenal consciousness, does not. Lycan gives a good ultra-brief summary of the state of play on phenomenal consciousness. Some just deny it (that represents a ‘desperate lunge’ in Lycan’s view); some, finding it undeniable, lunge the other way – or perhaps fall back? – by deciding that materialism is inadequate and that our metaphysics must accommodate irreducibly mental entities. In the middle are all the people who offer some partial or complete explanation of phenomenal consciousness. The leading view, according to Lycan, is something like his own interesting proposal that our introspective categorisation of experience cannot be translated into ordinary language; it’s the untranslatability that gives the appearance of ineffability. There is a fourth position out there beyond the reach of even the most reckless lunge, which is panpsychism; Lycan says he would need stronger arguments for that than he has yet seen.

Getting back to the original question, why does Lycan think the answer is, as it were, ‘yes, yes, no’? In the case of perceptual consciousness, he observes that different animals perceive different quantities of information and make greater or lesser numbers of distinctions. In that sense, at least, it seems hard to argue against consciousness occurring in degrees. He also thinks animals with more senses will have higher degrees of perceptual consciousness. He must, I suppose be thinking here of the animal’s overall, global state of consciousness, though I took the question to be about, for example, perception of a single light, in which case the number of senses is irrelevant (though I think the basic answer remains correct).

On state consciousness, Lycan argues that our perception of our mental states can be dim, vivid, or otherwise varied in degree. There’s variation in actual intensity of the state, but what he’s mainly thinking of is the degree of attention we give it. That’s surely true, but it opens up a couple of cans of worms. For one thing, Lycan has already argued that perceptual states come in degrees by virtue of the amount of information they embody; now state consciousness which is consciousness of a perceptual state can also vary in degree because of the level of attention paid to the perceptual state. That in itself is not a problem, but to me it implies that the variability of state consciousness is really at least a two-dimensional matter. The second question is, if we can invoke attention when it comes to state consciousness, should we not also be invoking it in the case of perceptual consciousness? We can surely pay different degrees of attention to our perceptual inputs. More generally, aren’t there other ways in which consciousness can come in degrees? What about, for example, an epistemic criterion, ie how certain we feel about what we perceive? What about the complexity of the percept, or of our conscious response?

Coming to phenomenal consciousness, the brevity of the piece leaves me less clear about why Lycan thinks it alone fails to come in degrees. He asserts that wherever there is some degree of awareness of one’s own mental state, there is something it’s like for the subject to experience that state. But that’s not enough; it shows that you can have no phenomenal consciousness or some, but not that there’s no way the ‘some’ can vary in degree. Maybe sometimes there are two things it’s like? Lycan argued that perceptual consciousness comes in degrees according to the quantity of information; he didn’t argue that we can have some information or none, and that therefore perceptual consciousness is not a matter of degree. He didn’t simply say that wherever there is some quantity of perceptual information, there is perceptual consciousness.

It is unfortunately very difficult to talk about phenomenal experience. Typically, in fact, we address it through a sort of informal twinning. We speak of red quale, though the red part is really the objective bit that can be explained by science. It seems to me a natural prima facie assumption that phenomenal experience must ‘inherit’ the variability of its objective counterparts. Lycan might say that, even if that were true, it isn’t what we’re really talking about. But I remain to be convinced that phenomenal experience cannot be categorised by degree according to some criteria.

15 thoughts on “Degrees of Consciousness

  1. There is experience of consciousness, and there is evidence of consciousness. One depends on the other. I can attribute consciousness to an amoeba, unlikely that the amoeba can do the same to me (or even itself). So, if consciousness does not hold in degrees, then am I an “apex consciousness?” Unlikely. The book ‘Orgin of Consciousness in the Breakdown of the Bicameral Mind’ discussed a type of consciousness wherein the hemispheres of the brain are less integrated and communicate by kinds of messages. That would be consciousness of an entirely different kind than we are accustomed.

    Sometimes I think there is a tendency in philosophy to ‘over-philosophize’ what is otherwise pretty straightforward. If a hard-line reductive materialist flat-out denies consciousness, why argue the point? Who would you be arguing with…?

  2. Aren’t categories like quantum and qualia, subjectivity and objectivity the means for meanings of conscious…
    …isn’t also, consciousness would proceed any dimension any time…

    Every [[moment]] we have, a result of past and future moments…

  3. I’m a bit surprised to find myself typing this, but I think Lycan is right. But to understand why he’s right, we have to remember what phenomenal consciousness is. It’s consciousness from the inside, the subjective side. As such, it’s a construction of perceptual consciousness and state consciousness. It only exists subjectively.

    Importantly, our ability to assess our own consciousness is limited by our own consciousness. If that consciousness is in a lesser state, our ability to perceive its lesser state is also lessened. And subjectively, we only perceive ourselves to either be having subjective experience, or we don’t perceive anything at all (being unconscious, asleep, anesthetized, etc).

    So, purely from the phenomenal perspective, the only perspective where phenomenal consciousness exists, phenomenal consciousness seems all or nothing.

    On state consciousness, I don’t think it’s a human only capability. But humans have more of it than any other animal (probably). As Lycan discusses, most vertebrates will self administer painkillers when injured, indicating they know they’re in pain. They have affect consciousness. But most of them don’t know that they know, that is, they don’t have metacognition, particularly to the recursive degree humans do.

  4. Very reasonable as usual, Peter. I would go a little further and suggest that the fact of having or not-having phenomenal consciousness is itself fuzzy, like baldness or non-baldness. (I guess non-baldness is fuzzier than baldness 😀 ) I think of phenomenal consciousness as, roughly speaking, interoceptive information that is poised to be broadcast to a global workspace. Which depends among other things on a creature’s having a global workspace. Which is a fuzzy matter, when you get right down to it.

  5. Paul, I like the direction you take with consciousness being poised to broadcast to a global workspace. It aligns with the notion of consciousness as a systemic feature, which I’m certain is true – distributed cognition. Consciousness can only be measured with respect to…other consciousness.

  6. Sadly the piece seems like nothing more than Lycan’s personal opinions about what is “extreme” and what arguments are compelling.

    It’s also odd he doesn’t see answers to the Hard Problem are rooted in these mere aesthetic preferences, given he wrote Giving Dualism Its Due.

  7. I looked up “state (of) consciousness” and it seems to be about the development of attention…
    …toward sensations, emotions and mentations inside oneself…

    My question, why isn’t philosophy interested in one’s sensations and emotions…
    …mentally we say they are here in us…

    But we do not try to observe, study or write about them as objects, we treat them as subjects…
    …that we sense feel and think is more, than just thinking…

  8. I’ve been using the analogy of laser light as consciousness.

    Firstly, it requires specific physical materials in specific conditions to emit coherent photons at all.

    Secondly, given a system that *can* lase, depending on the specifics, the *amount* of laser light produced varies.

    So there is a need for a specific system, but within that there can be degrees of operation.

  9. It’s not so much a question of whether there are degrees of consciousness as whether there are degrees of reflective awareness—reflective awareness of differences in experience, upon which an entity could reliably and intelligently react. Question to our worthy host: Might our kind of intelligence enable our kind of consciousness, or might our kind of consciousness enable our kind of intelligence—or are the matters inextricably entwined?

  10. I was chasing around looking for experiments that might test if nonhuman animals might show evidence that they can remember what they were thinking about. There are a couple that purport to show they can remember what they think the other animal is thinking about eg where one ape works out that a second ape has a false belief about where something is hidden. One human strategy for this is imagining one is in the position of the other, and working out the conclusions you would have reached – that has to be some type of consciousness.

    The other situation might be carrying out a long term non-instinctual plan, eg trekking 30 minutes to get the right grass for fishing out ants, then returning to the ant nest. One is thinking of the goal, and the facts of what one has already done.

  11. I think there is good reason to be skeptical of more than just the claim that the third category is without degree; I doubt the validity of the third category itself, and not because I deny the existence of those experiences that are called phenomenal.

    I agree that many animals can experience pain without knowing that they are in pain, while some go beyond that to know they are doing so, and I agree that this is a significant difference. The animals in the latter category, however, know what it is like to be in pain.

    Interestingly, Lycan quotes Nagel essentially making this point, and then writes “Where there is any degree of awareness of one’s own mental state, there is something it’s like for the subject to experience that state.” Am I missing something here, or has he not just made a clear case that the so-called ‘phenomenal consciousness’ is just an aspect of state consciousness?

  12. A “food chain” of plants to animals to being for understanding degrees of consciousness…

    …positing plants are for animals, animals are for consciousness, consciousnesses for what is…

    Then illusionary consciousness can be set aside and worked with directly…

  13. It’s hard to separate consciousness from the rest of your body. An eagle is more conscious of it’s visual surroundings than a person because it has better visual hardware. A bear less so as it’s vision is poorer. There are a host of physical capabilities in different animals that would affect what they are conscious of.

    Another source for differing degrees of consciousness is the neural wiring of the brain itself. Humans have the capacity of language which allows wide ranging thoughts to be developed and appear in our conscious. No language, no philosophy. Other animals have other internal capabilities that humans don’t have. A mouse on the forest floor is very attuned to every vibration and sound while we humans blissfully tromp around in the woods missing most of what is going on.

    Even what we select from one or more streams of awareness to bring into the conscious mind is likely not a conscious decision.

    So overall, there are degrees of consciousness in all areas but no evidence of degrees of consciousness in the conscious itself.

  14. I think Frans de Waal’s chapter on consciousness in his new book (Mama’s Last Hug) would definitely be relevant to this discussion. Studying non-human entities is key to our understanding of consciousness and its degrees.

Leave a Reply

Your email address will not be published. Required fields are marked *