Vacchablogga

January 1, 2020

Anthropic bias and the basis of consciousness

Functionalists think that every system that has the right functional organization is conscious.1 For example, they think that an advanced computer that has the same functional organization as my brain would be conscious, even if the computer is made out of silicon instead of grey matter. Non-functionalists deny this. They think that in addition to having the right functional organization, a system must be made out of the right material (grey matter, or whatever) to be conscious.

I have what I used to think was a good argument for functionalism. Now I think the argument is mistaken, for reasons I’ll explain below.

The argument goes like this. Suppose that you pick a material at random, give it the functional organization of our brains, and find that it supports consciousness. That counts in favour the view that most or all materials support consciousness. (If only, say, 1% of materials supported consciousness, then there is only a 1% chance that you would have picked one that supports consciousness.) But we can think of ourselves as such a random sample. After all, there are many different materials our brains could have been made out of. Evolution doesn’t care which one our brains are made out of, so long as it gets the job done.

The problem with the argument is that it fails to take into account observation selection effects. However limited the range of consciousness-supporting materials is, every conscious observer will find that the material that they are made out of supports consciousness. If it didn’t support consciousness, then they wouldn’t be able to make any conscious observations. Given this, we are at best justified in thinking of ourselves as a random sample of the conscious systems that are functionally organized in the relevant way. And nothing interesting follows from the trivial fact that a random sample of these conscious systems is conscious.

To see this point more clearly, consider the following parody argument:

Suppose that you pick a planet at random and find that it is habitable. That counts in favour of the view that most or all planets are habitable. But we can think of Earth as such a random sample. So the fact that Earth is habitable counts in favour of the view that most or all planets are habitable.

This argument makes the same mistake as the above argument for functionalism. However limited the range habitable planets is, every observer will find that the planet they inhabit is habitable. If it wasn’t habitable, then they wouldn’t be able to observe themselves inhabiting it because they wouldn’t exist.2

Of course, this doesn’t mean that functionalism is false. I think that a different argument for it probably succeeds. But I am now a bit less sure of functionalism than I used to be, since some of my confidence came from thinking that the argument in this post succeeded.

Complications

Above I talked about conscious observers. But what about unconscious observers? Why can’t we think of ourselves as a random sample of all observers—both the conscious and the unconscious ones?

Flat-footed reply: Unconscious “observers” aren’t really observers.

A more interesting reply: We can, but even the unconscious observers wouldn’t find that they lack consciousness. For example, a computer that has the same functional organization as my brain would believe that it is conscious. If it didn’t believe this, then its functional organization would differ from the functional organization of my brain, since I believe that I am conscious. So every observer with the relevant functional organization will at least believe that they are conscious, even if they aren’t.


  1. This is a very rough characterization of functionalism. For an explanation of what I mean by ‘functional organization’, see the first section of Chalmers' ‘Absent Qualia, Fading Qualia, Dancing Qualia’ (1995). Sometimes the term ‘functionalism’ is used to refer to the much stronger view that mental states are identical to functional states. As I’m using the term ‘functionalism’ here, it refers to more modest view that there is a lawful correlation between mental states and functional states. ↩︎

  2. This example is from chapter 1 of Bostrom’s Anthropic Bias (2002). ↩︎