Tuesday, November 1, 2011

Is the mind a computer?

Is the mind a machine?  Daniel Dennet seems to think so.

The brain is an organ.  Organs are organic machines, components in a composite machine (the composite machine is the animal itself).  Probably, one can also analyze the brain, and see how the brain is also a composite machine, composed of smaller machines.  Maybe the mind is simply one of the smaller machines that compose the brain.

Is the mind a computer?  Dennet calls the mind a “virtual machine”.  The term “virtual machine” can mean different things in technical contexts.  The JVM (Java Virtual Machine) and a VMWare (VM here stands for “virtual machine”) image are two very different things.  Mostly what Dennet is trying to say by using this term is that the mind is software as opposed to hardware.  This distinction is very important to Dennet, and he uses it to make some interesting points, but I find it problematic and sometimes distracting.  So for the moment I want to ignore the hardware/software distinction.

Computers, unlike many other machines, deal with data.  A computer stores data in memory (and on disk, and in registers, but let’s ignore these distinctions for the moment).   The contents of memory change over time.  This gives rise to a distinction that I do not want to ignore: the distinction between data and behavior.  A computer does things, but computer memory doesn’t do anything.  The computer’s CPU does things.  The memory is there for the CPU to play with.

The CPU can do things, but without the memory, it doesn’t know what to do!  It looks to the memory for instructions on what to do next.  This adds complexity, so some computers or computer programs segment the memory into the part that contains instructions and the part that contains non-instructions, or data.

If we model the mind as a computer, consciousness should be modeled as the data memory of the computer.  A mind can be conscious of different kinds of things.  The simplest contents of consciousness may be sensory perception, but certainly includes other things, like emotions, thoughts, etc.

I sometimes like to use JSON (JavaScript Object Notation) as a semi-formal, or pseudocode, way of visualizing the contents of data memory.  Here is a pseudocode model of a conscious state:

{
 visualField: ‘Some bitmap with a computer screen’,
 desires: [
‘Solve the mind-body problem’,
‘Publish a philosophical paper’,
 ],
 volition: ‘typing’,
 beliefs: [
‘The mind is a computer’,
‘God exists’
 ]
}

At any other point in time, the memory of this mind probably has contents that are different to some extent or another, for instance:

{
 visualField: ‘Israeli salad’,
 desires: [
‘Solve the mind-body problem’,
‘Publish a philosophical paper’,
 ],
 volition: ‘eating’,
 beliefs: [
‘The mind is a computer’,
‘God exists’
 ]
}

This “JSON object” represents the contents of consciousness, which is the data memory of the mind.  Presumably, there is something relevant to the CPU and the instruction memory, which controls or influences the transitions between conscious states, but those mechanisms are not contents of consciousness.

Many have emphasized the “higher-order” nature of consciouness, which means that the mind can think about itself.  Some, like Hofstaedter (and my father), seem to suggest that it is this “higher-order” nature that makes consciousness what it is.  Dennet is a bit more cautious about this point, but he too finds great importance in this higher-order-ness.  Higher-order-ness can also be modeled in JSON, and I think modeling it in this way can help clarify what higher-order-ness means:

{
 visualField: ‘Bitmap with a computer screen’,
 thoughts: [
‘I see a computer in front of me’,
‘There is a computer in front of me’,
‘I think therefore I am’
 ]
}

When the mind thinks about itself, it has a very high degree of accuracy.  There is nothing logically necessary about this.  It’s just how the mind works.

No comments:

Post a Comment