A few days ago, when I told him I thought his skepticism of mind uploading was a bit overly pessimistic, Massimo Pigliucci pointed out that mind uploading implies dualism and seemed to see this as a strike against it. (The relevant comments are scattered on this thread at his blog. Search for “selfaware” to find the snippets. If you peruse it, don’t miss Disagreeable Me’s contribution.)
Normally, when we say “dualism”, meaning mind-body dualism, we’re referring to the dualism of the non-material soul as opposed to the physical body. This is the substance dualism envisaged by philosopher Rene Descartes and many religions.
But as fellow blogger ausomeawestin pointed out to me some time ago, it’s not the only form of dualism. Philosophers contemplate other forms such as predicate dualism and property dualism, which hold that there is only physical substance but that dualism remains a useful logical concept. I’m not sure if what I’m about to describe counts as either predicate or property dualism, but there do seem to be similarities.
In the computer world, there is a type of dualism, although no one calls it that. It’s the dualism between software and hardware, between logical or virtual constructs and physical ones. The computer I’m typing this post on is a Macbook Pro. It has a hardware existence with an Intel processor, memory, storage, etc. But it also has a software identity, running Mac OS X and the applications I’ve installed along with copies of my data.
Being in IT, I use a lot of various devices, but the software they’re running is usually the more important part. The software is the personality, the “soul” of the experience of using that device. Software isn’t physical. I can’t point to Mac OS X, or iOS, or Android, or Microsoft Windows. Oh sure, I can point to a device running the operating system, or to the box holding the media that it came on (or at least I could before everything went to digital distribution), but none of these things I can point to actually is Mac OS X, or Windows, or Linux, or whatever your OS of choice is.
Software is a non-physical entity. It is information, patterns, structures that exist and have effects in the world. It can have several physical instantiations, including as magnetic encodings on a disk, as electrical signals in a cable or as WiFi radio signals when it is being downloaded, or as transistor states in a memory chip or processor when it is executing.
Teams of programmers are paid to create and maintain the operating systems, applications, and games that we use and play. Before I switched to the dark side (known by some as management) I spent countless hours of my life creating and maintaining software applications. I occasionally pay money to use certain ones, and companies consider software to be valuable intellectual property.
So there’s no doubt that software / hardware dualism exists in computers. The question is, does it exist in biological brains? Is the mind the the brain’s software? If so, can the mind be copied and executed somewhere else? How similar is the brain to a computer?
Broadly speaking, they both take input, process information according to programming (computers) or instinct (brains), and produce outputs. For a computer, this input typically comes in through the keyboard, mice, and data ports. Output comes out through the screen and data ports which may connect to peripheral devices.
In the case of the brain, it receives input through the nervous system, from senses such as sight, hearing, smell, touch, and taste. Output comes in the form of electrical signals sent through the nervous system to muscles throughout the body.
But there are radical differences. Computers store information in transistors that are engineered to be in one of two discrete voltage states. These two states are interpreted as 1s and 0s, which form the backbone of digital processing. The connections between transistors is orderly and determined by designers. Memory is usually in a different part of the processing chips from the actual logic gates. Everything about the computer’s design is engineered to be computable and discrete.
Brains are far messier. They appear to store information in synapses, the connections between neurons, cells that fire electrical signals. Synapses, which come in different types such as electrical and chemical synapses, can vary in the strength they transfer electricity, but smoothly, not in discrete states. The brain is not a digital processor, but an analog one, and its processing is mixed with its storage areas. There is no one central processing area of the brain. Rather, processing is distributed throughout, although in modules dedicated to specific functions and purposes.
Most importantly for this discussion, a modern computer is architected to load different software into its memory and clear it out on demand. Brains are not arranged this way. A brain’s programming can be modified to some degree by what input it receives, but it has no mechanism to record a new wholesale personality or to allow one to be copied out. Brains evolved for certain purposes, and loading or exporting a mind is not one of them.
There is no data port to copy information in or out. Any process to copy a mind is probably going to be destructive, at least with foreseeable technology. One prospective process is serial sectioning and imaging of the brain to record the states of all the synapses, neurons, and glial cells for use in a brain simulator or duplicated brain. Of course, the destructive nature of this pretty much guarantees that no one will want to do it until their natural death is imminent, at least not at first.
A key question is, how much detail is needed to copy a mind with fidelity? Is recording the connectome, the synapses and their strengths, enough to make an accurate copy of the mind? If so, then processes such as serial sectioning might eventually be able to record a mind. But if recording a mind accurately means going down to the arrangement of atoms, or even worse, to the subatomic level, then the idea of capturing and modelling a mind starts to look a bit hopeless.
That said, the neuroscience I’ve read talks in terms of neural circuitry, in terms of neurons and synapses. Usually the only people I see mention atomic arrangements are those looking for a reason to be skeptical of the idea of copying a mind. Of course, even if we don’t have to go all the way down to the atomic level to get an accurate copy, we would still have to have a far better understanding of how the brain works than we currently do to make a functional copy.
But another closely related question is, how much fidelity would we demand before we’d consider a copied mind to be the same person? It might be possible to make a copy that is functional but not a flawless duplicate of the original. Indeed, unless we make an identical brain, there would almost certainly be differences. How tolerant would we be of those differences if the new mind, for the most part, had the memories and qualities of the original?
In addition to accuracy issues, who we are is defined by how our brain functions, including its flaws. It’s easy to imagine that an uploaded mind could actually function better in some respects, but doing so would change the person. Imagine removing a person’s “flaws” such as excitability or innate melancholy. There are bound to be friends who would miss the exact mix that made up the original person.
Pondering all this, it’s easy to see why many people conclude that the mind is the brain, and that any talk of copying the mind is misguided. But unless the definition of the mind does go down to the atomic level, I think copying it is a matter of neuroscience and technology. Nature doesn’t make it easy for us, but I can’t see that it has made it impossible.
Some may regard the unavoidable differences between the original and copied minds as a crucial flaw. But this has to be balanced against the knowledge that our mind today is different than our mind from a year ago, or ten years ago. Indeed, someone who has been through a traumatic or moving experience may be noticeably different afterwards. That they might be somewhat different after being uploaded or copied might be a transition event we’d learn to live with. And if the alternative is non-existence, it’s easy to see many people opting for it.
Now, as I said the other day, I do think there is substantial room for skepticism that mind uploading will happen in twenty years along the lines often envisioned by the singularity prophets. (Although given the unpredictable nature of technological advance, I’d be pretty cautious of making any absolutist statements on this.)
Often, the thought is that AIs will come along and take care of this for us, but AIs themselves are not going to come about solely from increasing computing power. Their development will be as much a matter of software design as hardware. Both AIs and mind uploading will require that we understand far more about how a mind works than we currently do. And that may take decades or maybe even centuries.
This conclusion is often resisted by all sides in this debate. If mind uploading can’t happen in our lifetime, then it might be more comfortable to conclude that it’s permanently impossible. Understandably, no one is enthusiastic about being part of the last mortal generation. But if mind uploading does happen in our lifetime, it won’t be accomplished by people who have decided that it’s impossible.
Getting back to the original question, can the mind have an existence separate from the brain? I think it depends on how rigid or flexible we want to be in our definitions.