There are few things that everyone who ponders consciousness can agree on. It’s a topic where debates on the very definition of the subject are common. The only definitions that seem to command near universal assent are the ones oriented toward phenomenology, such as “subjective experience” or “something it is like.” And even then, the question of whether these are real or illusory is hotly debated.
Moving beyond phenomenology, many people still hold to substance dualism, the idea that the mind cannot be explained with mere physics, chemistry, biology, etc, that something else is needed. We appear to have a strong innate intuition for this view. I think it comes from the fact that our mental model of a mind bears little relation to our model of the physical brain. It leads to the “hard problem of consciousness.”
But the hard problem appears to actually just be a psychological one, a difficulty in accepting what over a century and a half of neuroscience is telling us, that there is no evidence for dualism.
Many people accept the above logic intellectually, but still retain latent dualistic intuitions. Well, I guess we all retain those intuitions to some extent, but not everyone remembers to discount them in the same way we discount our intuition of a stationary earth, that humans aren’t animals, or that space and time are absolute.
In summary, there is no evidence for a spiritual ghost in the machine, nor is there any for an electromagnetic ghost, a quantum ghost, or even a physical one in the sense of a particular location in the brain holding the soul or psyche. There is just the machine and what it does.
You could make the case that there is an overall informational ghost, but that would be true only to the extent that the “ghost” of Microsoft Windows is in the laptop I’m typing this post on.
This has implications for the concept of consciousness that I think many resist, even many stone cold materialists. We have subjective experience that is generated by the capabilities of our nervous system. Our own experience is the only one we ever get access to. We can only infer the existence of similar experiences in other systems. (In philosophy, this is know as the problem of other minds.)
Consciousness is a label we affix to a collection of capabilities that the information system we call our mind possesses. (The exact composition of which is itself a matter of ongoing debate.) When we ask if something else is conscious, I think what we’re really asking is if it processes information similar to the way we do and has similar drives.
So, when Bob ponders whether Alice is conscious, he’s basically thinking about how much Bob-ness she has. When Alice ponders Bob’s consciousness, she’s thinking about how much Alice-ness he has. When humans ponder animal consciousness, we’re wondering how much human-ness they have. And when we ponder machine consciousness, we’re wondering how much life-ness they might have.
This, incidentally, is very natural for us as social creatures. Pondering how much another entity thinks like us likely goes back at least to the earliest social species. Perhaps earlier animals even had an incipient theory of mind for prey and predators. This mode of thinking, to widely varying degrees, may be very ancient.
But it’s always a matter of judgment because no two systems process information in exactly the same way. Even different members of the same species are going to vary. And the further from mentally complete humans we move, the less like us they process information, and the more in doubt their us-ness is.
This is just a special case of the fact that whether a particular system implements a particular function is always a matter of judgment. To say that it isn’t is to invoke teleology, the idea that natural systems have some inherent purpose. But teleology was abandoned in science centuries ago, because it could never be objectively demonstrated. Function is an interpretation.
From the similarities, we decide how much moral consideration a particular system should have. If we decide that it should have it, we tend to think of it as conscious. Consider all the cases where someone argues that a creature is conscious or sentient, that’s it’s like us, to make the case that it should be treated better. But if there is no objective morality then it follows that there is no objective consciousness.
A commonly expressed objection to this is that it’s circular and subject to infinite regress. But this can be said for any evolved trait. How could the trait,particularly a social one, start if it’s required to first be in a parent or in partners? The answer is generally that the trait evolved gradually. The same can be said for consciousness. There was never a first conscious creature, just increasing capabilities until a point was reached where we might be tempted to apply the label “conscious” to it. But the first animal “worthy” of that label would not have been dramatically different from its parents.
All of which is to say, I think asking whether a system is conscious, as though consciousness is a quality it either possesses or doesn’t, is meaningless. Such a question is really about whether it has a soul, an inherently dualistic notion. Our judgment on this will come down to how much like us it is, how human it is. When put that way, the answers seems somewhat obvious. Some species, such as chimpanzees, obviously are a lot more like us than others, such as fish or snails, but all currently are much closer to us than any technological system.
This raises the question of whether we would ever consider a machine intelligence to be conscious unless it had very human like, or at least life like, qualities. When Alan Turing proposed his famous test (now known as the Turing Test), he did so to move the debate on whether machines could think from philosophy to science. But he may have identified the only true measure of other minds we can ever employ. Some criticized that Turing was really testing for how human-like a system was, but that may have been the very point.
It seems that whether any given system is “conscious” is something that lies in the eye of the beholder.
Unless of course I’m missing something?