Why embodiment does not make mind copying impossible

A while back, I highlighted a TEDX talk by Anil Seth where he discussed that cognition is largely a prediction machine.  Apparently Seth more recently gave another talk at the full TED conference, which is receiving rave reviews.  Unfortunately, that talk doesn’t appear to be online yet.

But one article reviewing the talk focuses on something Seth purportedly said in it, that uploading minds is impossible because the mind and body are tightly bound together.

Seth’s work has shown compelling evidence that consciousness doesn’t just consist of information about the world traveling via our senses as signals into our brains. Instead, he’s found that consciousness is a two-way street, in which the brain constantly uses those incoming signals to make guesses about what is actually out there. The end result of that interplay between reality and the brain, he says, is the conscious experience of perception.

“What we perceive is [the brain’s] best guess of what’s out there in the world,” he said, explaining that these guesses are constantly in flux.

…“We don’t passively see the world,” he said, “we actively generate it.” And because our bodies are complicit in the generation of our conscious experience, it’s impossible to upload consciousness to some external place without somehow taking the body with it.

Everything Seth describes conforms with most of the neuroscience I’ve read.  To be clear, the brain is indeed tightly bound with the body.  Most of it is tightly focused on interpreting signals sent to it from throughout the peripheral nervous system, and much of the rest is focused on generating movement or hormonal changes.  The portions involved in what we like to think of as brainy stuff: mathematics, art, culture, etc, is a relatively tiny portion.

And the brain appears to have very strong expectations about the body it’s supposed to be in.  That expected body image may actually be genetic.  In his book, The Tell-tale Brain, neuroscientist V.S. Ramachandran describes a neurological condition called apotemnophilia where patients want part of their body removed because they don’t feel like it should be there (to the extent that 50% of people with this condition go on to actually have the body part amputated).  It’s as though their expected body image has become damaged in some way, missing a part of their actual physical body.

If apotemnophilia is a standard brain mechanism gone awry, then a normal human mind is going to have very strong expectations of what kind of body it will be in.  This makes science fiction scenarios of removing someone’s brain and installing it as the control center of a machine an unlikely prospect, at least without dealing with far more complex issues than successfully wiring the brain into a machine.

But ultimately, I don’t think this makes copying a mind impossible, although it does put constraints on the type of environment the copied mind might function well in.  If the mind has strong expectations about its body, then a copied mind will need to have a body.  A mind uploaded into a virtual environment would need a virtual body, and a mind installed in a robotic body would need that body to be similar to its original body.  (At least initially.  For this discussion, I’m ignoring the possibility of later altering the mind to be compatible with alternative bodies.)

But doesn’t the tight integration require that we take the entire body, as Seth implies?  We could insist that copying a mind requires that the person’s entire nervous system be copied.  This would raise the difficulty, since instead of just copying the brain, the entire body would have to be copied.

Alternatively, a new nervous system could be provided, one that sends signals similar to the original one.  This requires that we have an extremely good understanding of the pathways and signalling going to and from the brain.  But if we’ve developed enough knowledge and technology to plausibly copy the contents of a human brain, understanding those pathways seems achievable.

The question is, what exactly is needed to copy a person’s mind?  If we omit the peripheral nervous system, are we perhaps leaving out something crucial?  What about the spinal cord?  When pondering this question, it’s worth noting that patients who’ve suffered a complete severing of their upper spinal cord remain, mentally, the same person they were before.

Such patients still have a functioning vagus nerve, the connection between the brain and internal organs.  But in the past, patients with severe peptic ulcer conditions would sometimes have vagotomies, where the vagus nerve to the stomach was partially or completely severed, without compromising their mental abilities.

Certainly the severing of these various nerve connections might have an effect on a person’s cognition, but none of them seem to make that cognition impossible.  Every body part except the brain has been lost by somebody who continued to mentally be the same person.  The human mind appears to be far more resilient than some scientists give it credit for.

Indeed, the fact that a person can remain partially functional despite damage to various regions to the brain demonstrates that this doesn’t stop at the spinal cord.  Which raises an interesting question, does the entire brain have to be copied to copy a human mind?

The short answer appears to be no.  The lower brain stem seems to be well below the level of consciousness and is very tightly involved in running autonomous functions of the body.  In a new body, it could probably be replaced.

The same could be said for the cerebellum, the compact region at the lower back of the brain involved in fine motor coordination.  Replace the body, and there’s no reason this particular region would need to be preserved.  In fact, patients who have suffered catastrophic damage to their cerebellum are clumsy, but appear to remain mentally complete.

That leaves the mid-brain region and everything above, including the overall cerebrum.  Strangely enough, of the 86 billion neurons in the brain, these regions appear to have less than 25 billion of them.  (Most of the brain’s neurons are actually in the cerebellum.  Apparently fine motor coordination takes a lot of processing capacity.)  It’s even conceivable that lower levels of the cerebral sensory processing regions could be replaced to match the new sensory hardware in a new body without destroying human cognition.

Obviously all of this is very speculative, but then people are often content to entertain concepts like faster-than-light spaceships, which would require a new physics, as merely a matter of can-do spirit.  All indications are that mind copying wouldn’t require a new physics, only an ability to continue studying the physics of the brain.

Unlike the singularity enthusiasts, I doubt this capability will happen in the next twenty years.  It seems more likely to be something much farther in the future, although it’s unlikely to be developed by those who’ve already concluded it’s impossible.

But is there an aspect of this I’m missing?  Something (aside from incredulity) that does in fact make mind copying impossible?

30 thoughts on “Why embodiment does not make mind copying impossible

    1. It’s actually kind of interesting (disturbing?) how limited that portion might be. Of course, an athlete might insist on having their cerebellum copied since they spent so much time fine tuning it, although that would only give them what they wanted if they were put into a near perfect clone of their original body.

      Like

  1. “But is there an aspect of this I’m missing?”

    Well, you missed … no wait, you got that. But then there was … oh right, you got that too. Nice job!

    Oh wait! There was this: “Unlike the singularity enthusiasts, I doubt this capability will happen in the next twenty years.”

    As you may know, I consider myself a full-fledged singularitarian [Super-human intelligence by 2045] . But I fully agree that mind copying will come long after that, if ever. It will certainly be possible, but the question is whether it would be useful. I kinda think it will only be done by a certain kind of ego, and that kind of ego is the kind that today has a full body portrait of themselves hung in their house, or somewhere.

    I would also like to point out a difference that I have not seen anyone address, specifically, the difference between mind copying and uploading. Uploading implies simulating the mind on a general purpose computer. In contrast, mind copying could happen via duplicating the neural processing on special hardware. So, for example, theoretically you could replace each neuron in your brain with a silicon version. But then, you could simply create the silicon version in toto. The significant difference is that the uploaded version of the mind will ALWAYS be much slower than the special purpose hardware version, and so the latter will always be preferable. The uploaded copy would only be useful as a stored copy that could be instantiated in special purpose hardware when necessary.

    *

    Liked by 2 people

    1. Thanks James! It’ll be interesting to see if we make the 2045 date, although that’s getting close to the outer ranges of my life expectancy (I’ll be 79)), so I may never find out.

      Maybe I’m egotistical, but I’d take mind copying if I were approaching the end of my natural life and the technology were available. What would I have to lose? But I’m not expecting to have that opportunity.

      Good point about the distinction between uploading and copying. I agree.

      I actually see uploading as a special case of copying, which is why I’ve started using “copy” more often than “upload”. I agree that copying to specific specialized hardware is more likely, although technically someone could get “uploaded” into a specific artificial brain in a data center that is networked into hardware generating a virtual environment.

      Liked by 1 person

  2. Not that it’s particularly relevant to your bottom line, but … the brain’s activities matter to perception in another very important way, not mentioned above. Specifically, it moves your eyes and ears and hands around, to better explore your environment. When psychologists held rats’ heads in one position and flashed images before them, they learned some things about rat vision – which have little application in the real world. A rat in a vise is a much poorer seer than a rat free to move. The same goes for us, obviously.

    Liked by 2 people

    1. I read something a while back that said that if our eyes were to be held rigidly immobile, that we wouldn’t be able to see much of anything. Our stable field of vision is a construction by visual circuitry in the brain that receives the signals coming in from the constant and rapid eye movements, most of which we’re not even aware of.

      Obviously if that circuitry were copied with the rest of the brain, the new environment or machinery would have to recreate a similar signal flow to it. It’s conceivable the technology doing the copying might simply replace that lower level circuitry with something compatible with the new sensory hardware, although given that there are only about a million fibers in the optic nerve, as opposed to who knows how many billions of connections between brain regions, recreating the communications along the optic nerve bundle might be the easiest approach.

      That’s assuming that the retina isn’t included in the copy. Technically, the retina is part of the central nervous system (unlike the other senses), so it might make sense to include it in the copying. If so, then the technology “only” needs to stimulate the copied rods and cones to recreate visual input.

      Liked by 1 person

  3. “if I were approaching the end of my natural life”

    Clearly you are not a singularitarian. By 2045, there will be no such thing as “the end of my natural life”. Aging is just an engineering problem.

    *

    Liked by 1 person

  4. To me you keep switching over to ‘copy’ rather than ‘upload’ and that’s what you’re missing. What Seth has referred to is uploading. I think he’s saying you can’t just kind of lift the mind out of the brain and put it somewhere else – that there is no brain/mind difference as we commonly describe it. You are the mechanism your brain is – you and I don’t have a mind to upload to begin with.

    Also there seems an omission in your copying talk – if it’s copying, then the original you is still there, you’ve just been duplicated. There’s no transferal type thing happening there, you’ve only been xeroxed.

    Liked by 1 person

    1. I’m not sure if I quite grasp your first point. Are you saying Seth would say copying was okay but uploading isn’t, or that the mind is an illusion?

      On your second point, absolutely, although any foreseeable scanning process is likely to be destructive. But once that initial copy happened, copies could be made at will. Someone could conceivably make ten copies off me off the original, if they were so inclined.

      A society with that technology would have to come up with rules and norms to deal with that fact. Who is married to the wife of the original, or owes alimony to the ex? Who gets the checking account? Who is guilty of a crime committed by one of the copies? If only the copy who actually did the crime, how do we handle criminals who make copies of themselves explicitly to avoid culpability?

      Like

      1. I’m not sure what Seth would say about copying, as he is just addressing ‘uploading’ from what I can see.

        On your second point, absolutely, although any foreseeable scanning process is likely to be destructive.

        What do you mean ‘although’? It’s somehow different because it kills you?

        Also I’m not sure how you can avoid culpability…if you make a copy of yourself, why would it want to go to jail anymore than you do? And when a regular twin commits a crime, I don’t think they have trouble figuring which goes to jail (not philosophically, anyway)

        Anyway, if you want some controversy, I’d say copying isn’t possible as copying even with a photocopier has never been possible. ‘Copying’ is just a heuristic of the human mind where something is forced into a shape roughly resembling another shape enough that the low resolution perception of the humans involved can’t tell the difference. As much as you can’t create matter, you can’t create a copy. You wouldn’t have copies of yourself, you’d have knock offs.

        Liked by 1 person

        1. The “although” was in response to “then the original you is still there”. If the initial copy is destructive, then the original actually isn’t still there. Of course, it would still be there for any subsequent copies, or if some way can be found to do the original copy non-destructively.

          On culpability, it’s not that hard to imagine a murderer psyching themselves to make a copy and, as that copy, commit the murder then kill themselves, theoretically absolving the original person of any guilt. The problem, from the criminal’s point of view, is that if we have the technology to copy a mind, we probably have the technology to examine its contents and determine the original’s motives. I can imagine prosecutors obtaining court orders to examine the mind of suspects.

          Copying an organic mind will never be perfect, but perfection is always a false standard. The question is whether it will be effective, whether it will be similar enough to convince friends and relatives, not to mention the new mind itself, that the copy is effectively the same person as the original.

          That said, recorded music never perfectly captures a live performance, but subsequent copies of a digital recording are effectively perfect. It seems like that would be true of minds as well.

          Like

          1. On culpability, it’s not that hard to imagine a murderer psyching themselves to make a copy and, as that copy, commit the murder then kill themselves

            It does seem rather hard to imagine to me?

            Anyway, it’s kind of like hiring a hitman, isn’t it? Except your growing a hitman.

            I can imagine prosecutors obtaining court orders to examine the mind of suspects.

            Sounds dystopian AF, as the kids would say! And who examines the minds/watches the minds of the examiners/watchmen?

            Copying an organic mind will never be perfect, but perfection is always a false standard. The question is whether it will be effective, whether it will be similar enough to convince friends and relatives, not to mention the new mind itself, that the copy is effectively the same person as the original.

            It doesn’t matter if it’s true, just whether it’s effective?

            Liked by 1 person

    2. “It doesn’t matter if it’s true, just whether it’s effective?”

      How would you define “true” in this context? If another system has my memories, dispositions, emotions, what determines whether it’s “truly” me?

      Like

      1. Well, you seem to be saying it is ‘true’ that it has your memories, it is ‘true’ that it has your dispositions, it is ‘true’ that it has your emotions? It kind of feels like you’ve used ‘true’ yourself several times. When you say to me it has your memories, are you saying that to convey something you find to be true? If so, doesn’t your use of ‘true’ sort of need to be unpacked to begin with?

        To me, genuine immortality rides on more than semantic ambiguities.

        Liked by 1 person

        1. Actually, I should have asked, “If another system appears to have my memories, dispositions, and emotions, what determines whether it’s ‘truly’ me?” Is there any way we could judge whether they were my true memories, dispositions, or emotions other than observing the new systems’s behavior? And if that new system wasn’t a perfect copy, but my friends and family still psychologically felt like it was me, wouldn’t that make it an effective copy?

          Of course, someone could say the new system is a philosophical zombie. But then you and I could say that about each other. We can never know beyond a doubt that other people are conscious. We shouldn’t hold a copied mind to a higher standard than the original could have ever met.

          Like

          1. Yeah, but if you buy a pet that looks just like the one that you accidentally stepped on and the kids don’t notice, does that carry some kind of significance? That you know there’s a difference, that you think there’s a difference, but they don’t feel a difference, how is that accomplishing something? Intellect tricking feelings doesn’t feel like immortality to me. Just seems like a variation of sitting on your hand until it’s numb.

            Like

  5. Hey, I wanted to let you know I’ve nominated you for the Mystery Blogger Award. I realize not everyone’s into these kinds of thing, so please don’t feel like I’m pressuring you to do this if you’re not interested. If you do chose to accept the award, the rules are up on my blog.

    The important thing is you’ve taught me a lot and inspired me to keep learning, so keep up the good work!

    https://planetpailly.com/2017/05/08/mystery-blogger-award/

    Liked by 1 person

    1. Thanks James! I generally don’t participate in these award chains, not out of any philosophical objection to them, but just because it isn’t really my cup of tea. But I’m extremely grateful for the nomination and the link.

      Liked by 1 person

  6. An interesting start to answering such questions could be the classic sci-fi/horror prop of the “brain in a jar.” Could a human brain in isolation be fed information through the brain stem, leading it to believe it was still in a body. While I do not think we could do this now, I suspect that it is possible. Since the best definition of “mind” I have found states that a mind is a set of mental attributes or functions, those functions seem to be of the brain, and if of the brain could those functions be mapped into a virtual space including provision for the inputs learned about in the “brain in a jar” experiments. I am close to saying this sounds plausible.

    The scary thing, potentially at least, is that if this could be done then we have to think about what duplicated minds means in terms of identity (I have myriad copies of the same computer file in my numerous computers and backup devices–which is the “real” file?) and also, what a completely synthetic mind could mean.

    Liked by 1 person

    1. I think your final points get at why so many people are made uneasy by this subject. It would dramatically complicate our understanding of self. If there are multiple copies of the same mind, who gets to control the checking account? Who is married to the original’s spouse? If one of the copies commits a crime, are all copies guilty or just that one? If only that one, what stops a criminal from forking themselves just before committing a crime? If I erase one of the copies, have I committed murder, or just battery?

      Like

  7. Lots of interesting questions to ponder. Great article, yet again!

    I hope my question hasn’t already been covered, but here goes:

    “Every body part except the brain has been lost by somebody who continued to mentally be the same person. The human mind appears to be far more resilient than some scientists give it credit for.”

    What if that person’s mind was preserved relatively intact because they already had the experience of living with those body parts? It seems that in these examples the brain could have already learned a significant amount from its embodiment, and maybe after having had that fullest experience, it doesn’t lose the memories afforded by the now-missing body part? Perhaps the experience of having had that body part projects into the future even after it’s gone? In other words, maybe a better case could be made if you find examples of people who never had these nerve connections, rather than someone who later had them damaged.

    Another question: If the body doesn’t turn out to be crucial for copying, in what sense is it not crucial? In other words, what exactly is being copied then? Identity as we commonly conceive of it does seem to be closely tied to the body. For instance, gender, race, social status, etc., all seem to be important in our self-narratives, and these are tied to our specific bodies and environments. If I had the opportunity to copy my mind into another body, biological or otherwise, would I remain me? Would my personality as I conceive of it remain or continue intact? I find this hard to believe, and I can’t imagine having a strong enough incentive to do this. We’re talking about immortality, after all, and this idea of copying into an entirely different body brings to my mind the Frankenstein problem…although maybe the changes would be delayed as my brain would presumably retain the same basic shape it had before, and maybe I would slowly morph into someone else. On the other hand, a virtual version of me seems more appealing, as that doesn’t seem as open to the sorts of changes I imagine would happen if my personality depended on my specific body.

    BTW, I might have mentioned to you the show Black Mirror on Netflix (a series of Sci-Fi short stories). Well I recently watched one that I thought you’d be interested in called “San Junipero”. I won’t tell you about it in case you plan on watching it. You might find things to object to in the technology of it, but I loved the storytelling and the question it brought up in the end. I’d be curious to hear your take on it.

    Liked by 1 person

    1. Excellent questions!

      For body parts, I think a huge part of becoming who we are and our worldview is indeed tied up in our body. It does matter when we lose the body part. I think someone born blind has a different perception of the world than someone who lost their sight later in life. The person who lost it later is probably trying to figure out what they would be seeing based on their other senses. But the person born blind never had sight, so it seems like their model of the environment would be totally in terms of what senses they grew up with. (This seems borne out by the fact that a child born blind who get’s their sight fixed too late, after a certain age (after the “critical period”), will never be able to see properly.)

      But what happens if we start a human mind off completely without access to a human body? I suspect the resulting mind, assuming it were functional, would be pretty alien. Greg Egan addresses this in his fiction, and I give him kudos for tackling it, but I strongly suspect he makes the resulting entities too human-like. Someone might argue that the mind would still have the genetically prescribed structure of a regular human’s, but genes work very interactively with the environment (initially the womb, later the world overall), not as a blueprint, and the environment for a mind that starts in virtual would be radically different.

      Definitely I suspect most people would want to be in a body similar (albeit idealized) to their original. Most, but not all. (Transgendered people, for instance, wouldn’t. Ramachandran speculates that their condition of feeling like they’re in the wrong body might have similar neurology to the body image issues of apotemnophilia sufferers.)

      Although it might be interesting to “try out” other body types, just to see what it’s like to experience them. But apotemnophilia seems to show that most people’s brains have a strong body image, and I suspect long term mental health would probably mean satisfying that image, perhaps in virtual, or figuring out some way to do psycho-surgery to modify the image.

      I haven’t seen too many Black Mirror episodes. Most of them are too dark for my tastes. But I’ll look that one up and watch it. (I did see an earlier episode where an uploaded copy of a woman is forced to be the center of a home automated system to meet the needs of her original self: dark, but interesting.) The latest Doctor Who episode touches on a similar theme.

      Liked by 1 person

      1. Interesting, I can’t imagine what it would be like to have a mind without a body (ever) or writing about that in fiction. It would be very alien, I would think. What would this mind think about? If it couldn’t use any senses, what information would come in except, maybe, the passage of time in darkness? (Maybe without any awareness of the concept of time?)

        On Black Mirror, yeah, I found some of those too dark…and I can usually handle dark. This episode is the opposite. It’s actually an unusual romance with a happy ending. And lots of 80s music. 🙂

        Liked by 1 person

        1. The mind would have to have some form of sensory input. I’m not sure it could develop in any meaningful way if it never had any. Egan described a new mind being born and its bootstrapping into self awareness, going on to become the main protagonist of his novel, Diaspora. In Incandescance, has a character who grew up in virtual and is thus completely comfortable with all kinds of alien body plans. I suspect Egan wasn’t familiar with apotemnophilia, at least not when he wrote those stories.

          Just watched San Junipero. Sweet story. The 80s setting and music, along with one spot of late 70s music, was nice. (I came of age in the 70s and 80s.) I’d watch more of the Black Mirror stories if they were like that.

          Liked by 1 person

          1. I was thinking the same thing, actually! Without some sort of sensory input, well, I can’t imagine what sort of experience that would be, or if we could call it experience at all.

            I’m glad you liked San Junipero. I thought you might find it interesting. I liked that it brought up the question of whether or not we would want to continue living virtually if that meant we wouldn’t (or might not) get to experience a natural death. Of course, those who don’t believe in an afterlife shouldn’t have a problem in making such a decision, but the show’s premise makes the choice practically relevant, and I imagine skeptics would think long and hard before making that gamble.

            I also liked the 80’s flashback. And I agree, I’d be more interested in the series if they were like that.

            Liked by 1 person

          2. It actually brought up something that I think even skeptics might have to wrestle with. When we lose loved ones, even if we don’t buy into the afterlife, we know we only have to miss them for the rest of our life. But if the rest of our life is effectively forever, then living with the pain of that separation might drive some people to forego immortality. It would be bad enough knowing we’d never see our parents again, but knowing you’d never see your tragically dead child again? Potentially until the heat death of the universe? I don’t think you’d have to be religious to maybe wonder whether it was worth it.

            Liked by 1 person

          3. Yes, definitely! I was thinking this felt like a new take on Nosferatu, in that there was a potential for experiencing a sense of horror in immortality. Also, in the show there’s a hint of that possibility in the night club scene—the one that looked like a Dantesque circle of hell, filled with people who looked very much out of place for that time—to which desperate souls would go to feel anything at all. I got the sense from the show that the happy ending wasn’t quite the real ending, but it was subtle. I appreciated that subtlety.

            Liked by 1 person

          4. I think most agree that they’d only take immortality with the option to later end it at the time of their choosing. Although once there’s a recording of your mind, how sure can you ever be that someone won’t wake it up again at some point in the future for their own purposes?

            A lot of sci-fi I’ve read that posits a society of immortals, shows most of the people in that society eventually self terminating (albeit after a long life by our standards). Neal Asher in his Polity books, has most people encounter an “ennui barrier” around 200 years of age, after which they become increasingly self destructive in desperate attempts to have new novel experiences. Greg Egan, who is always a bit more utopian in these matters, shows a couple of 10,000 year old characters going on one last adventure before they satisfactorily self terminate.

            Liked by 1 person

Your thoughts?

This site uses Akismet to reduce spam. Learn how your comment data is processed.