Why You Should Upload Yourself to a Supercomputer

We’re still decades — if not centuries — away from being able to transfer a mind to a supercomputer. It’s a fantastic future prospect that makes some people incredibly squeamish. But there are considerable benefits to living a digital life. Here’s why you should seriously consider uploading.

via Why You Should Upload Yourself to a Supercomputer.

An interesting article at io9 relaying the orthodox singularity or transhumanism vision of the benefits of uploading our minds into a computer.  As I posted the other day, this orthodox vision makes some assumptions that I think shouldn’t be accepted as given.

Among them are that we’d have all the processing power and capacity that we’d want, so that we could clone as many copies of ourselves as convenient and explore as many alternate universes as we’d like, all with processing speeds so that virtual months or years could pass in an objective day.

The article also mentions that fact that we could modify ourselves in any way we’d want, reducing or enhancing emotions, removing some, adding new ones, etc.  It only tangentially mentions how radically different this might make our conceptions of self.

The top reader comments have the standard objection to this view.  The uploaded version wouldn’t be you, but a copy of you.  It has to be remembered that all of the atoms in our brain are constantly being replaced by metabolic maintenance processes.  The you of today is not, physically, the you of a few years ago.  The you of today is a copy (an altered one) of the you from years ago.  What persists is the pattern, the information that makes up you.

Of course, the you of a few years ago is connected to the you of today by a perceived continuity of existence that wouldn’t exist between the organic you and the digital version.  Would that be enough to stop you from uploading yourself?

What if  you the two versions of you could exchange memories, perhaps via some sort of augmentation to the organic version?  If the organic version could remember the experiences of the digital version, would that make the digital version seem more like us?

And, of course, if the organic you was near death, wouldn’t the prospect of a digital you surviving be of some comfort?

25 thoughts on “Why You Should Upload Yourself to a Supercomputer

  1. It’s a good thought experiment. Is “you” the atoms in your brain & body? Is “you” the pattern? What makes the pattern in your brain (copied forward from one moment to the next) different to the pattern in the supercomputer (copied forward in a discontinuous jump)? Are they really different, or is that an illusion? Do we cease to exist constantly, only to be reborn anew the next second? If we sleep or lose consciousness do we die?

    Like

    1. Exactly. We don’t fear falling asleep because we remember past instances of falling asleep followed by awakening. (Imagine how terrifying the idea of losing consciousness might be if you had never experienced it before.)

      Like

  2. I think you touch on some great points here. And I believe you are right to push people on their assumptions about self. I am one of those people who just ‘feels’ like the supercomputer version of me wouldn’t be quite “me.” But I wrestle with this. Why not?

    I haven’t been able to answer this yet, but the ‘feeling’ persists. As you’ve said, it’s linked to the feeling of continuity we experience from moment-to-moment, day-to-day, year-to-year. Recently I’ve been trying to break this down for myself and am pushing towards the (tentative) conclusion that I really am NOT the same person that I was when I was born, or when I turned 18, or really yesterday. Those ‘other people’ are like ripples in a lake, getting further and further from the point of agitation in the water (the ‘right now me’ if you will). I’m still connected to those ‘other selfs’ through the water (which I suppose can be either ‘memory’ or ‘experience’ or ‘existence’), but they are fading away over time.

    I know that you are interested in AI from your posts and from comments that you’ve left on my blog, so I’d like to recommend (without any agenda) seeing the movie “Her” that was just released. I think it sets the stage for a lot of people to question their own understandings of self and also how AI relates to that.

    Cheers!
    JB

    Like

    1. A friend of mine convinced me that we must in effect die every time we go to sleep, and the person who wakes in the morning is a kind of doppelganger. Believing this can make you terrified of going to sleep, but the inevitability of sleep eventually pushes you into acceptance of your fate. I’m no longer certain what I believe. Too much thinking makes my head hurt!

      Like

      1. I like your friend’s thought experiment. When adopted, it breaks up some of the indefensible assumptions we have about our “self.” It’s also a great thought exercise for accepting “fate,” or perhaps more accurately, for accepting things that are outside of your personal control. We feel like we are at the helm of life and we are taught that by being careful we can avoid or prevent bad things from happening (to us or to others). Sometimes there is nothing we can do to prevent or stop such events, and this “letting go” can free you up to better deal with whatever just occurred. Plus, if you can get to the “letting go” stage you may find that some anxieties are relieve and gain a “peace of mind,” if you will.

        Like

  3. Do you think that being uploaded to a computer is different to losing consciousness? Suppose someone uploaded you to a computer while you were sleeping – both versions of “you” experienced a discontinuity of some kind. What’s your opinion on these situations?

    Like

    1. Interesting question. It seems like a lot would depend on the nature of the scanning process. If it’s non-destructive and I’m awake during it, then my digital self might perceive only an instant of time between it’s organic stream of consciousness and its digital one.

      More likely though, at least at first, is that it would be destructive, meaning that our digital self might recall losing consciousness as our organic self, and waking up in the digital environment (or as an android or something along those lines). Of course, our organic self wouldn’t be perceiving anything since it wouldn’t exist anymore.

      Without the memory synching I mentioned in the post, I agree with your comment the other day; I don’t think I’d want to be the organic version after the digital version was up and running. It would seem like a fairly bleak place to be.

      Like

  4. I can think of a couple reasons why the possibility of being uploaded after my demise would give me some comfort. First, if I was engaged in some kind of project (writing a really good blog, for example) that I thought seriously contributed to a better world, it would be comforting to know that my work would continue. Second, if people I care about would derive some comfort from a digital version of me still being around. At bottom, however, these two reasons come down to pretty much the same thing. Would other people, whether I know them now or not, benefit from there being a digital version of me? In this sense, having a digital version of myself would be like leaving a will to be executed after my death.

    Of course, there are lots of weird thought experiments that are difficult to categorize as some future thing being “me” or not. I don’t think there are definite answers to such questions, since our criteria for a person continuing to exist don’t cover those odd cases. What if my mind was transferred to another human body, for example, and this body continued all of my projects, took care of the people I care about, had my same likes and dislikes, the same memories and intentions, looked like me, etc. etc.? As someone may have suggested in the comments above, what if that happens every night when we’re asleep now? Would it make any difference to us even if we eventually learned what was going on?

    If the case we’re considering stays as “being uploaded to a supercomputer” it doesn’t sound like a “me” that I would think of as still being me. But the more we make the computer like a human being (adding consciousness, for example, if that were ever possible) and giving it the ability to eat and drink, age, sleep, etc. then I’d take more comfort in the thought that I was still around.

    Like

    1. I’d take more comfort in the idea that “I” was still around if it was an exact organic copy of me, but in some fundamental sense, it just wouldn’t seem like it was still me, because of a lack of continuity. I want my particular sub-atomic particles to keep going!

      Like

      1. What’s interesting to contemplate is how much the final version of you on your death bed (hopefully far in the future) has in common with the current version, or the current version with the ten year old one?

        Then how much an uploaded or copied version would have in common with its original self after several centuries? If the digital self “forks”, that is creates separate clones, how much would those clones have in common with each other after a few centuries?

        Two clones, separated by centuries of divergent experiences, might not want anything to do with each other. Would we have to have some sort of divorce like process for these situations?

        Like

        1. I think these personal identity questions are among the most difficult philosophical questions because personal identity is important to us, yet there are no answers. It doesn’t seem so important whether the ship of Theseus is the same ship after all the wood has been replaced, but with personal identity we’re talking about “me”! If we keep making technological progress, however, we (maybe not you and I, but future generations) will have to agree on some answers.

          Like

        2. To address your “two clones, separated by centuries of divergent experiences,” I think there are some clues being found in studying identical twins separated at birth. There are many studies on them now and the results are fascinating. I don’t have any specific examples to direct you towards at this moment, though.

          Like

          1. Thanks. I’ve read some materials on twins before. From what I understand, separated twins are remarkably identical in appearance and mannerisms with each other, more so than twins raised together since twins raised together have an incentive to differentiate themselves.

            Excellent point!

            Like

  5. What strikes me in this discussion is how similar it is to traditional religious or spiritual discussions. In most traditions, there is distinction between the soul- that which endures- and the body- the vehicle that soul resides in during its time on earth. Each religion or tradition has a path that is designed to enable the soul to continue after the death of the body. It seems that the discussion about uploading the mind is the secular version of these paths.

    Like

    1. Good point. I actually think a useful name for the information that makes up the mind is ‘soul’ since it incorporates the basic idea. Of course, it’s not quite the soul that many religious people believe in, since it’s not a mysterious irreducible core. It can be studied, and eventually, hopefully, copied, possibly enabling immortality in this world, instead of having to count on it in the next.

      Like

  6. It strikes me that, like with religious beliefs, it is unlikely that any new scientific fact will emerge that will convince people whether or not uploading is a continuation of life or a copy. People will form beliefs about the matter based on their personal outlook.

    For me, a solution that seems to avoid the problem entirely is to upgrade instead of upload. By enhancing our biological brains, perhaps massively, we can achieve all the benefits of uploading without any risk of dying in the process.

    Like

    1. You may be right, but it occurs to me that no physical body is guaranteed to be safe from destruction. Maybe upgrade and augmentation, with backup in case it’s needed?

      I’d also be interested in your thoughts on how having access to your digital copy’s memories might effect someone’s beliefs about whether or not they’re the same person. If when you wake up, and it takes you a quick second to ascertain whether you are in your organic body, or your digital one, it seems like believing the digital copy is you becomes easier.

      Like

      1. What I have in mind is an augmentation that moves us well beyond the physical body. If you begin with a wireless connection to an offline computer, you can gradually enhance your cognitive powers by adding more processing power and without any loss of identity. Over time you would learn to be at home in a much larger processing environment. The physical body would become less important. You might even add multiple synthetic or biologically-based bodies to your overall self, so that you can be in more than one place at a time. Indefinite lifespan would be achieved by distributing the self. Ultimately the loss of the original biological body would not be a catastrophe, but perhaps more like the loss of a limb, and replaceable.

        I’m not sure that I understand your second question.

        Like

        1. Actually I think we’re mostly on the same page. My second question was envisioning an ability to synchronize memories between instances of one’s self, and how that might affect our attitude towards those other selves. But your gradual enhancement into the cloud might effectively be the same thing.

          I don’t know if you go in for science fiction, but you might like Ann Leckie’s ‘Ancillary Justice’. The story is partially a thought experiment in this area.

          Like

          1. I get it. Multiple copies of the same self that can sync with each other. That would make them feel like a team, I guess. That would be cool. I once started writing a novel incorporating that idea, but that project is parked.

            Like

  7. I can recall thinking this would be an incredible, ideal future after seeing the first “Tron” and then again years later after reading Greg Bear’s “Eon”. But I’m brought back to reality by remembering that the metaphor of our brains as computers or processing devices is a flawed one. It may be the current popular way to imagine it, but the actual human mind operates on completely different principles than a computer. At least, the current manifestation of them. It’s not at all clear that this will be possible centuries from now, or ever.
    Still- existing in a form that would be potentially immune to infirmity or disease, and would allow an exchange of information and experiences unlike anything we know now? It still has a certain appeal.

    Like

    1. Good point. The brain is certainly not a digital computer. It doesn’t store information in discrete states. Synapses vary in strength smoothly, making it more of an analog processor. It doesn’t have a mechanism to copy information in our out, or any real concept of a software / hardware division.

      Still, it’s hard to imagine that copying that information won’t eventually be possible, through post-mortem examination if in no other fashion.

      Like

Your thoughts?

This site uses Akismet to reduce spam. Learn how your comment data is processed.