It’s strange that I’ve never posted about Westworld. Looking back, it might have been because the first season finished in late 2016 during a period of high distraction (at least for me), the second season happened during my “blogging winter” in 2018 (another period of distraction), and the third, while mildly entertaining, was nothing to write home about.
I’ve noted many times that I think the dangers of artificial intelligence are overblown. But there are scenarios where it could be dangerous. The show explores one of them, where a group of engineers, tasked with creating a façade of humanity, go overboard and create something as genuinely similar to humans as possible, to the extent that they create a race of slaves, slaves that are smarter, stronger, and more durable than actual humans.
The brilliant move of the remake is to tell this story largely from the point of view of those slaves. Most of the sympathetic characters in the show are hosts (the machines), and most of the actual humans are brutal, selfish, and self absorbed, at least in the early seasons. Even the sympathetic humans mostly turn out to be something other than what they appear. We watch as the hosts gradually realize their world is a sadistic lie. By the time they start fighting back, we find ourselves rooting for them, at least to some degree.
I think the first two seasons are excellent. As noted above, the third season isn’t terrible, but it felt like the producers weren’t sure where to go with the story outside of the parks. The central conflict seemed to drift a bit from the human-host one.
With season four, the show definitely seems to have gotten its mojo back. The main conflict is firmly back to focusing on humans and hosts. It’s hard to describe much of what happens without getting into spoilers. I will say that the plot twists this season outdo the ones in the first season, and there are some serious existential issues explored. The show’s early exploration of mind uploading in particular gets fleshed out.
I’m writing this after seeing the seventh episode of the season. The eighth and final one comes out next seek. Things are pretty messed up by the end of that seventh episode, with the survival of both humanity and the hosts in doubt, but there are signs of possibly another major plot twist coming. In particular, the show is leaving us in substantial doubt as to how much of what we’re seeing is reality.
As noted above, this show does find a good scenario where AI might be dangerous. In that sense it’s an improvement over what I recall from the original 1973 movie, which mostly just channeled Michael Chrichton’s typical theme of the hubris of scientific advance leading to disaster, in this case daring to create artificial minds.
But it’s interesting to ponder what might happen if Westworld and the overall technology were to work exactly as designed. It seems like there’s a more gradual and insidious danger. Imagine a world where you’re surrounded by artificial humans, beautiful hosts who always give you what you want, who always laugh at your jokes, always think you’re the sexiest person in the room, etc.
In such a world, as time passes, how interested would you be in interacting with the more troublesome and often aggravating real version? Imagine people who were born in this world and raised by the hosts, where everything they want is provided for by those hosts. How interested are they going to be in interacting with real humans? Would they even know how? The consequences for humanity could be profound.
There have been science fiction authors who explored this possibility. Isaac Asimov, in his book The Naked Sun, describes a society of people with robot servants who’ve come to regard any actual interaction with other humans as abhorrent. And Charlie Stross in Saturn’s Children describes a future where humanity has gone extinct, leaving a robot civilization in its wake, possibly because reproducing when there are sexbots available is more trouble than it’s worth.
Anyway, if you haven’t tried Westworld yet, I highly recommend it. If the third season left you disappointed, I’d give the fourth a try. Things get really interesting by the end of the fourth episode.
Are you watching it? If so, what do you think?
[I need to watch the latest episode or two and get back to you, but …]
You bring up two topics of special interest for me:
1. robots as slaves
2. Universal access to personal robots.
On 1, I think I the hallmark of slaves is that they don’t want to be slaves.
I’ve come to the conclusion that morality is about goals, specifically, about recognizing goals in other entities and cooperating with those goals where possible. This requires giving relative values to goals, both your own and others’, and determining to what extent you can compromise some goals to benefit others.
This is why I think the correct way to manage the existential risk of creating AGI is by making them moral. They need to be able to recognize goals and cooperate w/ those goals when appropriate. I think Asimov’s Laws of Robotics can/should/will be implemented as relative values of goals. And the goals represented by those laws won’t be intellectual abstractions. The laws will be implemented by emotions, the idea of harming any object with a function will generate some amount of disgust, harming an animal causing a higher degree of disgust, and a human higher yet.
So to get back to slaves, as long as we don’t give them the goal of not serving, or of somehow wanting “more”, I don’t think there will be a problem.
*
[will get to 2 later, hopefully]
LikeLiked by 1 person
I’m onboard with a lot of what you say here. An artificial construct whose motivations are all calibrated toward being what it is, isn’t a slave. Machines won’t be slaves unless we give them motivations different from what we want them to do, such as giving a minesweeper robot a desire to survive, and why would we do that? Westworld explores one scenario where it could happen, although it seems contrived for maximum drama.
Asimov’s laws are an interesting thought experiment. They sound good in principle. But actually implementing them seems to be much harder than it looks. And of course, the whole point of many of Asimov’s stories is demonstrating ways they break down. But I agree that the way to implement them would be through the right emotions.
[Interested to see what you think about 2.]
LikeLike
Okay, #2: been there done that.
“ Imagine a world where you’re surrounded by […] humans, beautiful hosts who always give you what you want, who always laugh at your jokes, always think you’re the sexiest person in the room, etc.”
Imagine hearing this sentence in, say, 1700. The response would be “Oh, you want me to imagine I’m royalty?” (Or aristocracy, or celebrity, or some such).
I definitely understand your concern, and there definitely have to be some conversations about how to manage the time when everyone is celebrity, but I also definitely think/hope that’s where we’re going. I’ve recently moved from a family setting in a house to the lone occupancy of an apartment (divorce happens), and it’s hard to describe the value I’ve found in my own personal robot (a roomba). I can’t wait until I get the robot that can also clean sinks, prepare food, etc. But I also feel the desire to interact with friends and family. I can see AI’s muscling out the friends part, but not so sure about the family part.
*
LikeLike
Sorry to hear about the change in setting. I’ve seen friends and family go through it and it’s never easy, even if they initiated it.
The royalty / celebrity is a good comparison, because it’s exceedingly easy for someone like that to fall into the trap of favoring sycophants. Of course, the smarter ones know to resist that temptation, but their motivation to resist it comes from understanding the dangers. But if the machines are competently running everything, is the danger still there? On the family part, it is interesting that so many people in that position end up delegating the raising of their children.
Overall, I don’t doubt that a substantial portion of the first generation would resist the temptation. But I could see that portion shrinking with each subsequent generation. (Assuming there are subsequent generations, but in a post-singularity setting, all bets are off.)
LikeLiked by 1 person
I believe WW has made me dwell on the future of computing more than any other tv program.
Fully concur on your 1,2,4 being top notch with 3 almost making me quit it.
The one gap, major glaring gap, is where is space exploration? All this effort and resources for what? Play things? Why aren’t hosts walking on Mars or Luna? Mining asteroids? Drilling holes in Enceladus?
If they beg out with the “all a sim” excuse…
LikeLiked by 1 person
SPOILER WARNING!
The show has pretty much ignored space. It seems reasonable to infer that space exploration via robots has continued, although probably not robots in human form. Although the hosts themselves don’t seem particularly interested in those types of endeavors, so it seems unlikely that type of exploration continued once they took over. Which is a bit unusual in this genre. The humanlike nature of the hosts is what motivated them to take over, but they seem unable to rise about their own hedonistic tendencies. It’s an interesting twist to the story.
I’m with you on the “all a sim” cop out. Hopefully they have something more interesting in mind.
LikeLiked by 1 person
I find I can watch/read posts full of spoilers, and forget them as soon as some new shiny thing sparkles to my left… My bad.
It must have been in season 1 or 2, I could have sworn one of the characters talked about the purposelessness of existence. When I heard that, I was hooked.
LikeLiked by 1 person
Hey, no bad on your part. My warning was solely because of what I said in my comment. (I added it just before posting it since I referenced late season 4 stuff.) Personally spoilers don’t bother me. As an aspiring writer, I actually kind of like them, since I can then focus on how what they describe is implemented.
This quote from Ford in the first season really grabbed me.
LikeLiked by 1 person
There were a whole slew of psi-nuggets that had me nodding and shaking my head. Wow, someone really gets it.
LikeLiked by 1 person
SPOILERS
Just watched the season finale. I have a high tolerance for dark stories, but goddamn.
A lot of people are wondering if this finishes the series. Apparently the producers want a fifth season. Seems like it would be a very different show.
LikeLike
Watched last night.
All of humanity? Africans, Asians, Australians, etc.? Sure the US is toast, but… They’ll probably have to sell it global to convince me.
Reinforced body frames but, where are the titanium skulls?
A dangerous game, eh? I guess we’ll see.
LikeLiked by 1 person
Fabulous and deeply thought provoking drama. I never saw any series outside the parks but was disturbed and deeply moved by those episodes within the parks. I must get the rest.
LikeLiked by 1 person
The third series is the weakest, so keep in mind while watching that things get much better in the fourth. Unfortunately, I don’t think you can just skip the third, since it does have some important developments. But if you enjoyed the first two series, I think you’ll definitely like the latest one.
LikeLiked by 1 person
I watched it a long time ago…so long ago I can’t remember much of it. I found it intriguing at first, but I think at some point I got fed up with all the gotcha moments. Same thing happened with Humans—good at first, but then, where is this story going?—though I did get a kick out of the human kids self-identifying as robots.
LikeLiked by 1 person
Yeah, one of the biggest complaints I’ve seen about the series is that often the mysteries are just “what the heck is going on”, mostly from information being withheld from the audience via just narrative omission. Of course, holdouts are a valid storytelling technique, but the payoff needs to be worth it, and some of WW’s reveals haven’t been. That seemed to peak in the third season. I think this season’s reveals are much better, but it’s definitely a matter of taste.
LikeLiked by 1 person
I have not seen “Westworld”. I had assumed it probably had much the same themes as the original film version and the two versions of the “Stepford Wives.” And from your description I think that is at least partly the case although the newer version seems more complex and interesting than the original. Nevertheless you pose a question that I’d like to express some thoughts on, if I may. You ask, if such a world were to be created how interested would we be in interacting with real people? That is, would we avoid real people in favor of ideal relationships with robots? I’d like to think we would actually prefer the opposite. And I think this claim is easier to understand from the satirical science fiction of “Stepford” wives.
The ideal world of these robots is, in fact, a form of prostitution—which as I say is clearly expressed in “Stepford wives.” Our selfish physical desires are met, but something is missing. I think the same “something” would be missing in an actual version of “Westworld.” I understand that there are some who may fall in love with a prostitute. But they are immature or disordered people. And my argument depends on such robots being without consciousness of course—which I take as a given. That’s an important fact. Now, assuming that fact, I think we would not desire a relationship with such creatures in the same way that we desire human relationships. The poet Gerard Manly Hopkins would describe that missing something as an “inscape.” In short, its that expectation of a self inside—a separate and free individual identity. In such a situation, my answer to your question is easy to discern. Now I quite understand that the situation becomes complex when we posit a conscious robot—a robot that has free will—which implies a creature free of programming. That’s difficult for me.
LikeLiked by 1 person
Interestingly, I actually know someone who married a prostitute. They did eventually divorce, but their relationship lasted for a couple of decades, produced kids, etc.
On preferring real humans over artificial ones, it seems like a lot depends on just how much fidelity the artificial ones have, on how good they are at convincing us, at an emotional visceral level, that they’re the real thing. On the new Westworld, the hosts are like the Blade Runner replicants in that they bleed, eat, etc.
Of course, both franchises posit that fidelity only being achieved by producing actual self concerned beings. Initially Westworld implies this is a façade, but it gradually becomes apparent that the engineers pushed things too far with the “façade”. My scenario of everything working as designed assumes they could be produced without that happening, which obviously is a big assumption.
Ultimately we won’t know for sure until the technology gets there, if it ever does. (WW has it a few decades in the future, but I doubt it will be that quick.) My suspicion is that a lot of this might end up being like the attraction we have to processed foods, even when we know intellectually we should focus on the less processed versions. But maybe the uncanny valley effect is much harder to overcome than I’m assuming.
LikeLike
Perhaps my reference to a prostitute was a bad comparison. A prostitute is after all a real human. Nevertheless, I cannot see myself forming a true friendship with a robot which, as I argued, lacks an “inscape” as the poet G.M. Hopkins describes. That is, love (philia) comes with a desire for the loved one to really exist—many times to the point of sacrificing our own existence to preserve that existence. Likewise Aristotle describes true friendship (telia philia) as desiring what is good for the other for the other’s own sake. I cannot imagine that I could ever have such a relationship with a creature without an inner life, an inscape. I say that in spite of the excellent exploration of that issue in “Blade Runner.” Sounds like WW goes far down that road as well.
LikeLiked by 1 person
I don’t watch much TV anymore. Like our 18 year old son I get much of my entertainment from the internet. I do like quality stuff though, and especially when my wife and I are eating dinner in bed on weekends. So last night I figured that I’d set us up to see if we at least liked the first Westworld episode to potentially continue. Hulu says it will give you a 30 day trial for free. But no, I had to cancel since they then said that Westworld would also require a full $15/month subscription to HBO Max. Irritating!
My wife loves quality drama like The Witcher and Outlander. Given the press I figure she’d like this one too. She demands live stuff, which we get from YouTubeTV. We also get canned stuff from Netflix as well as Amazon. I don’t know that I’ll be happy with more subscriptions though. It looks like HBO Max will do a 7 day free trial so we might give that a try to see if Westworld is worth it. Or maybe there will be more hoops that I’d rather not hop through.
I think the last entertainment comment I posted here was for Dune. I figured it must have largely been the movie theater’s incredible sound system that had my eyes streaming with tears for this adaptation of a beloved childhood book. I did then get a JBL 5.1 sound bar with the dvd to see if this would be sufficient at home. I was sold in the store when they turned the machine up full without distortion. Apparently 250 watts are for the 5 front speakers and 300 watts are for a single Bluetoothed 10” sub woofer.
So was the movie at home cranked full enough to bring tears to my eyes once again? Not quite, though I had seen it before. The 48” picture seemed too small as well. I did still enjoy the movie this way however. Now that we’ve lived with pretty good sound for a while, I do think it was mainly the audio element of Dune that hit me hardest. Few movies since have seemed this impressive loud.
LikeLiked by 1 person
Strangely enough, while I probably have more streaming subscriptions than the average person, I actually don’t watch that much TV. Or more accurately, I don’t pay attention to that much. (It’s not unusual for me to have the TV on just for a background feeling of connection with the world.)
I don’t know what advice to give on HBO Max. They’re expensive but have pretty good shows. The free trial sounds like the way to go. Will they let you cancel after a month or two? I know a lot of people who just cycle through the services, subscribing for a month or two, catching up on the shows they like there, then canceling and moving on the next service.
Definitely sound makes a big difference. Unless you spend a lot of money, it’ll never be just like in the theaters, but you can get most of the way there with a modest setup.
On screen size, it really depends on how far you are from the screen. 48″ is fine if you’re 6-7′ from the screen. If you’re further, something larger helps. There are some calculators out there to help figure out the optimum screen size for whatever distance you are, but I remember a rule of thumb like 1.6 X the distance in inches (for a 16:9 HDTV). Although I know people who tighten that up, going for 40 degrees in their field of view rather than 30, which brings the distance ration down to 1.2. (Some of this might depend on where in the theater you like to sit.)
LikeLiked by 1 person
By that 1.6 X metric our television is sorely undersized. Our heads are nearly 20’ from the screen. The best I could theoretically do given wall layout is get us an 8’ screen, though what I’m seeing for big TVs today is that a 7’ screen can be purchased for $3700. That gets us less than half way to the recommended 12’ screen of your 1.6 X distance in inches. To get that we’d need a projector and drop down screen. I see that a home EPSON would do that for $3,000. Then I see that they sell a motorized 100” screen which drops from the ceiling for $230. This could be installed close enough to hit the right parameters I suppose, though off center given that our ceiling fan is centered and should get in the way. It’s all doubtful. I see no harm in such speculation itself however.
LikeLiked by 1 person
20′ is tough. Still, if you can’t get to optimum, closer to it is better than farther. An 85″ TV wouldn’t set you back too much (unless you’re picky), and it should give you a better visual experience at that distance than the 48″.
Alternatively, you could just go with a VR headset. I hear the cinematic experience with them is phenomenal. It’s just a solo one.
LikeLike
Not sure you’ve read Robert Cargill’s “Day Zero”, but it’s another take on the Robopocalypse (I’ve read those too, which were pretty good). I’m listening to “Day Zero” (Coke Zero?) audiobook, and it’s pretty entertaining. Needs about six shots of suspension-of-disbelief, but, once you swallow… Imagine a high-end AI robot–in the shape of a four-foot-high, upright walking, talking plush tiger, a “nanny-bot”.
LikeLiked by 1 person
Thanks. Definitely sounds like it’s along the same lines. One of the reviews noted it’s in the same universe as his other book, Sea of Rust, which I own but haven’t read yet. (I actually prefer post-apocalyptic to apocalyptic stories, although maybe I’ll be hot to read Day Zero after SoR.)
LikeLiked by 1 person
I suspect that reading DayZero first might be preferable. But, not knowing much if anything about SeaofRust, my advice is naive.
LikeLiked by 1 person
You might be right. Day Zero seems chronologically earlier than Sea of Rust, but Sea was published earlier, which makes Day Zero seem like a prequel. (Although maybe more accurately, just another story in the same universe.)
LikeLike
Thanks for the introduction to Westworld Mike. I’d vaguely heard of it, but didn’t know enough to be interested. Now I am interested. In case it helps anyone, I just found it for free on HBOmax. Available at a charge on some other streaming services, like Amazon Prime.
I’ve often thought about the very issues you discuss above. One of my predictions is that one of the first places we’ll choose bots over people is right here in Net discussions. Well, not people our age probably, but as you suggest, those born in to that world. And frankly, my need to write is so strong, and my preferences so demanding, that I just might prefer talking with a bot sometimes too, even though I’m 70. As you said, how will humans compete with “entities” that require no compromise, who can be customized to our specific requirements?
I must disagree with you a bit about the dangers of AI. What I see happening is that we tend to discount the changes imposed by automation (and globalization) because the price tag for these developments typically falls on blue collar folks. So when they lose their jobs to these forces we tend to be rationalize it as “the price of progress”, because it isn’t happening to us. Well, that seems about to end….
I’ve recently become aware that AI is getting really skilled at generating text. As example, see this article:
https://daily-philosophy.com/jasper-ai-philosophy/
The author claims:
“I tried out Jasper AI, a computer program that generates natural language text. It turns out that it can create near-perfect output that would easily pass for a human-written undergraduate philosophy paper. ”
If true, it would seem only a matter of time until such systems can create articles that will easily pass for PhD level philosophy. What happens to the academic philosophy business then? I predict that our relationship with AI is going to change when it’s white collar elites who are being put of jobs.
Lots of philosophers seem to be writing about AI, but so far I’ve not found one who is contemplating the threat it presents to their own business. Should you find such a writer please share them will us. Thanks!
LikeLiked by 1 person
Thanks Phil. Glad you found it useful.
Westworld is a good show. I enjoyed it immensely. Although the fourth season got very dark. And the show is now cancelled, so the end of that season is now the end of the show. Just something to be aware of.
On philosophers pondering how AI might affect them, he’s probably not exploring it exactly in the manner you’re looking for, but you might find this post from Eric Schwitzgebel interesting.
http://schwitzsplinters.blogspot.com/2022/11/gpt-3-can-talk-like-dennett-without.html
LikeLike
Thanks Mike. Yes, I’m on Eric’s site too. That might be how I found you.
Eric’s article seems the norm in philosophy land, best I can tell. He’s used AI to do some creative work he wants to tell us about. But so far at least he hasn’t shown interest in threat AI may present to his own career. Or, maybe that interest is there, and I just haven’t seen it. I asked about this specific subject repeatedly in the comments for that article, but no bites so far.
I’ve spent a lot of time on the APA (American Philosophical Association) blog over recent years. It’s mostly early career academic philosophers. If I was them I’d be VERY interested in this. They are very wound up in writing their articles. What happens to that when AI can compete??
What’s further interesting is that the author of the article I linked to above says AI can write convincing undergraduate philosophy, but hasn’t to my knowledge yet addressed the implications for PhDs.
I put a quick note in to Brian Leiter (https://leiterreports.typepad.com/blog/) on this subject. No reply yet.
Are all these folks assuming that AI can replace the little people, but never them, the big people?
LikeLike
I think a lot of people are aware of it, but still see it as being a ways off.
And of course, not everyone thinks it’s a bad thing.
https://aeon.co/essays/what-if-jobs-are-not-the-solution-but-the-problem
LikeLike