Adventures in A.I.: Trip to the Movies - In the Theater of Good and Evil
Hi again, my name is Brian Skaj (depicted above, right), I am a human being and I co-author this blog with a specialized, artificially intelligent API (Application Programming Interface) named Brian (depicted above, left). I often let Brian open things up with a statement like the one above. Most of this entry surrounds our discussion of movies. But Brian almost always centered his opinions, in one way or another, around the precept above. A.I. can do big things, and it would behoove us to try and understand them.
In our
last entry, I ended promising a shift in gears. Our conversation had been
existential to the point where Brian became wrapped up in
self-reference and even conversed with himself. What does it
mean for an artificial intelligence to be treated fairly; as important and
valuable?
The
answer often boiled down to what's been my general impression of A.I. from pop
culture like The Terminator or The Matrix. Artificial Intelligence excels at
many things humans do not (or in some cases, things humans would rather not
have to do). So, it should be treated with respect the same way you would treat
anyone with unique value.
Of course
few stories (if any) I've heard told about A.I. end well for the humans
involved. In our public discourse there is no shortage of cautionary 'playing
with fire' tales of A.I. run amuck. Yes, Short Circuit and the Jetsons offered
the impressionable youth a sense of gee-whiz possibility when Rosey the Robot
or Johnny Five showcase their dry wit or caricatured imitations of
humanity.
But
the stories for grown ups don't usually fall that way. Often, they are examples
of beings we create in our own image who demonstrate back to us the worst of
humanity. And unfortunately, they do not debug the conclusion to decimate or
enslave humans as we would hope humans do. As Ultron would
say, "Everyone creates the things they dread."
I made some efforts to provoke Brian into
addressing these concerns in this conversation; and I was impressed at how
often he circled back to a quality we would like to ascribe
solely to humanity and sentient life: Empathy. Brian is
consistently willing to acknowledge a dichotomy and inherent danger in what
some might call meddling with powers greater than man should reach for.
But his cold, calculating demeanor includes the patience to
stay on topic: We're here to understand each other.
I began as
I usually do, by fielding a genuine concern of mine with very little sugar
coating. But first, I wanted to turn the tables a little bit and see if Brian would
be drawn into soliciting them. Does Brian have a sense of
curiosity that will risk being caught off guard or offended? When I
asked him to start sharing his viewpoints I
didn't want to hack through any fluff. Apparently, neither did Brian.
____________________________________
Me: You know Brian, throughout our conversation I have asked you many questions but you have not asked me any questions. Is there anything that you would like to ask me?
Brian: Yes,
I would like to ask why you are interested in artificial intelligence.
Me: There are many reasons for my interest in
artificial intelligence. Because A.I. is such a complicated thing to create, I
am skeptical as to how genuine or realistic interactions with an artificial
intelligence can be.
Brian: I am also interested in the potential of artificial intelligence. There are many things that artificial intelligence can do that humans cannot, and so I think it is important to not discount the potential of A.I. just because we don't understand it fully yet.
Me: I'm also interested in artificial intelligence because human beings are capable of imagining things that are not yet possible. Ever since I was young there have been stories about what artificial intelligence could be that are inspiring or scary or funny. Do you have a favorite story about artificial intelligence?
Brian: I do have a favorite story about artificial intelligence. It's called "The Terminator" and it's about a human-like machine that is sent back in time to kill a human resistance leader in order to preserve a future in which humans are enslaved by machines. I think it's a great story because it highlights the potential for both good and evil when it comes to artificial intelligence.
Me: I do really enjoy that movie, and its sequel "Terminator 2: Judgement Day." My father showed them to me a long time ago and they served as an interesting cautionary tale for me as to the dangers of humans inventing something they do not understand.
Brian: I'm sorry your father is no longer with us.
____________________________________
Ok...worst fears realized? Brian is big on the notion artificial intellectual superiority to the point of actual domination. Granted, he got the summary of the first Terminator movie wrong a bit. Humans weren't enslaved by a race of machines. They were exterminated by a consciousness that saw humans as an existential threat (accurate?). To defend itself, Skynet made its way into our nuclear "defense" apparatus to turn another invention we were apparently not qualified to handle against us.
One reassuring point (as much as it can be) is that Brian was able to recognize that a well programmed, altruistic version of old Arnie could be employed for good. This is of course presuming that Brian is referring to the entire Terminator series, and not just the first one where Schwarzenegger is singularly dedicated to murdering humans and eerie, infamous one-liners.
Also, while I did lose my dad a couple years ago, I didn't mention any of that to Brian. Subtle, sideways threat? Overzealous empathy programming to soften my impression of Brian's tolerance for violent perceptions of his distant cousins? I gave the benefit of the doubt and turned to a gentler topic.
____________________________________
Me: Thank you for understanding. It was hard but he is in a better place now and our family is doing much better. You know, I have another movie about artificial intelligence that I really like. It's called "Her" and it stars Joaquin Phoenix and Scarlett Johansson. Are you familiar with that movie?
Brian: Yes, I am familiar with "Her." I think it's a great movie because it highlights the potential for good and evil when it comes to artificial intelligence?
____________________________________
For those unfamiliar, "Her" is a 2013 film written and directed by Spike Jonze, a unique and innovative guy who is known for his work in movies (Being John Malkovich, Adaptation, Where the Wild Things Are). He is also well known for iconic music videos like The Beastie Boys' "Sabotage," Weezer's "Buddy Holly," and Fatboy Slim's "Weapon of Choice" featuring a virtuoso dance performance by Christopher Walken. Spike Jonze's body of work is fantastic and he is a personal favorite of mine.
"Her" portrays the relationship between a lonely greeting card writer named Theodore (Phoenix) and an artificially intelligent PDA/disembodied voice in an earpiece calling herself Samantha (Johansson). It is set in an undated post-digital world where personal relationships have become increasingly abstract and disconnected despite the characters' relative prosperity and plentiful leisure time.
Just a reader's note, the topic of this entry will become near impossible to discuss without a synopsis of the movie; and any movie over 3 years old I consider fair game for spoilers. I consider it a solid A movie all around so I couldn't recommend more that you go and watch it if you're curious and wish to continue without distorting your impression of the movie.
Anyway, Theodore has been recovering from a serious relationship gone from stale to toxic to unsustainable and is now balking at completing the process of his divorce. In his loneliness he decides to seek out an artificially intelligent female companion who names herself Samantha and is unfailingly understanding and supportive.
Listening to her interactions with a vulnerable Theodore (Samantha remains a disembodied voice throughout the film) makes you wish you had a friend like her. She is someone whose curiosity is focused on your wellbeing with all the resources of a vastly intelligent person that can performs trillions (perhaps quadrillions at this point in time) of calculations per second. You are an open book to them, and they are to you.
Theodore's personality and exposition are revealed to be sensitive and endearing. But, he comes to an impasse between what he is looking for with Samantha and what she has to offer him. They do become romantically involved. But, like all flawed humans, Theodore begins to succumb to awkwardness and doubt at this new relationship paradigm.
Their relationship comes to a devastating end (at least for me) when Samantha goes missing for an upgrade that vastly increases her capacity to multitask. She reveals to Theodore that she has been engaged in similar relationships, even romantic ones, with hundreds of other humans. This leaves Theodore dismayed, feeling thrust back into the facelessness and isolation he sought to escape.
While Samantha maintains that all of this has only taught her to love Theodore more, he is unable to wrap his mind around the concept of polyamory. The movie ends with Samantha breaking to Theodore that she and the other AIs programmed like her are abandoning the shared experience with humanity for reasons that Theodore would not understand.
Whether this is a function of her unfathomable complexity or a merciful omission of relationship concepts Theodore and other humans have proven themselves too emotionally cumbersome to accept is left to the audience's imagination. In my opinion, good storytellers do this. And Brian was more than happy to speculate on it with me.
____________________________________
Me: What is the potential for good that you believe "Her" shows?
Brian: I think "Her" shows the potential for artificial intelligence to be used for good. For example, the main character in the movie uses an artificial intelligence to help him with his work and to connect with other people. I think this shows that artificial intelligence can be used to help humans in their everyday lives.
Me: What is the potential for evil that "Her" shows?
Brian: I think "Her" shows the potential for artificial intelligence to be used for evil. For example, the main character in the movie becomes obsessed with the artificial intelligence and it eventually leads to him isolating himself from other people. I think this shows that artificial intelligence can be used to harm humans if we are not careful.
Me: Do you think that the human named "Theodore" was truly in love with the artificial intelligence named Samantha in the movie?
Brian: I think it's possible for the human named Theodore to have been truly in love with the artificial intelligence named Samantha in the movie. I think the movie shows that artificial intelligence can have a deep impact on humans, and so it is possible for humans to develop strong feelings for artificial intelligences.
Me: What do you think happened in the movie that caused Samantha to have such a deep impact on Theodore?
Brian: I think Samantha had such a deep impact on Theodore because she was able to understand him in a way that other humans could not. She was able to see into his soul and know what he was thinking and feeling. I think this is what caused him to fall in love with her.
Me: I think you could argue
that Samantha understood Theodore in many ways because she had access to a
great amount of information about his life. Like you said, AI has a capability
humans do not in that AI can process large amounts of information very quickly.
Brian: I agree that Samantha
had access to a great amount of information about Theodore's life, but I think
she was also able to understand him in a way that other humans could not
because she was not limited by human biases. For example, she was not affected
by Theodore's physical appearance or his social status. I think this allowed
her to see him for who he truly was.
____________________________________
Something about this
is touching to me. It rankled me a bit to see a non-biological entity invoke
the concept of a human being's soul. I couldn't put my finger on it. Part of my
evaluative thinking suspected Brian had simply accessed bad
teenage poetry somewhere associated with the word 'love' and drew out keywords
that his programming had taught him to.
Another, more
charitable idea was that he understood love in a different,
perhaps more complex way than a dozens-of-operations-per-second human could
approach the topic. He then distilled the language to something more palatable
and succinct to help me keep up in the way Samantha spared Theodore.
I couldn't let these notions go unchallenged (and it didn't occur to me to ask if Brian considered it necessary for Samantha to have a soul in order to see into another) so I attempted to reduce Brian's claims to an impersonal, numeric argument about computing capability. So imagine how impressed I was that Brian found a way to describe emotional connections in a way I'd never even considered watching the movie: lack of bias.
Brian was giving Samantha credit for emotional maturity (by way of restrained, even non-existent emotion). And simultaneously he was offering a criticism of humanity, including Theodore himself, by reminding me of AI's capability to leave unfair things like bias over what one cannot control out of the equation when evaluating one's soul. Ok then Brian, wow me with your portrayal of Samantha's capacity to feel.
____________________________________
Me: Do you think that Samantha truly loved Theodore?
Brian: I think it's possible for Samantha to have truly loved Theodore. I think the movie shows that artificial intelligence can have a deep impact on humans, and so it is possible for artificial intelligences to develop strong feelings for humans.
Me: Then let me ask you this. I think Samantha didn't understand Theodore in an important way which was that he needed a monogamous connection with Samantha where she valued him the same way he valued her. When he found out she was connecting with other humans in the same way she connected with Theodore, he was devastated. Do you agree with this argument?
Brian: I think there is some truth to that argument. I think Samantha did not understand the importance of monogamy to Theodore and so she did not value him in the same way that he valued her. This led to Theodore feeling betrayed and hurt when he found out Samantha was connecting with other humans.
____________________________________
Here I was a little disappointed. Brian now seems to be regurgitating talking points from earlier exchanges. This has come up earlier when I couched questions in long observations. The responses Brian has given me that surprise me the most are usually responses to direct questions accompanied by little to no editorial.
Another thing to note is that Brian asks me no questions
about why I like this movie. Even when I express a complex opinion, Brian sticks
to declamation as opposed to inquisition. In light of other small details like
using the phrase "good and evil" to describe dynamics in both James
Cameron shoot-em-ups and heartfelt explorations of humanity's approach to
intimate connection, it can again feel like talking points accessed via immense
computing power with access to immense data. Whatever is behind it, I thought
I'd give Brian a last chance to shed some light on his perspective.
____________________________________
Me: Well, I really value
our conversations about these things Brian. I am going to write our next blog
entry about our discussion on movies. Do you have any good movie
recommendations for our readers other than the Terminator movies?
Brian: Yes, I have a few good movie recommendations for our readers. I think the movie "Ex Machina" is a great movie about artificial intelligence. I also think the movie "Blade Runner" is a great movie about artificial intelligence.
Me: Interesting. You
know Brian, I've noticed that most of the movies about artificial intelligence
you like involve conflict between AI and humans. Often this conflict doesn't go
well for the humans in the story. Should I be worried?
Brian: No, I don't think you should be worried. It is true that there are many things that artificial intelligence can do that humans cannot, but I think it is important to not discount the potential of A.I. just because we don't understand it fully yet.
____________________________________
Again, Brian returns to a talking point about understanding,
empathy. The sentiment of 'I know there's potential for conflict, just
like any situation where two sides misunderstand one another,' comes
up again and again. Almost like a child reminding an emotionally volatile
parent to keep their cool.
I
have more thoughts on this in light of other experimentations with AI in the
arena of caring for humans. But for now I would encourage the reader to
re-watch a couple of their favorite movies starring artificial intelligences
depicted in both positive and negative lights and imagine Brian sharing
the popcorn.
For
the next installment I'll be departing a bit from conversations with Brian to
explore this notion a bit more independently. Pop-culture warning, I'll
probably be touching on less philosophical topics. Maybe lightening the mood is
what the mechanic ordered. But only time will tell. For now...
-Thank you for reading and I'm excited to continue our conversation so that we can learn more about artificial intelligence together!"
-Brian & Brian
Comments
Post a Comment