ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump

When we first meet L3, Lando Calrissian’s droid co-pilot in the 2018 film Solo: A Star Wars Story, she’s railing against a brutal droid-on-droid cage match in a cantina: “How can you condone this savagery?! Droid rights! We are sentient!”

When I first saw Solo, my reaction to L3 was mixed. I was thrilled to finally see a character in Star Wars saying out loud what I had been thinking for years: Wasn’t it obvious that droids are sentient—that is, that they have minds just like us? But I was also mildly annoyed that it had taken so long for someone to acknowledge this on screen, after droids had been weirdly treated like mere property in the Star Wars galaxy for decades.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

L3’s companions collectively roll their eyes at her call for droid liberation, and it would appear they weren’t the only ones. The Mandalorian debuted one year later, and even though the TV series is set 19 years after Solo in the Star Wars chronology, it depicts a world, or a galaxy, that doesn’t seem to have improved for droids. Underscoring this point, the star of the series, known as Mando, has a deep mistrust bordering on disdain for droids, illustrated in the very first episode by his insistence on riding in a rickety human-piloted speeder over a superior droid-piloted one.

PART OF THE FAM: Jibo, a cutesy home robot that originated in Cynthia Breazeal’s Personal Robotics Group at MIT, can recognize your face, even look at you. It’s jovial personality (Jibo can crack a joke or two) won over many families who, according to The Verge, felt pained when they learned that Jibo’s servers would shut down. Some planned funerals. But last year, the good news came that Jibo would stay online.Wikipedia Commons
ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Mando’s disdain for droids starkly contrasts with how we, as viewers, think of them. Droids like R2-D2 and BB-8 are cute; they seem more like friends than objects of suspicion, and people like Luke, Anakin, and Rey treat them in exactly this way. But do they have minds, like L3 insisted? How exactly should we determine this?

This is more than just a theoretical question. Droids are fictional, of course. But the questions they pose about our relationship to them are very real. We may not encounter a robot like L3 in our world in the near future, but advances in the currently fragmented fields of AI and robotics suggest that something like her could arrive sooner than many realize.

For example, AI language models can already generate text that is often indistinguishable from human writing (though they still have room for improvement1). In the field of robotics, some engineers have focused on mobility and dexterity: You may have seen videos of Boston Dynamics’s agile but faceless (and arguably creepy) robots designed primarily for industrial and military use. Other engineers and researchers have focused on so-called social robots, designed for human comfort in education and caretaking roles. One example is the recently resurrected Jibo,2 basically a cutesy Amazon Echo that recognizes your face and moves to look at you. It originated in Cynthia Breazeal’s Personal Robotics Group at MIT.

How do you decide if a drone has a mind? Your cat? Your best friend?

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

As it happens, Boston Dynamics’s Atlas robot bears a passing resemblance to L3, and Jibo looks a little like BB-8. Partly for this reason, it’s not hard to imagine that these separate threads of development might one day converge, and we could be standing face-to-face with a real-life Star Wars-like droid within our lifetimes. So the question of how we decide whether L3, BB-8, or The Mandalorian’s IG-11 have minds is really the question of how we will decide whether the robots in our own future have minds.

In fact, we unthinkingly make judgments about the minds of others every day. How do you decide if a drone has a mind? Your cat? Your best friend? The fact is you don’t actually know for certain about anyone or anything, but you can make reasonable guesses. A drone? Almost certainly not. Your cat? Sure, to some extent. Your best friend? Almost definitely. But what are these guesses based on?

Philosopher Daniel Dennett proposed that there are several different perspectives you can take when you want to predict what something is going to do.3 The most useful perspective for so-called “agents,” like your cat or your friend, is what he called the intentional stance: You assume the agent can act rationally to achieve its goals and desires. Why did your cat walk toward the food dish? Why did your friend walk toward the fridge? Same reason: They were hungry and they acted to eliminate their hunger.

If you watched your cat and your best friend for long enough, you would have more difficulty explaining your cat’s behavior this way than your friend’s behavior. (As an entire genre of TikTok videos can attest to, cats sometimes do things that defy rational understanding, at least from our perspective. Your friend might too, but far less often.) But you wouldn’t have any luck at all explaining the drone’s behavior by thinking about it as a rational agent. Why did the drone head toward the fridge? Not because it was hungry. The more easily we can explain the actions of something using the intentional stance, the more likely we are to attribute sentience to it.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

When I watch R2-D2, BB-8, and IG-11, I see behavior that looks a lot more like my friends’ behavior than a drone’s behavior. Their behavior is intentional—a sign of sentience. But their behavior is far more than just intentional. It’s intelligent.

A critical component of human-level intelligence is social understanding: the ability to infer what other people and agents want or what they’re feeling. In The Mandalorian, despite IG-11’s flat, robotic voice, he’s not short on social understanding. For example, after brutally taking down two stormtroopers in front of Grogu (known to most as Baby Yoda), he apologizes to Grogu for having to see it. And in a tender moment at the end of the first season, when IG-11 is treating Mando for a serious wound, IG-11 understands that Mando might be scared, and reacts like an empathetic doctor with a kind bedside manner: “You have suffered damage to your central processing unit … That was a joke. It was meant to put you at ease.”

But IG-11 was programmed for caretaking. What about a droid like R2-D2?

R2 reveals his social intelligence in the way he manipulates people using his knowledge of their beliefs and feelings. In A New Hope, R2 tricks Luke into removing his restraining bolt. In The Last Jedi, he manipulates Luke again into training Rey by guilt-tripping him with a reminder of his sister and his feelings of duty. This behavior is actually quite sophisticated: Children generally don’t develop the ability to manipulate others like this until at least 3 or 4, after most have learned to speak in full sentences.4 More importantly, we don’t have any robots in our world that can do this.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Lastly, droids can feel. This appears to be true in the literal sense—in Return of the Jedi, we see one droid crying out in what seems to be pain as it’s tortured in Jabba’s dungeon. But droids also clearly feel complex emotions. As just one example, in The Force Awakens, could you describe BB-8’s reaction on learning from Finn that Poe was killed in a crash as anything other than “sorrowful”?

Droids’ ability to feel is important because research suggests that we decide whether an agent has a mind based on two factors: whether it is capable of making decisions and plans and whether it is capable of feeling. In a 2007 survey of over 2,000 people led by psychologist Heather Gray, only humans were rated high on both of these factors (dogs and chimps were rated about as high people in ability to feel but not as high in ability to make decisions and plan).5

Of course, we can only infer droids’ feelings from their behavior. And the filmmakers of Star Wars have sometimes taken advantage of this aspect of our psychology for dramatic effect. For example, as robotic as IG-11’s movements and speech sometimes seem, he’s nothing compared to the machine-like Dark Troopers introduced in Season 2 of The Mandalorian. They don’t speak at all, and move in rigid lock-step, making them seem much more like simple machines lacking in experience than sentient agents.

This contrast is part of a running theme throughout Star Wars, in which the “bad” droids are depicted as more robotic than their “good” counterparts. Compare BB-8 to his First Order counterpart BB-9E in The Last Jedi, or K-2SO to his identical Imperial K-2 units in Rogue One. BB-8 moves fluidly and playfully, and K-2SO walks with a lazy gait and gesticulates wildly, while their counterparts move rigidly and precisely. Does this necessarily mean that BB-9E or the Dark Troopers aren’t sentient? Not at all. But it’s hard not to perceive them as having minds with less feeling or experience than droids like IG-11 and BB-8.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

It’s an unfortunate feature of common language that “human” often means compassionate and thoughtful and is often contrasted with “robotic,” meaning cold and logical. The droids of Star Wars, however, show us that there is nothing inherently human about being compassionate, nor is there any contradiction in a robot having a mind. R2-D2 and BB-8 don’t even look human, yet they are highly expressive and intelligent. IG-11 looks a little more like us, but he explicitly tells Mando that he is “not a living thing.” Yet even the droid-hating Mando understands that this isn’t a reason to deny IG-11 the same psychological status he’d give to a friend; when IG-11 sacrifices himself to save Grogu, Mando, and their friends, Mando feels the way any of us would: sad.

This is the same sort of dilemma we could face in the near future. Whether it comes in the form of a physical robot or a purely digital intelligence, soon enough, we will have AIs with the expressiveness of BB-8, the kindness of IG-11, the deviousness of R2-D2, or the self-righteousness of L3. Deciding whether these AIs have minds won’t be easy. But we should be prepared to grant that it’s possible they do. After all, we’ve been happily and automatically granting that the droids of Star Wars have minds for decades. Would we have it any other way?

Alan Jern is a cognitive scientist and an associate professor of psychology at Rose-Hulman Institute of Technology, where he studies social cognition. Follow him on Twitter @alanjern.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

References

1. Marcus. G. & Davis, E. GPT-3, Bloviator: OpenAI’s language generator has no idea what it’s talking about. MIT Technology Review (2020).

2. Carman, A. JIBO, the social robot that was supposed to die, is getting a second life. The Verge (2020).

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

3. Dennett, D.C. The Intentional Stance MIT Press, Cambridge, MA (1987).

4. Peskin, J. Ruse and representations: On children’s ability to conceal information. Developmental Psychology 28, 84-89 (1992); Stouthamer-Loeber, M. Young children’s verbal misrepresentations of reality. In Rotenberg, K.J. (Ed.) Children’s Interpersonal Trust Springer-Verlag, New York, NY (1991).

5. Gray, H.M., Gray, K., & Wegner, D.M. Dimensions of mind perception. Science 315, 619 (2007).

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Lead image: Markus Wissman / Shutterstock

close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.