ESSAY<br>Becoming Human<br>Emma Sloley
Would you kick a robot? I only ask because there’s a video floating around in which a human person is interacting with a robot—one of those spindly metal ones whose designers haven’t gone to much trouble to make resemble a living being, and yet it has a certain aliveness in the animated way it moves around—and the person just kicks it, out of the blue. The robot sort of staggers and then falls over. I have trouble watching that video. I find it genuinely upsetting. I realize, on an intellectual level, that the robot can’t feel anything, that it isn’t actually suffering pain or betrayal or humiliation. There’s something so chilling though about the contrast between the creature itself, so adorable and innocent in its goofy spindle-y way and somehow projecting an aura of eagerness, the universal human desire to be liked, and the callousness of its human companion, who visits violence on it without a thought. Yes, I called it a creature. I have anthropomorphized that little robot, and I don’t think I’ll be the last person to do so as we move rapidly into the age of mainstream AI.
I can state with confidence that I’d never kick a robot. But then again, I’m the kind of soft-hearted fool who relocates spiders and bugs instead of killing them. (I once decided not to spray a cockroach because it turned and looked at me. It saw me, we shared a moment, and I balked.) I can accept that other, more rational, people, might see no problem in treating a piece of robotics in the same way they might treat any other household appliance. Not with violence, necessarily—after all, who would want to go to the trouble, pain and expense of punching a refrigerator in anger—but with indifference, callous disregard. Here’s the thing: robots, if/when they become commonplace, occupy a rather unprecedented place in the taxonomy of physical existence. They’re not human but neither are they inanimate objects. Perhaps their closest analog, at least in their relationship to human beings, is non-human animals. They are animated, autonomous to the extent that their programming allows, able to make decisions and respond to human commands. And there’s the rub. I really wonder whether, as a species, we’re ready for AI. Like most technologies, AI is coming whether we’re ready or not. But is it really a good idea to incorporate robots into our lives?
I’m not talking about the popular sci-fi/futuristic premise that artificial intelligence is going to destroy us. Guys, I’m more concerned we’re going to destroy it. (It. Hmmm. That raises the adjacent issue of what pronouns robots should have. Do the manufacturers bestow them? Should they decide for themselves?) The reboot of Ridley Scott’s 1982 cult classic Blade Runner,Denis Villeneuve’s Blade Runner 2049,is bringing these questions—which have intrigued scientists, technicians, philosophers, and artists for decades—back to the conversation table. What rights, if any, should robots have? How can we be sure we won’t end up with robots so sophisticated they will evolve to become self-aware and experience feeling, like the ill-fated replicants of Scott’s original, drippy-dystopia fever dream (in turn based on Philip K. Dick’s dystopian fable Do Androids Dream of Electric Sheep?)
In some ways, a robot that could be programmed to feel a limited range of what we call emotions could actually be very useful under certain circumstances: HR robots could break the news of layoffs with the precise mix of sympathy and brisk matter-of-factness. Grief robots could counsel grieving family when a beloved member dies. Dementia care robots could simulate rapt attention at stories they’ve heard a thousand times before.
But based on my cynical stance that great innovation will almost always be used first and foremost to generate the greatest profits, I have a feeling we’ll see the first generation of mainstream AI put into use for the corporate world and the big-spending consumer market. As the uncanny valley effect becomes less of a problem, robots engineered to look and behave almost exactly like humans will be welcomed into our factories, workplaces and homes. (Obviously sex robots are coming to a bedroom near you as soon as humanly possible.) It seems a shame that the spaces where AI might be most effective—the home care industry and eldercare—will likely be the last place it arrives. I mean, which do you think will come first, the sexy Sean Young-from-Blade Runner-style AIs, or a frumpy non-gender-specific nurse programmed to empty bedpans and dispense medication?
In Variety’s review of Blade Runner 2049, the writer, Peter Debruge, made a passing reference to Pinocchio as the obvious progenitor to replicants, the uncannily humanoid droids who can “pass” as real people, and who are ruthlessly tracked down for extermination. Just like Pinocchio, those replicants yearned to be real boys and girls. And not just be them, but be recognized by humans as such. It struck me then, how this thread has been winding through the history of mankind for almost as long as we’ve been around. It might seem like we’re on a recent binge of technology anxiety—see films likeTerminator, Westworld, and Ex Machina, in which a super-sexy robot…well, no spoilers—but really, we’ve been grappling with this thorny idea for a long time, probably since we climbed out of the trees and realized there were other people around who looked almost the same as us, but not quite, and the first war got underway.
As we get closer to the singularity, the question seems ever more pressing. Who gets to be “real?” Who deserves humanity? And if robots do end up being able to experience emotions, to feel pain, fear or depression—even on a simulated level that only mimics ours—do we owe them some kind of empathy? Or can we just sleep easy knowing that they’re not real like we are; that simulated emotion isn’t the real thing; that a series of electrical impulses don’t really add up to feeling in any genuine sense? Because come on, having to accommodate the feelings, hopes, and dreams of a circuit board wrapped in a piece of alloy covered with skin-like synthetic fabric sounds like kind of a drag. It’s hard enough doing that with the people we love. (Just kidding, honey.) Surely it will be awfully tempting to second-citizen a being that you can not only tune out, but turn off.
Maybe we should ask the animals. It’s only very recently that the general public has even reluctantly got on board with the now indisputable, scientifically-proven idea that animals can feel pain and fear, can suffer from depression and anxiety in much the same way we do. That sort of thinking was once so unorthodox as to be heretical. We know better now, but we still largely treat non-companion animals like…well, animals. Factory farming is a moral abomination which our society continues to pretend doesn’t even exist.
Maybe we should check in with some fellow humans. Because, let’s face it, we haven’t even arrived at a consensus yet on whether all humans qualify for compassion. A little over 150 years ago in America, it was considered acceptable for humans to own other humans. To rape and murder them with impunity. In plenty of parts of the world, that’s still the norm. Many wealthy countries now have policies in place that allow genuine refugees fleeing unimaginable horrors to be turned away or imprisoned in modern-day concentration camps. There’s constant debate in our society about which classes of people deserve protection: women, gay people, transgender people, non-white people, people who practice certain religions.
Who deserves humanity? If any of us then all of us, surely. But as those protections, which once felt so enshrined, get weakened, eroded, rolled back, there’s a creeping sense that for some of us, it’s time to prove our humanity all over again. I see people in my life and in my timelines—particularly people of color, Muslims and Jews, the LGBT community, and women of all ages and walks of life—who are exhausted by it. “I can’t believe I still have to protest this shit,” went a commonly-expressed sentiment on signs at the recent Women’s Marches.
It’s hard to avoid bumping up against the uncomfortable idea that homo sapiens want, on some primal, apex-predator level, someone to kick around. In order to feel fully human, someone else needs to be relegated to sub-human. In the worst version of our collective self, maybe there’s a secret disappointment when we’re forced by law or societal pressure to consider previously-dehumanized beings as equals. It’s possible that the desire to be cruel, to have dominion over something, is pre-programmed, and we’ll never escape it.
So, I don’t know, maybe we should put the whole AI idea on ice for the moment. At least until we can figure out once and for all who gets to have a stake in being considered human, with all the perks and protections that go along with it. Until that time, I don’t think we’re ready for the robot revolution. But if the day should ever come when I get a little robot of my own, I’m going to be nice to it, respect its autonomy, and treat it as I would like to be treated. Just in case.
Emma Sloley is a journalist and fiction writer whose work has appeared in Catapult, Yemassee, the Tishman Review, Lunch Ticket, Structo, Travel + Leisure and New York magazine, among many others. She is a MacDowell fellow and her debut novel, DISASTER’S CHILDREN, will be published by Little A books in Fall 2019. Born in Australia, Emma now divides her time between the US, Mexico, and various airport lounges. Find her on Twitter @Emma_Sloley and www.emmasloley.com