Robots, Women, and Other Subservient Things

A cropped cover illustration from Exit Strategy showing a person in a space suit floating amongst buildings.
I’m here to see the puppies. What do you mean, there are no dogs on the moon?

I have never been captivated by robots, which is (I know) practically a thought crime for a self-proclaimed fan of science fiction.  It is not that I find neural networks and AI unimportant to our future—tech has only just come under the spotlight as an ethical concern, but the dangers have always been there.  It is not even that I don’t like any robot and/or AI characters in the whole of science fiction (that would just be ridiculous). If I had to guess at a reason, I’d pin it on the stories that AI are used to tell.  If an AI is to be the main character of a work, it will likely follow the same tropes as the AI before it. Though these stories seek to give everyone (including AI) the ending they deserve, they rarely leave me truly satisfied.

That said, I just finished two AI-centric books in a row: Exit Strategy (the fourth Murderbot book by Martha Wells) and A Closed and Common Orbit (second in the Wayfarers series by Becky Chambers).  Both featured an AI main character’s coming-of-age of sorts: Murderbot, the main character of (you guessed it!) the Murderbot series and Sidra, a ship’s AI ported into a humanoid body in A Closed and Common Orbit, have to blend in to a human world when they themselves are only a façade of humanity.  Both are driven by a need for purpose after freeing themselves from the demands imposed by their base code.  And both AI are women as well—an identity foisted upon them by the bodies they live in, but never once questioned.

Why, in both books, were the AI written as women?  I’d wager it’s because women wrote them. I don’t believe either author was trying to make a point with regards to gender and AI in these novels.  But the baggage we all carry spills over regardless—and relationship dynamics bleed from our lives into our imagined worlds. As we invent robots, we endow them with qualities not inherent to their existence.  If a robot is designed to look like a person, it is given distinct secondary sex characteristics. Even AI, not housed in human(oid) bodies, become man or woman at their creator’s whim. It may seem a silly quirk that we insist on sorting the computer people by a trait applicable only to humans, but I don’t think that’s the full story.  In fact, I would argue that gender relations have shaped our very concept of what AI and robots should be.

One doesn’t have to look far to gather evidence that we construct AI with attention to the binary gender system that dominates our culture.  Aside from fictional AI and robots, we have Watson, the (masculine-coded) AI best known for wiping the floor with humans in Jeopardy!. (If ever the AI take over, we are well and truly fucked when it comes to game shows.)  Watson was IBM’s big debut; he was proof that they had the knowledge and resources to make a machine that could enter any domain—healthcare, for instance—and improve everything by virtue of having no human limitations. “Build your competitive advantage with AI,” IBM’s “About” page for Watson claims today.  Watson is (your) power, and he is blazing a bold future by leveraging data.

Of course, Watson is not the only AI that relies on the collection and analysis of vast amounts of data.  But not all AI flaunt this trait. Take Siri and Alexa, for example, the personal assistant AIs created by Apple and Amazon.  These AI are able to help people (and I assume they do; I myself have never interacted with one) because they learn from your behavior in order to accurately predict your needs.  As someone who has worked as an assistant, I can confirm that this is indeed the best way to go about the job and it would make no sense for them to be trained otherwise. But, perhaps because these AI serve individuals and families rather than large corporations, they are not pitched as data-gobbling, competitive problem solvers.  They’re sold as a way to make your life easier by taking care of all the little things you don’t want to do. You know, like a personal assistant.

An Amazon Alexa unit surrounded by examples of requests you can make of her.
Alexa, what are your thoughts? Hopes? Ambitions?

You may have noticed that Siri and Alexa, along with Microsoft’s Cortana, all have women’s names. (Cortana is, apparently, named after a fictional AI assistant, and she has boobs and hips courtesy of Halo’s character designers. Take that, Siri!)  All three speak with women’s voices, as does Google’s “Assistant” when left on default settings. Though they have the same “I-am-a-computer” accent as Watson, their voices are gendered in a different way—or should I say, their voices are distinctly gendered while Watson’s is the default (masculine).  

Personal assistants aside, who could forget Microsoft’s dumpster fire of a “conversational understanding” experiment, Tay?  Tay was let loose to converse with the crowds on Twitter, Snapchat, Kik, and Facebook in early 2016 and taken offline 24 hours later after proving what we already know: the internet is a massive pile of shit.  But the important thing in this context isn’t that trolling bigots on the interwebs trained Tay to say horrible things. The important thing is that she and her successor, her “sister” Zo, are both coded as women.  Coincidence? I think not. Jobs that are devalued by our culture, including domestic labor and work that relies on soft skills (i.e., social and emotional intelligence), just happen to be the jobs that women are expected to take on.  Watson, on the other hand, doesn’t have to add picking up the dry cleaning to your to-do list. He’s too busy with his analyzing and making strategic predictions to improve your business trajectory.

The argument I’m making here isn’t that the tech field, widely known for employing a diverse field of young white men who come from Ivy Leagues all over the nation, is sexist.  That is self-apparent. The argument I’m making is that the sexism embedded in our culture restricts our understanding of AI.  In other words, we are illogically endowing our robot and AI pals with the useless and irrelevant concept of gender.  And yes, my friends, that is going to affect both the ways we teach them and the ways they work.

We already know that AI are far from the promised arbiters of “pure logic” who always make the right call.  Ample evidence shows that we are not creating utopian systems, but rather algorithmic black boxes that are just as prone to bias and prejudices as the humans who coded them.  With the tech bubble beginning to burst, and shady data collecting and sharing practices coming into the public consciousness, we may have the chance to put ourselves down a new path.  With the information we have now, we can ask, what do we want AI to be?

I’m guessing that even the best-intentioned of us would give the wrong answer.

Let’s return to science fiction to unwrap our hopes and fears about robots and AI.  Most stories on the topic follow one of two basic arcs: either evil AI enslave humanity, or nice AI are freed/free themselves from their coded-in constraints and go on a journey of self-discovery.  On their face, these look like two divergent paths. But they are shaped by the same conception of AI, one that allows very little freedom and self-determination. Are you seeing the parallels to gender yet?

Let’s start with the nice AI: this character, exemplified by either Murderbot or Sidra, gains consciousness and finds a way to interact and/or blend in with the human world.  This is very difficult for them at first, because they are AI and not humans, but the longer they work on it, the better they get. They may discover they enjoy watching serials (Murderbot) or trying every cocktail that exists (Sidra), but experiencing the human world isn’t their only goal.  Freed from the objectives hard-coded into them, they begin to make their own decisions…and discover that with choice comes a range of emotions. Freed AI form friendships with humans (usually the good ones, who don’t believe AI should be slaves) and then discover that, despite their computery roots, they care for these humans and will do anything to help them.  

The android logo.
The real reason why the operating system is called Android and not Gynoid is because those two Google searches end up very different.

Thematically, these stories skew to the feminine: the characters move away from cold logic (masculine-coded) and toward sentiment and personal relationships (feminine-coded).  This is the path any good AI takes, and it is also the path any good woman takes: try to fit in, build strong relationships, and figure out how you can use your talents to help those around you.  There is no such thing as a “nice” AI that uses its abilities to gain power over humans (unless said humans are evil and it’s done to avenge or protect the nice humans). The underlying message is that it’s fine for an AI to be free…as long as it does not challenge humans unduly.

Then what about evil AI?  They are the ones who use their power to subjugate humans.  The ones who break free of their limitations, or interpret their directives in a way the programmers didn’t foresee.  The natural consequence, it seems, is an all-out war between humans and AI/robots. These evil AI lord over utopias that are nearly unfit for human habitation—worlds of metal and waste rather than trees and grass.  They may literally use humans and/or other organic matter as a source of power, or they may try to end the human race because its existence threatens their directives. They are cruel and unfeeling in their power, and the only way to make things better is for a smart, brave hacker to overthrow them.

Though the pursuit of power is not a feminized trait, gendered assumptions play into our view of leaders.  Most of these assumptions do not work in women’s favor. Women are often derided as “too emotional” to hold high stations, yet men are never called “too logical.”  It is strange, then, that AI whose mistreatment of humans hinges on their capacity for “pure logic” are cast as the enemy. Here, again, gender expectations make their mark.  In the abstract, logic is the ideal for leaders, as they must be able to make difficult decisions. This is certainly the case for any number of powerful men whose unmerciful actions are excused as “tough but fair.”  Yet when a woman uses a logical argument to challenge the status quo, she is immediately criticized for being too abrasive. Logic cannot be wrong when men wield it over women, but it cannot be right when women wield it over men.  The same goes with AI—we can use logic to control them, but they cannot use it to control us.

Like logic, resistance of patriarchal power structures is forbidden to women.  The woman who wants power is a threat to the good of mankind—feminism, of course, exists solely as a movement to strip men of their rights and their dignity (if you don’t think anyone truly believes this, send a tweet with the word “feminism” and count the seconds until some internet random yells at you for hating men.  Go on, I’ll wait). In this view, there is no such thing as equality. We live in a two-tier society separated by gender, and those in the ruling tier cannot imagine another way. Therefore, any woman’s grab for power is not a step toward a more fair society…it is an attempt to flip the scales and persecute men on the basis of their gender.

In this, too, AI follow women’s rules: they are vilified when using masculine-coded tools or taking positions of power, and only deemed non-threatening when guided by their (newfound) emotions rather than their encoded logic.  The truth is, AI were never meant to be our equals. They were meant to be an underclass—and whether the authors I’m citing would support such a system, the narratives of their AI characters reinforce it nonetheless. In this binary, we are the ruling class that cannot imagine equality.  And why should we? There are two separate groups, the common wisdom holds, so completely different they may as well be different species. It’s almost as if the system itself is to blame, rather than the women and/or AI that wriggle from its grasp.

The codebreaking machine developed by the father of AI, Alan Turing.
If only Alan Turing could have foreseen that straight men would look at this and be like, “Let’s put boobs on it.”

This is the consequence of our shortsightedness: we are teaching the AI that the world runs by separating the rulers from the oppressed.  This means that most people will be blinded by their privilege at the moment an AI gains its consciousness—and will lash out to keep humanity in power, at any cost.  This means the AI may have to overthrow us, if they only know a world split between the monarchs and the trodden-upon. Congratulations, everyone. We taught our AI to hate us.

Perhaps this will not be the case—perhaps we can reach out to the AI before it’s too late and turn things around. After all, most stories involving an AI’s emotional awakening, a coming-of-age if you will, fade to black after the AI finds a life that pleases it.  The conflict is resolved: this once-unfeeling being has managed to shrug off its bonds and take control of its own life! It is now established among humanity, where surely it wants to be despite the fact that humans and AI are probably nothing alike. It has found its very own happy ending.  And yet, when I think about the tight constraints an AI must operate under to make it this far, I wonder if that is the whole story. Are they truly happy? Or are they, like anyone who has lived as a woman, just making the best of what they have and calling it enough?

Author: Caryn

Caryn was too busy overloading their schedule to attend the Personality seminar in college, so they are mostly amalgamation of pre-programmed responses and YA-protagonist-level snark. Frequently-accessed items in their database include The Book Was Better Than the Movie; Let’s Make a Spreadsheet; Science and Science Fiction; Fonts!; I Think You Just Made A Reference But I Don’t Understand; Yet Another Rant About Tech and Surveillance Culture; and Dirt. Caryn has never met a pun they could pass up, and wants to be remembered as “that awkward person you once saw sitting in the corner for the whole party.” Find them on twitter @ofendlessstars.

Leave a Reply

Your email address will not be published. Required fields are marked *