February 19, 2016

Do androids tell themselves stories in order to live?

by

Don't give this book to your robot, human!

Don’t give this book to your robot, human!

Joan Didion famously wrote:

We tell ourselves stories in order to live … We look for the sermon in the suicide, for the social or moral lesson in the murder of five. We interpret what we see, select the most workable of the multiple choices. We live entirely, especially if we are writers, by the imposition of a narrative line upon disparate images, by the “ideas with which we have learned to freeze the shifting phantasmagoria which is our actual experience.

Now, according to a paper entitled “Using Stories to Teach Human Values to Artificial Agents” and thanks (thanks?) to some newfangled technology, robots are telling themselves stories in order to live, too.

As The Guardian’s Alison Flood reports, two professors at the School of Interactive ComputingMark Riedl and Brent Harrison, have developed a “prototype system” by which robots become socialized by interacting with stories. They call the system—wait for it—“Quixote.”

Now, depending on where you stand regarding the grand (grand?) plight of technology (and the stories it is telling us), this is either great news, or, well … quite terrifying. I mean, have you read any books lately? Do we really want robots, which are no doubt stronger than many if not all of us, learning from books like this one? Or this one? Or this one?

But Riedl has anticipated our concerns and says, “Worry not.”

The AI … runs many thousands of virtual simulations in which it tries out different things and gets rewarded every time it does an action similar to something in the story. Over time, the AI learns to prefer doing certain things and avoiding doing certain other things. We find that Quixote can learn how to perform a task the same way humans tend to do it. This is significant because if an AI were given the goal of simply returning home with a drug, it might steal the drug because that takes the fewest actions and uses the fewest resources. The point being that the standard metrics for success (eg, efficiency) are not socially best.

Under ideal circumstances, Quixote never performs actions that would be considered psychotic, harmful, or antisocial. This is significant because we never told Quixote what is right or wrong.

That being said, Reidl and Harrison also note that, “it may not be possible to prevent all harm to human beings.” In other words: Please, worry on.

 

 

Chad Felix is the Director of Library and Academic Marketing at Melville House, and a former bookseller.

MobyLives