May 16, 2016

“If you can fall in love with a statue, I don’t see why you couldn’t fall in love with a neural network trained on romance novels.”

by

This is your brain on Google Brain on romance novels

This quote from Google’s Andrew Dai pretty much says everything you need to know about the search giant’s latest effort to humanize its consumer-facing AI interfaces. But these posts are supposed to be at least a few hundred words long, so let’s fill in some of the details.

Google Brain is the deep-learning branch of the tech-hegemon’s research efforts. The AI technologies being developed there are used in Google’s voice recognition software, as well as in video and photo searching. Also in Deep Dream, a flashback-inducing, 70’s-sci-fi-esque art project involving image recognition and neural-net learning techniques. It’s pretty far-out stuff.

Now, in order to create a sexier more approachable AI interface for search and email apps, Google researchers have been force-feeding their AI programs thousands of romance novels. Because there’s no better representation of authentic American vernacular than the smutty, sweet nothings of Nora Roberts.

Also because romance novels typically follow a more or less similar narrative structure, allowing the deep-learning algorithms to sleeplessly trawl a great sea of ripped bodices and airbrushed biceps for variations in tone, style, and syntax. Which is the point. To make our machines sound totally fuckable more human.

According to the unpublished paper that the team released, the program was actually fed more than 11,000 different texts, including the aforementioned romance tomes. The goal was to teach the AI to generate sentences in a natural succession, so that, given an opening and a closing statement, it could produce a sequence of language that naturally transitioned between them, as happens organically in human conversation.  For example, it might be programmed to begin by saying, “it made me want to cry,” and talk its way to “the man asked.” The result is weird, which should not be all that surprising, given the artistic output of Deep Dream. Thu-Huong Ha at Quartz usefully excerpted some of the more entertaining examples, which we’ve included below. Statements in bold are the prompts given by the research team.

“i want to talk to you.”
“i want to be with you.”
“i don’t want to be with you.”
i don’t want to be with you.
she didn’t want to be with him.

no.
he said.
“no,” he said.
“no,” i said.
“i know,” she said.
“thank you,” she said.
“come with me,” she said.
“talk to me,” she said.
“don’t worry about it,” she said.

he was silent for a long moment.
he was silent for a moment.
it was quiet for a moment.
it was dark and cold.
there was a pause.
it was my turn.
What exactly is this good for? I’m not sure, but here’s the abstract from the paper:

The standard unsupervised recurrent neural network language model (RNNLM) generates sentences one word at a time and does not work from an explicit global distributed sentence representation. In this work, we present an RNN-based variational autoencoder language model that incorporates distributed latent representations of entire sentences. This factorization allows it to explicitly model holistic properties of sentences such as style, topic, and high-level syntactic features. Samples from the prior over these sentence representations remarkably produce diverse and well-formed sentences through simple deterministic decoding. By examining paths through this latent space, we are able to generate coherent novel sentences that interpolate between known sentences. We present techniques for solving the difficult learning problem presented by this model, demonstrate strong performance in the imputation of missing tokens, and explore many interesting properties of the latent sentence space.

Andrew Dai gave a somewhat more straightforward answer to BuzzFeed last week, saying “Hopefully with this work, and future work, [the Google app] can be more conversational, or can have a more varied tone, or style, or register.”

I have only one question: did it read A Gronking to Remember?

 

 

Simon Reichley is the Director of Operations and Rights Manager at Melville House.

MobyLives