Home > e-phemera > Going Down Swinging: Poesis Ex Machina: Non-Human Language and Mechanic Poetry

It’s time for our October swap with Going Down Swinging!

One of Australia’s oldest and strangest literary publishers, Going Down Swinging was conceived in 1979. It now produces print anthologies, audio recordings, multimedia publications, live events and a very busy website.

We’re happy to be able to team up with Going Down Swinging and introduce Australian writers to our PRISMers–and vice versa. We’ll be swapping articles and interviews once a month, so keep an eye out!

This month, Rafael S. W. writes about poetry, humans and robots…

Poetry has long been a way of trying to come to terms with something we don’t fully understand. In the past this has been death, or love, or war.

Now it’s Horse Ebooks and Google Autofill.

google-poetics-1

google-poetics-2

Some of these are poetic in that traditionally horrible way – esoteric babble aiming to stupefy through pretty-sounding phrases. But some are genuinely poetic. (Though the question remains as to the criteria of ‘poetry’, and whether an accident of clumsy typing is ‘genuine’.) The only metric we really have to go on is the one thing we still have over robots – feelings.

Does this poem feel like it was written by a human or a computer? is the central question asked by Bot or Not, an experiment designed by Oscar Schwartz and Benjamin Laird for the Digital Writers’ Festival earlier this year. Described as “a Turing test for poetry”, the challenge is to distinguish between a poem written by a person, and one written by a robot. According to their Leaderboard, it seems like Gertrude Stein and Corey Wakeling are the most robotic humans, while this poem by the program Janus Node is the most human.

The methods used to create non-human poems vary from using rule-based systems, collaborations of original texts, or language models from databases of authors’ works. One example of these bots being loosed upon the world comes from @Pentametron, which is best explained in its own words:

With algorithms subtle and discrete
I seek iambic writings to retweet.

While we can take solace in the fact that we’re at least still writing the programs that generate the poetry, it’s clear that these sonnets have some of the creativity (and all of the voice) of our generation. Chris Baraniuk, in his essay ‘Evil Robots and Their Way with Words’, unintentionally describes the future of robo-poetry when talking about the dialogue of replicants in Blade Runner: “the things he says could easily be uttered by a human experiencing the same sense of fracture.”

The fight to distinguish between humans and robots has already moved on from poetry and is coming to a Twitter feed near you. Botornot – not to be confused with Schwartz and Laird’s website – gets users to rank tweets and compares them to 16,000 users who are ‘verified humans’ (which sounds like something you’d put at the bottom of your dating site profile). While Motherboard is quick to jump to the conclusion that, based on these tests, Obama is a robot, it is still worthwhile learning how to distinguish between tweets, especially as robots have already started seducing my friends.

tinder-bot

While spambots on Tinder are hardly poetic, the fact is that communication is evolving, so we may as well get a common tongue.

At the moment, there are two directions this evolution is heading in. The first is where we are developing robots to adopt our languages; the second is where we try and learn a language optimised for robots.

Robots interacting with humans

ASIMO is pretty much the poster child of the robots-resembling-humans race. As the first self-regulating humanoid walking robot, ASIMO has been guided by Honda through various stages of personal growth for the last fourteen years. And now they’ve taught him, sorry, it, sign language.

“Previous generations of Asimo have demonstrated incredible fluidity and speed of movements,” states Satoshi Shigemi, the ASIMO project’s chief engineer. But newer iterations of the robot are adding much greater flexibility and means of communication. Andra Keay, managing director of Silicon Valley Robotics, believes robotics can go even further.

“We’re entering a rich age for deep machine learning from humans,” she said.

Just like that Animal Farm quote, “One language good, nineteen languages better”, French robotics company Aldebaran has created a ‘companion robot’ that can speak in nineteen different languages. Even more amazing is how, after teaming up with the voice technology company Nuance, this new companion – NAO – aims to be able to learn the languages too. Thanks to the cloud-based software and integrations with Nuance’s voice recognition and text-to-speech programs, the NAO robot can also “walk on varying surfaces, track and recognize faces and objects, express emotions and react to touch.”

But who needs any of that when you could just make it dance to ‘Gangnam Style’.

Humans interacting with robots

Singing to them in binary could be a good start, but a specific ‘Robot Interaction Language’ might be even better. ROILA, as it’s known, is a language developed by the Netherlands’ Eindhoven University of Technology specifically for talking to robots. Like many conlangs it aims to be much more accessible and logical than natural languages, with the added bonus of being optimised for recognition by robots. This is thanks to an algorithm designed specifically to create a vocabulary free of ambiguous or confusing words. According to their website, humans shouldn’t have much trouble either, as “the simple grammar has no irregularities and the words are composed of phonemes that are shared amongst the majority of natural languages.”

While learning this inhuman tongue might seem strange, the rewards are rich – currently ROILA can be used with Roomba vacuum cleaners and Lego Mindstorms’ NXT robot.

My own personal favourite, though, is the language spoken by chess engines. It is a poetry of probability, where instead of every line being crafted into a whole, lines are pruned down until you just have a few letters and numbers per million phrases analysed.

chess-language

“#15…Bb7,” the chess engine will say to me, and even though we’re both using a common tongue, and the intermediary medium of the chessboard, it is highly unlikely I’ll be able to follow all its reasoning for the moves suggested. Indeed, chess computers have a reputation for quiet, seemingly subtle moves. If Grandmasters play chess with the finesse of a surgeon wielding a scalpel, then chess computers use laser eye surgery.

It’s entertaining, too, to hear the human dialogue around chess computers. Players talk about certain chess positions requiring you to “think like a computer”, or they might say, in post-game analysis, “the computer wants me to play Bishop e2, but I went with my gut and played Bishop b5.” Instinct, creativity: these are the only advantages we hold over computers, and it’s still not enough to help us win.

Robots interacting with robots

Communication between machines has basically been the cornerstone of the twenty-first century, but where before it was purpose-driven, now it’s experimental. Through crafting and analysing how robots develop and learn, we gain a greater understanding of the complexities of all languages – not just machine ones.

Lingodroids look like a child’s dumpster truck crossed with a dismembered thong, but according to the University of Queensland developers they are “language learning robots that play location language games to construct shared lexicons for places, distances, and directions.”

Rather than the typical transfer of data that happens between machines, the Lingodroids have been specially crafted to communicate concepts that can be both concrete and abstract. For example, they will observe a room (using 360-degree cameras, laser range finders, and sonar) and, if a space is unfamiliar to them, they will name it with a pairing of two syllables. One robot will then teach this new phrase to another and, through playing games, reinforce the understanding of this location.

Through observing their processes of interaction, researchers gain a greater understanding of how we describe the world around us, as well as developing processes which will, one day, allow robots to give and receive complex directions, even to places they’ve never been before.

In the end though, it doesn’t matter if you’re a human, robot or somewhere in between – knowing how to express yourself, and understanding how others do so, is key to your sense of self-identity. But, in the wise words of Horse Ebooks, “unfortunately, as you probably already know, people”.

Rafael S. W. is a graduate of creative writing and one of the founding members of Dead Poets’ Fight Club. He writes every single day and has been published in Voiceworks, Going Down Swinging No. 33, the current print/audio edition No. 35, and Dot Dot Dash. He also competes in poetry slams and giant-sized chess games.