One of the most interesting things about human language is the way that the things we say are merely an indicator of the things we do not say - of deeply mutual understandings of which our language is merely a signifier. In a sense, words and sentences are only a shorthand for a vastly larger meaning space, one that does not need to be explicitly represented because every one of us carries it around in our head.
We can only really appreciate this mutual understanding when we see attempts to replicate it in software - attempts that often fail in interesting ways. For example, today I learned, from a lecture by my friend Noah, about “Tale-Spin”, a software system developed by James Meehan to automatically generate stories. Most of the generated stories are fairly uninteresting, with the notable exception of the ones that fail - the stories in which the artificial intelligence inference logic broke down somewhere along the way. Here is one of my favorite Tale-Spin failures:
Henry Ant was thirsty. He walked over to the river bank where his good friend Bill Bird was sitting. Henry slipped and fell in the river. Gravity drowned.
What on earth is going on here? How can gravity drown? It turns out that the story system understands that Henry is being pulled into the river by gravity. But it doesn’t understand that gravity is not, in fact, a character. In the internal logic that generated this story, Henry Ant survives his mishap because his good friend Bill Bird pulls him out in time.
Alas, gravity is not so lucky. Being a character in the story, and having nobody to pull it out of the river, poor gravity dies. Or, as Noah phrased it so poetically in his talk, “Gravity has no friends.”