Trust Me

I’ve been thinking about a paper by researchers at the University of Zurich, published in the journal Nature, entitled “Oxytocin increases trust in humans.”

Scientists had already demonstrated that the natural chemical oxytocin (it is found, for example, in mother’s milk) plays a key role in the formation of social attachments among non-human mammals. This new study demonstrated that it also works like that among humans.

The researchers placed a volunteer in a room with a computer, and asked him/her to play a trading game with somebody on-line. The way the game was set up, the player could opt for one of two strategies: either “We cooperate and both win” or “I win but you lose”. The first strategy would earn more points, but only if the other player cooperated. Basically, the more you were willing to trust the other player, the more you could win.

The experiment was tried both with and without secretly pumping airborne oxytocin into the room. The researchers reported that subjects were significantly more likely to choose the high-trust strategy when the oxytocin was present.

And here is where it starts to get really interesting. They repeated the experiment, changing only one thing: This time, they told the volunteer that the other player was a computer program. The result? The oxytocin no longer had any effect on player behavior.

This suggests that we have one model in our heads for “human” and a completely different one for “acts like a human but I know it isn’t”. Which has enormous implications for all kinds of evolving media, like computer games, on-line communities, virtual storytelling, The Sims, World of Warcraft, Second Life, Facebook (the list could go on and on), and eventually maybe our kids’ android companions.

We may always feel an emotional chasm, a fundamental lack of true empathy, toward any virtual thing that we know is our artificial creation, no matter how believable it seems, no matter how advanced the technology ever gets.

And that might be a good thing.

3 thoughts on “Trust Me”

  1. the idea of airborn oxycontin freaks me out.

    WoW, Second Life, Facebook, and other communities actually do have humans on the other side — aren’t humans controlling those who are “acting” like humans in those environments? In those examples, the artificial is controlled by the human in real time.

    In those cases, I would argue that “acts like a human but isn’t” doesn’t totally apply here…

  2. Excellent point Sally! Here’s what I was getting at, in more detail:

    At some point non-player characters are always surreptitiously introduced into on-line environments, like the fictional “Sarah Tuttle” in Friendster. My view is that the fundamental difference in how we feel toward “human” versus merely “acts like human” will ensure that there will always be enormous societal pressure in social networks to out such fakes, no matter how convincing they are.

    There are some ethical distinctions we don’t care about, but this one is built right into our brains, so crossing that human/non-human line without informing the observer will always be considered a form of fraud.

Leave a Reply

Your email address will not be published. Required fields are marked *