I’ve been thinking about a paper by researchers at the University of Zurich, published in the journal Nature, entitled “Oxytocin increases trust in humans.”
Scientists had already demonstrated that the natural chemical oxytocin (it is found, for example, in mother’s milk) plays a key role in the formation of social attachments among non-human mammals. This new study demonstrated that it also works like that among humans.
The researchers placed a volunteer in a room with a computer, and asked him/her to play a trading game with somebody on-line. The way the game was set up, the player could opt for one of two strategies: either “We cooperate and both win” or “I win but you lose”. The first strategy would earn more points, but only if the other player cooperated. Basically, the more you were willing to trust the other player, the more you could win.
The experiment was tried both with and without secretly pumping airborne oxytocin into the room. The researchers reported that subjects were significantly more likely to choose the high-trust strategy when the oxytocin was present.
And here is where it starts to get really interesting. They repeated the experiment, changing only one thing: This time, they told the volunteer that the other player was a computer program. The result? The oxytocin no longer had any effect on player behavior.
This suggests that we have one model in our heads for “human” and a completely different one for “acts like a human but I know it isn’t”. Which has enormous implications for all kinds of evolving media, like computer games, on-line communities, virtual storytelling, The Sims, World of Warcraft, Second Life, Facebook (the list could go on and on), and eventually maybe our kids’ android companions.
We may always feel an emotional chasm, a fundamental lack of true empathy, toward any virtual thing that we know is our artificial creation, no matter how believable it seems, no matter how advanced the technology ever gets.
And that might be a good thing.






