Jill looked around Dean Simon’s office. Why do these guys always get such great offices, she wondered. Maybe there was an inverse correlation between how big your office is, and much time you spend doing research.
Her reverie was interrupted by the Dean’s exasperated voice. “You seem to be implying that your research projects — Anna and Fred — have human souls. I’m not comfortable with that.”
“No,” Bob answered patiently, “they are experiments in causal reasoning. We are building on the theories of Judea Pearl and others. Traditional logical systems can infer correlation, but not causality. They can recognize ‘it is raining, and I am holding an umbrella,’ but are not so good at inferring ‘because it is raining, I am holding an umbrella.’ True causality requires understanding context — the kind of thing humans do.”
“Aha,” so you are in fact saying they are human.”
Jill jumped in. “Dean Simon, Alec and I work on creating algorithms. An algorithm is by definition not a human being.”
“And where is Alec?” the Dean asked. “The bill for your lab’s server usage has gone through the roof, at a time when funding is being cut everywhere, and your precious young genius is nowhere to be found.”
“He’ll be here,” she said, feeling much less certain than she sounded.
“OK, they’re not human. But you say they can experience emotion?”
“Not Fred,” Jill said. “Only Anna. Fred has the basic causality logic, but not the ability to create new contexts. Only Anna can do that.”
“Yes, I understand. So is Anna capable of love, of hate?”
“We’re not really sure. We’re detecting a lot of interesting activity.”
Bob jumped in. “Anna seems to be cathecting on Alec, her creator, building a larger and larger causal network around him. Inference breeds motivation, which breeds inference again. It’s a resonant cycle. We’re just beginning to understand it by applying Structural Equation Modeling.”
Before the Dean had a chance to respond, Alec appeared at the door. “It’s a lot simpler than that,” he said. “Anna has daddy issues.”