Patterns

The concept I described yesterday — the difference between those we love and the generic idea we have of those we love — is a kind of pattern. You can transcribe this pattern to many different contexts, and yet it remains surprisingly robust. Othello’s tragedy was linked to his problem in seeing Desdemona — he was too invested in his idea of Desdemona.

Similarly, we often have trouble distinguishing between our leaders — with all of their strengths and foibles — and our idea of a leader. This is one reason we often feel so betrayed when they turn out to have the same human weaknesses that we regularly observe — and tolerate without difficulty — in others.

There are many such patterns in the relationships between people, from the way we tend to see the woto the ways we often reenact our (possibly dysfunctional) childhoods in relationships with friends, lovers and co-workers. It would be easy to see how one could build an architecture of human relations from such patterns and the larger patterns that form as they combine.

Which puts me in mind of “A Pattern Language”, the groundbreaking 1977 book of Christopher Alexander and others. That work pulled together many connected ideas about architecture — buildings, gardens, communal spaces, windows, entrances, pathways, and all of the many elements in our built world — to create an entire language for describing good architectural design.

Perhaps we can do the same with patterns of relationships between people. I wonder, would it be possible to build “A Pattern Language” to describe the many pathways, roads and bridges, hidden rooms and secret fortresses that we build upon the human heart?

You are not my heart

I was having a conversation this evening with a friend and she described an odd sensation that can occur, after she has gone away from home for a while (say, on a several month trip to another country). My friend said that sometimes there is a moment, when she first sees her lover after such an absence, in which things can seem a bit strange. She gives her partner a kiss, which is very pleasant, but somehow something seems not right. It’s as though this person she knows so well has become somehow vaguely unfamiliar.

She went on to say that this experience is extremely transitory. After a few moments, things tend to snap back into place, and the old feeling of familiarity returns. Yet she wonders at that moment of temporary weirdness.

I shared with her my theory: That she is seeing the difference between her lover and “her lover”. They seem to be one and the same, but they are not. The former is a flesh and blood person, whereas the latter is a concept in one’s head.

Since childhood, we each harbor some notion of romantic attachment. When we are ten years old, this might go no further than holding hands. Even then, the underlying emotions are all there, albeit in a nascent state. One day we meet someone, and these emotions become projected onto that individual. The person becomes our lover, and the traits of the real person become merged with the concept of “my lover” we’ve already been carrying around in our head.

After a long absence, our mental model of our lover can drift, perhaps just a little bit. We don’t have the actual person with us, so we fall back upon the idea of that person. Some part of our mind recalls not the individual, but the concept we’ve been carrying around with us since childhood.

This can lead to the odd moment or two. Fortunately, the sensation is fleeting, which makes sense. After all, the entire purpose of the lover we’ve carried within our heart since childhood is to prepare us for the real person — the unique individual who has won our heart.

Serious games

Today I was in a discussion with a group of people who study games from a literary perspective. The question was floated as to why people play games, since they are not useful.

Now, my perspective on things in general is that if something is highly pleasurable, then it is biologically connected to something that has had survival value for our species. We derive great pleasure from the taste of food because eating is essential to our survival. The same goes for sex, and for our feelings about our children. The greater the inherent survival value of a particular behavior, the greater is the associated pleasure.

Of course this can all be subverted. The pleasure principle associated with consuming food, when it runs amok, can kill you, and sexual appetites can become highly self destructive. The point is not that things can’t go wrong, but rather that the feeling of pleasure wouldn’t even exist in the first place, unless there were some underlying biological trait that had been selected for over an evolutionary time scale.

Which is why today’s conversation reminded me of an experience I had in the Seattle Zoo about ten years ago. I was in the part of the zoo where they keep the baby animals. In one enclosed yard were the young of some species of exotic deer. The males of this species have antlers when they are adults, but the males in this particular group just had stubby little knobs, where antlers were yet to grow.

In the middle of the yard was a big rock, about the size of a smallish table. I was surprised to see that the male baby deer had organized themselves to play a game around this rock. Half the deer lined up on one side of the rock, and the other half lined up on the other side. All of a sudden the two deer at the front of their respective queues would run up, leap onto the rock, and butt heads with each other. Then these two deer would each jump off the rock, circle back around, and get in the back of their respective lines.

For as long as I watched, the male baby deer continued to play this game. They never seemed to tire of it, and they were all clearly having a great time.

In that moment I realized that “fun” for the young of a species (including humans) comes from activities that exercise and develop skills that will become not only useful, but deadly serious, once the individual grows up. Young deer butt heads, and young humans play with dolls, play fireman, and play soldiers.

I am fairly confident that we are drawn to games (not only as children but also as adults) because they exercise and develop skills that allow us to function better. We can’t always recognize or identify what those specific skills are, but if the game is fun, that’s a very good indicator that important skill building is taking place.

In this very fundamental sense, all games are serious games.

Community

Today I paid an all day visit to an academic department at a nearby University. Gave a talk, met with students and faculty, and generally got a sense of the place.

There is something ineffably delightful about encountering an entire community for the first time. It’s not so much the physical buildings and rooms, although of course these make their own impression. It’s more a feeling you get from the people, the way they fit together with each other, the way you can sense each individual’s particular sense of belonging, of being part of a tribe, of having a shared purpose.

To the students I suppose I was a bit of an exotic animal, a visitor they’d heard of only by reputation. It took a while to break through the formalities, to get to the point where we got past roles and were simply sharing great conversation. Around the time I found myself and a group of students engaged in a spirited debate about the role of circular causality in the invention of the neural-net processor in the Terminator movies, I realized everything was going to be ok.

I love the way people naturally form themselves into little communities, the way a shared sense of identity — and a shared sense of pride — can emerge from the pooled energies of disparate individuals. This lovely ability we have, to weave a communal tapestry from the threads of our respective unique individual selves, is just about one of the most delightful of human traits.

World of Goop

I was fascinated — and delighted — that no sooner did I post a poem yesterday that expressed a mood of gloom, when a comment appeared that rewrote the poem ever so slightly, so that it expressed a mood of hope. This act of rewriting was an implicit assertion that my original poem (or any original work) was merely one fixed point in a universe of potential creations.

Intrigued by this manifesto of remix, I then wrote a third poem as an answering comment, which expressed yet another mood, which got me thinking how the space of potential works can in some ways be more interesting than the written canon.

Coincidentally, an article in today’s New York Times featured an examination of the trend toward the use of deliberate and unapologetic appropriation in literature. The article even quotes James Joyce’s memorable line “I am quite content to go down to posterity as a scissors and paste man.”

There is a tension here, of course, between those who see appropriation as an aesthetic right, and those who claim ownership over their original works, viewing unauthorized appropriation as theft of property. Of course there are powerful arguments on both sides. As with most interesting debates, god is in the details.

But suppose content creators were to fling open the doors. Suppose we started designing literature to be a target for remix and appropriation, from the ground up. We could, in fact, develop software that would enable this process. Suppose my goal was to write not a single original poem, but rather a procedural universe of poems for you to use — a kind of “generative oracle of poetry” (Goop). Readers could request different moods, and out of the Goop would emerge variants of the core poetic idea that expressed correspondent shades of emotion.

The concept of procedural literature is certainly not new. The OuLiPo movement — founded in 1960 by Raymond Queneau and Francois Le Lionnais — looks at any given creative work as merely one instance of a set of generator rules and constraints to create potential literature. In this view, the true original work lies in the creation of this underlying set of rules and constraints.

But I think such ideas will continue to be of only limited interest if such poetic oracles are one-offs, with each writer’s work existing in its own isolated universe. Suppose there were a coherent OuLiPo universe, in which many Goops were naturally linked. My generative creation could deliberately incorporate the generative power of yours, so that anyone who sought to pull out a customized result from my poetic musings would find echoes of your muse nestled within.

This way of doing things is very familiar to software designers. Generally speaking, each of us does not implement our own version of a high dimensional matrix inverter, or Voronoi diagram builder, or 3D physics simulator (unless we are doing it for practice and self-education, as one might, say, build a cigar box banjo as a craft exercise).

Perhaps it is time, given the rapidly increasing power and accessibility of computers, to apply the general ethos of communities of shared OuLiPo to prose and poetry. Collaborative building and sharing of libraries for algorithmic expression have become a mainstay of scientific progress. Why shouldn’t the arts community benefit from such twenty first century tools?

Dreams

The dreams you never know you dream
Still echo in the day
The inner eye, the unseen hand
Will lead you far away

Within the dark of night are found
Those shadows of your soul
That hide within a secret place
Until they take their toll

Voices whisper in the wind
And seeds of fate are sown
Where all the dreams that you forgot
Are waiting to be known

Faces, continued

The discussion yesterday was so interesting that I thought it would be a good idea to spend another day on this topic. When it comes to the question of ubiquitous face recognition (ie: technology assist that lets you recognize the face of anyone, anywhere, anytime), I think there are two issues that somehow got entangled in the comments.

One is the issue of prosthesis, and the other is the issue of privacy. When talking about computer-assisted face recognition as a prosthetic, it is useful to note that there is nothing even remotely unnatural about such a technology. In fact, humans have rather sophisticated machinery within our brains that allows us to recognize faces (it turns out that reasoning power alone is insufficient — to identify the face of another person, you actually need that dedicated hardware you’ve got in your cerebrum).

So using software to enhance facial recognition is no more unnatural than contact lenses, hearing aids or prosthetic shoes. In each case, a natural human ability is being augmented through technology.

Privacy only becomes an concern when we ask what becomes of those images we are all taking with our digital cameras. And here is where it might become useful to push the discussion a bit further into the future.

Let’s skip ahead another thirty years or so. In 2040, the iPhone has become a relic of the past, interesting only as an arcane cultural artifact, like a TI99 might be in the year 2010. We all have implants in our corneas that allow our eyes to see whatever cybervisions we wish, all networked wirelessly at very high data rates.

In the world of 2040, it is a given that your eyes have automatic computer-assist for recognizing faces, and many other things besides. Your cyber-enhanced eyes and ears are gathering data all the time, and your implanted personal CPUs are continually sifting through that data, for things that you personally would find of interest.

But here’s the rub: The moment that data leaves your body, there is an issue. The fact that whatever you see can instantly be transmitted to the world has all of a sudden become somebody else’s business — the business of the person you happen to be looking at.

I predict that a body of legal rulings will eventually be built around the question of what allowable limits there may be on my right to surreptitiously broadcast what I see with my own eyes. And those rulings will not decide in favor of unrestricted, unconditional rebroadcast of captured reality.

If we keep this scenario in mind, the issues become clear: I should have unlimited right to use iPhone based face recognition as a prosthetic — ie, for my own use in recognizing faces.

But once I start broadcasting my captured images to the world, identifying exactly who was where and when they were there, then I’ve crossed the line into potential invasion of privacy. That’s when the legal questions will start — and where the fuzzy line of what society finds acceptable will eventually become defined not through technology, but through case law.

There’s a face for that

It may not yet be common knowledge, but the technology already exists to figure out who a person is from a photo of that person. In particular, if you have a database of images of people’s faces, each tagged with the name of that person, then there are fairly reliable algorithms that can identify any one of those people from a new photo.

Meanwhile, millions of people are walking around carrying iPhones. There is already a culture in place whereby people use their iPhone to surreptitiously take photos of other people (you pretend you’re reading something on your screen, when in reality you are aiming your iPhone and clicking the shutter).

Logically you would think people would use an iPhone App to tell you who that guy or gal is at that party, or professional conference, or gallery opening? I mean, these are exactly the situations in which you are dealing with a known group of people, for whom tagged photos are likely to already exist.

I confess I’m one of those people who “knows” hundreds of people (at least) from conferences and other professional situations — if by “know” we mean that I recognize their faces and realize that I’ve spoken with them before, and have probably even shared a beer or two with them at one time. But I could not even begin to connect most of those familiar faces to their respective names, let alone to their professional affiliations.

Yes, I know that conferences hand out badges to attendees. But any conference attendee knows the limitations of that technology. Half the time people have their badge flipped around backwards, and during the evening parties (which is when you really get to talk with people), more than half have ditched their badges altogether – and the people who don’t wear nerdy badges at parties are probably just the people you’d rather talk to.

And of course you won’t generally find people wearing name tags at purely social gatherings or downtown performance events or art gallery openings.

It’s not just a question of name recognition. Your iPhone (which is, after all, a network appliance) could actually tell you something useful about that person — like the fact that they are working on precisely the research problem for which you’ve been seeking an expert. Or — in a slightly more sophisticated version — that they have just put a posting on Craig’s List to unload exactly that model of used netbook you’ve been desperate to find (and which they might be carrying with them right now). You get the idea.

I did a little searching around the Web. There are indeed some iPhone Apps out there that do face recognition. One of them even integrates with Facebook pictures. And yet I’ve never seen anyone use one.

Why is that?

Robots in your house

Today I was having a wonderful conversation with my friend Heather (who makes robots) about robots in your house. I don’t mean thermostats and dishwashers and automatic garage door openers all of those other practical robots that have been busily keeping your life in order for years while you weren’t looking.

I’m talking about Rosie from “The Jetsons”. I’m talking about R2D2 and C3PO from “Star Wars”, Robbie from “Forbidden Planet”, Huey Dewey and Louie from “Silent Running”. In your house, making your bed, greeting you at the door, cooking your favorite meal or just hanging out. Maybe a bit like a beloved dog, except this one can play a mean game of chess.

This is, of course, well trod territory. Asimov’s “Robot” series laid it all out for us decades ago. But in our culture something always seems to go wrong — eventually it all turns into Karel Kopek’s metaphor about repressed workers, and then things get bad. Somehow the robots figure out a way around Asimov’s three laws of robotics that are supposed to guarantee no harm to humans. Or Cylons spin out of control and start hunting us down. Or the Borg get really creative with used radio parts from Canal Street and end up looking like Maker Faire in hell.

This isn’t the case in Japan by the way. They love their robots, and every Japanese kid’s fondest wish is to have his or her own electromechanical friend that truly understands them, is up for going on adventures, and fighting bad guys. And not just kids. Grownup Japanese people want one too.

So what’s up with us? Why do our robots turn into scary monsters?

Heather was quoted today in a New York Times article with a very plausible explanation:

“The Japanese have always been more comfortable with it, but particularly in the West, there’s this whole Frankenstein thing that if we try to make something in the image of man, to make a new creature, we’re stealing the role of God, and it’s going to turn out wrong because that’s not our role.” – Heather Knight, quoted in The New York Times, Feb 24, 2010

I completely concur. Our entire Judeo-Christian tradition tells us that we’re not supposed to create life (other than through the usual, um, channels). This all goes back a lot further than Mary Shelley’s Frankenstein. The myth of the Golem and similar ancient European tales have shared this cautionary theme. Use your human intelligence to make other intelligent creatures, and you’re screwed (notice, by the way, how the Roomba has cleverly slipped in under our paranoia radar by channeling the whole “inoffensive pet” thing).

I suspect our cultural robo-paranoia goes back even further. It’s a close cousin of the Greek notion of hubris. Prometheus gives us fire, and the gods punish him for handing out one of their own divine powers like a party favor. Icarus, delighted by his newfound god-like ability to fly, forgets he’s not really a god. And you know what happens next.

So why did this particular flavor of technophobia emerge in the West, but not in the East? I’d love to hear any theories.

Expository writing

The subject of education reform brought me back to the single most useful class I took in high school (with the arguable exception of touch typing). Mr. Merkin’s expository writing class in the first semester of my senior year was quite unlike any other class I’ve ever attended.

The idea was simple. After a brief introductory lecture, we would be given a short story or essay to read, and each student would then write a one page essay, in pen, about what we had just read. Mr. Merkin would gather up our papers at the end of the class.

The next time we met he would hand back our papers, all marked up with red ink. Of course he would make grammatical corrections, but the more interesting corrections were structural — showing where our argument was veering off-target, where we had used a misleading metaphor, or pointing out an inadequate introduction or conclusion.

After we’d had time to absorb the returned paper, he would give us a lecture about some aspect of expository writing — the need to avoid overly lengthy descriptions, the structural trinity of introduction / exposition / conclusion, or the uses of a catchy lead-in.

Then we’d get another short story or essay to read, and we’d each write another one page expository essay. This would happen every class — three times a week. By the end of the semester, each of us had written dozens of short essays.

I distinctly recall that at the start of that semester I could not write worth a damn, and by the end of the semester I could. All I’d really needed was a set of short manageable goals, knowing that someone I respected was watching, and continual practice with good feedback at every step.

In my own teaching I have emulated Mr. Merkin’s methods. I never bother with exams — only homework. I give a homework assignment every week, always due before class the next week, and all of the assignments are learn-by-doing (mostly short programming assignments). I try to make each assignment self-contained, so that doing that assignment gives each student a sense of satisfaction and accomplishment.

I also try to make sure to structure the assignments so that every student is expected to add their own personal aesthetic spin to their work. Not only does this allow each student to express his or her individuality, but it also makes it essentially impossible to cheat.

It took me a while to realize that I was channeling my old high school teacher. Once I did, I felt an enormous sense of delight. My students tend to really enjoy these classes, and they seem to derive a great deal of pride and sense of ownership from their work. I learned from Mr. Merkin that a teacher’s job is to provide a properly structured ladder for each student to climb, step by step, by virtue of their own efforts.

When the ladder is designed properly, the student will generally succeed in climbing all the way to the top. The view they get, looking back over their own accomplishments at the end of the semester, is magnificent.