Thanksgiving

When I was a child I saw Thanksgiving as the great unifier. There were always other families around us in which the kids grew up with different beliefs and different ways of life. The list of religious school holidays was so long and varied – these kids get off of school on this day, those kids on that day – that it all came to seem, in my young mind, like some sort of elaborate joke.

But Thanksgiving was different. Just who we were giving thanks to was presented vaguely enough that you didn’t actually need to invoke any particular religion or deity – you could just go home and enjoy spending time with your family. You would eat a big meal cooked by your mom, and you knew that all of your friends, whatever their religion or background, were doing exactly the same thing. To me it was this inter-cultural solidarity, the sense of joining hands across subcultures, that was the most comforting aspect of the whole ritual.

This year I find myself reaching across a divide within Thanksgiving itself. I spent the entire morning today cooking various vegan dishes, which I brought to my mom’s house. Yes, there were non-vegan dishes there as well, but I was very touched by the mutual respect with which everyone approached this particular cultural divide. Because I was there, my sister made yummy dark-chocolate vegan brownies, awesome fresh vegan rolls, and delicious vegan potato and sweet potato latkes that were just to die for.

I think back upon that very first Thanksgiving, when the harvest for the new settlers to these shores consisted of beans, corn, squash and garden vegetables – the foods for which those pilgrims originally gave thanks. It feels good to realize that my family is so gracious in joining me to make room for these old ways of celebrating the bounty of the earth.

When does argument become religion?

The fever pitch of the recent election season brought me back to another election season – 2004, the time of John Kerry’s unsuccessful bid to topple Bush 43 (it seems so long ago now, doesn’t it?). Because I live and work in Manhattan, you can well imagine that the conversations leading up to the election were extremely monotonous – everyone here backed Kerry so strongly that they could not imagine anyone voting for Bush. By the end of October, all political discussions had the flavor of religious or tribal ceremonies. We would all repeat the same shared opinions to each other ad nauseum, until the very words began to lose any meaning.

Clearly there were other parts of the country where political conversations were monotonous for precisely the opposite reason – right-leaning places where everyone so thoroughly agreed that Bush was the better candidate, that there was nothing much left for anyone to say.

This contrast was nicely illustrated by Troy’s recent comment on my “Broken Glass” post, when he said: “I do believe that the majority of people out there that are fighting non-traditional marriage are not raising a family of their own.” On the contrary, I know many people around here who are raising families, and every single one of those parents was appalled and horrified by the passage of Proposition 8, and made a point of saying so. I suspect that where Troy lives things are quite different. Here in Manhattan (as well as in all the University towns I visit) one uniformly finds sense of outrage on the part of parents that their friends and colleagues, people they like and respect, are denied the right and responsibility of raising children. I’m not arguing right or wrong here, I’m just pointing out the vast difference between our respective subcultures.

When people from two such opposing subcultures begin a conversation, things can get weird. Each side knows the other is wrong. The kindest thing we each tend to think about the other is that they are well-meaning but deluded, the victim of some cleverly pitched self-serving lies or spin that have clouded their better judgement.

In some sense, you can say that in such situations we have all – both left and right – gone over the edge from rational discourse to religious thinking and tribal warfare. We are all so used to the general lockstep agreement in our respective enclaves, that when we meet someone from “the other side” it feels like an encounter with an apostate. Our reaction is no longer intellectual, but rather is dominated by a irrational sense of emotional discomfort at encountering the otherness of an enemy tribe.

In October 2004, at height of the Kerry/Bush mania, my colleague Robert Dewar made what I think is the most perceptive observation I have ever heard on the subject. He proposed a simple test to determine whether your own views on a subject were in the realm of reasoned argument or in the realm of religious indoctrination. The test is simple: Attempt to seriously argue the other point of view. If you can do that effectively (even if you don’t ultimately agree with your own arguments) then you are still in the realm of the rational. If not, then your thinking has gone over into religious/tribal territory.

Go ahead, try it.

Where the memories lie

In the dark of the night, when the world is asleep
And there’s no sound at all, just the thoughts that you keep
Does your mind ever wander, your thoughts ever stray
To days long ago, to a time far away?
Where the shifting sands wait, in shadows of blue
To lure the unwary, it is waiting for you
The past is illusion, a place of your dreams
Where nothing is ever the way that it seems
For the world of tomorrow will fade in a sigh
When you let yourself dwell where the memories lie
Take care, dreaming traveller, watch where you go
The more you remember, the less that you know
      Your world of tomorrow will fade in a sigh
      Leaving nothing but sand, where the memories lie

Inflection point

Today somebody was wearing a tee shirt that said “The only winning move is not to play.” He told me he was disappointed that most people didn’t recognize the quote. Being appropriately geeky (as, I suspect, are many of you reading this) I recognized it right away as the key line from the 1983 John Badham film “WarGames”. The complete snippet of dialog is between Dr. Falken and his supercomputer Joshua, who has just gone through the exercise of evaluating the outcome of every possible permutation of thermonuclear war:

Joshua:

Greetings, Professor Falken.

Falken:

Hello, Joshua.

Joshua:

A strange game. The only winning move is not to play. How about a nice game of chess?

What fascinates me most about this film is that it represents a precise inflection point in the popular culture – the moment when the programmer became the cool guy who got the girl. Certainly TRON had come out a full year earlier, but Bruce Boxleitner played him as pointedly nerdy – almost the antithesis of cool.

In contrast, Matthew Broderick, three years before he reached his apotheosis as Ferris Bueller, was identifiably cool and sexy, the teen rebel beginning to discover that he is a natural leader – witnessed just as he is coming into his considerable powers. This is essentially the same archetype that appears over and over again in literature. He is Prince Hal in “Henry IV, Part One”, James Dean in “Rebel Without a Cause”, Simba in “The Lion King” and Josh Hartnett in “The Faculty”.

The reason I find this change significant is that it pinpoints the year 1983 as the year the United States first experienced a massive shift in perception of its own power. Historically the power brokers in America had been those men (and it was pretty much always men) who wielded control of industrial production – John D. Rockefeller with his vast holdings in petroleum, and steel magnates like J.P. Morgan and Andrew Carnegie, followed somewhat later by a succession of powerful leaders of the automobile industry from Henry Ford to Lee Iacocca.

America was seen as mighty because of its industrial and manufacturing base, and this continued to be true after WWII and throughout the Cold War. Even the space race was a display of industrial brawn, the ultimate athletic feat of a nation that had worshiped the sheer physicality of transportation since the Wright Brothers and Ford had changed everything in 1908.

But of course now things are different. We are well into an era that worships a newer variety of Alpha leader, and a different kind of throne awaits Prince Hal. This is the era of Bill Gates, of Steve Jobs, of Larry and Sergei – of the supremacy of information over physical power.

Even our recent presidential election has been a triumph of the thinking man over the warrior – an outcome that would have been inconceivable in 1952 or 1956, when the intellectual Adlai Stevenson was practically laughed off the national stage when he attempted to go mano a mano with Eisenhower the war hero. But this time the election was fought and won on the internet. Obama the cool thinker – descendent of Matthew Broderick’s David Lightman – handily beat out McCain’s attempted channeling of John Wayne.

I would argue that the release of Badham’s cautionary film was the moment when this power shift first entered the national zeitgeist. The popular embrace of the internet era in which we all now live, where information is power, where teenagers view the cyber-creations of Will Wright with the same sense of reverent awe that a long-ago generation reserved for the physical feats of Harry Houdini, can be said to have begun a quarter of a century ago, with the release of “WarGames”.

Kindling

It’s clear that Amazon’s Kindle is at the forefront of something big. Maybe this particular device is not going to catch on with everybody, but it’s certainly an important foot in the door to rethinking how we interact with books. The combination of a fairly reasonable form factor, the use of electronic Ink (very easy on the eyes, even in bright sunlight, and not at all a battery hog), and – most important – the backing of the might Amazon, means that this device is getting quite a few people to sit up and take notice, in a way that didn’t happen two years ago with SONY’s ebook reader (when was the last time you bought a book from SONY?).

That said, I’m transfixed by the name. It seems almost an oxymoron for a company called “Amazon” to make a device called “Kindle”. Amazon’s name always gave me the warm fuzzies. Books are substantial, solid, old-fashioned, like the rainforest. Something we want to preserve so that the world can be a good place to live. The Amazon rainforest is a source of endless biodiversity, healthy atmosphere, medicinal treasures and ethnic traditions. I’ve always felt protective toward its bountiful presence, in somewhat the same way I’ve come to feel protective toward books, with their rich history, textured beauty and rugged physicality, in this transient age of the internet.

But to kindle means to start a fire, to burn – not a concept you want to throw around lightly when you’re talking about books. To me book burning is the very bane of civilization, bringing to mind Nazi rallies, as well as movements in our own country to ban “The Catcher in the Rye”.

Is Amazon suggesting that these electronic readers will eventually lead to the disappearance of the physical book? Certainly that would be convenient for a company like Amazon. They are, after all, in the business of licensing intellectual property. Ultimately it is not so much a physical book that they are selling to each buyer, but rather a license to possess a single instance of an copyrighted work. If they can streamline that point of sale, reducing overhead and moving each transaction toward an ideal of pure profit, perhaps that would serve their larger interests.

So in a sense, perhaps we are witnessing the start of the biggest book burning in history. One day such phrases as “between the covers” and “a real page turner” may be as euphemistic as “telephone dialing” or “rewind” – alluding nostalgically back to a Victorian reality that is long gone.

Some day soon, alas, as we all pick up our electronic readers, we may once and for all close the book on books. As our children, and their children after them, run their fingers over magic screens to summon up “The Adventures of Sherlock Holmes”, “Jane Eyre” and “Ivanhoe”, they may catch the sadly bemused looks on the faces of their elders. Perhaps they will even ask us what’s wrong. But I suspect that we will never, try as we might, be able to convey to them just what has been lost.

Missionary cheese

Some years ago I had the good fortune to apartment-sit for some Parisian friends. They had a beautiful duplex, just two blocks from the Seine, across from the Académie des Beaux-Arts. For me the entire experience was a slice of heaven. Every day I would wander out and purchase a fresh bread and some new and exotic cheese, and sometimes a lovely but not too expensive wine, and then I would venture forth into Paris, on my way to explore some new museum or other interesting cultural landmark.

During that trip I developed a taste for really really stinky cheese. I don’t eat cheese these days, but back then I liked to make a point of finding the most alarmingly aromatic cheese I could find – the kind that you could never bring back to the U.S. In those days this sort of cheese was illegal Stateside, presumably because the sheer exquisite headiness of its aroma would cause mass panic and terror in the hearts of American dairy farmers. These farmers knew, to their shame, that their tepid local product could never compete with such pungent magnificence.

Over the course of my stay in Paris I developed little pet names for all aspects of my experience. For example, I began to refer to the very stinkiest cheese – the kind that would spread its intense aroma relentlessly to fill any space – as “missionary cheese”. Later, when I was back in the U.S., I would mention this pet name to people, and they would raise their eyebrows in a most suggestive way. The mere mention of the phrase “missionary cheese” seemed to raise all sorts of lurid images in their minds.

My friends would ask me, not really sure that they wanted to know the answer: “Why did you call it missionary cheese?”

“Because,” I would explain truthfully, “whenever I brought a truly stinky cheese back to my Paris apartment, sooner or later it would convert all the other cheeses.”

Closing the loop

I learned today about “two way learning”. This is a technique whereby you have somebody learn something while you monitor that person’s brain activity. As the person is learning about something, the computer is simultaneously “learning” the patterns of the person’s brain activity.

The hope here is that we can train a computer to recognize particular patterns of brain activity, and use those patterns to determine something about what a person is thinking. The primary application for this now is for people who are paralyzed. If a computer can recognize a paralyzed person’s brain patterns, then eventually that person could simply think a particular thought in order to trigger the computer to perform a particular action.

When this technique was described to me, I had a completely wacky idea: Why not make a loop out of it – have the person and the computer watch each other? The person watches and tries to learn the pattern of the computer that is “learning” the person’s brain pattern. So, for example, when I have a particular thought, the computer monitors my brain activity and shows the results as some kind of image on a display screen. Meanwhile, I watch that screen and try to learn and recognize these images.

This is an interesting scenario because we humans have an extraordinary ability to recognize patterns in what we see. If I see a visual representation of my own thoughts, eventually I might start to be able to recognize what particular thoughts “look like”. Eventually I might learn to modulate my own thoughts in order to make various types of patterns appear on the computer screen. Essentially I am training my mind to train the computer.

By involving the person whose brain is being tracked as an active participant in the process, we might be able to create a rich and powerful learning feedback loop. By making use of the human mind’s amazing ability to recognize patterns, perhaps we can give people the power to modulate their own thoughts at will. Those modulated thoughts could then be used to exert truly precise control – purely through thought – of computers, and thereby of the world around us.

It’s worth trying anyway.

Washington and garlic

Today I flew from Seattle to St. Louis, because tomorrow I’m giving a talk at Washington University. Imagine that – going from one Washington to another in the same day, and neither one our nation’s capital.

My friend and host Caitlin cooked me a delicious dinner (pasta + sun dried tomatoes + garlic + olives + chickpeas + various subtle spices), and that reminded me of the very first time I tried to cook for my parents, after I was out of college and I’d gotten my own apartment and a real job.

I figured that it was an important step – showing my independence by doing something nurturing for my parents – taking care of them for a change. I assiduously followed the cookbook, got all the right ingredients, preheated things, chopped other things, and timed it all out so that my meal would be ready to serve by the time my parents arrived.

I only made one mistake.

I think it was an understandable mistake. After all, I’d never really cooked before. It’s not like I could be expected to know all of the technical terms right out of the gate.

To be more specific, it turns out that this:



is not a clove of garlic. Those of you who, like me in my tender youth, have naively thought that the above thingie is a garlic clove are in for a rude awakening. In fact, it’s something called a “bulb”. When you open up a bulb you get about ten little slices, like pieces of an orange. Each of those little slices, like the thing below to the right, is a garlic clove:



Why is this important? Well, when a recipe calls for two cloves of garlic, and your parents are coming over, and you have prepared them a dinner into which you have actually incorporated two entire bulbs of garlic (around twenty cloves, by my reckoning), things are likely to go amiss.

Fortunately my friend Burke happened to come by some time before the arrival of my unsuspecting parents. Unlike me, Burke actually knows a thing or two about cooking. He had me sautée and sautée relentlessly, which didn’t exactly rescue the meal, but considerably reduced its near-lethal strength.

Needless to say, the entire apartment – and probably all who ate there that evening – reeked of garlic for the next week. My mom and dad were very gracious about it, and they gave an excellent impression of enjoying the meal. It’s amazing what some parents will do out of love for their children.

But there may be a silver lining to this episode: You have probably heard that vampires hate garlic. And I can say definitively that from that day to this, I have never – not even once – been attacked by a vampire. I suspect that my good fortune in this area has been entirely due to the lingering effects of that meal.

You’ve got mail

I was having a conversation with some friends recently about the immense diversity of the things we each do on a computer over the course of a day – same computer screen, wildly different mindsets. For example, we typically use a completely different set of mental constructs in order to, on the one hand, play a first-person shooter, and on the other hand, to read our email.

But why, we wondered, should these two worlds be so separate and distinct? Why not read your email the way you play a first person shooter? After all, haven’t you ever felt a surge of pent up hostility upon receiving that annoying spam message yet again – the one you thought you had filtered out for good? Wouldn’t it be great to have some way to channel those emotions?

In my mind I am imagining my email inbox divided into dungeon levels. On the one hand, there are the relatively gentle levels, where we might encounter most professional and personal emails. And then there are the other levels further down – the ones where we keep all the aggressive spam that we have captured, but have not yet deleted from our hard drive. When you enter these levels, and find yourself face to face with your sworn enemy, it may be best to go out with guns blazing.

What would we call our game? How about “Mailbox Assassin”? Or “Email Nemesis”? Or maybe just “Going Postal”? Can anyody think of another likely name?

The writing on the wall

Today, the forty fifth anniversary of the commercial introduction of the push-button telephone, seems like a fine day to talk about the future of computer/human interfaces.

Let’s skip forward into the future for a moment, to the time when the cost of computer displays will essentially be free – akin to the cost of printing on paper today. Whether it will end up being embodied by some variant of E-Ink, Organic LEDs, or something else entirely, this is not such a farfetched scenario. Give it another fifteen years or so, and we’ll probably have some sort of fairly pervasive and low cost electronic wallpaper.

I’ve been wondering recently, who will get to decide what’s printed on that wallpaper? Let’s say you’re sitting in a restaurant with your friend or spouse. Like most public places, the restaurant wall will consist of changeable paint or wallpaper. Various locations along the wall will be able to display text, images, animations, or whatever else happens to be of interest.

Some uses are obvious: the dinner menu, including special of the day and wine list. News headlines, theatre listings, bus schedules, Op-ed pages. But other uses are not so obvious.

For example, ideally I would like the section of wall in front of me to serve my personal purposes – as though I’m looking at a Web browser on my own computer. I’d want the wall to recognize me and bring up that magazine article or movie I was in the middle of viewing earlier in the day. We will begin to see that sort of capability in just the next few years, through the use of face recognition software.

And it won’t be all that difficult to design this electronic wallpaper so as to show something different to people who are looking at it from different directions. The means to do this is already around – essentially a variant on lenticular lens displays, which have been used for over half a century to make stereo postcards and blinking Jesus pictures.

But will we really get our own customized wall space, wherever and whenever we want it? Perhaps the walls around us will be given over to narrowly targeted advertising – like the nightmare vision of personalized intrusiveness we were shown in the film “Minority Report”. Maybe the dictates of “homeland security” will require that face recognition software be used for reporting our whereabouts to some helpful government agency. In that case you might want to think twice before choosing that left-leaning Op-ed page to read over dinner.

Information utopia or dystopia – who gets to decide? Somehow I suspect that the reality may fall short of our hopes and dreams. But it would be nice to be proven wrong.