Readers and writers

Speaking of being surprised … not all that long ago I was at a National Science Foundation workshop with about fifty educators – mostly educators from various universities around the U.S. In one of the talks the speaker asked who among us uses the Wikipedia. Not surprisingly, about fifty hands went up. As we all know, everyone uses the Wikipedia.

But then the speaker asked who has edited an article in the Wikipedia. I put my hand up, not really thinking about it. And then after a moment I realized that only two of us in the room had our hands up. She and I looked at each other – we, apparently, were unusual.

So what’s up with this? Why is it that in a room of highly educated people – people who teach our college and graduate students – almost none have ever edited an article on Wikipedia, a resource that they themselves use every day? I would have asked them directly, but I couldn’t figure out a suitably non-confrontational way to frame the question.

It seems to me like the most natural thing in the world: If I happen to see an error I fix it, just to make it a little better for the next reader. It’s very easy to do, and it seems like the right way to treat a common resource.

Do I just have a fundamentally different view of the Wikipedia than most people?


In his comment on my Nov 28 post, Ross made the following sensible suggestion:

I boil it down to one question: can you list three cogent pros/cons for each candidate? If the answer is “no”, then you’ve fed for too long at the trough of Hannity or Olbermann. If the answer is “yes”, then let’s talk.

After reading this, I discussed it with somebody I know who tends to be quite levelheaded, a brilliant man whose opinions I generally respect quite a lot. He seemed to agree with Ross’s suggestion, until I said “for example, it would be interesting to try to come up with three cogent reasons why John McCain might have been a good candidate to vote for.” In about ten seconds, my conversant went into what can only be described as a controlled rage, incensed at the very suggestion that there could be a “cogent argument” for McCain, and clearly quite annoyed at me for proposing such a thing.

I tried to tell him that if you’re going to sway people who are on the fence on an issue, you need to understand what parts of the opposing arguments they are buying into, at least well enough to counter those arguments. But he was no longer listening. Within about thirty seconds he had angrily fled the room.

I was surprised, to say the least. I am sticking to my guns on this one – I might strongly disagree with the opinions of one hundred million of my fellow citizens, but I’m not willing to simply – or dismissively – label them all as deluded idiots. Some of these people are thoughtful, intelligent individuals, however much I may disagree with theim. I think I need to understand how they reached their conclusions, even if only to understand my own conclusions with greater clarity and perspective.

Are there really so few of us who are willing to reach across the aisle?

Falling expectations

This being Thanksgiving weekend, tradition dictated that we take my nephews to a truly silly action movie. The film du jour is the new Bond flick – “Quantum of Solice”. It’s mostly an excuse to see a gun-wielding Daniel Craig, as well as an army of stunt doubles computer graphic stand-ins, run, jump, leap from burning building to speeding boat to flying plane to swinging girder to whatever fast-moving object looks really cool in the shot. Nothing else in the movie really matters, but then, nothing else in the movie is really supposed to matter. Yes, of course various bad guys and beautiful women get killed and slept with (actually, only the women get slept with – the bad guys just get killed), but that’s all just a kind of background window dressing for the real action: watching James Bond do all these amazing feats of running, leaping, etc., while somehow managing to not drop his gun.

It all worked splendidly for my nephews, and for me as well, except for one place in the movie. There was a scene where James and his beautiful yet mysterious lady of the moment are falling out of an airplane, one parachute pack between them. There’s a tense moment when they try to reach each other while plunging through the air, and then – just in time – they come together, the chute opens, and they land without getting smashed like bugs.

The problem for me is that, unlike just about everybody in the targeted audience, I’ve actually been skydiving. So unfortunately I know first-hand that if you’re not around ten thousand feet up in the air when you pull the chord to open the chute, you’re going to get squashed like a bug anyway. In the movie they were about twelve feet in the air when the chute opened. The odd thing about this for me was the realization that if I had not actually ever been skydiving, this entire sequence would have worked perfectly for me. I wouldn’t have given this flagrant violation of the known laws of physics a second thought. “It’s James Bond,” I probably would have told myself. “Of course he can land safely in a parachute that has just opened a mere twelve feet off the ground.”

And of course that realization calls into question all the other parts of the fantasy – the jumping in and out of speeding boats, the falling onto cars from the top of a building, the getting blown out of a fiery bad-guy hotel just as it’s about to explode. Maybe even James Bond couldn’t do such things, I start to wonder, and I start to feel the entire edifice of my willing suspension of disbelief beginning to crumble.

And then I remember the most important thing that makes it all ok: It’s only a movie.


When I was a child I saw Thanksgiving as the great unifier. There were always other families around us in which the kids grew up with different beliefs and different ways of life. The list of religious school holidays was so long and varied – these kids get off of school on this day, those kids on that day – that it all came to seem, in my young mind, like some sort of elaborate joke.

But Thanksgiving was different. Just who we were giving thanks to was presented vaguely enough that you didn’t actually need to invoke any particular religion or deity – you could just go home and enjoy spending time with your family. You would eat a big meal cooked by your mom, and you knew that all of your friends, whatever their religion or background, were doing exactly the same thing. To me it was this inter-cultural solidarity, the sense of joining hands across subcultures, that was the most comforting aspect of the whole ritual.

This year I find myself reaching across a divide within Thanksgiving itself. I spent the entire morning today cooking various vegan dishes, which I brought to my mom’s house. Yes, there were non-vegan dishes there as well, but I was very touched by the mutual respect with which everyone approached this particular cultural divide. Because I was there, my sister made yummy dark-chocolate vegan brownies, awesome fresh vegan rolls, and delicious vegan potato and sweet potato latkes that were just to die for.

I think back upon that very first Thanksgiving, when the harvest for the new settlers to these shores consisted of beans, corn, squash and garden vegetables – the foods for which those pilgrims originally gave thanks. It feels good to realize that my family is so gracious in joining me to make room for these old ways of celebrating the bounty of the earth.

When does argument become religion?

The fever pitch of the recent election season brought me back to another election season – 2004, the time of John Kerry’s unsuccessful bid to topple Bush 43 (it seems so long ago now, doesn’t it?). Because I live and work in Manhattan, you can well imagine that the conversations leading up to the election were extremely monotonous – everyone here backed Kerry so strongly that they could not imagine anyone voting for Bush. By the end of October, all political discussions had the flavor of religious or tribal ceremonies. We would all repeat the same shared opinions to each other ad nauseum, until the very words began to lose any meaning.

Clearly there were other parts of the country where political conversations were monotonous for precisely the opposite reason – right-leaning places where everyone so thoroughly agreed that Bush was the better candidate, that there was nothing much left for anyone to say.

This contrast was nicely illustrated by Troy’s recent comment on my “Broken Glass” post, when he said: “I do believe that the majority of people out there that are fighting non-traditional marriage are not raising a family of their own.” On the contrary, I know many people around here who are raising families, and every single one of those parents was appalled and horrified by the passage of Proposition 8, and made a point of saying so. I suspect that where Troy lives things are quite different. Here in Manhattan (as well as in all the University towns I visit) one uniformly finds sense of outrage on the part of parents that their friends and colleagues, people they like and respect, are denied the right and responsibility of raising children. I’m not arguing right or wrong here, I’m just pointing out the vast difference between our respective subcultures.

When people from two such opposing subcultures begin a conversation, things can get weird. Each side knows the other is wrong. The kindest thing we each tend to think about the other is that they are well-meaning but deluded, the victim of some cleverly pitched self-serving lies or spin that have clouded their better judgement.

In some sense, you can say that in such situations we have all – both left and right – gone over the edge from rational discourse to religious thinking and tribal warfare. We are all so used to the general lockstep agreement in our respective enclaves, that when we meet someone from “the other side” it feels like an encounter with an apostate. Our reaction is no longer intellectual, but rather is dominated by a irrational sense of emotional discomfort at encountering the otherness of an enemy tribe.

In October 2004, at height of the Kerry/Bush mania, my colleague Robert Dewar made what I think is the most perceptive observation I have ever heard on the subject. He proposed a simple test to determine whether your own views on a subject were in the realm of reasoned argument or in the realm of religious indoctrination. The test is simple: Attempt to seriously argue the other point of view. If you can do that effectively (even if you don’t ultimately agree with your own arguments) then you are still in the realm of the rational. If not, then your thinking has gone over into religious/tribal territory.

Go ahead, try it.

Where the memories lie

In the dark of the night, when the world is asleep
And there’s no sound at all, just the thoughts that you keep
Does your mind ever wander, your thoughts ever stray
To days long ago, to a time far away?
Where the shifting sands wait, in shadows of blue
To lure the unwary, it is waiting for you
The past is illusion, a place of your dreams
Where nothing is ever the way that it seems
For the world of tomorrow will fade in a sigh
When you let yourself dwell where the memories lie
Take care, dreaming traveller, watch where you go
The more you remember, the less that you know
      Your world of tomorrow will fade in a sigh
      Leaving nothing but sand, where the memories lie

Inflection point

Today somebody was wearing a tee shirt that said “The only winning move is not to play.” He told me he was disappointed that most people didn’t recognize the quote. Being appropriately geeky (as, I suspect, are many of you reading this) I recognized it right away as the key line from the 1983 John Badham film “WarGames”. The complete snippet of dialog is between Dr. Falken and his supercomputer Joshua, who has just gone through the exercise of evaluating the outcome of every possible permutation of thermonuclear war:


Greetings, Professor Falken.


Hello, Joshua.


A strange game. The only winning move is not to play. How about a nice game of chess?

What fascinates me most about this film is that it represents a precise inflection point in the popular culture – the moment when the programmer became the cool guy who got the girl. Certainly TRON had come out a full year earlier, but Bruce Boxleitner played him as pointedly nerdy – almost the antithesis of cool.

In contrast, Matthew Broderick, three years before he reached his apotheosis as Ferris Bueller, was identifiably cool and sexy, the teen rebel beginning to discover that he is a natural leader – witnessed just as he is coming into his considerable powers. This is essentially the same archetype that appears over and over again in literature. He is Prince Hal in “Henry IV, Part One”, James Dean in “Rebel Without a Cause”, Simba in “The Lion King” and Josh Hartnett in “The Faculty”.

The reason I find this change significant is that it pinpoints the year 1983 as the year the United States first experienced a massive shift in perception of its own power. Historically the power brokers in America had been those men (and it was pretty much always men) who wielded control of industrial production – John D. Rockefeller with his vast holdings in petroleum, and steel magnates like J.P. Morgan and Andrew Carnegie, followed somewhat later by a succession of powerful leaders of the automobile industry from Henry Ford to Lee Iacocca.

America was seen as mighty because of its industrial and manufacturing base, and this continued to be true after WWII and throughout the Cold War. Even the space race was a display of industrial brawn, the ultimate athletic feat of a nation that had worshiped the sheer physicality of transportation since the Wright Brothers and Ford had changed everything in 1908.

But of course now things are different. We are well into an era that worships a newer variety of Alpha leader, and a different kind of throne awaits Prince Hal. This is the era of Bill Gates, of Steve Jobs, of Larry and Sergei – of the supremacy of information over physical power.

Even our recent presidential election has been a triumph of the thinking man over the warrior – an outcome that would have been inconceivable in 1952 or 1956, when the intellectual Adlai Stevenson was practically laughed off the national stage when he attempted to go mano a mano with Eisenhower the war hero. But this time the election was fought and won on the internet. Obama the cool thinker – descendent of Matthew Broderick’s David Lightman – handily beat out McCain’s attempted channeling of John Wayne.

I would argue that the release of Badham’s cautionary film was the moment when this power shift first entered the national zeitgeist. The popular embrace of the internet era in which we all now live, where information is power, where teenagers view the cyber-creations of Will Wright with the same sense of reverent awe that a long-ago generation reserved for the physical feats of Harry Houdini, can be said to have begun a quarter of a century ago, with the release of “WarGames”.


It’s clear that Amazon’s Kindle is at the forefront of something big. Maybe this particular device is not going to catch on with everybody, but it’s certainly an important foot in the door to rethinking how we interact with books. The combination of a fairly reasonable form factor, the use of electronic Ink (very easy on the eyes, even in bright sunlight, and not at all a battery hog), and – most important – the backing of the might Amazon, means that this device is getting quite a few people to sit up and take notice, in a way that didn’t happen two years ago with SONY’s ebook reader (when was the last time you bought a book from SONY?).

That said, I’m transfixed by the name. It seems almost an oxymoron for a company called “Amazon” to make a device called “Kindle”. Amazon’s name always gave me the warm fuzzies. Books are substantial, solid, old-fashioned, like the rainforest. Something we want to preserve so that the world can be a good place to live. The Amazon rainforest is a source of endless biodiversity, healthy atmosphere, medicinal treasures and ethnic traditions. I’ve always felt protective toward its bountiful presence, in somewhat the same way I’ve come to feel protective toward books, with their rich history, textured beauty and rugged physicality, in this transient age of the internet.

But to kindle means to start a fire, to burn – not a concept you want to throw around lightly when you’re talking about books. To me book burning is the very bane of civilization, bringing to mind Nazi rallies, as well as movements in our own country to ban “The Catcher in the Rye”.

Is Amazon suggesting that these electronic readers will eventually lead to the disappearance of the physical book? Certainly that would be convenient for a company like Amazon. They are, after all, in the business of licensing intellectual property. Ultimately it is not so much a physical book that they are selling to each buyer, but rather a license to possess a single instance of an copyrighted work. If they can streamline that point of sale, reducing overhead and moving each transaction toward an ideal of pure profit, perhaps that would serve their larger interests.

So in a sense, perhaps we are witnessing the start of the biggest book burning in history. One day such phrases as “between the covers” and “a real page turner” may be as euphemistic as “telephone dialing” or “rewind” – alluding nostalgically back to a Victorian reality that is long gone.

Some day soon, alas, as we all pick up our electronic readers, we may once and for all close the book on books. As our children, and their children after them, run their fingers over magic screens to summon up “The Adventures of Sherlock Holmes”, “Jane Eyre” and “Ivanhoe”, they may catch the sadly bemused looks on the faces of their elders. Perhaps they will even ask us what’s wrong. But I suspect that we will never, try as we might, be able to convey to them just what has been lost.

Missionary cheese

Some years ago I had the good fortune to apartment-sit for some Parisian friends. They had a beautiful duplex, just two blocks from the Seine, across from the Académie des Beaux-Arts. For me the entire experience was a slice of heaven. Every day I would wander out and purchase a fresh bread and some new and exotic cheese, and sometimes a lovely but not too expensive wine, and then I would venture forth into Paris, on my way to explore some new museum or other interesting cultural landmark.

During that trip I developed a taste for really really stinky cheese. I don’t eat cheese these days, but back then I liked to make a point of finding the most alarmingly aromatic cheese I could find – the kind that you could never bring back to the U.S. In those days this sort of cheese was illegal Stateside, presumably because the sheer exquisite headiness of its aroma would cause mass panic and terror in the hearts of American dairy farmers. These farmers knew, to their shame, that their tepid local product could never compete with such pungent magnificence.

Over the course of my stay in Paris I developed little pet names for all aspects of my experience. For example, I began to refer to the very stinkiest cheese – the kind that would spread its intense aroma relentlessly to fill any space – as “missionary cheese”. Later, when I was back in the U.S., I would mention this pet name to people, and they would raise their eyebrows in a most suggestive way. The mere mention of the phrase “missionary cheese” seemed to raise all sorts of lurid images in their minds.

My friends would ask me, not really sure that they wanted to know the answer: “Why did you call it missionary cheese?”

“Because,” I would explain truthfully, “whenever I brought a truly stinky cheese back to my Paris apartment, sooner or later it would convert all the other cheeses.”

Closing the loop

I learned today about “two way learning”. This is a technique whereby you have somebody learn something while you monitor that person’s brain activity. As the person is learning about something, the computer is simultaneously “learning” the patterns of the person’s brain activity.

The hope here is that we can train a computer to recognize particular patterns of brain activity, and use those patterns to determine something about what a person is thinking. The primary application for this now is for people who are paralyzed. If a computer can recognize a paralyzed person’s brain patterns, then eventually that person could simply think a particular thought in order to trigger the computer to perform a particular action.

When this technique was described to me, I had a completely wacky idea: Why not make a loop out of it – have the person and the computer watch each other? The person watches and tries to learn the pattern of the computer that is “learning” the person’s brain pattern. So, for example, when I have a particular thought, the computer monitors my brain activity and shows the results as some kind of image on a display screen. Meanwhile, I watch that screen and try to learn and recognize these images.

This is an interesting scenario because we humans have an extraordinary ability to recognize patterns in what we see. If I see a visual representation of my own thoughts, eventually I might start to be able to recognize what particular thoughts “look like”. Eventually I might learn to modulate my own thoughts in order to make various types of patterns appear on the computer screen. Essentially I am training my mind to train the computer.

By involving the person whose brain is being tracked as an active participant in the process, we might be able to create a rich and powerful learning feedback loop. By making use of the human mind’s amazing ability to recognize patterns, perhaps we can give people the power to modulate their own thoughts at will. Those modulated thoughts could then be used to exert truly precise control – purely through thought – of computers, and thereby of the world around us.

It’s worth trying anyway.