Nothing’s ever truly lost

Idle thoughts in random moments
Drift upon the vacant air
They flit about in lazy circles
Floating here, alighting there

Nothing’s ever truly lost
All the thoughts we’ve had remain
To fill an evening with regret
Or echo some forgotten pain

In woven mists of tender dark
They haunt the hollows of your sleep
In dreams they whisper soft and low
Of all the secrets that you keep

But in the mornings, then they gather
Memories of smoke and lace
Forming haloes ’round your head
To fill your day with light and grace

Moral sanitation workers

In “Jurassic Park” the lawyer character was eaten by a Tyrannosaurus Rex while sitting on a toilet in an out house. In the theatre where I saw the movie, everybody cheered.

Isn’t there something about this scenario that bothers you? In our hearts we find room for so many different ideologies, ethnicities, ways of thinking and being. Hell, last year Tom Cruise played a sympathetic Nazi, ferchristsakes. But lawyers? No, not lawyers. Those folks be dinosaur bait. When bad things happen to them we laugh, we cheer, we run around the room and do the antler dance. We wait with barely restrained glee for terrible events to befall lawyers everywhere they may appear in pop culture, whether it be movies, books, theatre, comic books or bubble gum cards.

But what exactly is their crime? Why the intensely focused cultural hatred toward our advocationary class?

I submit that we are actually engaged in a collective act of deflected self-hatred. We use lawyers to do our dirty work, and then we blame them. Heaven forbid we should blame ourselves. Particularly in America, where lawsuits are only slightly more common than bathroom breaks.

We sue each other in our courts of law, and then go out for drinks together afterward. When questioned, we shrug our shoulders ruefully and say “well, you know, those lawyers.” It’s no wonder they are paid so well. They are our ethical buffers, cleaning up the Stygian stables of our collective litigious excess and then conveniently taking the blame.

Not to put too fine a point on it, lawyers are our moral sanitation workers.

The optimists

The other day I was invited to a party where almost everyone was a philosopher. I don’t mean amateur philosopher, armchair philosopher, or reflective soul with a philosophical bent. I mean they were professional philosophers – people who do this for a living. Many of them were connected with the NYU Philosophy Department (one of the top philosophy departments in the world, as it turns out) and others were colleagues and collaborators of these folks from other institutions of higher learning around the world.

I found out, in the course of conversation, that a rather high percentage of these people focus on questions surrounding “theory of mind” – in which one looks at questions on the order of what is a human mind, what is consciousness, what is thought, what is self?

The friend/colleague who invited me to the party is something else – a psychologist. Therefore he looks at theory of mind questions from a different angle, one more related to the sorts of questions we ask in computer science: How the mind operates from a somewhat cybernetic perspective, as an extremely advanced sort of computational device. If I understand correctly, it seems that an essential difference between the philosophical and psychological views of humanity come down to the question of “can we build one?”

I don’t mean can we build one now. Enough is already known about how the human brain functions to make it clear that in 2009 there is simply not enough computational power in all the world’s silicon chips to replicate the functioning of even a single brain. But of course that might not always be true. So psychologists are tempted to look at a time in the future – perhaps 50 years from now, perhaps 500 years from now – when something on the order of the brain’s level of functional complexity can be replicated in silico.

Philosophers, unlike psychologists, are not exactly interested in the mechanism itself, but rather in what that would mean. Would we be replicating the essential nature of the brain, the aspect that we think of as humanity, and if so, would that mean we can codify humanity the way we currently codify computer software?

I also found that that both psychologists and philosophers ponder the future implications of this question in a very specific way: If human brain functioning – “thought”, if you will – could one day be replicated in computer circuitry, then could those future electronic humans make their own cyber-progeny, second generation artificial thought machines? And would their progeny then go on to make third, fourth, fifth generation machines, ad infinitum?

And if so, at what point would the descendents no longer be recognizably human? At what point would such creatures cease to feel any need to keep us silly humans around, even as quaint biological specimens of an outdated ancestral brain?

Here’s the kicker: On the above subject, it seems that there are “optimists” and “pessimists”. The optimists believe that it is indeed possible to create such generative species of artificially intelligent creatures. The pessimists believe that it is highly unlikely such a thing will happen in the foreseeable future.

The friend who invited me to the party is an optimist, and so he is quite morose on the subject. He believes it may be only a matter of time before our human species is replaced by an uncaring cyber-progeny that has evolved beyond our limited powers of recognition, a meta-species that will ultimately cast us aside altogether, once we no longer serve its unfathomable purposes.

I, on the other hand, find that I am a pessimist on the subject. And so I remain quite happy and carefree, fascinated as I may be by the gloomy and dire predictions of my sad friends, the optimists.

Surviving childhood

Recently in a conversation with a group of colleagues, I complimented one colleague on his ingenious way of putting together simple things to make remarkably new and innovative discoveries. Graciously he deflected attention from himself by talking about people who had been tinkerers as kid. He pointed out that most individuals who grow up to be inventors started out in childhood, and probably had some experience performing dangerous experiments with chemistry sets or some equivalent.

We all mused that perhaps there would have been more such people in the world, but that some of the more daring young would-be inventors had actually succeeded in blowing themselves up at an early age, and had therefore never made it out of childhood alive.

At this point the conversation took a curious turn, as each person related something they had done in their experimentally inclined youth that might have put them at risk.

When it was my turn, I talked of the day – I think when I was somewhere around seven years old – that I became curious about the electrical outlet, and wanted to find a more “hands on” way of exploring its properties. I did this by taking a wire coat hanger from my parents’ closet, bending it into a U shape, sticking one end of the hanger into one terminus of the 120V wall socket, and then gingerly poking the free end of the wire into the other hole, to see what would happen.

At this point in the story my colleagues were all looking at me with concern. Possibly they were wondering why I was even now alive to tell the tale. “Well?” one of them asked, “What happened? Did all the lights in the house go out?”

I explained that the lights had managed to say on, but that a spray of very impressive sparks had immediately shot out of the wall outlet, creating black scars on the wooden floor of my bedroom. As soon as the sparks started to fly, I had pulled the wire back out, discretion having finally overcome curiosity within my young brain.

I don’t think I understood back then just what kind of fire I was playing with. I realized only when I was older, looking back on that experience, that what had saved me was the fact that my body was never actually in the path of the high voltage electricity. The short circuit had gone entirely through the wire – an excellent conductor – rather than through me. Had I used two coat hangers – one in each hand – instead of the one, I would have been very efficiently electrocuted, and that day would have marked my final experiment.

I’ve never told my parents about this little escapade – I think it would only have worried them. The floor in that room of their house has long since been covered by carpeting, beneath which I suspect one would still find the tell-tale burn marks on the floor near one electrical outlet, evidence of the early career of a very lucky young scientist.

The advantages of vinyl

I’m glad to see that people in their twenties are beginning to go back to vinyl records, a trend I started noticing about a year ago. Now that technology is allowing music to become completely disembodied, there is something lovely about something as old-fashioned and analog as an LP with actual physical grooves on its surface.

There are certain qualities inherent in vinyl that simply cannot be duplicated in software. For example, I remember back to when I was very young, and my good friend and colleague Josh was heartbroken because a young lady had just unceremoniously dumped him for a mutual friend of ours. Technically, for a former mutual friend of ours. The man was still a friend of mine, but Josh’s views on the guy had just changed rather decisively, if you see what I mean.

Anyway, Josh was going through a spiral of self-destruction – not sleeping, unable to concentrate on work, making entirely too much use of various artificial substances to self-medicate away his pain. One day I suggested to Josh that we take a break from work, and go on a long walk. I was thinking that maybe talking it all out with a sympathetic buddy would help him cope with his grief.

We found a nice bucolic spot, and proceeded to walk around together, while comparing notes on the unfairness of love and the fickle nature of women. At some point, I remember, I asked him “So just how bad does it hurt right now.”

Josh paused for a few moments to collect his thoughts. Then he asked me a question. “Ken, have you ever played one track on a record over and over again, so many times that the grooves on that one track became completely worn down?”

“Of course,” I replied, with no hesitation. “‘For No One’, by the Beatles”.

Josh stared at me, somewhat startled. “Have I told you about this before?” he asked.

“No,” I replied, “but there was a girl when I was sixteen, and a week when I did nothing but sit in my room, play solitaire, and cue up that track over and over again. You should check out my copy of the Beatles’ ‘Revolver’ album. That song is worn clear down.”

That was a long time ago, and Josh and I have remained great friends ever since. There are some moments that just bind two men together for life, and this was one of them.

Just one of the advantages of vinyl.

Movie logic

Tonight, in a spirited conversation about movies prompted by having just seen “A Serious Man” (the wonderful new film by the Coen brothers), I mentioned that I had recently seen Neil Jordan’s intriguing film “The Brave One” – in which Jody Foster plays tha part of Erica Bain, a liberal New York City radio show host who turns into a decidedly unliberal vigilante after her fiancé is killed by vicious thugs.

It is clear that Jordan and screenwriters Roderick and Bruce Taylor are not just trying to put us inside the mind of someone whose grief leads her to become a killer. They want us to sympathize with her choice. Whether this is intended as a political statement or merely an aesthetic exercise is something you’ll need to decide for yourself – the movie doesn’t say. But I am not surprised to see this kind of extreme experiment in bringing the audience to strange places, given that this is the same director who gave us “The Butcher Boy” – a film in which the highly sympathetic protagonist is an extremely likeable child who gradually transforms (while never once losing our sympathy) into a mass murdering psychopath.

What concerns me here are the methods the filmmakers use to bring the audience along in “The Brave One”, as Erica Bain transforms before our eyes from sappy liberal to resolute vigilante killer. The key was provided by my friend, who recalled that not only had the thugs murdered Bain’s fiancé, they had also killed the fiancé’s dog.

For me that was the “aha” moment. In a movie, you can kill people all you want, and that’s ok. You can blow them up, stab them, throw them off buildings, set them on fire, yadda yadda. Audiences take that sort of stuff in stride. You may be a murdering fiend, but in movie logic – as in dream logic – that doesn’t make you a bad person. Maybe you were misunderstood as a child. Maybe you’ll realize the error of your ways and find a way to say you’re sorry before the end credits start to roll.

But if you kill a dog, well then my friend, you have crossed the line. You’ve just bought yourself a one way ticket to Hell, with no refunds allowed. It’s all very ironic, since in real life people kill dogs all the time. We use nice euphemisms like “put to sleep” to make ourselves feel all cozy inside, yet still we kill them – something we’d never dream of so casually doing to humans – and it’s all perfectly legal.

But in the dream logic of movies, audiences understand that killing a dog is evil because a dog is innocent. Theoretically a human can defend himself, is more or less on an equal level with his assailant. But a movie dog is a kind of holy vessel, a creature of God, not to be messed with lightly (except if it’s a comedy – then you can kill them by the bucketload). Millions of people watched stone faced in “Independence Day” as large parts of our planet’s population were snuffed out by alien monsters. But in the midst of all of the horror and mass carnage, the film showed a dog getting away, and audiences cheered.

In other words, without the dog murder at the start of “The Brave One”, it wouldn’t have worked out quite right. If, in the climactic scene, Jody Foster had pointed a gun at a nasty thug merely because he had murdered her fiancé in cold blood, had stared down her assailant and thought about pulling the trigger, audiences might very well have reacted negatively. “Get over yourself lady,” they might have thought, “He may be a cold blooded murderer, but that doesn’t mean he deserves to die. Go find a therapist before you end up hurting somebody.”

But by having the dog get killed, the filmmakers have effectively short-circuited the logic centers in our brains. We’re no longer thinking “Gosh, is taking a life for a life really a wise policy?” No, we’re thinking “You killed a dog. You killed a dog. Die f*cker!”

I’m not saying it’s right. I’m not saying it makes any sense. I’m saying it works precisely because it doesn’t make any actual sense at all.

It’s movie logic.

Planck’s constant

It doesn’t happen all that often, but every once in a while the Universe conspires to give me a great straight line. I like to think that this happens because I’ve been a good boy, have been polite to others, kept out of trouble, cleaned up my room.

But I don’t really kid myself – when it happens it’s just dumb good luck.

Today I had one of those moments of rare epigrammatic grace. It was during a technical talk. The invited speaker was showing the results of some very interesting research into algorithms that convert three dimensional computer graphics into line drawings. This is not such a simple problem. It turns out that it is very easy to make a bad line drawing out of a 3D computer graphic model, but not nearly so easy to make one that contains the nuance and great choices to be found in a drawing by a decent artist wielding an old fashioned pen or pencil.

To illustrate his results the speaker was showing a 3D computer graphic model of the head of the great physicist Max Planck. Next to this image he showed a very impressive algorithmically generated line drawing of the same model.

One of our colleagues in the audience didn’t seem satisfied with this successful result. He wanted to know whether the technique could also deal with a textured model of Max Planck. The speaker tried to explain that the goal of the research was to convey shape, not texture, but the questioner was having none of it. Undeterred by the speaker’s reasonable answer, he continued on in his expansive line of questioning: “Well, suppose Max Planck has red lips. Could you convey something about that?”

I thought this was unfair. It’s one thing to question the results of someone’s research, but something else entirely to suggest that the research itself should have been on a different topic altogether.

The invited speaker patiently tried to explain that the goal – the whole point of his research project – was to effectively convey shape through line drawings. The questioner started to object again, clearly taken with his own opinions about what the research should have been about.

Which is when I decided to come to the speaker’s rescue. Hoping to assist in getting the talk back on track, I jumped in helpfully and told the insistent questioner: “No, Planck’s constant.”

That did the trick.


I have made up my mind
To refuse to decide
And now know for certain
I must let things ride

I shall not choose a side
I will not take the bait
I’m decisively choosing
To stand by and wait

It is such a relief
To have taken a stand
To just watch the ball drop
From right out of my hand

While others might dither
And think they must choose
I have heard the alarm
And have set it on snooze

For life is adventure
And life is a choice
Knowing when to step forward
And raise up one’s voice

My friends, I have opted
With calm and precision
To choose one true path.
In a word: Indecision



It is now only a few weeks from November 1, the start of the “write a novel in a month” month at I am very excited. Doubly so because a friend and I decided we would tag team it – she and I will create the novel together, each writing on alternate days. We each have our respective lead characters, with their respective imaginary worlds trailing behind them like so much confetti, and we are now working our heads around how our characters will get along with each other.

There is probably an optimum amount of pre-planning, and with any luck we won’t miss the mark by too much. Going into the writing itself, the ideas need to be just the right amount of half-baked. Too raw, and we’ll end up with nothing but a sprawling mess. Too overcooked, and there won’t be any room for the thing to breathe, for our characters to reveal true selves and hidden destinies.

A certain amount of pre-planning is good, but let’s face it, characters truly come alive only once they are written. Your protagonist needs to meet the Buddha on the road, not in the motel room before even getting in the car. And in that spirit, we’re filling the old think tank with gas, firing up our creative spark plug, and mixing our metaphors like there’s no tomorrow.

Nanowrimo here we come!

A line of chalk

Ten years ago today, according to an official and exceedingly unscientific proclamation, the human population of the world reached six billion. Of course this assertion is silly, if taken as a literal statement of fact. Knowledge of human birth and death is based on statistical approximation and the tools available for tracking these statistics are rather imprecise. We do not have a little chip in every human body that tracks the whereabouts and disposition of each individual upon the earth.


Nonetheless, it became clear ten years ago that the mark was going to be passed, and by human reckoning it was a significant mark – being a “nice round number” with lots of trailing zeroes. When you think about it, the whole idea of a six billionth baby being significant in any way is rather mystical – not all that different from the numerological esoterica of the Kabballah, or various myths surrounding the Number of the Beast.

Then again, celebrations of New Years or birthdays are equally mystical. Nothing really happens at these moments, other than the passing of an arbitrary and culturally defined reference. It’s all rather like drawing a line of chalk in the road, coming back a day later to step over this line, and then celebrating our achievement.

But still it’s fun. Our brains work this way, and we have no way of perceiving the world around us than through these numerically obsessed minds of ours. And so, on the more or less arbitrary day of October 12, 1999, it was declared by the Secretary General of the United Nations that little Adnan Nevic was the six billionth human.