A Serious Movie

Erwin Schrödinger introduced his famous “Schrödinger’s cat” thought experiment to illustrate the apparent absurdity of one of the key implications of quantum theory. Namely, its implication that something could simultaneously exist and also not exist. Basically, down at the quantum level, a particle can remain in a quasi-state of existence, both existing and also not existing. The particle stays this way until it is observed by an outside system. At that moment it instantly “snaps” to one of the two states.

Schrödinger’s complaint was that this can lead to absurd outcomes, since you could easily tie a macroscopic object – say a house cat – to the fate of a single quantum particle (recipe: place the cat in a sealed box with a geiger counter; when the counter detects a single random quantum event, kill the cat). Quantum theory states that the cat is literally both alive and dead at the same time. Until, that is, somebody opens the box, at which point the cat instantly snaps to one of its two quasi-states: It becomes either a fully alive cat or a fully dead cat.

Yes, this sounds absurd, and people not familiar with quantum theory often respond by saying that we’re just describing the probability that the cat is alive or dead at any given moment. In fact, they say, the cat must always be completely alive or completely dead. But that turns out not to be the case. Strange as it seems, it turns out that Schrödinger’s objection was wrong – quantum theory’s prediction has been experimentally verified to be true. If you run various experiments with actual particles, the “cat is either completely alive or dead” assumption gives you the wrong answer. If you assume this crazy sounding “quasi-state” of an object both existing and not existing at the same time, the results you get match the experimental data perfectly.

Joel and Ethan Coen’s recent film “A Serious Man” is actually a treatise on this very subject, in disguised form. It starts with a reference to Schrödinger’s famous thought experiment, and then proceeds to show – in a very elegant fashion – that even in the domain of human actions, an object can be in a quasi-state of simultaneously both existing and not existing, up until the moment an observer forces the question of whether the object exists or not. At which point the object instantly snaps to one of these two states, as though it had been in that single state all the time.

I won’t spoil the movie for you by saying any more (many have not seen it yet – and I suspect it has not yet been released in various parts of Europe), but I wanted to pay tribute to a moment of cinematic genius: A moral fable that transposes one of the most difficult concepts of quantum theory into human terms, with perfect clarity.

When you see the film, see if you can spot what the “quasi-existing” object is.

Wrong-way Oreo

The other day, for the first time ever, I encountered a wrong-way Oreo. For those of you who don’t know, that’s an Oreo cookie that has one of its two dark chocolate wafers somehow turned around, so that its engraved outer side ends up on the inside, pressing inward to form a tell-tale impression, in perfect mirror-reverse, upon the snowy white cream filling.

I hadn’t been expecting it. In fact, I hadn’t even been aware that such a thing exists. Perhaps there are people who go around and speak of wrong-way Oreos, swapping tales of this arcane mystery in the same hushed and knowing tones they use when speaking of Bigfoot sightings or the alligators that dwell in the sewers of New York. Not that I have ever been in such a conversational group. Until now.

Today I asked various people if they had ever seen a wrong-way Oreo. My friend Charles said he saw one once, a few years back. Several other people reported having seen one as well. Charles has the theory that some part of the manufacturing process involves the chocolate wafer dropping downward, and that every once in great while a wafer lands the wrong way. He may very well be right.

But as I contemplated my oddball Oreo, I couldn’t help thinking there might be some deeper meaning here. Was this perhaps some sort of sign or omen? And if so, why was I chosen to get this cookie on this particular day? Would it still have counted if I had just eaten the cookie without ever looking at it? Or would fate then have conspired to place another wrong-way Oreo in my path?

And if fate were to deliver more wrong-way Oreos to me, what would happen if I were so oblivious that I just kept eating the darned things without ever noticing? Would fate then need to keep feeding me cookie after cookie, hoping against hope that one day I would become less oblivious? Would I one day find myself mysteriously eating entire boxes of Oreos, consuming vast quantities of the things until I became as round as – well – as an Oreo cookie?

These are metaphysical questions, far out of my league I am afraid. My feeble brain can contemplate only one wrong-way Oreo at a time. But even one cookie can have significance. Am I, perhaps, one of the few lucky humans, chosen by alien invaders, set apart by this secret sign from billions of less fortunate earthlings? I can envision a day dawning, after our planet’s ignominious defeat at the hands of the Lepusian space invasion force, perhaps sometime after the dust has settled, when the broken slag heaps of what had once been great earth cities lie smoking in ruins, and the once mighty suburbs of New Jersey have been reduced to desolate wastelands by beams of phase disruptor particles from the Lepusian imperial mothership. The few dazed remnants of a defeated human race slowly emerge, stunned, from out their hiding places, only to be picked off by precision laser fire from the dreaded roving lepudroids. On that day I shall stand triumphant, proud and free, ready to take my rightful place as a citizen of the galactic empire, holding my wrong-way Oreo cookie high for all to see, my ticket to a new world.

On the other hand, there is a chance that might not happen.

Weighing the promise of one day living a life of fabulous adventure roaming the galaxy far and wide in search of new civilizations, against the prospect of eating an Oreo cookie now, my internal struggle was brief.

Reader, I ate it.

Nothing’s ever truly lost

Idle thoughts in random moments
Drift upon the vacant air
They flit about in lazy circles
Floating here, alighting there

Nothing’s ever truly lost
All the thoughts we’ve had remain
To fill an evening with regret
Or echo some forgotten pain

In woven mists of tender dark
They haunt the hollows of your sleep
In dreams they whisper soft and low
Of all the secrets that you keep

But in the mornings, then they gather
Memories of smoke and lace
Forming haloes ’round your head
To fill your day with light and grace

Moral sanitation workers

In “Jurassic Park” the lawyer character was eaten by a Tyrannosaurus Rex while sitting on a toilet in an out house. In the theatre where I saw the movie, everybody cheered.

Isn’t there something about this scenario that bothers you? In our hearts we find room for so many different ideologies, ethnicities, ways of thinking and being. Hell, last year Tom Cruise played a sympathetic Nazi, ferchristsakes. But lawyers? No, not lawyers. Those folks be dinosaur bait. When bad things happen to them we laugh, we cheer, we run around the room and do the antler dance. We wait with barely restrained glee for terrible events to befall lawyers everywhere they may appear in pop culture, whether it be movies, books, theatre, comic books or bubble gum cards.

But what exactly is their crime? Why the intensely focused cultural hatred toward our advocationary class?

I submit that we are actually engaged in a collective act of deflected self-hatred. We use lawyers to do our dirty work, and then we blame them. Heaven forbid we should blame ourselves. Particularly in America, where lawsuits are only slightly more common than bathroom breaks.

We sue each other in our courts of law, and then go out for drinks together afterward. When questioned, we shrug our shoulders ruefully and say “well, you know, those lawyers.” It’s no wonder they are paid so well. They are our ethical buffers, cleaning up the Stygian stables of our collective litigious excess and then conveniently taking the blame.

Not to put too fine a point on it, lawyers are our moral sanitation workers.

The optimists

The other day I was invited to a party where almost everyone was a philosopher. I don’t mean amateur philosopher, armchair philosopher, or reflective soul with a philosophical bent. I mean they were professional philosophers – people who do this for a living. Many of them were connected with the NYU Philosophy Department (one of the top philosophy departments in the world, as it turns out) and others were colleagues and collaborators of these folks from other institutions of higher learning around the world.

I found out, in the course of conversation, that a rather high percentage of these people focus on questions surrounding “theory of mind” – in which one looks at questions on the order of what is a human mind, what is consciousness, what is thought, what is self?

The friend/colleague who invited me to the party is something else – a psychologist. Therefore he looks at theory of mind questions from a different angle, one more related to the sorts of questions we ask in computer science: How the mind operates from a somewhat cybernetic perspective, as an extremely advanced sort of computational device. If I understand correctly, it seems that an essential difference between the philosophical and psychological views of humanity come down to the question of “can we build one?”

I don’t mean can we build one now. Enough is already known about how the human brain functions to make it clear that in 2009 there is simply not enough computational power in all the world’s silicon chips to replicate the functioning of even a single brain. But of course that might not always be true. So psychologists are tempted to look at a time in the future – perhaps 50 years from now, perhaps 500 years from now – when something on the order of the brain’s level of functional complexity can be replicated in silico.

Philosophers, unlike psychologists, are not exactly interested in the mechanism itself, but rather in what that would mean. Would we be replicating the essential nature of the brain, the aspect that we think of as humanity, and if so, would that mean we can codify humanity the way we currently codify computer software?

I also found that that both psychologists and philosophers ponder the future implications of this question in a very specific way: If human brain functioning – “thought”, if you will – could one day be replicated in computer circuitry, then could those future electronic humans make their own cyber-progeny, second generation artificial thought machines? And would their progeny then go on to make third, fourth, fifth generation machines, ad infinitum?

And if so, at what point would the descendents no longer be recognizably human? At what point would such creatures cease to feel any need to keep us silly humans around, even as quaint biological specimens of an outdated ancestral brain?

Here’s the kicker: On the above subject, it seems that there are “optimists” and “pessimists”. The optimists believe that it is indeed possible to create such generative species of artificially intelligent creatures. The pessimists believe that it is highly unlikely such a thing will happen in the foreseeable future.

The friend who invited me to the party is an optimist, and so he is quite morose on the subject. He believes it may be only a matter of time before our human species is replaced by an uncaring cyber-progeny that has evolved beyond our limited powers of recognition, a meta-species that will ultimately cast us aside altogether, once we no longer serve its unfathomable purposes.

I, on the other hand, find that I am a pessimist on the subject. And so I remain quite happy and carefree, fascinated as I may be by the gloomy and dire predictions of my sad friends, the optimists.

Surviving childhood

Recently in a conversation with a group of colleagues, I complimented one colleague on his ingenious way of putting together simple things to make remarkably new and innovative discoveries. Graciously he deflected attention from himself by talking about people who had been tinkerers as kid. He pointed out that most individuals who grow up to be inventors started out in childhood, and probably had some experience performing dangerous experiments with chemistry sets or some equivalent.

We all mused that perhaps there would have been more such people in the world, but that some of the more daring young would-be inventors had actually succeeded in blowing themselves up at an early age, and had therefore never made it out of childhood alive.

At this point the conversation took a curious turn, as each person related something they had done in their experimentally inclined youth that might have put them at risk.

When it was my turn, I talked of the day – I think when I was somewhere around seven years old – that I became curious about the electrical outlet, and wanted to find a more “hands on” way of exploring its properties. I did this by taking a wire coat hanger from my parents’ closet, bending it into a U shape, sticking one end of the hanger into one terminus of the 120V wall socket, and then gingerly poking the free end of the wire into the other hole, to see what would happen.

At this point in the story my colleagues were all looking at me with concern. Possibly they were wondering why I was even now alive to tell the tale. “Well?” one of them asked, “What happened? Did all the lights in the house go out?”

I explained that the lights had managed to say on, but that a spray of very impressive sparks had immediately shot out of the wall outlet, creating black scars on the wooden floor of my bedroom. As soon as the sparks started to fly, I had pulled the wire back out, discretion having finally overcome curiosity within my young brain.

I don’t think I understood back then just what kind of fire I was playing with. I realized only when I was older, looking back on that experience, that what had saved me was the fact that my body was never actually in the path of the high voltage electricity. The short circuit had gone entirely through the wire – an excellent conductor – rather than through me. Had I used two coat hangers – one in each hand – instead of the one, I would have been very efficiently electrocuted, and that day would have marked my final experiment.

I’ve never told my parents about this little escapade – I think it would only have worried them. The floor in that room of their house has long since been covered by carpeting, beneath which I suspect one would still find the tell-tale burn marks on the floor near one electrical outlet, evidence of the early career of a very lucky young scientist.

The advantages of vinyl

I’m glad to see that people in their twenties are beginning to go back to vinyl records, a trend I started noticing about a year ago. Now that technology is allowing music to become completely disembodied, there is something lovely about something as old-fashioned and analog as an LP with actual physical grooves on its surface.

There are certain qualities inherent in vinyl that simply cannot be duplicated in software. For example, I remember back to when I was very young, and my good friend and colleague Josh was heartbroken because a young lady had just unceremoniously dumped him for a mutual friend of ours. Technically, for a former mutual friend of ours. The man was still a friend of mine, but Josh’s views on the guy had just changed rather decisively, if you see what I mean.

Anyway, Josh was going through a spiral of self-destruction – not sleeping, unable to concentrate on work, making entirely too much use of various artificial substances to self-medicate away his pain. One day I suggested to Josh that we take a break from work, and go on a long walk. I was thinking that maybe talking it all out with a sympathetic buddy would help him cope with his grief.

We found a nice bucolic spot, and proceeded to walk around together, while comparing notes on the unfairness of love and the fickle nature of women. At some point, I remember, I asked him “So just how bad does it hurt right now.”

Josh paused for a few moments to collect his thoughts. Then he asked me a question. “Ken, have you ever played one track on a record over and over again, so many times that the grooves on that one track became completely worn down?”

“Of course,” I replied, with no hesitation. “‘For No One’, by the Beatles”.

Josh stared at me, somewhat startled. “Have I told you about this before?” he asked.

“No,” I replied, “but there was a girl when I was sixteen, and a week when I did nothing but sit in my room, play solitaire, and cue up that track over and over again. You should check out my copy of the Beatles’ ‘Revolver’ album. That song is worn clear down.”

That was a long time ago, and Josh and I have remained great friends ever since. There are some moments that just bind two men together for life, and this was one of them.

Just one of the advantages of vinyl.

Movie logic

Tonight, in a spirited conversation about movies prompted by having just seen “A Serious Man” (the wonderful new film by the Coen brothers), I mentioned that I had recently seen Neil Jordan’s intriguing film “The Brave One” – in which Jody Foster plays tha part of Erica Bain, a liberal New York City radio show host who turns into a decidedly unliberal vigilante after her fiancé is killed by vicious thugs.

It is clear that Jordan and screenwriters Roderick and Bruce Taylor are not just trying to put us inside the mind of someone whose grief leads her to become a killer. They want us to sympathize with her choice. Whether this is intended as a political statement or merely an aesthetic exercise is something you’ll need to decide for yourself – the movie doesn’t say. But I am not surprised to see this kind of extreme experiment in bringing the audience to strange places, given that this is the same director who gave us “The Butcher Boy” – a film in which the highly sympathetic protagonist is an extremely likeable child who gradually transforms (while never once losing our sympathy) into a mass murdering psychopath.

What concerns me here are the methods the filmmakers use to bring the audience along in “The Brave One”, as Erica Bain transforms before our eyes from sappy liberal to resolute vigilante killer. The key was provided by my friend, who recalled that not only had the thugs murdered Bain’s fiancé, they had also killed the fiancé’s dog.

For me that was the “aha” moment. In a movie, you can kill people all you want, and that’s ok. You can blow them up, stab them, throw them off buildings, set them on fire, yadda yadda. Audiences take that sort of stuff in stride. You may be a murdering fiend, but in movie logic – as in dream logic – that doesn’t make you a bad person. Maybe you were misunderstood as a child. Maybe you’ll realize the error of your ways and find a way to say you’re sorry before the end credits start to roll.

But if you kill a dog, well then my friend, you have crossed the line. You’ve just bought yourself a one way ticket to Hell, with no refunds allowed. It’s all very ironic, since in real life people kill dogs all the time. We use nice euphemisms like “put to sleep” to make ourselves feel all cozy inside, yet still we kill them – something we’d never dream of so casually doing to humans – and it’s all perfectly legal.

But in the dream logic of movies, audiences understand that killing a dog is evil because a dog is innocent. Theoretically a human can defend himself, is more or less on an equal level with his assailant. But a movie dog is a kind of holy vessel, a creature of God, not to be messed with lightly (except if it’s a comedy – then you can kill them by the bucketload). Millions of people watched stone faced in “Independence Day” as large parts of our planet’s population were snuffed out by alien monsters. But in the midst of all of the horror and mass carnage, the film showed a dog getting away, and audiences cheered.

In other words, without the dog murder at the start of “The Brave One”, it wouldn’t have worked out quite right. If, in the climactic scene, Jody Foster had pointed a gun at a nasty thug merely because he had murdered her fiancé in cold blood, had stared down her assailant and thought about pulling the trigger, audiences might very well have reacted negatively. “Get over yourself lady,” they might have thought, “He may be a cold blooded murderer, but that doesn’t mean he deserves to die. Go find a therapist before you end up hurting somebody.”

But by having the dog get killed, the filmmakers have effectively short-circuited the logic centers in our brains. We’re no longer thinking “Gosh, is taking a life for a life really a wise policy?” No, we’re thinking “You killed a dog. You killed a dog. Die f*cker!”

I’m not saying it’s right. I’m not saying it makes any sense. I’m saying it works precisely because it doesn’t make any actual sense at all.

It’s movie logic.

Planck’s constant

It doesn’t happen all that often, but every once in a while the Universe conspires to give me a great straight line. I like to think that this happens because I’ve been a good boy, have been polite to others, kept out of trouble, cleaned up my room.

But I don’t really kid myself – when it happens it’s just dumb good luck.

Today I had one of those moments of rare epigrammatic grace. It was during a technical talk. The invited speaker was showing the results of some very interesting research into algorithms that convert three dimensional computer graphics into line drawings. This is not such a simple problem. It turns out that it is very easy to make a bad line drawing out of a 3D computer graphic model, but not nearly so easy to make one that contains the nuance and great choices to be found in a drawing by a decent artist wielding an old fashioned pen or pencil.

To illustrate his results the speaker was showing a 3D computer graphic model of the head of the great physicist Max Planck. Next to this image he showed a very impressive algorithmically generated line drawing of the same model.

One of our colleagues in the audience didn’t seem satisfied with this successful result. He wanted to know whether the technique could also deal with a textured model of Max Planck. The speaker tried to explain that the goal of the research was to convey shape, not texture, but the questioner was having none of it. Undeterred by the speaker’s reasonable answer, he continued on in his expansive line of questioning: “Well, suppose Max Planck has red lips. Could you convey something about that?”

I thought this was unfair. It’s one thing to question the results of someone’s research, but something else entirely to suggest that the research itself should have been on a different topic altogether.

The invited speaker patiently tried to explain that the goal – the whole point of his research project – was to effectively convey shape through line drawings. The questioner started to object again, clearly taken with his own opinions about what the research should have been about.

Which is when I decided to come to the speaker’s rescue. Hoping to assist in getting the talk back on track, I jumped in helpfully and told the insistent questioner: “No, Planck’s constant.”

That did the trick.

Indecisive

I have made up my mind
To refuse to decide
And now know for certain
I must let things ride

I shall not choose a side
I will not take the bait
I’m decisively choosing
To stand by and wait

It is such a relief
To have taken a stand
To just watch the ball drop
From right out of my hand

While others might dither
And think they must choose
I have heard the alarm
And have set it on snooze

For life is adventure
And life is a choice
Knowing when to step forward
And raise up one’s voice

My friends, I have opted
With calm and precision
To choose one true path.
In a word: Indecision

😉