Wagging the dog

I was talking with some colleagues today about the misconceptions people have about what they see on TV, movies, and other media, and I suddenly remembered an odd moment I had when I saw the first run of the 1997 film “Wag the Dog”.

It was during a scene in which Dustin Hoffman, as the Hollywood producer, is using the techniques of film magic to manufacture the video illusion of a war between the U.S. and Armenia. He’s in the post-production studio with his CIA client, played by Robert DeNiro.

In this particular scene, the young Kirsten Dunst is a young actress hired to pretend to be a traumatized child in a war zone. Hoffman shows DeNiro the magic of special effects by giving the young girl a box of cereal and then having her run across a bare blue-screen set. He then uses a digital console to replace the bare stage behind her with an Armenian village under fire — a scene of war and terror.

DeNiro, rather sensibly, questions the fact that she is running through this ersatz village with a box of cereal under her arm. Dustin Hoffman’s character the proceeds to twiddle some dials on the console, and in the video feed the cereal box is magically replaced by a kitten.

So far so good. We’re watching a fantasy of Hollywood special effects in action. Of course special effects don’t really work that way, but it’s perfectly legitimate for a movie to spin such a fantasy. It’s all part of the same “willing suspension of disbelief” that allows us to accept a movie star as a great physicist or a distinguished politician.

Except that on this particular day, in this particular movie theatre, something rather odd happened. Just as Dustin Hoffman’s character twiddled those dials to turn the cereal box into a kitten, the woman in the row just in front of us turned to her companion and said — in a rather loud voice — “That’s amazing!”

Frankly, her comment made a bigger impression on me than anything I was watching on-screen. Clearly she thought that the instant transformation from cereal box to kitten was real. But why?? Did she believe we were watching some sort of documentary? Suddenly I started to worry that all across America, moviegoers might be unable to distinguish reality from movie fantasy.

Do people actually believe that the house in “Up” could really float in the air from the buoyancy of a bunch of party balloons?

Do people really come away from Oliver Stone’s “JFK” believing that our 35th president was done in by a secret homosexual cabal led by Tommy Lee Jones?

And did people really leave “The Matrix” believing that we are all living in a fantasy dreamscape created by evil robots who are only keeping us alive to be used as spare Energizer batteries?

I had always assumed, before this incident, that audiences would knew how to draw a firm line between the tall tales on the silver screen and the reality of their actual lives. After all, basing your ideas of how reality works upon what you see in a Hollywood movie would be — for want of a better phrase — the tail wagging the dog.
.
Wouldn’t it?

World cup

The general hoopla around the World Cup — especially here in Paris, where I am spending the week — reminds me of the window that soccer opened in my own life. It was 1994, the year that Brazil barely edged out Italy to win the championship, which happened while I was spending some months in Sao Paulo. Needless to say, by the time I got back to NY I was completely immersed in all things soccer (or, as almost everyone in the world calls it, football).

It was the first time Brazil had won after a dry spell of 24 years, and several months in that atmosphere had converted me from a mere clueless Americano to a fan. When I returned to Manhattan, I wanted to share the excitement with everyone in my research lab at NYU. Yet the people in our lab at that time fell into two categories: Americans and Italians. The Americans had no idea what I was talking about — they just looked at me blankly when I started talking about the greatness of the Brazilian team and its achievement. The Italians were even worse. They all just gave me a tragic and baleful look and pretended to not know what I was talking about.

But that was the year that I discoved the other New Yorkers. Not my fellow intellectuals in their ivory tower, but the guys at the coffee shop, the taxi drivers, the men behind the counter at the greek diner. Everywhere I went, ordinary working New Yorkers — immigrants from just about every part of the globe — were excited that I had just come back from Brazil, and were eager to talk about the World Cup and its dramatic outcome.

There was one man — a really sweet guy from Greece who worked in the deli down the block — who had asked me, months before, to bring him back a soccer shirt sporting Pelé’s retired number 10. I had remembered the request, and upon my return from Brazil I presented him with the coveted shirt. For years after, that guy was my best buddy — he would light up in a huge smile whenever I came into the deli.

And so, thanks to the magic of soccer, I learned the shared language of the vast network of immigrant New Yorkers — the ones I had never before thought to get to know — who form the life blood of the city where I live.

G.I. Jane Eyre

Today someone was showing me sequences from various violent action computer games. I was impressed with the high level of realism in the combat scenes, the rapid fire editing, the authentic explosions, the powerful dynamics and convincing sound as bullet slammed into body armor.

And I thought to myself, what a shame that so much loving care, so much detail, is being poured into a genre that focuses so little on the human element, the deeper emotions, the psychological back and forth. For those who have read Jane Austen or the Brontes know that a voice speaking calmly in a drawing room, over tea and biscuits, can convey depths of cruelty, of psychic violence, beyond anything the players of Halo or Half Life 2 could ever imagine.

And so, perhaps it is time to combine these genres — to render in flesh and blood the depths of psychic violence that lurk within the romance novel. If we could get the brilliant minds that brought us Assassin’s Creed and BioShock to incorporate the remorseless human drama that lies just below the affable surface of a Jane Austen novel, we might achieve a new synthesis.

Perhaps we would then see such fine crossover games as “Pride and Extreme Prejudice”. Or maybe “G.I. Jane Eyre”.

I wonder what kind of ESRB rating these games would get.

Father’s Day at 100

Today is the hundredth anniversary of the celebration of Father’s Day in America. Father’s Day in this country was first celebrated in Spokane Washington in 2010, on the third Sunday in June, although it took another sixty two years for the U.S. Government to officially declare it as an annual event.

I suspect that official recognition took so long because the political cost was too high until the change in views of the roles of men and women that came to a head during the late sixties and early seventies. In the decades prior to the sixties, reactions to the proposal often ranged from laughter to derision.

We now live in a world where it is understood that fathers can be an important part of their children’s everyday lives. But it wasn’t all that long ago — not even half a century — when the conventional wisdom was that the gender roles were set in stone: mothers raise the children and fathers bring in the money to support them.

People generally think of “feminism” as a movement that centers around women, including such issues as equal pay for equal work, access to birth control and freedom from sexual harassment. But the history of Father’s Day is a testament to the fact that this is an incomplete view.

Thanks to the changes wrought by the feminist moment, we now take it for granted that men too have the right to spend quality time with their children. If a man wants to take time off to take care of his newborn child, people no longer call him crazy, and his continued employment is no longer threatened.

The official recognition of Father’s Day into perpetuity was signed into law not by some liberal crusader, but by Richard M. Nixon, who happened to be president at the historical moment when society finally accepted men as valid caretakers of their own children.

But who knows? Maybe it had to be Nixon. In the words of the old Vulcan proverb: “Only Nixon could go to China.”

 

Star Trek VI: The Undiscovered Country, scene 2

Shadows of the moon

These fragile vessels are gone away too soon
Oh do not ask me why

What are you and I, but shadows of the moon
That dance under the sky?

These moments we have, like raindrops in your hand
Which sparkle for a day

Are gentle dreams of beauty we barely understand
But dreams must fade away

From eternity we purchase a thimbleful of years
And oh, how high the price

Yet every moment, though paid for with our tears,
Is worth the sacrifice

Word

My good friend Andy and I recently found ourselves on the subject of words that refer to themselves. I first recall encountering such a word when I was about twelve years old. Leafing through Webster’s Dictionary, I came upon the wonderful word “logomachy”. It’s the sort of word you could imagine people fighting over, with one person saying “oh, there is no such word”, and someone else insisting that the word indeed exists. Eventually of course they resort to looking it up, only to encounter this definition:

Pronunciation: \lō-ˈgä-mə-kē\
Function: noun
Etymology: Greek logomachia, from log- + machesthai to fight
Date: 1569
1 : a dispute over or about words

For the person who had gallantly defended this word’s existence, could victory possibly be more sweet?

There are plenty of examples in literature of such metonymic word usage. From Roald Dahl’s delightfully sly use of the word “epexegetically” in his wonderful short story “The Great Grammatizator”, which I I talked about a while back, to the decision by the music group REM to name one of their albums “Eponymous” — possibly the most clever album title in the history of pop music.

Yet, as my friend Andy and I discussed, there are words that let you down. “Palindrome” is, sadly, not a palindrome. And no anagrams have yet been found for “anagram”. “Onomatopoeia” is not onomatopoetic, except by the most tortured interpretation of that word.

And so it is a delight when one comes upon words that are satisfyingly self-referential. “Noun” and “adjective” work as examples of themselves, although “verb” does not. The word “short” is self-descriptive, in a way that “miniscule” and “monosyllabic” are not.

The word “grandiloquent” is, well, grandiloquent. And I’ve always particularly adored the word “gargantuan”. Just saying it out loud makes the whole world seem somehow roomier (go ahead, try it). As opposed to the word “cramped”, a word that is all too self-descriptive.

“Mellifluous” describes itself rather perfectly. As do “abstruse”, “recondite” and “sesquipedalian”, although these last three are somewhat overly lexiphanic (you could look it up).

Word to the max.

Ageless

When we think of Beethoven, we don’t generally think of him as being one particular age. The same is true for Charles Dickens, Virginia Woolf, Pablo Picasso, William Shakespeare, or Emily Dickinson. The creative output of those wonderful geniuses did not coincide with one particular point in their lives, but rather was produced over a range of ages.

In this way, such artists have escaped the tyranny of age discrimination, and society’s odd views about age in general. They have instead been celebrated for their talent, for possession of a singular voice, for the output of a magnificent mind.

This is not the case for actors. Without even thinking about it, we make a clear distinction between the young Bette Davis and the older one, or the Nicholson of “Five Easy Pieces” and of “About Schmidt”. There is a vast gulf in our minds between Ingrid Bergman in “Casablanca” and in “A Woman Called Golda”. We can’t help it — age discrimination is deeply indoctrinated into us, and we cannot simply wish it away.

Yet this will change. Some time in the next twenty years — perhaps sooner — all movie production will go entirely digital. The person you see up on the screen will be a simulcrum, an artificial person digitally puppeteered by its real counterpart. And at that point actors will be able to be whatever age is most convenient. Only talent and commitment will matter, not mere accidents of chronology. An actor of twenty five will easily be able to portray himself or herself at the age of seventy five, and vice versa.

In that historical moment, the actor will become as ageless as the painter, the sculptor, the playwright and the poet.

Rejoycing

Today, as many of you know, is Bloomsday. On this day every year James Joyce aficionados everywhere re-enact the fictitious June 16 1904 walkabout through Dublin taken by Leopold Bloom in Joyce’s “Ulysses”. It is a glorious tradition, with just the right degree of nuttiness to hold our attention. Any such celebration needs to calibrate its insanity carefully, for as Saint-Gaudens once said: “As garlic is to food, insanity is to art.”

Which is why I am puzzled as to why all famous works of narrative fiction do not inspire similar celebrations. Where are the fans retracing Holden Caulfield’s journey around Manhattan? Why do we not see hordes of young women in little black dresses dining at Tiffany’s at 5am each morning? And why is nobody getting on a raft to follow the path laid down by Huckleberry Finn and his friend Jim?

Wouldn’t it be fun to recreate the journey to the heart of darkness that Joseph Conrad described so vividly — or at least its cinematic imitation by Francis Ford Coppola? And why aren’t football fans everywhere revisiting that Longest Yard? I am referring of course to the real version from 1974, not whatever the hell Adam Sandler thought he was doing in 2005.

Who wouldn’t want to travel from Paris to Marseille, across the Mediterranean Sea to Oran, Algeria, then across French Morocco to Casablanca, only to nobly give up one’s exit visa to Lisbon and the New World? I know I would. If you’re going to hand off the woman you love to some other guy just to show her how much you love her, Rick Blaine was way more practical about it than, say, Sidney Carlton.

If you see what I mean.

There are limits, to be sure. I can see why there is no annual day of re-enactment of Stanley Kubrick’s “2001, A Space Odyssey”. The expense alone would be astronomical. And I am really glad nobody is trying to re-enact the final scene of “Dr. Strangelove”. At least, I hope they’re not.

Pun play

I’ve been wondering whether it would be possible to write a play in which every line of dialog contains at least one pun. I don’t mean that the characters would deliberately be punning. In fact, they’d have no idea any of this is happening. Rather, as they play out their scenes, controversies, resolutions, shifts in power and allegiance, they would just happen to say things in puns.

The audience would experience such a play on two completely different levels. The pun or puns contained in each line of dialog would be a sort of music dancing around the lyrics of the play. One could imagine dialog along these lines:


Character 1: I never thought you’d hatch such a bold scheme.
Character 2: Yes, I’ve decided to come out of my shell.
Character 1: I’m so glad. You’ve always been a good egg.

You get the idea. I suspect the above example is too brazen — you’d need to be more subtle about it. If it’s done right, the resulting play might be very satisfying, or it might be unbearably awful. Many members of the audience might not even realize that there was a second “pun” labor to the proceedings.

It’s impossible to say at this point just how such a mad scheme would come across. But it would certainly be interesting to try.

Trees 3

I had always thought of photosynthesis in a fairly simple way: A plant acts as a factory for converting water and carbon dioxide into glucose and oxygen. The plant then converts that glucose into cellulose, starches, and all that other good stuff we get from plants. But now that I needed to know where the mass of cellulose comes from, I had to look more closely at how this factory works.

The weight ratio between single atoms of carbon, hydrogen and oxygen are (more or less) 12 to 1 to 16, respectively. So, for example, in a molecule of water (H2O), almost all of the weight comes from the oxygen. If you’re trying to figure out where the weight comes from in tree cellulose, you can almost ignore the hydrogen — it just doesn’t weigh very much. So the real question is: When H2O and CO2 combine to make C6H12O6 (glucose), does the oxygen come more from the water, or from the carbon dioxide?

Well, it turns out that a plant is actually two factories. The first factory is in the business of converting photons (which provide energy) and water into hydrogen. All living things have molecules called ADP, which act as batteries. When sunlight hits its leaves, a plant charges up these batteries, pulling the hydrogen out of the water, and adding it to the ADP, which then turns into another molecule called ATP.

Basically, ADP is a microscopic uncharged solar cell, and ATP is the same solar cell, all charged up.

While this first factory needs to be bathed in sunlight (or some other light source), the second factory doesn’t need any light at all. This second factory takes in ATP (those already-charged batteries) and carbon dioxide. From the ATP it gets those charged up hydrogen atoms, and combines them with the carbon dioxide. One of the two oxygen atoms in each CO2 molecule is released into the atmosphere, and the other one is used to make glucose — C6H12O6.

So it turns out that none of the oxygen from the water actually gets used to make glucose (and therefore cellulose) — the glucose contains only the oxygen from the carbon dioxide in the atmosphere.

When you add up the numbers, it turns out that only about six percent of the weight of wood cellulose (the hydrogen) comes from the roots — the other ninety four percent of that mass comes from carbon dioxide.

Which means that the wood of even the most massive tree comes almost entirely from pure air.