When Shakespeare was only 28

I am staying at Trinity College at the moment. My gracious hosts have put me in a lovely little room that is all old-fashioned old world charm. Looking out my window at the beautiful and ancient campus, I am wondering what academic visitors from an earlier era may have been offered this very room — perhaps using some of this very furniture.

I can picture Charles Dodgson, hard at work on his sequel to “Alice in Wonderland”. Or C.S. Lewis, sitting at this very desk while writing of the adventures of gallant Reepicheep on the Dawn Treader. Or maybe Tolkien, up from Oxford for a seminar, working out Tom Bombadil’s casually metrical banter.

Of course this place goes much farther back than even those esteemed worthies. The University was founded in 1592, an event officially presided over by the first Queen Elizabeth. One of the famed Darnley portraits of Her Royal Majesty hangs in the faculty sitting room where this afternoon I had a spot of tea.

It would be wonderful to be transported back to that time, if only for a day, when this august university was new, when the world was younger, when Shakespeare was only 28.

Seven billion wonders

A friend and I were walking down a Paris street yesterday, looking at all the people, when my friend expressed a sense of wonder at all the unique minds. And of course my friend was right.

Each person’s mind is an entire world unto itself. As far as we can know for sure, each of those minds is as complex and wonderful a thing as we have yet discovered in nature.

We tend to take this for granted. With seven billion people in the world — many of them living in very difficult circumstances — we can forget that each of those individuals is a vast universe of thoughts, memories, perceptions transformed into ideas.

Familiarity can sometimes breed contempt, but just a little reflection can bring us back to the deeper truth: That each individual human mind is a marvel, a true wonder of the Universe.

Not important, but essential

As I walked around Paris today, a thought came back to me that I’ve had many times about places I know and love well: Their reputation, their place in the world, as it were, is a separate thing from the specifics of their existence.

There are a million little details that come to define a city for you, once you spend any time there. Very few of these details would show up in the movie version. For example, the fact that you go one way on the 4 line to get to Porte d’Orleans, and the other way to get to Porte de Clignacourt. Or that when you walk south on Bd de Sebastopol, you arrive at the Fontaine du Palmier.

New York, Paris, London, Berlin, any city you can name, is full of such details of happenstance. These details add very little, if anything, to the mythic stature of a great city, but to anyone who lives there, they end up being the very soul of the place.

It’s such little details that make a place real. In the grand scheme of things, they may not be important, but if you live there, they are essential.

C.P. Snow would weep

In today’s New York Times, William Grimes wrote an art review on ‘Marks of Genius,’ Works From the Bodleian, at the Morgan.

Most of the review was fun to read, but one sentence I found completely horrifying.

Speaking of Euclid’s “Elements” and Newton’s “Principia Mathematica”, and thinking back on his own failure to learn geometry and calculus, he remarked that these “immortal works, beautifully printed and bound, are, in the end, math books.”

I am sorry that Mr. Grimes had a bad experience in high school, but personal experiences of adolescent trauma have no place in a discussion of works of unsurpassed intellectual beauty and genius.

After all, if you were once beaten up in tenth grade by some angry Hassids, does that mean you should dismiss the work of Arthur Miller, because the great playwright was “in the end, a Jew”?

The Political Party at the End of the Universe

The U.S. House of Representatives Majority Leader Eric Cantor, one of the leading forces in the Republican party — and, because of his longevity, one of the most politically powerful — was unexpectedly defeated in his own party’s primary by an unknown Tea Party candidate.

Apparently he was insufficiently zealous in his portrayal of Barack Obama as an evil force intent on nothing less than the total destruction of America and our way of life.

I cannot resist the temptation to misquote Douglas Adams:

“There is a theory which states that if ever anyone discovers exactly what the Republican Party is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable.

There is another theory which states that this has already happened.”

A little noise

Near the start of my career in computer graphics, I came up with some techniques that played with the boundary between order and chaos. It seems that people have found those techniques to be useful.

When it comes to visual stimuli, humans love order, and we also love chaos. But we especially like our signal and noise together, in just the right mix.

I wonder whether this principle can be extended to all human thought. Maybe we look at all things — politics, music, personal relationships — in terms of an optimum balance of order and chaos.

We may not always know how to find that balance, but we can generally feel it when we’ve hit the sweet spot. And perhaps our need to strike that balance drives much of our decision making.

When life feels too chaotic and out of control, we seek order. But when everything seems to be going well, with perhaps a bit too much clockwork precision, we might feel a powerful urge to create some chaos.

Who hasn’t had that urge at one time or another — just to mix it up a bit, to add a little noise?

Ethics toward post-humans

I’ve been thinking about CC’s comments on my recent post about computers and artificial intelligence. And it brings up an interesting question in ethics:

Suppose we had every reason to believe, due to some unforeseen breakthrough in artificial intelligence research, that computers would, in our own lifetime, first reach and then far surpass our own intelligence (and here I mean “intelligence” in the human sense).

Would we have an ethical obligation to teach those emerging entities, to protect them, guide them, help them as they travel along their path? After all, in a very real sense we would be their parents.

Or would we have a greater obligation to ourselves, our own human kind? If we knew that in a few short decades their intelligence would be to ours as our intelligence is to that of a rat, would we try to block their development — or even their very existence?

One reason this is an intriguing question is that humans have come to highly value nature’s experiment in human intelligence. Naturally enough, we see our own intellectual capacity as a kind of pinnacle of evolution. So in one sense we might be inclined to see that experiment go as far as it can.

On the other hand, we might just decide “To hell with this — I’m not going to let my species get replaced by some machine.” That too would be a very human response. 🙂

Self and other

Connections between people are tricky things. If you and I are friends, then who am I to you, and who are you to me?

Clearly my sense of you is vastly different from your experience within your own head. No matter how close we are, you remain — in my universe — a construct, a set of theories about who you might actually be.

Things get even more confusing in the case of a love relationship. There is so much more room for projection in a romantic entanglement, more yearning for the illusion that “you complete me”.

One great thing about merely being friends, rather than lovers, is that you generally don’t need to deal with that extra layer of confusion. Of course, you also miss out on a lot of fun. 🙂

So how can we ever be sure that we know another person — I mean the real other person, not the construct that our own mind creates around them?

Maybe we can’t, and maybe that is what makes it all so interesting.

Crossover

I was delighted to see, at this evening’s Tony Awards, Meredith Wilson’s The Music Man finally recognized as an early progenitor of rap music. Hugh Jackman brought out L.L. Cool Jay and T.I. to join him in rapping “Rock Island”, the brilliant opening number of Wilson’s masterpiece.

The moment was all the more sweet when you consider Wilson’s full history. More than ninety years ago he was a member of John Philip Sousa’s band, playing a style of music that couldn’t be further in our collective cultural consciousness from the edgy streetwise milieu of rap. Which makes the achievement of “Rock Island” — first widely heard in 1957 — all the more impressive.

Of course Meredith Wilson’s association with edgy modern popular music long predates this year’s Tony awards. In 1963 — more than half a century ago — the most famous rock band of them all, the Beatles, recorded “Till There was You”, also from The Music Man.

We have always had a vague sense in American culture that rap is the successor to rock and roll. It’s fascinating that Wilson’s music has managed to connect them together, more than half a century apart in time, after having first emerged out of the era of silent movies.

And there is at least one more connection here between rap and the era of classic rock: In order to use “Rock Island” in this evening’s Tony Awards broadcast, the show’s producers would have needed permission from the person who has long held the rights to Meredith Wilson’s entire catalog.

That would be none other than Sir Paul McCartney.