The evolution of evolution

Two interesting related events happened on this day in history. On November 24 1859, Charles Darwin published On the Origin of Species.

Then on November 24, 1974, Donald Johanson and Tom Gray discovered “Lucy”, a forerunner of homo sapiens who lived 3.2 million years ago, and the earliest of our known forebears believed to have used stone tools.

The Origin of Species was, in its way, also an evolution of the human use of tools — in this case, the tool of objective evidence based science. The subsequent discovery of Lucy, like many other such discoveries, was solid empirical confirmation of the predictive power of that science.

I think of the publication of Darwin’s book as an event both humbling and exalting. It was humbling because it asked us to set aside cherished myths about our uniqueness in the order of things.

It was exalting because that very humility was a call to evolve into our best selves. We were being asked not to flinch from objective evidence, but rather to embrace it. The courage to embrace truth, wherever that truth may lead, is one of the noblest qualities a human may possess.

Thanksgiving 2017

From the time I was a teenager, I understood that we are a morally compromised country. I suspect we have that in common with many countries.

Our society is built on a legacy of the twin horrors of slavery and genocide. What we invaders from Europe did to non-Europeans was unspeakable — both the non-Europeans who were already living here and the ones we imported to be our pack animals.

But I’d sort of made my peace with it because the ideal of America was something that made sense to me. This ideal includes the notion of equality, a welcoming attitude toward strangers, a society where people are treated equally, have a voice in their government, and believe in a fundamental principle of fairness and opportunity.

We didn’t always manage to achieve that ideal, but it was there as a guiding principle, so I thought it was possible to reach a moral compromise with our nation’s troubled history. As Oscar Wilde once said, “We are all in the gutter, but some of us are looking at the stars.”

Yet the monstrous current inhabitant of the Whitehouse has shaken my faith in that compromise. It’s not that he is a venal narcissist, a con man, a vile bully desperate for attention. That’s unfortunate, but it’s not the problem.

It’s more that his policies reflect our ugly past rather than any hopeful future. The vicious attacks upon blacks, latinos, immigrants and others who are vulnerable — not so much the ugly words (although those are bad), but the hurtful policies.

The cabinet appointments that all read as somewhere between a bad joke and a deliberate insult, the naked attempt in this current tax bill to line the pockets of the extremely wealthy by hollowing out the middle class, the approach to healthcare that pretty much says “if you’re poor, you die.” The sheer blatant cruelty of it all.

I was able to find a way to live with our blood stained past by telling myself we are building toward a meaningful present and future. But what does Thanksgiving mean if we are moving toward defining ourselves as a nation of cruel and selfish monsters?

A taxing situation

Vladimir Putin told the Orange one that he had not interfered in our U.S. elections. Despite quite a bit of evidence to the contrary amassed by our own government, that was good enough for you-know-who.

More recently, Roy Moore has been denying the various allegations against him of sexually assaulting teenagers and children, despite quite a bit of evidence to the contrary. Similarly, his denials seem to be good enough for you-know-who.

After all, why would anybody who was actually guilty of a crime insist on maintaining their innocence? That would not be logical.

I can sort of see the point. Once you start down the road of assuming that people who insist on their innocence might have something to hide, all sorts of mischief could follow. They might even be required to turn over their tax returns.

Algorithms are fractals

Today I decided to explain to my class a simple technique I use in my research. I had worried that it would be too simple, and that there would then be lots of time left at the end of the class.

Well, it didn’t work out that way. The “simple” technique turned out to be composed of other techniques that I had forgotten I’d implemented. And each of those techniques had subtleties to them that I had forgotten about.

By the time we were done, we had used up two solid hours of explanations, discussions, examples, math equations, source code and whiteboard drawings. What had seemed like a simple topic had turned out to be a deep dive into software design, algorithms, aesthetics, data structures and GPU techniques.

Maybe the easiest way to think of it is that algorithms are fractals. I wonder now why I ever thought such a thing would be simple to explain. On the positive side, we sure had fun.

Future phones

This weekend I was hanging out with a group of friends and we were talking about the disruptive social effects of SmartPhones. People seem to find those colorful little rectangles so compelling these days that it’s hard to spend time with anybody without continual interruption.

One of my friends took out his phone and showed us his new strategy for cutting down on SmartPhone use. “I’ve switched it to black and white mode,” he explained. “Everything I see on the screen is now shades of gray. So I don’t have the experience of bright colorful images continually luring me to look at the screen.”

He told us that this strategy was very effective. He noticed that since going to black and white, his SmartPhone usage had gone down quite a bit. He still uses it for essential things, but not for random moments of diversion.

That gave me an idea. “Imagine,” I told everyone, “some future phone technology designed to minimize distraction. Unlike today’s phones with their clunky screen display, this future phone would use audio only.”

Everybody stopped to think about this for a moment. Then they burst into laughter.

IP Gerrymandering

This week I was discussing with a colleague Apple’s patent on pinch-to-zoom. I remarked on how surprised I was that Apple was able to successfully sue Google for the Android operating system infringing that patent, given that pinch-to-zoom was invented by Myron Krueger around 1972.

My colleague pointed out that Apple’s patent was more clever than that. They knew they couldn’t patent pinch-to-zoom itself, so instead they patented the use of any data structure within a computer program that supports multitouch gestures like pinch-to-zoom.

Since Google’s software (which the court could examine) used such a data structure, Apple was able to successfully claim that it fell under that patent. If Google had implemented pinch-to-zoom without the use of a specific data structure to support it, Apple couldn’t have successfully sued them.

This highlights the differences between inventing and patenting. An invention creates new possible intellectual property, whereas a patent is a claim of ownership of intellectual property — not the same thing at all.

To use an analogy with land, invention is discovering new territory, whereas patenting is claiming where the property lines should go. If you’re a really talented lawyer, you can carve up the property lines in previously discovered intellectual territory in ways that nobody ever thought of before.

You’re not actually discovering new territory, you’re just putting fences in clever places. You are creating new property for yourself right smack in the middle of a parcel of land that somebody else thought was theirs.

It’s kind of like the way Gerrymandering works in politics. Even if your opponent has more votes, you can still win simply by redrawing the boundaries between districts.

A history of failed film techniques

Every medium has techniques that are understood to work for their intended audience. Film, for example, has adopted quite a few conventions that have been shown to be effective in support of clear storytelling.

We now know what those are. They include establishing shots, two shots, close-ups, cut on action and the 180o rule, among many others.

In the earliest days of filmmaking not all of those conventions had been worked out. It wasn’t so much a question of any technical limitation as of understanding what works for human viewers.

After all, the set of all possible movies is incredibly vast — it consists of anything you can capture with a camera, edit together, and show on a screen. Yet the set of movies that can actually be comprehended by human beings is a relatively tiny subset of this much larger set.

With that in mind, it would be interesting to compile a list of techniques that filmmakers tried which ultimately failed. An obvious example would be films which broke the 180o rule: An edit which moves the camera position to the other side of the actors.

This kind of cut doesn’t work because the “screen right” direction in the first shot becomes the “screen left” direction in the second shot. When you do that, audiences lose track of which way the actors are facing, and they become disoriented.

I’ve tried to search on-line for a history of failed film techniques. That is, attempts to add to the cinematic vocabulary not because of any technical difficulty, but because the minds of human viewers would simply reject them or respond with confusion.

Does anybody know where such a list might exist?

That guy

The day before yesterday I was having a conversation with some of my students at NYU about why people go out to the movies. We focused on the whole tribal aspect of it.

When you are in a movie theater, and you are surrounded by other people, most of those other people are generally strangers. Yet you still feel a sense of being in a tribe, and your sense of immersion is amplified by that sense.

It’s a powerful primal feeling, and you get the same boost when you go out to see live theater, or to a concert, or to a football game. As an audience, we all collectively manage to heighten the sense of emotional involvement for each other.

I noted, for completeness, that this isn’t always how people see movies in a movie theater. “Sometimes,” I pointed out, “when a big movie mogul is screening a film, it’s just an empty theater, and he or she is the only one watching.”

Of course that is a statistically rare occurrence. I mean, who is that guy? Have you ever met him? We decided it wasn’t really a point worth dwelling on.

Then today I visited Lucasfilm/ILM to give a talk and discuss possible research collaboration. The first thing my hosts did, before having me meet with anybody, was sit me down in a big empty movie theater.

They directed me to center row H, because that is the best seat in the house. For the next 38 minutes, all by myself, I watched Lucasfilm/ILM movie trailers and special effects reels on the big screen, which was a totally awesome experience.

I suddenly realized that, at least today, I was that guy.

Nostalgia

This week I was talking to a student I had just met, and we were happy to discover that we had both been born in New York City. She grew up in the East Village, so we reminisced about how radically that part of the city has changed over the years.

It was so good to meet a fellow native New Yorker, and to realize we had that in common. But it was even better to do what proud New Yorkers always do — complain about New York.

We groused about the way the true character of the East Village has etched away with time, replaced by something more commercial.

“St. Marks has never been the same,” I said, “after they opened the Gap Store.”

“Wait,” she said, “there’s never been a Gap Store on St. Marks. I would have remembered.”

“I am sure there was a Gap Store,” I insisted. “It’s closed now, but I remember when it first opened, we all thought that was the beginning of the end.”

So we went to the Internet. Sure enough, there was indeed a Gap Store on St. Marks. It opened in 1988.

“That,” the student said, “was before I was born.”

Ah.