Math notation and 3D, part 2

Yesterday Stephan’s comment made a valid point about the sense in which mathematical notation is one dimensional. I think the disagreement, if there is one, is about which semantic level to focus on.

There are statements that are certainly true, yet not at all useful in particular contexts. For example, it is certainly true that humans are made of atoms, but that fact doesn’t provide very much insight about why Romeo and Juliet was a tragedy.

Similarly, I think the statement “all mathematical statements can e expressed as a one dimensional string,” while certainly true, is not useful in most contexts. Clearly it is useful at the meta-level, where Gödel’s incompleteness theorem resides.

But when you are using mathematical notation to communicate some concept or relationship to a fellow human being, you are rarely operating on that meta-level. In such cases, which by far the great majority of cases, you want to maximize for readability and clarity of thought, and your math notation should ideally express how multiple dimensions of ideas interact with each other.

After all, it is certainly true that if I send you a digital photograph of my cat, the transmitted data can be represented by a one dimensional array of pixel values. And as Stephan points out, such a representation is perfectly adequate for performing a Fourier Transform, digital convolution, or various other mathematical operations.

Yet if we insist on keeping things at that level of interpretation, you may never realize that you are looking at a picture of my cat.

Math notation and 3D

I was talking today with a student about mathematics. The student was interested in new ways of using gesture to create mathematical notation.

Since I am interested in augmented reality, I suggested that perhaps we can try to move mathematical notation off of the flat page, and into three dimensions.

The student protested that mathematical notation is actually one dimensional. I could see where he was coming from. After all, everything you can write down in math can be expressed as a single text string — which is indeed one dimensional.

I disagreed, and as an example, I wrote the following formula on the whiteboard:

x2 + y2


“You see?” I said. “Sure this can be expressed as a single text string, but that doesn’t capture what we mean. When we look at this expression, we see something two dimensional: We read the addition on top horizontally, while the relationship between the numerator and the denominator of the division is expressed vertically.”

Then I took a closer look at the expression I had just written. “Wait a second,” I said, “this is actually three dimensional. Each superscripted ‘2’ is really orthogonal to everything else. By making that text smaller and shifted up and to the right, we are essentially conveying the existence of an operation (raising to the power of 2) that is orthogonal to both the addition and the division.”

It dawned on me that when we write in mathematical notation, we are actually packing a lot of dimensions onto that flat page. For example, subscripts represent yet another dimension, orthogonal to all the others. Suddenly, the idea of writing mathematical expressions in three dimensional space seemed a whole lot more interesting.

Looking at the stars

I spent much of yesterday talking to people at the MIT Media Lab, and then much of today talking to people at the Bose Corporation. In both places, we spend most of our time talking about possibilities for the future.

Of course those possibilities are not all roses and sunlight. After all, as our access to ever advancing technologies increases, there is a concomitant increase in opportunities for invasions of privacy, identity theft, and other social ills we might not yet even have words for.

As with all technological advancements, with great power comes great responsibility (thank you, uncle Ben). With that idea firmly in mind, while we did discuss the very real potential pitfalls of advancing communication technology, our conversations mostly steered toward the positive.

People are marvelous and terrible creatures. We humans have so much potential for either good or evil, and things can go very wrong. Yet I find myself siding with that great observer of human nature, Oscar Wilde: “We are all in the gutter, but some of us are looking at the stars.”

Unpacking the travel ban, part 2

Continuing from yesterday — with many thanks to a extremely zealous reader…

The larger problem with Trump’s politically motivated travel ban is not that it denies individuality. The larger problem is that his ludicrous version of “policy” is taking the place of the serious work of screening for very real terrorists.

My message to Trump supporters who enjoy pretending that their Reality Show star hero playing to his base is engaged in some sort of actual policy: Try not to wallow overly long in your own deceptive feeling of self importance or, worse, self defined rationality.

Banning the populations of entire countries, which is completely irrelevant to the very real and serious problem of national security, while refusing to engage in the difficult work of screening for truly dangerous individuals, is simply irresponsible.

Distracting everybody with a sideshow while not screening for the actual people who have declared, usually while holding severed heads, that they want to destroy America or some other country is a textbook definition of suicide. Demanding that others join you in the mass suicide is called mass murder.

Unpacking the travel ban

What does it mean to ban travel to our country from entire nations, regardless of individual identity? It means, at core, that there are no individuals. There is no scholar, there is no artist, there is no musician or scientist.

The greatest gift you have as a human being is the ability to tell yourself that your individual life matters, that you have something unique to give, that you are more than simply a tiny dot in a random swarm. Yet it is now the policy of the United States that none of that is true.

Your belief that you are an individual is, apparently, an illusion. There is no promising musician, no aspiring scientist, no brilliant architect, no graceful athlete or brave political dissident. There is only the Puerto Rican, the Somalian, the Jew.

Whoever you may have thought you were, we now know that you are not an individual, you are merely a faceless and undifferentiated member of a tribe. The world will no longer make the mistake of conferring upon you the dignity of individuality.

Or at least, as of yesterday, that is the official policy of the United States of America.

The beer comes to you

When the Web first came out in 1993, very few people could have predicted Google, Facebook, YouTube or Wikipedia. The Web’s radically different means of distribution of information changed not only the answers to how we communicate, but the very questions.

Similarly, when the SmartPhone arrived in 2007, very few people could have predicted Lyft or Airbnb, Instagram or SnapChat. The SmartPhone changed digital usage patterns so radically, obsoleted so many long held assumptions about information scarcity, that the very economy itself was transformed.

Soon, when the wearables become ubiquitous in about four years from now, something analogous will transpire. But this change won’t just transform our information world — it will transform our physical world.

Once we have been freed from the tyranny of the screen, our digitally augmented interactions with that physical world will change radically. After we have stopped peering into display screens, we will once again focus our attention on our actual surroundings. The Internet of Things will have begun in earnest.

Pretty much everything around us will become robotically actuated, somewhat the way automobiles are already becoming robots. We will take it for granted that furniture will arrange itself at our bidding, that the lights and sound in our houses will adjust dynamically.

Our children will look back with amusement on those bygone days when their parents needed to leave the living room, go into the kitchen, and open a refrigerator door, just to have a beer — in that long ago time before unobtrusive robots carried such objects around for us.

After all, our children will be living in an age when such things are no longer even thought about. It will all seem so obvious to them: You don’t go to the beer — the beer comes to you.

A dog year is the square root of a computer year

Today at our lab we were comparing the performance of the Samsung S8 phone to the Galaxy Note 4. As it happens, my first SmartPhone was a Galaxy Note 4.

I only got it because it worked with the newly emerging GearVR, and in the second half of 2014 I wanted our lab to do untethered shared virtual reality. Back then, sticking OptiTrack motion capture marker on GearVRs was the easiest way.

Now, three years later, we are still using the GearVR for this, except now the OptiTracks have been replaced by Vive trackers (only available in the last few months), which are a lot less expensive than OptiTracks. Of course we’ve upgraded to the latest model SmartPhone — the S8.

The S8 is vastly faster than the Note 4 was. Which means everything works a lot better — graphics rendering, character movement, position tracking — all of the qualities that make for a good and immersive shared VR experience.

Today I was trying to convey to my students just how quickly all of this technology is advancing, thanks to good old Moore’s Law. “You know how a year is around seven dog years?” I said.

The students nodded. They all knew about that.

“Well,” I explained, “a dog year is basically around seven computer years. Which means that a year is around forty nine computer years.”

“In other words,” I continued, trying to put it into terms that would resonate with computer science students, “a dog year is the square root of a computer year.”

Vichy revisited

During World War II, the collaboreurs in the Vichy government held positions of power, while members of the French Resistance were hunted down. Yet after the war, the situation reversed. If you had participated in the Vichy government in any way, the best path to salvaging your reputation lay in convincing people that you had secretly belonged to the Resistance.

I see a rough parallel emerging in America today. People who are currently collaborating in the process of callously dismantling so much that is beautiful and kind and noble about this country will one day claim that they had actually been part of the resistance. They will quite likely protest that they were resisting from the inside.

Yet there are a few individuals fortunate enough to have indisputable evidence — right now — that they remained patriots through these dark times. Those individuals will be able to prove that they stayed true to our nation’s ideals, and had had the presence of mind to understand what the American flag really stands for when our nation is at its best.

Currently there is one certain way that such patriotic individual Americans can be identified, because (conveniently enough) they are being identified by name. Yesterday Stephen Curry became one of those individuals.

Everything is new

I was talking to an old friend this evening who, like me, teaches at New York University. We were comparing notes about the incredible energy and enthusiasm among our students.

She was telling me how disheartening it is to see that people who are somewhat older don’t seem to have that energy and fire. It really does seem to be a trait of the young.

I told her that it was exactly this quality of younger people that makes me love teaching. But somehow that didn’t seem like a complete answer. Why, I wondered to myself, do younger people have this quality?

And then I had it. “When you are young,” I told my friend, “you understand that everything is new.” She agreed completely.

After we said goodnight, I kept thinking about our conversation. The challenge, I found myself thinking, is to maintain that understanding as one gets older — the ability to see that everything is new.

Dinner with Shakespeare

Supposed you were given the opportunity, through some metaphysical event, to have dinner with William Shakespeare. Would you?

I know many people who would say, without even needing to think about it, “Hell yes!” After all, a conversation with the greatest writer in the history of the English language is a sort of dream come true.

But what if Shakespeare, the actual person, were to end up deviating dramatically from Shakespeare the written presence? What if he should turn out to be boorish — or even worse, boring?

Would you rather know this, or would you regret having discovered such an uncomfortable truth? Would you end up wishing that your dreams of greatness had remained forever unsullied?

I don’t think there is any right or wrong answer to this question. Yet I suspect that how you choose to answer this question might say a lot about your particular view of reality.