Being the dragon

It isn’t until near the very end of the first book of George R. R. Martins’ “A Game of Thrones” series that honest to goodness dragons appear. Before that, it seems to be more or less a story about some rather dysfunctional medieval kingdoms, and maybe bad weather.

But then the dragons show up, and suddenly it’s all a lot less Barbara Tuchman and a lot more J. R. Tolkien.

Today I looked around at the shaders people have been writing with WebGL. Things are kind of boring, until you get to the shaders that mix it up with procedural noise. And then suddenly the visual results get really interesting.

I looked at various peoples’ shader code for noise, and to my surprise I saw copyright notices with assorted peoples’ names — both for the original noise function and for simplex noise. Which is weird because both of those are my algorithms. So I’m not really sure exactly what those copyrights are for.

In any case, it feels as though shaders with noise function are like those dragons in “A Game of Thrones”. When you see them, things become a lot less like an academic exercise and a lot more about cool shapes and images.

Except in this case, in a weird way, it seems that I am the dragon!

Much faster

I’ve been playing around with WebGL. To many people reading this, the word “WebGL” might not mean very much. But to some it is a very big deal.

You see, computers have been getting faster at an exponential rate, a phenomenon commonly known as “Moore’s Law”. So computations which were out of reach only a few years ago, because they were just too gosh darned slow, one day become easy, and then soon after that you can do them in real time.

The general purpose processor on your computer can do a lot of different things, but that very generality means it can only go so fast. It’s not allowed to cut corners, because it has to be general.

But those little graphics processor chips that also come with your computer have no such responsibility. They don’t need to run an operating system, or a file system, or support your text editor or spreadsheet. All they need to do is make graphics happen blazingly fast. And that means they can indeed cut corners. As a result, they can do certain calculations hundreds of times faster than your computer’s main processor.

WebGL is a standard that lets you access all that raw power directly from your Web browser. It’s not yet supported by all Web browsers, but it will be.

Which means that I and others who make cool graphics things can, right in your Web browser, show you — and let you play with — stuff that is far cooler and more intricate than anything you’ve ever seen before.

I’m going to start posting examples of this stuff soon, as soon as I make something I’m really happy with.

What’s in a name?

Rod Brooks, the great robotics pioneer and innovator, founded the company iRobot, which created the Roomba, thereby introducing household robotics into many homes. More recently, Brooks founded another company, Rethink Robotics, which introduced Baxter, a friendly low-cost general purpose robotic factory worker with some built-in common sense.

Somewhere Karel Capek is looking bemused, while Isaac Asimov is wondering whether he can still sue.

But where did the name “Baxter” come from? I have a theory.

In 1962 Hanna-Barbera first aired the cartoon “The Jetsons”. As many of you know, it was a vision of a future where family cars had been replaced by flying saucers, meals could be created at the touch of a button, and pretty much all of our techno-fantasies had come true. The joke was that nothing had really changed: Our hero, George Jetson, was just as much the put-upon every-man as his predecessor, Hanna-Barbera’s even more popular every-man Fred Flintstone.

One of the most popular characters on “The Jetsons” was the robotic maid Rosie, a working-class robot with an accent straight out of Brooklyn. The wise-cracking Rosie, who referred to her employer as “Mr. J”, never let George Jetson get the upper hand. While technically she worked for him, she always made sure her upwardly striving white-collar boss knew that she was several steps ahead of him.

Hanna-Barbera had a habit of riffing off and borrowing from whatever was popular in the contemporary culture. In this case the borrowing was from a very popular sitcom that had premiered the previous year. “Hazel” starred the brilliant Academy Award winning Shirley Booth. She played a working class maid (complete with Brooklyn accent) in an upper middle class household. Hazel was always several steps ahead of her employer, who she always referred to as “Mr. B”.

This was all when Rod Brooks was around fifteen years old, a very impressionable age for a young roboticist.

As it happened, Hazel’s upwardly striving white-collar boss also had the first name George.

And his last name was Baxter. You do the math.

Warm / cold

In popular culture we often find duos of men who differ from each other in a very specific way: One of them is a romantic who fundamentally sees the world through a lens of warmth and emotion, and the other is a cold-eyed realist, who looks at things more cynically.

Some examples of this pairing (among many) are James T. Kirk and Mr. Spock, Bud Costello and Lou Abbot, Stan Laurel and Oliver Hardy, Martin Luther King and Malcolm X, Paul McCartney and John Lennon, John Watson and Sherlock Holmes, Jean Valjean and Inspector Javert.

I find it striking how often this particular trope shows up — the list above could go on and on. Yet I can’t think of nearly as many examples where the two are women. There’s Mary-Beth Lacey and Christine Cagney. Or maybe Andrea Sachs and Miranda Priestley, but now I’m reaching.

Why the apparent gender disparity? Hmm.

Six plays

This evening I went to see six short one-act plays, all new original works by young playwrights.

The results were all over the map. One was pure schtick, another turned out to be a very entertaining joke with an unexpected punchline. A third was a wry and knowing examination of the complexities of friendship.

Yet another was a full frontal assault on the very concept of narrative theatre, which pretty much took a metaphorical Uzi to the unities of Aristotle, and blasted the hell out of them.

What was wonderful was that the whole thing was even happening — new works written, directed and acted by young people in New York, in a tiny but well maintained theatre space at affordable prices.

The odd thing is that earlier in the day I had seen one of the hottest tickets on Broadway — an SRO showing of a wildly expensively, sumptuously produced, perfectly executed juggernaut of musical theatre. The big show aimed very high, and delivered on all its promises.

But a part of me, against all economic reason, liked the little fledgling experimental one-acts better.

Plato’s caveat

The acquisition of the venerable Washington Post by Jeff Bezos seems like one more sign that the internet is literally devouring printed media. Where will it all lead?

Plato took on the voice of his teacher Socrates to deride the rise of printed media. His ersatz Socrates argued that the ability to write everything down would be the death of memory, of direct human transmission of culture and wisdom, of personal intellectual responsibility.

Plato was being coy, for he knew full well that he was using the written word to make these arguments. They have, in fact, come down to us thanks to the ability of written language to transmit human culture across a span of centuries.

One day, sometime in the future, people may laugh at the absurdity of fretting over the death of print. “How silly people were to worry,” they might say, as they gaze into a re-constructed past through implanted lenses. “We wouldn’t even know about any of this, if it weren’t for the internet!”

Pun Wars

Last week I was with my friend David, who likes bad puns (something we have in common). We were discussing space flight and the possibility of colonizing Mars, and I pointed out a certain irony.

At which point David said “That makes sense, because Mars is irony.”

Which is true — Mars is red because it contains iron. But it was also, as those of you who pun will know, a signal that our conversation had left the realm of substantive discussion, and had entered a Pun War — a strange sort of combat in which all that matters is how outrageous a pun you can make. The contest generally continues until somebody comes up with a pun that can’t be topped.

“I was merely testing your mettle,” I said.

“How elementary!” he replied. “Maybe we should table these puns.”

“I’m willing to do that,” I said, “for a small fee.”

At which point he gave in, for David recognized that “Fe” is the symbol for iron in the periodic table of the elements.

The next day I was relating this episode to my friends Craig and Lisa, and their fifteen year old daughter Dana happened to be there. As I got to the punchline I started to worry that perhaps this was all going over Dana’s head.

“Did you understand that?” I asked her.

“Of course I understood that,” she sniffed. “After all, I’m fe-male. And you know what that means, don’t you? I’m Iron Man!”

I happily conceded defeat.

Security

In his comment yesterday, J. Peterson sensibly argued that in a future world where your very appearance is mediated by cyber-technology, someone who means you harm could conceivably hack into the system and alter the way people see you.

I would argue that there is solid historical precedent against this. Society has been more or less tolerant of hacking when the consequences are low. But once the stakes get high enough, various legal, social and technological mechanisms start to kick in. This response is not dependent upon any particular technology or historical era, but is instead a fundamental characteristic of all functioning societies down through the ages.

For example, if you are like most people in Western cultures, the bulk of your wealth is kept in a bank. These days, there is no actual pile of cash representing the size of your bank account. Rather, your money is represented as a set of binary digits in a secure computer account, in a bank that answers to government regulation and oversight.

Both the government and the citizenry are well aware that if our collective bank accounts were successfully hacked, the result would be chaos and perhaps worse. And so safeguards are put into place.

The same thing will happen when your very identity depends upon cyber-security. In theory somebody could hack into the database and erase or modify your perceived identity. But in practice we as a society are not going to let that happen, short of an event that leads to complete social and legal breakdown.

And if that were to happen, we would have bigger things to worry about.

Naked

In a restaurant today I was saying to a friend that if somebody were to walk in the door stark naked, people would probably be very upset. Furthermore, the proprietor would most likely call the cops, and the indecently exposed citizen would be issued a summons and perhaps carted away.

“And yet,” I continued, “a state of being naked is certainly more natural than a state of being clothed. So what is legal and socially accepted is, in this instance, precisely what is not natural.” My friend agreed upon this last point.

I said all this by way of leading up to a conjecture on the future of augmented reality glasses. “There may come a day, after everyone is wearing, when your appearance, from a social perspective, will be entirely mediated by the eyewear worn by others. After all, if everyone sees you visually transformed in a consistent way, then that transformation effectively becomes clothing.”

“And should this future come to pass, then it might become illegal to go about in public without wearing your AR glasses. To look upon people in their natural state would be considered indecent, an intrusion on their right to privacy. The man who walks around with naked eyes might find himself arrested for disturbing the peace, and perhaps thrown in jail.”

My friend seemed to find this prospect disturbing, but he did not argue against its plausibility.

Inverse search

Sometimes a friend will send me a cool link to a video or animation or other noteworthy object of interest on the Web. Later I may want to show that same thing to somebody else, but sometimes I can’t find it.

It’s true that I could carry around a SmartPhone, use it to dig through my emails, and either retrieve the link to type it into the browser of my new friend, or else forward the email to them, so they can click on the link.

But why should I need to do that? Why can’t I just type something like “Animated Taiwan deranged 3D characters act out news” into a search window?

It seems to me there should be some sort of inverse search facility, which starts with a URL and converts it into some easy to remember phrase. When you type in that phrase, the top search hit is the site you want.

Is that asking too much?