Attention versus impact

Certain events, such as the Academy Awards or the World Cup games, attract an almost insane amount of focus from the world. Even randomly weird and relatively meaningless events can captivate the attention of millions, like a suggestive dance between Miley Cyrus and a large foam rubber finger.

Yet every day significant things happen which will have a long term impact on all of our lives and yet somehow pass below our collective radar, lost in all the noise. Such influential events can take many forms: A law enacted, a disease cured, a more lethal handgun perfected.

If there were some way to spot such game-changing events early on, surely that would be a good thing.

If we look back over the years, the wisdom of hindsight can sometimes allow us see such long term impacts with greater clarity. For example, it didn’t seem to occur to anybody in the 1950s that the massive expansion of roadways out of New York City by Robert Moses would result in entire industries moving out of the city, leading to the city’s economic collapse by the 1970s.

Perhaps it would be interesting to chart, going back in time, what sorts of events had a particularly high “long term impact” versus “initial attention” ratio. That might make it easier learn what to look for while such events are occurring, rather than discovering their import only years later.

Movies 2.0

Continuing the thought from yesterday, we don’t need to wait 100 years to see a sensory evolution of the protagonist driven linear narrative.

Technologies are already emerging that allow movies to be seen from many different angles. For example, Total Cinema 360 develops software for shooting a movie using the same “see in all directions” camera that Google uses for Google Street View. Viewers can then put on an Oculus Rift and look around to see the movie in any direction.

Some computer games are a bit like movies with a user controllable camera. But games are usually more about making choices to affect the outcome than about conveying a traditional linear narrative. Probably because of this focus, the “acting” by non-player characters generally leaves much to be desired.

But game-related technology can be used another way. Suppose we just want to make a movie that can be wandered through — observed from any location and angle. Even today we can use motion capture and 3D graphical modeling, animation and rendering to create all the digital assets that would be needed to make such an immersive movie. Using emerging technologies like the newest version of the Microsoft Kinect, motion capture doesn’t even need to be prohibitively expensive.

But this is where we get to something that is not quite a movie as we know it: If the viewer can wander around the room and see things from any angle (as in immersive theatre pieces like “Tamara”, “Tony and Tina’s Wedding” and “Sleep No More”) then many of the traditional means of subliminal signaling used by filmmakers would no longer work.

The creators of such “immersive film worlds” cannot use many of the traditional filmmaker’s techniques for creating subjective experiences: The interplay between establishing shots, two-shots and close-ups, the choice of lens power and depth of focus, placing key and fill lights for a particular shot, and so forth.

New and different techniques will need to be developed, which do not rely on camera placement. Over time these new techniques will mature and evolve, and then we will truly have a new medium — Movies 2.0.

After movies

The progression from novel to movie is not really paralleled by anything in interactive media. To say that “Just as we moved from words to images, as the novel gave way to the film, now we are moving to interactivity as the film gives way to the computer game” doesn’t quite sit right.

It’s not that I think of games as a lesser medium. Quite the contrary. Computer games are glorious and exciting in their vast possibility, and they are still in their infancy. No, that’s not it.

It’s more that the progression from page to screen is within the long tradition of protagonist driven linear narrative, and I don’t think that’s going to be replaced. Linear narrative seems to emerge from how our minds work, and it is how we have always told our stories of emotional truth.

And it’s not just novels and films that work this way. The theatre can be thought of as a kind of hybrid of novel and film. It privileges words the way a novel does, yet like cinema it also privileges the visceral quality of physical human presence.

So I am wondering what will be the future of the protagonist driven linear narrative — a form that has existed in human history for as far back as we can see, and that shows no signs of going away. What will it be like in, say, a century from now?

Will it be some form of immersive holodeck, in which we find ourselves seemingly co-present with the characters of a compelling story — seeing what they see, hearing what they hear, touching what they touch?

Or will it be something even beyond that — a direct transposition of their most subtle and fleeting thoughts and emotions onto our own brain, as though these thoughts and feelings were our own, emerging from within the core of our being?

The garden of pure ideology

It’s interesting to think back, from a distance of thirty years, on the once iconic quote from 1984 that I posted yesterday (I changed only one word). Obviously the people who wrote those words were deliberately echoing George Orwell, and riffing on the significance of the year 1984.

But those were more innocent times. The Web was still a good decade away, and few could have predicted that a clever ad for a personal computer — sold as a symbol of personal choice and an icon of freedom from the hegemony of Corporate America — would actually prefigure a very different future.

In the wake of the Snowden revelations, we are all reassessing that dream. In this country, conservatives tend to mistrust power in the hands of government, and liberals tend to mistrust power in the hands of corporations. But now we all have common cause — there is plenty of mistrust to go around. Somebody has our data, and we’re trying to figure out just how scary that is.

The idea that more technology is better is indeed, as that ad from thirty years ago put it, a garden of pure ideology. Alas, we don’t always get to decide what grows in the garden.

Happy birthday you-know-who

 

“Today, we celebrate the thirtieth glorious anniversary of the Information Purification Directives. We have created, for the first time in all history, a garden of pure ideology–where each worker may bloom, secure from the pests purveying contradictory truths. Our Unification of Thoughts is more powerful a weapon than any fleet or army on earth. We are one people, with one will, one resolve, one cause. Our enemies shall talk themselves to death, and we will bury them with their own confusion. We shall prevail!”

Naming things

The last two posts have gotten me thinking about our need to name things.

I guess it’s logical that humans need to put a name on something before they can see it as something of value. After all, for the last several hundred thousand years our species has been developing this astonishing facility with language. Our particular way with words seems to be unique among nature’s wonders, at least as far as we know.

And yet there is a contradiction running through the very core of this way of valuing things. After all, what do we truly value, when it comes right down to it? Here are some things that come to mind: Friendship, our children, courage in the face of danger, our attraction to our chosen partner.

None of these things require words. In fact, there is every reason to believe that we evolved these emotional traits long before we evolved our highly developed sense of language. After all, these are traits we share with other species.

And therein is one of the great contradictions of being human: We are creatures obsessed with naming things. In fact we fill our lives with constant chatter. Yet the things we value most in our hearts are beyond words.

The privatization of language

The provisional Trademarking of the word “Candy” for computer games might seem silly, but it’s actually deadly serious. Buried amidst its claim to exclusive use of the word for recording equipment, computer games and every conceivable type of clothing (an astonishingly exhaustive list), the King Limited trademark also lists this:

“Educational services, namely, conducting classes, seminars, workshops in the field of computers, computer games; Training in the field of computers”

Just as I did in 2010, and Vi Hart did before that, many people have used the iconography of candy in on-line learning experiences that get kids interested in math and computation. If this trademark goes through, free use of such socially positive uses of this word would be prohibited.

But the implications are far larger than one word. Such a trademark would create legal precedent for a lexical land grab. Corporations could trademark equally broad usage of any word — mom, dad, friendship, love — effectively turning these words into private property.

Among the many reasons that’s a bad idea, here’s one: Once the line is crossed to prevent the free use of common positive words for educational purposes, it is the beginning of the end of what people like Vi and I do — helping to make learning more fun and enjoyable. Think about that the next time you play Candy Crush Saga.

Amazingly, the right to use our own language is being taken away from us. And it’s all happening so easily.

Like taking candy from a baby.

Crush saga

Today a friend told me that a game company has managed to get the European Union trademark for a very common word in the English language, when that word is used in the title of a game or article of clothing.

This is significance for those of us on the other side of the pond because the U.S., which generally honors E.U. trademark decisions, has issued a provisional trademark for such uses of this word.

And the lawyers at Apple Inc. are already enforcing that decision, since Apple distributes the game on its devices.

I probably shouldn’t use the word here, because some clever lawyer at Apple could plausibly imply that a “blog” is a kind of “game”, and that therefore I would be in violation of trademark laws. Remember, these are the people who recently proved in a court of law that Myron Krueger’s use of pinch-to-zoom in 1983 was actually first invented twenty five years later by Apple Inc. But that’s another crush saga.

So here is fair warning: If you are foolish enough to use that word in the title of a game you write or an article of clothing you sell, thinking that somehow you get a free pass because you played Candyland as a kid or like to wear candystripe pants, then you’ve got another think coming.

But what I can do is provide some alternate titles for frustrated game designers who would like a good name for their game, but are not allowed to use a certain word that starts with “C”. Herewith some possible titles, none of which have been trademarked, and all of which I would be happy to grant to any and all under an open source Creative Commons license:

Bubble Gum Crush Saga
Butterscotch Crush Saga
Caramel Cream Crush Saga
Cherry Balls Crush Saga
Chocolate Crush Saga
Cinammon Sticks Crush Saga
Circus Peanuts Crush Saga
Gobstopper Crush Saga
Gumdrop Crush Saga
Gumball Crush Saga
Gummi Bear Crush Saga
Gummi Worm Crush Saga
Jawbreaker Crush Saga
Jellybean Crush Saga
Lemon Drop Crush Saga
Licorice Crush Saga
Lollipop Crush Saga
Marshmallow Crush Saga
Peanut Butter Cup Crush Saga
Salt Water Taffy Crush Saga

All of the above titles are, as far as I know, legally free and clear for the taking. Just in case you’d like to use one as the title of your game. Or your article of clothing.

You are very welcome.

The mysterious drawing

This morning in our lab conference room I had a meeting with my students, during which we never quite looked at the whiteboard at the front of the room.

Only after the meeting did I notice an odd little drawing in the lower left corner of the whiteboard. I have no idea who drew it:




 

I found the drawing a bit puzzling. Clearly it’s a picture of a dragon, and drops of water are coming down from a cloud, and the dragon is breathing fire on the drops of water. Beyond that its meaning seemed a mystery.

Suddenly I had a revelation. I called over the students, and pointed to the board excitedly. “Do you know that that is?”

None of them had an answer.

“That,” I explained, much too pleased with myself, “is a dragon drop interface!”

From dust to dust

Our experience of reality is fairly continuous. We see surfaces everywhere, and underneath those surfaces are solid or fluid volumes. It all seems like pretty connected stuff.

But this all breaks down when things get either much bigger than us or much smaller than us. In a neat bit of symmetry, “bigger” and “smaller” in this case both mean about a factor of a billion.

Once you get to the size of a typical star (like our sun), which is about a billion times bigger than we are, the Universe starts to look like a bunch of little specks with a whole lot of empty space between them. This pattern continues up to the largest “things” we know about, galactic superclusters, which are about 224 times bigger than we are.

If you look in the other direction, pretty much the same thing happens. Everything seems fairly continuous until you get down to the level of small molecules, which are about a billion times smaller than we are.

Any smaller than that, and everything is little specks inside vast empty spaces, first at the level of atoms within molecules, then nucleii within atoms, and all the way on down to neutrinos, the smallest “things” we can measure, which are about 224 times smaller than we are.**

So in a sense we are in the middle of a kind of island. When you look at the entire span of scales in the known Universe, it’s mostly dust to dust, with just a little patch of land right in the middle. That little patch is where we are.


** Strings in string theory can get about 100 billion times smaller than neutrinos, but we have no direct evidence that they exist, let alone any way to measure them.