Now and Then

I have been listening to the Beatles song “Now and Then” periodically throughout the day. It’s a sad song, but it makes me incredibly happy. Every time I hear it, it seems to get even better.

I noticed that there are quite a few on-line commentators trashing this song. It seems that some people don’t like the fact that cutting edge technology was used to help bring it to life.

But when didn’t the Beatles use cutting edge technology to create their music? Wasn’t that an integral part of their identity?

Pay attention people.

Visual music

This evening I modified my extended reality piano program to create rising notes. Every time you play a note on the keyboard, a shape emerges from that key, and rises up into the air.

I first tried black and white shapes, but those were kind of boring, so now I am using rainbow colored shapes. I am making the black notes darker, but the general scheme is red, orange, yellow, green, blue, indigo, violet for the notes C,D,E,F,G,A,B.

I suppose I could have started the rainbow with A instead of C, but in my mind the scale really begins with C. I can always change it.

This is just a first step. What I really want to do is make multicolored butterflies rise up out of the keyboard as you play. At that point I think it will really start to feel like visual music.

Virtual sculpture

I am starting to build sculptures in XR. You can’t see them unless you are wearing your XR specs. But if you are wearing, they seem perfectly real, like any other object in the room.

But there is a difference. These sculptures are dynamic. They can change their shape over time.

It’s a kind of hybrid art form — sculpture, but also animation — totally crossing the lines between genres. I wonder how many other categories of artistic expression will start to become jumbled and recombined, as XR becomes a part of our everyday reality.

Pet dragon

I am starting to get used to XR in my everyday life, thanks to my handy-dandy Meta Quest 3. So I can finally fulfill a childhood dream — having my own pet dragon.

I’m not talking here about one of those big scary critters. I’m envisioning a little fellow who can fly around and keep me company as I cook breakfast or putter around the apartment.

Unfortunately, you can’t just go out and buy a pet dragon. So I am going to need to build one, as a kind of DIY project.

I wonder what it will be like to play catch with a pet dragon.

If Harry Potter were a musician

This afternoon I plugged a midi keyboard into my MacBook. I connected it to a Web-based audio interface, used some samples of piano notes that I had lying around, and was soon able to play music in my Web browser.

Then I added that code to my WebXR system, so I could use my music keyboard to create graphics in XR. At some point I realized that I could just plug the midi keyboard directly into my Meta Quest 3, using its Web browser based midi interface.

All of a sudden I can play the piano while making 3D graphics animate in the air above the keyboard — not in VR, but in video-passthrough XR. So I can still see my room, and I can see my real hands playing music on the keyboard.

But now I can also create magical 3D graphics in mid-air just by playing the piano. Like Harry Potter, if he were a musician.

The doors of perception

As humans, we have a collective belief in a single shared reality. We trust the evidence of our senses — the Sun moves across the sky, a kitchen table seems firm and solid.

But since we are all human, we all share the same perceptual limitations. If I can’t perceive something, for example a four dimensional object, then for the most part you can’t perceive it either, since we were all born with pretty much the same perceptual equipment.

Which means that there is very likely a large universe out there — or perhaps right here with us — that is outside of our shared understanding of what the universe is. Which leads me to two questions:

(1) How can we all collectively perceive things, perhaps with the use of the right technology, that are outside of the range of our natural human perception?

(2) If we manage to do that, will be be able to use that enhanced perception to build a more true and accurate model of the Universe around us?

Future XR dinner

I had a meeting today with some NYU colleagues in which we talked about the future of extended reality (XR), and what our research can do to help make it happen. Afterward, we all went out to a nice dinner at a nearby restaurant.

At some point during the dinner I turned to the people next to me and said “I think I know how we will be able to tell when XR has finally arrived.”

“How?” they asked.

“We will know,” I said, “because we will all be wearing our XR glasses while eating a nice meal together at a restaurant like this one. And it will all be perfectly normal.”

Mornings and evenings

I am pretty sure that I am effectively a different person in the mornings and in the evenings. In the early morning, I can get an entire day’s work done in an hour or so, and I really enjoy it.

By the evening, all I want to do is chill out. I can manage to get work done, but it doesn’t feel like much fun. And the results are not that great either, because my heart just isn’t in it.

So you could say that morning me is supporting evening me. Both of me are just fine with that.