I am starting to build sculptures in XR. You can’t see them unless you are wearing your XR specs. But if you are wearing, they seem perfectly real, like any other object in the room.
But there is a difference. These sculptures are dynamic. They can change their shape over time.
It’s a kind of hybrid art form — sculpture, but also animation — totally crossing the lines between genres. I wonder how many other categories of artistic expression will start to become jumbled and recombined, as XR becomes a part of our everyday reality.
Just because you’re lucky doesn’t mean you’re safe.
I am starting to get used to XR in my everyday life, thanks to my handy-dandy Meta Quest 3. So I can finally fulfill a childhood dream — having my own pet dragon.
I’m not talking here about one of those big scary critters. I’m envisioning a little fellow who can fly around and keep me company as I cook breakfast or putter around the apartment.
Unfortunately, you can’t just go out and buy a pet dragon. So I am going to need to build one, as a kind of DIY project.
I wonder what it will be like to play catch with a pet dragon.
This afternoon I plugged a midi keyboard into my MacBook. I connected it to a Web-based audio interface, used some samples of piano notes that I had lying around, and was soon able to play music in my Web browser.
Then I added that code to my WebXR system, so I could use my music keyboard to create graphics in XR. At some point I realized that I could just plug the midi keyboard directly into my Meta Quest 3, using its Web browser based midi interface.
All of a sudden I can play the piano while making 3D graphics animate in the air above the keyboard — not in VR, but in video-passthrough XR. So I can still see my room, and I can see my real hands playing music on the keyboard.
But now I can also create magical 3D graphics in mid-air just by playing the piano. Like Harry Potter, if he were a musician.
As humans, we have a collective belief in a single shared reality. We trust the evidence of our senses — the Sun moves across the sky, a kitchen table seems firm and solid.
But since we are all human, we all share the same perceptual limitations. If I can’t perceive something, for example a four dimensional object, then for the most part you can’t perceive it either, since we were all born with pretty much the same perceptual equipment.
Which means that there is very likely a large universe out there — or perhaps right here with us — that is outside of our shared understanding of what the universe is. Which leads me to two questions:
(1) How can we all collectively perceive things, perhaps with the use of the right technology, that are outside of the range of our natural human perception?
(2) If we manage to do that, will be be able to use that enhanced perception to build a more true and accurate model of the Universe around us?
I had a meeting today with some NYU colleagues in which we talked about the future of extended reality (XR), and what our research can do to help make it happen. Afterward, we all went out to a nice dinner at a nearby restaurant.
At some point during the dinner I turned to the people next to me and said “I think I know how we will be able to tell when XR has finally arrived.”
“How?” they asked.
“We will know,” I said, “because we will all be wearing our XR glasses while eating a nice meal together at a restaurant like this one. And it will all be perfectly normal.”
I am pretty sure that I am effectively a different person in the mornings and in the evenings. In the early morning, I can get an entire day’s work done in an hour or so, and I really enjoy it.
By the evening, all I want to do is chill out. I can manage to get work done, but it doesn’t feel like much fun. And the results are not that great either, because my heart just isn’t in it.
So you could say that morning me is supporting evening me. Both of me are just fine with that.
The other day, I posted a particular teaching challenge:
“Suppose you are giving a lecture on a favorite topic of your choice. You and everybody else in the room are wearing XR glasses, so that you can make anything at all magically appear for everybody to see and hear, as though it were actually in the room.
“What would you choose as your lecture topic? And what would your audio-visuals be?”
And I realized that I needed to ask myself this question, and discover what my own answer would be. After a few days thought, it came to me.
I would show people how to make my noise function. My visuals would evolve gradually from the algorithm to a 3D model of the resulting noise itself.
Today I started building this on my handy dandy Quest 3. I don’t know how it’s going to turn out, but I am having fun!
I notice that for me there are two distinct steps to writing useful software. The first step is to just get it working. If it doesn’t work, then it’s not going to be useful to anybody.
But then after that, there is the question of making it useful to people who are not me. Not only should it be useful for the particular problem that I am trying to solve, but it should also be able to help other people to solve different problems.
To make things useful for other people, I need to pull out all of the independently useful bits, give each one an interface that is clean and easy to use, and all the while make sure that I haven’t broken anything. When all that is done, then I can go on to working on other projects.
So every time I solve a problem for myself, I am also creating new tools that can help other people to get things done. Most other programmers also think and work this way.
Which means that for most programmers, the act of coding is inherently generous. The ability to continually build useful tools for the community is one of the wonderful things about programming.
Today, October 21, is the day in history when Thomas Edison applied for a patent for his version of the electric light bulb. Which was, in many ways, the canonical invention. In fact, the light bulb is the most often used symbol for the idea of invention itself.
Today is also the day in history, in fact exactly 40 years ago, when a length of one meter was first officially defined as the distance that light travels through a vacuum in a precisely defined period of time.
I can’t help but feel that those two events are connected. Each of these bright ideas, in its way, illuminates the other.