Dynamic furniture

There are two things we generally care about when it comes to furniture: How it looks and how it feels. Because of the limitations imposed by physical materials, these two are often linked.

However, there may come a point where we are mainly seeing the furniture in many rooms through the lens of extended reality. In public places, in particular, it may eventually be considered rude to remove one’s XR eyewear, and so in such places we will be “wearing” all the time.

When that happens, how a chair looks and how that same chair feels can be completely decoupled from one another. But is this a good thing or a bad thing?

In the words of Rufus E. Miles, “Where you stand depends on where you sit.”


I hate deadlines, but I admit that without them I would probably not get much done. So you could say it’s a love/hate relationship.

That final crunch can help you a bunch.
The thing you dread is what gets you ahead.
What makes you scream puts you on the team.
What drives you crazy helps you not be lazy.
What prods your ass makes you top of the class.

I deal with my fear, as deadlines near,
And the pressure climbs, by making rhymes. 🙂

Future board games

In order to play a board game like Monopoly or Chess or Scrabble, you need just the right equipment. Alas, most of the time when you are hanging out with friends, you don’t have a Monopoly or Chess or Scrabble board with you.

But soon that won’t be a problem. As soon as you and your friends put on your XR specs, the board will materialize on the table in front of you.

On the one hand, this seems like a step backward. Instead of a being tangible experience, these games will become ephemeral.

Yet there is another way of looking at it. Many more people will be able to play them. And that can’t be a bad thing, right?

A room with a view

Right now the value of real estate varies tremendously with whether or not it has a good view. Alas, no matter how much money you pay, you always get the same view.

At some point, when extended reality specs become as numerous as smartphones are today, that will change. You will be able to decide what view out of your window that you want on any given day, whether of The Eiffel Tower, or of the Grand Canyon, or of a lunar landscape.

I wonder what that will do to the value of real estate.

A.I. Etiquette, part 2

Our understanding that we are dealing with a fellow human is not something intellectual. It is instinctive, innate, part of our biology.

We don’t reject the humanity of chatbots because they are insufficiently capable. We reject their humanity because they are not human.

It means nothing to us if they are turned off, or duplicated, or altered in various ways, because there is nothing really at stake.

In contrast, we view each human life as inherently precious, and the loss of a human life as a tragedy. This is not intellectual. It is tribal, it is
primal, and it is baked into our DNA.

The rate of A.I. development is not relevant here. In this realm, there are larger forces at work.

A.I. Etiquette, part 1

Today I wanted to confirm whether I already needed to make payment on a bill, so I called the number written on the bill statement. Not surprisingly, the call was answered by a virtual person.

“She” was very polite, and she asked me some questions to verify it was really me, guiding me through the process. At some point she said “Your bill is not due until November 30. Would you like to pay now?”

At that point, I just hung up the phone. The bill was not yet due, so I didn’t need to pay anything, and there was no point in continuing.

Had I been talking to a real person, I would have exchanged some sort of pleasantries before hanging up. Presumably I would have thanked the person for their time, wished them a good Thanksgiving holiday, and so forth. But in this case, since there was no actual person on the other end, I simply hung up.

Afterward, the question occurred to me as to whether A.I. will ever advance enough to change my behavior. In other words, in that same situation, given a sufficiently advanced A.I. agent, would I ever feel the need to first exchange pleasantries with that agent, rather than simply hanging up the phone?

I suspect that the answer is no, and I think the reasons are profound and important. More tomorrow.

Sometimes four there are

When you and I are having a face-to-face conversation, the spatial dynamics are fairly simple. Both of us are facing directly toward the other, and at any point in the conversation it’s clear where the focus of attention is.

If we were to use advanced extended reality technology to visually place virtual objects into the scene, we could pretty much always place them half way between you and me, and everything would make perfect sense.

But when four people are sitting around a square table, things are a little more complex. The two people at one corner might be engaged in their own conversation, or three people might be sharing a conversation while the fourth is checking their notes.

So where should those virtual objects go? Does the system need to actively interpret what is going on within our conversation — perhaps who is paying attention to whom — and then make dynamic decisions based on that?

And what if you are talking to me, but I am ignoring you because I am listening to somebody else around the table. What are the best visuals for that situation?

I suspect that much of this will become clear some time in the future, whem multi-participant extended reality has been seamlessly integrated into our everyday conversations.

Always two there are

Soon after October 10, the launch day for the Meta Quest 3, I took to carrying one around with me when traveling. If I got on an airplane, my handy Quest 3 would go with me.

Previously I had been toting around a Quest Pro, but those are big and clunky. In happy contrast, the Quest 3 takes up hardly any room at all.

But now I no longer carry around a Quest 3 when I travel. Instead, I always carry two Quest 3s.

One of them is for me, and the other is for whatever colleague I am meeting with. I put on one, hand the other to my colleague, and say “try this.”

This lets me properly road test what it feels like for two people to have a face-to-face conversation in mixed reality. Maybe it will feel like the future.