Eventually (although not soon), we will be able to use a combination of visual, audio and haptic feedback to create a multi-sensory experience that feels just like reality. In a sense, the challenge here is to pass something akin to the Turing test.
The test would go something like this: If I am collaborating with two people, one of whom is sitting directly across a table from me and the other is 1000 miles away, can we create an experience of presence with sufficient fidelity so that I cannot tell which is which?
For example, if the person sitting directly across from me passes an object to me across the table, I should be able to see it, hear it slide across the table, and feel it as I take the object from my collaborator. I might also feel a slight resistance as the other person lets go of the object.
Can I replicate this experience with a person who is 1000 miles away by using multi-sensory passthrough? At what point does the combination of visual, audio and haptic passthrough sufficiently match the fidelity of physical co-presence so that I can no longer tell the difference?
I don’t know the answer. But I think that this would be a very worthy goal to strive for, and that research in this area would be very exciting.
Add taste and nutrition synthesis to the mix and the world will forever be changed, social distance a quaint notion; social obligation, as we know it now, intensified (or perhaps made meaningless).