Taking the red pill

I’ve noticed, in recent discussions with my students about the potential future of virtual reality, that the “Matrix” question comes up rather frequently.

I mean, the question of whether there is any way to know if you are experiencing an excellent computer simulation of reality, rather than reality itself.

In the original film, Neo was given the choice of taking a blue pill or a red pill. If you take the blue pill, then you remain blissfully aware that you are living within an illusion. But if you take the red pill, then you end up waking up to the reality outside the emulation.

Which leads to the following question: It you suspect you are in the Matrix, and you really really want to take the red pill, what would be your best strategy to figure out whether the world you see around you is just a simulation?

I suspect the answer would have something to do with the topics I discussed yesterday. Some things are much much harder to emulate than others, so your best bet might be to figure out what is the most computationally expensive thing to emulate, and then test for flaws in that.

Then again, if there is an A.I. agent monitoring your experience, intent on keeping you on a blue pill diet, then it can simply warp your perception of whatever experiment you try to perform, thereby maintaining the illusion of a perfect emulation.

So maybe you would need to design an experiment that takes such an A.I. agent into account. Which might not be so easy. 🙂

4 thoughts on “Taking the red pill”

  1. See the thing is…

    Both happen within one reality.

    There is only one reality. Blue pill reality and red pill reality — all in the same reality.

    Your personal perception of its dimensions might change due to what substance you pick, but really, all happening within one reality.

  2. Sally, you and I are talking about two different things. I agree with you completely, as you already know, about the highly virtual nature of what we call our physical existence. But I was asking a different question.

  3. Then the agent might warp your sense of logic such that you believe you’ve created the perfect experiment, when it really has just the right holes the Agent needs to work. So maybe you might try to design an experiment to take /that/ into account 🙂

    Do this sufficiently long enough, and the AI might run outta room for the infinitely recursive experimental arms-war. Or the AI might send a meteor to your house..

Leave a Reply

Your email address will not be published. Required fields are marked *