Are we living in a simulation?

I was listening to a Dan Bongino podcast entitled, Ep. 1909 Are We Living In A Simulation?  As a bit of a personal side note to his normal news aggregating and commenting, he began talking about the idea that we, humans, are living in a simulation.  This subject is fascinating to Bongino and as he likes to say, he “brought the receipts” to back up this theory.

He informed his listening audience that there is an experiment, the Double-Slit Experiment – of which much has been written if you care to look it up – that strongly suggests we humans are being spoon-fed our perception.  The strange thing about what the experiment reveals is this…

Each particle suddenly seems to exist everywhere, in many different places at once —like a wave spread out everywhere. This phenomenon is known as superposition.

However, particles only act this way when no one is looking!

Something strange occurs if someone is watching or if detectors are placed at each slit to see when a photon changes into a wave and goes through both slits at the same time. When there are detectors or an observer present, the single photon doesn’t become a wave. It remains a single photon and goes through only one of the slits. The photon behaves itself, as if it knows it is being watched. So photons appear to be aware! And the observer affects the behavior of the photon.

–Pamela Oslie

IANAPhysicist, nor is Pam (I only used her quote because it most concisely sums up the Double Slit Observer Effect, YMMV.).  And then there’s this gem, by an actual physicist…

Let us pause here and be perfectly clear. Measuring the future state of the photon after it has gone through the slits causes the interference pattern to vanish. Somehow, a measurement in the future is able to reach back into the past and cause the photons to behave differently. In this case, the measurement of the photon causes its wave nature to vanish (i.e., collapse) even after it has gone through the slit. The photon now acts like a particle, not a wave. This paradox is clear evidence that a future action can reach back and change the past.

— Louis Del Monte

Whoa!

It was shortly after I’d begun reading about this phenomenon – the book, The Grand Biocentric Design by Robert Lanza, is pretty good – that I noticed a strange occurrence at work.

I’m in control of the security cameras at two of my employer’s facilities.  There are more cameras than will reasonably fit on one computer screen for viewing purposes, so I have several views set up that are on a carousel, with each view displaying only some of the cameras for 10 seconds at a time before moving to the next view.  There is the main security view, which shows the 2 employee entrances, both inside and out, as well as the break room and parking lots.  Another view shows warehouse stuff, another production, etc.

It’s always been the case that not every camera’s shot comes immediately on when the carousel switches to the next view.  I mean, the cameras are always on and recording, but when a particular view comes up, some cameras are immediately showing while maybe one or two might take a split second or so to come up.  It’s just a minor nuisance.  First world problems.

I’d had the feeling, a bit along the lines of a superstition, like a jinx or a watched pot, that usually it was the camera that I was most interested in seeing that was the last to come up.  I passed it off as its being the proverbial last place I looked.  Or maybe I just wouldn’t notice if it was a camera I wasn’t interested in to be the last up because I wasn’t focused on its place on the view’s grid.

Whatevs, right?  Just passing thoughts on a minor nuisance.

Until, that is, Bongino’s podcast got me reading articles in my downtime at work about that Feynman Experiment.  I mean, could it be that the time delay was due to the simulation’s going back in time to convert what I was looking at from a wave to a particle?  …and because it was having to go back through the electronics of the camera, cable and software, this conversion was not instantaneous, and thus there was a delay in my desired camera coming up in the view?

I know this all sounds silly, it did even to myself, but boredom got the best of me, so I decided to put some science to it.

 

IFLS  

After some thinking about it, I decided on testing 3 things:

  1. No Camera Preselected.  I’d film my computer monitor as it scrolled through the carousel, then watch it later and note the delayed camera(s).
  2. Camera Preselected and Watched For.  I’d watch the live feed with a predetermined camera to look for at each turn of the carousel, then note which camera was actually delayed.
  3. Camera Preselected, Not Watched For.  Same as the above, with a predetermined camera to watch.  But at the turn of the carousel I’d look at a different camera’s spot on the view’s grid.

The purpose of the 3 methods was to determine whether or not the “simulation” reacted to my intention or actual eyesight.  Now, I’m not any kind of statistician, but from working in the business world for most of my life I’d come to understand sample sizes and distributions somewhat.  So, the following became the parameters for my experiment:

  1. Number of Cameras:  23
  2. Number of Views/Grids in the Carousel: 4
  3. Number of Cameras per View:
    1. 9
    2. 16
    3. 23
    4. 6
  4. Seconds per View:  10
  5. Hours to View per Testing Method:  3
  6. Approximate Number of Carousel Turns per 3 Hour Test:  1,000

The results are as follows.  For test #1, No Camera Preselected, upon viewing the recording of my monitor while the security camera app was on and going through the views carousel, there were 976 cameras delayed at the turn of the carousel.  A graph of the distribution… 

This appears to be sufficiently random to me.  There look to be some wide swings, minimum count to maximum, but not after digging a little deeper.  For example, cameras 5, 6, 9 & 11 are in all 4 views, so they’d have more odds of being delayed at a carousel change than say, camera 23, which is only in view 3.

I suppose there’s actually a way to figure out the odds, but it’s not like I was going to pay a pro to do this!

For test #2, I just sat and watched the security cam app feed when I had some spare time sitting at my desk.  Before each carousel turn I referred to a check-off list that I’d made telling me which camera I would look at next, and I immediately looked at its spot on the grid when the carousel switched views.  Now things start to get interesting.  The graph… 

…which turns out to be around 81% of the cameras I wanted to look at were in fact the, or one of the, cameras that were delayed.

But was it my eyesight causing this, i.e. the simulation reacting to what I was looking at, or my intent.  I mean, would it be practical for the simulation to always be going back in time to fix things?  Wouldn’t it be better if it knew before so it could fix things in real time?

Apparently the answer to that last question is no…

It only matters what I’m actually looking at that the simulation cares about.  Though now that I think about it, reading people’s minds would probably be a fool’s errand, and even more inefficient than going back in time.  But what could I possibly know about that?

As if any of this is even real, right?

 

Why Us? 

And yet, real or no, I couldn’t stop thinking about it.  I didn’t quite believe the numbers from my tests.  Surely I was doing something wrong.  I didn’t actually expect to find that there just might possibly be some… thing… going back in time and changing light waves to particles, all for my viewing pleasure.  What would be the point?

It’s not like my life is exceptionally interesting in any way.  My life’s “A” reel is not unusual.  I mean, when we humans play video games (and by “we” I don’t include myself; I’ve never owned a game system, only watched my kids play on them at times) we don’t watch people go to a desk job day after day.

Let me back up a bit.  I’m just trying to figure out why we’d be in a simulation, assuming we’re not just bio-batteries like in the Matrix movies.  When we play a video game, a simulation, we pretend to be special.  Superheroes.  Soldiers.  Animals.  Royalty.  Central Planners.  Sports Stars.  Or there’s some interesting thing to do like solve an escape room.  What part do I contribute to some alien’s entertainment?

I suppose I could be an NPC, a Not Player Controlled character.  I at least understand the NPC meme, despite my lack of doing the role playing game thing.  And since the NPC is considered a joke for its lack of nuance, could it be that in the future there’s an effort to make NPCs less of a joke and more interesting, complete with back stories?  I suppose doing so would add to the simulation’s realism.

So, I’m an NPC.  Who on planet earth are the simulation’s playing characters?  Who on earth would an alien RPG player want to be that is more interesting than their real life?  We humans already pretend to be something better or more interesting than human in our own games (our own simulations within the simulation), so wouldn’t the alien want to be something better than itself when playing in the simulation of us here on earth?

Who would the alien player want to be?  A member of Seal Team 6?  The President?  A terrorist?  A mass murderer?  Could an alien player have been Hitler or Stalin, in a sort of Grand Theft Auto type game where you pretend at being the bad guy on a global scale?

But what if an NPC turned into something interesting?  Would our alien player want to interact?  If I were to become a Hitler by chance, would the player want to go to war with me?  Would it want to be me?  Could two alien players simply choose to become Hitler and Churchill, to take over their lives/souls/beings if such an interesting situation presented itself?

And how would I know if I were interacting with an alien player?  Would there be a way to see the player, outside of this simulation, in his own environment?  We have Wii sensors – and I guess there are other game systems with similar such devices – that allow the game to “see” the player.  Yeah, there’s the video game controller, but that’s all it is, a controller.  The Wii type cameras are actually looking at the player so the video game can see the player, allowing it, and even the NPCs, to interact with the game’s “vision” or “sensation” of the player.  Would it be possible, within our simulation, to find the inputs from a Wii type sensor, and “see” the alien player in his own environment?

But first, I’d have to be able to tell the difference between an NPC and the alien player’s character.  The most obvious way to find the alien would be to identify the most adventurous or interesting among us.  Another way might be to attract the alien, by becoming a most adventurous or interesting NPC myself…