A collection of evidence in support of simulation theory.

Visual Interface

Direct Digital Testimonial Manipulation Surveillance Google
I am shown a pre-Alpha YouTube feature.

Eye-Tracking


SIGNAL

At the very beginning of this video, we see an “on-off” toggle. At the very moment my eyes landed upon this toggle, and watched it swipe to “off,” my iPhone vibrated.

ANALYSIS

This was immediately striking, because it FELT as if I had controlled the interface with my eyes. It FELT as if I had performed that action myself. Upon refresh, to see if the vibration would happen again, it did not. However, it had already left its mark.

It would be completely possible to develop a visual interface that could be controlled via your eyes. Today’s eye-tracking software is very accurate.

It would not surprise me at all if Google is testing this feature. And given everything else I’ve seen, it would not surprise me at all if they gave me the first-look at it.

This could have been a random vibration, but it was timed so well that I’m inclined to think it was planned. No, I wasn’t controlling it with my eyes; they simply “scheduled” the vibration to occur at precisely that moment, to evoke the exact feelings it did.