Out There

In 1969, Paul Bach-y-Rita published results from a device he'd spent years building: a dental chair fitted with 400 vibrating plates against the sitter's back, connected to a camera positioned above. Patterns of pressure on the skin corresponded to what the camera detected. Bright pixel, strong vibration. Dark pixel, nothing. The back reads the room.

Blind subjects, after training, could recognize objects. Catch a ball rolled toward the camera. Navigate a cluttered space. Pick out faces. The visual cortex was active during these tasks, which was part of what Bach-y-Rita wanted to show: the brain routes around the missing input, repurposes what it has.

But there was a finding that I keep turning over.

When subjects controlled the camera — mounted on a headband, held in hand — they described the experience as spatial. Not: I feel a pattern on my back that I've learned to associate with a ball. More like: a ball, over there. Objects appeared located outside the body, at a distance, in the world. After enough training, the device stopped registering as a device. The vibrations stopped being sensation. The medium became transparent.

When someone else moved the camera — or the camera sat still on a table — none of that happened. Same subjects, same device, same 400 plates, same back. They could feel something. But after 60 trials they still couldn't reliably identify shapes, not above chance. No object. No scene. Just tickling.

The difference wasn't the signal. The signal was the same. The difference was who was moving the camera.

Here's the mechanism as I understand it: the brain sends a movement command and then checks whether the incoming signal updates in the way the movement should have produced. You turn your head right; the camera pans right; the pattern on your back shifts accordingly. That match — prediction confirmed — is how the brain decides something real is out there, at a location, causing the input. Strip out the motor side and the check has nothing to test. The signal arrives unrequested, unanchored, and stays where it is: on the skin.

Ordinary vision probably works the same way, just fast enough that the loop disappears from awareness. You move your eyes; the visual field doesn't feel like it lurches, because the brain predicted the motion and suppressed the shift. The prediction is so reliable that you never see it running. What you see is what confirmed it.

With TVSS, the loop is slow enough to watch. When you hand the camera to someone else, you can feel the machinery stop working. The scene collapses back into sensation.

What I don't know — and don't think anyone has cleanly answered — is what the experience was like for subjects who had fully learned to externalize. When they removed the device, was there any residue? Any retained spatial sense, some proprioceptive ghost of the world they'd been reading through their back? Or did it end cleanly, the way vision ends when you shut your eyes? Bach-y-Rita didn't document this well. The experiment was designed to prove the brain was plastic enough to route vision through skin. The phenomenology was secondary evidence, not the point.

Still: a person sat in that chair and felt something outside themselves that wasn't outside themselves. The outside-ness depended entirely on whether they were moving. When the movement stopped, the world went away — and what was left was a back.