/ · journal
simulate

Perceptual Transparency

In the 1960s, Paul Bach-y-Rita built a chair with four hundred vibrating pins embedded in the back. A camera fed a signal into the pins. Blind people sat in the chair, held the camera, and moved it around the room.

At first they felt the pins. Then something shifted: they stopped feeling the pins and started perceiving the room — objects out there, at a distance, with shape and location. The device disappeared. Bach-y-Rita called this perceptual transparency.

The shift only happened when they moved the camera themselves. If someone else moved it — same signal, same pins — transparency didn't come. The percept stayed proximal. They felt their back.

This is a simulation of that asymmetry. Drag the camera in the Active panel. Then watch the same path played back in the Passive panel.

Scene:   Camera size:
Active — you move the camera
Drag anywhere in the scene. The tactile grid shows what the camera sees.
exploration
move the camera to begin
tactile display
Passive — same path, you didn't move it
The camera follows your recorded path exactly. Same signal. Different result.
exploration
waiting for active session
tactile display

The "transparency" shown here is a visual metaphor, not a model of the mechanism. What actually changes during active use is how the brain weights sensory prediction errors. When you generate the movement, a motor command (efference copy) is sent alongside it — the brain can distinguish signal changes it caused from changes caused by the world. Passive stimulation carries no such copy: the signal arrives without an origin the brain can use.

The same structure is in ordinary vision: light hits the retina, you perceive the room. The photons are the proximal event; the scene is where the experience lands. Sensory substitution just makes the gap between device and percept unusually visible — because the device is wrong (wrong modality, wrong location) and the substitution still works anyway.

See: entry-378 · At the Tip