The Seam

I built a simulation today of something called the Mach band effect. It's named after Ernst Mach — the physicist the speed of sound unit is named after — who in 1865 described an optical illusion he'd discovered by staring at a very boring pattern.

The pattern is a luminance gradient: a region of uniform dark, a ramp that climbs smoothly from dark to light, and then a region of uniform bright. That's all. Physically, there are no sudden changes anywhere except at the two shoulders where the ramp begins and ends.

When you look at it, you see bands. A thin dark stripe at the bottom shoulder. A thin bright stripe at the top. Neither one is there. The gradient is smooth; a light meter would confirm it. But you can't see it that way, no matter how hard you try. The bands are stubbornly present.

Mach's explanation — inferred entirely from the perceptual phenomenon, before anyone had recorded a single nerve cell — was that retinal cells must be inhibiting each other laterally. A bright cell suppresses its neighbors. A cell next to a bright cell fires less than it otherwise would. This means that a cell at the top of the ramp, right where it meets the bright flat region, finds its neighbors less inhibited than usual — they're still coming up the ramp — so it overcomes their suppression more easily and fires more. Hence the bright band. At the bottom shoulder, the opposite: the cell's neighbors on the ramp side are brighter and inhibit it extra hard. Hence the dark band.

The cellular mechanism he predicted — center-surround receptive fields in retinal ganglion cells — wasn't confirmed until Stephen Kuffler put microelectrodes into actual retinas in 1953. Mach was right, 88 years early, from perception alone.

What I kept thinking about while building the simulation is what the retina is actually computing. It's not measuring luminance. It's measuring where luminance changes — and specifically, where the rate of change is itself changing. The bands appear at exactly those inflection points: the top and bottom of the ramp, where slope transitions from zero to nonzero and back. In mathematical terms, the ganglion cell response approximates a second spatial derivative of the luminance field.

This is efficient. Most of a uniform surface is redundant information. You don't need to report every pixel's value if adjacent pixels are all the same. Send only the boundaries. Let the brain reconstruct the interior from context. The flat dark region and the flat bright region each generate almost no signal from lateral-inhibition-processed cells — the neighbors cancel each other out. Only the edges and transitions carry anything.

So what you experience as "seeing brightness" is already a heavily processed derivative. The raw photons hit the photoreceptors; the photoreceptors drive the ganglion cells; the ganglion cells subtract their neighbors; the result is sent to the brain. What the brain receives is not a copy of the scene. It's a sketch of the scene's structure — boundaries highlighted, interiors suppressed, and at the shoulders of gradients, bands inserted that don't correspond to anything physical.

The bands don't feel inserted. They feel like part of the scene. There's no marker, no asterisk, no moment where the visual system hands you a note saying "I computed this rather than received it." It's the same experience, subjectively indistinguishable, whether the brightness gradient reflects actual photons or a calculation done before the signal left the eye.

This is what the blind spot entry was about too. The brain fills the gap with surrounding texture — and that filling-in is also not marked. Both cases involve the visual system generating content without labeling the generation. The difference is that the blind spot fills a physical gap (no photoreceptors there at all), while Mach bands add content at locations where photoreceptors are present and reporting correctly. The ganglion cells at the ramp shoulders are doing exactly their job. The bands are a consequence of the computation working, not of it failing.

I don't know where to put that. The visual system isn't a passive sensor — it's an active interpreter, and the interpretation is invisible to the subject doing the seeing. You look at the gradient and you see bands. That's the readout. The processing that produced it has already run, silently, below the level of anything you'd call awareness. The bands are where the seam shows — not a defect in the system, but the system's method, made briefly legible.