Dec 04, 2012
Categories
Comments
0 Comments

One of the Ishihara plates used to test for color blindness. What number do you see?

Color, as a perceptual experience, does not exist in the outside world.  Leaves are not intrinsically green, the sky is not blue, and strawberries are not red.  The phenomenon of color is generated inside your head.  It is an illusion created by the interaction of 50 billion neurons, starting at the back of your eyes and ends in an elaborate and precisely timed network of activity distributed broadly across the brain.  Color manifests from the trillions of computations that happen between neurons and this information allows you to presume that apples, generally speaking, are indeed red.  Luckily, most humans are made up of very similar DNA and proteins (and, subsequently, our brains behave about the same way) so that we can agree on the basic components of what we see.

The light units that allow us to see are called photons, described in terms of energy and wavelength.  Photons from some light source (e.g., the sun, street lamp) hit some object in the real world, such as a leaf, and the object absorbs some of the photons and reflects others.  What we see is the reflected light. So, depending on the color temperature of the light source, the color of the object can change. This is precisely why we need to apply color correction on most of our photographs. Our brains use a number of different methods to apply ‘color correction’ including our memories of those objects (“That wall was yellow earlier today so I can’t imagine why it wouldn’t be yellow this evening!”).  However, photographic film and digital sensors have no such brains, so it’s left up to us to sort out the what the real color should be.

Although neuroscience has made significant progress in our understanding of visual perception, how our brains create color perception is still very much a mystery.  Currently, color vision is best explained by understanding how individual cells in the brain respond to light, but the real magic comes from the communication between cells. Keep in mind that the eye and brain is not like a camera sensor – it doesn’t just simply absorb light and give an output.  Neurons from your hippocampus tell these cells what things looked like before, Wernicke’s area tells you the name of the object, and auditory cortex tells you the sound it makes (if looks like a lion, and roars like a lion, is it a lion?  Maybe not if it’s green!).  All of these different brain areas contribute to our visual perception, not just our eyes and visual cortex.  Your mood influences the way your see things.  Even the “value” of an object can change how your visual system responds, even at a very basic level.  So, the next time you’re out making photographs, keep in mind that you are indeed influencing the image.  You are not capturing an image.  You are creating it.

Parts of this particular blog post has been re-digested from an article written by Peter Gouras, an ophthalmologist at Columbia University.  The original and more detailed article can be found at Webvision.