Science Explains Why You Saw the Dress as Blue or White

Newly published experiments suggest it’s all about the assumptions our brains use to filter out extraneous material.

Remember the dress? Back in February, practically everyone with an Internet connection weighed in on the question of whether a seemingly ubiquitous photograph was of a blue dress with black stripes, or a white one with gold stripes.

The controversy clearly disturbed a lot of people, throwing into serious doubt their previously unquestioned assumption that we are all seeing pretty much the same thing.

So what was going on? The new edition of the journal Current Biology presents three perspectives, based on scientific experiments conducted in the wake of the kerfuffle.

While they differ in their specifics, all suggest that the blue color of the dress was a key factor. One team of researchers, led by University of Nevada-Reno psychologist Michael Webster, referred to “the special ambiguity of blue.”

It turns out we automatically adjust our perception of color to assume an object is being bathed in either bluish or yellowish light. If we assume bluish lighting (that of a sunlit sky on a bright blue day), we automatically compensate for the distortion it causes, bleach out the blue, and see the color white.

But if we believe the object is illuminated by yellowish light such as a standard bulb, we assume the blue is the color of the dress itself, and therefore see it that way.

Since the photo of the dress provided no clues as to whether it was taken indoors or outdoors, in sun or in shade, each of us went to our default setting—which, it turns out, differs from person to person.

“We make assumptions about the world that guide the interpretation of sensory data, and these assumptions can be quite different for different individuals.”

“As Newton remarked, color is not a property of an object,” writes a second research team, led by psychologist Karl Gegenfurtner of the University of Giessen in Germany. “It arises when a surface is illuminated and light is reflected into the eye of an observer, who interprets the light distribution of the whole scene and assigns a color to the object.”

Even though this stimulus is “constantly changing,” we are normally “very good at assigning constant colors to objects,” the German researchers note. It’s one of the shortcuts the brain uses to simplify things for us: Once we decide a fire truck is red, we continue to perceive it as red, no matter the weather or time of day.

But the dress proved different. A trio of researchers from Wellesley College and M.I.T. surveyed 1,410 people about the image, and found a substantial number—45 percent of those who had previously seen the image in news reports or through social media—subsequently changed their minds regarding its colors.

Overall, they found 57 percent of participants described the dress as blue with black stripes, while 30 percent saw it as white and gold, 11 percent as blue and brown, and two percent as some other combination of colors. “Reports of white/gold over blue/black were higher among older people and women,” they report.

The age difference makes sense when you take into account the fact younger people are more likely to be “night owls,” and are thus more likely to have artificial, “warm” light as their default setting. Their unconscious assumption that the dress is being illuminated by an incandescent bulb made them more likely to see it as blue/black.

People who spend more of their time in “cool,” blue-sky sunlight would likely use the opposite assumption, leading to their view of the dress as white/gold. Backing up this notion, the researchers found they could flip people’s perceptions “by introducing overt cues” regarding the light source.

Further testing showed that making the image larger, which gave viewers a better sense of the material, made people more likely to see the dress as white/gold. On the other hand, blurring the image made them more likely to see it as blue/black.

The German researchers, after conducting a much smaller lab study featuring 15 people, similarly found the differences in color perception “are mainly due to the perceived differences in lightness.”

“The bright blue tones present in the image could equally well be due to a dark bluish illumination on a white dress, or to a blue dress under a neutral bright light,” they write. “Under conditions of high uncertainty … observers may differ quite substantially” as to which process is taking place, which impacts their perception of the colors they are seeing.”

The controversy is a good reminder that the visual images we perceive are filtered through the biases of our brains. As Gegenfurtner and his team put it: “We make assumptions about the world that guide the interpretation of sensory data, and these assumptions can be quite different for different individuals.”

Findings is a daily column by Pacific Standard staff writer Tom Jacobs, who scours the psychological-research journals to discover new insights into human behavior, ranging from the origins of our political beliefs to the cultivation of creativity.

Related Posts