How odd are some even numbers? Are some triangles more “triangular” then others? Why is purple a fruit? These are fairly surrealistic ways to ask about things that are either yes or no. For example, 798 indisputably is an even number … kinda.
New research from psychologist Gary Lupyan at the University of Wisconsin points out the imprecision of human understanding about black-and-white categories—”from a failure to fully abstract from the details of the input”—and compares these fuzzy human algorithms with precise and context-free computer algorithms. In the new edition of the journal Cognition, he addresses “why brains make mistakes computers don’t.”
Lupyan describes a trio of experiments that demonstrated how the “rules” for human thinking are fast and loose—probably as an assist to the former quality—compared to the stricter and essentially flawless determination we would expect from a computer. The result is the finding, obvious yet nuanced, that human thinking is different from computer calculation.
Computers aren’t distracted by extraneous information, and yet sometimes that extraneous information is important to the task at hand.
Usually when comparing what a computer can do, we assume it is doing something we could do—like lots of complex mathematical equations—but don’t want to, whether based on speed or tediousness. Furthermore, we assume the computer won’t get bored, which we know we would and that mistakes likely would result. Lupyan’s findings, however, suggest that people have a different set of hard-and-fast rules than the ones they program into computers, even though those computer rules are part of an attempt to replicate and accelerate human-based performance.
“People’s performance,” he writes, “is not simply slower and more error prone, but qualitatively different, displaying inherent sensitivity to aspects of the input that are formally irrelevant to the operation being performed.”
The three experiments included determining whether or not a number was even or odd, whether a given shape was a triangle or not, and who qualified for a contest open only to grandmothers. Variations were added to the experiments—adding time pressure, spelling out numbers, color-coding individual digits—to suss out if this could be attributed or dismiss reasons for the answers other than a peculiarity of human cognition. All told, the 13 variations had 1,117 participants in total, 35 of them university undergrads and the balance volunteers drawn from Amazon’s Mechanical Turk, a crowd-sourced marketplace that allows strangers to take on advertised online “human intelligence tasks.” They were all adults, with varying amounts of education, and located in the United States and India.
On the whole, the humans did a credible job answering correctly but a sizable minority still made what might be called intentional missteps (and Lupyan factored out inattention and misperception). When Lupyan asked about the mistakes, only once did someone admit they goofed. Instead, he got answers that some numbers were more even then others, 400 compared to 798 for instance, or that, “Those perfect sexy equilateral triangles are the most ‘trianglest.’” Lupyan found that someone who thought 400 was “more even” than 798 was likely to introduce other perceptual gradations into otherwise binary answers.
But what does it mean for 400 to be more even than 798? Is it that 400 looks more even? At issue, I think, is not perceptual similarity, but overall representational similarity. People mistake 798 for an odd number not because it looks like an odd number. Rather, the reason 798 looks odd (at least odder than 400), is that the representation of 798 is closer to that of other odd numbers than the representation of 400. 798 is almost odd.
The oddest part of the results, at least to me, is that I instinctively understand how someone might see a right triangle as more triangle-y than a scalene, or that a woman in her 60s with five grandchildren is more grandmother-y than a 39-year-old whose daughter just gave birth. There are canonical examples of each and instances that adhere closest to those are the more, umm, authentic. There are certainly philosophical antecedents for this thinking; the more masochistic among us may attempt to recall Plato’s “theory of forms,” while the idea that some even numbers are more even than others has a whiff of Orwell about it: “All animals are equal, but some animals are more equal than others.” (Lupyan specifically explained he was looking more at the psychology of the idea “concept” than at its philosophy….)
That intuitive feeling about degree has been observed before; almost a half century ago Stanford’s Robert Moyer and Thomas Landauer showed people make more mistakes. All Triangles Are Not Created Equal in determining that five is greater than four than that five is greater than one. But, as in the present case, to deny three-sided figures are triangles or that women with grandchildren are grannies—that’s a step beyond, Lupyan argues.
Is this good or bad? Well, Lupyan fuzzily answers, it’s human. Computers aren’t distracted by extraneous information, and yet sometimes that extraneous information is important to the task at hand. And sometimes it’s not.
“This input-sensitivity is critical for obtaining the enormous behavioral flexibility that humans have, but may come at the cost to the ability to perform certain kinds of formal computations,” he explained in a release from Cognition’s publisher. “More broadly, the results tell us that the metaphor of the brain as a digital computer running formal algorithms is seriously misleading.” It’s an appropriate conclusion in a study that in its own way is about the breakdown of metaphor.