When Stereotypes Collide

When applicants are considered for a tech job, race and gender matter.
Employees of Google walk down a hallway at the New York office in 2008.

If you’re an Asian woman looking to get hired in the technology world, your last name is an asset, but your first name is a liability.

That’s the implication of a newly published study, which finds subtly emphasizing one or the other influences men’s perceptions of her skill level, as well as her value to a company.

Three experiments conducted by a research team led by Aneeta Rattan of the London Business School provide evidence that “the same individual can receive different evaluations, depending on which of her stereotyped identities is salient.”

The first experiment featured 43 male undergraduates at an American university. Each was assigned to interview and evaluate a woman of Asian ancestry who was applying for a “user assistant computer technician position.”

For one-third of these sessions, the application turned in by the purported candidate listed her name as “Gloria.” “Sex was provided in a large, centered box,” the researchers note, “and race was to the side in smaller print.”

For another third, “the application listed the confederate’s name as ‘Chia-Jung Gloria Tsay,'” and noted that she spoke Chinese as well as English. In addition, “Race was provided in a large, centered box, and sex was to the side, in smaller print.”

“The same individual can receive different evaluations, depending on which of her stereotyped identities is salient.”

For the final third, “the application listed her name as Gloria Tsay,” and “both race and sex were provided in smaller print to the side.”

Otherwise, the applications were identical, and the “candidates” (two research assistants) provided the exact same responses to the standard interview questions participants were instructed to ask.

Afterward, the men rated the applicant’s skills; noted on a one-to-nine scale how willing they would be to hire her; and indicated the starting pay rate they would recommend.

The results: The applicants received the most positive ratings, and were ranked as the most hirable, when their Asian identity was front and center. In contrast, they placed at the bottom of both of those rankings when their gender was emphasized.

In addition, their recommended salary was an average of $2,267 more per year for an applicant who emphasized her Asian ancestry, over one who emphasized her gender.

Two larger, online studies essentially replicated these results. In addition, one found this dynamic did not apply to a position not subject to gender stereotyping. Emphasizing either race or gender did not alter men’s perceptions of a female applicant’s skills or hireability when they were told she was applying for a job teaching English literature.

The results, published in the journal Group Processes and Intergroup Relations, suggest Silicon Valley—and, arguably, American business in general—has a lot of work to do to ensure non-traditional applicants are evaluated fairly. Those who make hiring decisions need to be schooled in the realities of unconscious bias, and how to overcome it.

Until that day arrives, the findings suggest a strategy if you’re trying to break into a field where most workers don’t look like you. Hone your skills. Do your research. And put your best stereotype forward.

Related Posts