University Assault Audits Don’t Tell the Full Story

A new study shows that sexual assaults reported under the Clery Act decline after government audits, suggesting drastic underreporting.

Sexual assaults, a serious problem at American universities for decades, are drastically underreported, even compared with sexual assaults among the general population. Worse, a new study shows, periodic government audits aren’t doing much, if anything, to improve the situation.

Among other things, the Clery Act requires colleges and universities to report the recorded number of campus rapes and sex crimes over the course of the year. And because the Department of Education conducts occasional audits of campus crime reporting systems, schools are in theory pressured into following through with that reporting. Whether those audits promote better campus sexual assault reporting, however, is suspect. For one thing, studies have found about one in five women are sexually assaulted during college, yet Clery Act statistics put the number at around one percent, suggesting that some schools may be deliberately underreporting sexual assaults to the government and their students.

Prior to revelations about former assistant football coach Jerry Sandusky, Pennsylvania State University reported around 20 sexual assaults per 100,000 people, itself a credulity-stretching number. During the subsequent investigation, Penn State’s reported sexual assault rate increased 14-fold.

One way to check that hypothesis, reasoned University of Kansas School of Law professor Corey Rayburn Yung, was to look at the number of reported sexual assaults before, during, and after a DoED audit or investigation. If reports go up during an investigation—which begins with an in-person audit lasting a few days but can continue for a year or more—but come back down afterwards, that suggests that colleges and universities are on their best behavior during an audit, but revert back to their wayward ways once the period of scrutiny has passed.

Analyzing Clery Act reports and data on DoED audits from 2001 to 2012, Yung found that among schools the DoED investigated, there was, on average, a 44 percent increase in the reported rate of sexual assaults during an investigation compared with the rates reported beforehand. After the inquiries ended, however, sexual assault reporting dropped back down to numbers statistically indistinguishable from pre-investigation levels. 

At times, the jumps stretch even the most entrenched credulity. Prior to revelations about former assistant football coach Jerry Sandusky, Pennsylvania State University reported around 20 sexual assaults per 100,000 people, itself a credulity-stretching number. During the subsequent investigation, Penn State’s reported sexual assault rate increased 14-fold.

While it could be that students simply report more sexual assaults during audit periods, that’s unlikely to explain the data, according to Yung. Penn State’s investigation notwithstanding, DoED audits tend to be fairly low-profile events and garnered just 17 reports in the media database Lexis-Nexis prior to 2009, the year the Sandusky scandal broke. This suggests that students aren’t aware of DoED investigations, and, as a result, are unlikely to report assaults at a higher rate during those inquiries.

“The study results indicate that the sexual assault data supplied by schools is likely severely undercounting the number of reported incidents on campuses,” Yung writes, suggesting that greater penalties for Clery Act violations, more frequent audits, and probation for offending schools “would help abate the current pattern of schools returning to apparent undercounting practices as soon as the DoED is no longer applying high levels of scrutiny.”

Related Posts