Can Algorithms Put a Stop to Partisan Gerrymandering?

Researchers are developing algorithms to draw non-partisan district maps.

The Supreme Court heard arguments on Tuesday in a case of partisan gerrymandering that could reshape the future of American democracy.

In Gill v. Whitford, the justices will determine whether federal courts can strike down district maps for being too partisan. The challengers chastise Wisconsin Republicans for monopolizing the 2012 state elections through gerrymandering, a practice in which state legislatures redraw district lines to favor one party.

After hearing arguments on Tuesday, the justices seemed to agree that an extreme case could cross a constitutional line, but they remain divided on where to draw it.

“Gerrymandering is distasteful,” said Justice Samuel A. Alito Jr., “but if we are going to impose a standard on the courts, it has to be something that’s manageable.”

States have long searched for this “manageable” method, as Steffanee Wang wrote in Pacific Standard last year. Political scientists agree there is no “one-size-fits-all” method; instead, states have tested reforms on an individual basis.

To eliminate partisanship, Arizona and California turned to independent redistricting commissions run by non-partisan, unelected civilians. Though members were vetted for party allegiance and conflicts of interest, even they were not immune. In an increasingly polarized political climate, political scientists note that it’s hard to find neutral citizens. Worse, political players can hijack the system, as one Democratic commission chairperson did in Arizona.

Other states had their own takes on reform. In New Jersey‘s commission, a chairperson appoints party leaders and pushes them to compromise; Iowa relies on an independent advisory group.

But in the past year, researchers have touted a new solution, which takes people out of the equation: using math to draw the map.

This satirical map, dated 1813, reflects the origin of the word gerrymander.
This satirical map, dated 1813, reflects the origin of the word gerrymander.

(Photo: Wikimedia Commons)

In a paper released in September, data scientists from the University of Illinois–Urbana-Champaign proposed using an algorithm to analyze census blocks and create congressional districts based on specific parameters. Led by computer science professor Sheldon Jacobson, they built on decades of research to make their algorithm more accessible and equitable.

This is the most recent development in the growing field of algorithmic redistricting. In 2016, another team from the University of Illinois–Urbana-Champaign developed an algorithm that could generate every possible district map and evaluate the level of partisanship. Duke University professor Jonathan Mattingly did a similar analysis in 2017. On the Internet, free open-source programs let people draw district maps optimized for compactness without a supercomputer.

These types of algorithms have been used for decades to aid in partisan gerrymandering, but recent projects seek to utilize them for public good.

Even Jacobson’s new algorithm, however, will need to be programed to optimize for certain goals—and so we’re back to the problem of “standards.” Early models prioritize compactness and equal population, but these standards can be just as partisan. Tufts University mathematician Moon Duchin, who studies the compactness metric for gerrymandering, has said the field is a “mess.”

Determining this criteria is political, not scientific. This is why some political scientists propose using a combination of algorithms and independent commissions. Others look to the Supreme Court.

On Tuesday, Chief Justice John Roberts left the court with another warning: The justices can embrace science and mathematical theories, but the public will still say “baloney.”

Related Posts