Skip to main content

Why Shaky Data Security Protocols for Apps Put LGBTQ People at Risk

For the LGBTQ community, the digital age should have opened an age of freedom. Instead, it has opened them to new threats across the globe.
Gay, lesbian and transgender activists react to the unanimous decision by the Iowa Supreme Court earlier in the day recognizing same sex marriage as a civil right during a celebration on April 3, 2009 at the University of Iowa in Iowa City, Iowa.

In 2016, Egyptian citizen Andrew Medhat was sentenced to three years in prison for "public debauchery." But he hardly engaged in acts that were debaucherous. Rather, police found out that Medhat was planning to meet up with another man, and officers were able to locate him through the gay hook-up app Grindr and arrest him. Being gay isn't illegal in Egypt. Not technically. But under the hazy guise of "debauchery," the police there have managed to bend the law in a way that allows them to impede on the privacy of an especially vulnerable group of people.

For the LGBTQ community, the digital age should have opened an age of freedom. In the old, analog days, finding a relationship often involved risking exposure at a time when such exposure could lead to harm, or even death. Dating apps promised a chance to connect privately. But that promise is false if the state can access the data, or even the location, of someone via the app. Indeed, this group, long criminalized and pathologized, is often an afterthought when it comes to user privacy and regulations—which has resulted in a precarious digital landscape.

It feels important to note here that technology isn't inherently good; neither is it inherently evil. It's neutral and at the will of those who use it. That will can be malicious, as we saw with Egypt's use of Grindr—popular for the way it can connect gay men through their geolocation information. At first glance, this seemingly harmless method yields no direct consequences. But a deeper look reveals just how easily the app can be misused.

Consider how, within the past five years, instances of attacks coordinated via Grindr—among other location-based applications—have not-irregularly compromised the security of gay men. Cases have ranged from a serial killer in the United Kingdom, who would use Grindr to lure unsuspecting gay men to him before killing them, to a case in the Netherlands last year, when Grindr was used to locate and attack two gay men in the town of Dordrecht. Earlier this year in January, two men in Texas were charged with conspiracy to commit hate crimes after they used Grindr to physically assault and rob at least nine gay men.

On the one hand, it's certainly true that anti-gay hate crimes like these can, and do, happen without location-based apps. After all, it's not just in the context of these hook-up apps that gay men in particular are more vulnerable; men who have sex with men have always been more vulnerable. This is due in no small part to ambient, state-sanctioned homophobia that has historically forced this sort of intimacy underground, where there has been little protection. (The professor and cultural historian James Polchin gets at this dynamic in his forthcoming book, Indecent Advances: A Hidden History of True Crime and Prejudice Before Stonewall.)

Still, it's also true that apps have opened up new avenues for these sorts of crimes to be committed, though this has been unintentional on the parts of the apps themselves.

I'd argue that there are two main reasons for this broader issue. First: wobbly privacy. It's fairly easy to pinpoint a user's location without it being explicitly—or consensually—given. This can occur through a process known as "trilateration." In short, if three people want to determine someone's location with a fair degree of precision, all they need is their three locations as well as their respective distances from a person they're all in contact with. Then, using basic geometry, they can "trilaterate" this data to find the location of the unsuspecting person. (This was, essentially, the tack that the police in Egypt took to find Medhat.)

This first issue leads to a second—and in some ways more alarming—problem. In Grindr's terms of service, this security flaw is actually specified. After reading Grindr's privacy policy, it does say that "sophisticated users who use the Grindr App in an unauthorized manner, or other users who change their location while you remain in the same location, may use this information to determine your exact location and may be able to determine your identity." But this is hidden deep within the app's privacy policy page—within the already lengthy terms of service.

When I recently examined the terms of service page, it wasn't only long—it was also littered with terms that may not be immediately understood for users outside the technology or privacy fields. Put another way, it's unlikely that users will take the time to read a terms of service that's at once lengthy and phrased in a dense, inaccessible way. Instead, far too many users "consent" to the terms without fully understanding how their safety—their lives—may be at risk.

Indeed, the questions to ask, which have no direct answers, are these: Is it consent, truly, if users don't know what it is they're consenting to? Is it their fault if they don't bother to read the information given to them? Or do companies share some of the responsibility too—especially when it's a vulnerable, long-marginalized group that has to deal with the consequences?

Of course, this is an issue that permeates innumerable aspects of technology, not just apps like Grindr. Moreover, I'm not arguing that Grindr is the root of the problem. My point, rather, is that any piece of technology can be used in a way that inflicts harm on its users, and it's prudent to take these considerations into account when we have broader conversations on tech safety.

So, what to do about this?

For one, apps that use location services ought to be more cognizant of the implications that attend their use. This could take the form of limiting the ability to trilaterate and access private information within location-based applications by encrypting this data. It's also crucial to present terms of service in an easily digestible way, for instance by jettisoning unnecessary jargon so that people, particularly those who might be at greater risk, can make informed decisions. And lawmakers, for their part, could be more forceful about holding app companies accountable when it becomes clear that there are safety shortcomings in their products that affect their users.

Examples of putting this into action are already on display. In Europe, the General Data Protection Regulation seems to be changing the face of data privacy on a global scale. "Big U.S. firms are already required to comply with the GDPR for European markets, so it makes sense to extend a similar approach to the U.S.," says Marc Rotenberg, president of the Electronic Privacy Information Center, a Washingotn, D.C.-based advocacy group.

This European Union law on data and consumer rights was once deemed hard to implement. But as privacy breaches continue to evolve with technology, it makes sense to think critically about the breaches that may be ahead and put into practice laws to protect the otherwise unprotected.

Both online and beyond, it's clear that the rights of some groups, like those of gay men, are more tenuous than others'. Why not reaffirm our commitment to the protection of all citizens?

This story originally appeared in New America's digital magazine, New America Weekly, a Pacific Standard partner site. Sign up to get New America Weekly delivered to your inbox, and follow @NewAmerica on Twitter.