While Europe Cracks Down on Data-Collection Practices, U.S. Consumers Remain as Vulnerable as Ever

Data-protection regulations in the E.U. require transparency in data mining. In the U.S. no such rules exist, but some states are working to change that.
Author:
Publish date:
Updated on
European Parliament President Antonio Tajani welcomes Facebook's Mark Zuckerberg at the European Parliament prior to his testimony on the Cambridge Analytica data scandal in May of 2018. Under a European regulation enacted that month, France recently fined Google €50 million.

European Parliament President Antonio Tajani welcomes Facebook's Mark Zuckerberg at the European Parliament prior to his testimony on the Cambridge Analytica data scandal in May of 2018. Under a European regulation enacted that month, France recently fined Google €50 million.

On January 21st, France's data privacy watchdog, the National Data Protection Commission (CNIL), levied a €50 million fine against Google—roughly $57 billion USD— for what the authority described as a "lack of transparency, inadequate information and lack of valid consent" over the way the company appropriated user data for personalized ads. The commission determined that Google had ran afoul of the General Data Protection Regulation, which the European Union enacted in May of 2018 in the interest of safeguarding user data from corporate privacy abuses. Notably, no such regulations exist at the federal level in the United States.

According to a statement the CNIL put forth announcing the penalty, if a user wanted to know what Google was doing with the geo-tracking data that its Android smartphones were collecting, or which categories of personal data were being compiled for ad personalization purposes, they would have found it "excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information ... accessible after several steps only." Under the new E.U. regulation, technology companies that fail to provide users with clear or comprehensive information around which personal data is being culled in the interest of churning out hyper-targeted ads are subject to hefty fines, which can run as high as up to 4 percent of their annual revenue.

"What the GDPR says is that, before a company does this, a company needs to notify consumers and get consumer's consent to do so," says Adam Schwartz, a senior staff attorney at the Electronic Frontier Foundation. Because of a lack of regulatory oversight in the U.S., however, companies can compile data sets of where a person goes, who they talk to, and what they buy with relative impunity.

In Google's case, the €50 million penalty it weathered this week can be fairly characterized as more of a flesh wound than anything else, especially when considering that Google's parent company, Alphabet, reported nearly $34 billion in revenue in just the last quarter. Nevertheless, the penalty marks the most significant blow any tech giant has faced under the GDPR thus far. While Google still has the option to appeal the fine with French administrative authorities, its initial response was to offer up the standard mea culpas and promises to renew its efforts not to violate the new guidelines moving forward.

"People expect high standards of transparency and control from us," a Google spokesperson told Recode in a statement. "We're deeply committed to meeting those expectations and the consent requirements of the GDPR. We're studying the decision to determine our next steps."

The GDPR has had a ripple effect throughout the tech industry in recent months, with platforms like Twitter and Snapchat publishing blog posts aimed at advertising their willingness to comply with Europe's new regulations.

But Schwartz says that the platforms' lobbying efforts in the U.S. tell a different story. "For many years privacy advocates have wanted stronger privacy laws," he says, "and for many years these companies have resisted them."

Schwartz says that Google, like other companies, "is essentially harvesting information bread crumbs from consumers across a larger number of services that [the company] is providing, bringing all of that info together to create one profile of a user, and then slinging ads based on that profile."

There is no one U.S. law that requires companies to get consent before acquiring information, but Schwartz described a patchwork of state laws, passed in recent years, that have sought to make it harder for companies to harvest user data without limits.

In 2008, the Illinois general assembly passed the Biometric Information Privacy Act, which mandates that companies obtain consent from individual users before collecting their biometric data, including fingerprints and facial-recognition software data. And in California, the passage of the landmark California Consumer Privacy Act last year heralded in a previously unprecedented level of data protection for consumers, including the right to request deletion of any personal information that a company has collected.

While states have been slowly advancing their own legislation to bolster privacy protections, efforts have not fared as well on the federal level. Although the Obama administration debuted a "Consumer Privacy Bill of Rights" in 2012 with the intention of creating a "comprehensive blueprint to improve consumers' privacy protections," the proposal was criticized for its willingness to let Silicon Valley draft its own playbook, and the legislation designed to codify the proposal's privacy protections eventually fizzled out in Congress in 2016.

However, the din of voices in the U.S. calling for legislative solutions is only growing louder as Europe cracks down on big data collection. Since 2016, a series of high-profile data breaches has left consumers more scared than ever about how their personal information is being used and protected. In 2018 alone, disastrous leaks dominated headlines: Facebook was notoriously accused of knowingly exposing the personal data of up to 87 million users to data-mining firm Cambridge Analytica, and Marriott International saw its reservation system fall prey to a hack that left an estimated 500 million people exposed.

The widespread leaks have resulted in eroded public trust in organizations' ability to protect their personal data. One Pew Research Center study published in 2017 found that roughly 64 percent of Americans had been exposed to a data breach, and more than half of respondents reported that they did not trust key institutions, including the federal government and social media sites, to safeguard their information.

And at least thus far, data protection seems to be low on President Donald Trump's immediate policy agenda, despite rumblings last year that the administration was working on crafting a set of policy proposals aimed at cracking down on unregulated tech giants. U.S. consumers hoping for data protections can pray for a miracle thawing of Washington, D.C., gridlock, but it's safe to assume that tech giants, which have been reaping the spoils of zero federal regulation up until now, won't come quietly.

Related