SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF

Privacy Rights and Corporate Accountability in the Age of "Big Data"

The American Library Association's current celebration of "Choose Privacy Week" highlights the diminishing scope - yet increased importance - of this choice. After all, privacy is a fundamental human right, preserving

The American Library Association's current celebration of "Choose Privacy Week" highlights the diminishing scope - yet increased importance - of this choice. After all, privacy is a fundamental human right, preserving space for thinking, reading, writing, bodily integrity, identity, individual autonomy, and self-and-community development -- free from use, abuse, discrimination, exploitation or interference by the government, companies, or other people. Privacy is integrally related to other fundamental rights, ranging from free expression, association, and assembly, to freedom from coercive interrogation or torture or extrajudicial killings by drones or otherwise.

With ubiquitous surveillance, however, and the privacy violations by major corporations like Facebook (whose very business model has involved fooling people into giving up their privacy), or Google (whose Street View cars sucked up email content as well as passwords), truly voluntary choice recedes. All the more important, then, to understand and assert your rights to "choose privacy" wherever possible.

Facebook and Google are only the most familiar businesses premised on scooping up personal data and manipulating it (and you) in order to make more money by selling "you" to their advertisers. "You" are defined by your "data exhaust" (i.e. the digital trails and datapoints you leave). Readers should know that E-book readers, like Amazon's Kindle, are increasingly doing the same thing with your reading habits, notes, highlights, and preferences. Beyond mere marketing, the potential harms include identity theft and more insidious manipulation and rights violations.

Such companies have ongoing relationships with government, with the better ones (e.g. Google) transparently disclosing requests and insisting on privacy and due process protections like a warrant beforehand, but the worst in this regard (e.g. Apple, Amazon, mobile phone companies Verizon and AT&T) routinely failing to protect users' data, essentially allowing government to sometimes "outsource" illegal, warrantless searches.

The unfolding "Big Data" trend, built on exponential increases in computing power and decreases in storage and processing cost, also unleashes social value beyond profit. Google's use of search terms to predict flu epidemics comes to mind (although that didn't work so well this past flu season).

In their new book on the Big Data "Revolution," however, Vicktor Mayer-Schonberger and Kenneth Cukier acknowledge that Big Data dramatically increases risks to privacy (and separate risks of "Minority Report"-type harms from probabilistic prediction as well). They note that prior protective approaches don't work given the realities and scale of the new data trend.

The "notice and consent" approach falls short when people don't read ever-changing privacy policies, or when data holders routinely engage in mission creep: often beneficial new uses not contemplated let alone consented to at the outset, yet defensible in context. (Note however that many would still insist on user's rights to consent, at least to materially different secondary uses of personal information, and not merely for sensitive health, financial, geolocation, or children's information). The "Big Data" authors also note that "opt-out" approaches can be unfamiliar and difficult, sometimes leaving traces that could still harm users. And anonymization fails when it's so much easier than thought to identify or re-identify someone from their data.

Instead of such approaches, they thus propose in essence a corporate responsibility/accountability approach (applicable to government also):

We envision a very different privacy framework for the big-data age, one focused less on individual consent at the time of collection and more on holding data users accountable for what they do. In such a world, firms will formally assess a particular reuse of data based on the impact is has on individuals . . .

. . . sloppy assessments or poor implementation of safeguards will expose data users to legal liability, and regulatory actions such as mandates, fines, and perhaps even criminal prosecution.

External and internal enforcers ("Algorithmists") within corporations and government would help protect privacy. Though the authors seem not to know it, this proactive risk-management approach resembles that of the recent UN Framework for Business and Human Rights and Guiding Principles, which require "Human Rights Risk Assessments" and corporate "due diligence."

The problem is that companies like Facebook and Google, while trumpeting the "shared responsibility" of users, government, and companies to protect privacy, in fact push responsibility onto users (to "leverage" the privacy "tools" companies given them but which we've seen aren't fairly presented, understood, or used in practice). The companies tend to rationalize that privacy is already, essentially, dead. Aside from Mark Zuckerberg's famous pronouncement to that effect, Google Chairman Eric Schmidt's new book (written with Google Ideas director Jared Cohen, which probably accounts for its sometimes schizophrenic character) argues:

Since information wants to be free, don't write anything down you don't want read back to you in court or printed on the front page of a newspaper . . . In the future this adage will broaden to include not just what you say and write, but the websites you visit, who you include in your online network, what you "like," and what others who are connected to you do, say, and share.

Whereas the above UN Framework and Guiding Principles put the "due diligence" obligation on companies, Schmidt and Cohen (at 65) put it on citizens and consumers -- to read and rely on those horrendous policies! And they cave in on such principles as the right to be forgotten at some point ("you cannot assume there is a simple delete button"). This is unsurprising, since losing that data would cost Google much money.

Not a pretty picture. So in the meantime, what to do? Don't give up just yet. Here are some of my personal high-level tips, none foolproof but all worthwhile:

  • Have a thoughtful Facebook and social media strategy, whether scrubbing your profile of truly personal information, trying to cope with privacy settings, taking a break, narrowing use, and finally, if you can't take it anymore, leaving completely.
  • Don't rely on "opt-out" cookies, which are themselves problematic and don't work if you clear cookies at each logoff (as you probably should); instead, modify your cookie settings to refuse at least third party cookies (maybe all), and install a browser privacy add-on such as EFF's HTTPS everywhere.
  • Don't leave geo-location monitoring "always on," but only when you need it (e.g. while navigating).
  • Use anti-spyware like Spybot S&D and Adaware.
  • Don't believe what the powers-that-be tell you: remain skeptical, educate yourself, and monitor privacy developments, including via the ALA's Choose Privacy Week website and guidance from EFF and others.
  • Support broader legal privacy protections and become more active with privacy-protective human rights civil society groups, such as EPIC, EFF, CDT, the ACLU, and Amnesty International, to resist privacy intrusions and encourage genuine governmental and corporate responsibility and accountability.
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.