SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The American Library Association's current celebration of "Choose Privacy Week" highlights the diminishing scope - yet increased importance - of this choice. After all, privacy is a fundamental human right, preserving
With ubiquitous surveillance, however, and the privacy violations by major corporations like Facebook (whose very business model has involved fooling people into giving up their privacy), or Google (whose Street View cars sucked up email content as well as passwords), truly voluntary choice recedes. All the more important, then, to understand and assert your rights to "choose privacy" wherever possible.
Facebook and Google are only the most familiar businesses premised on scooping up personal data and manipulating it (and you) in order to make more money by selling "you" to their advertisers. "You" are defined by your "data exhaust" (i.e. the digital trails and datapoints you leave). Readers should know that E-book readers, like Amazon's Kindle, are increasingly doing the same thing with your reading habits, notes, highlights, and preferences. Beyond mere marketing, the potential harms include identity theft and more insidious manipulation and rights violations.
Such companies have ongoing relationships with government, with the better ones (e.g. Google) transparently disclosing requests and insisting on privacy and due process protections like a warrant beforehand, but the worst in this regard (e.g. Apple, Amazon, mobile phone companies Verizon and AT&T) routinely failing to protect users' data, essentially allowing government to sometimes "outsource" illegal, warrantless searches.
The unfolding "Big Data" trend, built on exponential increases in computing power and decreases in storage and processing cost, also unleashes social value beyond profit. Google's use of search terms to predict flu epidemics comes to mind (although that didn't work so well this past flu season).
In their new book on the Big Data "Revolution," however, Vicktor Mayer-Schonberger and Kenneth Cukier acknowledge that Big Data dramatically increases risks to privacy (and separate risks of "Minority Report"-type harms from probabilistic prediction as well). They note that prior protective approaches don't work given the realities and scale of the new data trend.
The "notice and consent" approach falls short when people don't read ever-changing privacy policies, or when data holders routinely engage in mission creep: often beneficial new uses not contemplated let alone consented to at the outset, yet defensible in context. (Note however that many would still insist on user's rights to consent, at least to materially different secondary uses of personal information, and not merely for sensitive health, financial, geolocation, or children's information). The "Big Data" authors also note that "opt-out" approaches can be unfamiliar and difficult, sometimes leaving traces that could still harm users. And anonymization fails when it's so much easier than thought to identify or re-identify someone from their data.
Instead of such approaches, they thus propose in essence a corporate responsibility/accountability approach (applicable to government also):
We envision a very different privacy framework for the big-data age, one focused less on individual consent at the time of collection and more on holding data users accountable for what they do. In such a world, firms will formally assess a particular reuse of data based on the impact is has on individuals . . .
. . . sloppy assessments or poor implementation of safeguards will expose data users to legal liability, and regulatory actions such as mandates, fines, and perhaps even criminal prosecution.
External and internal enforcers ("Algorithmists") within corporations and government would help protect privacy. Though the authors seem not to know it, this proactive risk-management approach resembles that of the recent UN Framework for Business and Human Rights and Guiding Principles, which require "Human Rights Risk Assessments" and corporate "due diligence."
The problem is that companies like Facebook and Google, while trumpeting the "shared responsibility" of users, government, and companies to protect privacy, in fact push responsibility onto users (to "leverage" the privacy "tools" companies given them but which we've seen aren't fairly presented, understood, or used in practice). The companies tend to rationalize that privacy is already, essentially, dead. Aside from Mark Zuckerberg's famous pronouncement to that effect, Google Chairman Eric Schmidt's new book (written with Google Ideas director Jared Cohen, which probably accounts for its sometimes schizophrenic character) argues:
Since information wants to be free, don't write anything down you don't want read back to you in court or printed on the front page of a newspaper . . . In the future this adage will broaden to include not just what you say and write, but the websites you visit, who you include in your online network, what you "like," and what others who are connected to you do, say, and share.
Whereas the above UN Framework and Guiding Principles put the "due diligence" obligation on companies, Schmidt and Cohen (at 65) put it on citizens and consumers -- to read and rely on those horrendous policies! And they cave in on such principles as the right to be forgotten at some point ("you cannot assume there is a simple delete button"). This is unsurprising, since losing that data would cost Google much money.
Not a pretty picture. So in the meantime, what to do? Don't give up just yet. Here are some of my personal high-level tips, none foolproof but all worthwhile:
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
With ubiquitous surveillance, however, and the privacy violations by major corporations like Facebook (whose very business model has involved fooling people into giving up their privacy), or Google (whose Street View cars sucked up email content as well as passwords), truly voluntary choice recedes. All the more important, then, to understand and assert your rights to "choose privacy" wherever possible.
Facebook and Google are only the most familiar businesses premised on scooping up personal data and manipulating it (and you) in order to make more money by selling "you" to their advertisers. "You" are defined by your "data exhaust" (i.e. the digital trails and datapoints you leave). Readers should know that E-book readers, like Amazon's Kindle, are increasingly doing the same thing with your reading habits, notes, highlights, and preferences. Beyond mere marketing, the potential harms include identity theft and more insidious manipulation and rights violations.
Such companies have ongoing relationships with government, with the better ones (e.g. Google) transparently disclosing requests and insisting on privacy and due process protections like a warrant beforehand, but the worst in this regard (e.g. Apple, Amazon, mobile phone companies Verizon and AT&T) routinely failing to protect users' data, essentially allowing government to sometimes "outsource" illegal, warrantless searches.
The unfolding "Big Data" trend, built on exponential increases in computing power and decreases in storage and processing cost, also unleashes social value beyond profit. Google's use of search terms to predict flu epidemics comes to mind (although that didn't work so well this past flu season).
In their new book on the Big Data "Revolution," however, Vicktor Mayer-Schonberger and Kenneth Cukier acknowledge that Big Data dramatically increases risks to privacy (and separate risks of "Minority Report"-type harms from probabilistic prediction as well). They note that prior protective approaches don't work given the realities and scale of the new data trend.
The "notice and consent" approach falls short when people don't read ever-changing privacy policies, or when data holders routinely engage in mission creep: often beneficial new uses not contemplated let alone consented to at the outset, yet defensible in context. (Note however that many would still insist on user's rights to consent, at least to materially different secondary uses of personal information, and not merely for sensitive health, financial, geolocation, or children's information). The "Big Data" authors also note that "opt-out" approaches can be unfamiliar and difficult, sometimes leaving traces that could still harm users. And anonymization fails when it's so much easier than thought to identify or re-identify someone from their data.
Instead of such approaches, they thus propose in essence a corporate responsibility/accountability approach (applicable to government also):
We envision a very different privacy framework for the big-data age, one focused less on individual consent at the time of collection and more on holding data users accountable for what they do. In such a world, firms will formally assess a particular reuse of data based on the impact is has on individuals . . .
. . . sloppy assessments or poor implementation of safeguards will expose data users to legal liability, and regulatory actions such as mandates, fines, and perhaps even criminal prosecution.
External and internal enforcers ("Algorithmists") within corporations and government would help protect privacy. Though the authors seem not to know it, this proactive risk-management approach resembles that of the recent UN Framework for Business and Human Rights and Guiding Principles, which require "Human Rights Risk Assessments" and corporate "due diligence."
The problem is that companies like Facebook and Google, while trumpeting the "shared responsibility" of users, government, and companies to protect privacy, in fact push responsibility onto users (to "leverage" the privacy "tools" companies given them but which we've seen aren't fairly presented, understood, or used in practice). The companies tend to rationalize that privacy is already, essentially, dead. Aside from Mark Zuckerberg's famous pronouncement to that effect, Google Chairman Eric Schmidt's new book (written with Google Ideas director Jared Cohen, which probably accounts for its sometimes schizophrenic character) argues:
Since information wants to be free, don't write anything down you don't want read back to you in court or printed on the front page of a newspaper . . . In the future this adage will broaden to include not just what you say and write, but the websites you visit, who you include in your online network, what you "like," and what others who are connected to you do, say, and share.
Whereas the above UN Framework and Guiding Principles put the "due diligence" obligation on companies, Schmidt and Cohen (at 65) put it on citizens and consumers -- to read and rely on those horrendous policies! And they cave in on such principles as the right to be forgotten at some point ("you cannot assume there is a simple delete button"). This is unsurprising, since losing that data would cost Google much money.
Not a pretty picture. So in the meantime, what to do? Don't give up just yet. Here are some of my personal high-level tips, none foolproof but all worthwhile:
With ubiquitous surveillance, however, and the privacy violations by major corporations like Facebook (whose very business model has involved fooling people into giving up their privacy), or Google (whose Street View cars sucked up email content as well as passwords), truly voluntary choice recedes. All the more important, then, to understand and assert your rights to "choose privacy" wherever possible.
Facebook and Google are only the most familiar businesses premised on scooping up personal data and manipulating it (and you) in order to make more money by selling "you" to their advertisers. "You" are defined by your "data exhaust" (i.e. the digital trails and datapoints you leave). Readers should know that E-book readers, like Amazon's Kindle, are increasingly doing the same thing with your reading habits, notes, highlights, and preferences. Beyond mere marketing, the potential harms include identity theft and more insidious manipulation and rights violations.
Such companies have ongoing relationships with government, with the better ones (e.g. Google) transparently disclosing requests and insisting on privacy and due process protections like a warrant beforehand, but the worst in this regard (e.g. Apple, Amazon, mobile phone companies Verizon and AT&T) routinely failing to protect users' data, essentially allowing government to sometimes "outsource" illegal, warrantless searches.
The unfolding "Big Data" trend, built on exponential increases in computing power and decreases in storage and processing cost, also unleashes social value beyond profit. Google's use of search terms to predict flu epidemics comes to mind (although that didn't work so well this past flu season).
In their new book on the Big Data "Revolution," however, Vicktor Mayer-Schonberger and Kenneth Cukier acknowledge that Big Data dramatically increases risks to privacy (and separate risks of "Minority Report"-type harms from probabilistic prediction as well). They note that prior protective approaches don't work given the realities and scale of the new data trend.
The "notice and consent" approach falls short when people don't read ever-changing privacy policies, or when data holders routinely engage in mission creep: often beneficial new uses not contemplated let alone consented to at the outset, yet defensible in context. (Note however that many would still insist on user's rights to consent, at least to materially different secondary uses of personal information, and not merely for sensitive health, financial, geolocation, or children's information). The "Big Data" authors also note that "opt-out" approaches can be unfamiliar and difficult, sometimes leaving traces that could still harm users. And anonymization fails when it's so much easier than thought to identify or re-identify someone from their data.
Instead of such approaches, they thus propose in essence a corporate responsibility/accountability approach (applicable to government also):
We envision a very different privacy framework for the big-data age, one focused less on individual consent at the time of collection and more on holding data users accountable for what they do. In such a world, firms will formally assess a particular reuse of data based on the impact is has on individuals . . .
. . . sloppy assessments or poor implementation of safeguards will expose data users to legal liability, and regulatory actions such as mandates, fines, and perhaps even criminal prosecution.
External and internal enforcers ("Algorithmists") within corporations and government would help protect privacy. Though the authors seem not to know it, this proactive risk-management approach resembles that of the recent UN Framework for Business and Human Rights and Guiding Principles, which require "Human Rights Risk Assessments" and corporate "due diligence."
The problem is that companies like Facebook and Google, while trumpeting the "shared responsibility" of users, government, and companies to protect privacy, in fact push responsibility onto users (to "leverage" the privacy "tools" companies given them but which we've seen aren't fairly presented, understood, or used in practice). The companies tend to rationalize that privacy is already, essentially, dead. Aside from Mark Zuckerberg's famous pronouncement to that effect, Google Chairman Eric Schmidt's new book (written with Google Ideas director Jared Cohen, which probably accounts for its sometimes schizophrenic character) argues:
Since information wants to be free, don't write anything down you don't want read back to you in court or printed on the front page of a newspaper . . . In the future this adage will broaden to include not just what you say and write, but the websites you visit, who you include in your online network, what you "like," and what others who are connected to you do, say, and share.
Whereas the above UN Framework and Guiding Principles put the "due diligence" obligation on companies, Schmidt and Cohen (at 65) put it on citizens and consumers -- to read and rely on those horrendous policies! And they cave in on such principles as the right to be forgotten at some point ("you cannot assume there is a simple delete button"). This is unsurprising, since losing that data would cost Google much money.
Not a pretty picture. So in the meantime, what to do? Don't give up just yet. Here are some of my personal high-level tips, none foolproof but all worthwhile: