SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"When I came public it wasn't to single-handedly change the government. I wanted to inform the public so they could make their own decision," Snowden told the audience. "I took an oath to support and defend the Constitution and I saw that the Constitution was being violated on a massive scale."
You can see his complete appearance at SXSW here.
Monday's session was moderated by Ben Wizner, director of the ACLU Speech, Privacy and Technology Project, and a legal advisor to Edward Snowden. The day before he left for SXSW, Wizner spoke with Moyers & Company's senior writer Michael Winship.
Special thanks to John Light for recording and editing, and to Helen Brunner, director of the Media Democracy Fund, for helping to make this interview possible.
Winship: Tell us about the [American Civil Liberties Union's] Speech, Privacy and Technology Project. What's its agenda?
Wizner: We're trying to plant our flag at the point where science and technology and civil liberties intersect. The rapid developments in science and surveillance technologies really do have an impact on a whole range of rights, not just privacy, and we want to have the institutional expertise to be able to identify what those issues are and shape sophisticated responses. So the project has lawyers, but not only lawyers. We have two full-time computer scientists on our staff who are technologists, who are experts in encryption, secure communications, surveillance technologies. And they've been able, not only to help us respond with more intelligence, I would say, to the intelligence surveillance scandal, but also to help identify issues that we might not have intuitively understood to be civil liberties issues.
We want to be looking at issues that are not ready to be litigated, but might be important issues in five years, or eight or ten. One of the things I like to do is go into a room of scientists, maybe neuroscientists, and say, "What do you think the ACLU should know and be worried about?" and just see what kind of conversation comes out of that. And it's a really amazing kind of exercise. You might have somebody say, "You need to be worried about fMRI brain scanning. The research on this is really scary. It makes the polygraph seem like child's play. This is going on in classified Pentagon labs and you need to be thinking about this right now..." It might be somebody talking about other biometrics that are not at the forefront, or what's going to happen or what the dangers are of universal DNA screening at birth and what protections are necessary.
So sometimes we might end up writing a white paper about an issue like this. Other times, we might prepare litigation, and other times we might use our network of 50 ACLU state affiliates to push for immediate legislation through the state legislature.
Winship: So, really, what you're saying is there's something in almost everything that everybody does that's affected by the work you're doing.
Wizner: The reality is that, in the last year in particular, and really the last two or three years, there's been a disproportionate focus on digital privacy and government surveillance. And that focus has certainly exploded in the eight or nine months since Edward Snowden began to dominate the global headlines with revelation after revelation about the scope of the surveillance data. And so it's sort of an "all hands on deck" for that issue right now. A debate like this may come along once in a generation, an opportunity to kind of hit the reset button and for us to really reinvigorate our oversight mechanisms, and also encourage changes in the technology community to correct some of the asymmetries in power between the state and the citizens.
Winship: How did you first learn and know about Edward Snowden? When did that happen?
Wizner: I had been a friend, and at times advisor, to Laura Poitras, and a friend, and at times, source, to Glenn Greenwald. These were people with whom I was in very regular communication, and still am. And so when Laura Poitras first received a communication from someone who claimed to be in a senior position in the intelligence community, who claimed to have access to documentary evidence of illegal, unconstitutional and very, very troubling government surveillance activities, I was one of the people to whom Laura reached out for advice about what the next steps were. At that time, she had been subjected to so much surveillance herself that it was at the front of her mind that this might be an effort to entrap her, and so she wanted to make sure that she didn't put herself in any legal jeopardy or unnecessary risk. And so over the course of the next several months, leading up to the day in late May or June when Laura Poitras and Glenn Greenwald got on the plane to go meet Snowden, I had been having conversations with them. But I didn't know Edward Snowden's name or his identity until you did, until The Guardian put on its website that first video interview, in which Glenn Greenwald asks Edward Snowden who he is, what he's done and why he's done it. We were gathered around computers in our office watching that, just as I imagine people were gathered around computers in offices in Silicon Valley, in Fort Meade [National Security Agency headquarters], in Washington and in a lot of other places where people watch these issues closely.
Winship: How would you characterize what he has revealed?
Wizner: Well, maybe the best way to answer that question is to remember what President Obama said in the first week after the revelations began to appear on front pages. He said Americans shouldn't be too worried about these disclosures because all three branches of government had blessed the programs and activities that were being disclosed. That was a true statement. That was also exactly the problem. And it's worth looking at what those same three branches of government have done since Edward Snowden's disclosures, since the public was brought into this conversation.
So let's look at the courts. Now, it's true that a court called the Foreign Intelligence Surveillance Court had approved, in secret, some of these programs. It's a court that hears only from the government, does not have the benefit of adversarial briefing, didn't get to hear what our objections would have been. It's also a court that was set up to give warrants, not to write opinions on whether surveillance programs in general were lawful. And when we tried to bring challenges to these programs in open federal courts, we got as far as the Supreme Court, but every court turned us away without even considering the legality of the programs. The government said, "These plaintiffs have no right to be in court. They can't show that they were subjected to these surveillance programs, and therefore they don't have standing. And they're not allowed to use the discovery process to learn that, because that would be a state secret." The result being that no one has the right to go into federal court to challenge the legality of these programs.
Edward Snowden was watching this. In our very first conversation, one of his first questions to me was, "Have these documents that have been published so far given you standing to go back in court?" To him, the idea that a court would not answer the question, "Is this program legal? Is it constitutional?" but instead would contort itself in order to not answer that question seemed like a failure of oversight, and he was right.
What's happened since his disclosures? We have now taken some of these documents, gone back into federal courts, where our standing is really much harder to question. Two federal judges have now considered, for example, the constitutionality of the government's collection of all telephone metadata. They've come so far to different conclusions on the legal question, but both said that the plaintiffs have standing to be in court. So one thing that he's done is he's reinvigorated judicial oversight.
Now, what about Congress? To me, the signal moment in Congress is [Senator] Ron Wyden asking [Director of National Intelligence] James Clapper, "Is there any kind of information that you collect on millions or hundreds of millions of Americans?" And Clapper says, "No, sir, not wittingly." We like to call this Clapper lying to Congress, and it's certainly that. But it would be much more accurate to say that Clapper was lying to the American people, because Senator Wyden knew that the answer was false. He didn't, he felt like he couldn't correct the answer. No one else on the committee corrected the answer. Clapper didn't correct the answer, no one on his staff, no one on the Administration. So what we had was a lie being told to Congress and no one in any branch coming forward to say that a lie had been committed. And Snowden was watching that, too. And what's happened in Congress since the public disclosures? The issue has come out of the intelligence communities and into the full Congress. There is historic bipartisan legislation that would end bulk collection of American's data, that would create an adversarial process in the Foreign Intelligence Surveillance Court. This is the kind of legislation that would've been absolutely unthinkable before Snowden.
The direction has been one-way since the late 1970s. The Deep State has more authority, not less. The opposite is going to happen now. Now, whether it's something that seems more cosmetic or something that really is historic, well, that's really up to the people to decide. We will see. But there's been an earthquake in the congressional oversight of these programs, and that's because of Snowden.
And even the executive branch, which said, "Nothing to see here" -- you know, the president appointed his own review board, that included former very high ranking intelligence community officials and other close friends of his. I think it's fair to say that the civil society organizations expected a whitewash. But that's not what we got. The conclusions were -- more politely stated -- that the NSA had essentially gotten out of control, that it allowed its technological capabilities to drive its practices, rather than having its practices constrained by laws and values, and even wisdom. And there were dozens and dozens of recommendations that went not only to giving Americans greater protections, but also people abroad. And you heard the president in January, in his big speech about the NSA, say -- first time for any president, I think -- that we need to be concerned about the privacy rights of people outside the US who are not protected by the Constitution.
So all three branches of government are now doing the oversight that the Constitution wants them to do, that they were not doing before Edward Snowden. To me, that is his most significant contribution.
Winship: And you feel that the route he took, via journalists, was the one and only way he could go?
Wizner: I guess at times I wonder what people mean when they say he should have gone through a traditional route rather than going through journalists. Sometimes, the kinds of people who we call whistleblowers -- and I don't use that term as a term of art -- are people who uncover unquestionably illegal conduct that's been hidden away and they just need to bring it to the attention of an overseer, call up an inspector general, call up a member of Congress and say, "Look what I found," and then the system will take care of itself. But sometimes, someone comes upon a system of global dragnet surveillance that the oversight system deems perfectly legal. This is not something that Congress was unaware of. This is not something that courts were unaware of, at least the courts that were set up to review these practices. What was Edward Snowden supposed to do, call up the Senate Intelligence Committee and say, "Hi, I'm a 29-year-old contractor who works in Hawaii, and I'm calling to report to you about the programs that you have approved in secret?" This was a very, very different kind of situation. There was no one to report to who had not been part of the system of approval. And even those who were in the Congress who shared Snowden's view about the propriety and maybe legality of this were unwilling to talk. Senator Wyden was on the floor of the Senate with his hair on fire, saying, "If the American people knew what I knew, they would be angry and they would be shocked." Well, that turned out to be true, but we didn't learn it from Senator Wyden. We learned it from Edward Snowden.
And one more point about what he did. You know, the number of documents that Edward Snowden has made available to the public is zero. What he did is give information to journalists, with the instruction that they and their editors, in consultation, where necessary, with government officials, decide what was in the public interest to publish, and to withhold information that would be harmful to publish. He wanted to create a protocol that would correct for his own biases. He was someone who had spent the last almost ten years in the intelligence community. He didn't think that his own judgments -- and he has very strong judgments about what should or should not be public -- were adequate to this moment and wanted to make sure that the institutions that had the experience in doing this -- and these are our newspapers, who have long experience competing with the government over access and control of secret information -- that that be the way that the information got published. And many people have not noticed this. In an interview that Snowden gave with Time magazine when he was runner-up to the Pope for Person of the Year, he said he hasn't always agreed with the public interest determinations of the journalists, but that that's precisely why he needed to do it this way. He didn't want and didn't think that he should have the responsibility to decide which of these documents should be public. He wanted to appeal to the traditions, the institutions, the expertise of the media in helping to make those important judgments. That's what we want whistleblowers to do. We don't want them to unilaterally substitute their judgment for everybody else's. We want them to go through these institutions that funnel and that channel that and have longer experience in making these kinds of decisions.
Winship: And yet, Keith Alexander, the outgoing head of the NSA, made a speech at Georgetown a few days ago in which he said that journalists don't have the proper ability to analyze these materials, and he said that Snowden's leaks had caused "grave, significant and irreversible damage to our nation."
Wizner: Those words are the classic weasel words of the Deep State. That sentence could have been lifted from the United States government's brief to the Supreme Court in the Pentagon Papers case, where they said if the Court allowed The New York Times and Washington Post and others to publish the papers they would be responsible for "grave and irreversible damage to the national security." It's exactly the same kind of language.
You know, I wonder if General Alexander really believes that our democracy would be stronger and better off if journalists deferred in every case to the expertise and interests of the executive branch in deciding what to publish. I mean if you look just back at the last few years and consider what the public would not have known on that model, we would not have known that the case for war in Iraq was cooked and based on deliberate exaggerations and lies. We wouldn't have known that American soldiers tortured and sexually humiliated prisoners in Abu Ghraib. We wouldn't have know that the CIA set up a network of secret prisons around the world and used an extraordinary rendition program to kidnap people off the streets of cities, chain them to the floors of planes and fly them to those places. We wouldn't have known about an enhanced interrogation program, known to the rest of the world as torture, where prisoners were waterboarded, suspended from ropes, beaten and treated in ways that the US has always considered criminal when it was done to our own soldiers. We wouldn't have known that the Bush administration decided that the rules for foreign intelligence surveillance collection were too cumbersome and that they should just throw aside the statute and collect whatever they wanted to, under the president's own authority, leading to the near resignation of the attorney general and the head of the FBI.
All of this stuff was classified. Not just classified; it was classified at the highest level. These were the secrets that the government said were most critical to keep. But what kind of democracy would we be if the public had never learned of this information?
I'm also not saying that journalists alone should decide what the public sees. I mean the government's voice in this debate is an important one. It's a back and forth. It's always been a back and forth. I don't believe a single story based on Snowden documents has yet been published without consultation with the government, without giving the government an opportunity to strenuously object and to point out things that might cause harm in their view. And that's why I don't think there's been any credible evidence at all of real harm to national security from these leaks.
Winship: From your position here at the Speech, Privacy and Technology Project, were you, like Senator Wyden, shocked by the extent of the surveillance?
Wizner: The short answer is yes. And I'm not easy to shock. And I question whether I was being naive, but it never occurred to me that the NSA or Department of Justice could think that a provision like Section 215 of the Patriot Act, which is similar to a provision giving grand juries subpoena authority, could be used to get the records of every single American phone call every day. It simply didn't make any sense. Or that the government could think that it was a valid constitutional argument to say that the Fourth Amendment basically is silent when it comes to intercepting and storing all of our communications -- it has nothing to say, it's not even implicated -- that only when someone decides to query that database in search of a particular person does the Fourth Amendment have even a limited role, that they're allowed to collect everything without having to consider the Fourth Amendment. It seems to me that that turns the Fourth Amendment on its head, that we do search and seizure before suspicion, rather than having suspicion before search and seizure.
This stuff, the extent of it, was shocking, and I was also shocked by some of the NSA's efforts to undermine the security of the Internet in our global communications in order to facilitate their mass surveillance. The idea that they would deliberately weaken commonly used encryption standards that are used to protect our financial records, that are used to protect our medical records, our networks -- and they would do this even as other parts of the government are telling us that the cyber threat is greater than the terrorism threat right now. Well, in the name of fighting terrorism, the offensive surveillance that they're doing is weakening our cyber defenses against those other kinds of threats. This is going on in the very same government. And the amount of that, the way that offense had just been sort of given everything at the expense of defense, did come as something of a shock, and I think not just to people in the ACLU, but to people in the technology community, many of whom did not consider themselves political, saw politics as messy, liked the elegance of ones and zeroes and code, but now find themselves having to treat the NSA as a different kind of threat model when they're thinking about security of networks and systems.
Winship: And yet, certain of these companies were also, selling information to the NSA, or sharing information.
Wizner: I think the relationship between the NSA and these companies is complicated. It's not one relationship, it's many relationships. And I do think that, in particular, the Silicon Valley companies -- that have a sort of self-image as being libertarian; they have the ethos of being against evil and on the side of freedom -- certainly want to minimize the ways in which they facilitated the kinds of mass surveillance that's gone on.
On the other hand, I think there is a difference between a program like PRISM, where the government goes to the technology companies with orders from a court and then they open up their systems in response to those orders, and other programs that we read about where the NSA looks for the places in these companies where data is moving around without encryption -- so transfers of data between data centers overseas -- and then hacks into those connections. I don't think they ask permission to do that, and I think that it's not all hypocrisy when the companies say, "I can't believe they did that." Their attitude is, "You could get what you wanted with a court order. The law is so favorable to you now, why are you hacking us? Why are you weakening us?"
And I do think that now that the debate is out in the open, and now that people around the world, or known to these companies as markets around the world, are very nervous about American companies, because they have seen how the NSA treats them like a silver platter for dragnet surveillance. You know, the companies face a kind of existential threat. They're either going to have to do more to protect, to give end-to-end encryption to their customers' data, or they're going to lose those customers. And so maybe it was easier for the companies when this debate was in the shadows or when it wasn't really known. Now they really are going to have to make these choices. They can't present the same smiley face to the public and to government.
Winship: And it's really a global issue, not just a domestic one; the companies have clients all over the world and there's pushback from those clients.
Wizner: And you see the way that the NSA operates. They might tell Germany, "We're not going to take data of Germans in Germany," so they'll wait until it leaves Germany and transits to Denmark and take it there. And they might say the same thing to Denmark, "We're not going to take information from the people of Denmark in Denmark," but they'll get it when it transits to Germany. And because of the NSA's access to these global communications systems, the issue is global, as you say. Not a lot of domestic reform litigation is going to be able to be equal to the size of the challenge, so there is a sort of broader challenge to free societies about how to deal with the question of mass surveillance. We're talking -- when I say mass surveillance, I'm saying basically collecting everything, tapping into the backbone of the Internet and syphoning off everything and storing it, which we can do now because storage is cheap. We used to have to decide who we were going to follow. Now we can follow everybody. And there's no question that having all of that information is going to be useful to governments in some way. You create a database that can be a surveillance time machine. It can allow governments to hit rewind and recreate our lives in ways that would surely help solve some crimes.
It also is, I think, a real danger to give governments that much power over their citizens. And so I think there has to be a sort of law and policy debate that takes place at a global level, at least among free societies. But also, remember that when Google switches from having its Gmail traffic go from being unencrypted by default to being encrypted by default, something that it did in 2010, that affects hundreds of millions of people instantly, at the flip of a switch, and not just people in the United States. Google just made it almost impossible for the government of Iran to get Gmail information from Iranians who are using the service. So the technology companies, the fixes that they put in place will be global fixes and not just for Americans.
Winship: But in terms of Americans, what do you say to people who say, "Look, they're just trying to protect me. What's the harm? I haven't done anything wrong. I've got nothing to hide"?
Wizner: I think that many Americans do feel that way. I do think that, when you talk about the dangers of mass surveillance, they can seem abstract. They can seem futuristic, they can seem science fiction-y. They're not as easy to digest as a villain like J. Edgar Hoover, who was blackmailing Martin Luther King and infiltrating movements, or a Stasi [former East German secret police] keeping detailed files on neighbors and all of that. And I start by saying we are not a society like that yet. We are fundamentally, for most people, at least, a free society. I mean there are communities like the Muslim community that have experienced a much more invasive form of surveillance, with informants in their mosques and people in their community, provocateurs. But for the most part, that's not what it feels like now to live in America.
Having said that, without the right kinds of democratic controls, these kinds of technologies frighten me. And they frighten me for a number of reasons. As I said, our principle privacy protection in the past was probably more a cost than law. Governments simply had to make decisions about what they were going to collect and what they were going to store, because it was expensive to do those things. Now that those costs are plunging to zero and Moore's law [1] marches on, it's technologically and financially feasible for governments to collect and store everything about us, all of our movements, all of our communications, all of our associations, all the information they need to construct portraits of our lives that might be more detailed even than things that we know about ourselves. And if you add to this other kinds of surveillance that surely are coming -- drones in the sky with ARGUS cameras that can record entire cities second by second and reconstruct them -- we will be living in a different kind of panopticon, and I do think that that is going to cast a chill on what it means to be a human being, if we know that every moment that we consider private now is actually residing somewhere, and someone somewhere can hit rewind and see it, even if we ourselves are not suspected of doing anything wrong.
If that's too abstract, I think that when governments amass this kind of information and unleash upon it computer algorithms that are trying to make predictions and judgments -- you know, who's dangerous, who might be a terrorist, who should we be more worried about? We've seen what happens on the commercial side every day with these kinds of processes, because of course supercomputers that belong to insurance companies and others are trying to calculate what kind of risk we are. You have a credit score. If anybody's ever made a mistake with a credit score of yours, I bet it was not that easy to have it resolved, and in the meantime, you might not have gotten a loan, you might've been redlined, you might've had much higher car insurance costs. I think when this tendency moves over to government -- and it's happening already -- we already have watch lists that say that some people can fly and some people can't, or that they can go in this lane or they can go in that lane. But that's only going to proliferate as there's more and more data and faster and faster computers and more confidence that they can make these predictions about us. I worry about due process. I worry about basic fairness. And I worry about a world -- not a world that looks like Orwell. The law professor Daniel Solove has said that maybe the better analogy for this, or metaphor, is not Orwell, but Kafka, where a big data state makes judgments that can be permanent and irreversible, and don't seem fair, and we don't have a chance to speak back.
And then, finally, I worry that the current state of play with these technologies cannot last, that we're being told, "Don't worry, all of this data just stays in a box and we only look at it a couple of hundred times a year." That's true now, but it will not be true in two years, and it will definitely not be true in five, and in ten years, every law enforcement agency is going to have access to this data. Today it's the NSA, but it's going to be the FBI, if it isn't already, and the DEA, and joint terrorism task forces that include local sheriffs and your local police. And very soon, the cop on the corner, through his handheld device, is going to have access to all of this data, and he will because we'll be presented with scenarios where if he had had it, it would've stopped this crime or stopped this attack, and the restrictions will be blamed. And, you know, Pandora 's Box is going to be opened.
I think if we don't, right now, when we hold the world's attention focused on this debate, put in stronger democratic controls, it becomes more and more likely that, gradually, through mission creep, we're going to be living in a different kind of surveillance state, and one that people feel and experience much more directly.
Winship: It seems to me, the other thing we're talking about is the chilling effect on speech, that people are self-censoring, even, because they're afraid to be surveilled like this and to have everything that they're saying covered and documented.
Wizner: Yes. I think "chill" is something that we worry about a lot, and maybe we should worry about it even more. I sometimes think about poor Professor Petraeus [former General and CIA Director David Petraeus] at the City University of New York. One woman sent a nasty anonymous email to a second woman, the second woman called the FBI, and in a short time, the FBI was reading the content of many, many, many thousands of emails. And perhaps they justified doing that -- I think it was outrageous -- because once they saw the CIA director's name, they thought maybe this is a national security threat. Well, I hope that that scared every member of Congress, that the FBI could use national security as a pretext to pour through the CIA director's private email communications. He was not even suspected of wrongdoing, but because our digital trails are basically permanent and we can't erase them, it doesn't even require the arrow to be pointed at us. It could be pointed at somebody next to us and then our lives can be turned upside down like this.
So I think people are beginning to experience this. We see these sad cases where a Facebook photo of somebody drinking at a fraternity party when she was 20 results in her not being able to get hired as a teacher when she's 26. Stories like that are going to proliferate, where the digital footprints that we leave by living ordinary lives constrain choices down the road and do chill. And I think they already do chill. I think they chill things in a way that are going to give us worse politicians. Right now, anybody who wants to be president who's 18 years old is probably living a life of walking on eggshells and trying to avoid the kinds of experiences that could be embarrassing. But do we want that person to be our president, someone who from the age of 16, 17 and 18 is doing everything possible to engage in our world so as not to be in a position where his or her life can be distorted down the road? I mean, to me, these are really, really important questions that go far beyond law and get at what it's going to be like to be a person in a world of universal collection. And the kinds of controls that we put access to this, our ability to hit delete on things that might actually be useful and not to make national security the first, second and third argument in deciding how to make these hard decisions as a society, it's going to be very interesting to see how all of that plays out.
Winship: Are you hopeful?
Wizner: I think there was a period of time after the 9/11 attacks, and it was a much longer period of time than it should've been, where the talismanic invocation of terrorism or national security was enough to sort of silence opposition to the expansions of the government's surveillance authority. And I think that one of the remarkable developments in the last eight or nine months has been how the surveillance state has lost control of the argument in the plot, that their efforts to say that they have stopped any number of terrorism attacks with these and we would all be in great danger if they lost these authorities just don't seem as credible anymore, and partly, I think that that's a function of the distance from 9/11 and the fact that very, very, very few Americans are injured or killed by terrorism -- you know, a tiny handful, compared to other threats. And so people are beginning to be able to contextualize that. Partly, there is a generation, and particularly a digital generation, that was not as politically traumatized by 9/11, and doesn't jump when the intelligence agencies tell them to. And I think partly it's because people are seeing how these technologies are proliferating at the local level. Even if someone is willing to trust the president and the Defense Department and the NSA, most people are a lot less trusting of the cop on the corner, who might use this technology to stalk his ex-wife. And we don't want him to be able to open up his handheld device and have location information for everyone in the city because a license plate-reading camera has snapped that and he can feed right into that database. Some people call this the little brother problem, as opposed to the big brother problem.
But I do think that there is pushback. There is bipartisan pushback. This is an issue that is at least as important to people on the libertarian right as it is to people on the liberal left. There is this fundamentally American notion of being left alone unless you do something wrong that is jeopardized by dragnet surveillance, which captures information on everyone, in case we might do something wrong.
So, yes, I'm about as upbeat as I've been. I'll tell you that two states have passed legislation requiring law enforcement to go a judge and get a warrant before they can track your location using your cellphone. Those states were not New York and California; they were Montana and Maine. So there's something going on in the country where the question will be, at the national level, whether this will be a strong enough coalition to push reform through the House and the Senate against very, very, very strong opposition from a very well funded security state. But the fact that we're adding dozens and dozens of sponsors to these kinds of bills that would've been unthinkable just a year ago makes me optimistic.
[1] Moore's law is the idea that data density doubles every 18 months. Named after Gordon Moore, co-founder of Intel.
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
"When I came public it wasn't to single-handedly change the government. I wanted to inform the public so they could make their own decision," Snowden told the audience. "I took an oath to support and defend the Constitution and I saw that the Constitution was being violated on a massive scale."
You can see his complete appearance at SXSW here.
Monday's session was moderated by Ben Wizner, director of the ACLU Speech, Privacy and Technology Project, and a legal advisor to Edward Snowden. The day before he left for SXSW, Wizner spoke with Moyers & Company's senior writer Michael Winship.
Special thanks to John Light for recording and editing, and to Helen Brunner, director of the Media Democracy Fund, for helping to make this interview possible.
Winship: Tell us about the [American Civil Liberties Union's] Speech, Privacy and Technology Project. What's its agenda?
Wizner: We're trying to plant our flag at the point where science and technology and civil liberties intersect. The rapid developments in science and surveillance technologies really do have an impact on a whole range of rights, not just privacy, and we want to have the institutional expertise to be able to identify what those issues are and shape sophisticated responses. So the project has lawyers, but not only lawyers. We have two full-time computer scientists on our staff who are technologists, who are experts in encryption, secure communications, surveillance technologies. And they've been able, not only to help us respond with more intelligence, I would say, to the intelligence surveillance scandal, but also to help identify issues that we might not have intuitively understood to be civil liberties issues.
We want to be looking at issues that are not ready to be litigated, but might be important issues in five years, or eight or ten. One of the things I like to do is go into a room of scientists, maybe neuroscientists, and say, "What do you think the ACLU should know and be worried about?" and just see what kind of conversation comes out of that. And it's a really amazing kind of exercise. You might have somebody say, "You need to be worried about fMRI brain scanning. The research on this is really scary. It makes the polygraph seem like child's play. This is going on in classified Pentagon labs and you need to be thinking about this right now..." It might be somebody talking about other biometrics that are not at the forefront, or what's going to happen or what the dangers are of universal DNA screening at birth and what protections are necessary.
So sometimes we might end up writing a white paper about an issue like this. Other times, we might prepare litigation, and other times we might use our network of 50 ACLU state affiliates to push for immediate legislation through the state legislature.
Winship: So, really, what you're saying is there's something in almost everything that everybody does that's affected by the work you're doing.
Wizner: The reality is that, in the last year in particular, and really the last two or three years, there's been a disproportionate focus on digital privacy and government surveillance. And that focus has certainly exploded in the eight or nine months since Edward Snowden began to dominate the global headlines with revelation after revelation about the scope of the surveillance data. And so it's sort of an "all hands on deck" for that issue right now. A debate like this may come along once in a generation, an opportunity to kind of hit the reset button and for us to really reinvigorate our oversight mechanisms, and also encourage changes in the technology community to correct some of the asymmetries in power between the state and the citizens.
Winship: How did you first learn and know about Edward Snowden? When did that happen?
Wizner: I had been a friend, and at times advisor, to Laura Poitras, and a friend, and at times, source, to Glenn Greenwald. These were people with whom I was in very regular communication, and still am. And so when Laura Poitras first received a communication from someone who claimed to be in a senior position in the intelligence community, who claimed to have access to documentary evidence of illegal, unconstitutional and very, very troubling government surveillance activities, I was one of the people to whom Laura reached out for advice about what the next steps were. At that time, she had been subjected to so much surveillance herself that it was at the front of her mind that this might be an effort to entrap her, and so she wanted to make sure that she didn't put herself in any legal jeopardy or unnecessary risk. And so over the course of the next several months, leading up to the day in late May or June when Laura Poitras and Glenn Greenwald got on the plane to go meet Snowden, I had been having conversations with them. But I didn't know Edward Snowden's name or his identity until you did, until The Guardian put on its website that first video interview, in which Glenn Greenwald asks Edward Snowden who he is, what he's done and why he's done it. We were gathered around computers in our office watching that, just as I imagine people were gathered around computers in offices in Silicon Valley, in Fort Meade [National Security Agency headquarters], in Washington and in a lot of other places where people watch these issues closely.
Winship: How would you characterize what he has revealed?
Wizner: Well, maybe the best way to answer that question is to remember what President Obama said in the first week after the revelations began to appear on front pages. He said Americans shouldn't be too worried about these disclosures because all three branches of government had blessed the programs and activities that were being disclosed. That was a true statement. That was also exactly the problem. And it's worth looking at what those same three branches of government have done since Edward Snowden's disclosures, since the public was brought into this conversation.
So let's look at the courts. Now, it's true that a court called the Foreign Intelligence Surveillance Court had approved, in secret, some of these programs. It's a court that hears only from the government, does not have the benefit of adversarial briefing, didn't get to hear what our objections would have been. It's also a court that was set up to give warrants, not to write opinions on whether surveillance programs in general were lawful. And when we tried to bring challenges to these programs in open federal courts, we got as far as the Supreme Court, but every court turned us away without even considering the legality of the programs. The government said, "These plaintiffs have no right to be in court. They can't show that they were subjected to these surveillance programs, and therefore they don't have standing. And they're not allowed to use the discovery process to learn that, because that would be a state secret." The result being that no one has the right to go into federal court to challenge the legality of these programs.
Edward Snowden was watching this. In our very first conversation, one of his first questions to me was, "Have these documents that have been published so far given you standing to go back in court?" To him, the idea that a court would not answer the question, "Is this program legal? Is it constitutional?" but instead would contort itself in order to not answer that question seemed like a failure of oversight, and he was right.
What's happened since his disclosures? We have now taken some of these documents, gone back into federal courts, where our standing is really much harder to question. Two federal judges have now considered, for example, the constitutionality of the government's collection of all telephone metadata. They've come so far to different conclusions on the legal question, but both said that the plaintiffs have standing to be in court. So one thing that he's done is he's reinvigorated judicial oversight.
Now, what about Congress? To me, the signal moment in Congress is [Senator] Ron Wyden asking [Director of National Intelligence] James Clapper, "Is there any kind of information that you collect on millions or hundreds of millions of Americans?" And Clapper says, "No, sir, not wittingly." We like to call this Clapper lying to Congress, and it's certainly that. But it would be much more accurate to say that Clapper was lying to the American people, because Senator Wyden knew that the answer was false. He didn't, he felt like he couldn't correct the answer. No one else on the committee corrected the answer. Clapper didn't correct the answer, no one on his staff, no one on the Administration. So what we had was a lie being told to Congress and no one in any branch coming forward to say that a lie had been committed. And Snowden was watching that, too. And what's happened in Congress since the public disclosures? The issue has come out of the intelligence communities and into the full Congress. There is historic bipartisan legislation that would end bulk collection of American's data, that would create an adversarial process in the Foreign Intelligence Surveillance Court. This is the kind of legislation that would've been absolutely unthinkable before Snowden.
The direction has been one-way since the late 1970s. The Deep State has more authority, not less. The opposite is going to happen now. Now, whether it's something that seems more cosmetic or something that really is historic, well, that's really up to the people to decide. We will see. But there's been an earthquake in the congressional oversight of these programs, and that's because of Snowden.
And even the executive branch, which said, "Nothing to see here" -- you know, the president appointed his own review board, that included former very high ranking intelligence community officials and other close friends of his. I think it's fair to say that the civil society organizations expected a whitewash. But that's not what we got. The conclusions were -- more politely stated -- that the NSA had essentially gotten out of control, that it allowed its technological capabilities to drive its practices, rather than having its practices constrained by laws and values, and even wisdom. And there were dozens and dozens of recommendations that went not only to giving Americans greater protections, but also people abroad. And you heard the president in January, in his big speech about the NSA, say -- first time for any president, I think -- that we need to be concerned about the privacy rights of people outside the US who are not protected by the Constitution.
So all three branches of government are now doing the oversight that the Constitution wants them to do, that they were not doing before Edward Snowden. To me, that is his most significant contribution.
Winship: And you feel that the route he took, via journalists, was the one and only way he could go?
Wizner: I guess at times I wonder what people mean when they say he should have gone through a traditional route rather than going through journalists. Sometimes, the kinds of people who we call whistleblowers -- and I don't use that term as a term of art -- are people who uncover unquestionably illegal conduct that's been hidden away and they just need to bring it to the attention of an overseer, call up an inspector general, call up a member of Congress and say, "Look what I found," and then the system will take care of itself. But sometimes, someone comes upon a system of global dragnet surveillance that the oversight system deems perfectly legal. This is not something that Congress was unaware of. This is not something that courts were unaware of, at least the courts that were set up to review these practices. What was Edward Snowden supposed to do, call up the Senate Intelligence Committee and say, "Hi, I'm a 29-year-old contractor who works in Hawaii, and I'm calling to report to you about the programs that you have approved in secret?" This was a very, very different kind of situation. There was no one to report to who had not been part of the system of approval. And even those who were in the Congress who shared Snowden's view about the propriety and maybe legality of this were unwilling to talk. Senator Wyden was on the floor of the Senate with his hair on fire, saying, "If the American people knew what I knew, they would be angry and they would be shocked." Well, that turned out to be true, but we didn't learn it from Senator Wyden. We learned it from Edward Snowden.
And one more point about what he did. You know, the number of documents that Edward Snowden has made available to the public is zero. What he did is give information to journalists, with the instruction that they and their editors, in consultation, where necessary, with government officials, decide what was in the public interest to publish, and to withhold information that would be harmful to publish. He wanted to create a protocol that would correct for his own biases. He was someone who had spent the last almost ten years in the intelligence community. He didn't think that his own judgments -- and he has very strong judgments about what should or should not be public -- were adequate to this moment and wanted to make sure that the institutions that had the experience in doing this -- and these are our newspapers, who have long experience competing with the government over access and control of secret information -- that that be the way that the information got published. And many people have not noticed this. In an interview that Snowden gave with Time magazine when he was runner-up to the Pope for Person of the Year, he said he hasn't always agreed with the public interest determinations of the journalists, but that that's precisely why he needed to do it this way. He didn't want and didn't think that he should have the responsibility to decide which of these documents should be public. He wanted to appeal to the traditions, the institutions, the expertise of the media in helping to make those important judgments. That's what we want whistleblowers to do. We don't want them to unilaterally substitute their judgment for everybody else's. We want them to go through these institutions that funnel and that channel that and have longer experience in making these kinds of decisions.
Winship: And yet, Keith Alexander, the outgoing head of the NSA, made a speech at Georgetown a few days ago in which he said that journalists don't have the proper ability to analyze these materials, and he said that Snowden's leaks had caused "grave, significant and irreversible damage to our nation."
Wizner: Those words are the classic weasel words of the Deep State. That sentence could have been lifted from the United States government's brief to the Supreme Court in the Pentagon Papers case, where they said if the Court allowed The New York Times and Washington Post and others to publish the papers they would be responsible for "grave and irreversible damage to the national security." It's exactly the same kind of language.
You know, I wonder if General Alexander really believes that our democracy would be stronger and better off if journalists deferred in every case to the expertise and interests of the executive branch in deciding what to publish. I mean if you look just back at the last few years and consider what the public would not have known on that model, we would not have known that the case for war in Iraq was cooked and based on deliberate exaggerations and lies. We wouldn't have known that American soldiers tortured and sexually humiliated prisoners in Abu Ghraib. We wouldn't have know that the CIA set up a network of secret prisons around the world and used an extraordinary rendition program to kidnap people off the streets of cities, chain them to the floors of planes and fly them to those places. We wouldn't have known about an enhanced interrogation program, known to the rest of the world as torture, where prisoners were waterboarded, suspended from ropes, beaten and treated in ways that the US has always considered criminal when it was done to our own soldiers. We wouldn't have known that the Bush administration decided that the rules for foreign intelligence surveillance collection were too cumbersome and that they should just throw aside the statute and collect whatever they wanted to, under the president's own authority, leading to the near resignation of the attorney general and the head of the FBI.
All of this stuff was classified. Not just classified; it was classified at the highest level. These were the secrets that the government said were most critical to keep. But what kind of democracy would we be if the public had never learned of this information?
I'm also not saying that journalists alone should decide what the public sees. I mean the government's voice in this debate is an important one. It's a back and forth. It's always been a back and forth. I don't believe a single story based on Snowden documents has yet been published without consultation with the government, without giving the government an opportunity to strenuously object and to point out things that might cause harm in their view. And that's why I don't think there's been any credible evidence at all of real harm to national security from these leaks.
Winship: From your position here at the Speech, Privacy and Technology Project, were you, like Senator Wyden, shocked by the extent of the surveillance?
Wizner: The short answer is yes. And I'm not easy to shock. And I question whether I was being naive, but it never occurred to me that the NSA or Department of Justice could think that a provision like Section 215 of the Patriot Act, which is similar to a provision giving grand juries subpoena authority, could be used to get the records of every single American phone call every day. It simply didn't make any sense. Or that the government could think that it was a valid constitutional argument to say that the Fourth Amendment basically is silent when it comes to intercepting and storing all of our communications -- it has nothing to say, it's not even implicated -- that only when someone decides to query that database in search of a particular person does the Fourth Amendment have even a limited role, that they're allowed to collect everything without having to consider the Fourth Amendment. It seems to me that that turns the Fourth Amendment on its head, that we do search and seizure before suspicion, rather than having suspicion before search and seizure.
This stuff, the extent of it, was shocking, and I was also shocked by some of the NSA's efforts to undermine the security of the Internet in our global communications in order to facilitate their mass surveillance. The idea that they would deliberately weaken commonly used encryption standards that are used to protect our financial records, that are used to protect our medical records, our networks -- and they would do this even as other parts of the government are telling us that the cyber threat is greater than the terrorism threat right now. Well, in the name of fighting terrorism, the offensive surveillance that they're doing is weakening our cyber defenses against those other kinds of threats. This is going on in the very same government. And the amount of that, the way that offense had just been sort of given everything at the expense of defense, did come as something of a shock, and I think not just to people in the ACLU, but to people in the technology community, many of whom did not consider themselves political, saw politics as messy, liked the elegance of ones and zeroes and code, but now find themselves having to treat the NSA as a different kind of threat model when they're thinking about security of networks and systems.
Winship: And yet, certain of these companies were also, selling information to the NSA, or sharing information.
Wizner: I think the relationship between the NSA and these companies is complicated. It's not one relationship, it's many relationships. And I do think that, in particular, the Silicon Valley companies -- that have a sort of self-image as being libertarian; they have the ethos of being against evil and on the side of freedom -- certainly want to minimize the ways in which they facilitated the kinds of mass surveillance that's gone on.
On the other hand, I think there is a difference between a program like PRISM, where the government goes to the technology companies with orders from a court and then they open up their systems in response to those orders, and other programs that we read about where the NSA looks for the places in these companies where data is moving around without encryption -- so transfers of data between data centers overseas -- and then hacks into those connections. I don't think they ask permission to do that, and I think that it's not all hypocrisy when the companies say, "I can't believe they did that." Their attitude is, "You could get what you wanted with a court order. The law is so favorable to you now, why are you hacking us? Why are you weakening us?"
And I do think that now that the debate is out in the open, and now that people around the world, or known to these companies as markets around the world, are very nervous about American companies, because they have seen how the NSA treats them like a silver platter for dragnet surveillance. You know, the companies face a kind of existential threat. They're either going to have to do more to protect, to give end-to-end encryption to their customers' data, or they're going to lose those customers. And so maybe it was easier for the companies when this debate was in the shadows or when it wasn't really known. Now they really are going to have to make these choices. They can't present the same smiley face to the public and to government.
Winship: And it's really a global issue, not just a domestic one; the companies have clients all over the world and there's pushback from those clients.
Wizner: And you see the way that the NSA operates. They might tell Germany, "We're not going to take data of Germans in Germany," so they'll wait until it leaves Germany and transits to Denmark and take it there. And they might say the same thing to Denmark, "We're not going to take information from the people of Denmark in Denmark," but they'll get it when it transits to Germany. And because of the NSA's access to these global communications systems, the issue is global, as you say. Not a lot of domestic reform litigation is going to be able to be equal to the size of the challenge, so there is a sort of broader challenge to free societies about how to deal with the question of mass surveillance. We're talking -- when I say mass surveillance, I'm saying basically collecting everything, tapping into the backbone of the Internet and syphoning off everything and storing it, which we can do now because storage is cheap. We used to have to decide who we were going to follow. Now we can follow everybody. And there's no question that having all of that information is going to be useful to governments in some way. You create a database that can be a surveillance time machine. It can allow governments to hit rewind and recreate our lives in ways that would surely help solve some crimes.
It also is, I think, a real danger to give governments that much power over their citizens. And so I think there has to be a sort of law and policy debate that takes place at a global level, at least among free societies. But also, remember that when Google switches from having its Gmail traffic go from being unencrypted by default to being encrypted by default, something that it did in 2010, that affects hundreds of millions of people instantly, at the flip of a switch, and not just people in the United States. Google just made it almost impossible for the government of Iran to get Gmail information from Iranians who are using the service. So the technology companies, the fixes that they put in place will be global fixes and not just for Americans.
Winship: But in terms of Americans, what do you say to people who say, "Look, they're just trying to protect me. What's the harm? I haven't done anything wrong. I've got nothing to hide"?
Wizner: I think that many Americans do feel that way. I do think that, when you talk about the dangers of mass surveillance, they can seem abstract. They can seem futuristic, they can seem science fiction-y. They're not as easy to digest as a villain like J. Edgar Hoover, who was blackmailing Martin Luther King and infiltrating movements, or a Stasi [former East German secret police] keeping detailed files on neighbors and all of that. And I start by saying we are not a society like that yet. We are fundamentally, for most people, at least, a free society. I mean there are communities like the Muslim community that have experienced a much more invasive form of surveillance, with informants in their mosques and people in their community, provocateurs. But for the most part, that's not what it feels like now to live in America.
Having said that, without the right kinds of democratic controls, these kinds of technologies frighten me. And they frighten me for a number of reasons. As I said, our principle privacy protection in the past was probably more a cost than law. Governments simply had to make decisions about what they were going to collect and what they were going to store, because it was expensive to do those things. Now that those costs are plunging to zero and Moore's law [1] marches on, it's technologically and financially feasible for governments to collect and store everything about us, all of our movements, all of our communications, all of our associations, all the information they need to construct portraits of our lives that might be more detailed even than things that we know about ourselves. And if you add to this other kinds of surveillance that surely are coming -- drones in the sky with ARGUS cameras that can record entire cities second by second and reconstruct them -- we will be living in a different kind of panopticon, and I do think that that is going to cast a chill on what it means to be a human being, if we know that every moment that we consider private now is actually residing somewhere, and someone somewhere can hit rewind and see it, even if we ourselves are not suspected of doing anything wrong.
If that's too abstract, I think that when governments amass this kind of information and unleash upon it computer algorithms that are trying to make predictions and judgments -- you know, who's dangerous, who might be a terrorist, who should we be more worried about? We've seen what happens on the commercial side every day with these kinds of processes, because of course supercomputers that belong to insurance companies and others are trying to calculate what kind of risk we are. You have a credit score. If anybody's ever made a mistake with a credit score of yours, I bet it was not that easy to have it resolved, and in the meantime, you might not have gotten a loan, you might've been redlined, you might've had much higher car insurance costs. I think when this tendency moves over to government -- and it's happening already -- we already have watch lists that say that some people can fly and some people can't, or that they can go in this lane or they can go in that lane. But that's only going to proliferate as there's more and more data and faster and faster computers and more confidence that they can make these predictions about us. I worry about due process. I worry about basic fairness. And I worry about a world -- not a world that looks like Orwell. The law professor Daniel Solove has said that maybe the better analogy for this, or metaphor, is not Orwell, but Kafka, where a big data state makes judgments that can be permanent and irreversible, and don't seem fair, and we don't have a chance to speak back.
And then, finally, I worry that the current state of play with these technologies cannot last, that we're being told, "Don't worry, all of this data just stays in a box and we only look at it a couple of hundred times a year." That's true now, but it will not be true in two years, and it will definitely not be true in five, and in ten years, every law enforcement agency is going to have access to this data. Today it's the NSA, but it's going to be the FBI, if it isn't already, and the DEA, and joint terrorism task forces that include local sheriffs and your local police. And very soon, the cop on the corner, through his handheld device, is going to have access to all of this data, and he will because we'll be presented with scenarios where if he had had it, it would've stopped this crime or stopped this attack, and the restrictions will be blamed. And, you know, Pandora 's Box is going to be opened.
I think if we don't, right now, when we hold the world's attention focused on this debate, put in stronger democratic controls, it becomes more and more likely that, gradually, through mission creep, we're going to be living in a different kind of surveillance state, and one that people feel and experience much more directly.
Winship: It seems to me, the other thing we're talking about is the chilling effect on speech, that people are self-censoring, even, because they're afraid to be surveilled like this and to have everything that they're saying covered and documented.
Wizner: Yes. I think "chill" is something that we worry about a lot, and maybe we should worry about it even more. I sometimes think about poor Professor Petraeus [former General and CIA Director David Petraeus] at the City University of New York. One woman sent a nasty anonymous email to a second woman, the second woman called the FBI, and in a short time, the FBI was reading the content of many, many, many thousands of emails. And perhaps they justified doing that -- I think it was outrageous -- because once they saw the CIA director's name, they thought maybe this is a national security threat. Well, I hope that that scared every member of Congress, that the FBI could use national security as a pretext to pour through the CIA director's private email communications. He was not even suspected of wrongdoing, but because our digital trails are basically permanent and we can't erase them, it doesn't even require the arrow to be pointed at us. It could be pointed at somebody next to us and then our lives can be turned upside down like this.
So I think people are beginning to experience this. We see these sad cases where a Facebook photo of somebody drinking at a fraternity party when she was 20 results in her not being able to get hired as a teacher when she's 26. Stories like that are going to proliferate, where the digital footprints that we leave by living ordinary lives constrain choices down the road and do chill. And I think they already do chill. I think they chill things in a way that are going to give us worse politicians. Right now, anybody who wants to be president who's 18 years old is probably living a life of walking on eggshells and trying to avoid the kinds of experiences that could be embarrassing. But do we want that person to be our president, someone who from the age of 16, 17 and 18 is doing everything possible to engage in our world so as not to be in a position where his or her life can be distorted down the road? I mean, to me, these are really, really important questions that go far beyond law and get at what it's going to be like to be a person in a world of universal collection. And the kinds of controls that we put access to this, our ability to hit delete on things that might actually be useful and not to make national security the first, second and third argument in deciding how to make these hard decisions as a society, it's going to be very interesting to see how all of that plays out.
Winship: Are you hopeful?
Wizner: I think there was a period of time after the 9/11 attacks, and it was a much longer period of time than it should've been, where the talismanic invocation of terrorism or national security was enough to sort of silence opposition to the expansions of the government's surveillance authority. And I think that one of the remarkable developments in the last eight or nine months has been how the surveillance state has lost control of the argument in the plot, that their efforts to say that they have stopped any number of terrorism attacks with these and we would all be in great danger if they lost these authorities just don't seem as credible anymore, and partly, I think that that's a function of the distance from 9/11 and the fact that very, very, very few Americans are injured or killed by terrorism -- you know, a tiny handful, compared to other threats. And so people are beginning to be able to contextualize that. Partly, there is a generation, and particularly a digital generation, that was not as politically traumatized by 9/11, and doesn't jump when the intelligence agencies tell them to. And I think partly it's because people are seeing how these technologies are proliferating at the local level. Even if someone is willing to trust the president and the Defense Department and the NSA, most people are a lot less trusting of the cop on the corner, who might use this technology to stalk his ex-wife. And we don't want him to be able to open up his handheld device and have location information for everyone in the city because a license plate-reading camera has snapped that and he can feed right into that database. Some people call this the little brother problem, as opposed to the big brother problem.
But I do think that there is pushback. There is bipartisan pushback. This is an issue that is at least as important to people on the libertarian right as it is to people on the liberal left. There is this fundamentally American notion of being left alone unless you do something wrong that is jeopardized by dragnet surveillance, which captures information on everyone, in case we might do something wrong.
So, yes, I'm about as upbeat as I've been. I'll tell you that two states have passed legislation requiring law enforcement to go a judge and get a warrant before they can track your location using your cellphone. Those states were not New York and California; they were Montana and Maine. So there's something going on in the country where the question will be, at the national level, whether this will be a strong enough coalition to push reform through the House and the Senate against very, very, very strong opposition from a very well funded security state. But the fact that we're adding dozens and dozens of sponsors to these kinds of bills that would've been unthinkable just a year ago makes me optimistic.
[1] Moore's law is the idea that data density doubles every 18 months. Named after Gordon Moore, co-founder of Intel.
"When I came public it wasn't to single-handedly change the government. I wanted to inform the public so they could make their own decision," Snowden told the audience. "I took an oath to support and defend the Constitution and I saw that the Constitution was being violated on a massive scale."
You can see his complete appearance at SXSW here.
Monday's session was moderated by Ben Wizner, director of the ACLU Speech, Privacy and Technology Project, and a legal advisor to Edward Snowden. The day before he left for SXSW, Wizner spoke with Moyers & Company's senior writer Michael Winship.
Special thanks to John Light for recording and editing, and to Helen Brunner, director of the Media Democracy Fund, for helping to make this interview possible.
Winship: Tell us about the [American Civil Liberties Union's] Speech, Privacy and Technology Project. What's its agenda?
Wizner: We're trying to plant our flag at the point where science and technology and civil liberties intersect. The rapid developments in science and surveillance technologies really do have an impact on a whole range of rights, not just privacy, and we want to have the institutional expertise to be able to identify what those issues are and shape sophisticated responses. So the project has lawyers, but not only lawyers. We have two full-time computer scientists on our staff who are technologists, who are experts in encryption, secure communications, surveillance technologies. And they've been able, not only to help us respond with more intelligence, I would say, to the intelligence surveillance scandal, but also to help identify issues that we might not have intuitively understood to be civil liberties issues.
We want to be looking at issues that are not ready to be litigated, but might be important issues in five years, or eight or ten. One of the things I like to do is go into a room of scientists, maybe neuroscientists, and say, "What do you think the ACLU should know and be worried about?" and just see what kind of conversation comes out of that. And it's a really amazing kind of exercise. You might have somebody say, "You need to be worried about fMRI brain scanning. The research on this is really scary. It makes the polygraph seem like child's play. This is going on in classified Pentagon labs and you need to be thinking about this right now..." It might be somebody talking about other biometrics that are not at the forefront, or what's going to happen or what the dangers are of universal DNA screening at birth and what protections are necessary.
So sometimes we might end up writing a white paper about an issue like this. Other times, we might prepare litigation, and other times we might use our network of 50 ACLU state affiliates to push for immediate legislation through the state legislature.
Winship: So, really, what you're saying is there's something in almost everything that everybody does that's affected by the work you're doing.
Wizner: The reality is that, in the last year in particular, and really the last two or three years, there's been a disproportionate focus on digital privacy and government surveillance. And that focus has certainly exploded in the eight or nine months since Edward Snowden began to dominate the global headlines with revelation after revelation about the scope of the surveillance data. And so it's sort of an "all hands on deck" for that issue right now. A debate like this may come along once in a generation, an opportunity to kind of hit the reset button and for us to really reinvigorate our oversight mechanisms, and also encourage changes in the technology community to correct some of the asymmetries in power between the state and the citizens.
Winship: How did you first learn and know about Edward Snowden? When did that happen?
Wizner: I had been a friend, and at times advisor, to Laura Poitras, and a friend, and at times, source, to Glenn Greenwald. These were people with whom I was in very regular communication, and still am. And so when Laura Poitras first received a communication from someone who claimed to be in a senior position in the intelligence community, who claimed to have access to documentary evidence of illegal, unconstitutional and very, very troubling government surveillance activities, I was one of the people to whom Laura reached out for advice about what the next steps were. At that time, she had been subjected to so much surveillance herself that it was at the front of her mind that this might be an effort to entrap her, and so she wanted to make sure that she didn't put herself in any legal jeopardy or unnecessary risk. And so over the course of the next several months, leading up to the day in late May or June when Laura Poitras and Glenn Greenwald got on the plane to go meet Snowden, I had been having conversations with them. But I didn't know Edward Snowden's name or his identity until you did, until The Guardian put on its website that first video interview, in which Glenn Greenwald asks Edward Snowden who he is, what he's done and why he's done it. We were gathered around computers in our office watching that, just as I imagine people were gathered around computers in offices in Silicon Valley, in Fort Meade [National Security Agency headquarters], in Washington and in a lot of other places where people watch these issues closely.
Winship: How would you characterize what he has revealed?
Wizner: Well, maybe the best way to answer that question is to remember what President Obama said in the first week after the revelations began to appear on front pages. He said Americans shouldn't be too worried about these disclosures because all three branches of government had blessed the programs and activities that were being disclosed. That was a true statement. That was also exactly the problem. And it's worth looking at what those same three branches of government have done since Edward Snowden's disclosures, since the public was brought into this conversation.
So let's look at the courts. Now, it's true that a court called the Foreign Intelligence Surveillance Court had approved, in secret, some of these programs. It's a court that hears only from the government, does not have the benefit of adversarial briefing, didn't get to hear what our objections would have been. It's also a court that was set up to give warrants, not to write opinions on whether surveillance programs in general were lawful. And when we tried to bring challenges to these programs in open federal courts, we got as far as the Supreme Court, but every court turned us away without even considering the legality of the programs. The government said, "These plaintiffs have no right to be in court. They can't show that they were subjected to these surveillance programs, and therefore they don't have standing. And they're not allowed to use the discovery process to learn that, because that would be a state secret." The result being that no one has the right to go into federal court to challenge the legality of these programs.
Edward Snowden was watching this. In our very first conversation, one of his first questions to me was, "Have these documents that have been published so far given you standing to go back in court?" To him, the idea that a court would not answer the question, "Is this program legal? Is it constitutional?" but instead would contort itself in order to not answer that question seemed like a failure of oversight, and he was right.
What's happened since his disclosures? We have now taken some of these documents, gone back into federal courts, where our standing is really much harder to question. Two federal judges have now considered, for example, the constitutionality of the government's collection of all telephone metadata. They've come so far to different conclusions on the legal question, but both said that the plaintiffs have standing to be in court. So one thing that he's done is he's reinvigorated judicial oversight.
Now, what about Congress? To me, the signal moment in Congress is [Senator] Ron Wyden asking [Director of National Intelligence] James Clapper, "Is there any kind of information that you collect on millions or hundreds of millions of Americans?" And Clapper says, "No, sir, not wittingly." We like to call this Clapper lying to Congress, and it's certainly that. But it would be much more accurate to say that Clapper was lying to the American people, because Senator Wyden knew that the answer was false. He didn't, he felt like he couldn't correct the answer. No one else on the committee corrected the answer. Clapper didn't correct the answer, no one on his staff, no one on the Administration. So what we had was a lie being told to Congress and no one in any branch coming forward to say that a lie had been committed. And Snowden was watching that, too. And what's happened in Congress since the public disclosures? The issue has come out of the intelligence communities and into the full Congress. There is historic bipartisan legislation that would end bulk collection of American's data, that would create an adversarial process in the Foreign Intelligence Surveillance Court. This is the kind of legislation that would've been absolutely unthinkable before Snowden.
The direction has been one-way since the late 1970s. The Deep State has more authority, not less. The opposite is going to happen now. Now, whether it's something that seems more cosmetic or something that really is historic, well, that's really up to the people to decide. We will see. But there's been an earthquake in the congressional oversight of these programs, and that's because of Snowden.
And even the executive branch, which said, "Nothing to see here" -- you know, the president appointed his own review board, that included former very high ranking intelligence community officials and other close friends of his. I think it's fair to say that the civil society organizations expected a whitewash. But that's not what we got. The conclusions were -- more politely stated -- that the NSA had essentially gotten out of control, that it allowed its technological capabilities to drive its practices, rather than having its practices constrained by laws and values, and even wisdom. And there were dozens and dozens of recommendations that went not only to giving Americans greater protections, but also people abroad. And you heard the president in January, in his big speech about the NSA, say -- first time for any president, I think -- that we need to be concerned about the privacy rights of people outside the US who are not protected by the Constitution.
So all three branches of government are now doing the oversight that the Constitution wants them to do, that they were not doing before Edward Snowden. To me, that is his most significant contribution.
Winship: And you feel that the route he took, via journalists, was the one and only way he could go?
Wizner: I guess at times I wonder what people mean when they say he should have gone through a traditional route rather than going through journalists. Sometimes, the kinds of people who we call whistleblowers -- and I don't use that term as a term of art -- are people who uncover unquestionably illegal conduct that's been hidden away and they just need to bring it to the attention of an overseer, call up an inspector general, call up a member of Congress and say, "Look what I found," and then the system will take care of itself. But sometimes, someone comes upon a system of global dragnet surveillance that the oversight system deems perfectly legal. This is not something that Congress was unaware of. This is not something that courts were unaware of, at least the courts that were set up to review these practices. What was Edward Snowden supposed to do, call up the Senate Intelligence Committee and say, "Hi, I'm a 29-year-old contractor who works in Hawaii, and I'm calling to report to you about the programs that you have approved in secret?" This was a very, very different kind of situation. There was no one to report to who had not been part of the system of approval. And even those who were in the Congress who shared Snowden's view about the propriety and maybe legality of this were unwilling to talk. Senator Wyden was on the floor of the Senate with his hair on fire, saying, "If the American people knew what I knew, they would be angry and they would be shocked." Well, that turned out to be true, but we didn't learn it from Senator Wyden. We learned it from Edward Snowden.
And one more point about what he did. You know, the number of documents that Edward Snowden has made available to the public is zero. What he did is give information to journalists, with the instruction that they and their editors, in consultation, where necessary, with government officials, decide what was in the public interest to publish, and to withhold information that would be harmful to publish. He wanted to create a protocol that would correct for his own biases. He was someone who had spent the last almost ten years in the intelligence community. He didn't think that his own judgments -- and he has very strong judgments about what should or should not be public -- were adequate to this moment and wanted to make sure that the institutions that had the experience in doing this -- and these are our newspapers, who have long experience competing with the government over access and control of secret information -- that that be the way that the information got published. And many people have not noticed this. In an interview that Snowden gave with Time magazine when he was runner-up to the Pope for Person of the Year, he said he hasn't always agreed with the public interest determinations of the journalists, but that that's precisely why he needed to do it this way. He didn't want and didn't think that he should have the responsibility to decide which of these documents should be public. He wanted to appeal to the traditions, the institutions, the expertise of the media in helping to make those important judgments. That's what we want whistleblowers to do. We don't want them to unilaterally substitute their judgment for everybody else's. We want them to go through these institutions that funnel and that channel that and have longer experience in making these kinds of decisions.
Winship: And yet, Keith Alexander, the outgoing head of the NSA, made a speech at Georgetown a few days ago in which he said that journalists don't have the proper ability to analyze these materials, and he said that Snowden's leaks had caused "grave, significant and irreversible damage to our nation."
Wizner: Those words are the classic weasel words of the Deep State. That sentence could have been lifted from the United States government's brief to the Supreme Court in the Pentagon Papers case, where they said if the Court allowed The New York Times and Washington Post and others to publish the papers they would be responsible for "grave and irreversible damage to the national security." It's exactly the same kind of language.
You know, I wonder if General Alexander really believes that our democracy would be stronger and better off if journalists deferred in every case to the expertise and interests of the executive branch in deciding what to publish. I mean if you look just back at the last few years and consider what the public would not have known on that model, we would not have known that the case for war in Iraq was cooked and based on deliberate exaggerations and lies. We wouldn't have known that American soldiers tortured and sexually humiliated prisoners in Abu Ghraib. We wouldn't have know that the CIA set up a network of secret prisons around the world and used an extraordinary rendition program to kidnap people off the streets of cities, chain them to the floors of planes and fly them to those places. We wouldn't have known about an enhanced interrogation program, known to the rest of the world as torture, where prisoners were waterboarded, suspended from ropes, beaten and treated in ways that the US has always considered criminal when it was done to our own soldiers. We wouldn't have known that the Bush administration decided that the rules for foreign intelligence surveillance collection were too cumbersome and that they should just throw aside the statute and collect whatever they wanted to, under the president's own authority, leading to the near resignation of the attorney general and the head of the FBI.
All of this stuff was classified. Not just classified; it was classified at the highest level. These were the secrets that the government said were most critical to keep. But what kind of democracy would we be if the public had never learned of this information?
I'm also not saying that journalists alone should decide what the public sees. I mean the government's voice in this debate is an important one. It's a back and forth. It's always been a back and forth. I don't believe a single story based on Snowden documents has yet been published without consultation with the government, without giving the government an opportunity to strenuously object and to point out things that might cause harm in their view. And that's why I don't think there's been any credible evidence at all of real harm to national security from these leaks.
Winship: From your position here at the Speech, Privacy and Technology Project, were you, like Senator Wyden, shocked by the extent of the surveillance?
Wizner: The short answer is yes. And I'm not easy to shock. And I question whether I was being naive, but it never occurred to me that the NSA or Department of Justice could think that a provision like Section 215 of the Patriot Act, which is similar to a provision giving grand juries subpoena authority, could be used to get the records of every single American phone call every day. It simply didn't make any sense. Or that the government could think that it was a valid constitutional argument to say that the Fourth Amendment basically is silent when it comes to intercepting and storing all of our communications -- it has nothing to say, it's not even implicated -- that only when someone decides to query that database in search of a particular person does the Fourth Amendment have even a limited role, that they're allowed to collect everything without having to consider the Fourth Amendment. It seems to me that that turns the Fourth Amendment on its head, that we do search and seizure before suspicion, rather than having suspicion before search and seizure.
This stuff, the extent of it, was shocking, and I was also shocked by some of the NSA's efforts to undermine the security of the Internet in our global communications in order to facilitate their mass surveillance. The idea that they would deliberately weaken commonly used encryption standards that are used to protect our financial records, that are used to protect our medical records, our networks -- and they would do this even as other parts of the government are telling us that the cyber threat is greater than the terrorism threat right now. Well, in the name of fighting terrorism, the offensive surveillance that they're doing is weakening our cyber defenses against those other kinds of threats. This is going on in the very same government. And the amount of that, the way that offense had just been sort of given everything at the expense of defense, did come as something of a shock, and I think not just to people in the ACLU, but to people in the technology community, many of whom did not consider themselves political, saw politics as messy, liked the elegance of ones and zeroes and code, but now find themselves having to treat the NSA as a different kind of threat model when they're thinking about security of networks and systems.
Winship: And yet, certain of these companies were also, selling information to the NSA, or sharing information.
Wizner: I think the relationship between the NSA and these companies is complicated. It's not one relationship, it's many relationships. And I do think that, in particular, the Silicon Valley companies -- that have a sort of self-image as being libertarian; they have the ethos of being against evil and on the side of freedom -- certainly want to minimize the ways in which they facilitated the kinds of mass surveillance that's gone on.
On the other hand, I think there is a difference between a program like PRISM, where the government goes to the technology companies with orders from a court and then they open up their systems in response to those orders, and other programs that we read about where the NSA looks for the places in these companies where data is moving around without encryption -- so transfers of data between data centers overseas -- and then hacks into those connections. I don't think they ask permission to do that, and I think that it's not all hypocrisy when the companies say, "I can't believe they did that." Their attitude is, "You could get what you wanted with a court order. The law is so favorable to you now, why are you hacking us? Why are you weakening us?"
And I do think that now that the debate is out in the open, and now that people around the world, or known to these companies as markets around the world, are very nervous about American companies, because they have seen how the NSA treats them like a silver platter for dragnet surveillance. You know, the companies face a kind of existential threat. They're either going to have to do more to protect, to give end-to-end encryption to their customers' data, or they're going to lose those customers. And so maybe it was easier for the companies when this debate was in the shadows or when it wasn't really known. Now they really are going to have to make these choices. They can't present the same smiley face to the public and to government.
Winship: And it's really a global issue, not just a domestic one; the companies have clients all over the world and there's pushback from those clients.
Wizner: And you see the way that the NSA operates. They might tell Germany, "We're not going to take data of Germans in Germany," so they'll wait until it leaves Germany and transits to Denmark and take it there. And they might say the same thing to Denmark, "We're not going to take information from the people of Denmark in Denmark," but they'll get it when it transits to Germany. And because of the NSA's access to these global communications systems, the issue is global, as you say. Not a lot of domestic reform litigation is going to be able to be equal to the size of the challenge, so there is a sort of broader challenge to free societies about how to deal with the question of mass surveillance. We're talking -- when I say mass surveillance, I'm saying basically collecting everything, tapping into the backbone of the Internet and syphoning off everything and storing it, which we can do now because storage is cheap. We used to have to decide who we were going to follow. Now we can follow everybody. And there's no question that having all of that information is going to be useful to governments in some way. You create a database that can be a surveillance time machine. It can allow governments to hit rewind and recreate our lives in ways that would surely help solve some crimes.
It also is, I think, a real danger to give governments that much power over their citizens. And so I think there has to be a sort of law and policy debate that takes place at a global level, at least among free societies. But also, remember that when Google switches from having its Gmail traffic go from being unencrypted by default to being encrypted by default, something that it did in 2010, that affects hundreds of millions of people instantly, at the flip of a switch, and not just people in the United States. Google just made it almost impossible for the government of Iran to get Gmail information from Iranians who are using the service. So the technology companies, the fixes that they put in place will be global fixes and not just for Americans.
Winship: But in terms of Americans, what do you say to people who say, "Look, they're just trying to protect me. What's the harm? I haven't done anything wrong. I've got nothing to hide"?
Wizner: I think that many Americans do feel that way. I do think that, when you talk about the dangers of mass surveillance, they can seem abstract. They can seem futuristic, they can seem science fiction-y. They're not as easy to digest as a villain like J. Edgar Hoover, who was blackmailing Martin Luther King and infiltrating movements, or a Stasi [former East German secret police] keeping detailed files on neighbors and all of that. And I start by saying we are not a society like that yet. We are fundamentally, for most people, at least, a free society. I mean there are communities like the Muslim community that have experienced a much more invasive form of surveillance, with informants in their mosques and people in their community, provocateurs. But for the most part, that's not what it feels like now to live in America.
Having said that, without the right kinds of democratic controls, these kinds of technologies frighten me. And they frighten me for a number of reasons. As I said, our principle privacy protection in the past was probably more a cost than law. Governments simply had to make decisions about what they were going to collect and what they were going to store, because it was expensive to do those things. Now that those costs are plunging to zero and Moore's law [1] marches on, it's technologically and financially feasible for governments to collect and store everything about us, all of our movements, all of our communications, all of our associations, all the information they need to construct portraits of our lives that might be more detailed even than things that we know about ourselves. And if you add to this other kinds of surveillance that surely are coming -- drones in the sky with ARGUS cameras that can record entire cities second by second and reconstruct them -- we will be living in a different kind of panopticon, and I do think that that is going to cast a chill on what it means to be a human being, if we know that every moment that we consider private now is actually residing somewhere, and someone somewhere can hit rewind and see it, even if we ourselves are not suspected of doing anything wrong.
If that's too abstract, I think that when governments amass this kind of information and unleash upon it computer algorithms that are trying to make predictions and judgments -- you know, who's dangerous, who might be a terrorist, who should we be more worried about? We've seen what happens on the commercial side every day with these kinds of processes, because of course supercomputers that belong to insurance companies and others are trying to calculate what kind of risk we are. You have a credit score. If anybody's ever made a mistake with a credit score of yours, I bet it was not that easy to have it resolved, and in the meantime, you might not have gotten a loan, you might've been redlined, you might've had much higher car insurance costs. I think when this tendency moves over to government -- and it's happening already -- we already have watch lists that say that some people can fly and some people can't, or that they can go in this lane or they can go in that lane. But that's only going to proliferate as there's more and more data and faster and faster computers and more confidence that they can make these predictions about us. I worry about due process. I worry about basic fairness. And I worry about a world -- not a world that looks like Orwell. The law professor Daniel Solove has said that maybe the better analogy for this, or metaphor, is not Orwell, but Kafka, where a big data state makes judgments that can be permanent and irreversible, and don't seem fair, and we don't have a chance to speak back.
And then, finally, I worry that the current state of play with these technologies cannot last, that we're being told, "Don't worry, all of this data just stays in a box and we only look at it a couple of hundred times a year." That's true now, but it will not be true in two years, and it will definitely not be true in five, and in ten years, every law enforcement agency is going to have access to this data. Today it's the NSA, but it's going to be the FBI, if it isn't already, and the DEA, and joint terrorism task forces that include local sheriffs and your local police. And very soon, the cop on the corner, through his handheld device, is going to have access to all of this data, and he will because we'll be presented with scenarios where if he had had it, it would've stopped this crime or stopped this attack, and the restrictions will be blamed. And, you know, Pandora 's Box is going to be opened.
I think if we don't, right now, when we hold the world's attention focused on this debate, put in stronger democratic controls, it becomes more and more likely that, gradually, through mission creep, we're going to be living in a different kind of surveillance state, and one that people feel and experience much more directly.
Winship: It seems to me, the other thing we're talking about is the chilling effect on speech, that people are self-censoring, even, because they're afraid to be surveilled like this and to have everything that they're saying covered and documented.
Wizner: Yes. I think "chill" is something that we worry about a lot, and maybe we should worry about it even more. I sometimes think about poor Professor Petraeus [former General and CIA Director David Petraeus] at the City University of New York. One woman sent a nasty anonymous email to a second woman, the second woman called the FBI, and in a short time, the FBI was reading the content of many, many, many thousands of emails. And perhaps they justified doing that -- I think it was outrageous -- because once they saw the CIA director's name, they thought maybe this is a national security threat. Well, I hope that that scared every member of Congress, that the FBI could use national security as a pretext to pour through the CIA director's private email communications. He was not even suspected of wrongdoing, but because our digital trails are basically permanent and we can't erase them, it doesn't even require the arrow to be pointed at us. It could be pointed at somebody next to us and then our lives can be turned upside down like this.
So I think people are beginning to experience this. We see these sad cases where a Facebook photo of somebody drinking at a fraternity party when she was 20 results in her not being able to get hired as a teacher when she's 26. Stories like that are going to proliferate, where the digital footprints that we leave by living ordinary lives constrain choices down the road and do chill. And I think they already do chill. I think they chill things in a way that are going to give us worse politicians. Right now, anybody who wants to be president who's 18 years old is probably living a life of walking on eggshells and trying to avoid the kinds of experiences that could be embarrassing. But do we want that person to be our president, someone who from the age of 16, 17 and 18 is doing everything possible to engage in our world so as not to be in a position where his or her life can be distorted down the road? I mean, to me, these are really, really important questions that go far beyond law and get at what it's going to be like to be a person in a world of universal collection. And the kinds of controls that we put access to this, our ability to hit delete on things that might actually be useful and not to make national security the first, second and third argument in deciding how to make these hard decisions as a society, it's going to be very interesting to see how all of that plays out.
Winship: Are you hopeful?
Wizner: I think there was a period of time after the 9/11 attacks, and it was a much longer period of time than it should've been, where the talismanic invocation of terrorism or national security was enough to sort of silence opposition to the expansions of the government's surveillance authority. And I think that one of the remarkable developments in the last eight or nine months has been how the surveillance state has lost control of the argument in the plot, that their efforts to say that they have stopped any number of terrorism attacks with these and we would all be in great danger if they lost these authorities just don't seem as credible anymore, and partly, I think that that's a function of the distance from 9/11 and the fact that very, very, very few Americans are injured or killed by terrorism -- you know, a tiny handful, compared to other threats. And so people are beginning to be able to contextualize that. Partly, there is a generation, and particularly a digital generation, that was not as politically traumatized by 9/11, and doesn't jump when the intelligence agencies tell them to. And I think partly it's because people are seeing how these technologies are proliferating at the local level. Even if someone is willing to trust the president and the Defense Department and the NSA, most people are a lot less trusting of the cop on the corner, who might use this technology to stalk his ex-wife. And we don't want him to be able to open up his handheld device and have location information for everyone in the city because a license plate-reading camera has snapped that and he can feed right into that database. Some people call this the little brother problem, as opposed to the big brother problem.
But I do think that there is pushback. There is bipartisan pushback. This is an issue that is at least as important to people on the libertarian right as it is to people on the liberal left. There is this fundamentally American notion of being left alone unless you do something wrong that is jeopardized by dragnet surveillance, which captures information on everyone, in case we might do something wrong.
So, yes, I'm about as upbeat as I've been. I'll tell you that two states have passed legislation requiring law enforcement to go a judge and get a warrant before they can track your location using your cellphone. Those states were not New York and California; they were Montana and Maine. So there's something going on in the country where the question will be, at the national level, whether this will be a strong enough coalition to push reform through the House and the Senate against very, very, very strong opposition from a very well funded security state. But the fact that we're adding dozens and dozens of sponsors to these kinds of bills that would've been unthinkable just a year ago makes me optimistic.
[1] Moore's law is the idea that data density doubles every 18 months. Named after Gordon Moore, co-founder of Intel.