[01:10.060 --> 01:18.540] Good morning, everyone, and thank you so much for coming this morning, and thank you as well to the Association for inviting me here to speak. [01:18.540 --> 01:34.380] I, over the last 20 months, have spoken at countless events in literally dozens of countries around the world, and the opportunity to participate in this event has generated among the greatest excitement for me. [01:34.380 --> 01:53.860] And I say that because I really believe that the epicenter of the debate that has been enabled by the disclosures of Edward Snowden lie with the people in this room, which I realize is a fairly heavy and even dramatic responsibility to place on your shoulders, [01:53.860 --> 01:58.620] especially so early in the morning, but I actually think it's really true. [01:59.060 --> 02:14.800] Whenever I talk about the NSA debate, I'm invariably asked two questions far more commonly than any others, one of which I'll address in a few minutes, but the first most common of which, by far, is the question of, well, what has really changed? [02:15.180 --> 02:24.860] And like many questions, that question is actually an argument masquerading as a question, and the argument is essentially one that posits the following. [02:25.620 --> 02:36.060] The disclosures we've been able to make, the reporting we've been able to do, has certainly generated lots of media attention, people who believe this viewpoint will say. [02:36.060 --> 02:45.340] They'll say there's been a great debate around the world, but nothing has really fundamentally changed about the spying powers on which you've reported. [02:45.340 --> 02:47.440] There hasn't actually been much change. [02:47.440 --> 02:53.300] The spying continues more or less in the same form as it was two years ago. [02:53.320 --> 03:12.660] And what they mean by that is that if you drive 45 minutes away from here to Fort Meade, you can see the headquarters of the NSA, it's still, the building is still standing, the doors are still open, people are going in and out of it, and that building is still the venue for a great deal of electronic surveillance. [03:13.140 --> 03:28.160] And what they also mean is that if you drive much closer, five minutes away, to the Capitol, you will search in vain, in complete futility, for any new laws that have been enacted in the wake of this reporting that limit what it is that the U.S. [03:28.160 --> 03:33.920] government and its allies can do when it comes to spying either on American citizens or foreign nationals. [03:33.920 --> 03:40.800] And all of that is absolutely true as far as it goes, but I don't think it goes very far at all. [03:40.800 --> 03:50.480] In fact, I think that it actually fundamentally misapprehends the nature of this debate specifically, but political change more generally. [03:51.160 --> 04:07.960] It is certainly true that the NSA continues to stand and that the NSA continues to spy, but I think it's always important to realize that the intelligence community of the United States is one of the most powerful factions within what is still the world's most powerful government. [04:07.960 --> 04:21.040] And when you talk about factions that exercise a great deal of power, they aren't simply defeated or even undermined quickly or easily without a significant and long struggle. [04:21.520 --> 04:30.200] But I think more important is the question of where statutory change within Washington fits into the question of what has actually changed. [04:30.200 --> 04:45.700] When I was in Hong Kong and meeting with Edward Snowden along with my colleague Laura Poitras, and we were thinking about all of the possible outcomes that might arise from the reporting we were about to do and the way in which mass surveillance might be reined in, [04:45.700 --> 04:49.580] the question of what laws might be passed by the U.S. [04:49.580 --> 04:51.860] government to limit the power of the U.S. [04:51.860 --> 04:58.680] government is one of the issues on which we spend the smallest amount of time and mental energy. [04:59.120 --> 05:05.320] And the reason for that is that I just never thought that the way in which the power of the U.S. [05:05.320 --> 05:10.920] government to spy on people would be limited would come from limitations that the U.S. [05:10.920 --> 05:13.300] government would place on itself. [05:13.600 --> 05:18.020] Because that just isn't the nature of human beings and it's not the nature of political power. [05:18.020 --> 05:22.540] People don't walk around thinking about how they can limit their own power. [05:22.540 --> 05:24.100] That was never the expectation. [05:24.100 --> 05:37.240] And I think the question of what laws have been passed in Washington is probably the least interesting and least relevant metric in trying to understand whether there have been substantial changes as a result of this debate. [05:37.840 --> 05:53.220] There have been extremely substantial changes that have arisen over the last 20 months in terms of the ability of the United States and its partners and other governments around the world to spy on our activities and communications on the Internet. [05:53.220 --> 05:53.940] And I think the U.S. [05:53.940 --> 06:00.240] government itself would probably be the very first party to acknowledge that to be true. [06:00.240 --> 06:02.100] It's true in all different ways. [06:02.300 --> 06:24.500] There are numerous countries around the world which have learned for the first time about the extent to which the privacy of their citizens is being compromised who are taking all sorts of steps from creating international regimes to reconstructing physically how the Internet functions so it doesn't rely as exclusively on the United States to working together to prevent U.S. [06:24.500 --> 06:32.920] hegemony over the Internet that pose very substantial barriers to the ability of spy agencies to engage in mass surveillance. [06:33.280 --> 06:57.400] Much more important, I think, is the fact that individuals around the world now realize for the first time the threat posed to their privacy from mass surveillance and they are able, and not just able, but are in fact now doing, to exploit all sorts of tools such as encryption and browsers that enable anonymity to protect their communications from being spied on. [06:57.400 --> 07:13.880] And if you look at all sorts of studies that are remarkable, they all conclude that there are extremely significant increases in the amount of human beings in continents all over the world who are using encryption for the first time. [07:14.260 --> 07:19.780] Encryption is a very serious barrier between the ability of the U.S. [07:19.780 --> 07:24.600] government and its partners and your communications to be able to spy on. [07:24.600 --> 07:27.840] It's not invulnerable, but it's a very serious problem. [07:28.400 --> 07:49.820] And the way that I know that it's a serious problem is that, as you might have heard, I happen to have many tens of thousands of their most secret documents, and they spend a great deal of time very worried about the fact that people will use encryption if they understand the extent to which surveillance has been implemented. [07:49.820 --> 08:06.700] So individuals turning more and more to encryption is a very real and very important change that can provide a great barrier on the ability of the United States and other governments and stateless organizations to invade privacy on the Internet. [08:06.820 --> 08:27.280] But I think the most important change is the change in the incentive scheme that now exists for companies that collect and store and manage people's private data and the behavioral changes that we've seen on the part of those companies as a result in the alterations of those incentive schemes, [08:27.280 --> 08:32.760] by which I mean things like the very vitriolitive conflicts between the U.S. [08:32.760 --> 08:50.840] and British government on the one hand and companies like Facebook and Apple on the other over those companies' decisions to embed end-to-end encryption in services like WhatsApp or to embed very serious encryption products in some of their new and most used products. [08:51.300 --> 09:06.460] Now, whenever I talk about the changes in behavior on the part of tech companies as a way in which our privacy can be better protected in the future, it meets with a great deal of cynicism from people all over the world. [09:06.460 --> 09:08.580] And I think with very good reason. [09:09.040 --> 09:25.040] Some of that cynicism is just about a general ideological view that the nature of private companies or corporations is that they're geared not to doing social good like protecting people's privacy but instead maximizing their own wealth and their own profit. [09:25.040 --> 09:34.860] But some of it is actually based in more specific evidence about the role that tech companies have played when it comes to compromising people's privacy. [09:34.960 --> 09:50.560] There are a whole series of, I think, fairly disturbing statements that have come from leading Silicon Valley executives about views of privacy going back 15 years when a Sun Microsoft executive said privacy is dead, get over it. [09:50.560 --> 10:06.160] Or the rather sneering response that was given by the CEO of Google, Eric Schmidt, when asked in an interview about the ways Google invades privacy and he said, if you're doing something you don't want others to know about, maybe that's a good reason not to be doing it. [10:06.160 --> 10:12.660] Or the proclamation by the CEO of Facebook, Mark Zuckerberg, that privacy is no longer a social norm. [10:12.660 --> 10:22.840] The series of statements along those lines have created a perception that Silicon Valley is not particularly interested in protecting privacy and in fact willing to undermine it. [10:23.040 --> 10:47.920] And then more specifically, a lot of the Snowden documents entail very clear evidence prior to the world's understanding of what the surveillance system was of not just Silicon Valley companies but financial companies and especially telecoms actively collaborating with the NSA to undermine the privacy of their clients and customers and users. [10:47.920 --> 10:57.600] Often to the extent that the law required them to do so, but in many cases far beyond the extent of what the law required. [10:57.740 --> 11:08.320] And the reason for that was that there were all sorts of benefits two years ago and before for companies to actively cooperate with the government to undermine their users' privacy. [11:08.320 --> 11:18.480] There were all kinds of benefits from creating positive relationships with the government at almost no cost because so much of it was being done in the dark. [11:18.480 --> 11:24.000] And I think what the Snowden revelations have changed more than anything else is that calculus. [11:24.400 --> 11:31.440] So now it is no longer true that there is no cost to collaborating with the U.S. [11:31.440 --> 11:37.020] government and its partners to undermine the privacy of people's data or their activities. [11:37.020 --> 11:41.680] There is a huge cost to doing so because people around the world are now aware of it. [11:41.840 --> 11:58.960] And I think tech companies for the first time rationally are genuinely petrified that they will lose an entire generation of users to South Korean or Brazilian or German companies who say, don't use American tech companies because they'll give your data to the NSA. [11:58.960 --> 12:01.860] You should use ours instead. [12:02.240 --> 12:15.920] There are a lot of people inside Silicon Valley companies and tech companies and banking companies who are genuinely devoted to privacy, not for business reasons, but as a societal belief, as a societal value. [12:15.920 --> 12:20.320] I've met some of them at this conference who work at very high levels in those companies. [12:20.320 --> 12:35.260] But the nature of companies is that if privacy is seen as an antithetical force to business prosperity, it will be a huge uphill battle to get these companies to really meaningfully protect privacy. [12:35.260 --> 12:41.740] And what has changed is that privacy is no longer I view it as an adversary to business prosperity. [12:41.740 --> 12:43.460] It is now a great complement to it. [12:43.460 --> 12:51.740] It's actually urgently necessary for private companies to demonstrate to the public that they are serious about protecting their privacy. [12:51.740 --> 12:57.380] And that, to me, has been the greatest change in terms of the Snowden revelations. [12:58.080 --> 12:59.540] And that's not just true for me. [12:59.540 --> 13:00.040] The U.S. [13:00.040 --> 13:10.880] government and the British government and its allies clearly view that change in incentive scheme as a serious threat to their ability to engage in ongoing mass surveillance. [13:10.880 --> 13:19.300] I think one of the most extraordinary changes of the last six to nine months has been the rhetoric that has been unleashed on the part of British and U.S. [13:19.300 --> 13:26.200] officials towards companies like Facebook and Apple and others who are starting to embed serious encryption into their products. [13:26.200 --> 13:39.280] They're now all but accusing them of the things they used to accuse only journalists and human rights activists of being, which are friends of the terrorists or aiders and abettors of the terrorists. [13:39.280 --> 13:43.500] That is the real front in this battle that's taking place right now. [13:43.500 --> 13:44.220] The U.S. [13:44.220 --> 13:47.880] government and the British government want to change that incentive scheme again. [13:47.880 --> 13:53.800] They don't want tech companies to view it as being in their interest to protect privacy. [13:53.800 --> 13:59.640] They want tech companies to view it as dangerous to protect people's privacy and to resist surveillance. [13:59.640 --> 14:26.580] And their tactic for doing that is to threaten to demonize these companies as at fault if there is a terrorist attack or as allies or aiders and abettors of people who want to engage in violence, to coerce and pressure them to cooperate once again, to reenter that system of collaboration upon pain of having those labels and accusations applied to them. [14:26.820 --> 14:44.960] And this question that tech companies now face is one that journalists face, that activists have long faced, that all sorts of people face who want to resist government orders or dictates or the exercise of power, which is the extent to which you're willing to stand in defense of values you believe in, [14:44.960 --> 14:49.780] even upon pain of being subjected to those kinds of accusations. [14:50.480 --> 15:04.600] And I think the reason that the NSA reporting that we have done has triggered such an intense and enduring debate, not just in the United States but around the world, is because of the nature of the Internet. [15:05.040 --> 15:16.340] The Internet, I think everybody agrees, is one of the most extraordinary inventions ever, but the extent to which it will be applied is very much in question. [15:16.500 --> 15:28.980] By which I mean that the Internet, the promise of the Internet has always been that it could be this extraordinary tool of liberalization and democratization, and it still can be that, and we see examples of that all the time. [15:29.400 --> 15:31.260] But it could also be the opposite. [15:31.260 --> 15:39.680] It could also be the most extraordinarily potent tool of oppression and coercion and control of all time. [15:39.780 --> 15:45.700] And I really think we're at the crossroads in determining how the Internet will be used. [15:45.700 --> 15:49.620] And the question of how that will be resolved, I think, is very much open. [15:49.620 --> 15:52.080] It's not fatalistically predetermined. [15:52.080 --> 15:53.940] It's a byproduct of our choices. [15:53.940 --> 16:05.520] The choices that we make every day about what we do in our lives and the values that we defend and the ones we're not willing to defend will determine what the resolution of that question is. [16:05.520 --> 16:26.920] And I really do believe that the choices that you make as privacy professionals or privacy activists or people working within the companies that now control so much of our data will play as big of a role, if not a bigger role, than any other single factor in determining the question of what the Internet actually will be. [16:27.720 --> 16:46.740] Now, I just want to spend a little bit of time talking about my experiences working with Edward Snowden, because I think that there are all kinds of really critical lessons to draw from examining and understanding the actions that Edward Snowden took and the choices that he made. [16:46.740 --> 16:57.660] I know that the lessons that I learned working with him will profoundly shape and influence everything that I think about and the way that I look at things for the rest of my life. [16:57.660 --> 17:03.520] And so I just want to share with you a little bit of that personal anecdote and some of the thoughts I have about it. [17:03.520 --> 17:09.600] I think it relates quite directly to what we just discussed, as well as just being valuable on its own. [17:10.560 --> 17:26.340] When Laura Poitras and I boarded a plane in June of 2013 to go to Hong Kong from New York, we knew we were going to meet a source who claimed to have extremely incriminating evidence about the mass surveillance programs engaged in by the United States, [17:26.340 --> 17:28.780] but we knew almost nothing else about him. [17:28.780 --> 17:36.380] We didn't know where he worked or how old he was or what he looked like or even what his name was. [17:36.780 --> 17:45.060] And everybody has had that experience when you have an exclusively online interaction with somebody and then you meet them for the first time in person. [17:45.060 --> 17:49.980] What they actually look like is completely different from what you imagine them to be. [17:49.980 --> 17:52.620] That was definitely our experience as well. [17:52.740 --> 18:01.520] We had been assuming that the person with whom we were communicating was in his 50s or 60s, near the end of his career, for all kinds of reasons. [18:01.520 --> 18:05.420] And when we got to Hong Kong, we saw someone who looked like a kid. [18:05.420 --> 18:09.280] He was 29 years old at the time, but he looked much, much younger. [18:09.280 --> 18:11.020] He looked 22 or 23. [18:12.200 --> 18:30.580] And so when I sat down with him for the first time and spent the first two or three days with him, my number one priority by far was to try and understand what his actual motive was in making this choice that he had made, that if he went through it, it was going to send him to prison for the rest of his life. [18:30.580 --> 18:42.760] Because it was crucial to us not to be part of somebody's unraveling unless it was really grounded in genuine autonomy and agency, a truly rational thought process. [18:42.760 --> 18:53.180] And so it was vital for me to apprehend what his actual motive was and not to accept kind of a preordained script that was superficial, but to actually get to his motives. [18:53.180 --> 18:55.500] And I spent many hours doing it. [18:55.920 --> 19:03.100] And part of what I learned, or much of what I learned over the course of the two or three days that I explored his motive was the following. [19:03.160 --> 19:06.740] The things that I actually came to believe were genuinely what motivated him. [19:07.000 --> 19:24.280] One is that he talked about the value of the Internet in his life and the fact that as somebody who grew up in a lower middle class environment in northern Virginia with a father who worked for the federal government for the Coast Guard, he felt like he didn't have an opportunity to explore the world. [19:24.280 --> 19:45.660] The Internet and the ability to function on the Internet anonymously let him explore the world, explore who he was, to interact with people who he would never otherwise interact with, to engage in exploration of alternative ideas and identities that was only possible because he was able to do that anonymously without the record of what he was doing permanently being attached to him. [19:45.660 --> 20:05.360] And he said the thought that that might be lost, that something so crucial to him and the people he loved might be permanently lost was something that he regarded as a truly profound injustice that he was not willing to allow to happen without taking steps that he could take to do something about it. [20:05.680 --> 20:20.700] More generally, when we talked about what the likely consequences would be for him, specifically the almost certainty that he would end up spending decades in an American prison and why he was willing to engage in that behavior, what he said to me was, [20:20.700 --> 20:26.000] and this is what I ultimately came to believe was really driving him, he said, actually there's a little bit of selfishness in this. [20:26.000 --> 20:46.840] And the selfishness is that if I have to spend the rest of my life knowing that I confronted this grave injustice, what I regard as a grave injustice, and believed I had the ability to do something about it but chose not to out of fear or some other concern for my immediate material well-being, [20:46.840 --> 21:05.660] that the pain of having that sit on my conscience for the rest of my life would be much, much worse than anything that the United States government could do to me, including me putting me in prison for the rest of my life as punishment for engaging in what I regard as an act of conscience. [21:06.800 --> 21:10.820] He also talked a great deal about a sense of betrayal that he felt. [21:10.820 --> 21:23.440] This was somebody who had been inculcated from the time of childhood to believe certain things about how his government behaves in the world, about the kind of political culture that we have, about what democracy actually means. [21:23.440 --> 21:24.760] He joined the U.S. [21:24.760 --> 21:31.260] Army voluntarily to go fight the war in Iraq, thinking that this was a war of liberation in defense of freedom. [21:31.260 --> 21:54.760] And it was not suddenly but over time, gradually, many years, that he began to think differently about what the role of the United States government in the world actually was, about the nature of what kind of democracy we really have, if so much of what was being done by our government of the greatest significance was being concealed from the American people rather than disclosed to them. [21:54.760 --> 22:07.160] And he felt as though his duty as a citizen was no longer to comply with unjust law but rather to break it in pursuit of what he regarded as his duty as a citizen. [22:07.160 --> 22:09.560] And these were the motives that he talked about. [22:09.620 --> 22:19.200] And the lesson for me that I drew more than any other, that I think everybody actually can value from at least thinking about, even if you don't ultimately accept it, is this. [22:19.940 --> 22:22.400] I've been writing about politics for ten years now. [22:22.400 --> 22:23.500] I was a lawyer before that. [22:23.500 --> 22:25.400] I started writing about politics in 2005. [22:25.400 --> 22:37.400] And one of the things I immediately saw and have seen ever since is that paying attention to politics very easily breeds this sense of defeatism and powerlessness. [22:37.700 --> 22:52.520] It's very easy if you think about injustices or view things that you regard as unjust in the world to tell yourself that you would like to do something about this but you're simply somebody who lacks the power to really do anything about it. [22:52.520 --> 23:01.380] That these forces are so formidable and powerful that there's simply no way for you to make meaningful, take meaningful steps in resistance to them. [23:02.320 --> 23:09.580] And to me the story of Edward Snowden should be the permanent antidote to that kind of defeatism. [23:09.780 --> 23:18.180] It's actually tempting to think that, that we're too powerless to confront injustice because it actually relieves us of the obligation to do so. [23:18.180 --> 23:21.100] We have an incentive to accept that. [23:21.380 --> 23:30.360] And yet for me the Edward Snowden story and having seen it up close illustrates actually what a fiction that is that we can tell ourselves sometimes. [23:30.360 --> 23:31.060] What a fraud. [23:31.060 --> 23:34.960] Edward Snowden is somebody who grew up completely powerless. [23:34.960 --> 23:39.120] I mean when I met him he was as ordinary of a person as it gets. [23:39.120 --> 23:40.480] He was in his 20s. [23:40.480 --> 23:42.240] He was a high school dropout. [23:42.240 --> 23:47.200] He was an obscure employee of a huge mega corporation, Booz Allen. [23:47.200 --> 23:53.700] He was somebody who had no family connections, no wealth of any kind, as ordinary as it gets. [23:54.300 --> 23:59.400] And yet whatever else you think of him, he did change the world. [23:59.400 --> 24:00.600] He changed the world. [24:00.600 --> 24:17.820] He caused hundreds of millions of people, if not more, around the world to think in radically different ways about not just surveillance and the value of privacy in the digital age, but the danger of government secrecy and the role of journalism in a democracy and the role that the United States plays in the world. [24:18.420 --> 24:45.320] And if you look at the most significant instances and unexpected instances of societal change, you typically find that they're actually catalyzed by extremely ordinary people, whether it's the civil rights movement in the United States being catalyzed by the refusal of an obscure woman to sit at the back of the bus or the Arab Spring challenging the most entrenched despots because an anonymous, [24:45.320 --> 24:50.700] completely ordinary street vendor in Tunisia set himself on fire in protest. [24:50.700 --> 25:10.060] These should be permanent solutions to the temptation to think that we actually lack power to do things about injustices that we perceive, and we don't need to be as extreme as Rosa Parks or the Tunisian street vendor or Edward Snowden or other whistleblowers in sacrificing and unraveling our lives. [25:10.060 --> 25:27.400] There are choices that we confront every single day in our lives as citizens, in your jobs, that pose these same questions about the values that you believe in, and not just the ones you say you believe in, but the actions that you're willing to take in their defense. [25:27.400 --> 25:37.840] And whatever choices you make, you should never allow yourself to believe that you lack the power to have an impact on all of these questions. [25:38.780 --> 25:43.800] So the last point I just want to talk about is the other question that I'm asked most frequently about. [25:43.800 --> 25:47.200] I began talking about the one I'm asked most frequently, which is, well, what has changed? [25:47.200 --> 26:00.940] The second most frequent question I'm always asked in any media interviews I do or events like this is, you've done all these stories from the Snowden archive, you've done all these disclosures of documents, what has been the most shocking story to you? [26:00.940 --> 26:03.220] What has been the most shocking revelation? [26:03.760 --> 26:23.060] And what I always say is that the most shocking revelation that I encountered when I began working on this material, and to this day I regard as the most significant, is the overall explicit goal of the NSA and its allies, which is captured by a phrase, [26:23.220 --> 26:32.800] a motto, that the NSA uses over and over and over again in multiple documents that it creates when describing its own institutional purpose. [26:32.800 --> 26:36.640] And that motto is, collect it all. [26:36.940 --> 26:40.480] That is the institutional mandate of the NSA. [26:40.480 --> 26:52.840] Not to collect lots of communications about terrorists or to collect lots of communications in the world, but to collect all electronic communications that take place by and between other human beings. [26:52.840 --> 26:57.660] And that's not me saying that, that's their own documents which have been published over and over. [26:57.840 --> 27:07.560] And that's another way of saying that the goal of the NSA and its allies is the elimination of individual privacy in the digital age. [27:07.560 --> 27:12.160] That is not hyperbole, that is a literal description of their institutional mandate. [27:12.160 --> 27:20.040] And they are not regarding this as some future pipe dream, but as an institutional goal which they are very close to fulfilling. [27:20.040 --> 27:28.200] The NSA and its partners collect billions, billions of communication events from around the world every single day. [27:28.200 --> 27:40.040] Their greatest challenge by far, institutionally, is where to store the gargantuan amount of data they are collecting, even though huge amounts of data can now be stored on tiny little thumb drives. [27:40.040 --> 27:46.860] And they are building new facilities in places like the UK and in Utah to store the information they're collecting. [27:46.860 --> 27:54.500] They want to collect the internet as a whole so that they can monitor that which they want to review. [27:54.880 --> 28:00.740] Now that poses obvious and very serious threats to individual privacy. [28:00.780 --> 28:03.460] And I don't really have time to talk about why that's so significant. [28:03.460 --> 28:17.440] I actually did a TED talk for anyone who's interested about why privacy matters and why not just terrorists and criminals, but every single human being in this room and outside of it values privacy and has things to hide. [28:17.440 --> 28:33.700] And the things we lose as individuals when we allow governments or anybody else to subject us to the possibility of constant monitoring, the knowledge that we no longer have a place where we can go and think and explore without judgmental eyes being cast upon us. [28:33.700 --> 28:36.040] And you can go and look at that if you're interested. [28:36.040 --> 28:48.880] But beyond the subversion of privacy, that that policy obviously poses, I think even more significant is the subversion of democracy that it poses. [28:49.520 --> 28:59.060] I mean, we can debate and reasonable people can debate and do debate the extent to which some of the details of these programs ought to be kept secret. [28:59.060 --> 29:06.480] And there are people who think we publish too much, and there are a lot of people, although they're not heard from as much, who think we've published too little. [29:06.480 --> 29:10.320] And if forced to choose, I would actually say that the latter critique is the more valid one. [29:10.320 --> 29:13.300] But there's reasonable debates that one can have. [29:13.480 --> 29:27.440] But what I think is not reasonable is to take the position that we would have been better off if we had remained completely ignorant of the broad strokes of what our government is doing. [29:27.440 --> 29:38.760] You can call your system of politics democracy and point to the fact that every four years, we get to go to the ballot booth and punch one of the two names to determine who occupies certain offices. [29:38.760 --> 29:40.840] But democracy is really not meaningful. [29:40.840 --> 29:53.400] It's really a process that's illusory if the most consequential actions that the people who wield political power are taking are concealed from us in their entirety. [29:53.660 --> 30:14.640] And whatever else you think about these questions of mass surveillance, and the need to monitor, and all of those other questions, if the government is going to do something as profoundly and obviously consequential as turn the internet into a realm of indiscriminate mass surveillance to adopt a motto, [30:14.640 --> 30:22.080] collect it all, that's certainly something that in a healthy democracy, citizens had a right to know. [30:22.180 --> 30:37.620] And I think the question that Edward Snowden faced when he decided whether or not he should come forward and unravel his life to disclose this information to journalists, and the question that we as journalists faced when deciding whether or not to report on this, [30:37.620 --> 30:43.880] was the extent to which democracy required this information to be known. [30:44.100 --> 30:47.820] And for me, that is only the first step. [30:47.820 --> 30:53.040] We do now know the extent to which surveillance is being pursued. [30:53.040 --> 30:59.220] And I actually think that the question is now in all of your hands about what will be done about that. [30:59.220 --> 31:17.060] Do we actually want the internet to become this tool of unprecedented monitoring and coercion and surveillance, or do we think it's critical that individuals retain the ability to do things without others monitoring and watching what they're doing? [31:17.060 --> 31:29.540] And all I ask is that as you make the choices that will determine the outcome of that debate, that you think about those questions very seriously, as well as your duties as a citizen and as a member of a democracy. [31:29.540 --> 31:30.680] Thanks very much for listening. [31:30.680 --> 31:31.860] I really appreciate it.