[00:09.860 --> 00:11.180] Thank you. [00:11.180 --> 00:12.700] I'm absolutely thrilled to be here. [00:12.700 --> 00:14.140] Thanks so much. [00:14.140 --> 00:20.960] Two of the biggest names in entertainment are Steven Spielberg and Steven Bochco. [00:20.960 --> 00:29.000] And back in 1971, they teamed up on the very first regular episode of the TV series, Columbo. [00:29.000 --> 00:33.120] Spielberg directing and Bochco, who went on to create Hill Street Blues, writing. [00:33.120 --> 00:35.800] The episode was called Murder by the Book. [00:35.800 --> 00:43.180] Jack Cassidy was the guest star, playing one half of a murder mystery fiction writing team. [00:43.180 --> 00:51.400] But when his partner, who actually did all the writing, threatened to out Jack as a fraud, Jack killed him. [00:51.780 --> 01:02.740] Jack then wrote a letter implicating himself on his IBM's electric typewriter and sent it off, trusting that no one could figure out that he had written the letter. [01:02.760 --> 01:09.800] But wily old Lieutenant Columbo eventually realized that he could hack Jack's communication. [01:09.800 --> 01:20.880] The electric typewriter Jack owned used a film ribbon, you see, and if you held it up to the light, you could read precisely what had been typed on it. [01:20.920 --> 01:30.820] That episode, back in 1971, was the first time I can recall the issue being raised of mail being hacked based on a flaw in technology. [01:30.820 --> 01:33.180] What's changed between now and then? [01:33.180 --> 01:44.720] Well, the biggest change, I think, is that Jack Cassidy had no idea that his correspondence was vulnerable to being recovered by someone on the machine he'd used to produce it. [01:44.720 --> 01:58.720] But nowadays, every savvy citizen is aware, at least in principle, that their email, their search history, their bank accounts, their instant messages, and everything else they do electronically can be intercepted. [01:58.720 --> 02:06.400] Indeed, just last night, I was watching the extended cut of Ron Howard's movie version of Dan Brown's novel, The Da Vinci Code. [02:06.400 --> 02:13.460] As he's walking through the Louvre, Tom Hanks notes the security camera and says, are any of those real? [02:13.460 --> 02:17.160] And the French detective accompanying him says, of course not. [02:17.160 --> 02:20.740] Then, Ed, so you know something about security procedures? [02:20.760 --> 02:27.880] And Hank's reply is, well, I know that surveillance in a museum this size is cost prohibitive. [02:27.880 --> 02:49.760] In other words, even a decade ago, 2006, we were so used to the idea that we were being watched that we We'd gone from having no idea, as in the Columbo episode, that our activities could be tracked, to complete acceptance that it was going to happen, [02:49.760 --> 02:51.000] at least in public. [02:51.000 --> 02:57.080] And now, of course, thanks to Moore's Law, those security cameras are real, ubiquitous, everywhere. [02:57.800 --> 03:01.440] But what about what happens in private? [03:01.440 --> 03:06.180] Don't we still want, may demand, privacy in our personal lives? [03:06.180 --> 03:15.140] Glenn Green was one of the first reporters to see the files Edward Snowden had leaked, and he's gone on to become an articulate and very vocal advocate for the need for privacy. [03:15.140 --> 03:30.700] In his 2014 TED talk, he cites examples of people who thought they were alone doing silly dances, or singing off-key, becoming mortified when videos of them doing those things were uploaded to YouTube. [03:30.700 --> 03:41.600] He argues that this simple joy of making a fool of oneself, when you think no one is around, will disappear under a surveillance society. [03:41.660 --> 03:43.720] But is that true? [03:43.720 --> 03:45.680] I don't think so. [03:45.680 --> 03:52.060] Rather than being mortified by videos of them doing silly things, young people today upload them themselves. [03:52.060 --> 03:59.580] Instead of surveillance constraining silly behavior, we become exhibitionists, reveling in sharing it. [03:59.580 --> 04:02.220] We revel in reality television. [04:02.220 --> 04:08.420] Privacy is shucked off because we'd rather have an even more elusive commodity, attention. [04:09.320 --> 04:13.440] The greatest science fiction editor who ever lived was John W. [04:13.440 --> 04:14.360] Campbell, Jr. [04:14.360 --> 04:26.560] Besides writing the story upon which all the movie versions of The Thing was based, he also edited astounding science fiction, later renamed Analog Science Fiction and Fact, the leading science fiction magazine. [04:26.560 --> 04:30.580] He edited it from 1937 to when he died in 1971. [04:30.580 --> 04:36.740] Campbell had a saying, the future doesn't happen one at a time. [04:36.740 --> 04:38.760] And that's exactly right. [04:38.760 --> 04:43.520] No social or technological innovation happens in isolation. [04:43.520 --> 04:48.320] If privacy norms change, society changes. [04:48.320 --> 05:03.420] Sure, there have long been frequent front page stories about privacy breaches or security lapses, the battle between Apple and the FBI over whether Apple should unlock the San Bernardino terrorist, Syed Farouk, is a recent example. [05:03.520 --> 05:06.540] But that's not happening in isolation. [05:06.960 --> 05:10.160] What else is front page news these days? [05:10.160 --> 05:14.920] Well, for one thing, Canada is looking to decriminalize marijuana possession. [05:14.920 --> 05:31.920] And we're seeing a lot of struists, to use a good Yiddish word, about who should be allowed to use washrooms, which public restrooms, with many opponents finally throwing in the towel on the transgender issue, and gender-neutral restrooms becoming rapidly the norm on university campuses, [05:31.920 --> 05:36.360] which are, as we all know, often at the cutting edge of social change. [05:36.360 --> 05:41.020] And they'll doubtless, within a decade, become the norm everywhere. [05:41.020 --> 05:55.420] That change comes on the heels of the United States finally coming to recognize the rights of gay people to be married, to publicly be a couple, something we've recognized in Canada for 13 years now, without society falling apart. [05:55.940 --> 05:56.820] I'm an author. [05:56.820 --> 06:00.500] I know well that people in the publishing industry tend to be poorly paid. [06:00.500 --> 06:11.620] But in 2012, Random House USA gave every single employee who had been with the company more than a year a $5,000 Christmas bonus. [06:11.620 --> 06:12.300] Why? [06:12.300 --> 06:21.800] Because of the massive runaway success of one book, Fifty Shades of Grey, which is a novel of BDSM erotica. [06:21.800 --> 06:29.260] That sort of soft porn has always existed, but it was bought in plain brown wrappers and hidden away. [06:29.260 --> 06:30.600] But not anymore. [06:30.600 --> 06:34.460] People were happily seen buying and reading that book in public. [06:34.560 --> 06:40.180] Meanwhile, all sorts of people are watching Game of Thrones, based on the novels by my friend George R. [06:40.180 --> 06:40.280] R. [06:40.280 --> 06:46.120] Martin, which is peppered with not just graphic violence, but with nudity and sexuality. [06:46.120 --> 06:50.780] Stuff that would have been banned on TV a decade ago is mainstream now. [06:51.800 --> 06:54.140] What have all these things got in common? [06:54.440 --> 06:56.360] Legalizing pot smoking? [06:56.360 --> 06:59.040] Not having to hide your transgendered status? [06:59.040 --> 07:00.480] Being openly gay? [07:00.480 --> 07:02.780] Smut becoming a mass consumer item? [07:02.780 --> 07:07.880] People freely admitting that they watch movies and TV with nudity? [07:07.880 --> 07:10.820] It's exactly what editor John W. [07:10.820 --> 07:12.220] Campbell was talking about. [07:12.220 --> 07:16.060] The future not happening one at a time. [07:16.060 --> 07:26.540] If we can't keep secrets anymore, if no one can, then we start reducing the number of things that are considered shameful. [07:26.540 --> 07:35.800] We do triage on the mortifying, and we reject as irrelevant things that heretofore had been verboten. [07:35.800 --> 07:45.000] Yes, a loss of privacy on a society-wide level changes behavior, but it doesn't necessarily make it less open. [07:45.000 --> 07:51.140] It can, as I have cited above, demonstrably does make it more open. [07:51.260 --> 07:55.820] I don't smoke pot myself, maybe you don't either, but lots of people do. [07:55.820 --> 08:11.560] And so, as a society, we've said, well, if that's something lots of people used to have to do in private because it was considered bad or a moral failing, I remember reefer madness, we shouldn't consider it to be those things anymore. [08:11.560 --> 08:21.600] And sure, there are still good and valid reasons why some gay people stay in the closet, but millions, millions have come out of the closet. [08:21.600 --> 08:27.700] Our Prime Minister, Justin Trudeau, will march in Toronto's Pride Parade this year on July 3rd. [08:27.700 --> 08:31.400] The secret stairwell for pot smoking is gone. [08:31.400 --> 08:34.600] The closet door is opening wide. [08:34.600 --> 08:37.860] The plain brown wrapper is coming off. [08:37.860 --> 08:45.820] See, the group being most discommoded by lack of privacy isn't those who would do things they've been schooled to believe are aberrant. [08:45.820 --> 08:54.460] Rather, it's the scolds, the controlling few who wanted to hound and hurt others for simply being human. [08:54.460 --> 08:57.920] They want you to be afraid of exposure. [08:57.920 --> 09:11.980] It's a fact that all three speakers of the House involved with the impeachment of Bill Clinton for an extramarital affair with Monica Lewinsky ended up being outed as having sex scandals of their own. [09:11.980 --> 09:25.160] And it's become a cliche to have a senator or a congressperson or a lawmaker violently, who violently opposes something on moral grounds, to turn out to be equally guilty of the same acts themselves. [09:25.160 --> 09:28.320] Except, guilty is the wrong word. [09:28.320 --> 09:33.540] The people they hounded and excoriated as guilty aren't guilty of anything. [09:33.540 --> 09:46.000] And the scolds themselves were guilty in most cases of little more than hypocrisy, a human trait harder and harder to get away with as we become a more open society. [09:46.000 --> 09:49.660] Good riddance to hypocrisy, I say. [09:49.740 --> 10:02.780] The more we're watched, the more we see that there are countless others just like us, that the things we've been taught are shameful or private are natural, the better off we all are. [10:02.780 --> 10:23.140] The Victorian era ended officially in 1901, 115 years ago, before commercial radio, before powered flight, before computers, before TV, before nuclear power, before space flight, heck, before electricity and indoor plumbing in most Canadian homes. [10:23.140 --> 10:26.780] But finally, we're shucking off its vestiges. [10:26.780 --> 10:28.360] We're saying, enough! [10:28.360 --> 10:30.780] You can't make me feel guilty. [10:30.780 --> 10:32.800] You can't shame me. [10:32.800 --> 10:35.180] You can't suppress me. [10:35.180 --> 10:37.980] It's my world, too. [10:38.220 --> 10:41.700] Sure, some behaviors are curtailed by surveillance. [10:41.700 --> 10:52.480] Even just putting a picture of a pair of watching eyes above the coffee kitty in an office lunchroom cuts down on those who steal coffee from their co-workers. [10:52.480 --> 10:56.780] Speeders who recklessly endanger all of us are now routinely caught. [10:56.780 --> 11:04.800] And yes, millions more who might otherwise have recklessly endangered all of us don't speed at all. [11:04.800 --> 11:09.120] Muggings and rapes happen in the dark corners, not out in the light. [11:09.120 --> 11:15.080] Do we really want to protect the right to steal, to hurt, to harm, to kill? [11:15.140 --> 11:16.360] Of course not. [11:16.360 --> 11:28.520] As for peccadilloes, well, turns out that maybe not everyone is doing the same thing, but so many are that it's pointless to object. [11:29.160 --> 11:33.260] Let me flash back a few decades, or a decade and a half anyways. [11:33.260 --> 11:44.780] 16 years ago, the year 2000, Eastman Kodak down in Rochester, New York, approached me and a few other science fiction writers to come do an ideation session for them. [11:44.780 --> 11:54.520] They were concerned about the long-term future of their core business, which was still image photography, and they wanted new ideas for imaging products. [11:54.520 --> 12:00.840] I said to them, look, the biggest problem with your business model is that it's reactive rather than proactive. [12:00.840 --> 12:07.640] For a third of a century at that point, 2000, they'd been using the slogan, that's a Kodak moment. [12:08.040 --> 12:24.680] But the truly crucial moments, when a punk sticks a gun in your ribs, when another car side swipes yours, when you accidentally leave your favorite hat somewhere, when your baby utters its first word, go unrecorded simply because we didn't know they were about to happen. [12:24.940 --> 12:35.520] I proposed to them what's since come to be known as life logging, constant recording by an individual of everything he or she does tied into GPS tracking. [12:35.540 --> 12:36.020] Why? [12:36.020 --> 12:36.800] Simple. [12:36.800 --> 12:47.620] It's way easier to record everything and pluck out what you want after the fact than to try to guess when something significant is going to occur. [12:48.160 --> 12:53.780] Kodak did not take my suggestion, but if they had, they might still be in business. [12:53.860 --> 13:03.920] But I went on and wrote a trilogy of novels about a world in which everyone is doing life logging, Hugo Award winning hominids and its sequels, humans and hybrids. [13:03.920 --> 13:07.900] And the sort of world I portrayed is inevitable, I'm convinced. [13:07.900 --> 13:15.620] The current Russian buying dash cams as protection against police overreach is just the tip of the iceberg. [13:15.620 --> 13:23.260] As battery technology improves and electronics require less and less power, soon our phones will be listening 24-7. [13:23.260 --> 13:29.480] Heck, they already do if you've Suri or another personal assistant set to be voice activated. [13:29.480 --> 13:32.500] More than that, the cameras will always be on. [13:32.500 --> 13:38.780] Storage space is rapidly approaching the point where it's effectively free in unlimited quantities. [13:38.780 --> 13:43.020] We're already moving from handheld computers to wearable computers. [13:43.020 --> 13:46.440] In a decade, we'll be looking at implanted computers. [13:46.440 --> 13:48.980] They'll be part of you. [13:48.980 --> 14:06.500] And with those sorts of wearables or implantables, plus the move toward cloud computing, storage off your device at remote locations of your data, it makes the alibi archives, as I call them, not only plausible, but inevitable. [14:06.840 --> 14:15.100] No one but you, or if you disappeared, your family or the police, should be able to access the contents of your personal off-site black box. [14:15.100 --> 14:24.480] But if you did disappear, kidnapped, lost, fallen down a hole, wandering aimlessly because of Alzheimer's, you could be quickly found. [14:24.480 --> 14:28.000] No more missing persons, no more desperate searches. [14:28.000 --> 14:29.880] That sounds useful, doesn't it? [14:29.880 --> 14:33.360] Now what about adding a constant transmission of your vital signs? [14:33.360 --> 14:39.080] If they indicated you were having a heart attack or a stroke, an ambulance could be automatically dispatched. [14:39.120 --> 14:40.260] And it gets better. [14:40.260 --> 14:48.420] If everyone's actions were recorded for their eyes only, unless a proper court order demanded otherwise, think of the reduction in crime. [14:48.420 --> 14:55.960] Who would assault, murder, or rape if they knew that the victim would have a complete off-site record of the event? [14:56.440 --> 14:59.980] Okay, I can see some of you squirming out there. [14:59.980 --> 15:05.040] Yeah, but, you want to say, citing any number of objections to what I've just said. [15:05.180 --> 15:09.300] But have you been following the science news this week? [15:09.320 --> 15:19.820] Earlier this week, NASA announced that its Kepler satellite had discovered 1,284 additional planets outside our solar system. [15:19.820 --> 15:21.920] Now, what has that got to do with privacy? [15:21.920 --> 15:27.860] And why should we think twice before we defend the rights of people to do whatever they want? [15:27.880 --> 15:37.780] Well, see, there's a long -standing problem in astronomy called the Fermi paradox, named for Enrico Fermi, who first proposed it in 1950. [15:37.780 --> 15:49.080] If the universe should be teeming with life, as all of our indications about biology and evolution and chemistry and physics suggest, then ask Fermi quite reasonably, where are all the aliens? [15:49.080 --> 15:52.020] The question is even more vexing today. [15:52.020 --> 16:00.600] SETI, the Search for Extraterrestrial Intelligence, with radio telescopes, has utterly failed to turn up any signs of alien life form. [16:00.600 --> 16:03.660] They've been looking for 56 years now. [16:03.660 --> 16:05.280] Why has it failed? [16:05.280 --> 16:25.980] One chillingly likely possibility, that is one reasonable solution to the Fermi paradox, is that as the ability to wreak damage on a grand scale becomes more readily available to individuals, soon enough just one malcontent or one lunatic will be able to destroy an entire world. [16:25.980 --> 16:38.140] Perhaps countless alien civilizations have already been wiped out by single terrorists who had been left alone to work unmonitored in their private homes and laboratories. [16:38.220 --> 16:45.220] We've already seen on earth what one crazed suicide bomber can do with early 21st century technology. [16:45.220 --> 16:54.440] Imagine the devastation he or she might manage with the ordnance and genetic capabilities that will be freely available within the next few decades. [16:54.440 --> 17:01.800] We can be sure that those who wish society harm will be taking full advantage of advanced technologies. [17:01.800 --> 17:07.680] Why shouldn't we take advantage of oversight technology to protect ourselves? [17:07.740 --> 17:11.160] But what about the bogeyman of totalitarianism? [17:11.160 --> 17:18.500] Again, it was privacy that made Hitler's final solution come within a hair's breadth of succeeding. [17:18.500 --> 17:32.440] But it was the lack of privacy, the openness of communication through the internet that prevented the Chinese government from covering up the 1989 massacre in Tiananmen Square or from trying anything similar since. [17:32.440 --> 17:34.900] But what about military secrets? [17:34.900 --> 17:40.740] Well, it's the aggressors who benefit from the ability to do things clandestinely. [17:40.740 --> 17:57.740] If the Japanese had been privy to the July 16, 1945 A-bomb test explosion in Alamogordo, New Mexico, I doubt they would have needed to be surprised by bombs dropping on Hiroshima and Nagasaki before surrendering. [17:58.200 --> 18:06.140] President Obama apparently isn't going to apologize for keeping the bomb secret when he visits Hiroshima later this month, becoming the first seated U.S. [18:06.140 --> 18:07.800] president ever to do so. [18:07.800 --> 18:19.340] But in reality, some transparency, we have this device now and here's how much damage it could do, might well have saved over 100,000 civilian lives. [18:19.400 --> 18:33.460] Whether we want American-style life, liberty, and the pursuit of happiness, or Canadian peace, order, and good government, clinging to privacy at all costs is the worst thing we can do. [18:33.460 --> 18:45.440] For as the silence from the stars attests, not only is an unexamined life not worth living, it may be that unexamined lives are too dangerous for us to allow them to be lived. [18:45.440 --> 18:52.640] The very future of humanity may depend on giving up some of our outmoded views on privacy. [18:52.660 --> 18:58.860] It's a fundamental part of growing up, becoming accountable for your actions. [18:58.860 --> 19:09.040] To invoke the title of one of the greatest science fiction novels of all time, as far as privacy is concerned, we may finally be at childhood's end. [19:09.040 --> 19:15.080] Meanwhile, dance like no one is watching, even if everyone is. [19:15.080 --> 19:20.740] We're all human, and that's a fact none of us can hide any longer. [19:20.740 --> 19:22.200] Thank you very much. [19:28.230 --> 19:29.290] Thank you. [19:30.550 --> 19:32.430] We have a digital timer here. [19:32.430 --> 19:37.250] I have five minutes and 54 seconds to take questions, and there is somebody going around. [19:37.250 --> 19:43.190] Mindy has a mic over here, and I will do my best to riposte and parry. [19:44.410 --> 19:46.410] I greatly enjoy your talk. [19:46.410 --> 19:46.930] Thank you. [19:46.930 --> 19:51.390] And I greatly enjoy the rewritings, but I have to tell you, I couldn't disagree with you more. [19:51.390 --> 20:06.030] To suggest that we have to give up our privacy in order to benefit from other things and be accountable and transparent, if that's the dated zero-sum proposition of either or, that will completely eliminate not only privacy, but our freedom, innovation, [20:06.030 --> 20:07.010] and prosperity. [20:07.010 --> 20:11.970] To suggest that it has to be one or the other, oh my god, that is what is going to tank us. [20:11.970 --> 20:15.730] Yes, I sat in on your talk this morning and I realized we were going to disagree. [20:15.730 --> 20:21.570] The point, of course, of having a symposium is to have a plurality of viewpoints. [20:21.570 --> 20:23.310] I agree that you and I disagree. [20:23.310 --> 20:29.610] The policy makers, the regulators, which you used to be, will make their decisions as time goes on. [20:29.610 --> 20:37.030] But there has to be, and I'm all in favor of non-zero-sum solutions. [20:37.250 --> 20:41.110] I didn't say, I don't think I ever once said the word eliminate privacy. [20:41.110 --> 20:44.450] I did say, however, that there's a balance that has to be achieved. [20:44.450 --> 21:00.410] And I think it's important when we have people who are selling privacy solutions as a way of making money, when we have regulators whose portfolios depend on the continuation of privacy as a form of job security, that we look at a larger policy issue here. [21:00.410 --> 21:04.110] So I know you're going to disagree completely and totally with me. [21:06.310 --> 21:06.930] No, no, no, no. [21:06.930 --> 21:08.850] But we have an industry of privacy. [21:08.850 --> 21:10.790] We have an industry of privacy now. [21:10.790 --> 21:14.310] So it has to be looked at in the context of societal whole. [21:14.310 --> 21:29.310] I don't think anybody else here, for instance, suggested during the course of these two days that the changes in society that we've seen in the 21st century go hand-in-hand with the general reduction of privacy and the increase in accountability. [21:29.310 --> 21:31.770] That's actually, I think, worth considering. [21:31.770 --> 21:33.330] I understand you agree with me completely. [21:33.330 --> 21:35.010] You did have your time on this stage. [21:35.010 --> 21:36.530] I thank you for your comment. [21:43.070 --> 21:57.790] So when you invoke Goodwin's law and talked about the Nazis, I can't help but say that privacy and transparency represents social power. [21:57.790 --> 22:03.830] And privacy is an attempt by the people that were oppressed to maintain their power. [22:03.830 --> 22:05.730] So that makes it a fundamental right. [22:05.730 --> 22:14.410] The transparency you talk about is the transparency of the powerful, where they have to reveal to the citizens what it is they're doing. [22:14.410 --> 22:24.130] But at the same time, the citizens have to maintain their potential for anonymity, have to maintain their multiple identities, have to maintain their privacy. [22:24.130 --> 22:31.970] So in that sense, I agree with what you say about transparency, but it's transparency for the powerful and privacy for the powerless. [22:31.970 --> 22:34.750] And I think that's where we want to go. [22:35.490 --> 22:37.170] And I don't disagree. [22:37.290 --> 22:40.650] In fact, of course, I'm a huge advocate of transparency. [22:40.690 --> 22:46.170] When one invokes Godwin's law, one is to say that basically Hitler was sui generis and it can never happen again. [22:46.170 --> 22:55.670] We are about to have an election in the United States that might, in fact, suggest that it was not a one of a kind and that comparisons are apropos in the current political climate. [22:55.670 --> 23:01.810] So don't be quite as quick to dismiss any comparison to what went wrong in the past. [23:01.950 --> 23:09.930] Privacy and transparency are also, I don't think, as completely divided as one half and the other half as you've indicated. [23:10.010 --> 23:16.570] There's transparency on the individual level and privacy on the corporate level or on the government level as well. [23:16.990 --> 23:20.150] And we need to find a middle ground. [23:20.510 --> 23:33.550] One of the reasons to have a provocateur is hopefully that in the middle or somebody who has an opposite position completely at the other end is hopefully in the middle we find a ground that actually represents all stakeholders positions and concerns. [23:33.650 --> 23:48.250] And the bottom line is we are conducting an enormously worldwide experiment in how society is going to live in an era of mass data, mass telecommunications, mass surveillance. [23:48.250 --> 23:58.790] And we do not actually know, no one, not me, not her, not anybody else can assert that we're going to survive the 21st century with the solutions that we choose to make. [23:58.790 --> 24:07.570] We are at a precarious civilization at the moment because of international terrorism, because of all sorts of concerns about financial collapse and so forth. [24:07.570 --> 24:11.990] And we are not, any one of us, certain of the outcome. [24:11.990 --> 24:14.930] If you're a betting person I invite you to take your bets. [24:14.930 --> 24:22.230] We can all reconvene at the 2116 Privacy Summit if there's a world in 2116. [24:22.390 --> 24:24.590] We've got time for one more very quick question. [24:28.650 --> 24:29.070] Oh, we don't? [24:29.070 --> 24:30.370] My timer says one minute. [24:30.370 --> 24:31.850] Okay, thank you all very, very much. [24:31.850 --> 24:32.510] Thank you.