[00:32.430 --> 00:37.090] So this talk will be a little different. It's an idea that I've been kind of [00:37.090 --> 00:40.910] that I've been milling around my brain for the past six months or so. So this is [00:40.910 --> 00:44.470] the first time I'm testing it out with an audience. You're gonna help me [00:44.470 --> 00:49.390] crowdsource it. So my name is Anita Nikolic. I just finished up a tour at the [00:49.390 --> 00:54.250] National Science Foundation. I was a program officer for cybersecurity. NSF, [00:54.250 --> 00:58.150] National Science Foundation, funds most of the basic research in the U.S. and [00:58.150 --> 01:03.390] they fund almost 90% of computer science research. So security is underneath the [01:03.390 --> 01:08.590] kind of guise of computer science. Specifically, I was a program officer for [01:08.590 --> 01:14.390] networking security, social networking, and anti-censorship. So just to give you [01:14.390 --> 01:17.390] some background. And prior to that, my background is mostly in networking [01:17.390 --> 01:24.070] security. So I've been reading this book about kind of amateur scientists, [01:24.070 --> 01:27.810] professional scientists, and how the twain sometimes meets, sometimes don't. I [01:27.810 --> 01:32.090] saw this cool quote from Arthur C. Clarke. He's the author of 2001, A Space [01:32.090 --> 01:34.950] Odyssey. And I thought this was really interesting. I definitely encountered [01:34.950 --> 01:38.850] this a lot kind of in my time at NSF because there's a lot of distinguished [01:38.850 --> 01:42.510] scientists that that come back and forth through the halls. So I thought this was [01:42.510 --> 01:48.150] very interesting. So here's kind of the proposition. And as you'll see, I kind of [01:48.150 --> 01:53.530] ran this past a bunch of people. And the semantics change depending on what your [01:53.530 --> 01:57.590] preference is. But the proposition is there, you know, the past four years that [01:57.590 --> 02:01.150] I've spent in the government at NSF. And prior to that, I was at a university. [02:01.150 --> 02:05.890] Prior to that, I was just in IT operations. There is a lot of money that's [02:05.890 --> 02:11.250] spent doing academic security work. A lot of this work never gets any place. And [02:11.250 --> 02:14.150] we'll talk a little bit about that. It kind of lingers in papers, that's behind [02:14.150 --> 02:17.750] paywalls. At the same time, there's a lot of cool stuff I hear at these [02:17.750 --> 02:22.410] conferences. That sometimes doesn't make it, you know, the total feedback to [02:22.410 --> 02:27.350] whether it's improving security in a product or getting it out to people who [02:27.350 --> 02:30.850] could use it. That doesn't happen either. Sometimes it does. But the proposition is [02:30.850 --> 02:34.450] how can we get kind of, I'll call them non-professional, you know, us basically [02:34.450 --> 02:40.410] non-professional researchers, operators, and official academic researchers to [02:40.410 --> 02:44.170] work together somehow. So some things to consider. Is this really even a [02:44.170 --> 02:47.290] distinction between these two? Is this becoming meaningless, particularly for [02:47.290 --> 02:51.190] security and privacy? Should it become meaningless or are there roles for each? [02:51.190 --> 02:55.590] Should we have academics who, you know, sometimes can't push the limits as [02:55.590 --> 03:01.050] much as we can, not being in academia? Or is research just research? Before, about [03:01.050 --> 03:06.590] 1970, academics could use money very freely. You saw it from World War II, [03:06.590 --> 03:10.210] through the 50s, 60s, they could use money very freely. They were given money. They [03:10.210 --> 03:13.410] could explore things. They didn't have to so much worry about publishing and [03:13.410 --> 03:17.510] tenure as they do now. Modern day, there's something called peer review. And if [03:17.510 --> 03:20.370] you're not familiar, kind of talk a little bit about it. But peer review [03:20.370 --> 03:25.110] means these ideas come to a panel. People like me, we invite distinguished experts [03:25.110 --> 03:29.890] to come and listen and be thoughtful about ranking these ideas. And peer [03:29.890 --> 03:34.690] review is kind of designed to prevent major shifts. Everything is very [03:34.690 --> 03:38.610] incremental in that when you bring up a novel idea or something really [03:38.610 --> 03:43.150] transformative, your fellow peers often say, well, you know, that's not [03:43.150 --> 03:48.510] safe. That may not work. And science has become more like a business. So what [03:48.510 --> 03:51.830] prompted me to think about this? This isn't really anything new. I'll give you [03:51.830 --> 03:56.050] some examples of where it has been happening. So I remember back in high [03:56.050 --> 03:59.130] school, one of my teachers talking about that, and I don't have the [03:59.130 --> 04:02.470] correct name of it, but this great books program in New York City. Basically, it [04:02.470 --> 04:07.350] was kind of ordinary people talking with Columbia professors about, you know, the [04:07.350 --> 04:10.530] great books. So they would bring people in, you know, it's more common now you see [04:10.530 --> 04:14.790] meetups, but this is back in the 60s or 70s. And the insight was, wow, all these [04:14.790 --> 04:18.650] kind of average people who read these classic works have really, you know, [04:18.650 --> 04:23.490] insights that are pretty thoughtful. So that always kind of stuck with me. The [04:23.490 --> 04:28.290] past four years I've been at NSF, NSF spends, and this is just one agency, [04:28.290 --> 04:32.410] there are others, DARPA and others, NSF spends 80 million dollars a year on [04:32.410 --> 04:37.470] basic privacy and security research. Many other agencies spend millions of it on [04:37.470 --> 04:42.490] applied work, on basic work. Why are there still so many security problems? [04:43.290 --> 04:47.510] One of the programs that I started there was this transition of practice program. [04:48.390 --> 04:52.870] I stole the name from Homeland Security, but basically what I wanted to do was, [04:52.870 --> 04:56.950] you know, that with all this money being spent, and this program that I was on was [04:56.950 --> 05:01.210] around for ten years, so that's what, eight hundred million dollars, and some [05:01.210 --> 05:05.450] of the same issues still are there year after year. So what what can we do to [05:05.450 --> 05:09.910] transition some academic work and get it into companies, get it in the hands of people [05:09.910 --> 05:13.970] who can use it? Many academics were very upset about this. They said, we want money [05:13.970 --> 05:17.770] to just do basic security work and that's gonna take away from it. Getting [05:17.770 --> 05:22.210] it more practical, we can't do that. So it's kind of a mixed bag and it evolved [05:22.210 --> 05:25.950] over, you know, the four years I was there to people kind of accepted that we're [05:25.950 --> 05:29.970] not taking away your right to do basic crypto work, but there's a lot of work [05:29.970 --> 05:33.390] that's sitting there that never gets into the hands of people who can use it. [05:33.750 --> 05:37.670] Another thing I noticed as I go to these a lot of these conferences, just I was [05:37.670 --> 05:41.370] lucky for my job, I got to go to a lot of these, if you look at the agendas and [05:41.370 --> 05:46.450] I didn't have time to do this, but kind of in the past I'd say, you know, five to [05:46.450 --> 05:50.450] six years, you look at the agendas for these kind of cons and academic [05:50.450 --> 05:55.450] conferences, they're very similar. The topics are very similar, the approaches [05:55.450 --> 05:59.810] are very similar, but neither kind of wants to be seen at each other's [05:59.810 --> 06:03.290] conferences. So when I'd say, you know, I'm gonna put in my travel, let's go to [06:03.290 --> 06:07.090] DEF CON or any of the, we call them principal investigators, are any of them [06:07.090 --> 06:12.190] going to academics going, they, many just will not go there. Can that be overcome [06:12.190 --> 06:16.490] that, you know, people attend each other's events. Some talks that inspired [06:16.490 --> 06:22.530] me at ThoughtCon last year, maybe the year before, the cyber squirrel, Chris Thomas [06:22.530 --> 06:26.950] talked about, you know, the premise being squirrels are a bigger threat to the ICS [06:26.950 --> 06:30.350] systems and actual hackers. I thought that was really interesting, so I brought [06:30.350 --> 06:34.450] him into NSF and we usually get like Nobel Prize winners and serious talks [06:34.450 --> 06:38.370] and he gave this talk. And it's interesting, half the room walked out and [06:38.890 --> 06:42.050] half the room just thought it was awesome. I thought, you know, there's [06:42.050 --> 06:45.430] something to that, you know, this is a really, this project is kind of [06:45.430 --> 06:48.110] tongue-in-cheek, but there's a lot of kind of quantitative metrics and [06:48.110 --> 06:51.850] approach to it that's very interesting. So that kind of struck a nerve with me, [06:51.850 --> 06:55.770] there's something to it. I'm sure many of you know, you know, Jay Radcliffe, this [06:55.770 --> 07:00.490] hacker researcher, talked about his project a few years ago about hacking his [07:00.490 --> 07:04.330] own insulin pump and going to Johnson & Johnson and getting them to fix it, but [07:04.330 --> 07:07.790] he had a nice talk last year and that kind of really inspired me to [07:07.790 --> 07:12.350] crowdsource this about, you know, can we inspire kind of this community to maybe [07:12.350 --> 07:17.290] use more of a scientific method to get some of our things that are not [07:17.290 --> 07:21.250] academic research to get to get traction with it. Maybe take things a little more [07:21.250 --> 07:29.430] seriously, maybe use a little more methodology and do we want to. So at NSF [07:29.430 --> 07:35.930] we were asked a lot, you know, when an idea came by our desks or something, [07:35.930 --> 07:40.050] what does the community think? What does the community say about this? You [07:40.050 --> 07:44.430] know, and me not being an academic, I kind of landed by circumstance at NSF. I'd say, [07:44.430 --> 07:47.070] well, who is the community? What do you mean by that? I mean, I came from security [07:47.070 --> 07:51.190] operations. When you say the community, who are you talking about? If it's the [07:51.190 --> 07:55.430] same people, you know, writing the same papers for the same conferences, is that [07:55.430 --> 08:00.190] the community? Is that operators? So, you know, what does that really mean? So this [08:00.190 --> 08:02.930] isn't a new problem. It got me to thinking, you know, are there other [08:03.430 --> 08:07.510] disciplines, other areas besides security where this might have occurred? So I'm in [08:07.510 --> 08:13.010] one of these meetups for science books. So one of the books they had us read was [08:13.010 --> 08:17.390] about Mendel. And I'm not a biologist, not familiar with it, but it's a very [08:17.390 --> 08:22.870] interesting book. And I learned about Mendel. This is if you recall your [08:22.870 --> 08:26.490] high school biology, he's the father of Mendel's Law of Inheritance, which [08:26.490 --> 08:31.010] explains heredity and how traits are passed down. Well, he grew up on a [08:31.010 --> 08:35.290] farm. He worked as a gardener and a beekeeper, a quiet guy. He ended up as a [08:35.290 --> 08:42.890] monk in this Abbey in Brno. And what he did for his many years was grow pea [08:42.890 --> 08:47.070] plants. He literally just had pea plants in the garden and he did these [08:47.070 --> 08:51.450] experiments. And he kept very meticulous notes. He wanted to develop new color [08:51.450 --> 08:55.670] variants and examine hybridization. And this had never been done before. So he [08:55.670 --> 08:58.810] did this for many, many years and published a paper which he sent to these [08:58.810 --> 09:03.030] proceedings. He sent them to the Royal Society, the Linnaean Society, the [09:03.030 --> 09:07.870] Smithsonian. And they just, because he wasn't a professional scientist, the [09:07.870 --> 09:12.510] paper wasn't taken seriously. It was kind of put into this very obscure journal [09:12.510 --> 09:17.690] and forgotten about. Well, when this Dutch botanist, a friend of his who was [09:17.690 --> 09:21.130] cleaning out his stuff prior to moving, sent him this paper. He said, this is [09:21.130 --> 09:26.230] amazing. So it took many years. It wasn't until 1909 where all these things were [09:26.230 --> 09:30.790] finally pinned to the discoveries Mendel first described in that paper, which [09:30.790 --> 09:35.170] just by happenstance, this paper happened to be found. So that got me [09:35.170 --> 09:39.210] thinking, you know, there's, it seems like there's a similar thing going on in our [09:39.210 --> 09:44.970] community. Another example, which I love because I live in Chicago, this is Sioux, [09:44.970 --> 09:48.870] the T. rex Sioux at the Field Museum. It's the biggest, best preserved, if you [09:48.870 --> 09:52.590] never seen it, it's this amazing looking T. rex. Well, Sue Hendricks, it was named [09:52.590 --> 09:57.790] after her, she found it. And she was a high school dropout. She moved to Florida [09:57.790 --> 10:02.730] to go diving. She lived with her uncle and she was an explorer, adventurer. She [10:03.210 --> 10:08.270] caught these fish to sell to aquariums, rare ones she would just sell at cost [10:08.270 --> 10:12.390] to museums. And she hooked up with some explorers and said, I'll help you dig up [10:12.390 --> 10:18.010] fossils. And she ended up, their tire was flat so they had to spend the night [10:18.010 --> 10:21.410] there. She said, you know, there's this ridge we haven't explored yet. And it's [10:21.410 --> 10:25.290] kind of on her instinct of doing this for many years that she actually, not the [10:25.650 --> 10:30.150] paleontologist, not the geologist, an untrained explorer had the tenacity and [10:30.150 --> 10:37.210] she found this fossil. So it's another example, you know, I think astronomy is [10:37.210 --> 10:42.130] another great example. Although in astronomy, you know, typically we have the [10:42.130 --> 10:44.890] tools at our disposal in computer science and security versus like [10:44.890 --> 10:47.990] astronomy. But there's a couple interesting examples. I never knew about [10:47.990 --> 10:53.010] the Shoemaker-Levy comet, which of course many of us have heard about. I didn't [10:53.010 --> 10:58.110] realize that Carol Shoemaker, who is one of the discoverers of it, at one point [10:58.110 --> 11:02.070] she had found them the greatest number of these asteroids and comets of any [11:02.070 --> 11:05.710] person. I think that's been superseded. But she was not formally trained. She was [11:05.830 --> 11:10.070] a housewife. She didn't start her observations until she was 51 years old. [11:10.070 --> 11:15.790] She just kind of was a fan. Her husband was an astronomer. She had no [11:15.790 --> 11:18.670] training in this at all, but she kind of made a really big difference in this [11:18.670 --> 11:25.190] field. So I thought this was interesting. I'm sorry my slides are a little all [11:25.190 --> 11:30.590] over the place. But I put together, side by side, I just randomly grabbed a few [11:30.590 --> 11:35.330] years ago, just randomly some talks from DEFCON and some from USENIX security. [11:35.330 --> 11:39.410] USENIX security is a big kind of big event people like to publish in. So I [11:39.410 --> 11:42.870] kind of put them next to each other and, you know, I wonder if people can really [11:42.870 --> 11:48.870] tell which is which. Maybe some of you have been, but if you take a look at it, [11:48.870 --> 11:54.290] kind of look at the topics. I don't know if anybody has a guess or if you maybe know [11:54.290 --> 12:00.010] this already. So the one on the left is USENIX. The one on the right is DEFCON. [12:00.610 --> 12:04.970] You kind of look through the titles and the topics. You could almost switch the [12:04.970 --> 12:10.510] two around and you could be at one or the other conference. So some things [12:10.510 --> 12:14.050] we're not talking about. Citizen science. There's this really good [12:14.050 --> 12:17.030] article, which I know is a little hard to read on the bottom, but I could send you [12:17.030 --> 12:20.530] the link. Really good article on citizen science. Citizen science is where kind of [12:20.530 --> 12:25.890] scientists set up experiments and people help contribute to it. And mostly the [12:25.890 --> 12:29.150] scientists get the benefits. They're the ones that get kind of the fame. [12:29.150 --> 12:32.930] The same with SETI. That's where you donate your kind of compute power to [12:32.930 --> 12:36.590] look for extraterrestrial intelligence. Bug bounties. I think that's a whole [12:36.590 --> 12:41.070] topic in and of itself that has kind of an ecosystem set up already. Same as [12:41.070 --> 12:45.210] these, I'll call them professional hackers, such as this HackerOne, which [12:45.210 --> 12:50.450] helps facilitate the bug bounty programs. I'm also not really talking about kind [12:50.910 --> 12:55.330] of, you know, research being done on your own accord. What I really would like to [12:55.330 --> 13:01.190] get to is how can these communities share better across the boundaries. So a [13:01.190 --> 13:05.270] lot of people call this this valley of death from, you know, a really cool idea to making [13:05.270 --> 13:08.330] something happen or getting into a product. This is kind of gap between [13:09.210 --> 13:13.270] research, whether it's academia or elsewhere, and how is it translated into [13:13.270 --> 13:17.770] either better products, marketable products, something that can [13:17.770 --> 13:21.350] be useful. This isn't new. There's a lot of these incubators. There's these [13:21.350 --> 13:27.370] industry-university collaborations to, you know, generate commercial products. But a [13:27.370 --> 13:32.190] lot of companies are eliminating or seriously scaling back their research [13:32.190 --> 13:38.110] arm. I mean companies like, you know, Microsoft and RSA and Dell and these [13:38.110 --> 13:40.730] kind of places. They're scaling back and what they're doing is just giving [13:40.730 --> 13:44.950] $50,000 grants to academics and saying, you know, we don't want to keep [13:44.950 --> 13:49.410] people full-time on staff. It's a kind of discovery front, which many times was done [13:49.410 --> 13:55.050] by the Bell Labs and these big companies. That's not the case so much anymore. So I [13:55.050 --> 14:00.270] ran some of these ideas just past four different groups. Academics, industry, I'll [14:00.270 --> 14:04.930] say underground, kind of, you know, hacker community, and government. My informal [14:04.930 --> 14:08.670] crowdsourcing, about 70% of people were kind of excited, like that's a good idea, [14:08.670 --> 14:13.630] and about 30% mostly academics were like, no way, just it would ruin my [14:13.630 --> 14:19.290] credibility, you know, I don't see it that I would get anything out of it. But I did [14:19.290 --> 14:23.050] have quite a number of people who thought, and I'll go into some examples [14:23.050 --> 14:28.650] that have been done in the past, that, you know, as long as you can assure there's no [14:28.650 --> 14:32.750] damage done, and as long as you can assure this is done in a pretty ethical [14:32.750 --> 14:36.950] way, that it's a cheap way to get good research, and it's a good way to get [14:37.890 --> 14:43.890] people involved in something official. So if you're, you know, many of us have day [14:43.890 --> 14:47.890] jobs, but we do kind of interesting researchy stuff on the side, it's a great [14:47.890 --> 14:53.270] way if you want to get promoted or get something, you know, on your CV, it's a [14:53.270 --> 14:56.050] great way if you just have altruistic motives, you really want to make a [14:56.050 --> 14:59.570] difference. And it's a great way if you just want, you know, five grand to get [14:59.570 --> 15:03.090] some stuff to tinker around with and help somebody. I mean, I think there's a [15:03.090 --> 15:12.270] lot of different needs that can be satisfied. So just a short, kind of, if you [15:12.270 --> 15:16.690] don't know how academic security and privacy studies are funded, it's just, you [15:16.690 --> 15:22.350] know, I learned a lot about going to NSF, but before, as I mentioned before, about [15:22.350 --> 15:26.490] 1970, the funding was just, you know, you went to your university, you're a professor [15:26.490 --> 15:32.170] there, you got money. One example in the UK is, it's called the Royal [15:32.170 --> 15:36.430] Commission for the Exhibition of 1851. It's a granting agency. They used to [15:36.430 --> 15:39.490] allow awardees to just pursue research, they give money and say, pursue whatever [15:39.490 --> 15:42.530] research you want, wherever it goes, that's fine, it comes up, you'll come up [15:42.530 --> 15:46.670] with something useful. Now you need to have a proposal be judged. So say we [15:46.670 --> 15:49.790] go to NSF, you come in, and that's kind of the machine of proposal, right? And you [15:49.790 --> 15:53.430] come in with an idea, you get a committee of people, they evaluate it, they [15:53.430 --> 15:57.430] give you the money. And private foundations also give a lot of grants. [15:57.450 --> 16:00.970] MacArthur, Gates give a lot of grants. And they have very little obligations on the [16:00.970 --> 16:04.630] part of academic researchers. It's almost like a gift, as long as you can get the [16:04.630 --> 16:11.550] money. But the metrics of success, by and large, are papers. I'm gonna skip [16:11.550 --> 16:18.710] this just because it's late in the day. So some obstacles, and I'm sure there [16:18.710 --> 16:22.350] are many, these are some obstacles I have found to engaging kind of the academic [16:22.350 --> 16:26.790] side. A lot of the workshops, academic workshops, are invite-only. You don't hear [16:26.790 --> 16:31.430] about them. Right before I left, I sponsored a really cool one, which is [16:31.990 --> 16:37.190] great. It was great timing on trustworthy algorithms, you know, fake news. And it was [16:37.190 --> 16:42.470] fascinating. But very explicitly, my boss, kind of the government folks, wanted just [16:42.470 --> 16:46.850] academics. And of course, if you do that, you're not going to get necessarily [16:46.850 --> 16:52.150] interesting opinions and ideas. Failure is considered bad. You think in science [16:52.150 --> 16:56.350] that failure is encouraged in experimentation. Failure is considered [16:56.350 --> 17:00.650] bad because then you don't have a paper that says you did something interesting. [17:00.650 --> 17:04.930] Whereas if it's really, really incremental, you know, we call it just a [17:04.930 --> 17:09.590] hack of something, that's praised because it's gone well. Conferences, if [17:09.590 --> 17:12.590] you haven't been to an academic conference, it's mostly people reading papers that [17:12.590 --> 17:16.890] were already put online. There's not a lot of socializing, except for maybe [17:16.890 --> 17:20.530] finding a collaborator you by and large knew at grad school or some other [17:20.530 --> 17:24.270] conference. The incentive in that world, and I think we have to be very [17:24.270 --> 17:28.650] mindful, what's the incentive? It's tenure, if you're an academic, which requires [17:28.650 --> 17:34.890] papers at top conferences, publishing often. At many, you know, MIT, Stanford's [17:34.890 --> 17:40.950] nowadays, in the past about five years, they want you to show that you have some [17:40.950 --> 17:46.230] form of startup or entrepreneurial experience. And of course, getting grants [17:46.230 --> 17:50.570] from funding agencies, it's all about how much money you can get in there. So the [17:50.570 --> 17:55.410] driving factor, you know, if you look at this in a negative way, is not necessarily [17:56.170 --> 18:01.370] making security and privacy better in a tangible way. You know, we can look at [18:01.370 --> 18:04.670] that and say, well, that is what they're doing, but in a very tangible way, the [18:04.670 --> 18:08.890] goal is to get tenure, and that's by doing these things. Some obstacles to [18:09.750 --> 18:13.390] engaging, I'll call non-academic researchers. Cons like this are [18:13.390 --> 18:17.370] culturally a little hard to navigate if you don't know people, or you're [18:17.370 --> 18:23.530] introverted, or you know, maybe you don't have something to do with the village. [18:23.530 --> 18:26.790] So I think for a lot of people, they're culturally hard to kind of get [18:26.790 --> 18:32.650] into. A lot of people don't attend these events, so their ideas or their work is a [18:32.650 --> 18:36.170] little hard to track down. And people attend, many of them don't give speeches, [18:36.170 --> 18:39.910] they don't, you know, broadcast their ideas. Of course, we know not all motives [18:39.910 --> 18:47.290] are altruistic. A lot of the work, however, you know, I think is done by many [18:47.290 --> 18:50.710] people for personal satisfaction, for the challenge. It's not necessarily, you don't [18:50.710 --> 18:55.910] really necessarily want, you know, car companies to fix things, but for your own [18:55.910 --> 19:00.130] challenge, to show that you had an idea and it came to fruition. And I think a [19:00.130 --> 19:03.150] lot of people lack the interest in time and collaborating with any kind of [19:03.150 --> 19:09.770] official entity. I won't read all of this, but so one, you know, one thing here is, [19:09.770 --> 19:13.110] well, can I just read the research and get inspired? And it's interesting, I've [19:13.110 --> 19:16.410] noticed the past four to five years, and I've heard it already today, and I've [19:16.410 --> 19:20.750] only been here since mid-afternoon, is a lot of people do read academic papers or, [19:20.750 --> 19:26.190] you know, want to be able to get to the original source. So academic conferences [19:26.190 --> 19:29.690] might publish the full papers online. Workshops occasionally have the white [19:29.690 --> 19:33.830] papers online. I find those two things, actually, I find them easier to digest [19:33.830 --> 19:42.450] oftentimes because they're shorter. Since 2013, publicly funded grants or agencies [19:42.450 --> 19:48.890] with that grant over 100 million dollars in R&D, research and development, have to [19:48.890 --> 19:54.210] make the data publicly accessible to search, retrieve, and analyze. They're [19:54.210 --> 19:59.530] still figuring out how, but it's via this memo. Journalists, however, are always [19:59.530 --> 20:03.170] behind a paywall, almost always. It's a big problem with a lot of debate. There's [20:03.170 --> 20:06.630] one publishing company that you have to pay them to publish, you have to pay them [20:06.630 --> 20:11.830] to read it. So there's a workaround called Sci-Hub, which has 64 million [20:11.830 --> 20:16.750] articles up there, and they claim to have 85% of the paywalled scholarly articles [20:16.750 --> 20:20.170] up there for you to read. So a lot of people do that as a workaround through [20:20.170 --> 20:25.310] the paywall. And I just found this really interesting quote that people have, you [20:25.310 --> 20:28.490] know, time to become experts on quackery and pop science. Wouldn't it be nice to [20:28.490 --> 20:31.690] start seeing them take up actual science as a hobby and be able to read [20:31.690 --> 20:38.190] these papers? There's a really cool paper in the Atlantic, or article in the [20:38.190 --> 20:41.090] Atlantic, a few weeks ago along this line saying the more kind of [20:41.090 --> 20:45.030] sophisticated science becomes, the harder it is to communicate results. So if you [20:45.030 --> 20:48.710] try to read some of these onerous papers, the concept's often very simple. The [20:48.710 --> 20:53.710] communication of it can tend to be very onerous. So it's a very, it's a [20:53.710 --> 20:57.390] fascinating article. It's kind of a long read, but very interesting. I just noted, [20:57.390 --> 21:00.750] you know, if you're not familiar with these folks, I wasn't, but there's some [21:00.750 --> 21:04.370] really interesting work that's been produced by, just like plucked out a [21:04.370 --> 21:08.390] handful of people with some cool projects that I think would be kind of [21:08.390 --> 21:12.690] interesting takers on this idea. Yoshi Kono, who's University of Washington, he [21:12.690 --> 21:18.550] had this article on the top about encoding malware into a strand of DNA. [21:18.550 --> 21:22.770] Damon McCoy does all this cool stuff on looking at the Silk Road and the dark [21:22.770 --> 21:26.530] web and analyzing Craigslist rental scams and Nigerian gangs and things like [21:26.530 --> 21:31.370] that. Reshipping and mule scams, it's how to use your Visa card to do reshipping [21:31.370 --> 21:37.210] scams. Stefan Savage has this awesome project on measuring cybercrime and [21:37.210 --> 21:40.190] actually how much money is made off the dark web and cybercrime. People talk [21:40.190 --> 21:44.130] about this, you know, but he's actually measuring how much money is made there. [21:44.310 --> 21:50.490] He also did this remote car hacking in 2010. So what if we brought the two sides [21:50.490 --> 21:56.050] together? One of the, you know, we see this time gap kind of between academic stuff [21:56.050 --> 21:59.790] and stuff you see at these cons. So, you know, if you went to DEF CON last year, [21:59.790 --> 22:03.930] this voting village had tons of press attention saying it was a first-ever look [22:03.930 --> 22:07.650] at voting security and they had all these voting machines for the first time. [22:07.650 --> 22:11.010] People have hacked into them, but people have actually been doing this work for [22:11.010 --> 22:15.230] years. It's just not published. It's not publicized very well. There are [22:15.230 --> 22:21.910] tons of, you know, kind of obscure journal stuff and different workshops on how [22:21.910 --> 22:25.990] people have hacked voting machines. But again, you know, were we to get that [22:25.990 --> 22:30.230] out sooner, were we to kind of marry it up with people who could make a splash in [22:30.230 --> 22:35.330] the press, you know, perhaps that can make a difference. This has nothing to do [22:35.330 --> 22:38.590] with security, but I thought this was a super cool project. This guy came to talk [22:38.590 --> 22:43.550] to us. He's a computer scientist at Brandeis. It's a project called Digital [22:43.550 --> 22:48.870] Ahmadi. And he is a computer scientist, a mathematician, and for whatever reason, [22:48.870 --> 22:51.330] even though he doesn't play an instrument, wanted to create the perfect [22:51.330 --> 22:55.650] cello. So he spent like five years tracking down these artists and people [22:55.650 --> 23:01.250] who make cellos. They're more into the art of it, not so much the kind of [23:01.250 --> 23:04.330] equations of it. Well, he's really into the computational thinking. So he thought, [23:04.330 --> 23:08.030] what if he could bring together his computational thinking and Euclidean [23:08.030 --> 23:14.250] geometry and the art of this cello and kind of the science of instrument design. [23:14.350 --> 23:18.550] So he did this thing called a geometry engine, and he wanted to design the [23:18.550 --> 23:22.990] perfect cello to sound like these, you know, ones that were made in the 18th [23:22.990 --> 23:27.730] century. And it took him years before people even... this famous one, he wanted [23:27.730 --> 23:31.910] to do a CT scan to see, you know, what does it really look like? And it took him [23:31.910 --> 23:34.990] like three or four years. He had to fly to Italy in his vacation time and [23:34.990 --> 23:38.690] convince them that he was worthy of what he did. They said, you don't even play [23:38.690 --> 23:41.730] the cello. Why do you want to do this? But it was his fascination with kind of the [23:41.730 --> 23:45.810] perfect shapes and angles. And as it turns out, they have a wonderful [23:45.810 --> 23:50.530] collaboration now of kind of arts and music and in computation. So I thought [23:50.530 --> 23:56.250] that was just kind of an inspiring example. Some prior efforts at DARPA. [23:56.550 --> 24:00.710] When Mudge was at DARPA, there was a cyber fast track where they gave micro [24:00.710 --> 24:05.610] grants to just kind of average people. He noticed that, you know, the cyber [24:05.610 --> 24:09.670] incidents kept increasing from, you know, kind of the five years prior to him [24:09.670 --> 24:14.190] going to DARPA. But also spending on security was increasing by the federal [24:14.190 --> 24:17.910] government. So what can we do? So one of his answers was, you know, kind of give [24:17.910 --> 24:21.970] out these micro grants as an alternative to traditional funding. And [24:21.970 --> 24:27.530] they averaged a week from proposal to giving the funds. One of the projects [24:27.530 --> 24:32.230] that came out of it was the car hacking research that was presented in 2015. [24:33.370 --> 24:36.830] Other efforts, I just wanted to put these kind of down. There's things like [24:36.830 --> 24:39.850] experience tracks at academic conferences where you're not an academic [24:39.850 --> 24:45.390] but you have experience as an operator. Case studies to show, you know, for human [24:45.390 --> 24:51.630] computer interaction and in AI conferences. So there's kind of like [24:51.630 --> 24:56.730] other thinking but nothing really formal. Some more efforts, bug bounties. I'm the [24:56.730 --> 25:00.650] Cavalry is kind of a nascent or I guess not so nascent effort on promoting [25:01.230 --> 25:06.370] public safety research. There's a lot of public private threat sharing, the ISACs [25:06.370 --> 25:10.290] and kind of different places who share threat indicators. I thought this was [25:10.290 --> 25:15.130] cool, this workshop in London, you know, where they were trying to get together [25:15.130 --> 25:19.830] academics and kind of business people. They had some ideas, you know, a number [25:19.830 --> 25:23.190] where you can kind of pair up a business person to bounce ideas off of and an [25:23.190 --> 25:28.670] academic. Hackathons extended beyond just kind of the coding phase but, you know, [25:28.670 --> 25:32.390] how can you, you know, have an opportunity to learn and pitch and talk [25:32.390 --> 25:36.390] to each other. So things like that to kind of make it more sustainable than [25:36.390 --> 25:40.630] just a hackathon weekend. So what can we do with the current model? Some [25:40.630 --> 25:44.190] adventurous faculty have said, you know, they'd be happy to sponsor kind of non [25:44.190 --> 25:49.190] academics to participate, to give them micro grants, encouraging particularly [25:49.190 --> 25:52.930] undergrads who don't have a lot of these boundaries in the way they think. A lot [25:52.930 --> 25:56.090] of the government agencies could easily kind of include this underground [25:56.090 --> 26:01.090] component and I think inviting non academics to workshops. So in this kind [26:01.090 --> 26:04.450] of underground area, what, you know, what can be done? Some of these crazy [26:04.450 --> 26:08.970] academics who are interested in, you know, who have been pushing the boundaries [26:08.970 --> 26:14.050] explicitly involve them. They cannot do offensive research with these grants. [26:14.050 --> 26:18.150] They're banned from doing that. So this is an opportunity, many of them that I [26:18.150 --> 26:22.270] spoke to said, this is a great way for us to not do it, but give you a micro grant [26:22.270 --> 26:26.630] and you can try it and tell us what happened. So it's, you know, from their [26:26.630 --> 26:30.550] thinking this is like a win-win for them. Maybe there's some matchmaking that can [26:30.550 --> 26:35.430] be made at these different cons. It's interesting, many academic, like more than [26:35.430 --> 26:40.350] more than a handful who I've run across who are, and some of them are crypto [26:40.350 --> 26:45.570] experts, say only do they not use PGP. They can't figure out many of the open [26:45.570 --> 26:50.190] source tools that many of us use. You know, simple things like LastPass and [26:50.190 --> 26:54.330] other kind of tools because they're not used to the practical hands-on [26:54.330 --> 26:59.110] experience. So can we, you know, school them on the just the practical design of [26:59.110 --> 27:03.690] usability for some of these tools. So I think a part of the challenge of how we [27:03.690 --> 27:08.930] is kind of non-professional, you know, community is how do we broadcast [27:08.930 --> 27:14.090] what we know and find out. Many of you, many in this world are kind of these [27:14.090 --> 27:18.150] media savvy people. Some are not. They don't want to be broadcasting things. Is [27:18.150 --> 27:21.530] there a way, is there a kind of middle ground to help broadcast some of the [27:21.530 --> 27:27.710] results? Are there things that haven't been tried? You know, a journal. Journals, the [27:27.710 --> 27:31.130] challenges, who would peer review? It takes a lot of time. Who put the time [27:31.130 --> 27:36.510] into organizing it, getting reviews? It's a great place to show off your work. It [27:36.510 --> 27:40.830] takes a lot of work to write, to read, to peer review. Maybe a virtual journal. [27:41.510 --> 27:47.490] Maybe for some of these CTFs, bringing academics and hackers together as equals. [27:48.770 --> 27:53.250] You know, setting up, participating, doing the actual grunt work is something, you [27:53.250 --> 27:58.030] know, that we learn a whole lot. Maybe there's a new joint neutral venue or [27:58.030 --> 28:02.810] conference, physical meetings and virtual meetings, kind of bring people together. [28:04.190 --> 28:07.310] Does the model work elsewhere? I just want to bring this up. So I was [28:07.310 --> 28:11.910] involved in the bodybuilding world and it's very prevalent in sports [28:11.910 --> 28:15.650] performance, at least bodybuilding. I put up there kind of the picture of Mr. [28:15.650 --> 28:20.250] Olympia contenders and supplements. They're called gurus, which [28:20.250 --> 28:22.610] makes you think of kind of the traditional guru. The gurus are like [28:22.610 --> 28:26.110] these guys with the green check. They're like guys who are really good at [28:26.110 --> 28:32.530] figuring out supplements, training, timing, all this stuff. So that was a very [28:32.530 --> 28:35.810] interesting, you know, kind of convergence of you have people trained [28:35.810 --> 28:40.470] in this stuff. They're not necessarily the ones that athletes are using. I think [28:40.470 --> 28:44.750] the biohacking type stuff, quantified self, this guy Larry Smarr, he's an [28:44.750 --> 28:48.710] academic computer science professor, does a lot of this quantified self stuff, but [28:48.710 --> 28:52.990] he's got the tools at hand. He's got a lot of expensive tools to be able to [28:52.990 --> 28:57.150] measure different things about his body. And you have people who go to meetups and [28:57.150 --> 29:00.690] they're doing a lot of the same kind of thing, just kind of on a different level. [29:00.690 --> 29:07.170] Could we get a feedback loop between the two? So that's that's my idea. Thank [29:07.170 --> 29:09.990] you for sticking with me to listen through it. If you have some comments or [29:09.990 --> 29:15.010] thoughts, or think it's stupid or great, or want to make something happen, let me know.