[00:55.520 --> 00:56.780] Hi, everybody. [00:56.780 --> 00:57.660] Thanks for joining. [00:58.420 --> 01:02.600] So we are going to get started with this pretty scant room. [01:03.200 --> 01:04.540] We're the EFF. [01:05.020 --> 01:08.000] The spotlight makes it exciting, though, if you're up here. [01:09.100 --> 01:14.660] Typically we have a lawyer present with us, since the EFF is a law firm. [01:14.880 --> 01:21.140] Unfortunately, we did not have a lawyer that was able to join us this time, but we do have representatives from the technology team. [01:21.140 --> 01:22.580] And the activism team. [01:22.580 --> 01:23.280] Yeah. [01:24.080 --> 01:29.760] So a bit about the EFF, if you're not familiar with us, we are the leading digital civil liberties organization. [01:29.760 --> 01:33.800] We have been fighting for people's rights online for more than 30 years. [01:33.800 --> 01:39.860] That looks like various thrusts of work in activism, legal work, and technology. [01:39.860 --> 01:55.720] So that looks like sometimes suing the government for warrantless surveillance of citizens, going after law enforcement agencies for doing the same thing, representing security researchers to make sure that you all have the right to do your work and be protected and not get caught under shitty lawsuits, [01:55.720 --> 01:59.940] or creating open source technology projects, which I'll cover a little bit in my section. [01:59.980 --> 02:02.940] And then activism work, which Jose will cover as well. [02:02.940 --> 02:03.200] Yeah. [02:03.200 --> 02:04.160] So my name is Jose. [02:04.160 --> 02:06.920] I've been at EFF for four years. [02:06.920 --> 02:10.940] I'm one of the grassroots advocacy organizers on the activism team. [02:11.380 --> 02:22.660] And generally speaking, the activism team does advocacy, advocacy that is at the government level, federal, state, and municipal, as well as at the international level. [02:22.660 --> 02:32.620] We have staff in the European Union, staff in Latin America that pay very close attention to some international, mostly international agreements that are happening in those realms. [02:32.620 --> 02:49.760] And then we also do lots and lots of advocacy around institutional policy, corporate policy, and governance, and then try to push the best practices or push against the worst actors in those kinds of realms. [02:51.860 --> 02:57.860] I'll just say my issue areas are mostly what we call street-level surveillance. [02:57.860 --> 03:06.500] Street-level surveillance is this massive, almost all-encompassing, it feels like for me, question of like law enforcement surveillance tech. [03:06.500 --> 03:19.160] So that's mostly police at the municipal level and the state level, but it also includes border-based technology and also some level of technology in the carceral system as well. [03:19.500 --> 03:23.300] You know, I also work on electronic monitoring, for example, within the justice system. [03:23.300 --> 03:29.500] And then I also work on workplace tech and surveillance in the workplace as well. [03:31.420 --> 03:35.500] And on the technology, well, the public interest technology team is what we call it. [03:35.500 --> 03:37.920] I am a senior staff technologist on that team. [03:37.920 --> 03:46.460] I was originally hired to help maintain one of our open-source software projects, Privacy Badger, which is a browser extension that blocks online tracking. [03:46.520 --> 03:58.460] I've since transitioned off of that team, and I'm now sort of a broad-spectrum technologist, where I inform a lot of our policy issues or I basically tell politicians when they don't understand the internet correctly. [03:58.580 --> 04:09.100] And I do co-lead our Reproductive Justice Working Group, where we talk about health information, data privacy rules, and what the technology looks like. [04:09.100 --> 04:14.120] And then I do a bunch of privacy and security trainings or resourcing for at-risk communities. [04:14.120 --> 04:26.000] So that involves people who are involved in abortion access, but also just journalists, activists, some security researchers who are in niche areas whose OPSEC is threatened by state actors, and so on. [04:26.020 --> 04:33.340] And then one of the main ways that I get my hands dirty during the workday is I work with what we call the Electronic Frontier Alliance. [04:33.340 --> 04:48.640] So the Electronic Frontier Alliance is a network, completely decentralized network, of local domestic groups in the United States, only in the United States and its colonies, that do local programming, local advocacy. [04:49.240 --> 05:03.580] You know, some of them are hacker and maker spaces, some of them are student and campus groups, and some of them are specifically advocacy groups that want to push their local governments or their state governments on surveillance and tech issues, on broadband access, [05:03.580 --> 05:06.240] on right to repair legislation. [05:06.340 --> 05:21.240] And then there's no money that goes back and forth between these two things, but EFF really does need, you know, the alliance and need these kinds of local levers, because it allows us to not only communicate what we're talking about, what we're thinking about, [05:21.240 --> 05:29.420] and what we're seeing, the trends that we're seeing, either nationally or in local areas, to local groups, but it really is our ears on the ground, right? [05:29.420 --> 05:38.640] So we get tons of groups around the country who then come back to us and say, this is legislation that we're seeing crop up in Atlanta or, you know, in Minneapolis-St. [05:38.640 --> 05:39.280] Paul. [05:39.420 --> 05:48.520] This is, you know, or this is an issue that we're having, and we don't yet know how to do advocacy around it, so can you give us basic advocacy training or support? [05:48.880 --> 06:04.060] We want to do more Know Your Rights workshops or OPSEC workshops, and we don't yet know how to do that, or we're having, we're running into a roadblock with people who either feel like there's no reason to do it because we don't have privacy anyway, or, [06:04.060 --> 06:11.020] of course, the other side of it, which is people that feel like we need to be extremely paranoid all the time instead of being functional. [06:11.020 --> 06:13.140] So we talk with local groups all the time. [06:13.140 --> 06:20.360] We offer advice and advising and sometimes get our hands dirty right in the local struggles that they're engaged with. [06:20.980 --> 06:25.260] I should mention on that that, I mean, the EFA is fantastic. [06:25.620 --> 06:45.260] If you are working with an organization or any kind of collective of technologists or people who are impacted by digital surveillance, just check out the EFA, consider joining, because it's a lot of, it's sort of like mutual aid where the collective benefit is larger than everyone's individual input. [06:45.840 --> 06:59.160] It's incredible and it's, as a digital privacy and security trainer or someone who does a lot of education, I get pulled in sometimes to these conversations because one of the things that we lead with in the sort of the few people at EFF who does that type of work, [06:59.160 --> 07:13.960] the digi-sec trainings or the op-sec trainings, is not like parachuting in out of nowhere, not being a community member, not understanding the struggles those people are going through, and instead enabling them to understand the technical concepts or the legal risks that they're facing, [07:13.960 --> 07:16.280] and basically training the trainer. [07:16.280 --> 07:28.960] That's exactly right, and one of the problems is that if you do have a parachute in kind of mentality, you can win victories occasionally when it comes to, less when it comes to law enforcement surveillance tech, that's a very very difficult place to win a victory, [07:28.960 --> 07:41.200] but there are victories that you can win, but you're not going to be able to maintain them, and you're not going to be able to maintain them in a way where a lot of our work and a lot of the kind of policy angles that we sometimes have requires there to be local experts, [07:41.200 --> 07:44.880] local expertise, and that means that somebody's got to be paying attention. [07:44.880 --> 07:46.140] Is there enforcement? [07:46.860 --> 07:48.600] Has it been defanged? [07:48.600 --> 07:52.820] Has legislation or a policy at a university or other institution been defanged? [07:52.820 --> 08:03.080] So you really need people on the ground who are steadily maintaining the victory, even when you get the victory, to make sure that you don't lose it, because it's very very easy to lose. [08:03.080 --> 08:06.520] So we incredibly value these kinds of groups. [08:06.520 --> 08:22.040] Just a few of the ones that we have in the in the region, we're pretty strong in Minnesota and Illinois, we're still working on Wisconsin, although we have some interest from a few different groups here, but we work with Lucy Parsons Labs and the Citizen Network of Protection in Chicago, [08:22.040 --> 08:24.320] we work with Privacy Watch in St. [08:24.320 --> 08:44.740] Louis, we work with Restore the Fourth Minnesota and the Minneapolis Tech Network, and then we also work with a lot of groups that are locally-based but then create tech for movements, for communities, for users and consumers including Calix and May 1st. [08:44.740 --> 08:47.780] And Calix, for example, does incredible work. [08:47.780 --> 08:55.920] We talk to them a lot about their work to create an OS system that is much more privacy-focused and privacy-directed, and also not-for-profit. [08:55.980 --> 09:04.340] All of the EFA members are not-for-profit, and so like all the rest, and that means that they really, it's a struggle for passion. [09:04.340 --> 09:12.440] They're doing it for their passion because they believe in privacy and they believe in making sure that all users have access to it, not just people who really are in the know. [09:12.440 --> 09:14.240] And I am not one of the people who's in the know. [09:14.240 --> 09:21.220] I'm not a technologist, so... Well, that's our spiel about what we do, sort of on the day-to-day. [09:21.220 --> 09:24.920] I should say, though, that this is a Q&A forum right now, right? [09:25.060 --> 09:28.920] I'm going to open it up in just a moment, but I should say that neither of us are lawyers. [09:28.940 --> 09:32.840] Please don't come to us for legal advice or any sort of expertise in that area. [09:32.840 --> 09:38.060] If you do need legal representation or advice, please do the regular intake forms we have at EFF. [09:38.220 --> 09:47.020] You could do intake at EFF.org is one email address that answers those questions, or you could reach out to us directly at our email addresses, which is on the EFF website. [09:47.020 --> 09:49.080] But just, this is not the forum for that. [09:49.400 --> 09:51.400] Okay, so let's open it up. [09:51.400 --> 09:53.640] Any questions you have about the EFF and our work? [09:54.040 --> 09:54.560] Yes? [09:54.560 --> 09:57.740] So, I know we all mentioned that you do a lot of work. [10:03.680 --> 10:05.600] We're a C3. [10:05.600 --> 10:07.160] We're non-profit, yeah. [10:07.160 --> 10:10.720] But we are not, we don't take partisan issues. [10:10.720 --> 10:11.980] We don't take partisan stances. [10:11.980 --> 10:13.700] We can't endorse candidates. [10:13.760 --> 10:17.280] And EFA members, there are some EFA members that do. [10:17.280 --> 10:24.360] We obviously don't collaborate with them on anything that's political, and their politics run the gamut, because we're non-partisans. [10:24.360 --> 10:30.300] It's really, you know, it's a question of fighting for the basic principles locally, whoever's doing it. [10:30.680 --> 10:33.140] But we are a C3, so. [10:33.140 --> 10:37.860] But we do do a lot of lobbying for, like, privacy bills that are good. [10:37.860 --> 10:44.960] And then more often, a lot of bills that are bad are impeding on people's right to privacy, so. [10:44.960 --> 10:52.140] And you, we also occasionally in EFA will do trainings, for example, for administrative questions. [10:52.140 --> 11:00.740] And one of the big questions we often get is incorporation questions, and then questions of how much can a 501c3 do in terms of advocacy? [11:00.740 --> 11:02.900] And, you know, actually it's quite a lot. [11:02.900 --> 11:10.300] If you do a lot of other stuff, and if you track the numbers and you pay attention, then there's a decent amount that a 501c3, like EFF, can do. [11:10.300 --> 11:22.100] And since we have so much litigation, and since we have so much tech and public interest tech that happens at EFF, our advocacy never reaches the limit for the IRS to care about. [11:22.100 --> 11:27.000] So although we track it, we pay attention to it, just like we would advocate any other C3 does. [11:30.120 --> 11:30.960] Yes? [11:30.960 --> 11:32.320] So, it seems [11:38.680 --> 11:42.760] to be difficult since we can't get Congress to decide on anything. [11:43.480 --> 11:48.700] It looks like the last few years we have seen a lot of movement at the state level, various parties. [11:48.700 --> 12:00.600] Do you anticipate that you're going to see a break in the logjam at the federal level any time soon, or is the state level where we anticipate it to actually be by 2018? [12:01.720 --> 12:10.800] I think we can both answer this a little bit, but I think on the federal level, it usually depends on, I think, the regulatory framework. [12:10.800 --> 12:23.680] So it's on the regulatory end, you can get the FTC, the FCC, the CFPB, the NLRB to issue guidances, sometimes to create rules that go through a whole democratic process. [12:24.260 --> 12:27.660] And sometimes you can't. [12:27.660 --> 12:33.200] And right now is one of those moments where the regulatory framework is not going to be how we're going to make any victories in Washington. [12:33.240 --> 12:49.520] But there certainly are situations where there may be legislation that comes in Washington, and some Congress members may come to us, or we may try to talk to their staff to just clue them in on some of the technical details, and some of the legal technical details. [12:50.260 --> 12:58.900] But you're right, Washington is very infrequently the place to get things done from a democratic place. [12:58.920 --> 13:03.880] I think, small d democracy by the way, I'm not a democrat myself. [13:04.840 --> 13:09.620] On the other hand, I think at the state level, we do see a lot of policies come up. [13:09.620 --> 13:13.300] We do see a lot of proposals in certain states, right? [13:13.300 --> 13:16.900] And it is a pretty small collection of states. [13:16.900 --> 13:28.440] And usually what happens is that there will be an important state, California or Virginia, Texas, that introduces a new piece of legislation, and then other states will start to glom on to it. [13:28.440 --> 13:38.000] So what they start to introduce is this piece of privacy, or health care privacy, data privacy legislation that might have been introduced in Virginia. [13:38.020 --> 13:43.200] And then we'll say, well, that makes our job a little bit easier, because we already know we don't like the Virginia legislation. [13:43.200 --> 13:47.660] So now let's read this, and we read a lot of state legislation all the time. [13:47.860 --> 13:56.420] A lot of stuff that EFA members send to us, a lot of stuff that we find on our own, and then we have a very strong California-based, because we're San Francisco-based. [13:56.420 --> 13:58.780] I'm in New York, but we're San Francisco-based. [13:58.780 --> 14:02.080] A very strong team in California. [14:02.580 --> 14:12.240] So there's new legislation all the time, and there is new legislation that has gotten passed in Illinois and New York and California a little bit. [14:12.240 --> 14:20.680] I think a lot of the problem is that the state legislatures don't know the tech any more than most politicians do. [14:20.680 --> 14:25.160] So now they all wanted to do, for the last couple of years, AI legislation. [14:25.160 --> 14:27.720] They all wanted to create something around AI. [14:27.720 --> 14:28.840] They didn't know what. [14:28.840 --> 14:44.460] That was about the level of technical expertise that they had, and even knowing what AI was, mostly because AI is full of little cottage industries, vendors and firms that want to sell a product. [14:44.600 --> 14:52.220] And so that's who the state legislatures are hearing from if they're not hearing from local community groups, technical groups, and groups like EFF. [14:52.220 --> 15:07.860] So it's starting to trend in a few different places, but you have to be very watchful because there's frequently lots and lots of exemptions on data privacy questions, sometimes in realms that really there shouldn't be, like health care or education and protection of youth. [15:08.080 --> 15:20.880] And we also note a lot of anti-privacy legislation that tends to have a much bigger wave and a lot more states start to look at it. [15:22.850 --> 15:35.370] To be honest, I don't understand the political system well enough to, certainly not enough to augur the outcomes or likelihood of comprehensive federal data privacy legislation passing. [15:35.370 --> 15:44.610] I know that's what we need, and I do work with communities and coalition groups enough to understand that everyone kind of wants that. [15:44.610 --> 15:59.230] No one quite yet shares the same language about what it looks like, and we are deliberately misled to believe that it's a partisan issue to fight for privacy in some way, because it's given to us in various issue spaces that are kind of caught up in culture war topics. [15:59.230 --> 16:05.390] But when you talk to anyone about the impacts of things like data brokers, they get fired up and they don't like it. [16:05.390 --> 16:09.130] And that's a common experience that everyone shares. [16:09.530 --> 16:12.690] Non-partisan, yeah. [16:13.350 --> 16:18.210] So I can't predict the likelihood of it happening. [16:18.210 --> 16:19.410] I know we need to. [16:19.410 --> 16:27.750] I know what it looks like to get there, which is public education around these issues and demystifying the bullshit that we're fed about what privacy looks like online. [16:28.730 --> 16:42.930] I think one other thing, in terms of a question of partisanship, tech and privacy, in my experience at least, is not something where you can assume one party or the other is going to be correct ever. [16:42.930 --> 16:51.530] So there are some states where you're going to get one party really gung-ho behind something and the other party pushing back, and there are other states where it's the opposite. [16:52.030 --> 16:58.710] And so sometimes that's like because an industry is based in that state, or there's some other kind of local interest. [16:58.710 --> 17:00.450] Some of it is just ignorance. [17:01.050 --> 17:04.270] But anything's partisan if you phrase it right. [17:04.270 --> 17:10.210] And how privacy, transparency, and security are partisan, they can go both ways. [17:10.210 --> 17:20.330] It could be partisan for a Democrat party, Republican party, or someone else altogether, because there's libertarians everywhere, there's people who care about civil liberties and civil rights everywhere. [17:20.330 --> 17:30.710] The point is to bring them together, get them in coalition, and then actually understanding the technical aspects of legislation that they're looking at to get it right. [18:09.620 --> 18:20.100] Yeah, I think that there has been a few things that I've been paying attention to, and that some of the other EFF staff that I collaborate with have been paying attention to. [18:20.100 --> 18:23.620] Some of it is big on the data privacy front. [18:23.820 --> 18:37.120] Some of it is big on the actual federal workers, and we're talking about hundreds of thousands of workers, you know, and their incomes sometimes help other workers around them where they live. [18:37.260 --> 18:46.520] And so there's some level of how much workplace privacy and workplace protections in the federal government do these workers expect. [18:46.520 --> 18:49.960] And then there's also hits to, like I said, the regulatory framework. [18:49.960 --> 19:07.180] We had about three or four agencies that were thinking very concretely about how do they use existing law, legislation that's been passed in Congress and isn't therefore something that's untouchable to protect workers in the workplace on data questions so that workers have access to their data. [19:07.180 --> 19:11.980] If there is disciplinary action, workers can push back on it. [19:11.980 --> 19:17.780] And if there are any other kinds of privacy rights that a worker has at the federal level, it's enforced. [19:17.780 --> 19:20.060] And it's enforced by the right agencies. [19:20.180 --> 19:21.500] That's been cleared. [19:21.500 --> 19:27.080] So in the last few months, that's not something that we're going to be able to see. [19:27.140 --> 19:30.360] There are civil service workers who are working really, really hard. [19:31.140 --> 19:35.200] I used to do a lot of stuff around the CFPB in a previous job. [19:35.680 --> 19:42.700] And so there are people still in the CFPB and a number of these other agencies that have been able to fight to protect their jobs and are civil service workers. [19:42.700 --> 19:59.060] So if somebody complains, for example, about a data issue that a bank leaked or a security issue with their accounts, there are still people in those agencies that are trying to enforce the current law and enforce the current regulations. [19:59.880 --> 20:02.860] But it's taken a huge hit. [20:02.860 --> 20:16.700] And I think when you gut the workforce and you especially gut enforcement and a lot of the sections of these agencies that they've been gutting, the law is just not going to get enforced. [20:19.320 --> 20:32.020] I would add to that that recently I was at a conference, like an international digital rights conference, that features a lot of groups who do digital rights work internationally. [20:32.420 --> 20:32.500] And [20:36.420 --> 20:41.940] the temperature of these rooms that I was in was very hot and sad. [20:41.940 --> 20:50.320] People were angry and upset that their funding had been cut and it was probably their last moments in this line of work. [20:50.360 --> 20:57.280] And that is just emotionally devastating to be around and to try to gauge what the impact of it will be. [20:57.280 --> 21:09.780] What I do know is it makes my work significantly more difficult because then I feel burdened with the leftover work that is left behind because these people are losing funding. [21:09.780 --> 21:12.160] These digital rights organizations are losing funding. [21:12.840 --> 21:16.460] And that sucks because I have enough work. [21:16.580 --> 21:18.860] And the people I work with have enough work. [21:18.860 --> 21:20.540] We have plenty to do. [21:20.940 --> 21:25.420] So we have yet to really measure the impacts of it other than I know it will be bad. [21:25.420 --> 21:28.680] We are also... we're not the legal team again. [21:28.680 --> 21:32.900] That is almost half of... about half of the program staff at EFF. [21:32.980 --> 21:41.760] But the legal team, it has joined lawsuits specifically on data and data collection that DOJ has tried to pull from various agencies. [21:42.900 --> 21:44.520] We are suing DOJ. [21:45.060 --> 21:48.340] And that is something that literally affects us all. [21:48.340 --> 21:49.700] I still haven't done my taxes. [21:49.700 --> 21:54.000] But at some point... well, it will be very soon. [21:54.100 --> 21:55.340] Yeah, exactly. [21:55.340 --> 21:56.600] Plenty of time. [21:57.020 --> 22:00.160] So we are actually in the fray a bit. [22:00.420 --> 22:02.640] We may not be the best people to talk about that. [22:02.640 --> 22:06.720] But we are in the fray on the... specifically on the DOJ question. [22:06.720 --> 22:09.080] That's personal data for everybody. [22:09.080 --> 22:09.600] Literally. [22:12.500 --> 22:13.020] So... [22:18.500 --> 22:19.540] LPRs? [22:19.840 --> 22:20.420] Cool. [22:20.420 --> 22:21.400] I've never heard it like that before. [22:21.400 --> 22:22.100] I like that. [22:22.100 --> 22:22.700] Alpert. [22:22.700 --> 22:23.580] It's a good name. [22:46.000 --> 22:56.180] We're basically urging the California AG to enforce their data privacy laws that So we started to look at ways that prohibit sharing of ALPR data across state lines. [22:56.180 --> 23:07.320] So people who have to migrate into California to get an abortion or other sort of criminalized healthcare, which you'll actually hear me talk about if you see me downstairs in the big room at 3. [23:07.320 --> 23:09.520] I'll be ranting about this a lot. [23:10.380 --> 23:17.760] But we caught a bunch of law enforcement agencies sharing ALPR data with out-of-state agencies, which is illegal. [23:17.940 --> 23:26.080] So we started sending letters to them, to Bonta, the AG in California to say, hey, this is illegal, this is not okay. [23:26.300 --> 23:31.000] And we are definitely seen as, like , it's working. [23:31.440 --> 23:36.920] Some law enforcement agencies have stopped doing that, which is good, because it's illegal. [23:37.320 --> 23:40.980] So I know we're kind of like a bee in their bonnet, but that's a good thing. [23:40.980 --> 23:42.360] That's what our job is. [23:42.360 --> 23:44.500] That's why we're a successful organization. [23:44.860 --> 23:45.120] Right . [23:45.120 --> 23:55.260] We have a lot of staff that go to trade shows and have relationships with law enforcement around the country, and those relationships are important, of course. [23:55.260 --> 24:00.420] Obviously we have a lot of relationships with people who are subject to law enforcement activity. [24:00.420 --> 24:03.580] So we kind of hear both sides of these kinds of things. [24:03.680 --> 24:11.100] In part, in the activism team, I can say that when it comes to automated license plate readers, we don't have a formal position. [24:11.100 --> 24:13.680] And we don't have a formal position opposed to them. [24:13.680 --> 24:30.520] But we do work a lot with local groups whenever there is a push for local guard rails on them or a limitation, especially on certain companies, because part of the problem with law enforcement surveillance is that it's not just that local law enforcement agency on their server. [24:30.520 --> 24:32.580] They don't often have a server . [24:32.580 --> 24:35.540] It's not necessarily completely in their control. [24:35.540 --> 24:44.680] It's shared with a fusion center, which would share it with other law enforcement agencies across the state, at the state level, and then at the federal level. [24:45.000 --> 24:53.480] And it also has... they just have tons and tons of vendors that the data has to go through, or they go through these various data brokers. [24:53.960 --> 24:59.200] And so when that kind of thing happens, a lot of it is a push for this is not a safe company. [24:59.200 --> 25:02.760] This is not a company that you can trust your municipality. [25:03.320 --> 25:06.040] And it's... it may be around unprotected questions. [25:06.040 --> 25:13.640] It may be around questions of a city or a state not criminalizing something that another city or state is criminalizing. [25:13.680 --> 25:17.620] But it's also how long are they retaining this data, right? [25:17.620 --> 25:38.180] We push a lot back on questions of long-term data retention, such that we've gotten some places to limit their data retention on ALPR data for 30 days or even down to a few days, which, you know, if you leave it up to the law enforcement agencies, they're going to keep it in perpetuity or they're going to keep it at least for six months. [25:38.400 --> 25:43.300] Even if it has nothing to do with an investigation, an active investigation, there's never been a warrant. [25:43.360 --> 25:58.280] And then you add to that that we have these rental companies and the rental companies, car rental companies, will then give license plate numbers to the local law enforcement or state law enforcement that have the ALPRs. [25:58.280 --> 26:16.520] And now they have a hot list of license plate numbers that A, is riddled with incorrect data, with errors, and then B, has cars on it far, far after the point in which the car has been recovered or the car was returned a day late, you know, or something else to that effect. [26:16.520 --> 26:26.160] So now you've got all these other drivers who get hassled because of this and their movements are being tracked by all these ALPRs because of these hot lists. [26:26.160 --> 26:35.980] So again, you know, it's pushing back on this kind of data retention and data collection because the best and safest data is the data you don't collect, right? [26:41.270 --> 26:42.810] Love them? [26:44.570 --> 26:46.230] Yeah, no. [26:48.230 --> 26:51.810] No, it's a despicable business model. [26:52.790 --> 26:55.050] Yeah, yeah, yeah, we're with you. [26:56.690 --> 26:58.270] Other questions? [27:27.320 --> 27:31.220] I would say at the trade shows, I think that, you know, it's a lot of firms, right? [27:31.220 --> 27:33.000] These are these are cottage industries. [27:33.000 --> 27:50.240] And so as soon as it's not just, you know, AI or ALPRs, but face recognition software, as soon as this kind of software is discussed, then a company wants to say, how can we create a product in some cases, not nearly in all, but how can we create a product that we can market this way? [27:50.380 --> 27:54.860] Not necessarily, it doesn't actually do the thing, but it's marketed this way. [27:54.860 --> 28:02.240] And then how do we get it into the hands of the right agencies, right departments to make sure that we can get a contract eventually? [28:02.300 --> 28:10.020] And so, for example, we saw this with, we don't have any lawyers present, but we saw this with Evolve, for example. [28:10.020 --> 28:14.940] Evolve, without an E, has a weapon detection instead of a metal detector. [28:14.980 --> 28:25.080] And what they did was they, you know, loaned them to the New York Metro, MTA , the Metropolitan Transit Authority, and to the NYPD. [28:25.080 --> 28:39.020] And they stationed them at some museums, they stationed them at some theaters, they stationed them at some train stations, specific train stations, and they were like, this would have been big bucks because they would have gotten how many museums and train stations across the city eventually. [28:39.760 --> 28:48.060] So what they, you know, they said, well, we've got this technology, we're using AI to be able to detect weapons without hassling most of your consumers. [28:48.060 --> 28:51.460] Your consumers walk right by, they don't have to take the metal out of their pockets. [28:51.500 --> 28:56.920] And the truth is that in the industry trade shows, Evolve was a joke, right? [28:56.920 --> 29:04.160] We went to those and they were discussed, and this kind of technology was discussed, and people very well knew that it was a joke. [29:04.160 --> 29:13.360] Shortly after that, there was a class action lawsuit by some of the customers and clients of Evolve because they didn't really have the AI integration that they were claiming. [29:13.360 --> 29:19.600] They were marketing something to usually government actors and to institutional actors that they didn't have, right? [29:19.600 --> 29:25.820] A product, you know, it's all marketing, not actually the technologists who are expressing it. [29:25.880 --> 29:30.020] So, you know, fortunately in that case, New York pulled back. [29:30.020 --> 29:47.680] I personally went through them in museums and theaters because I'm a New Yorker a few times, and I think one time, I think it was at the Guggenheim, the thing went off, and I showed them, you know, everything I had at the security booth, and I had obviously no weapon of any sort, [29:47.680 --> 29:52.000] but also the security folks there were like, yeah, we know this doesn't work. [29:52.100 --> 29:58.380] Nobody that goes through, that the alarm goes off, has ever had a weapon, and we know about the class action lawsuit. [29:58.960 --> 30:01.900] You know, we're just the personnel here at the moment, you know? [30:01.900 --> 30:05.640] So I think some of it for sure is ignorance, right? [30:05.640 --> 30:24.500] Because as the tech moves fast, it's very easy for us lay people to hear a word or a phrase or a buzzword and then hear the basic explanation from something on TikTok or on the Internet, you know , that is like an AI voice, and it's not accurate, and then we think we know what we're doing, [30:24.500 --> 30:26.440] and policymakers can be just the same. [30:26.440 --> 30:39.180] But some of it is also that there's vendors out there that are always marketing stuff, and they don't want the folks that they're trying to sell to or trying to get contracts with to really understand the technology, because if they understood the technology, [30:39.180 --> 30:43.360] they'd know that this particular product doesn't actually meet the marketing claims. [30:45.200 --> 30:45.780] Wow. [30:45.780 --> 30:46.860] What was that company called? [30:46.860 --> 30:47.460] Evolve? [30:47.460 --> 30:47.760] Evolve , yeah. [30:47.760 --> 30:48.120] Wow. [30:48.120 --> 31:10.280] So that reminds me of a company that, again, to allude to what I'll be talking about at 3 o'clock about bodily autonomy, there's this company called Thorn that does anti-trafficking technologies for both consumer and law enforcement systems, and they have been caught so many times lying about the efficacy of their technologies, [31:10.280 --> 31:18.720] and they've also positioned themselves as a, like, necessary component in legal compliance, but they're a for-profit private company. [31:18.900 --> 31:28.740] It's fucked up, and I think that's a pretty good example of, like , deliberate disinformation put out by them, just, like, lying over and over about the efficacy of their products. [31:28.960 --> 31:49.120] I think an example of misinformation that I encounter quite often is in the work I do on, like, OPSEC advising for either individuals or organizations who are worried about their digital privacy and security, there is this common fear or malaise that they have about what we call, [31:49.120 --> 31:56.500] like, security nihilism, where there's nothing to do, that Big Brother knows it all anyways, so why bother doing anything? [31:56.560 --> 32:13.820] Or they are scared of these, like, extremely sophisticated and confusing inaccessible technologies that are sold to them as these, you know, these sort of cure-all solutions to their OPSEC concerns, and they don't know what to do, they don't know where to start, [32:13.820 --> 32:30.020] and so they don't do anything, and they do all of their organizing on, like, Google Drive and SMS, you know, and it's, like, it's... that is hard, because that is a product of misinformation that I think is also kind of pushed by a not great tech industry, [32:30.020 --> 32:35.840] for-profit tech industry, and just a misunderstanding of what information security actually looks like. [32:35.940 --> 32:42.440] But that's, you know, that's just misinformation, it's just an accidental thing that happens, and we work to cure it. [32:42.440 --> 32:57.860] And then one other thing of note, I think, is that some of these firms, you know, the bigger firms, they have legal departments, and those legal departments will go after local press, local activists, and communities that put out the data of how ineffective, [32:57.860 --> 33:17.620] say, Evolve, because I'll stick to just attacking one company's reputation, but basically, you know, if a big enough company hears that the press is pushing back, or a reporter is saying that there's a 98% ineffectiveness rate of your tech, then, you know, [33:17.620 --> 33:19.460] you need to quiet that down. [33:19.460 --> 33:23.200] And so they do cease and desist orders, they do other kinds of litigation. [33:23.280 --> 33:34.080] Sometimes we have stepped in, and we certainly are more likely to step in at the amicus stage of that kind of case. [33:34.080 --> 33:36.260] But, you know, it can be very quieting. [33:36.260 --> 33:49.140] And so some of what we have to do is we have to depend on the industry trade magazines, for example, police one, you know, police have their own trade magazines, and some of them are very, very honest that this is a product that doesn't work. [33:49.140 --> 33:56.940] And they'll put out the information because they have the police chiefs and the sheriffs and the various agencies in St. [33:56.940 --> 34:16.140] Louis or Chicago or wherever else, they have those ears, so they can get the data that, you know, this is the level of ineffectiveness of this particular product, and, you know, this is how costly it is, nevertheless, and how it's not actually helping with gun violence or X issue that it's marketing itself as dealing with. [34:16.140 --> 34:27.920] So on the litigation side, we also step in and certainly refer, we have a wide referral network to try to defend reporters, local activists, and communities in those kinds of cases as well. [34:32.990 --> 34:34.510] Any other questions? [34:37.470 --> 34:52.350] I'll just note while people are thinking up questions, if you are, that we also have a few resources that in particular I want to point you to that comes out of the activism team in part, and a lot of it also comes out of the tech team in the case of the first one, [34:52.350 --> 34:54.310] the Surveillance Self-Defense Guide. [34:54.310 --> 35:07.590] The Surveillance Self-Defense Guide is our most up-to-date and regularly updated kind of tech security, especially for the lay people kind of guide and website. [35:07.590 --> 35:10.410] It's ssd.eff.org. [35:10.410 --> 35:11.590] We're very proud of it. [35:11.590 --> 35:14.530] We work really hard on it, especially my colleague Thorin. [35:14.530 --> 35:23.290] And then we formerly used to run the Security Education Companion, which you can find at that long phrase.org. [35:23.290 --> 35:34.310] We no longer update that, but that may still have some helpful curriculum for anybody in this room that does any workshops and trainings around your rights and basic security tech questions. [35:34.310 --> 35:39.250] SLS.eff.org is our street-level surveillance hub. [35:39.250 --> 35:44.270] And we actually do use that to good benefit for policymakers who don't understand the tech. [35:44.270 --> 36:00.010] We have a list of different kinds of mostly law enforcement-based technology and then long, long, long explanations of them, links to cases and case history, links to our own blogs and responses to it. [36:00.010 --> 36:05.710] And then we're also very proud, always, and always in need of help, with the Atlas of Surveillance. [36:05.770 --> 36:23.490] The Atlas of Surveillance is a U .S.-based domestic database of every contract that we can find for law enforcement agencies at the municipal, state, and federal level, and all of the contracts that they have, say with ShotSpot or Evolve or whomever else. [36:23.490 --> 36:27.730] The only way to keep that up to date is research, is volunteers. [36:27.730 --> 36:31.910] So we always accept volunteers helping us in that way. [36:32.390 --> 36:35.070] And also our collaborations with EFA groups. [36:35.150 --> 36:41.050] Sometimes they pass us contracts if they've done FOIA at the local level. [36:41.050 --> 36:56.090] And then we are happy to make sure everybody knows about it, so that you can look up Milwaukee, Milwaukee Police Department, or you can look up Wisconsin and then look up a specific form of law enforcement tech and see if we know, at least, about that form of tech in your area. [36:58.890 --> 37:03.710] I was plugging again SSD, the Surveillance Self-Defense Guides. [37:03.730 --> 37:23.710] Regarding that question that was asked earlier about misinformation and disinformation, and I talked a bit about security nihilism, that is the most often referenced guide or site that I point to of guides for people to learn about information security and privacy and how to defeat that security nihilism. [37:23.710 --> 37:29.070] It really is so beginner-friendly and so essential for people who want to learn more about that. [37:51.130 --> 37:55.030] It's also sec.eff.org, but we don't maintain it anymore. [37:55.030 --> 37:57.890] A couple years ago, we released it. [37:57.890 --> 38:03.310] I would say that the guides in that resource are a bit more stable. [38:04.190 --> 38:21.450] The SEC is a bit more stable and has a longer shelf life than SSD does, because SSD, we talk about specific technologies and how to use them and implement them into your practice, where SEC is more about that train-the-trainer thing I talked about, like how to be a digital security expert in your community. [38:21.610 --> 38:30.670] And so it's a bit more like high-level paradigmatic thinking about how to be a good educator when it comes to that stuff. [38:34.410 --> 38:36.250] Any other questions? [38:48.100 --> 38:50.140] I just got excited. [38:50.380 --> 38:51.380] Definitely. [38:51.380 --> 38:56.660] I mean, being on the public interest technology team, I think the stuff that we put out is just so cool. [38:56.660 --> 39:04.820] The two that I would plug, privacy and privacy badger, that's the tracker-blocking browser extension that is a plug-and-play thing that anyone can use. [39:04.820 --> 39:10.740] It really is made for the least tech-savvy people you know, because they don't have to know how it works. [39:10.740 --> 39:13.040] They don't have to do anything to configure it to make it work. [39:13.040 --> 39:16.720] They literally just install it, and it makes everything better. [39:16.720 --> 39:23.320] It is one of those few privacy tools that is actually easy to install and makes your life better because of it. [39:23.500 --> 39:29.000] It literally makes your web, your browsing speed faster. [39:29.000 --> 39:35.180] It blocks a lot of ads as a byproduct of blocking all those trackers, and it just reduces your digital footprint. [39:35.180 --> 39:40.500] So it's great for people in your life who just don't want to think about something, but they need a little bit more privacy. [39:40.720 --> 39:41.200] Fab. [39:41.200 --> 39:42.260] Great product. [39:42.320 --> 39:52.020] Long before I was at EFF, I had privacy badger and HTTPS everywhere on my browsers every single time, and now I don't have to have HTTPS everywhere on my browsers. [39:52.020 --> 40:01.860] I used to teach about privacy badger in the digital privacy trainings I would do in my community before I worked at EFF, and then when I got hired to work on privacy badger, it was like, great. [40:02.220 --> 40:11.540] The other tool I would recommend is Certbot, which is a free certificate authority that we... how many millions of sites now do we offer free certs to? [40:11.940 --> 40:12.900] It's incredible. [40:12.900 --> 40:13.760] It's free to use. [40:13.760 --> 40:14.240] It's fab. [40:14.240 --> 40:17.280] If you're like a web admin of any kind, check out Certbot. [40:17.900 --> 40:21.540] It's great, and we're encrypting the Internet through it, so it's fab. [40:28.920 --> 40:33.100] I don't know what time we're at, but if there are no more questions, we can wrap it up. [40:33.100 --> 40:34.540] Oh, okay. [40:49.420 --> 40:50.980] No. [41:04.760 --> 41:08.800] Perhaps some other technologists on the team feel differently than I do. [41:11.000 --> 41:27.060] Either forking a known browser engine or coming up with our own and then maintaining it to an extent where it's actually usable and not just a sort of political product or political action is well beyond the breadth of what we have available at EFF. [41:27.060 --> 41:33.720] We seem like a huge, mighty organization, and we are in our impact, but our technology team is like ten technologists. [41:33.720 --> 41:36.140] That would be crazy to do. [41:36.460 --> 41:37.980] That's not going to happen. [41:38.300 --> 41:44.280] Privacy badger is one of those rare things where it was developed as a political cudgel. [41:44.300 --> 41:54.860] It was like if you don't respect this flag that we're putting out saying that you should respect our privacy and not track us from site to site, we're going to enforce it anyways at the network level. [41:56.220 --> 41:58.380] And it just continued to work. [41:58.380 --> 42:08.980] Now we're stuck in the problem where we have to maintain it indefinitely because it is so useful, but that's such a rare thing where we can run a project like that with only two developers and it has five million users. [42:09.200 --> 42:13.100] So something as large and complex as a browser engine ? [42:13.100 --> 42:14.300] I'm sorry. [42:16.220 --> 42:17.780] Not likely. [42:18.200 --> 42:20.840] I'll note two ways that you can support our work. [42:20.840 --> 42:32.600] One is if you have a local group, if you're a faculty advisor, for example, of a student group, you can go to EFA.EFF.org and learn all about the Electronic Frontier Alliance and the possibility of joining. [42:32.600 --> 42:40.180] And then we also have a table downstairs right across from reception where we're selling merchandise and taking new members in. [42:40.180 --> 42:45.960] Or if you want to renew your membership, Christian is sitting at the table until I think about 4 or 5 o'clock. [42:45.960 --> 42:53.400] I might take a shift from Christian, but right across from reception, you can't miss it with a lot of shirts on the table. [42:53.500 --> 42:59.100] There's also a ton of free literature on the table about the different projects and software products that we make. [42:59.100 --> 43:04.620] So if you just want to learn more and read about it for free, just go visit the table just across from registration. [43:07.100 --> 43:08.300] Last question? [43:08.780 --> 43:12.860] If there are no other questions, I guess we can duck out early a little bit. [43:14.120 --> 43:18.920] Like I said before, catch me at 3 o'clock in the Circle Room talking about the war on bodily autonomy. [43:20.020 --> 43:21.600] Thank you very much. [43:21.760 --> 43:23.260] Thanks for the great questions. [43:29.960 --> 43:35.200] Microsoft Mechanics www .microsoft.com