[00:25.500 --> 00:27.340] Remember the message. [00:27.920 --> 00:30.260] The future is not set. [00:59.090 --> 01:00.910] Can I just jump into it? [01:01.250 --> 01:02.690] Okay, great. [01:02.890 --> 01:03.850] Hi everyone. [01:03.850 --> 01:04.890] Thanks for joining. [01:04.890 --> 01:14.670] I am really glad to be here right now because I have a presentation that feels a little bit out of left field for conferences like this. [01:15.290 --> 01:20.090] I am a senior staff technologist at the Electronic Frontier Foundation. [01:20.250 --> 01:25.870] And I'm here to talk about the digital front lines of the war against bodily autonomy. [01:25.990 --> 01:28.730] And I call that systems of dehumanization. [01:30.490 --> 01:32.230] Is there feedback? [01:36.350 --> 01:37.630] Okay. [01:38.630 --> 01:44.870] So, if you're not familiar with the EFF, we are the leading digital civil liberties organization. [01:44.930 --> 01:52.930] We have more than 30 years in the fight, fighting for privacy, free expression, and privacy and rights online. [01:53.070 --> 02:07.470] As a senior staff technologist at the EFF, for me, when I first joined, that looked like being a developer on one of our open source software projects, Privacy Badger, which is a tracker blocking browser extension. [02:07.690 --> 02:09.910] It's free and easy to use. [02:09.910 --> 02:17.190] If you want to learn more about that, go check out the booth over across from the registration out there. [02:17.570 --> 02:27.150] But now, at EFF, I have transitioned out of working on Privacy Badger, and I co-lead our reproductive justice working group, which focuses on health privacy. [02:27.150 --> 02:32.390] And I provide digital privacy and security trainings for at-risk communities. [02:32.730 --> 02:42.490] That looks like journalists, activists, local health care advocates, or just individual people who are worried about their operational security. [02:43.130 --> 02:53.270] And before arriving at EFF, I led countless digital privacy and OPSEC workshops for a sex worker advocacy collective named Hacking Hustling. [02:53.270 --> 03:00.850] And before that, I founded the trans-forward tech justice collective called T for Tech. [03:01.310 --> 03:14.970] And it was through both of those projects that I have collaborated for years with Digital Defense Fund, which is the premier American digital security organization for the abortion access movement. [03:15.270 --> 03:20.150] So the reason I bring up those different issue spaces is because that's what I'm going to be discussing today. [03:20.150 --> 03:35.070] Not just reproductive rights, but also trans rights, systemic transphobia, and the struggle for liberation, sex worker rights, and digital rights, surveillance, and tech-enabled oppression that are intersecting with and impacting all of those. [03:36.070 --> 03:53.550] Some issue spaces that I'm not discussing, though, that I think are relevant to the fight for bodily autonomy, that I want to call into this space because they do intersect often with it, immigrant justice, carceral technologies, disability justice, and harm reduction. [03:58.110 --> 04:13.690] And just a quick note before we get into the thick of my material, brief content warning, I will be bringing up some brief and nonspecific references to sexual assault and domestic violence and violent crime. [04:13.690 --> 04:21.390] They're nonspecific and just in passing, but, you know, if those are particularly sensitive issues for you, just do what you have to do to take care of yourself. [04:24.780 --> 04:35.140] Okay, so, if you were to tune out entirely for the rest of this talk I'm about to give, there's one kind of overarching point that I want you to take away with you. [04:35.140 --> 04:44.680] That's the TLDR of this talk, which is that nobody's right to self-determination and bodily autonomy should be hindered by digital surveillance and tech-enabled oppression. [04:44.680 --> 04:55.950] What I mean by that is everyone here has a body, online and off, and you should fight for the right to determine how that body exists. [04:56.400 --> 05:04.890] So, it is in your best interest to eliminate any and all efforts that restrict others' rights to bodily autonomy and self-expression. [05:06.440 --> 05:25.520] Because what is criminalized today may not directly impact you, and if so, great, lucky you, but experts like myself and others have traced how this penumbra, the lens of criminality, has continued to expand and envelop more and more people over time. [05:25.980 --> 05:29.140] So, if it's not today, it will be tomorrow. [05:29.720 --> 05:36.980] And overall, privacy, free expression, and self-determination are fundamental human rights, online and off. [05:39.420 --> 05:52.540] Okay, so here is a brief and kind of haphazard timeline I put together of various events that I think are relevant to the conversation or firehose of information I'm about to give to you. [05:52.820 --> 06:02.960] Starting 150 years ago, Comstock, there was this guy, Anthony Comstock in New York, who was this sort of puritanical freak who was obsessed with obscenity. [06:03.160 --> 06:08.140] And Comstock Law was passed, named after him, which is an obscenity law. [06:08.140 --> 06:20.980] The reason I'm bringing it up is because it informs how a lot of our legal frameworks or thinking about obscenity is framed today, 150 years later, based on Victorian-era morals. [06:21.200 --> 06:38.580] It actually has been revived a lot by legal experts recently to talk about how surveillance of the male system is deeply embroiled with analysis on Comstock Law, and especially Mifepristone and Misoprostol, which are the two abortion medications for self-managed abortion at home. [06:38.640 --> 06:42.940] And so when those are surveilled for in the male system, it's thanks to Comstock. [06:45.350 --> 06:48.110] And then jumping ahead 100 years, there's Roe v. [06:48.110 --> 06:48.490] Wade. [06:48.490 --> 06:52.050] That was a privacy rule that was passed in 1973. [06:52.190 --> 06:57.450] It was not codifying the legal right to abortion into law, as we learned in 2022. [06:57.450 --> 07:04.530] It was just a privacy rule that afforded people who needed an abortion the kind of privacy from the prying eyes that would want to criminalize it. [07:06.210 --> 07:12.270] 20 years later, 1996 was a big year for Internet and digital law. [07:13.030 --> 07:14.610] HIPAA was passed. [07:14.610 --> 07:18.390] Now HIPAA, you're probably familiar with, at least in passing by name. [07:18.390 --> 07:26.390] It is the legal framework by which we talk about how health information can be passed across computerized systems. [07:27.010 --> 07:32.390] I think it's important to note here that the P in HIPAA stands for portability, not privacy. [07:32.510 --> 07:35.210] That's an often misunderstood point. [07:35.210 --> 07:45.270] And it is the interoperability of healthcare systems, of the information passed between these different systems, that is a benefit of portability. [07:45.270 --> 07:46.650] It's the benefit of HIPAA. [07:46.650 --> 07:50.150] But it's when you criminalize healthcare that that becomes a real problem. [07:51.570 --> 07:56.750] The same year, we had Section 230 of the Communication Decency Act was passed. [07:56.750 --> 07:58.510] I'll cover that more in detail in a bit. [07:58.510 --> 08:00.270] I have a whole rant on it. [08:01.210 --> 08:08.670] And in 1997, there was the Nuremberg Files website, an unfortunately named website that sort of set a precedent for doxing. [08:08.670 --> 08:10.730] I'll cover that also. [08:10.930 --> 08:13.830] Similar to 2018 FOSTA-SESTA. [08:13.830 --> 08:16.410] I'll go deep on that one. [08:17.070 --> 08:24.270] And then 2022, I bring up the Don't Say Gay bill that was passed in Florida because it was... [08:25.270 --> 08:36.530] It set a precedent for unconstitutional impediments on people's right to free expression, particularly in schools, based on this sort of idea around obscenity. [08:37.070 --> 08:46.870] And it really paved the way for 2022, 2023, and 2024, which all have had record number of anti-trans laws, each surpassing the other. [08:46.870 --> 08:57.930] And in 2025, we are already nearing passing 2024's number, even though we're barely through the first quarter of the year. [08:59.150 --> 09:04.110] Oh, and in 2022, this hopefully isn't news to many people, but Roe v. [09:04.110 --> 09:05.330] Wade was overturned. [09:05.330 --> 09:10.750] And that came out through a Supreme Court memo leak, bad OPSEC on their part. [09:10.750 --> 09:21.250] And we learned about that, and then a lot of people, you know, became galvanized in the fight and began to think about how digital rights and the right to abortion access were deeply connected. [09:25.590 --> 09:31.730] Okay, so, jumping into the first big issue space and the sort of terrible things that are happening there, let's talk about reproductive rights. [09:31.770 --> 09:37.410] And here's a great and terrifying graphic that one of my colleagues designed. [09:39.570 --> 09:43.430] So, some digital surveillance and tech risks that are endangering the movement. [09:43.430 --> 09:45.130] This is non-exhaustive, by the way. [09:45.130 --> 09:50.190] These are just things that I have noticed over the years as a security researcher and someone working in the space. [09:50.190 --> 09:53.810] So, these are just like what I hear about with the communities that I'm working with. [09:55.010 --> 10:01.010] Number one being passive surveillance mechanisms built into social media and pervasive ad technology. [10:01.010 --> 10:10.610] I'm not just talking about data brokers, I'm talking about like data retention policies, or sometimes like analytics software, or just shitty privacy policies in general. [10:13.850 --> 10:19.410] Abuse of public records that are weaponized in doxing campaigns and coordinated harassment. [10:19.870 --> 10:22.290] You will hear that a lot throughout this talk. [10:22.290 --> 10:23.810] That happens quite often. [10:26.530 --> 10:27.950] Data brokers. [10:28.390 --> 10:34.510] It is a vampiric shithole of an industry that needs to have been taken care of like yesterday. [10:34.510 --> 10:36.810] There's no reason that it should exist the way it does. [10:36.810 --> 10:44.170] It provides warrantless surveillance of citizens to both state and non-state actors. [10:44.350 --> 10:47.020] It, frankly, should be criminal. [10:49.880 --> 10:52.720] And then there are street-level surveillance threats. [10:52.800 --> 11:00.610] Things like automatic license plate readers that are tracking people's location, especially around health care clinics that offer abortion services. [11:02.720 --> 11:07.500] And, as I already mentioned before, there's the interoperability of medical health records. [11:07.500 --> 11:11.020] And third-party apps that are supposed to be covered by HIPAA. [11:11.020 --> 11:19.220] HIPAA covers the sort of first-party communications between patient and provider and like the one system of communication they use. [11:19.220 --> 11:28.320] But there are a lot of these third-party apps that kind of are necessary for today's technologies that are supposed to be covered through these business associate agreements. [11:28.380 --> 11:33.200] But HIPAA is just like laughably misunderstood by everyone. [11:33.200 --> 11:38.500] Even less so by these companies that are supposed to be compliant under these BAAs. [11:41.720 --> 11:44.100] And there's a lot of financial censorship. [11:44.120 --> 11:55.980] When I say this, I'm talking about like being de-platformed from... well, being de-banked, but also de-platformed from like Venmo and Cash App and PayPal and the like. [11:55.980 --> 11:58.480] This happens a lot with abortion funds. [12:01.400 --> 12:06.020] Then, there's abuse of private health data by crisis pregnancy centers. [12:06.020 --> 12:11.440] If you're not familiar with CPCs and you want to have a really bad time, learn about them. [12:11.440 --> 12:19.500] They're basically posing as health clinics for people who are seeking usually alternatives to their pregnancy outcomes. [12:19.560 --> 12:23.460] And they pretend to be health clinics that will say, are you looking for an abortion? [12:23.460 --> 12:24.420] Come talk to us. [12:24.420 --> 12:27.760] And they'll offer some things like STI screenings and whatnot. [12:27.800 --> 12:34.740] But then, instead of actually offering abortion-related care, they'll feed them a bunch of anti-abortion propaganda. [12:34.740 --> 12:44.280] They're usually funded... well, they're funded by public funds, which is why the abuse of public health data is literally criminal. [12:44.280 --> 12:48.560] And they're being investigated for it, and they shouldn't be using taxpayers' dollars for this. [12:49.400 --> 12:53.040] But they're usually run out of like church basements and stuff. [12:54.940 --> 13:02.020] And then there is, finally, censorship of just reproductive health care material by fidgety tech companies. [13:02.020 --> 13:07.920] So these are companies whose content moderation policies are usually kind of swaying with the cultural discourse at the time. [13:07.920 --> 13:15.320] We've seen a lot of censorship on regular platforms just around abortion or just reproductive health care materials in general. [13:15.320 --> 13:18.900] Because that's obscene, according to them. [13:20.780 --> 13:24.840] Okay, so I sort of mentioned this earlier in that timeline thing. [13:24.840 --> 13:29.100] The Nuremberg Files was this website that was launched in 97. [13:29.100 --> 13:47.800] It's an early example of doxing, of surfacing public records that are available to anyone, but collating them into one space and operating under the guise of plausible deniability so that others who have worse intentions than just revealing information can access it and choose who they're going to shoot, [13:47.800 --> 13:49.420] burn down clinics, etc. [13:49.660 --> 13:52.440] That happened because of this. [13:53.420 --> 13:57.500] And it really set a precedent for this type of website to exist. [13:58.540 --> 14:01.720] That's a screenshot from what it looked like back in the day. [14:01.720 --> 14:04.460] And that smug-looking guy in the corner is who built it. [14:05.080 --> 14:07.340] He died, like a few years ago. [14:09.720 --> 14:13.460] So today, though, there is this website abortiondocs.org. [14:13.460 --> 14:15.020] It's currently functioning. [14:15.040 --> 14:17.160] It basically does the same thing. [14:17.160 --> 14:21.760] It has a sort of veneer of credibility to it, but it's essentially the same. [14:21.760 --> 14:33.220] You probably can't read the tagline underneath the website address at the top, but it says, the largest collection of documents on America's abortion cartel. [14:33.340 --> 14:37.840] So if you look closely at the language of some of these places, you kind of get the vibe of what they're doing. [14:41.060 --> 14:47.300] Here is an example of some tech suppression of information, some censorship online. [14:48.240 --> 14:54.700] Microsoft's Bing search engine was found to be quietly suppressing abortion-related content. [14:54.700 --> 15:04.660] And this was only found by an independent researcher who realized when they mistyped or misspelled the word abortion, they suddenly got much more relevant results in the search. [15:05.960 --> 15:07.280] That sucks. [15:07.320 --> 15:08.920] That is so dumb. [15:09.120 --> 15:15.700] They've changed that since, but it just sort of shows what happens when you have these opaque policies that no one can understand. [15:18.520 --> 15:23.900] Now we'll have a couple of examples of digital evidence being used in abortion-related cases. [15:24.040 --> 15:33.120] This is a pretty famous case that came out of Nebraska, where a mother was sentenced to two years in prison for helping her daughter navigate self-managed abortion. [15:33.460 --> 15:37.020] And it was their Facebook messenger logs that were used as key evidence. [15:39.340 --> 15:42.100] This was happening right around Roe v. [15:42.100 --> 15:43.700] Wade being overturned. [15:43.700 --> 15:51.480] I think the subpoena that Facebook got from law enforcement happened before Roe was overturned. [15:51.480 --> 15:54.020] But then it all sort of played out just after. [15:54.020 --> 16:05.620] So Facebook officials said that their response to the law enforcement subpoena, you know, they're just being compliant, and they said that it has no reflection of any sort of anti-abortion sentiment within the company. [16:06.300 --> 16:18.820] Okay, but to me, this is what happens when tech companies have really shitty privacy policies and so-called neutral stances on these political issues like the criminalization of healthcare. [16:19.660 --> 16:21.660] People go to prison for this. [16:22.500 --> 16:27.040] The data isn't that important, or their bottom line is not that important, at least to me. [16:27.120 --> 16:33.080] And as we'll see a bit later, Facebook's sort of stance on the matter is not as neutral as you would think. [16:36.080 --> 16:47.680] Just another example, some cell phone location data was used in a criminal case where a mom and a son were charged with kidnapping because they helped a young girl who was consensually with them go get an abortion. [16:50.200 --> 16:52.940] So I mentioned street-level surveillance. [16:53.040 --> 17:07.480] Automatic license plate readers are these devices that are placed on, well, often cop cars, but sometimes they're out like traffic lights and stuff, and they can mark and ID and tag license plates, like hundreds in a matter of a minute, you know, very quick, [17:07.480 --> 17:08.620] very sophisticated. [17:10.540 --> 17:11.920] It sucks. [17:12.060 --> 17:25.520] And these law enforcement agencies, particularly in California, which has pretty good privacy rules baked into the California Constitution, we found that they were sharing that ALPR data with out-of-state agencies, which is illegal. [17:26.680 --> 17:34.900] And this is particularly concerning for people who are traveling into California to get an abortion from more restrictive states, which happens a lot. [17:34.900 --> 17:45.060] So we have been urging the California AG, Judge Bonta, to enforce the rules here. [17:45.060 --> 17:53.320] We've been sending letters out to these agencies who we found breaking the law, and say, like, hey, we're on to you, quit breaking the law. [17:53.940 --> 17:55.960] And there's been, like, middling success. [17:55.960 --> 17:57.040] Some have stopped. [17:59.700 --> 18:13.600] So some companies actually do, like, gauging the sentiment or need for more increased privacy when, you know, the lens of criminalization expands to include more people, they do change things. [18:13.600 --> 18:26.480] So Google actually promised to remove location data history from Google Maps for sensitive locations, such as abortion clinics or clinics that offer gender-affirming care or the like. [18:28.540 --> 18:32.120] Unfortunately, it was found that they were lying about that. [18:32.120 --> 18:40.560] So 18 months later, after they made that promise, an independent research company tested the results and found that nothing had changed. [18:40.560 --> 18:42.800] This actually happened twice. [18:43.100 --> 18:51.180] So they were tested for it after 18 months, found that they were lying, and then Google was like, shit, sorry, sorry, we'll change it. [18:51.280 --> 18:58.080] And then three months later, the same independent researcher did it again, and they were like, nothing has changed. [18:58.080 --> 18:59.560] Google, what gives? [19:00.320 --> 19:02.540] I think they have changed, though, now. [19:02.540 --> 19:06.640] But it really goes to show how much trust we can put in Google. [19:06.640 --> 19:13.020] A couple other examples of digital evidence being used in criminal cases. [19:13.020 --> 19:17.200] There's a woman here whose text messages were used to prosecute her with feticide. [19:17.720 --> 19:25.540] And then there was a quite famous case of a woman named Latrice Fisher whose Google search history was used to prosecute her for murdering her infant. [19:25.540 --> 19:27.160] And this was actually pre-Roe v. [19:27.160 --> 19:32.500] Wade being overturned. [19:32.820 --> 19:34.860] Okay, so a bit about data brokers. [19:36.060 --> 19:38.200] I've already ranted a little bit about data brokers. [19:38.200 --> 19:45.000] But here's a really pernicious example of what happens when they provide this sort of warrantless surveillance of people. [19:45.020 --> 19:58.800] Senator Ron Wyden actually revealed this investigation that his office had been running on this company called Near Intelligence, which is a data broker company who was offering location data of people and selling it to whoever wanted it. [19:58.960 --> 20:03.280] Location data collated with the types of businesses where they were traveling to. [20:03.280 --> 20:20.880] So people who were traveling in and around abortion clinics, all of their device IDs were captured and bought by an anti-abortion group who, well, the only surface example of what actually they did with it was deliver a bunch of anti-abortion propaganda to their devices. [20:20.980 --> 20:28.020] But we're sure that it was sold off elsewhere and just sort of collected into databases where who knows how else it's being abused. [20:29.300 --> 20:36.880] Oh, and if you want to have another really bad time, like another rabbit hole of doom, look up fog data science. [20:36.880 --> 20:49.180] It's not immediately related to the sort of repro rights space, but they provide real-time location data of your phone to cops for free without warrants. [20:53.030 --> 20:54.310] Okay. [20:54.590 --> 20:57.730] So let's jump into sex worker rights. [20:58.430 --> 21:03.910] Some of the digital surveillance and tech risks facing or endangering this movement. [21:04.130 --> 21:06.150] Things are starting to be familiar now. [21:06.150 --> 21:07.650] I'm going to repeat a few of them. [21:07.650 --> 21:11.270] Like passive surveillance mechanisms built into social media. [21:12.330 --> 21:15.750] Abuse of public records that are used in doxing campaigns. [21:16.630 --> 21:18.490] Data brokers, again. [21:19.690 --> 21:24.110] Street-level surveillance, again, but this time I'm not highlighting ALPRs. [21:24.110 --> 21:26.070] I want to talk about surveillance towers. [21:26.550 --> 21:43.370] These are like literal turrets that are put up into areas that have a lot of street-based sex work, which are, well, A, it's an imposition on the communities in which they're placed, but their efficacy rates are so low and insane, and they're really expensive. [21:43.410 --> 21:47.310] So it's just like this ludicrous sort of hardware that is expensive but doesn't do anything. [21:47.310 --> 21:54.130] And we'll talk in a moment about why there is a huge increase in street-based sex work now than there was, say, ten years ago. [21:56.070 --> 21:58.970] Financial censorship, of course, there's a lot of that. [22:00.850 --> 22:04.810] Removal of public health information due to terrible internet law. [22:04.870 --> 22:06.270] That's FOSTA-SESTA. [22:06.270 --> 22:08.130] I'll cover that too in just a moment. [22:09.730 --> 22:13.530] Facial recognition software is a big one in this sort of sex worker space. [22:13.530 --> 22:28.230] There are various facial recognition companies who scrape public profiles of escorts and then use that either in collaboration with law enforcement or with just like anti-sex work people who want to identify sex workers in their communities. [22:30.750 --> 22:33.190] So all of this is a bit ironic. [22:34.350 --> 22:53.990] Anyone who's at least well knowledgeable on the history of the internet and how it sort of was shaped through niche industry should be familiar with how sex workers really built the idea of the internet as we know it today, especially blogs and monetized blogs. [22:54.070 --> 23:04.190] So there was this porn performer in the 90s, Danny is her name, and she became a web developer just by hobby. [23:04.190 --> 23:12.170] She made a blog, we didn't have that word then, but she created a website to host her material and she monetized it. [23:12.170 --> 23:14.370] You had to pay to enter a bit later. [23:14.470 --> 23:23.250] It was so wildly popular that her service provider, their servers broke because there were so many people accessing the page. [23:23.370 --> 23:26.390] And same thing happened with her financial institutions. [23:26.390 --> 23:38.330] So she had to work with them for a little bit more reliability in the systems they were providing her, and it also created this industry understanding about what happens when something like this, what turns into a blog, becomes really popular. [23:39.090 --> 23:55.030] Also, you probably can't see it, but on the bottom of the screen, all this blurry pink text there is really early SEO strategy, so she just has a bunch of text words on there like boobies and nipples and stuff, and it gets her a higher ranking on the search engines. [23:58.950 --> 23:59.990] Okay. [24:00.450 --> 24:07.690] Now for my miniature rant and history lesson on Section 230 and why you're probably wrong about it. [24:09.070 --> 24:10.930] Now, I'm not actually singling you out. [24:11.850 --> 24:18.710] I harbor a lot of anger and resentment around people's perception of Section 230, and people are often wrong about it. [24:18.710 --> 24:26.010] I'm not a lawyer, so I'm not here to give you really sophisticated legal analysis, but I do, unfortunately, know a lot about Section 230. [24:26.910 --> 24:37.390] Which is essentially, what it says is that people are responsible for their own speech online, not the Internet companies that host services or the intermediaries. [24:37.390 --> 24:38.210] Okay? [24:39.150 --> 24:40.450] That's it. [24:40.510 --> 24:55.750] If Internet intermediary companies are held responsible for individual behavior, like if there is illegal activities happening and they're held responsible for it, it is essentially a death knell for free expression and privacy online. [24:55.750 --> 25:06.810] Either because it says they have to wipe out all content that could be potentially illegal, so any sort of dissident speech or marginalized behavior, gone. [25:06.810 --> 25:07.630] That's cut. [25:07.630 --> 25:12.150] So there's no real room for radical progression, especially in the room of politics. [25:13.010 --> 25:15.310] Privacy, also, gone. [25:15.310 --> 25:30.470] We have seen a lot of proposed carve-outs to Section 230 that are saying things like, oh, children are being abused via these services online, therefore we need a backdoor built into your encryption systems. [25:30.830 --> 25:32.790] And that's bullshit. [25:33.030 --> 25:41.510] That's not going to prevent the crime from happening, A, and these encryption schemes are useless if the government has a backdoor to them. [25:42.450 --> 25:50.250] Also, if you've thought a couple of steps ahead in this process, content moderation actually isn't threatened by Section 230. [25:50.250 --> 25:52.010] It's emboldened by it. [25:52.010 --> 26:04.270] Because when we, as people that exist online, get to say what we want and be held responsible for it, we also get to engage with the companies who have the kind of content moderation policies that we want to engage with. [26:04.270 --> 26:09.610] So they have the agency to determine their own business model, and we choose where we go. [26:09.790 --> 26:12.510] That wouldn't be the case if Section 230 were gone. [26:13.410 --> 26:20.850] And lastly, just again, let me say, cutting or amending Section 230 does not prevent violence or crime. [26:20.850 --> 26:25.890] Don't believe it when duplicitous lawmakers try and tell you that. [26:26.810 --> 26:35.150] As my coworker actually once said, carving out Section 230 won't make the Internet a nicer, safer place. [26:39.260 --> 26:44.600] So, unfortunately, there was a carve-out to 230 in 2018. [26:44.800 --> 26:56.120] It is this sort of package bill, FOSTA-SESTA, the Fight Online Sex Trafficking Act and Stop Enabling Sex Trafficking Act, if I remember correctly. [26:56.120 --> 27:16.540] It was this package bill that carved out 230 and made Internet companies, these intermediaries, newly responsible for any kind of content that appears on their platforms that could be culpable in trafficking-related charges or trafficking activities. [27:16.540 --> 27:23.100] It is overly broad and it had a lot of impacts on the sex worker rights space. [27:23.100 --> 27:28.020] Of course, no sex worker rights groups were consulted in the passing of this bill. [27:28.460 --> 27:44.360] This group that I worked with for a long time, they've since sunset, unfortunately, Hacking Hustling, actually reported on the impacts of FOSTA-SESTA and the impact it had on public health and sex worker rights, just generally recapping this study, which is still available online. [27:44.360 --> 27:49.980] Removing public health information doesn't just hurt sex workers, it hurts everyone. [27:50.260 --> 28:03.760] The types of materials that are being removed are not just like academic studies and research studies around public health when sex work is decriminalized, but things like bad date lists, where sex workers share with each other who the known rapists are in their community, [28:03.760 --> 28:08.560] who are the guys who are hiring the workers, drugging them and raping them. [28:08.560 --> 28:13.600] When that's removed offline, when we don't have access to that, it's dangerous. [28:15.600 --> 28:34.080] And this is the bill that pushed a lot of sex work offline, because so many websites shut down, because they didn't have legal resources to fight any sort of charges that might come from FOSTA-SESTA, so a lot of people were pushed offline onto the streets, [28:34.080 --> 28:37.540] which is unsafe for everyone. [28:40.040 --> 28:56.320] And this was a really popular issue, FOSTA-SESTA was really popular, it was a bipartisan support from everyone, we had a lot of Republicans signing onto it, we had Bernie Sanders signing onto it, and a lot of celebrities were stepping out to do these commercial campaigns, [28:56.320 --> 29:05.120] propaganda pieces, saying you can buy a child online as easily as you can buy pizza, so we should amend section 230. [29:08.810 --> 29:26.610] What this actually looked like in practice, a lot of websites had to shut down, or they had to enforce these new overly broad, overly censorious approaches to removing sexually explicit material that could be thought of as culpable in trafficking. [29:26.610 --> 29:43.030] Tumblr had the best and worst response to this, where they implemented this really haphazard, clumsy moderation system that removed all sexually explicit content, including pictures of uncooked chicken breast in recipes. [29:43.030 --> 29:44.490] Classical art. [29:44.490 --> 29:49.070] They even accidentally removed their own content moderation policy blog post. [29:53.810 --> 30:09.970] So then there's this company called Thorn, it was started by Ashton Kutcher and Demi Moore in 2012, they're an anti-trafficking technology company, they work with both law enforcement and consumer products, and they pushed really hard for FOSTA because, [30:09.970 --> 30:20.630] well, they provide the solution that makes companies compliant with these new rules, and since they're a private for-profit company, that's a good idea for them, right? [30:21.550 --> 30:39.070] It would be great if they worked because we are all on the side, especially maybe this isn't a known thing in this space, but there is a difference between human trafficking and consensual sex workers. [30:39.650 --> 30:43.570] Consensual sex workers want to end human trafficking. [30:45.210 --> 30:52.190] So, it would be great if products like Thorn worked really well to prevent human trafficking. [30:52.190 --> 30:58.750] Unfortunately, there have been numerous reports on how they frequently lie about their numbers and the efficacy of their products. [30:59.210 --> 31:08.570] This great piece by a journalist, Violet Blue, really covered the gamut of how many times they have lied, and it's pretty damning. [31:08.570 --> 31:15.390] They really overblow their numbers and it gets them contracts with law enforcement. [31:16.950 --> 31:22.250] An example, this is an arrest report that came out actually just this week. [31:22.410 --> 31:29.330] It's on MuckRock right now, if you're interested, that does mention one of Thorn's products. [31:29.330 --> 31:42.890] You might not be able to read it, but it says that they arrested an escort and they ran her face through some facial recognition technology called Spotlight, which identified her as a known sex worker in the area, and that led to her arrest. [31:47.210 --> 31:54.410] So, let's talk about trans and gender non-conforming liberation and the threats facing that movement. [31:54.450 --> 32:00.630] At this point, all of the different threats I've already talked about in the previous two will be in this one. [32:00.810 --> 32:18.950] So, passive surveillance mechanisms built into tech platforms, abuse of public records for doxing campaigns, data brokers, the interoperability of medical health records and third party apps that don't understand BAA agreements, censorship of public health materials on government websites. [32:19.110 --> 32:26.470] That one's a little bit new for the other points I've listed, and it's really hot and happening right now. [32:26.890 --> 32:28.810] I'll go into it. [32:29.770 --> 32:40.970] And, as I sort of alluded to in the previous section on FOSTA-SESTA, there's a ton of LGBTQ plus content that frequently gets suppressed online. [32:40.970 --> 32:59.450] Study after study after study has shown that LGBTQ plus content frequently gets flagged as adult or obscene, both by machine, like algorithmic driven content moderation systems, and human bias, so manual removal systems. [33:00.290 --> 33:03.490] And most platforms use a combination of both. [33:04.670 --> 33:20.170] So talking about public health records that are being purged right now, the current administration, following Trump's anti-trans executive orders, of which I think there are six right now, a lot of government websites began to comply in advance before those executive orders turned into any kind of law. [33:20.170 --> 33:26.990] And they're removing all material on their public websites that have to do with gender ideology, a.k.a. [33:26.990 --> 33:29.170] any mention of trans people existing. [33:29.370 --> 33:44.650] So that looks like decades of research studies, health records, HIV and AIDS related research, safety guidelines for LGBTQ plus people in various sectors, and so on. [33:45.070 --> 33:49.110] So there's this project which you might be familiar with, the Wayback Machine. [33:49.110 --> 34:03.010] It is a project from the Internet Archive which is set out to archive the entire Internet and they routinely sort of spider out and scan websites and snapshot them and then record them for anyone to access. [34:03.010 --> 34:04.870] It's a sort of archivist project. [34:05.150 --> 34:18.430] Their reach is not as far as we would like because every single website that I checked in the Wayback Machine that was purged in this sort of data removal that just happened in the past couple months, none of them were captured. [34:22.920 --> 34:30.140] A couple of examples of this just in real time, here's a safety guideline for LGBTQ plus people in the U.S. [34:30.140 --> 34:34.220] and considerations they might take traveling internationally. [34:34.220 --> 34:51.760] They just removed any and all instances of transgender related materials in this which is ironic because the rest of the world kind of does acknowledge that trans people exist and there are specific risks that trans people take that are separate from our cisgender counterparts in the LGBTQ community in various places around the world. [34:55.140 --> 35:30.060] Same thing with the social security website, anything that had to do with trans people on there is now gone and on the left there is a particularly ironic one that just happened where the public park service website for the Stonewall Monument in the West Village in New York removed any mention of transgender from the site which is ironic because the Stonewall riots happened in 1969 as a response to the police brutality that was taking place against the LGBT people in the bar there and the riots were started by A black trans woman. [35:34.240 --> 35:43.600] So it's not just erasing sort of decades of research and materials that existed on these government websites, you know, from before. [35:43.600 --> 35:46.280] It's erasing the possibility of them happening in the future. [35:46.280 --> 35:57.720] The Trump administration cut 125 million dollars from LGBTQ plus health funding, which are the places that were producing those research, which is devastating for many. [35:57.720 --> 36:06.180] And it's about half of the money they spent on anti-trans commercials for the Trump presidential campaign in 2024. [36:10.420 --> 36:22.280] Okay, so here's another example of one of those websites that sort of operates under the guise of plausible deniability, just resourcing or uplifting public documents that anyone can access and collating them. [36:22.280 --> 36:29.120] This one is the Stop the Harm database, which is from a collective called Do No Harm. [36:29.120 --> 36:39.160] And their intent is to document any medical facilities and doctors, including their home addresses, which offer gender-affirming care. [36:39.500 --> 36:50.940] This is obviously another tactic in stochastic terrorism with the unexpressed but obvious purpose of causing violence in real life to these people. [36:50.940 --> 37:03.160] And I personally have taken calls from people whose clinics have been burned down or received threats of gun violence, like over the phone, gunshots being fired at them. [37:03.740 --> 37:06.700] And that's because of this type of stuff. [37:08.440 --> 37:16.000] So I should mention that the group who made this one collaborates quite frequently with a couple other groups who are sort of known in the space. [37:16.080 --> 37:23.080] There's the Alliance Defending Freedom, the ADF, and they collaborate with the Heritage Foundation who wrote Project 2025. [37:24.260 --> 37:29.180] And then there is this group, oh sorry, this is the ADF. [37:29.820 --> 37:33.500] They had this incredible leak last year. [37:33.700 --> 37:40.560] Can we call it a leak when someone's OPSEC is so bad that they leave a Google Drive open, unsecured for the public to access? [37:41.060 --> 37:42.180] I'm not sure. [37:42.180 --> 38:00.120] However, it did reveal all of their sort of strategy and planning and emails back and forth about how to coordinate the anti-trans legislation they were planning along with the anti-abortion strategies and how these things play well together for their Christian fundamentalist goals. [38:04.670 --> 38:15.970] So down to the state level, at the time of writing this talk, which was a couple weeks ago, there were 809 proposed bills in the U.S. [38:15.970 --> 38:18.030] that are sort of anti-trans in nature. [38:18.030 --> 38:20.090] I'm sure it's gone up since then. [38:20.090 --> 38:27.330] But one of the worst ones that is only proposed right now, and actually it probably won't pass, but the precedent is pretty scary. [38:27.810 --> 38:32.170] Texas is saying that just identifying as transgender would be a felony. [38:32.250 --> 38:37.810] Meaning felony jail time, prison, for just expressing that you are trans. [38:37.810 --> 38:49.930] Not doing anything else that are also criminalized in Texas, going to the bathroom or getting healthcare or talking about the fact that LGBT people exist in a school. [38:52.250 --> 39:12.490] Also in Texas, one of the bathroom bans that did pass, this is law right now in Odessa, Texas, if a trans person uses a bathroom in a public place, the business should take them to and they can get a minimum reward of $10,000. [39:13.250 --> 39:13.930] Minimum. [39:13.930 --> 39:18.550] Plus the legal fees from the person who was trying to use the bathroom. [39:18.810 --> 39:21.230] Oh, and then that person gets sent to jail. [39:22.710 --> 39:28.290] So the reason I'm bringing these things up is because those are sort of like real life violences that take place. [39:28.290 --> 39:35.430] But the digital realm, as I've talked about, not just online, but also street level surveillance systems, impact these spaces as we've seen. [39:35.430 --> 39:45.370] In fact, Department of Homeland Security at the federal level just lifted a ban on surveillance based on sexual orientation and gender identity. [39:45.370 --> 39:55.610] Meaning, if you are of a sexual orientation or a gender that is diverse, you can be spied on by DHS solely based on that. [39:55.610 --> 39:59.930] And you no longer have a constitutional right to protection from it. [40:00.830 --> 40:10.130] This is probably due in part, there's some language in the executive orders that Trump has put out saying that woke ideology is anti-American. [40:10.670 --> 40:17.350] So there's like this, the threat of terrorism by being anti-American by existing as a queer person. [40:20.740 --> 40:34.920] So this is particularly concerning for, it should be for everyone, but especially trans people who are currently incarcerated or who now face a very likely threat of incarceration for this purpose. [40:34.920 --> 40:40.340] There's this act called the Prison Rape Elimination Act that is put over facilities. [40:40.660 --> 40:49.880] Currently, there's the expressed purpose of migrating all trans inmates into the facilities that correspond with their gender assigned at birth. [40:53.780 --> 41:04.080] Study after study after study has shown that trans women are most frequently the victims of rape in facilities where they're placed with men. [41:04.360 --> 41:29.080] And in fact, in some PREA compliant, that's that prison rape law, facilities, there is this tactic known as V-coding, which is prevalent enough that it's been cited now in multiple research studies where trans women are used as rape objects to dull the temperature of the social upheaval in the facility. [41:29.080 --> 41:33.360] So they're placed with more violent inmates so that they can be raped. [41:36.600 --> 41:38.500] Okay, back to the tech. [41:38.780 --> 41:56.640] So in social media land, you know, these companies have begun, of course, complying in advance with these sort of new rules around what's allowed on their platforms and sort of following the content moderation policies expressed by like X and whatnot. [41:56.640 --> 42:13.680] Facebook released this kind of like hilariously bad video with Zuckerberg talking about lifting restrictions on hate speech, including, well he didn't name it, but it was shown in the change log that certain insults towards women are now allowed again, [42:13.680 --> 42:17.940] and you won't be like censored for it, so you could be called property if you're a woman. [42:17.940 --> 42:19.160] Transgender people freaks. [42:19.160 --> 42:23.780] And he did actually say outright in the video language around immigrants. [42:26.040 --> 42:33.860] YouTube, literally, this is, yeah, yesterday this was reported on, removed gender identity from their hate speech policy. [42:33.860 --> 42:35.820] I don't think they released a statement on this. [42:35.820 --> 42:40.540] This was just sort of noted in the change log of their content moderation policies. [42:43.300 --> 42:48.700] Here's an example of the so-called, of like X has sort of been like leading the way for this. [42:49.240 --> 42:59.560] So, you know, like Elon Musk likes to call it sort of like free speech platform that I think that happened after he was like, actually, you know what, you can say the n-word on there. [42:59.780 --> 43:07.280] You can say a lot of like Nazi stuff on there, and we won't ban you for it because we want to allow all free speech. [43:07.280 --> 43:17.360] But to this day, the word cis is such a slur to him that it is suppressed enough where the user is actually made aware of it. [43:17.360 --> 43:25.360] It's not just like shadow banned, like the user's made aware that this violates our policy because that language is so harmful and dangerous. [43:28.320 --> 43:34.300] Okay, so as I mentioned, a lot of these like anti-trans laws that are passing are focused on minors, right? [43:34.300 --> 43:39.900] It's like there's the do not say gay bill or the don't say gay bill that I referenced from 2022. [43:40.360 --> 43:48.060] I was really thinking about it a lot at this time because I had been researching student-issued devices that come often pre-loaded. [43:48.060 --> 43:51.560] It's basically like a surveillance rootkit on the device. [43:51.560 --> 43:54.020] There's a ton of companies that do this. [43:54.040 --> 43:57.860] GoGuardian, Bark, Gaggle, the list goes on. [43:57.860 --> 44:09.940] And so I was researching like what it looks like for students and how they sort of intersected with spaces like this where certain types of speech or self-expression are now criminalized in the school space. [44:09.940 --> 44:11.080] What happens? [44:11.720 --> 44:14.500] As predicted, it was really bad. [44:14.500 --> 44:25.700] Some co-workers and I got together and about a year later we ran a research project where we just basically filed like hundreds and hundreds of public records requests to school districts around the U.S. [44:25.700 --> 44:28.140] who had these devices. [44:28.140 --> 44:35.360] I think it's like 60% of schools now, probably more at this point, at that point it was, have these devices issued to students. [44:35.420 --> 44:43.960] And we just wanted to see like what kind of, what types of content were flagging the machine saying that this is like explicit or inappropriate for kids to be looking at. [44:43.960 --> 44:45.760] It was bad. [44:45.920 --> 44:50.160] It was like a lot of like history articles and stuff, like a lot of Wikipedia articles are banned. [44:50.160 --> 44:53.360] Predictably there's a lot of like sexual health information. [44:53.380 --> 44:57.860] So any sort of like medical research documents, also banned. [44:57.860 --> 45:02.120] But also really commonly LGBTQ plus content, banned. [45:02.880 --> 45:13.320] What was particularly terrible about it are these like, when students or when kids are beginning to navigate their identity and think about things like, am I gay? [45:13.320 --> 45:19.360] Am I experiencing sorts of like feelings that are different or whatever from my classmates? [45:19.360 --> 45:29.680] They're often outed by these technologies and they're not outed to like their parents who can also be the sort of threats of abuse in their social structure. [45:29.680 --> 45:33.660] But it's by like school facilitators, school administrators of these systems. [45:33.660 --> 45:34.940] These are not social workers. [45:34.940 --> 45:38.380] These are not people who are equipped to deal with these issues. [45:38.380 --> 45:45.520] These are the people who are incentivized because they can go to jail for it, reporting those students to law enforcement. [45:45.580 --> 45:58.520] And actually it got so bad that Senator Elizabeth Warren, her office released a report or a demand that these companies be investigated for the abuse they were perpetrating on students. [45:58.520 --> 46:07.280] And they found that the increase of likelihood contact with law enforcement was so high and disproportionate with queer people and students of color. [46:11.620 --> 46:13.200] Ah, okay. [46:13.420 --> 46:15.380] So that's a lot of doom and gloom. [46:15.380 --> 46:29.320] What I didn't include in the slides here, I should probably say though, is that most of my work is doing OPSEC and privacy and security advising for organizations or like individuals, activists who are sort of in these issue spaces. [46:30.840 --> 46:40.840] And one thing I have noticed, especially post-Roe, is cross-movement collaboration between these spaces. [46:41.420 --> 46:49.520] Because they share so many of the same vectors of surveillance and oppression that I sort of covered before, a lot of shared strategies are emerging. [46:49.520 --> 46:55.040] And we're understanding that we share the same oppressors, the same bad guys, same threats. [46:55.160 --> 46:57.120] So there is a lot happening. [46:57.120 --> 46:59.480] It's not like totally helpless. [47:00.700 --> 47:04.400] But I do want to say that these things are not inevitable. [47:04.400 --> 47:09.260] Like the creep of criminalization is not necessary. [47:10.000 --> 47:11.820] We don't have to have that. [47:12.400 --> 47:16.120] We may disagree on certain types of healthcare, being a human right. [47:16.120 --> 47:17.600] And if so, you're wrong. [47:17.600 --> 47:20.560] But the creep of criminalization is undeniable. [47:20.780 --> 47:22.240] That is happening. [47:22.240 --> 47:24.560] It is including more and more people. [47:24.560 --> 47:25.940] And that should worry you. [47:25.940 --> 47:27.820] You should fight against it. [47:30.040 --> 47:43.960] So if you work at a tech company with any kind of leverage in policy or product design space or whatever that looks like in your work, do some things like reviewing your privacy policies. [47:44.560 --> 47:46.660] What kind of encryption standards do you have? [47:46.660 --> 47:48.860] I mean, maybe you can't do full end-to-end. [47:48.860 --> 47:50.900] But maybe you have good encryption at rest. [47:50.900 --> 47:53.140] Maybe you have good data retention policies. [47:53.480 --> 47:58.540] What kind of documentation do you have in place for how to respond to law enforcement? [47:58.540 --> 48:09.980] Because if there's one thing that we have learned over the past 10 years in this space, it's how often companies who think they're not actually implicated in these issue spaces are. [48:09.980 --> 48:19.280] And they hand over data to law enforcement that results in people's either lives going to prison or dying. [48:20.180 --> 48:22.660] So you could fight against that. [48:23.480 --> 48:25.160] And, you know, some basic stuff. [48:25.160 --> 48:27.760] Reviewing what trackers your website uses. [48:27.780 --> 48:29.200] Consider using some... [48:29.840 --> 48:33.980] of course, removing the ad technology and the data brokers that are on your website. [48:34.000 --> 48:35.480] But there are some other ones. [48:35.480 --> 48:38.700] Like some privacy forward alternatives to some back-end analytics tools. [48:38.700 --> 48:42.440] Like maybe go with Matomo instead of Google Analytics and so on. [48:44.520 --> 48:50.640] And if you are a developer and maybe you have less control in the sort of product design or policy design space... [48:56.100 --> 48:57.940] just consider where you're working. [48:58.360 --> 49:01.620] Like, I know that the job market is pretty bad right now. [49:01.620 --> 49:05.060] But there are some questions you should be asking yourself as a worker in the tech space. [49:05.560 --> 49:08.800] Is it worth spending your life to make that company richer? [49:08.800 --> 49:11.000] Which industries are you benefiting? [49:11.000 --> 49:13.920] And who's being exploited to make those people richer? [49:14.060 --> 49:15.760] Is it worth it to you? [49:18.200 --> 49:21.480] And if you don't know what else to do or how else to help... [49:22.400 --> 49:27.140] you can help us whose job it is to do that kind of help. [49:27.140 --> 49:33.140] Go to act.eff.org or eff.org slash no. [49:33.720 --> 49:39.200] And those are just places where we have an action center where there's a slew of bad bills that are happening. [49:39.260 --> 49:43.780] Things that are sort of in the same route of FOSTA-SESTA is happening. [49:43.780 --> 49:45.300] More carve-outs to 230. [49:45.320 --> 49:50.500] Bad encryption backdoor bills and bad privacy rules, etc. [49:50.500 --> 49:53.060] We have a whole suite of things you can do there. [49:53.500 --> 49:55.840] And I should also say, just reach out. [49:56.200 --> 50:01.380] You can email intake at eff.org if you have any sort of legal questions. [50:01.380 --> 50:04.120] Or if you want to speak to the technologists, like myself. [50:04.960 --> 50:09.240] And you can just contact me, if you like, daily at eff.org. [50:11.280 --> 50:12.060] Okay. [50:12.160 --> 50:14.040] That is the end of my presentation. [50:14.140 --> 50:15.720] We're a few minutes early. [50:15.720 --> 50:18.720] So I guess... is it okay if I take a question or two? [50:18.720 --> 50:19.120] Okay. [50:19.120 --> 50:19.740] Great. [50:20.280 --> 50:21.640] Anyone have any questions? [50:23.920 --> 50:24.800] Yeah. [50:43.150 --> 50:44.230] Good question. [50:44.230 --> 50:48.710] There is... you've kind of identified, I guess, by the nature of your question. [50:48.710 --> 50:54.610] So the question was how to mitigate the dangers that are caused by these websites who are surfacing public records. [50:55.290 --> 50:59.830] And under the sort of guise of being like, well, public records laws are for the public good. [50:59.830 --> 51:01.110] They're for the public interest. [51:01.230 --> 51:01.790] True. [51:02.850 --> 51:03.650] Yes. [51:03.990 --> 51:05.770] This is what happens sometimes in the work. [51:05.770 --> 51:06.770] I don't know how to answer it. [51:06.770 --> 51:27.110] Except that this is something I encounter pretty often at EFF, where there is this inherent tension between what should be publicized or what should be open and accessible information versus what I think needs to be kept private for people's safety, especially at risk or traditionally oppressed peoples. [51:28.610 --> 51:30.110] I don't know. [51:30.530 --> 51:31.840] There are some groups who... [51:33.070 --> 51:34.850] No, I can't say that like... [51:34.850 --> 51:36.510] Well, how do I say this? [51:37.490 --> 51:48.290] There are some great security people, some great hackers, who have gone after some of the groups that have coordinated those things. [51:48.310 --> 51:52.790] So I didn't mention it, but there was the American College of Pediatrics. [51:52.790 --> 52:01.030] They also work with the ADF and the Do No Harm Coalition, that like the anti-trans, anti-abortion people. [52:02.150 --> 52:20.670] The American College of Pediatrics, they had some insider threat in their organizing, and a hacker actually then from there got access to 2,600 pages of emails between them and lawmakers and other people at the Heritage Foundation and whatnot, talking about their strategies for how to do this stuff. [52:20.990 --> 52:26.250] So that is one way you could like directly attack the bad actors. [52:26.250 --> 52:28.250] I'm not saying you do it, because I, you know... [52:29.890 --> 52:31.210] But you could. [52:33.310 --> 52:34.250] Yeah. [52:37.390 --> 52:39.970] Oh yeah, you had good questions earlier. [53:00.830 --> 53:01.890] Great question. [53:04.370 --> 53:05.970] Great question. [53:06.250 --> 53:12.370] So the question is like, what misinformation or disinformation is being used in these sort of areas to push other agendas? [53:13.270 --> 53:15.650] Or like, why are they doing it? [53:16.590 --> 53:20.510] So, you know, I mentioned I had that spiel about 230. [53:20.530 --> 53:36.790] We see a lot of disinformation, like willful disinformation made by lawmakers in that space, because they want to carve out this bill, because it affords us as the citizens, as people who are being governed, the right to free expression and privacy online. [53:36.790 --> 53:42.910] And so they are often giving us these bills, like Earn It was one that is still sort of in limbo. [53:42.910 --> 53:45.750] It's not quite dead, but it may come back. [53:45.750 --> 53:46.370] Who knows? [53:46.370 --> 53:48.450] There's the Kids Online Safety Act. [53:48.450 --> 53:56.170] These bills that are made to garner sympathy in the public opinion of like, kids are being abused online. [53:56.170 --> 53:58.050] And like, yeah, that's terrible. [53:58.050 --> 54:02.050] But taking away our privacy and free expression is not going to stop that from happening. [54:02.050 --> 54:11.050] But they use this sort of disinformation or misinformation around how the systems work, what privacy is doing to like afford bad actors safety or whatever. [54:11.050 --> 54:18.630] And they're trying to sell us, you know, basically them having even more control, warrantless access over our lives. [54:18.630 --> 54:26.550] So that's like a really deliberate tactic taken by lawmakers and just like think tanky people who are advocating for these things. [54:29.670 --> 54:35.270] And if you go to act.eff.org, you'll find our sort of action center on like how to fight COSA and stuff. [54:35.270 --> 54:36.630] So please do that. [54:42.460 --> 54:43.520] Okay, great. [54:44.280 --> 54:45.340] No more questions. [54:45.460 --> 54:47.520] I'm going to unplug and I'll be around. [54:47.520 --> 54:48.580] Please feel free to ask questions. [54:48.580 --> 54:52.120] Oh, and if like, go over to the EFF booth, they might still be open. [54:52.400 --> 54:55.000] Just across from the registration desk, you can learn more there.