[00:32.510 --> 00:36.530] Developer-friendly cryptography is what I'm going to get started on this morning. How [00:36.530 --> 00:41.770] many of you would call yourselves developers in some capacity? All right. How many of you [00:41.770 --> 00:47.930] work with developers? A couple more. Okay. Very good. So because we have this small screen, [00:47.930 --> 00:52.070] I was kind of moving things around last night to make it larger. But, you know, certainly [00:52.070 --> 00:56.570] feel free to come forward. This link at the top here will get you a copy of the slide [00:56.570 --> 01:01.890] deck in a PDF format. So, you know, if you want to follow along on your laptop or something, [01:01.890 --> 01:10.870] feel free. My name is Bryce Williams. I work for a consulting firm called SysLogic, based [01:10.870 --> 01:17.030] here in Milwaukee, and lead their managed security services team. So my group, we provide [01:17.730 --> 01:24.410] application security guidance and a variety of topics for organizations. We review lots [01:24.410 --> 01:28.590] of code. So we do source code assessments, you know, security assessments on a lot of [01:28.590 --> 01:34.130] different, you know, large systems to small embedded devices. Train thousands of developers. [01:34.130 --> 01:38.170] This takes me all over the place, you know, talking to developers specifically about application [01:38.170 --> 01:44.010] security concerns, which includes cryptography. And personally, I've been in the field for [01:44.010 --> 01:49.250] 20 years as an application developer architect, and then transitioning into the security, [01:49.250 --> 01:53.710] the application security space. So everything I do is around application security and working [01:53.710 --> 02:02.350] with the developer community. I want to start with this statement, this quote from Dr. Newman. [02:02.450 --> 02:05.590] And this is, this may be something you've seen before. There's several different variations [02:05.590 --> 02:11.810] of it. This was some, this was a statement that Dr. Newman gave in 2001 for a New York [02:11.810 --> 02:17.950] Times article. And the article was discussing an announcement by a Harvard professor about [02:18.250 --> 02:25.530] a new type of unbreakable encryption technique that he had, you know, put together, he and [02:25.530 --> 02:30.850] his team. And it was based on the use of a key based on a stream of random numbers. So [02:30.850 --> 02:36.370] it's a pretty novel technique. From an academic and theoretic standpoint, it was actually [02:36.370 --> 02:41.490] pretty cool. Where the statement from Dr. Newman and some others that commented on it [02:41.490 --> 02:46.190] was, you know, this is really great, you know, from that perspective, but we often [02:46.190 --> 02:51.030] see weaknesses when it comes to the implementation of, you know, truly great like cryptographic [02:51.030 --> 02:55.670] techniques. So I want you to keep this in mind as I go through this discussion that [02:55.670 --> 03:00.790] you can have great cryptography, but often, you know, putting the pieces together, actually [03:00.790 --> 03:08.450] practically putting it into place is where we see weaknesses. So to get started, I want [03:08.450 --> 03:13.470] to look at cryptographic best practices for 2018. And don't worry if these are unfamiliar [03:13.470 --> 03:19.890] to you or if you can't read it. These are things that, you know, over time, you know, [03:19.890 --> 03:25.610] now that we're in 2018, we have some updated cryptographic best practices. If you were [03:25.610 --> 03:29.050] in Zach Grace's talk yesterday, some of them, there's a little bit of overlap with what [03:29.050 --> 03:33.690] I'm talking about today and what he covered. The best practices, for example, are similar, [03:33.690 --> 03:38.010] you know, derived from the same sources from cryptographic professionals that, you know, [03:38.010 --> 03:43.670] know this topic inside and out. So we have things like random data from kernel-based [03:43.670 --> 03:49.510] CS, ERGs, your random number generators, your crypto random number generators. Use of authenticated [03:49.510 --> 03:56.830] symmetric encryption. Symmetric signatures using HMAC. With your hash functions, make [03:56.830 --> 04:00.410] sure that you're using those that avoid length extension attacks. That's an important one. [04:00.810 --> 04:03.930] Some of this I'll come back to, but just keep in mind, you know, at a high level, these [04:03.930 --> 04:08.670] are the best practices that cryptographic professionals promote and encourage folks [04:08.670 --> 04:16.230] to use. Yeah. So password storage, password-based KDFs, asymmetric encryption, preferring the [04:16.230 --> 04:22.770] use of electrical cryptography or ECC over RSA. For asymmetric signatures, preferring [04:22.770 --> 04:31.430] the use of EDDSA or RFC 6979. And then secure communications, TLS everywhere, you know, [04:31.430 --> 04:34.810] is great. Something along those lines, you know, a good end-to-end encrypted communication [04:34.810 --> 04:45.370] stream, TLS 1.2 or now 1.3. I forgot I had this. So I want to get us started by field [04:45.370 --> 04:50.290] observation. So these are examples I'm going to go through that my team and I initially [04:50.290 --> 04:55.470] reviewed, things that we have, you know, uncovered in our assessments. Some of these, you know, [04:55.470 --> 04:59.670] let's start with a basic... well, hold on. Let's start with some common mistakes first, [04:59.670 --> 05:05.690] then we'll get into the actual examples. So here, some of the things we have observed [05:05.690 --> 05:11.970] include, you know, these different categories. Things like weak password storage. Use of [05:11.970 --> 05:17.690] reversible encryption. This is all too common. AES, RSA used in storage of passwords, not [05:17.690 --> 05:24.150] ideal. Use of hash functions without salts, a little better. Again, not ideal. Insufficient [05:24.150 --> 05:28.550] work factor to prevent brute force. Often, you know, there might be some work factor [05:28.550 --> 05:33.090] utilized or what we call a KDF, a key derivation function, but there hasn't been sufficient [05:33.090 --> 05:39.050] thought put into the work factor. We also see poor key management. Keys are not stored [05:39.050 --> 05:45.630] securely, hard-coded, or placed in source control. It's a pretty common issue. Loose [05:45.630 --> 05:51.510] access control or overshared keys. Lack of granular key usage and periodic rotation. [05:51.510 --> 05:57.270] All sorts of, you know, related key management issues. We also see a general lack of authenticated [05:57.270 --> 06:01.090] encryption. And if you're not familiar with that term, authenticated encryption, don't worry, [06:01.090 --> 06:04.450] most developers aren't either. They haven't even, they've never heard of authenticated encryption. [06:04.450 --> 06:09.610] They don't realize that's something that they need to be concerned about. In their minds, AES, [06:09.610 --> 06:13.390] they've heard of AES, that's good enough. Let's just use AES because we've heard that's military [06:13.390 --> 06:19.070] grade. It's a good start. AES isn't necessarily bad, but you need to know a little bit more than [06:19.070 --> 06:24.530] that. Or they may know they need to use authenticated encryption, so they go down the path [06:24.530 --> 06:29.810] of developing a, you know, authenticated encryption construction based on something. But [06:29.810 --> 06:36.210] it's not the preferred encrypt and MAC approach. We also see use of keys, initialization vectors, [06:36.210 --> 06:42.690] and nonces in a way where there's misuse, there's reuse of keys, they're hard-coded, [06:42.690 --> 06:49.130] they're using some sort of strange obfuscation to, you know, protect them. Use of passwords for [06:49.130 --> 06:53.150] encryption keys. Another big issue, a password is not designed to be an encryption key. You can [06:53.150 --> 06:57.170] turn a password into an encryption key, but a password itself should never be used for an [06:57.170 --> 07:04.110] encryption key. Also weak and or non-random values for these, you know, keys, IVs, and nonces or just [07:04.110 --> 07:08.670] reuse of values. There are certain constructions where reuse of a nonce is a big no-no. [07:09.990 --> 07:15.850] So here's my first example. Use of not encryption is what I call it. So you probably, if you're in [07:15.850 --> 07:19.510] the back, you might not be able to read all of this, but essentially we have, this is in C code, [07:19.510 --> 07:23.950] in fact, all of my code examples are in C. This is from an actual assessment that was performed. [07:23.950 --> 07:28.710] We ran across this function, I think I ran across this one, called encrypt password. And just [07:28.710 --> 07:33.190] skimming through this, right away I knew we had a problem because there's no mention of any [07:33.190 --> 07:38.610] algorithms and any, you know, cryptographic algorithms or ciphers. There's no encryption [07:38.610 --> 07:45.010] key. There's just this interesting line here that's doing some, essentially some, shifting [07:45.010 --> 07:51.450] some bits around. So that's not encryption. That's what we, that's obfuscation essentially. [07:51.450 --> 07:55.730] And you think, well yeah, this is a pretty crazy example. This wouldn't exist that, that often in [07:55.730 --> 07:58.890] the field, or maybe it's super old code. But I think you might be surprised at how often you [07:58.890 --> 08:08.110] run across examples of encryption that looks something like this. In this next example, [08:08.110 --> 08:13.890] we have a random number generator that's being seeded or initialized. Here the comment says it [08:13.890 --> 08:18.810] all. Need to see the RNG with a hash of the MAC address. So if you can kind of read the code here, [08:18.810 --> 08:26.130] it's doing a murmur to hash based off some pieces of a MAC address. And this is just messed up. [08:27.730 --> 08:33.170] You can, you know, a MAC address is a static value for a device. A random number generator, [08:33.170 --> 08:37.770] on the other hand, needs to be, it can't be deterministic. It needs to be unique and [08:37.770 --> 08:43.190] unpredictable. So the use of a MAC address for a random number generator should never happen. It [08:43.190 --> 08:50.680] just, it shouldn't be any part of it. We also see a lot of overly, what I call overly complex [08:50.680 --> 08:55.840] encryption. Just extra layers of things that, you know, you, I remember running across this one for [08:55.840 --> 09:01.200] the first time. You know, right away going, what is going on here? And then breaking down, you know, [09:01.200 --> 09:05.780] reverse engineering the logic to understand what was happening. And it kind of looked like this, [09:05.780 --> 09:11.040] where it's generating a random value. That random value gets stored in a database. The random value [09:11.040 --> 09:14.820] gets split into two. You combine the second half with a plain text, reverse the combined value, [09:14.820 --> 09:19.900] hash it with SHA-256, and on and on. One of my favorite sections is line two here, where it goes [09:19.900 --> 09:25.380] in, creates a UUID, takes the five parts, reorders the parts, and then puts it back together. And [09:25.380 --> 09:33.760] it's like, what, who dreamed this up? And why? And essentially, this second step here is the cause of [09:33.760 --> 09:37.440] the concern. There's actually a weakness in this approach, regardless of the fact that it's just [09:37.440 --> 09:42.580] weird. The random value should never have been stored in the database alongside the encrypted [09:42.580 --> 09:50.680] data. I forgot to mention, this was for storing data in a database using reversible encryption. [09:50.680 --> 09:55.500] They wanted to be able to recover the plain text later. So they use this approach, which, you know, [09:55.500 --> 10:01.500] at its face is crazy. But the reason I think that the designers put this in place is because they [10:01.500 --> 10:06.300] recognize that by storing the random value in the database, it just, maybe it felt wrong. They knew [10:06.300 --> 10:11.100] there was some sort of weakness, so they added additional layers of obfuscation, essentially to [10:11.100 --> 10:16.400] make it more difficult to reverse engineer. Now, obviously, we figured it out. And an attacker with [10:16.400 --> 10:22.280] enough information or access to things could do the same thing. And ultimately, potentially, [10:22.280 --> 10:26.440] reverse engineer and recover the plain text values that are stored in the database. So if they could, [10:26.440 --> 10:29.620] you know, take the entire, or have access to the entire database, they have all the detail they [10:29.620 --> 10:36.980] need to recover the encrypted information. So anytime I see extra layers of complexity involved [10:36.980 --> 10:42.520] in encryption techniques, it usually clues me into the fact that it could warrant some study, [10:42.520 --> 10:46.820] because there's probably a reason there's extra layers there. They're hiding something. [10:48.860 --> 10:54.460] Here's another example. It's not entirely cryptography based, but this is looking at a [10:54.460 --> 10:59.740] system. This is a cloud-hosted, you know, typical web system with a database backend, an API front [10:59.740 --> 11:04.780] end, and a JavaScript UI. Three different issues we kind of highlighted in this particular [11:04.780 --> 11:11.900] implementation. We've got missing authentication on the API endpoint. There is this function called [11:12.480 --> 11:18.020] get to random security questions by user that allowed an anonymous user, so no authentication, [11:18.020 --> 11:23.120] anonymous user could ask this based on a user ID and get back that user's secret, you know, [11:23.120 --> 11:29.140] two of their three secret questions and answers. And then interestingly enough, [11:29.140 --> 11:34.560] when I saw this, I realized we had a big problem. The validation of the answers that were given by [11:34.560 --> 11:38.140] the end user was performed on the client side. So the JavaScript was the one that was actually [11:38.140 --> 11:42.200] saying, oh, did they answer the, did they input the correct answer for this question or not? [11:42.200 --> 11:47.980] Yes or no? So obviously an attacker could bypass that step. And this was a brand new system. I [11:47.980 --> 11:53.200] think we looked at this like two months ago or something. A brand new feature that they added. [11:53.200 --> 11:58.480] So obviously the team wasn't really thinking about the security aspects of this. And even [11:58.480 --> 12:03.400] my third point here was even on the backend, there was issues as well. The answers themselves [12:03.400 --> 12:07.240] to the secret questions were stored in the database using symmetric, you know, reversible [12:07.240 --> 12:12.380] encryption. Ideally you want to store your secret question answers in the same way that you store [12:12.380 --> 12:16.360] passwords using, you know, a key derivation function because there's really no reason to [12:16.360 --> 12:21.300] recover the actual plain text of that answer. You just need it for comparison purposes. [12:24.300 --> 12:29.600] So we looked at a few issues that my team and I have uncovered and kind of highlighted, you know, [12:29.600 --> 12:36.000] some basic issues. So if you compare those to the cryptographic best practices, we're way off [12:36.000 --> 12:42.060] the mark. You know, we're looking at, you know, basic issues that we seem to uncover over and over [12:42.620 --> 12:48.800] rather than, you know, seeing that if developers are able to address the more advanced, you know, [12:48.800 --> 12:55.780] modern topics, they're not there in general. So this slide looks at recent cryptography issues [12:55.780 --> 12:59.960] and in this case these are some items highlighted by Matthew Green, who's a professor at John [12:59.960 --> 13:07.540] Hopkins, pretty well known in the cryptography space. So what did he highlight here? DHUK [13:07.540 --> 13:11.660] FortiGate hardcodes a key that makes every VPN session crackable. You might remember that [13:11.660 --> 13:17.340] scenario. Recently being, you know, in the past two or three years that these issues came out, [13:17.340 --> 13:23.000] you know, in the press. Pretty big deal. All related to cryptography-based mistakes. [13:23.920 --> 13:27.960] Next one, Juniper hardcodes a similar key and then gets hacked by the Chinese, who changed [13:27.960 --> 13:34.040] that key to one of their own choosing. That obviously was a big deal. Every major browser [13:34.040 --> 13:38.320] manufacturer and a number of websites make TLS vulnerable to practical decryption attacks. That [13:38.320 --> 13:43.360] was our freak attack. And similarly the next one, browsers and websites make TLS vulnerable [13:43.360 --> 13:50.480] to practical decryption attacks yet again with the drown attack. Apple uses crypto wrong in [13:50.480 --> 13:55.780] their iMessage encryption for a billion users. That was a pretty big deal. And remember, all of [13:55.780 --> 14:00.360] these are big manufacturers. These are the folks that have big dedicated security teams. They have [14:00.360 --> 14:07.100] FIP certification. So what kind of issues is everyone else running into? Obviously my team and [14:07.100 --> 14:11.100] I have seen some of those and some of the examples that I highlighted. You know, a lot, these are [14:11.100 --> 14:15.660] more complicated kind of concerns, you know, more advanced subtle mistakes that are made. The subtle [14:15.660 --> 14:20.260] mistakes, you know, are going to certainly bound to happen. It's the basic mistakes that we often [14:20.260 --> 14:30.640] see that concern me. So why are mistakes so common? Three different metrics here from different [14:30.640 --> 14:37.100] studies. Just to back up the statement that cryptographic issues are a concern. 61.5% of [14:37.100 --> 14:41.800] applications scanned had one or more cryptographic issues. That's from Veracode's report last year. [14:41.900 --> 14:47.120] But they scan a lot of code obviously. 66% of the most popular cryptocurrency [14:47.680 --> 14:53.240] mobile apps. So there was a study that looked at cryptocurrency mobile apps. 66% of those contained [14:53.240 --> 14:58.960] hard-coded sensitive data including passwords or API keys. I mean that's really unfortunate. And [14:58.960 --> 15:04.940] then look at this last one here. Highlights the fact that 17% of bugs they looked at were in the [15:04.940 --> 15:10.060] cryptographic libraries. Whereas the remaining 83% were in cryptographic, were misuses of [15:10.060 --> 15:15.100] cryptographic libraries. And that last point is what I, you know, want to emphasize that the [15:15.100 --> 15:18.680] cryptographic libraries certainly can have mistakes. And you need to be mindful of those [15:18.680 --> 15:23.640] libraries. And you know, make sure you're using latest ones. And I'll talk more about libraries. [15:23.640 --> 15:31.280] But the implementation of cryptography is more often where we see, you know, problems being [15:31.280 --> 15:39.000] introduced. So if we look at like just a few developer challenges. Again, kind of why are [15:39.000 --> 15:43.520] mistakes made? What is it that the developers have to fight with? I think, you know, for those of us [15:43.520 --> 15:48.520] that do development, this will ring true. Things that you have to, you know, be mindful of as [15:48.520 --> 15:53.340] you're working on a system. You got to make it work. It needs to work first. You have to, [15:53.340 --> 15:57.820] ensure the product actually works before you make it secure. You know, I'd argue that you need to do [15:57.820 --> 16:02.200] both at the same time. But clearly you have to make the product work. That's important. You also [16:02.200 --> 16:07.060] have to meet delivery dates. And often delivery dates can be a priority over adding in a certain [16:07.060 --> 16:12.900] level of security or getting extra expertise or pair of eyes to review it. You might have a [16:12.900 --> 16:18.180] performance or usability concerns as a result of certain security choices. And so maybe they win [16:18.180 --> 16:24.340] out. Performance or usability that is. Might have inadequate security testing. Maybe security [16:24.340 --> 16:32.260] controls don't get tested with sufficient expertise. Lack of crypto knowledge. Developers might not [16:32.260 --> 16:38.380] get the training that they need. The access to knowledge sources to give them the information [16:38.380 --> 16:43.640] about cryptography that's accurate. It might also be a problem with poor library or API support. [16:43.640 --> 16:48.780] Either in crypto libraries that are being utilized or in the programming languages that you're using, [16:48.780 --> 16:59.040] you know, that provide cryptographic APIs. I want to look at those last two items. Training and API [16:59.040 --> 17:08.620] use. Just to highlight a few things. So first off, I looked at the top 10 cryptography courses on [17:08.620 --> 17:14.360] Pluralsight. This is not to dig Pluralsight. They're a beverage sponsor here at CypherCon. You [17:14.360 --> 17:18.180] find this issue with others as well. If you're not familiar with Pluralsight, they're an excellent [17:18.180 --> 17:22.940] online video based training vendor. Lots, you know, thousands of different technical training topics [17:22.940 --> 17:28.500] that you can get. The interesting thing with the cryptography that, so I looked at the top 10. If [17:28.500 --> 17:33.300] you search for cryptography, these are the 10 ones that have, that are focused on cryptography. Some [17:33.300 --> 17:38.120] interesting highlights. They generally provide good history on cryptography and basic concepts, [17:38.120 --> 17:44.100] but generally lack practical engineering guidance. There's no discussion of authenticated encryption. [17:44.100 --> 17:48.120] Remember I mentioned authentication, authenticated encryption, AE, that's a pretty big deal. That's [17:48.120 --> 17:53.720] important. No discussion at all. Also no discussion of secure key management or key storage options. [17:54.280 --> 18:00.960] No mention of kernel based cryptographic random number generators versus user space RNGs. Two of [18:00.960 --> 18:05.180] the courses, two out of ten, made a kind of a passing mention of ECC, but they didn't really [18:05.180 --> 18:09.860] get in any details talking about, you know, why would you want to use ECC versus RSA? What are [18:09.860 --> 18:14.020] some preferred curve choices? That sort of thing. And I think some of it is just because the content [18:14.020 --> 18:18.400] is a little dated in some of them. Not to say this couldn't be addressed, but this is just [18:18.400 --> 18:24.260] indicative of cryptographic, like, training material that's out there. Whether it's stack [18:24.260 --> 18:29.200] overflow posts or, you know, training materials like that. You have to be careful because, I mean, [18:29.200 --> 18:34.000] knowledge is good. But when you give people the wrong knowledge or dated knowledge, it can [18:34.000 --> 18:39.540] actually work against you. And so on so forth. Nearly all hash examples use MD5, which we know [18:39.540 --> 18:44.280] is broken. And quite a bit of odd advice. Things like, you should double your PBKDF2 [18:44.280 --> 18:47.940] iterations every year, which I've never seen anywhere else. I don't know where that came from. [18:48.000 --> 18:52.940] Maybe you should use this obscure tiger hashing function instead of SHA-256, because SHA has [18:52.940 --> 18:58.300] been shown to have weaknesses in the past. SHA-1, for example. That's the kind of stuff I don't [18:58.300 --> 19:02.140] want developers to focus on. See, oh, maybe I should maybe I should double my PBKDF2 [19:02.140 --> 19:07.700] iterations every year. So now I've got, you know, a gazillion iterations and it doesn't work anymore. [19:10.060 --> 19:18.080] So, what are some popular cryptographic libraries? These are general-purpose cryptographic libraries [19:18.080 --> 19:24.020] that, in my opinion, provide too many options to the the average developer. Botan, Bouncy Castle, [19:24.020 --> 19:29.440] Crypto++, Libgcrypt, OpenSSL, WolfCrypt. This is just a few of them, of course. You may be [19:29.440 --> 19:34.260] familiar with some of these. You may use some of these. It's not that they're bad. They just [19:34.260 --> 19:38.820] provide a lot of different options. Things like, you know, encryption algorithms, hash [19:38.820 --> 19:42.280] functions. And I may not have these numbers, you know, completely accurate. I kind of went [19:42.280 --> 19:46.240] through documentation and tried to summarize things. And it's based on like interfaces and [19:46.240 --> 19:51.260] API endpoints. So, you as a developer go in, you have to choose a hash function. You've got 27 [19:51.260 --> 19:57.520] different choices in Libgcrypt. Signature schemes. We're at a reason Bouncy Castle goes [19:57.520 --> 20:03.680] crazy on signature schemes that, you know, at your disposal. So, as a developer, it increases your [20:03.680 --> 20:15.070] chances of choosing something that may not be in your best interest. Oops. Landmines in [20:15.070 --> 20:21.490] cryptographic libraries. So, in addition to having just lots of options, they also have some bad or [20:21.490 --> 20:26.670] insecure options that you need to be aware of. Things like RSA with insecure padding. That's an [20:26.670 --> 20:32.510] area where cryptographers, you know, have a lot to say about. You should never, you know, not [20:32.510 --> 20:38.110] only should you maybe not use RSA, prefer using ECC. If you need to use RSA, make sure you're [20:38.110 --> 20:42.550] using preferred padding. There's a lot of implementations that just provide insecure padding. [20:42.550 --> 20:48.070] That's the default. All of these libraries provide at least the option of using RSA with insecure [20:48.070 --> 20:57.070] padding. AES with ECB mode. We know that's kind of, you know, faulty. Broken ciphers. Things like [20:57.070 --> 21:04.290] RC4, for example. RC2. Just some really outdated, clearly broken ciphers. They're there in every [21:04.290 --> 21:08.210] single one. A lot of it's to provide backwards compatibility or, you know, interoperability with [21:08.210 --> 21:13.710] things. But if that's an option and someone, you know, as a developer, meaning well, maybe they've [21:13.710 --> 21:18.630] copied something off the Internet. I was looking at a blog article just the other day where someone [21:18.630 --> 21:26.030] was talking about old, you know, algorithms that were basically defunct in the mid-90s. Same with [21:26.030 --> 21:32.050] hash functions. Old hash functions. Broken hash functions. This user space RNGs. They all have, [21:32.050 --> 21:36.850] other than WolfCrypt I guess, they all have, you know, this user space RNG concept that can get [21:36.850 --> 21:40.330] you into trouble. Without even realizing it. Because it's just, it's kind of like the default [21:40.330 --> 21:48.050] behavior. Another thing is with implementation challenges. Here's a specific example around the [21:48.050 --> 21:52.170] encrypt and MAC approach. So you have some knowledge. You know you need to use authenticated [21:52.170 --> 21:56.930] encryption. But maybe you're working with a library or a programming language that doesn't [21:56.930 --> 22:02.030] have authenticated encryption. But you can build it yourself with the built-in primitives. So you [22:02.030 --> 22:06.790] put together this encrypt and MAC construction. In order to do it correctly though, you have to [22:06.790 --> 22:10.130] keep all this in mind. You've got to use a different key for your encryption from your [22:10.130 --> 22:15.330] authentication. Preferably you derive those keys from a single master key using a KDF. You have to [22:15.330 --> 22:19.110] also make sure that all your string comparisons use constant time. You can't use standard string [22:19.110 --> 22:25.330] comparison functions. You definitely have to use an HMAC and ensure it includes a ciphertext, your [22:25.330 --> 22:29.350] additional authentication data, the initialization vector, and the encryption method. All of those [22:29.350 --> 22:36.490] have to be packaged inside the HMAC. And those fields passed in the HMAC, they must use a format [22:36.490 --> 22:41.670] that unambiguously delineates them. Any, if you leave out one of those fields, if you don't use a [22:41.670 --> 22:45.430] correct format, you essentially are introducing a weakness. And you don't, you don't necessarily [22:45.430 --> 22:49.150] realize it because this isn't your bread and butter to create these encrypt and MAC [22:49.150 --> 22:53.650] constructions. I've seen so many examples of these where well-meaning developers know they need to [22:53.650 --> 22:57.110] go down this path of authenticated encryption, but they try and put all this together and it's [22:57.110 --> 23:05.730] admittedly, it's complicated. We also have weaknesses in standards and protocols. Many of [23:05.730 --> 23:10.830] you may have worked with the JavaScript object signing and encryption standard suite. It's like a [23:10.830 --> 23:18.450] suite of standards. Things like JSON web signatures, JWS, which even if you have a compliant [23:18.450 --> 23:24.710] implementation, meaning it meets the standard correctly, it is vulnerable to passing in the none [23:24.710 --> 23:32.230] or HS256 algorithms. As an attacker, you can pass those in when you're, when in reality you're [23:32.230 --> 23:37.110] expecting an RSA signature and it's able to kind of work with those. Essentially, it's a fairly [23:37.110 --> 23:42.490] known by now, you know, attack vector. So you want to make sure not only are you using a [23:42.490 --> 23:48.390] standards compliant implementation, but it's also aware of these weaknesses and has put in extra [23:48.650 --> 23:54.410] safeguards to protect against them. Same thing with JSON web encryption, JWE allows this [23:54.410 --> 23:59.930] insecure padding, you know, RSA with insecure padding choice, which is unfortunate. It also [23:59.930 --> 24:05.150] allows ECDH with NIST curves, which introduce risk of invalid curves attacks. A little more rare, [24:05.150 --> 24:09.330] but something to be concerned about. It's unfortunate that these choices exist in the [24:09.330 --> 24:17.070] standard. OAuth 2, you may be familiar with OAuth 2. It's a great, you know, standard for technically [24:17.070 --> 24:22.250] as an authorization protocol, but used in the authentication authorization space. Provides [24:22.250 --> 24:28.650] several weak workflows that ideally should really never be used in modern systems, but I see this [24:28.650 --> 24:32.790] all the time. Use of client credentials and password grant types where they shouldn't, you [24:32.790 --> 24:37.190] know, it's not recommended. There's also this optional state parameter, this in the standard, [24:37.190 --> 24:42.130] it's optional, but it really should always be used. There's also no explicit access token [24:42.130 --> 24:47.010] specification in OAuth 2. So you'll see, you'll see some interesting implementations as a result [24:47.010 --> 24:52.210] of that. There's just no specific guidance. So obviously good implementations of OAuth 2 do a [24:52.210 --> 24:55.870] great job, but they're going above and beyond the standard because they, you know, have [24:55.870 --> 25:00.530] additional knowledge and experience associated with that. I always recommend that anyone that's [25:00.530 --> 25:05.210] working in the OAuth 2 space, that you read the Threat Model and Security Considerations RFC. You [25:05.210 --> 25:09.770] know, if you search for OAuth 2 Threat Model, you'll get this RFC. Goes into a lot of detail [25:09.770 --> 25:15.390] about particular threats and countermeasures that you can be, that you should be aware of. But most [25:15.390 --> 25:19.710] developers just aren't aware of that. And as a result, they don't think about certain types of [25:19.710 --> 25:30.100] attacks you can do in this space. So I ask myself this all the time. Should developers stop using [25:30.100 --> 25:36.100] cryptography? You know, because we do a lot of training. At one point I asked a client and said, [25:36.720 --> 25:41.000] you know, we've done all these assessments. We see all these, you know, consistent issues with [25:41.000 --> 25:46.060] cryptography. Maybe it makes sense to start telling your developers to stop doing cryptography. [25:46.840 --> 25:51.360] Because clearly, at least everything I'm looking at, no one's doing it right. So does that make [25:51.360 --> 25:55.780] sense? Should we tell developers to stop using cryptography? Clearly, you know, one pro of that [25:55.780 --> 26:01.080] approach is to have less security implementation flaws. Con is, of course, then your systems will [26:01.080 --> 26:04.380] lack even basic protections. They're not gonna have use any kind of cryptography. And so then [26:04.380 --> 26:09.800] it's more obvious that you've got, you know, a lack of security protection. So what if security [26:09.800 --> 26:14.120] pros do all the crypto, you know, cryptography work instead? And in certain organizations, that [26:14.120 --> 26:19.620] might make sense. You might be able to do that. But generally, security pros don't scale well. You [26:19.620 --> 26:25.580] can't have, you know, one or two people writing all of this code. You know, anything involving [26:25.580 --> 26:31.960] cryptography, which admittedly is an increasing amount of features and system aspects these days [26:31.960 --> 26:36.860] that need to take advantage of crypto. So that's just gonna slow down progress if you have the, [26:36.860 --> 26:47.830] you know, just folks with specific expertise take care of that. So, assuming we can't do that, what [26:47.830 --> 26:53.530] are some other options? What is this idea of developer-friendly cryptography? Has anyone ever [26:53.530 --> 26:57.650] heard of that concept before? Developer-friendly cryptography? I don't think I dreamed it up. Not [26:57.650 --> 27:05.650] you. So how to define developer-friendly? Here's four different things that, to me, mean it's [27:05.650 --> 27:10.630] developer-friendly. Like, takes it to that next level. There's no need to select ciphers or key [27:10.630 --> 27:18.030] sizes. There is automatic generation of encryption key, initialization vector, salt, and nonce values. [27:18.310 --> 27:25.710] There are simple, clear APIs that provide high-level outcome-based functionality. So rather [27:25.710 --> 27:32.250] than using cryptographic primitives, you know, you have to choose ciphers and algorithms. You [27:32.250 --> 27:37.730] instead say, I, you know, it's high level. I'll show you some examples of what I mean by this. And also [27:37.730 --> 27:45.050] important, also important, there are no insecure or low security options. So as much as possible, [27:45.050 --> 27:54.170] you avoid the scenario where developers shoot themselves in the foot. So, start with, let's look [27:54.170 --> 27:59.970] at some developer-friendly libraries. Those that I feel are kind of candidates for this concept of [27:59.970 --> 28:06.170] developer-friendly. We've got Libsodium. And that's an important one to remember. If you remember [28:06.170 --> 28:12.570] anything from this talk, remember Libsodium. It is a cross-platform, compilable, you know, module [28:12.570 --> 28:19.930] based on the NACL package. It's got language bindings for most languages out there. It's very [28:19.930 --> 28:26.510] popular in the security professional and cryptographer space. It's generally easy to work [28:26.510 --> 28:31.610] with from a developer standpoint. And it supports things like authenticated encryption, digital [28:31.610 --> 28:40.190] signatures, you know, hashing. It's performance optimized. And it doesn't give you so many options [28:40.190 --> 28:45.030] that, I mean, it gives you enough options as a developer to get most things done. If you need [28:45.030 --> 28:48.970] something certainly more advanced or more unique, you may have to move to a more general purpose [28:48.970 --> 28:54.330] library. But this is the type of library that I feel comfortable recommending to any developer in [28:54.330 --> 29:00.070] any sort of environment that they're working in. Libhydrogen is kind of like the younger brother [29:00.070 --> 29:08.210] of Libsodium. Designed for the embedded system space. It's written in C, C99. So it, you know, [29:08.210 --> 29:15.550] it's fairly versatile. It has the same general API, the interfaces as Libsodium. But implemented [29:15.550 --> 29:20.970] with essentially just two cryptographic primitives. So it keeps it very small, very lightweight. Not [29:20.970 --> 29:25.530] as tried and not as well tested as Libsodium at this point. But definitely something that you [29:25.530 --> 29:30.630] should keep your eye on if you're working in the embedded system space. MonoCypher is another one. [29:30.630 --> 29:36.970] Pretty nice and not something I would necessarily recommend over the other two. And then, but at the [29:36.970 --> 29:40.810] same time, if I was assessed, you know, performing an assessment against the system and I saw use of [29:40.810 --> 29:46.910] MonoCypher, Tink, or ASP.NET Core, I would feel more comfortable than a general-purpose library. [29:47.410 --> 29:52.770] Tink is another library put together by some Google engineers that is designed, again, to create those [29:52.770 --> 29:58.230] high-level kind of outcome based functionality. It doesn't have password hashing or performance. It's [29:58.230 --> 30:02.870] not as performance optimized. And it only works with Java and C++. But it's still a decent option. [30:02.910 --> 30:06.650] Even, and then I also want to highlight, you know, ASP.NET Core, which is an interesting one in this [30:06.650 --> 30:12.790] list. And that it is, you know, only a subset of Microsoft's new .NET Core framework. And it, again, [30:12.790 --> 30:16.750] it only provides authenticated encryption and password hashing. And it does provide some key [30:16.750 --> 30:21.610] management functionality as well. But it is, it's kind of nice in that it's built-in. So developers [30:21.610 --> 30:26.250] that can leverage this, I like the direction Microsoft is going with this cryptographic API. [30:27.450 --> 30:34.730] So here's an encryption example from Limbosodium. This is C code again. You don't really have to [30:34.730 --> 30:40.530] read the detail and you probably can't if you're in the back. Other than it is fairly simple. You'll [30:40.530 --> 30:45.590] notice if you, you know, view this later, for example. There's no mention of any cryptographic [30:45.590 --> 30:53.470] algorithms here. No choice of cipher. No key size selected. No, you know, correctly generating [30:54.010 --> 30:59.130] initialization vector or nonce. It kind of makes, I mean, you still have to do, you know, create the [30:59.130 --> 31:05.190] key and create the nonce for the encryption. But it provides functions to do this in a very [31:05.190 --> 31:13.170] simplistic manner. Where a developer is not going to generally choose, you know, a wrong [31:13.170 --> 31:23.960] selection. Here's LibHydrogen. This is a public key signature example. Where again, it's using, [31:23.960 --> 31:30.940] it's creating a public-private key pair and using that to generate a signature to sign some [31:30.940 --> 31:37.420] data and then later verify the signature. So again, very straightforward, very simplistic for a [31:37.420 --> 31:48.910] developer to leverage. So that's all great. But what about FIPS 140? Anyone familiar with FIPS [31:48.910 --> 31:56.590] 142? A couple folks. This is, so FIPS stands for the Federal Information Processing Standard. And [31:56.590 --> 32:01.890] publication 140 has to do with this particular space and the two is means it's version two. [32:01.890 --> 32:07.970] That's the current version of FIPS 140. So cryptographic libraries can be FIPS 142 level [32:07.970 --> 32:13.810] one validated. So the software library itself can be level one validated. Level two validation is [32:13.810 --> 32:20.530] reserved for hardware devices. But that validation, that requires some time and money. You actually [32:20.530 --> 32:28.010] have to ship your, you know, your library off to a testing organization called CVMP. They run it [32:28.010 --> 32:32.570] through its paces, you know, look at a whole bunch of things with it. Obviously it takes a lot of [32:32.570 --> 32:37.690] time and you have to pay them. But then you get a certificate that says, okay, we tested this [32:37.690 --> 32:41.990] particular library on this specific set of hardware and actually will list out, you know, whether [32:41.990 --> 32:47.750] it's two or four different hardware-based environments. You know, this particular, like [32:47.750 --> 32:54.550] it's Windows 2008 with this particular version on it. So they certify that will work properly [32:54.550 --> 33:02.670] according to FIPS approved standards with that particular environment. Use of FIPS validated [33:02.670 --> 33:07.770] modules is mandatory by US and Canadian government agencies. Others may use it as well. But [33:07.770 --> 33:11.950] definitely if you're working in the federal government space, this is most likely going to [33:11.950 --> 33:16.910] come up for them. It's important that you have a FIPS validated library that you're utilizing. [33:16.910 --> 33:20.210] It doesn't mean you have to create one and get it validated. It just means that any of your [33:20.210 --> 33:24.890] cryptographic choices, you know, your cryptographic code is utilizing a library that's gotten this [33:24.890 --> 33:30.130] sort of validation. Keep in mind that FIPS validation doesn't necessarily mean that it's [33:30.130 --> 33:33.190] more secure. That that library is more secure. It just means it's gone through this validation [33:33.190 --> 33:39.970] process. It's been reviewed essentially by the CBMP against FIPS standards. So if you are [33:39.970 --> 33:45.270] working this space, you do need a FIPS validated library. But clearly you can't use any of the, [33:45.270 --> 33:48.170] well I shouldn't say clearly because I didn't state it, but any of those libraries we looked [33:48.170 --> 33:52.630] at before, LibSodium, LibHydrogen, none of those are FIPS validated libraries. And there's reasons [33:52.630 --> 33:56.170] for that. They, you know, they're really not in a position where that they want, they need to [33:56.170 --> 34:01.510] pursue, you know, as an open source library to pursue this validation. Generally you're looking [34:01.510 --> 34:05.650] at a commercial library that's gone through this process in this case. So they're out there. [34:05.650 --> 34:11.890] There's some good libraries like WolfCrypt for example, NanoCrypt by Mocana. Even OpenSSL has a [34:11.890 --> 34:17.170] FIPS validated version of the library that you could look at. I do recommend that if you need [34:17.170 --> 34:23.170] one of these, that you take the time to consider writing a developer-friendly wrapper around it. [34:23.250 --> 34:28.350] Similar to, you know, providing interfaces similar to LibSodium or others. Or maybe get some [34:29.250 --> 34:34.150] expertise, some outside help for example, to put this together, this wrapper for your developer [34:34.150 --> 34:40.050] group. That way they're not having to worry about the specific cryptographic choices and so on. [34:43.610 --> 34:48.650] Also, key management best practices. We talked a little bit obviously about, you know, general [34:48.650 --> 34:53.770] encryption and cryptographic function best practices. Key management is another area that I [34:53.770 --> 35:02.350] mentioned is a fairly big weakness. Developers in code that I've reviewed and my team [35:02.350 --> 35:07.850] has reviewed, they can at times do a very good job of putting together cryptographic functions. [35:07.990 --> 35:13.950] But then the keys are, you know, they're stored in source control or they're stored even in, you know, [35:13.950 --> 35:19.150] hard-coded in the system itself. So, some best practices around key management. If we had an [35:19.150 --> 35:24.750] ideal system, it would have all of these. No key or secret should be stored in clear text. Keys [35:24.750 --> 35:30.370] should have a defined limited lifetime based on usage. The keys should be refreshed automatically [35:30.370 --> 35:35.270] where possible. There should be a method to manually revoke keys. You always want a way that, [35:35.270 --> 35:38.590] you know, once you generate a key that you can revoke it. Even if it's, you know, kind of a [35:38.590 --> 35:45.190] manual process that you have to go through. Keys should never have an unlimited lifetime. Access [35:45.190 --> 35:49.610] to clear text keys should be limited through authorization or permission. So, being able to [35:49.610 --> 35:57.210] play some access control around it would be, is important. You also want to make sure that access [35:57.210 --> 36:05.290] to clear text keys is limited through authorization and permissions. And the all key lifecycle and [36:05.290 --> 36:10.610] access events should be audited. So, ideally, anytime there's, you know, access to a key, you [36:10.610 --> 36:15.710] know, you pull it out of a vault, for example, that there's a log event that occurs, you know, this [36:15.710 --> 36:22.450] particular application or this particular user. You see this a lot with code signing. You know, [36:22.450 --> 36:27.970] if you've got an implemented code signing kind of process set up within your organization, you [36:27.970 --> 36:33.450] generally want to have a more robust handling of the key material around use of code signing. So, [36:33.450 --> 36:42.460] only certain individuals have access to that. That usage is fully audited and so on. It sounds [36:42.460 --> 36:48.300] like we're in an elevator. So, some key management solutions. You might not be able to read this [36:48.300 --> 36:53.780] slide, but I'll highlight a few solutions that are, you know, some pre-made kind of solutions [36:53.780 --> 36:58.740] rather than having to roll your own, which is generally not ideal. So, you've got HashiCorp [36:58.740 --> 37:03.980] Vault or KeyWiz. These are good for distributed systems, you know, client-server environments, [37:03.980 --> 37:09.220] anywhere where you've got more, you know, more components involved in your particular system [37:09.220 --> 37:14.920] environment. Vault is considered the gold standard in this space. They have very comprehensive key [37:14.920 --> 37:24.660] management strategy and allow a variety of topologies in how you deploy your key management [37:24.660 --> 37:33.300] infrastructure and so on. So, it's a great tool in this space. You may also leverage a hardware [37:33.300 --> 37:37.060] security module. And hardware security modules often can be used with a lot of these solutions [37:38.840 --> 37:44.380] as a hardware backing, essentially. So, you could get it in a chipset form if you're working, you [37:44.380 --> 37:48.980] know, an embedded environment, for example, a bedding system. Maybe as a USB device or even as [37:48.980 --> 37:56.060] like a rack mount appliance that you could have in your own data center. An HSM provides a very [37:56.060 --> 38:03.280] protected form of kind of a vault, if you will, you know, an actual hardware vault to store keys. [38:03.280 --> 38:08.380] And to do so in a way where it's tamper resistant, very difficult for someone to pull keys out of [38:08.380 --> 38:12.780] there. They actually get generated on that device. And those keys, at least, you know, in case of [38:12.780 --> 38:18.000] private keys, for example, they never leave that device. They always stay there. If you're working [38:18.000 --> 38:24.800] in the cloud, you got things like Amazon KMS, Azure Key Vault, Google Cloud KMS, and OpenStacks [38:24.800 --> 38:30.480] Barbican. These are all HSM as a service, essentially, which are great, especially if you [38:31.140 --> 38:36.760] need to use this on a maybe you need to use an HSM on a more periodic basis. You don't have the funds [38:36.760 --> 38:42.040] to get a full HSM appliance. You can just leverage one of these systems and do so in a fairly [38:42.040 --> 38:46.300] inexpensive manner with good, you know, robust. I mean, on the back end, they're essentially using [38:46.300 --> 38:53.860] those HSM appliances themselves. If you're working with either Ansible or Chef in automation [38:53.860 --> 38:59.140] processes, both Ansible and there's an Ansible vault feature and Chef has a vault feature as [38:59.140 --> 39:03.520] well, which is above and beyond just Chef data bags. Chef vault is more robust for use with [39:03.520 --> 39:08.360] secrets and key management. They do lack more advanced features like, so for example, if you're [39:08.360 --> 39:11.900] using Ansible vault, you're not going to get all the feature set that you would with HashiCorp [39:11.900 --> 39:15.860] vault. But because it's built in, it can be obviously very convenient. It's a good place to [39:15.860 --> 39:22.900] start. If you want integrated secret and workflow management, you might want to consider Docker [39:22.900 --> 39:29.440] with SwarmKit or DCOS. They have commercial offerings of these that offer even more advanced [39:29.440 --> 39:34.860] features then, but you will get some, you know, a basic feature set in the free versions, which is [39:34.920 --> 39:40.000] a good, again, not as robust as like a vault implementation, but that integration, of course, is [39:40.000 --> 39:48.600] nice. The other ones I'll mention, Knox, Tink, and ASP.NET Core. Those are all examples of [39:48.600 --> 39:53.260] integrated application level key management. You're not going to get necessarily the more [39:53.260 --> 39:57.680] advanced features, but they, and they have more limited language bindings. For example, obviously [39:57.680 --> 40:03.340] ASP.NET Core only works with ASP.NET Core. Tink only Java and C++. I forgot now what Knox works [40:03.340 --> 40:12.540] with. Probably Go. But they're very, they, because you can implement them at your application [40:12.540 --> 40:18.220] level, you've got more flexibility on how those keys are managed. Often, they will also work with [40:18.220 --> 40:29.090] an HSM backend. So if it's present, you know, you can leverage the HSM hardware. So high-level [40:29.090 --> 40:35.410] developer-friendly recommendations. Transport layer security. So if someone, you know, if we're [40:35.410 --> 40:39.530] reviewing code or someone comes to me and asks about what kind of algorithms should I use for [40:39.530 --> 40:44.570] secure communications? I say, let's not talk algorithms. That's just, you know, we're, we're [40:44.570 --> 40:49.870] gonna go nowhere with this. Someone's already done that work for you. Just use TLS or essentially, [40:49.870 --> 40:56.370] you know, a robust communications protocol that is already gonna take into consideration all the [40:56.370 --> 41:01.230] things that you don't need to worry about. And do so in a, you know, in a right way. In a secure [41:01.230 --> 41:08.030] way. On the storage side, you want to consider built-in storage encryption solutions where [41:08.030 --> 41:12.110] possible. You know, certainly take advantage of these if you have the opportunity to. Things like [41:12.110 --> 41:16.890] Microsoft's transparent data encryption. Again, you're just getting disk level encryption at this [41:16.890 --> 41:21.290] point. But it's so easy to use, there's almost no reason why not to. And it gives you the ability [41:21.290 --> 41:25.530] to protect your data at rest, your data backup, database backups and SQL Server and so on. Same [41:25.530 --> 41:30.250] thing with like Amazon's EBS encryption, full disk encryption. Those are just good options to [41:30.250 --> 41:34.690] leverage where you as a developer don't have to do anything. You don't have to know it even exists. [41:34.690 --> 41:40.570] It just kind of works for you for those, for specific scenarios. We mentioned developer-friendly [41:40.570 --> 41:47.470] libraries. Definitely want to check out use of those libraries, even above and beyond, you know, [41:47.470 --> 41:54.510] your built-in programming APIs. For example, in .NET. .NET has a pretty decent cryptographic API [41:54.510 --> 41:59.890] that includes, that is included with the framework. But I've seen so many misuses of it. The general, [41:59.890 --> 42:04.350] my general recommendation to development teams as we go through training is, wherever possible, [42:04.350 --> 42:07.810] use Libsodium. If you need to get, if you need to get to the point where you need to use [42:08.530 --> 42:12.610] cryptography, you need to use some encryption, like you can't leverage TLS, so you can't, you [42:12.610 --> 42:17.930] know, it's, it's not covered by any of those kind of built-in solutions. Take advantage of Libsodium, [42:17.930 --> 42:23.910] which has a, you know, Libsodium net is the .NET language binding of Libsodium. And then key [42:23.910 --> 42:28.630] management solutions. You always want to consider how the keys are being managed. That includes, [42:28.630 --> 42:32.830] not only, you know, your production environment, but also your test environments, your local [42:32.830 --> 42:39.550] development. Anytime I see a key checked into source control that is not specific to a development [42:39.550 --> 42:44.010] environment or, you know, kind of a small test environment, we always want to flag that and say, [42:44.010 --> 42:52.610] hey, this is not an ideal solution. Oversharing of keys is all too common and very easy for things [42:52.610 --> 42:58.310] to leak out. I mean, we've probably all seen examples of things that get into GitHub or get [42:58.310 --> 43:05.170] into areas where they're just more accessible. Logs, sometimes, a lot of times keys get logged [43:05.170 --> 43:09.730] too. And those logs will get aggregated into log management systems. And those logs, you know, or, [43:09.730 --> 43:15.130] you know, sucked in via syslog systems. And so then you've got more people with access to your [43:15.130 --> 43:20.170] logs. And then if they get keys or credentials that way, then you've opened up a door and so [43:20.170 --> 43:31.750] on. Questions? I left a little time here. So any specific questions? Yeah, feel free. These links [43:31.750 --> 43:36.050] are, again, linked to the slide deck. And I specifically put in three links because I wanted [43:36.050 --> 43:42.970] to keep these, these three tools, if you will, or libraries, is what I would consider the most [43:42.970 --> 43:48.210] important. These are the, these have a general purpose in the sense they have broad applicability, [43:48.210 --> 43:53.770] broad, you know, usage in the case of Libsodium and Libhydrogen. Fairly decent, like, language [43:53.770 --> 44:01.730] binding so that they'll work in most scenarios. So if you're working in PHP, in fact, PHP has, [44:02.190 --> 44:05.670] default cryptographic library now, which is pretty exciting. They're, they're kind of ahead [44:05.670 --> 44:15.810] of the game, of all things PHP. Well, if you're working in Ruby, Go, Rust, .NET, Java, you name it. [44:15.810 --> 44:26.850] I probably have, there's probably a Libsodium binding for you. Questions? Yes? Are there good [44:26.850 --> 44:32.610] code review options for cryptography specifically? Yeah. My team, obviously, enjoys looking at [44:32.610 --> 44:37.030] cryptography. There are other, I think, firms that kind of specialize in looking at cryptography. But [44:37.030 --> 44:42.330] from an automated standpoint, there are some things. I think most tools, I, I have a leg in [44:42.330 --> 44:47.590] the like static analysis space as well. And spend time looking at what tools are capable of in [44:47.590 --> 44:51.650] determining cryptographic issues. And it is an interesting space. Clearly they can identify [44:51.650 --> 44:57.570] usages of old, outdated functions. But being able to determine if you're using even newer [44:57.570 --> 45:01.610] functions in a proper manner, can be complicated. You know, you got a, like an encrypt and max [45:01.610 --> 45:07.290] solution. It might be great. It might be that you've missed something and it's just, there's [45:07.290 --> 45:11.670] just not enough context for an automated tool to figure that out. So I can't point to one specific [45:11.670 --> 45:18.010] example. I think most, obviously the big-name commercial static analysis tools are probably, [45:18.010 --> 45:25.890] all attempt to do something in this space. Lighter weight tools, you know, open source tools, varying, [45:25.890 --> 45:31.570] it's mixed, I think, based on the language. Java has some very interesting default cryptographic [45:31.570 --> 45:37.130] choices. Things like, if you encrypt with AES, you're gonna get ECB mode by default. So that's [45:37.130 --> 45:47.860] obviously not cool. So a lot of tools will pick that up. Some Java based concerns. So yeah. Cool. [45:47.860 --> 45:51.160] All right. I think we're good. Thanks, everyone.