Developer-friendly cryptography is what I'm going to get started on this morning. How many of you would call yourselves developers in some capacity? All right. How many of you work with developers? A couple more. Okay. Very good. So because we have this small screen, I was kind of moving things around last night to make it larger. But, you know, certainly feel free to come forward. This link at the top here will get you a copy of the slide deck in a PDF format. So, you know, if you want to follow along on your laptop or something, feel free. My name is Bryce Williams. I work for a consulting firm called SysLogic, based here in Milwaukee, and lead their managed security services team. So my group, we provide application security guidance and a variety of topics for organizations. We review lots of code. So we do source code assessments, you know, security assessments on a lot of different, you know, large systems to small embedded devices. Train thousands of developers. This takes me all over the place, you know, talking to developers specifically about application security concerns, which includes cryptography. And personally, I've been in the field for 20 years as an application developer architect, and then transitioning into the security, the application security space. So everything I do is around application security and working with the developer community. I want to start with this statement, this quote from Dr. Newman. And this is, this may be something you've seen before. There's several different variations of it. This was some, this was a statement that Dr. Newman gave in 2001 for a New York Times article. And the article was discussing an announcement by a Harvard professor about a new type of unbreakable encryption technique that he had, you know, put together, he and his team. And it was based on the use of a key based on a stream of random numbers. So it's a pretty novel technique. From an academic and theoretic standpoint, it was actually pretty cool. Where the statement from Dr. Newman and some others that commented on it was, you know, this is really great, you know, from that perspective, but we often see weaknesses when it comes to the implementation of, you know, truly great like cryptographic techniques. So I want you to keep this in mind as I go through this discussion that you can have great cryptography, but often, you know, putting the pieces together, actually practically putting it into place is where we see weaknesses. So to get started, I want to look at cryptographic best practices for 2018. And don't worry if these are unfamiliar to you or if you can't read it. These are things that, you know, over time, you know, now that we're in 2018, we have some updated cryptographic best practices. If you were in Zach Grace's talk yesterday, some of them, there's a little bit of overlap with what I'm talking about today and what he covered. The best practices, for example, are similar, you know, derived from the same sources from cryptographic professionals that, you know, know this topic inside and out. So we have things like random data from kernel-based CS, ERGs, your random number generators, your crypto random number generators. Use of authenticated symmetric encryption. Symmetric signatures using HMAC. With your hash functions, make sure that you're using those that avoid length extension attacks. That's an important one. Some of this I'll come back to, but just keep in mind, you know, at a high level, these are the best practices that cryptographic professionals promote and encourage folks to use. Yeah. So password storage, password-based KDFs, asymmetric encryption, preferring the use of electrical cryptography or ECC over RSA. For asymmetric signatures, preferring the use of EDDSA or RFC 6979. And then secure communications, TLS everywhere, you know, is great. Something along those lines, you know, a good end-to-end encrypted communication stream, TLS 1.2 or now 1.3. I forgot I had this. So I want to get us started by field observation. So these are examples I'm going to go through that my team and I initially reviewed, things that we have, you know, uncovered in our assessments. Some of these, you know, let's start with a basic... well, hold on. Let's start with some common mistakes first, then we'll get into the actual examples. So here, some of the things we have observed include, you know, these different categories. Things like weak password storage. Use of reversible encryption. This is all too common. AES, RSA used in storage of passwords, not ideal. Use of hash functions without salts, a little better. Again, not ideal. Insufficient work factor to prevent brute force. Often, you know, there might be some work factor utilized or what we call a KDF, a key derivation function, but there hasn't been sufficient thought put into the work factor. We also see poor key management. Keys are not stored securely, hard-coded, or placed in source control. It's a pretty common issue. Loose access control or overshared keys. Lack of granular key usage and periodic rotation. All sorts of, you know, related key management issues. We also see a general lack of authenticated encryption. And if you're not familiar with that term, authenticated encryption, don't worry, most developers aren't either. They haven't even, they've never heard of authenticated encryption. They don't realize that's something that they need to be concerned about. In their minds, AES, they've heard of AES, that's good enough. Let's just use AES because we've heard that's military grade. It's a good start. AES isn't necessarily bad, but you need to know a little bit more than that. Or they may know they need to use authenticated encryption, so they go down the path of developing a, you know, authenticated encryption construction based on something. But it's not the preferred encrypt and MAC approach. We also see use of keys, initialization vectors, and nonces in a way where there's misuse, there's reuse of keys, they're hard-coded, they're using some sort of strange obfuscation to, you know, protect them. Use of passwords for encryption keys. Another big issue, a password is not designed to be an encryption key. You can turn a password into an encryption key, but a password itself should never be used for an encryption key. Also weak and or non-random values for these, you know, keys, IVs, and nonces or just reuse of values. There are certain constructions where reuse of a nonce is a big no-no. So here's my first example. Use of not encryption is what I call it. So you probably, if you're in the back, you might not be able to read all of this, but essentially we have, this is in C code, in fact, all of my code examples are in C. This is from an actual assessment that was performed. We ran across this function, I think I ran across this one, called encrypt password. And just skimming through this, right away I knew we had a problem because there's no mention of any algorithms and any, you know, cryptographic algorithms or ciphers. There's no encryption key. There's just this interesting line here that's doing some, essentially some, shifting some bits around. So that's not encryption. That's what we, that's obfuscation essentially. And you think, well yeah, this is a pretty crazy example. This wouldn't exist that, that often in the field, or maybe it's super old code. But I think you might be surprised at how often you run across examples of encryption that looks something like this. In this next example, we have a random number generator that's being seeded or initialized. Here the comment says it all. Need to see the RNG with a hash of the MAC address. So if you can kind of read the code here, it's doing a murmur to hash based off some pieces of a MAC address. And this is just messed up. You can, you know, a MAC address is a static value for a device. A random number generator, on the other hand, needs to be, it can't be deterministic. It needs to be unique and unpredictable. So the use of a MAC address for a random number generator should never happen. It just, it shouldn't be any part of it. We also see a lot of overly, what I call overly complex encryption. Just extra layers of things that, you know, you, I remember running across this one for the first time. You know, right away going, what is going on here? And then breaking down, you know, reverse engineering the logic to understand what was happening. And it kind of looked like this, where it's generating a random value. That random value gets stored in a database. The random value gets split into two. You combine the second half with a plain text, reverse the combined value, hash it with SHA-256, and on and on. One of my favorite sections is line two here, where it goes in, creates a UUID, takes the five parts, reorders the parts, and then puts it back together. And it's like, what, who dreamed this up? And why? And essentially, this second step here is the cause of the concern. There's actually a weakness in this approach, regardless of the fact that it's just weird. The random value should never have been stored in the database alongside the encrypted data. I forgot to mention, this was for storing data in a database using reversible encryption. They wanted to be able to recover the plain text later. So they use this approach, which, you know, at its face is crazy. But the reason I think that the designers put this in place is because they recognize that by storing the random value in the database, it just, maybe it felt wrong. They knew there was some sort of weakness, so they added additional layers of obfuscation, essentially to make it more difficult to reverse engineer. Now, obviously, we figured it out. And an attacker with enough information or access to things could do the same thing. And ultimately, potentially, reverse engineer and recover the plain text values that are stored in the database. So if they could, you know, take the entire, or have access to the entire database, they have all the detail they need to recover the encrypted information. So anytime I see extra layers of complexity involved in encryption techniques, it usually clues me into the fact that it could warrant some study, because there's probably a reason there's extra layers there. They're hiding something. Here's another example. It's not entirely cryptography based, but this is looking at a system. This is a cloud-hosted, you know, typical web system with a database backend, an API front end, and a JavaScript UI. Three different issues we kind of highlighted in this particular implementation. We've got missing authentication on the API endpoint. There is this function called get to random security questions by user that allowed an anonymous user, so no authentication, anonymous user could ask this based on a user ID and get back that user's secret, you know, two of their three secret questions and answers. And then interestingly enough, when I saw this, I realized we had a big problem. The validation of the answers that were given by the end user was performed on the client side. So the JavaScript was the one that was actually saying, oh, did they answer the, did they input the correct answer for this question or not? Yes or no? So obviously an attacker could bypass that step. And this was a brand new system. I think we looked at this like two months ago or something. A brand new feature that they added. So obviously the team wasn't really thinking about the security aspects of this. And even my third point here was even on the backend, there was issues as well. The answers themselves to the secret questions were stored in the database using symmetric, you know, reversible encryption. Ideally you want to store your secret question answers in the same way that you store passwords using, you know, a key derivation function because there's really no reason to recover the actual plain text of that answer. You just need it for comparison purposes. So we looked at a few issues that my team and I have uncovered and kind of highlighted, you know, some basic issues. So if you compare those to the cryptographic best practices, we're way off the mark. You know, we're looking at, you know, basic issues that we seem to uncover over and over rather than, you know, seeing that if developers are able to address the more advanced, you know, modern topics, they're not there in general. So this slide looks at recent cryptography issues and in this case these are some items highlighted by Matthew Green, who's a professor at John Hopkins, pretty well known in the cryptography space. So what did he highlight here? DHUK FortiGate hardcodes a key that makes every VPN session crackable. You might remember that scenario. Recently being, you know, in the past two or three years that these issues came out, you know, in the press. Pretty big deal. All related to cryptography-based mistakes. Next one, Juniper hardcodes a similar key and then gets hacked by the Chinese, who changed that key to one of their own choosing. That obviously was a big deal. Every major browser manufacturer and a number of websites make TLS vulnerable to practical decryption attacks. That was our freak attack. And similarly the next one, browsers and websites make TLS vulnerable to practical decryption attacks yet again with the drown attack. Apple uses crypto wrong in their iMessage encryption for a billion users. That was a pretty big deal. And remember, all of these are big manufacturers. These are the folks that have big dedicated security teams. They have FIP certification. So what kind of issues is everyone else running into? Obviously my team and I have seen some of those and some of the examples that I highlighted. You know, a lot, these are more complicated kind of concerns, you know, more advanced subtle mistakes that are made. The subtle mistakes, you know, are going to certainly bound to happen. It's the basic mistakes that we often see that concern me. So why are mistakes so common? Three different metrics here from different studies. Just to back up the statement that cryptographic issues are a concern. 61.5% of applications scanned had one or more cryptographic issues. That's from Veracode's report last year. But they scan a lot of code obviously. 66% of the most popular cryptocurrency mobile apps. So there was a study that looked at cryptocurrency mobile apps. 66% of those contained hard-coded sensitive data including passwords or API keys. I mean that's really unfortunate. And then look at this last one here. Highlights the fact that 17% of bugs they looked at were in the cryptographic libraries. Whereas the remaining 83% were in cryptographic, were misuses of cryptographic libraries. And that last point is what I, you know, want to emphasize that the cryptographic libraries certainly can have mistakes. And you need to be mindful of those libraries. And you know, make sure you're using latest ones. And I'll talk more about libraries. But the implementation of cryptography is more often where we see, you know, problems being introduced. So if we look at like just a few developer challenges. Again, kind of why are mistakes made? What is it that the developers have to fight with? I think, you know, for those of us that do development, this will ring true. Things that you have to, you know, be mindful of as you're working on a system. You got to make it work. It needs to work first. You have to, ensure the product actually works before you make it secure. You know, I'd argue that you need to do both at the same time. But clearly you have to make the product work. That's important. You also have to meet delivery dates. And often delivery dates can be a priority over adding in a certain level of security or getting extra expertise or pair of eyes to review it. You might have a performance or usability concerns as a result of certain security choices. And so maybe they win out. Performance or usability that is. Might have inadequate security testing. Maybe security controls don't get tested with sufficient expertise. Lack of crypto knowledge. Developers might not get the training that they need. The access to knowledge sources to give them the information about cryptography that's accurate. It might also be a problem with poor library or API support. Either in crypto libraries that are being utilized or in the programming languages that you're using, you know, that provide cryptographic APIs. I want to look at those last two items. Training and API use. Just to highlight a few things. So first off, I looked at the top 10 cryptography courses on Pluralsight. This is not to dig Pluralsight. They're a beverage sponsor here at CypherCon. You find this issue with others as well. If you're not familiar with Pluralsight, they're an excellent online video based training vendor. Lots, you know, thousands of different technical training topics that you can get. The interesting thing with the cryptography that, so I looked at the top 10. If you search for cryptography, these are the 10 ones that have, that are focused on cryptography. Some interesting highlights. They generally provide good history on cryptography and basic concepts, but generally lack practical engineering guidance. There's no discussion of authenticated encryption. Remember I mentioned authentication, authenticated encryption, AE, that's a pretty big deal. That's important. No discussion at all. Also no discussion of secure key management or key storage options. No mention of kernel based cryptographic random number generators versus user space RNGs. Two of the courses, two out of ten, made a kind of a passing mention of ECC, but they didn't really get in any details talking about, you know, why would you want to use ECC versus RSA? What are some preferred curve choices? That sort of thing. And I think some of it is just because the content is a little dated in some of them. Not to say this couldn't be addressed, but this is just indicative of cryptographic, like, training material that's out there. Whether it's stack overflow posts or, you know, training materials like that. You have to be careful because, I mean, knowledge is good. But when you give people the wrong knowledge or dated knowledge, it can actually work against you. And so on so forth. Nearly all hash examples use MD5, which we know is broken. And quite a bit of odd advice. Things like, you should double your PBKDF2 iterations every year, which I've never seen anywhere else. I don't know where that came from. Maybe you should use this obscure tiger hashing function instead of SHA-256, because SHA has been shown to have weaknesses in the past. SHA-1, for example. That's the kind of stuff I don't want developers to focus on. See, oh, maybe I should maybe I should double my PBKDF2 iterations every year. So now I've got, you know, a gazillion iterations and it doesn't work anymore. So, what are some popular cryptographic libraries? These are general-purpose cryptographic libraries that, in my opinion, provide too many options to the the average developer. Botan, Bouncy Castle, Crypto++, Libgcrypt, OpenSSL, WolfCrypt. This is just a few of them, of course. You may be familiar with some of these. You may use some of these. It's not that they're bad. They just provide a lot of different options. Things like, you know, encryption algorithms, hash functions. And I may not have these numbers, you know, completely accurate. I kind of went through documentation and tried to summarize things. And it's based on like interfaces and API endpoints. So, you as a developer go in, you have to choose a hash function. You've got 27 different choices in Libgcrypt. Signature schemes. We're at a reason Bouncy Castle goes crazy on signature schemes that, you know, at your disposal. So, as a developer, it increases your chances of choosing something that may not be in your best interest. Oops. Landmines in cryptographic libraries. So, in addition to having just lots of options, they also have some bad or insecure options that you need to be aware of. Things like RSA with insecure padding. That's an area where cryptographers, you know, have a lot to say about. You should never, you know, not only should you maybe not use RSA, prefer using ECC. If you need to use RSA, make sure you're using preferred padding. There's a lot of implementations that just provide insecure padding. That's the default. All of these libraries provide at least the option of using RSA with insecure padding. AES with ECB mode. We know that's kind of, you know, faulty. Broken ciphers. Things like RC4, for example. RC2. Just some really outdated, clearly broken ciphers. They're there in every single one. A lot of it's to provide backwards compatibility or, you know, interoperability with things. But if that's an option and someone, you know, as a developer, meaning well, maybe they've copied something off the Internet. I was looking at a blog article just the other day where someone was talking about old, you know, algorithms that were basically defunct in the mid-90s. Same with hash functions. Old hash functions. Broken hash functions. This user space RNGs. They all have, other than WolfCrypt I guess, they all have, you know, this user space RNG concept that can get you into trouble. Without even realizing it. Because it's just, it's kind of like the default behavior. Another thing is with implementation challenges. Here's a specific example around the encrypt and MAC approach. So you have some knowledge. You know you need to use authenticated encryption. But maybe you're working with a library or a programming language that doesn't have authenticated encryption. But you can build it yourself with the built-in primitives. So you put together this encrypt and MAC construction. In order to do it correctly though, you have to keep all this in mind. You've got to use a different key for your encryption from your authentication. Preferably you derive those keys from a single master key using a KDF. You have to also make sure that all your string comparisons use constant time. You can't use standard string comparison functions. You definitely have to use an HMAC and ensure it includes a ciphertext, your additional authentication data, the initialization vector, and the encryption method. All of those have to be packaged inside the HMAC. And those fields passed in the HMAC, they must use a format that unambiguously delineates them. Any, if you leave out one of those fields, if you don't use a correct format, you essentially are introducing a weakness. And you don't, you don't necessarily realize it because this isn't your bread and butter to create these encrypt and MAC constructions. I've seen so many examples of these where well-meaning developers know they need to go down this path of authenticated encryption, but they try and put all this together and it's admittedly, it's complicated. We also have weaknesses in standards and protocols. Many of you may have worked with the JavaScript object signing and encryption standard suite. It's like a suite of standards. Things like JSON web signatures, JWS, which even if you have a compliant implementation, meaning it meets the standard correctly, it is vulnerable to passing in the none or HS256 algorithms. As an attacker, you can pass those in when you're, when in reality you're expecting an RSA signature and it's able to kind of work with those. Essentially, it's a fairly known by now, you know, attack vector. So you want to make sure not only are you using a standards compliant implementation, but it's also aware of these weaknesses and has put in extra safeguards to protect against them. Same thing with JSON web encryption, JWE allows this insecure padding, you know, RSA with insecure padding choice, which is unfortunate. It also allows ECDH with NIST curves, which introduce risk of invalid curves attacks. A little more rare, but something to be concerned about. It's unfortunate that these choices exist in the standard. OAuth 2, you may be familiar with OAuth 2. It's a great, you know, standard for technically as an authorization protocol, but used in the authentication authorization space. Provides several weak workflows that ideally should really never be used in modern systems, but I see this all the time. Use of client credentials and password grant types where they shouldn't, you know, it's not recommended. There's also this optional state parameter, this in the standard, it's optional, but it really should always be used. There's also no explicit access token specification in OAuth 2. So you'll see, you'll see some interesting implementations as a result of that. There's just no specific guidance. So obviously good implementations of OAuth 2 do a great job, but they're going above and beyond the standard because they, you know, have additional knowledge and experience associated with that. I always recommend that anyone that's working in the OAuth 2 space, that you read the Threat Model and Security Considerations RFC. You know, if you search for OAuth 2 Threat Model, you'll get this RFC. Goes into a lot of detail about particular threats and countermeasures that you can be, that you should be aware of. But most developers just aren't aware of that. And as a result, they don't think about certain types of attacks you can do in this space. So I ask myself this all the time. Should developers stop using cryptography? You know, because we do a lot of training. At one point I asked a client and said, you know, we've done all these assessments. We see all these, you know, consistent issues with cryptography. Maybe it makes sense to start telling your developers to stop doing cryptography. Because clearly, at least everything I'm looking at, no one's doing it right. So does that make sense? Should we tell developers to stop using cryptography? Clearly, you know, one pro of that approach is to have less security implementation flaws. Con is, of course, then your systems will lack even basic protections. They're not gonna have use any kind of cryptography. And so then it's more obvious that you've got, you know, a lack of security protection. So what if security pros do all the crypto, you know, cryptography work instead? And in certain organizations, that might make sense. You might be able to do that. But generally, security pros don't scale well. You can't have, you know, one or two people writing all of this code. You know, anything involving cryptography, which admittedly is an increasing amount of features and system aspects these days that need to take advantage of crypto. So that's just gonna slow down progress if you have the, you know, just folks with specific expertise take care of that. So, assuming we can't do that, what are some other options? What is this idea of developer-friendly cryptography? Has anyone ever heard of that concept before? Developer-friendly cryptography? I don't think I dreamed it up. Not you. So how to define developer-friendly? Here's four different things that, to me, mean it's developer-friendly. Like, takes it to that next level. There's no need to select ciphers or key sizes. There is automatic generation of encryption key, initialization vector, salt, and nonce values. There are simple, clear APIs that provide high-level outcome-based functionality. So rather than using cryptographic primitives, you know, you have to choose ciphers and algorithms. You instead say, I, you know, it's high level. I'll show you some examples of what I mean by this. And also important, also important, there are no insecure or low security options. So as much as possible, you avoid the scenario where developers shoot themselves in the foot. So, start with, let's look at some developer-friendly libraries. Those that I feel are kind of candidates for this concept of developer-friendly. We've got Libsodium. And that's an important one to remember. If you remember anything from this talk, remember Libsodium. It is a cross-platform, compilable, you know, module based on the NACL package. It's got language bindings for most languages out there. It's very popular in the security professional and cryptographer space. It's generally easy to work with from a developer standpoint. And it supports things like authenticated encryption, digital signatures, you know, hashing. It's performance optimized. And it doesn't give you so many options that, I mean, it gives you enough options as a developer to get most things done. If you need something certainly more advanced or more unique, you may have to move to a more general purpose library. But this is the type of library that I feel comfortable recommending to any developer in any sort of environment that they're working in. Libhydrogen is kind of like the younger brother of Libsodium. Designed for the embedded system space. It's written in C, C99. So it, you know, it's fairly versatile. It has the same general API, the interfaces as Libsodium. But implemented with essentially just two cryptographic primitives. So it keeps it very small, very lightweight. Not as tried and not as well tested as Libsodium at this point. But definitely something that you should keep your eye on if you're working in the embedded system space. MonoCypher is another one. Pretty nice and not something I would necessarily recommend over the other two. And then, but at the same time, if I was assessed, you know, performing an assessment against the system and I saw use of MonoCypher, Tink, or ASP.NET Core, I would feel more comfortable than a general-purpose library. Tink is another library put together by some Google engineers that is designed, again, to create those high-level kind of outcome based functionality. It doesn't have password hashing or performance. It's not as performance optimized. And it only works with Java and C++. But it's still a decent option. Even, and then I also want to highlight, you know, ASP.NET Core, which is an interesting one in this list. And that it is, you know, only a subset of Microsoft's new .NET Core framework. And it, again, it only provides authenticated encryption and password hashing. And it does provide some key management functionality as well. But it is, it's kind of nice in that it's built-in. So developers that can leverage this, I like the direction Microsoft is going with this cryptographic API. So here's an encryption example from Limbosodium. This is C code again. You don't really have to read the detail and you probably can't if you're in the back. Other than it is fairly simple. You'll notice if you, you know, view this later, for example. There's no mention of any cryptographic algorithms here. No choice of cipher. No key size selected. No, you know, correctly generating initialization vector or nonce. It kind of makes, I mean, you still have to do, you know, create the key and create the nonce for the encryption. But it provides functions to do this in a very simplistic manner. Where a developer is not going to generally choose, you know, a wrong selection. Here's LibHydrogen. This is a public key signature example. Where again, it's using, it's creating a public-private key pair and using that to generate a signature to sign some data and then later verify the signature. So again, very straightforward, very simplistic for a developer to leverage. So that's all great. But what about FIPS 140? Anyone familiar with FIPS 142? A couple folks. This is, so FIPS stands for the Federal Information Processing Standard. And publication 140 has to do with this particular space and the two is means it's version two. That's the current version of FIPS 140. So cryptographic libraries can be FIPS 142 level one validated. So the software library itself can be level one validated. Level two validation is reserved for hardware devices. But that validation, that requires some time and money. You actually have to ship your, you know, your library off to a testing organization called CVMP. They run it through its paces, you know, look at a whole bunch of things with it. Obviously it takes a lot of time and you have to pay them. But then you get a certificate that says, okay, we tested this particular library on this specific set of hardware and actually will list out, you know, whether it's two or four different hardware-based environments. You know, this particular, like it's Windows 2008 with this particular version on it. So they certify that will work properly according to FIPS approved standards with that particular environment. Use of FIPS validated modules is mandatory by US and Canadian government agencies. Others may use it as well. But definitely if you're working in the federal government space, this is most likely going to come up for them. It's important that you have a FIPS validated library that you're utilizing. It doesn't mean you have to create one and get it validated. It just means that any of your cryptographic choices, you know, your cryptographic code is utilizing a library that's gotten this sort of validation. Keep in mind that FIPS validation doesn't necessarily mean that it's more secure. That that library is more secure. It just means it's gone through this validation process. It's been reviewed essentially by the CBMP against FIPS standards. So if you are working this space, you do need a FIPS validated library. But clearly you can't use any of the, well I shouldn't say clearly because I didn't state it, but any of those libraries we looked at before, LibSodium, LibHydrogen, none of those are FIPS validated libraries. And there's reasons for that. They, you know, they're really not in a position where that they want, they need to pursue, you know, as an open source library to pursue this validation. Generally you're looking at a commercial library that's gone through this process in this case. So they're out there. There's some good libraries like WolfCrypt for example, NanoCrypt by Mocana. Even OpenSSL has a FIPS validated version of the library that you could look at. I do recommend that if you need one of these, that you take the time to consider writing a developer-friendly wrapper around it. Similar to, you know, providing interfaces similar to LibSodium or others. Or maybe get some expertise, some outside help for example, to put this together, this wrapper for your developer group. That way they're not having to worry about the specific cryptographic choices and so on. Also, key management best practices. We talked a little bit obviously about, you know, general encryption and cryptographic function best practices. Key management is another area that I mentioned is a fairly big weakness. Developers in code that I've reviewed and my team has reviewed, they can at times do a very good job of putting together cryptographic functions. But then the keys are, you know, they're stored in source control or they're stored even in, you know, hard-coded in the system itself. So, some best practices around key management. If we had an ideal system, it would have all of these. No key or secret should be stored in clear text. Keys should have a defined limited lifetime based on usage. The keys should be refreshed automatically where possible. There should be a method to manually revoke keys. You always want a way that, you know, once you generate a key that you can revoke it. Even if it's, you know, kind of a manual process that you have to go through. Keys should never have an unlimited lifetime. Access to clear text keys should be limited through authorization or permission. So, being able to play some access control around it would be, is important. You also want to make sure that access to clear text keys is limited through authorization and permissions. And the all key lifecycle and access events should be audited. So, ideally, anytime there's, you know, access to a key, you know, you pull it out of a vault, for example, that there's a log event that occurs, you know, this particular application or this particular user. You see this a lot with code signing. You know, if you've got an implemented code signing kind of process set up within your organization, you generally want to have a more robust handling of the key material around use of code signing. So, only certain individuals have access to that. That usage is fully audited and so on. It sounds like we're in an elevator. So, some key management solutions. You might not be able to read this slide, but I'll highlight a few solutions that are, you know, some pre-made kind of solutions rather than having to roll your own, which is generally not ideal. So, you've got HashiCorp Vault or KeyWiz. These are good for distributed systems, you know, client-server environments, anywhere where you've got more, you know, more components involved in your particular system environment. Vault is considered the gold standard in this space. They have very comprehensive key management strategy and allow a variety of topologies in how you deploy your key management infrastructure and so on. So, it's a great tool in this space. You may also leverage a hardware security module. And hardware security modules often can be used with a lot of these solutions as a hardware backing, essentially. So, you could get it in a chipset form if you're working, you know, an embedded environment, for example, a bedding system. Maybe as a USB device or even as like a rack mount appliance that you could have in your own data center. An HSM provides a very protected form of kind of a vault, if you will, you know, an actual hardware vault to store keys. And to do so in a way where it's tamper resistant, very difficult for someone to pull keys out of there. They actually get generated on that device. And those keys, at least, you know, in case of private keys, for example, they never leave that device. They always stay there. If you're working in the cloud, you got things like Amazon KMS, Azure Key Vault, Google Cloud KMS, and OpenStacks Barbican. These are all HSM as a service, essentially, which are great, especially if you need to use this on a maybe you need to use an HSM on a more periodic basis. You don't have the funds to get a full HSM appliance. You can just leverage one of these systems and do so in a fairly inexpensive manner with good, you know, robust. I mean, on the back end, they're essentially using those HSM appliances themselves. If you're working with either Ansible or Chef in automation processes, both Ansible and there's an Ansible vault feature and Chef has a vault feature as well, which is above and beyond just Chef data bags. Chef vault is more robust for use with secrets and key management. They do lack more advanced features like, so for example, if you're using Ansible vault, you're not going to get all the feature set that you would with HashiCorp vault. But because it's built in, it can be obviously very convenient. It's a good place to start. If you want integrated secret and workflow management, you might want to consider Docker with SwarmKit or DCOS. They have commercial offerings of these that offer even more advanced features then, but you will get some, you know, a basic feature set in the free versions, which is a good, again, not as robust as like a vault implementation, but that integration, of course, is nice. The other ones I'll mention, Knox, Tink, and ASP.NET Core. Those are all examples of integrated application level key management. You're not going to get necessarily the more advanced features, but they, and they have more limited language bindings. For example, obviously ASP.NET Core only works with ASP.NET Core. Tink only Java and C++. I forgot now what Knox works with. Probably Go. But they're very, they, because you can implement them at your application level, you've got more flexibility on how those keys are managed. Often, they will also work with an HSM backend. So if it's present, you know, you can leverage the HSM hardware. So high-level developer-friendly recommendations. Transport layer security. So if someone, you know, if we're reviewing code or someone comes to me and asks about what kind of algorithms should I use for secure communications? I say, let's not talk algorithms. That's just, you know, we're, we're gonna go nowhere with this. Someone's already done that work for you. Just use TLS or essentially, you know, a robust communications protocol that is already gonna take into consideration all the things that you don't need to worry about. And do so in a, you know, in a right way. In a secure way. On the storage side, you want to consider built-in storage encryption solutions where possible. You know, certainly take advantage of these if you have the opportunity to. Things like Microsoft's transparent data encryption. Again, you're just getting disk level encryption at this point. But it's so easy to use, there's almost no reason why not to. And it gives you the ability to protect your data at rest, your data backup, database backups and SQL Server and so on. Same thing with like Amazon's EBS encryption, full disk encryption. Those are just good options to leverage where you as a developer don't have to do anything. You don't have to know it even exists. It just kind of works for you for those, for specific scenarios. We mentioned developer-friendly libraries. Definitely want to check out use of those libraries, even above and beyond, you know, your built-in programming APIs. For example, in .NET. .NET has a pretty decent cryptographic API that includes, that is included with the framework. But I've seen so many misuses of it. The general, my general recommendation to development teams as we go through training is, wherever possible, use Libsodium. If you need to get, if you need to get to the point where you need to use cryptography, you need to use some encryption, like you can't leverage TLS, so you can't, you know, it's, it's not covered by any of those kind of built-in solutions. Take advantage of Libsodium, which has a, you know, Libsodium net is the .NET language binding of Libsodium. And then key management solutions. You always want to consider how the keys are being managed. That includes, not only, you know, your production environment, but also your test environments, your local development. Anytime I see a key checked into source control that is not specific to a development environment or, you know, kind of a small test environment, we always want to flag that and say, hey, this is not an ideal solution. Oversharing of keys is all too common and very easy for things to leak out. I mean, we've probably all seen examples of things that get into GitHub or get into areas where they're just more accessible. Logs, sometimes, a lot of times keys get logged too. And those logs will get aggregated into log management systems. And those logs, you know, or, you know, sucked in via syslog systems. And so then you've got more people with access to your logs. And then if they get keys or credentials that way, then you've opened up a door and so on. Questions? I left a little time here. So any specific questions? Yeah, feel free. These links are, again, linked to the slide deck. And I specifically put in three links because I wanted to keep these, these three tools, if you will, or libraries, is what I would consider the most important. These are the, these have a general purpose in the sense they have broad applicability, broad, you know, usage in the case of Libsodium and Libhydrogen. Fairly decent, like, language binding so that they'll work in most scenarios. So if you're working in PHP, in fact, PHP has, default cryptographic library now, which is pretty exciting. They're, they're kind of ahead of the game, of all things PHP. Well, if you're working in Ruby, Go, Rust, .NET, Java, you name it. I probably have, there's probably a Libsodium binding for you. Questions? Yes? Are there good code review options for cryptography specifically? Yeah. My team, obviously, enjoys looking at cryptography. There are other, I think, firms that kind of specialize in looking at cryptography. But from an automated standpoint, there are some things. I think most tools, I, I have a leg in the like static analysis space as well. And spend time looking at what tools are capable of in determining cryptographic issues. And it is an interesting space. Clearly they can identify usages of old, outdated functions. But being able to determine if you're using even newer functions in a proper manner, can be complicated. You know, you got a, like an encrypt and max solution. It might be great. It might be that you've missed something and it's just, there's just not enough context for an automated tool to figure that out. So I can't point to one specific example. I think most, obviously the big-name commercial static analysis tools are probably, all attempt to do something in this space. Lighter weight tools, you know, open source tools, varying, it's mixed, I think, based on the language. Java has some very interesting default cryptographic choices. Things like, if you encrypt with AES, you're gonna get ECB mode by default. So that's obviously not cool. So a lot of tools will pick that up. Some Java based concerns. So yeah. Cool. All right. I think we're good. Thanks, everyone.