[00:32.080 --> 00:39.560] When you're broke. Lots of companies or organizations, public organizations like [00:39.560 --> 00:45.780] the one I work for, are dealing with budget constraints now. It's a very real [00:45.780 --> 00:50.780] thing. You need a tool, or you think you need a tool, and it costs a lot of money. [00:50.780 --> 00:55.960] And how do you get that job done? You need people that are knowledgeable, [00:55.960 --> 01:02.480] problem solvers that can come up with those kinds of solutions. And after all [01:02.480 --> 01:06.540] my experience in the field, and poking around a bit, playing around with [01:06.540 --> 01:12.100] different stuff, I believe I found quite a few tools out there that can pretty [01:12.100 --> 01:16.700] much get the job of forensics done for a company. Which is kind of amazing that [01:16.700 --> 01:22.420] you don't have to pay $5,000 for something. So another thing, another [01:22.420 --> 01:27.260] thing I wanted to, a group of people I wanted to speak to, or anybody that's new [01:27.260 --> 01:33.200] to forensics, thinking about getting into forensics, I wanted to speak to them too. [01:33.200 --> 01:39.180] Because I want to show you what the forensic process looks like, what it's [01:39.180 --> 01:45.000] all about, what you're actually striving to do, and what kind of tools are out [01:45.000 --> 01:50.700] there for each step of the way. So I hope you guys get something out of that. This [01:50.700 --> 01:55.720] isn't like a, this isn't a presentation about, you know, listing all the free [01:55.720 --> 02:00.860] tools out there and, you know, training you on how to do them. This is just a, you [02:00.860 --> 02:05.100] know, this is what computer forensics looks like. These are the things you need [02:05.100 --> 02:10.080] to watch out for. And here's some tools along the way that can help you get each [02:10.080 --> 02:14.700] step of the process done. And I'm also going to demonstrate a few of those [02:14.700 --> 02:20.360] things. They're pretty simple demos. You'll see them along the way. So yeah, let's [02:20.360 --> 02:28.090] get into the meat of it. So the goals of forensics. What do you [02:28.090 --> 02:31.810] think the goal of forensics is? This is, I want to make this like an [02:31.810 --> 02:35.290] interactive class too. So feel free to raise your hand, okay? [02:36.690 --> 02:43.190] What's forensics? I mean, you can think like CSI, whatever. Yeah. Evidence. [02:43.370 --> 02:47.850] Evidence is definitely part of it. What's the, what's the aim of forensics? [02:50.390 --> 02:54.210] Figuring out what happened. Good. Those are, those are good accurate answers. Some [02:54.210 --> 02:58.990] people think that forensics is, oh you're trying to, you know, you're trying to [02:58.990 --> 03:03.290] analyze evidence to prove a point. I don't like that answer. I don't like that [03:03.290 --> 03:08.070] answer at all. I don't think forensics is about proving points. I think forensics [03:08.070 --> 03:12.650] is about reviewing the evidence and presenting the facts of a case. And [03:12.650 --> 03:19.270] points can be made based off those facts. So yeah, good answers. So yeah, [03:19.270 --> 03:23.670] you're trying to answer the who, what, when, why, and how, those kinds of questions. [03:24.890 --> 03:30.730] We're gonna go into the process of forensics more, but basically the way it [03:30.730 --> 03:35.550] works is you're gonna start out, somebody's gonna have a question about [03:35.550 --> 03:42.630] something. So for example, let's say I'm working at the university and, you know, [03:44.490 --> 03:50.430] somebody says that, oh so-and-so down the hall had, you know, nude ladies up on a [03:51.090 --> 03:57.310] screen last week or something like that. Well that rolls down to me and that's an [03:57.310 --> 04:02.890] allegation. That's got a specific, you know, time and date stamped on it for me. [04:02.890 --> 04:07.930] And so my goal at that point is to say, you know, did he look at that stuff at [04:07.930 --> 04:11.630] that point in time and present the facts around that. So that's kind of an [04:11.630 --> 04:18.670] example of something you'd be seeing. So like I said, explicit content. Another [04:18.670 --> 04:23.810] really practical one for like enterprise-wide environments is if you [04:23.810 --> 04:29.470] have a SIM and you receive some kind of alert that there's malicious activity on [04:29.470 --> 04:35.270] whatever segment of your network. It's, you know, just blowing up traffic or [04:35.270 --> 04:40.250] whatever and it's coming from this IP address. That's another thing that I [04:40.250 --> 04:45.050] would go and analyze. I try to see what's on the, what's on that box, what's [04:45.050 --> 04:51.090] causing it and preserve the evidence there. So those are the goals of [04:51.090 --> 04:58.310] forensics. The process. This, I came up with this. This isn't like the official [04:58.310 --> 05:02.890] forensic process. This is just like, this is what I do every time. Every time I [05:02.890 --> 05:08.510] work on a case, this is what I do. Starts out with communication. What I mean by [05:08.510 --> 05:14.610] that is, it starts somewhere. You know, someone asks for my service. Someone's [05:14.610 --> 05:18.690] gonna ask for, if any forensic examiners are out there, they're gonna ask for your [05:18.690 --> 05:24.630] service to help you answer a question. So that's, that's kind of what I mean by [05:24.630 --> 05:27.390] communication. You might receive a telephone call asking you to do [05:27.390 --> 05:33.850] something, that kind of thing. So you got the first alert. Preserving the evidence. [05:34.930 --> 05:41.010] I would argue that preserving the evidence is probably the most important [05:41.010 --> 05:49.270] part of the forensic process. And why is that? What happens, what happens if I get [05:49.270 --> 05:56.650] it wrong? What happens if I don't preserve the evidence properly? Yeah. Yeah. [05:56.690 --> 06:00.130] The whole, I mean, that's the foundation. That's the foundation of your [06:00.130 --> 06:04.550] investigation. The integrity is just shot. The credibility is out the window. If you [06:04.550 --> 06:10.110] do it wrong, then, you know, down the road, if you end up in litigation and you got [06:10.110 --> 06:14.570] to stand before a judge, like, what are you gonna say? What are you gonna say about the [06:14.570 --> 06:18.490] integrity of an evidence, of the evidence? What's the attorney on the other side [06:18.490 --> 06:29.710] gonna say about you? You know? I'll get into, I'll get into that. I'm gonna go, so [06:29.710 --> 06:34.070] you can line out in more details about exactly what I mean with that and do [06:34.070 --> 06:39.590] some demos on that. All right. So, yeah. Let's move on. Oh, I'm sorry. Actually, yeah. [06:39.730 --> 06:43.390] Processing it. The evidence that you preserve it and then you process it. [06:44.950 --> 06:50.070] Processing has specific goals in mind. I'm gonna get into that. You analyze the [06:50.070 --> 06:53.890] evidence to try to answer the questions. And then, lastly, you're gonna report the [06:53.890 --> 06:59.430] findings to the person who asked you to do it in the first place. That's the [06:59.430 --> 07:09.280] forensic process in a nutshell. All right. Preserving the evidence. Look like [07:09.280 --> 07:14.460] that got more powerful. So, we already talked about why we do it, why it's [07:14.460 --> 07:20.240] important. There's some free tools out there. FTK Imager, Forensic Linux [07:20.240 --> 07:25.380] Distribution, such as CAIN, which stands for the Computer Aided Investigative [07:25.380 --> 07:32.980] Environment. Volatility and Redline are also wonderful tools. Preserving the [07:32.980 --> 07:38.220] evidence. Yeah, important. Credibility there. I want to talk about a couple [07:38.220 --> 07:42.780] concepts, forensic concepts. Right-blocking. Does anybody know what [07:42.780 --> 07:47.780] right-blocking is? I'm assuming we have... we've got newbies here. We've got, you [07:47.780 --> 07:51.400] know, middle of the road. We've got seasoned veterans. So, can somebody [07:51.400 --> 07:53.440] answer what's right-blocking? Yeah, back there. [08:09.620 --> 08:20.940] Right. Right, yeah. So, right-blocking. It's, yeah, it's exactly what it sounds like. [08:20.940 --> 08:26.440] You're blocking the ability to write to the piece of evidence. Not possible in [08:26.440 --> 08:33.420] all scenarios. I would probably use the example of cell phone forensics as one. [08:33.420 --> 08:42.140] That's not always possible to do that. So, yeah. What was I gonna say? Sorry. So, [08:42.140 --> 08:49.060] FTK Imager is a good tool to create a forensic image of something that's [08:49.060 --> 08:56.680] right-blocked. So, forensics. Thinking about traditional forensics and handling [08:57.520 --> 09:03.740] a crime scene per se. You wouldn't just go walking around on a crime scene [09:03.740 --> 09:10.740] moving things around. You wouldn't be placing new things on the crime scene. [09:10.740 --> 09:14.520] And that's essentially why we're using a right-blocker. We don't want to change [09:14.520 --> 09:18.520] anything about the piece of evidence because it's, like I said, the whole [09:18.520 --> 09:25.420] foundation of our investigation. Some Linux distributions. I mentioned Kane. [09:25.420 --> 09:31.960] It's just a personal favorite of mine. There's other ones out there. Kane is a [09:31.960 --> 09:38.160] forensic Linux distribution that enables right-blocking by default when you boot [09:38.160 --> 09:47.420] into it. So, for example, let's say I have a laptop that has a solid-state drive. [09:47.420 --> 09:50.820] You know, one of those that's, it's like, it looks like a little, I'm not a [09:50.820 --> 09:55.720] hardware guy. Looks like a memory stick almost and doesn't interface well with [09:55.720 --> 10:01.180] traditional SATA interfaces. I'm not really able to plug that in to anything. [10:01.180 --> 10:05.720] So, what I would do in that scenario is I would boot into something like Kane or [10:05.720 --> 10:11.200] another Linux distribution just off the laptop itself. And it's gonna right-block [10:11.200 --> 10:14.960] everything and then I'm gonna plug in an external drive to it and I'm gonna image [10:14.960 --> 10:20.180] it. That's just how it goes. Speaking of imaging, does anyone know what [10:20.180 --> 10:42.100] forensic imaging is? Back. You're not saying it? Okay. Yep. Right, yeah. So, in [10:42.100 --> 10:48.640] traditional crime scenes you see the forensic guys, I'm just gonna call them, [10:48.640 --> 10:52.560] taking pictures and stuff of the evidence, you know, putting numbers by [10:52.560 --> 10:57.720] the evidence and whatnot. We're creating a forensic image of a hard drive or [10:57.720 --> 11:04.860] whatever the digital device is and we're, yeah, making an exact bit-by-bit [11:04.860 --> 11:10.940] copy of that hard drive, preserving it exactly how it was. So, yeah, that's what [11:10.940 --> 11:16.560] forensic imaging is. A couple other tools I listed there, volatility and redline. [11:16.560 --> 11:21.260] Those come into play for memory forensics and that's just a whole [11:21.260 --> 11:26.960] another ballgame. We're gonna be talking more about just doing traditional hard [11:26.960 --> 11:32.360] drive forensics today and doing my labs through that. But if you know anything [11:32.360 --> 11:38.340] about memory, RAM, whatever you want to call it, it's a volatile piece of [11:38.340 --> 11:43.900] evidence. That means as soon as you turn that computer off, you've cleared out [11:43.900 --> 11:50.160] the memory. And I won't go into, you know, all the tricks that you can do to [11:50.160 --> 11:56.140] actually preserve the actual memory using like freezing techniques and [11:56.140 --> 12:02.240] whatnot. But for all intents and purposes, the data is lost once the device gets [12:02.240 --> 12:07.720] turned off. So, yeah, so let's take a look at a demo. I'm just gonna turn the mic [12:07.720 --> 12:13.980] off here and going to fire up FTK Imager. Really simple demo, but I just wanted to [12:13.980 --> 12:19.260] show you like how easy it is to use a free tool like FTK Imager to create a [12:19.260 --> 12:25.120] forensic image of a hard drive. But, you know, saying that we're using a write [12:25.120 --> 12:29.160] blocker and that's the whole foundation of our investigation. I'm just going to [12:29.160 --> 13:00.260] show you how easy it is to just do it right. So, yes, that's a really good [13:00.260 --> 13:06.140] question. How do software write blockers compared to hardware write [13:06.140 --> 13:11.060] blockers? I've played around with that a little bit. I haven't actually used like [13:11.260 --> 13:16.800] a paid piece of software. It's my first inclination to go with a hardware [13:16.800 --> 13:23.420] write blocker every time, because that just kind of leaves, you know, bugs and [13:23.420 --> 13:27.400] things out of the scenario for the most part. If it's just cutting it off at the [13:27.400 --> 13:33.080] hardware level, I feel much more comfortable about that. So if I don't [13:33.080 --> 13:37.840] have that capability of having a hardware write blocker, I would try to [13:37.840 --> 13:43.360] implement a technique like booting into a Linux distribution where, you know, the [13:43.360 --> 13:49.920] policy is just set in the Linux distro to automatically write block. I've played [13:49.920 --> 13:54.220] with things where, you know, it's changed, you know, one bit in the Windows [13:54.220 --> 14:01.260] registry to turn off the ability to write to USB drives. It seemed to have [14:01.260 --> 14:06.700] worked in my testing. I just don't have 100% confidence in that though. So I hope [14:06.700 --> 14:09.560] that answers your question. All right, let's get into the lab. [14:11.740 --> 14:20.080] All right, so FTK Imager. We got it going on here. I am basically just going to go [14:20.080 --> 14:26.380] file and basically say create a disk image. We're creating an image of a hard [14:26.380 --> 14:36.300] drive, whatever. I'm going to select physical disk. I might run over on time if I keep [14:36.300 --> 14:40.940] asking questions like this. What's the difference between a physical drive and [14:41.060 --> 14:49.300] a logical drive? Can someone answer that? Yes. They're partitions, yes. So you'd be [14:49.300 --> 14:54.660] taking a forensic image of a partition, not the actual whole or everything [14:54.660 --> 14:59.180] that's on the physical hard drive. So we want physical drive. [15:02.700 --> 15:04.900] So we're just going to select that. [15:14.310 --> 15:22.890] All right, so we've got the physical drives listed there. We would then find our piece of evidence here that we're [15:22.890 --> 15:26.910] dealing with. You would see, you know, physical drive, zero, one, two, three, four, [15:26.910 --> 15:32.030] five, whatever, however many drives you have. And you would pick the most relevant one. [15:32.930 --> 15:41.370] Hit finish. It's going to ask you where you want to put the forensic image and what [15:41.370 --> 15:46.910] format you'd like to store it in. It gives you a few options here. You have the [15:47.590 --> 15:53.410] ability to store it in just a raw, as a raw DD image. That's just, this is bit by [15:53.410 --> 15:56.890] bit everything that's on the disk. There's no compression, nothing like that. [15:56.890 --> 16:02.350] So if you have a two terabyte drive you are doing a forensic image of, your raw [16:02.350 --> 16:07.850] forensic image is two terabytes as well. So you got to be smart about that. [16:09.010 --> 16:15.890] Typically the formats you're going to see are raw DD and E01 formats. The E01 [16:15.890 --> 16:24.250] format is the NCASE witness format and it has compression capabilities. You can [16:24.250 --> 16:28.210] do just a little bit of a compression and all the way to super crazy and that [16:28.210 --> 16:32.850] will change how long it takes for your forensic image to be made. [16:34.630 --> 16:45.550] So we're just going to select raw. At this point you're going to see, it's going to ask you for case number, evidence number, description, your [16:45.550 --> 16:50.610] damner name, all that kind of stuff. It's good to try to fill in as much as you [16:50.610 --> 16:56.470] can. Especially if you have to produce a forensic image to the other side in the [16:56.470 --> 17:01.790] courtroom and they would like to penalize it, they will care about this kind of documentation. [17:04.390 --> 17:12.350] Next, select the destination, give it a name, and you're finished. And it's good. And [17:12.350 --> 17:16.770] that's all there is to it. That's all there is to creating a forensic image there. [17:26.980 --> 17:34.240] Okay, as you can see it's very simple. Processing the evidence. So this is [17:34.240 --> 17:41.620] really where I think the experience in forensics training comes into play. [17:41.820 --> 17:46.760] Because you're trying to figure out, okay, what kind of question am I answering [17:46.760 --> 17:51.540] here? What's the whole reason I'm doing this case? Like we said, are we [17:51.540 --> 17:56.880] looking for, you know, explicit content on somebody's computer on a certain date? [17:56.880 --> 18:01.740] That's the goal of my investigation. Am I going to check and see that, is there [18:01.740 --> 18:08.120] this kind of activity on that date? So by reviewing the goals and trying to [18:08.120 --> 18:13.000] remember them, you're setting yourself up to process the evidence in an [18:13.000 --> 18:26.900] efficient way. Give me a second here. Okay. So some tools to do this. Free tools. You [18:26.900 --> 18:32.320] can write them down if you want. Locked timeline. Redline. Volatility. I mentioned [18:32.320 --> 18:39.480] those already. Red Ripper. Wireshark. Snort. Scalpel. Photoreq. Clam AV. Virus [18:39.480 --> 18:46.580] Total. And Lime. So obviously I'm not taking credit for any of these these [18:46.580 --> 18:52.320] tools whatsoever. There's amazing authors out there in the forensic community. Some [18:52.320 --> 18:57.820] of which I can't even pronounce their names, like Logged Timeline. So some of [18:57.820 --> 19:02.840] the concepts I want to go over. Timeline analysis. Why would you think this [19:02.840 --> 19:18.770] is important? Yeah, you're just trying to prove, did something happen at a certain [19:18.770 --> 19:24.590] time on a computer? It also, for example, if we're dealing with like a piece of [19:24.590 --> 19:29.970] malware or something, you might see a series of events occur on a computer. In [19:29.970 --> 19:34.730] which case, timeline analysis is super helpful for trying to figure out the [19:34.730 --> 19:40.390] whole story of that piece of malware. Delete file recovery. I hope that we [19:40.390 --> 19:44.710] all know that when we delete something on a hard drive, it's not gone forever. [19:45.990 --> 19:54.090] Metadata analysis. We're reviewing pieces of information about a specific file. So [19:55.010 --> 19:58.830] a piece of metadata would be like a timestamp, like the created time of a [19:58.830 --> 20:04.850] file, the modification time of a file. Those could be important. Worked on a [20:04.850 --> 20:13.590] case where there was someone trying to forge a will to gain sole ownership over [20:13.950 --> 20:18.810] a company that this guy was passing along in his will. So he's trying to [20:18.810 --> 20:25.110] forge a will. Just simple metadata analysis right there on a PDF file. Who [20:25.110 --> 20:29.390] the author was, when was the last modified time of it, the creation time of [20:29.390 --> 20:36.170] it. Just answering simple questions and that's, you know, what helps solve the [20:36.170 --> 20:42.810] case. Memory forensics. We talked about that. Capturing. Instead of doing like a [20:42.810 --> 20:47.670] forensic image of a hard drive, you would basically be capturing the live memory [20:47.790 --> 20:53.670] of a computer and performing forensics on it after you captured it. Tools like [20:53.670 --> 20:58.090] volatility and redline can assist you in that. Even FTK imager can actually [20:58.090 --> 21:05.310] done memory. Network forensics. That's usually important when you're dealing [21:05.310 --> 21:12.270] with a piece of malware and you have an outbreak on your network of an [21:12.270 --> 21:16.850] infection and you're trying to see what's communicating with what. Data [21:16.850 --> 21:23.070] exfiltration could possibly be seen by network forensics as well. And lastly [21:23.070 --> 21:32.190] file carving. Does anybody know what file carving is? Yeah. File carving. Yeah. [21:32.190 --> 21:33.070] Go ahead. [21:49.380 --> 21:59.800] Sure. So, and one of the examples I have of a file carver is scalpel and [21:59.800 --> 22:07.120] photorec. So, the way file carving works is you have to kind of think, okay, what [22:07.120 --> 22:14.980] is a file? What is file? Like how does the computer know what an Excel file is? How [22:14.980 --> 22:21.660] does a computer know what a PDF document is? Or so on so forth. Has anyone heard [22:21.660 --> 22:27.280] of the magic numbers at all? Yeah. Magic numbers. So, I don't call them magic [22:27.280 --> 22:32.140] numbers. I usually call them headers and trailers. Hexadecimal numbers that will [22:32.140 --> 22:39.320] identify the signature of a file type. So, it helps the operating system know how [22:39.320 --> 22:45.280] like what is an Excel file? What's a PDF file? So, these tools like scalpel and [22:45.280 --> 22:51.920] photorec, they use the magic numbers to scan your the hard disk or whatever [22:51.920 --> 22:56.700] whatever piece of evidence you're working with. Find matches for those and [22:56.700 --> 23:01.840] then pull them out. Carve them out. So, what could you do with that? I mean, if [23:01.840 --> 23:07.040] you have lots of deleted files on a computer and you're not able to recover [23:07.040 --> 23:11.720] them by conventional means, you can carve them out based on their file signatures. [23:11.720 --> 23:15.400] They're still there if they haven't been overwritten by other data on the hard [23:15.400 --> 23:26.430] drive. So, goodness gracious. So, let's do let's do a demo of log to timeline real [23:26.430 --> 23:51.570] quick. We're just gonna do a quick timeline generation. Okay, so this is how [23:51.570 --> 23:59.370] easy it is. So, I want to make a timeline of all of the data on a computer. It's [23:59.370 --> 24:04.530] it's seriously just one command there. So, I'm using, you may notice that the [24:04.530 --> 24:09.830] command says, hopefully that's easy to see for you guys, it says psteel.exe. [24:09.830 --> 24:16.690] Log to timeline is actually a part of a set of tools called the Plazo tools. I [24:16.690 --> 24:23.090] believe that's how you pronounce it. Psteel is one particular piece of [24:23.090 --> 24:27.790] functionality from log to timeline. Basically, you're just saying that I want [24:27.790 --> 24:33.090] you to generate the entire timeline of absolutely everything on this piece of [24:33.090 --> 24:37.170] evidence. Whereas log to timeline, you might be able to customize it a little [24:37.170 --> 24:43.590] bit more. Say, I only want artifacts in the Windows system 32 folder or something [24:43.590 --> 24:48.270] like that. And it'll generate a timeline based on that. Psteel.exe, it's just [24:48.270 --> 24:55.910] saying, the full kitchen sink, give it all to me. So, commands just psteel.exe [24:55.910 --> 25:05.030] dash dash source, tell where my piece of evidence is. And then, dash W, I tell it [25:05.030 --> 25:11.290] where I want to output just a text file in CSV format. So, here we go. [25:25.150 --> 25:37.990] So, this is what you end up with. It's gonna spit out a CSV file. It's gonna [25:37.990 --> 25:42.030] just, you have these different headers for the columns here. You got the time [25:42.030 --> 25:48.110] stamp, time stamp description. It's gonna tell you, what does that even mean? Where [25:48.110 --> 25:54.970] is it getting this time stamp from? The source, is it just file system metadata [25:54.970 --> 26:02.190] like creation time, the file last access time? Or if you go down here, let's see. [26:02.190 --> 26:06.550] There's different types of metadata and it will kind of list them out. It says [26:06.550 --> 26:17.350] meta, yeah, so there you go. Tells you the source there, tells you the message. So, [26:17.350 --> 26:24.270] the message is kind of just like, let me expand it out here. It's gonna be like [26:24.270 --> 26:28.590] the name of a file, a little bit of information there. So, this particular [26:28.590 --> 26:35.190] file is a PowerPoint file. So, it's pulling a piece of metadata, the [26:35.190 --> 26:42.890] description of the file there. Tells you which log-to-timeline parser it used to [26:42.890 --> 26:48.830] get that information and the display name of the file. So, this one you can [26:48.830 --> 26:55.990] see is just a secret project design concept dot PowerPoint. So, there you go. [27:04.340 --> 27:09.200] Where does it pull all the information from? So, if I took a forensic image of a [27:09.200 --> 27:13.120] hard drive, who asked the question? So, I can just look at you. Great. So, if I took a [27:13.120 --> 27:19.860] forensic image of a hard drive and then ran a PC or a log-to-timeline or [27:20.760 --> 27:25.260] whatever against that hard drive, it's going through there. It's gonna, it's [27:25.260 --> 27:29.140] gonna find the partitions, like Windows partitions or recovery partitions, [27:29.140 --> 27:34.360] whatever it sees. And then it's going to pull all the information from the master [27:34.360 --> 27:40.600] file table is one thing. And the cool thing about log-to-timeline that I [27:40.600 --> 27:45.040] actually really like over a lot of enterprise tools is that it will do [27:45.040 --> 27:49.760] things, it'll like expand out. For example, Windows event logs just automatically [27:49.760 --> 27:56.920] throw that in there with regular file system metadata. And so, yeah, it pulls it [27:56.920 --> 28:02.920] from the master file table. And if any other files, like such as office [28:02.920 --> 28:08.080] documents that are written in XML format, they have some different metadata fields [28:08.080 --> 28:13.400] and will pull that as well. So, I hope that answered your question. Okay, cool. [28:14.160 --> 28:45.950] That's awesome. Sadness. I'm just gonna have to rob you guys of the slideshow. So, [28:45.950 --> 28:49.610] reviewing, going back to the forensic process, the next step would be analyzing [28:49.610 --> 28:55.350] the evidence. Basically, when I analyze stuff like that, you can see it's just [28:55.350 --> 28:59.630] like in a spreadsheet. It's not like in any like fancy tool or whatever. So, I [28:59.630 --> 29:04.750] mean, if you have access to Google Docs, like what I just showed you up there, you [29:04.750 --> 29:11.310] can analyze the timeline of a computer. So, we analyze and, yeah, we analyze [29:11.310 --> 29:16.070] spreadsheets, we can analyze a notepad, terminal, command prompt, looking at the [29:16.070 --> 29:22.170] output of certain commands. What is something, so when you're using all these [29:22.170 --> 29:31.090] different tools, what is something to consider as far as, let's say you're [29:31.090 --> 29:35.070] working on a case that might have to go to litigation, what is an important [29:35.070 --> 29:46.130] consideration of using some of these tools? Yes? Has it been validated in court? [29:46.510 --> 29:52.110] Have I validated it myself? Like, I mean, that has some power in itself that, [29:52.110 --> 29:55.510] hey, I ran my own tests on this. I mean, when you think of traditional [29:55.510 --> 29:59.550] forensics and CSI or whatever, like what are they doing in those shows? They're [29:59.550 --> 30:04.050] like, they're running tests, like all the time. It's got to be the same [30:04.050 --> 30:08.370] thing. You have to validate that kind of stuff. You have to know what you're [30:08.370 --> 30:14.150] talking about. You have to be accurate. Accuracy is so important. I worked on [30:14.150 --> 30:21.790] this case back in California where there is, I was a forensic examiner on [30:21.790 --> 30:26.510] the case and there was one on the opposing side as well, and we were [30:26.510 --> 30:32.450] talking about USB timestamps. When a USB was plugged in, when it was [30:32.450 --> 30:36.390] unplugged, that kind of thing. Files being moved to it. It was a theft to [30:36.390 --> 30:44.270] trade secrets case. I used a particular tool to analyze the USB timestamps. He [30:44.270 --> 30:48.810] used a different one and we came up with different results. Hmm, how does that [30:48.810 --> 30:55.510] happen? We're both forensic examiners. We should both be right somehow, right? So my [30:55.510 --> 30:59.990] boss is like, what happened? You know, like looking at me like, you messed up. You [30:59.990 --> 31:03.310] know, that kind of thing. So I'm like, I'm going into like the Windows Registry. [31:03.350 --> 31:08.450] I'm pulling out the timestamps manually. I'm not relying on a tool to do it for [31:08.450 --> 31:12.270] me. Pulling them out manually. Validating those timestamps that I originally [31:12.270 --> 31:16.890] produced were accurate. And so I'm like, okay, well where is this guy getting his [31:16.890 --> 31:22.070] timestamps from? And eventually they produced their report. And I take a look [31:22.070 --> 31:27.790] at the report and I can kind of tell like, oh, I've used this tool before [31:27.790 --> 31:31.690] somewhere. And you know, I do some googling. I'm like, yeah, it's this tool. So I like [31:31.690 --> 31:36.410] go on their website. I start playing around with it. And I start reading the [31:36.410 --> 31:42.070] fine print on the website. And it was something as silly as not compatible [31:42.070 --> 31:49.230] with versions of Windows after Vista or something like that. And that was it. [31:49.350 --> 31:55.110] That was it. It was a Windows 7 computer. That was it. And he used, so he used it, [31:55.110 --> 31:58.950] that tool. It wasn't compatible with Windows. Sure enough, we go and try to [31:58.950 --> 32:02.990] validate that. And we came up with the timestamps he did on the same piece of [32:02.990 --> 32:08.670] evidence. So you really have to consider, consider accuracy of these tools. And how [32:08.670 --> 32:16.750] important that can be. And you know, if it makes or breaks the case. So yeah. [32:21.270 --> 32:32.930] Right. So a lot, a lot of tools nowadays, man, how do I answer that? It depends on [32:32.930 --> 32:39.150] your tool set for sure. There are some tools that can take an image of just an [32:39.150 --> 32:44.510] encrypted hard drive. Let's say it's just encrypted with like BitLocker. And you [32:44.510 --> 32:50.090] throw that, you point that forensic tool to the the image. It says, oh yeah, it's a [32:50.090 --> 32:54.170] encrypted with BitLocker. Give me the recovery key. Punch in the recovery key. [32:54.170 --> 33:00.430] Voila. And it's opened up. Some of our tools that we deal with, you might have [33:00.430 --> 33:09.070] to get kind of weird with them. I imagine that, you know, perhaps if you mounted an [33:09.070 --> 33:12.850] encrypted disk image to, like, that's encrypted with BitLocker to Windows. How [33:12.850 --> 33:18.050] many minutes? 20 minutes? Thank you. I imagine if you mounted that encrypted [33:18.050 --> 33:24.050] image to Windows, that it might see it as a BitLocker encrypted drive. I can't, [33:24.050 --> 33:28.750] I've never done that, so I can't accurately answer that. But yeah, some of [33:28.750 --> 33:32.210] these tools do handle encrypted hard drives or allow you to move [33:32.210 --> 33:36.070] forward from there. There's different ways to solve the same kind of [33:36.070 --> 33:39.090] problem. You just have to figure out how you want to approach taking the [33:39.090 --> 33:43.490] forensic image. That is a very good question though, especially if you get [33:43.490 --> 33:48.330] like a laptop or something. Asking them right off the bat, is this laptop have [33:48.330 --> 33:53.590] its hard drive encrypted? So yeah, good point. The last thing I want to [33:53.590 --> 33:59.190] talk about with the forensic process is reporting. That's like the [33:59.190 --> 34:03.370] kind of whole point of this. We're given the answers. I'm not gonna talk too much [34:03.370 --> 34:08.630] about reporting because this isn't a class about how to write a report. You [34:08.630 --> 34:12.090] can figure that out on your own. There's some tools out there to do it. Open [34:12.090 --> 34:17.810] Office, LibreOffice, Google Docs. Whenever you're writing a report for [34:17.810 --> 34:21.890] anything at all, for any kind of technical report, you have to consider [34:21.890 --> 34:27.530] who your audience is. Who am I talking to? Am I talking to the chief information [34:27.530 --> 34:32.230] security officer? Am I talking to an attorney? Am I talking to an executive? [34:32.830 --> 34:37.630] Depending on who the person is, there's some assumption of technical [34:37.630 --> 34:42.850] knowledge there. You know, an attorney is probably gonna know less about techie [34:42.850 --> 34:47.690] techie stuff than a CISO is. So you have to kind of consider who your audience is [34:47.690 --> 34:52.670] and man, is that a challenge of computer forensics. Communicating like super [34:52.670 --> 34:58.970] technical information and just layman's terms kind of way. Your report could go [34:58.970 --> 35:06.070] to the courtroom, could be you know put in front of, what is it, just put in front [35:06.070 --> 35:10.570] of the courtroom and people need to understand that, learn how to process [35:10.570 --> 35:19.270] that evidence. So yeah, reporting is a huge part. So all that said, I'm kind of [35:19.270 --> 35:25.250] winding down here. We have all these free tools out there. We have the [35:25.250 --> 35:29.430] knowledge of the forensic process. Where does that leave us now? What are the [35:29.430 --> 35:33.830] where the capabilities of that? And this is kind of what I've been experimenting [35:33.830 --> 35:39.250] with lately is, am I able to, with a little programming and scripting [35:39.250 --> 35:47.610] knowledge, am I able to make a very nice fluid process on how to create a [35:47.610 --> 35:51.990] forensic image, process it in the way I want it processed and produce the exact [35:51.990 --> 35:55.930] kind of information that I care about? The answer is absolutely yes, you can do [35:55.930 --> 35:59.410] that. With a little, little bit of scripting knowledge, a little bit of [35:59.410 --> 36:05.170] programming knowledge, you can automate just about any kind of forensic task out [36:05.170 --> 36:11.570] there. Another thing to consider, especially with memory forensics, which I [36:11.570 --> 36:17.710] didn't talk too much about today, is baselining your evidence. So you, you [36:17.710 --> 36:23.530] might have one particular image running on, you know, all your Windows desktops at [36:23.530 --> 36:27.910] your company. They're all gonna have the same kind of processes sitting in the [36:27.910 --> 36:33.010] memory out there. Can we baseline that to make our investigation a little more [36:33.010 --> 36:39.330] efficient down the road? Yeah, we can. We absolutely can. Another thing I want to [36:39.330 --> 36:43.950] make clear before I wrap up, and if you guys have questions, I left some time for [36:43.950 --> 36:49.770] that, but another thing I want to make clear, I love, I love forensic tools that [36:49.770 --> 36:54.010] you pay for, so I'm not bashing them today. I just wanted to, I wanted to [36:54.010 --> 36:59.710] highlight the, the possibilities of these free tools out there, and how, where [36:59.710 --> 37:03.350] they fit in the forensic process. And hopefully I did a good job of that. [37:03.350 --> 37:07.350] Hopefully you guys got a good understanding, or at least a good [37:07.350 --> 37:12.710] overview of the forensic process, and how these tools can be useful to you. And [37:12.710 --> 37:16.650] just thinking about how we can even automate them after we've mastered using [37:16.650 --> 37:23.310] them in the future. So yeah, there's plenty of enterprise-level, paid-for [37:23.310 --> 37:28.350] forensic tools out there that I use. They're awesome. They have their purpose. [37:29.170 --> 37:38.490] So yeah, anybody have some questions for me? Yeah? What's that? [37:43.150 --> 37:48.710] That's, that's a great question. How do you do forensics in the cloud? Wow. So I've [37:48.710 --> 37:53.690] come across this a few times. So it really depends on what kind of piece of [37:53.690 --> 38:00.290] evidence it is. So let's take like an Amazon web server, for example. Depends [38:00.290 --> 38:07.530] on what kind of service that server is hosting up. The type of way I'm going [38:07.530 --> 38:12.310] to connect to it. So am I just gonna traditionally authenticate to it, like I [38:12.310 --> 38:18.990] normally would with an Amazon web server? Some tools out there have that built [38:18.990 --> 38:26.010] into them, where you say, point me to a server, an Amazon web server, or a Gmail [38:26.010 --> 38:32.210] account even. And you pop in the credentials and it's going to pull down [38:32.210 --> 38:42.450] the data from there. Now, considering that, it makes you wonder, like okay, we talked [38:42.450 --> 38:46.830] about preserving the evidence, right blocking things earlier. What, you know, [38:46.830 --> 38:49.870] what does that mean for like cloud information? How do we, like what does [38:49.870 --> 38:53.330] that, what does that mean? Does that like totally get rid of the credibility of [38:53.330 --> 38:59.590] the evidence at that point? And not necessarily, no. What we will try, what we [38:59.590 --> 39:04.930] try to do as forensic examiners is be honest about the data. Especially like, [39:04.930 --> 39:09.350] for example, if we made a mistake in collecting the data, it's not good to lie. [39:09.350 --> 39:13.530] It's not good to say, oh you know, well that's normal, that's normal behavior. Or [39:13.530 --> 39:19.090] you know, like other forensic examiners do that. No, we mess up. We have to own it. [39:19.090 --> 39:23.190] We got to be responsible. We have to say, this is what I changed about the [39:23.190 --> 39:28.150] evidence. And take responsibility for it and move forward from there. So same kind [39:28.150 --> 39:33.990] of thing with cloud data. We have to recognize it for what it is. We show the [39:33.990 --> 39:38.870] steps that we took to collect that data. We document everything so it can be [39:38.870 --> 40:00.730] reproduced. Same thing with memory forensics too. Yeah, go ahead. Sure, chain [40:00.730 --> 40:06.250] of custody. Yeah, that's that's a really good question. So I don't use any [40:06.250 --> 40:10.110] particular software to handle chain of custody. Chain of custody can really, I [40:10.110 --> 40:14.830] guess I could say Adobe PDF or whatever, you know. Like you could make a PDF [40:14.830 --> 40:20.430] document of a chain of custody form. And depending on what organization you work [40:20.430 --> 40:25.250] with, or if you're just like a private contractor, you could dish those out to [40:25.250 --> 40:31.290] people along the way. As soon as that evidence comes into play, like let's say [40:31.290 --> 40:34.870] you're gonna be working with the server admin. He's gonna be, that's gonna be the [40:34.870 --> 40:39.190] first name on the chain of custody. And the server room is gonna be where it [40:39.190 --> 40:42.610] came from. That kind of thing. Does that kind of answer your question at all? [40:52.070 --> 40:57.390] Right, right, yeah. And you really you want to be involved in the process from [40:57.390 --> 41:01.830] the start. And try to just kind of butt your way in there as soon as you can. [41:02.150 --> 41:06.530] Because like if any of this information is new to you, it's new to other people [41:06.530 --> 41:11.150] out there too. They don't know like how to preserve things. And they might change [41:11.150 --> 41:15.950] things about a piece of evidence before it even comes to you. And you need to [41:15.950 --> 41:19.890] know that. You need to know that information. Really common thing is, I'm [41:19.890 --> 41:26.150] just gonna shut the server down. Like well, it might be important, you know. So [41:26.150 --> 41:44.670] yeah. Any other questions? Yeah, I'm sorry an online service. Yeah, okay. [41:56.230 --> 42:11.470] I'm not sure I understand the question. I'm sorry. Sure, okay. Gotcha, okay, yeah. [42:21.070 --> 42:34.770] Okay, okay, right. Oh that's, I gotcha. Okay, that's an excellent question. So [42:35.390 --> 42:38.130] especially, yeah, like if you're dealing with something like child pornography [42:38.130 --> 42:45.370] that's super serious. So I've worked with a few of those cases. And typically [42:45.950 --> 42:50.530] because I'm not law enforcement, it gets passed along pretty quickly. It's my [42:50.530 --> 42:56.470] obligation if I'm dealing with child pornography, any kind of case like that, [42:56.470 --> 43:00.810] as soon as I know that's what I'm dealing with, it's gone. Like bye-bye. [43:00.810 --> 43:05.730] Calling the police, calling the FBI, whatever. It's going to them. So to answer [43:05.730 --> 43:12.530] your question, they would have to subpoena the ISP to get that information.