I tend to prefer having the option to pick among a large selection of tools to be highly specific in solving problems. The fewer options I have, the more likely I will be doing an “OK” job instead of doing a “good” job. Worse still, when not having the right tools, I may not be able to the job at all. With that, I say things like, “The more tools in your toolbox, the more problems you can solve.”
This doesn’t mean tools solve problems. It doesn’t mean that anyone with many tools can solve problems. It doesn’t mean that any tool can solve any problem. And it doesn’t mean that merely having lots of tools means you can solve lots of problems. I simply mean that by having a choice of tools that I am competent to use, I can pick the right tool for the right job.
To elaborate a little more, I do not believe that having every tool in existence is a reasonable plan. There are far too many tools available for any one person to be competent (even at the least level of competence!), or to keep updated, or to validate, or to use with any frequency to remember which button (or command) does what. It is just too much.
Each person has their own sweet spot as to how many tools that they need for their job. For some, this may be a few. For others, it may be more. And for each person, the sweet spot is different. And over time, the sweet spot changes. Even with different jobs, the sweet spot will change. You have to find your sweet spot and no one else can tell you what that is. Did I mention that your sweet spot of tools will change?
Related experience on having too many things: In my younger life, days after I checked into 2/3, I was headed to the field for a month (PTA). What did I do to prepare for 30 days in the field? I went to the PX and bought a bunch of nicely packaged and enticing junk that was packaged and marketed as "the top 10 things that Marines need to bring to the field". I brought back this junk to the barracks and started packing and strapping junk to my ALICE pack. The old salts in my squad ribbed me pretty bad about me wasting my money on ‘junk’. The things I bought were promises of making field life easier for Marines in the field. The packages said as much! Days after being in the field, I cursed every little junk item I bought because all it did was add weight to everything I was carrying, did nothing to make fieldwork easier, and I learned from the salty Marines on what works in the field based on hundreds of years of Corps’ existence, not some toy from the PX. Some of the NCOs that I mirrored were magicians in the field. Literally, they performed magic. The things I learned that a person can do with the right tools showed me that it is never just a tool, but the right person appropriately employing the right tool. I carried this lesson from that first month in the field through today with this blog post.
"It's not the machine, but the examiner, that does the work" - Brett Shavers
Some questions have one answer. What is 2+2? Is fire hot? Is ice cold? Other questions have more than one answer. That is why this this blog post is titled, What’s the best way to get to Spokane from Seattle? No one can answer this question for someone else. There are too many variables involved for there to ever be one answer to the best way to get to Spokane from Seattle. Variables such as, how soon do you need to be there? How much are you willing to pay for travel expenses? Do you prefer to drive, fly, or ride a bus or train? Do you like a window seat or aisle? And hundreds of personal questions that affect the best way for you to get to Spokane from Seattle. Point being, best for you may not necessary be best for me.
So, when I hear “the best forensic tools” or “this is the best forensic tool”, I assume that the person stating or writing such a thing is speaking solely for themselves, as there is no way that they are speaking for me. It is impossible. The variables to choose a software tool are no different than choosing something to eat for lunch. It depends on everything. Time to eat. Money to spend. Locations available. Type of food available. Allergies. Food preferences. Tastes. Desires for a certain food at that very moment. The person eating with you and their preferences. Any “top list” of anything is useless except for the person creating the list.
The point: You choose the best tool for you that can solve the problem.
I have written things about forensic software, but never stated that one tool is better than another, or that you only need one tool. The closest thing I have written that could be interpreted as such was in the X-Ways Forensics Practitioner’s Guide. And even then, I simply stated that if you are proficient in X-Ways Forensics, then you probably know your stuff and probably know how to use other tools too.
https://www.amazon.com/gp/product/0124116051/ref=dbs_a_def_rwt_bibl_vppi_i1
The appropriate number of tools is that number which (1) you can maintain competence and (2) those tools that you need. If you can’t maintain competence in a tool, get rid of it. If you don’t need a tool, why do you have it? Don’t overwhelm yourself with too many tools that you can’t use competently or don’t need to use at all.
I had an email recently asking my opinion on validation. I happened to be extremely busy at the time I saw the message on my phone, and didn’t have the time to appropriately respond. In short, the question was how do you validate tools and how often. And do you use test images that are online. Wow. Tough question actually.
Without writing a book on forensic tool validation, all I can say is that I have spoken to many about this subject and I have found that it is a rare examiner that validates all of their tools, and at that, rarely does it regularly. The vast majority simply buy (or download) a tool and use it. Validation happens when they use another tool to check the work of the first tool on finding an artifact…in real cases. The result is that many of us use a tool (that we didn’t validate) to find evidence, and then we use another tool (that we also didn’t validate) to validate our findings. Hint: I test my tools against other tools against known data sets.
I’m not a software developer, but I am aware of what software testing is and how to do it. There are books on how to test software, with processes that range from simple to complex. I recommend picking up one or more of these books to get a handle on software validation before you get asked on the stand about it. Seriously. Check into it because it happens, or at least it happened to me. At a minimum, I suggest taking advantage of Paraben's free ebook on validating forensic tools (https://paraben.com/validation/).
As far as the online test images, I believe that they have their place if they were developed for testing. There are images that are freely downloadable for testing that were purchased from the private market. These particular sets of images are from discarded and used computer systems that we really have no idea what happened on the systems other than what the tools tell us. I find this to be very exciting, but only for the sake of curiosity to see what data did people throw out without knowing this risk? In my opinion, these are the worst images to use a test images, because we have to trust the tools to tell us what happened on the systems. How can you test a tool on data that you are trusting that a tool is correct in telling you if you can't validate the data?
Test images should be images that you know exactly what data is on them, and know exactly how the data was created. If you don’t have the documentation of the activity that occurred on the image, then the only thing you are testing is your patience of time of running software. If you don’t know the fact of what happened on an image (prior to imaging of course), then how do you know your tool is performing correctly? You don’t, because you are trusting the tool to be accurate with the data that you can’t validate, in order to validate the tool you are testing…
**EDITED** 12/21/18
To clarify my initial thoughts on test images and tool validation, I think it better to state that the tools may be accurate in parsing the data on images on which you do not have assurance of the activity, but that the results may be incorrect or inconclusive. What I mean by this is by one example, is that the tool may parse the data correctly, but the data itself may have been anti/counter-forensics. To show this, I've created a trick-question in classes where I planted anti/counter-forensic data (user-created files) onto a drive purposely to throw off an analysis. Tricks are unfair in teaching, but this sort of exercise makes several points, such as how to state conclusions in a report, to question why a file may or may not exist and its supporting metadata may or may not exist, and not to jump to conclusions at the first sight of seeing evidence.
The steps I took were simple:
The result was most of the class assumed that the files were created by the logged on user account, on the dates and times of the documents/files. Some of the class questioned how the files were created (not downloaded, no evidence of the applications being run, no USB connections, no LNK files, etc..). The tools were correct in pulling the data, but the conclusions were wrong about the data itself. The point being made was that finding the evidence is #1 important. Next is to validate the evidence (is it really evidence or not?). And come to some conclusion of how the evidence most likely was created supported by other corroborating evidence (other than the actual data file itself). The result was everytime the 'evidence file' was found in classwork, students really worked to make sure they grabbed as much supporting evidence on that file as possible.
*The user-created files were time stomped to match Internet activity dates/times on the drive.
I have a set of test images that I have created over the years. For each image, I have extensive documentation with everything I did on that image, with date and time. It is a lot of work. A serious amount of work, but I now have a library with different OSs and different types of evidence planted on the images. When I run a tool on an image, I compare the result of the tool with my notes. It should match exactly, and if it does not, either I used the tool wrong or the tool doesn’t work. I know that because I planted the evidence and know exactly what the evidence is. I know how it got on the disk. I know when it was put on the disk. I know because I did it and documented it as I was doing it. My test images are validated by me, for me. You can’t do that with an image you find online. You can’t even do it with an image someone gives you, because you are trusting someone else with validation of the data on the image! At best, you are trusting the creator of an image to not only give you accurate information about the image, but that they accurately documented the creation of the data on the image. Think about that a moment.
One question that I saw on Twitter a while back concerning the software listings on dfir.training, was something to the effect of “are all these tools validated?” This is a legitimate question because there are over 1,300 software listings. The only accurate answer is that none of the software is validated. Not a single one. Not a single tool on Github is validated. Nothing on SourceForge is validated. Not a single commercial suite that costs thousands of dollars is validated. No open source programs are validated either. None of them. Nada. Zip.
The only tools that are validated are the tools that you personally test. Out of that 1,300+ tool listing, whatever you download and use is up to you to validate. Out of that 1,300+ tool listing, you may ever only need 5 or 50 or 500 of those tools in your lifetime. Again, that is totally up to your situation and needs and validation falls upon you. Sorry, but that is the way it works.
Everyone is different because we are. Every scenario is different, because they are. Tools are different because they are developed by different people and for different scenarios. All of this adds up to an infinite number of solutions for each person to decide on which tools to pick for specific scenarios.
Here is how I do I pick tools (keeping it simple…):
That’s it. Every single scenario, I go through the same process. Some are quick and easy to figure out. If the job is imaging an easy-to-access single hard drive without any encryption in a desktop, then the choice is quick and simple. Scenarios beyond that will add a bit of complexity with each additional obstacle to overcome. This process covers every scenario from basic imaging to full-fledged network breach that is bleeding data like a stuck pig. There comes a point where I can’t handle a problem because it is way out of scope of what I know and time needed for me to learn what is needed. If I came across a problem that required me to be a program developer, I could do it if I had the time and the problem could wait while I got a degree in computer programming. But I know my limits, and I know how long it takes me to learn a new application to a competent level if that can be the solution.
Back in the day, we didn’t really have much in the way of software choices. If you started back in the Norton Disk Editor days…you were really limited in choices overall. Today, we have many (too many?) to choose from. Then we have personal preferences. I know examiners who swear by one particular forensic suite (name any suite and I’ll show someone that swears only by it). Others won’t touch a suite because they prefer to use small tools to solve problems. By small tools, I mean those forensic tools that do one specific thing rather than a suite of functions. Some demand push-button only, others want CLI only. Some only use Windows-based, others only Mac, and believe it or not, some only use Linux-based forensic applications. Many use a combination of all of these, because it depends on the problem to solve coupled with competence in specific tools.
I never question someone’s preferences in tool selection or tool development, because preferences. As long as the problem can be solved, personal preferences don’t really matter.
It is only when personal preferences interfere with problem solving that it matters. When someone keeps trying to force a solution that keeps failing or is obviously inappropriate, then the problem is never solved and gets worse. If a tool is not working on a problem, and you can’t fix it, then quickly move to something that works.
You should be able to flow from tool to tool to solve problem to problem.
Accept now that one tool does not do it all. This includes reporting. I have seen comments about forcing one suite to accept reports from other suites and tools because the examiner wants to press ‘print’ and have it all done in one.
In reality, each suite creates its own reports, and many (most?) small tools don’t even create reports at all. They will spit out the data, but not so much a report of the data like a suite will. Unless you are only using one suite for a case, you will be hodge-podging a report from multiple suites of tools and creating output reports from small tools. Yes, some suites allow for easy importing of other reports, but as for me, I am combining small tool outputs with suite reports, adding software logs, pasting screenshots, and typing statements and summaries to form one report. When I take a course in a tool in which the provider is touting the reporting feature as the end-all be-all reporting feature, I kinda tune out because I heard that song before.
I do my best to avoid technical writing, except for a few pieces that I want to put out that others may not be aware. The only reason I do not want to put out technical pieces is that so many others are doing fantastic work in publishing their research. David Cowen’s test kitchens are the most innovative that I’ve seen in the online forensic videos. There are several on Twitch that have been doing the same sort of thing (hacking mostly), but Dave’s fits the area where I work in forensics, so I really appreciate what he is doing. Others (too many to mention) are writing blog posts with some juicy technical forensic info. The thing I find missing is that of the investigative aspect, the principles and concepts of forensics, and the personal facets of forensic work. That’s what I tend to focus on when teaching and writing. I believe we can all learn the technical aspects, particularly when we have some outstanding researchers sharing their knowledge! My objective is to push the other side of the coin, the side that focuses on using your brain to make decisions, to think things through, and solve any problem with multiple solutions derived on how to think.
There are many ways. There are side roads, service roads, flight paths, and train tracks. Rather than think that you can drive to Spokane, evaluate the ways and pick the solution that fits your situation at that moment, because driving may be best today, but flying may be best next week. No different with forensics. Today’s solution may be different from tomorrow’s.a
I tend to take a path to Spokane that starts with the goals of my analysis, with the next step being the development of an analysis plan. Many times, I'm able to achieve my analysis goals with FTK Imager, tools to parse data sources in to timeline format, and Notepad++.
Another hidden nugget in there...exactly knowing your goals lets you pick exactly what you need (a specific socket wrench and not an entire box of wrenches).
By accepting you will be accessing a service provided by a third-party external to https://brettshavers.com/
Be sure to check out my DFIR Training website for practically the best resources for all things Digital Forensics/Incident Response related.
© 2023 Brett Shavers