Interview with Brian Carrier, author of “File System Forensic Analysis

By | April 18, 2005

Brian Carrier specializes in digital forensics and is the writer and maintainer of The Sleuth Kit and Autopsy, which are open source forensics tools. In this interview with IT-Observer he discusses digital forensics topics as well as his book.

Who is Brian Carrier? Introduce yourself to our readers.

I love to figure out how things work and how to find digital evidence. I write and maintain The Sleuth Kit and Autopsy, which are open source forensics tools that can be used to find and recover digital evidence from disks and file systems. I find the challenge determining what happened on a system to be exciting and that is why I wrote File System Forensic Analysis (FSFA).

I am also a Ph.D. candidate at Purdue University / CERIAS and was previously a consultant at @stake Inc. in Boston where I led the incident response team and forensic lab. I frequently speak and teach about digital forensics and am a member of the Honeynet Project. I also do academic things like publish papers and review them for journals and conferences.

How did you gain interest in computer security?

I first became interested in computer security while working as a system administrator. When the company got its first Internet connection I was involved with the installation of the firewall and I became hooked. From then on my focus has been on security, including cryptography and digital forensics.

When did you start working with digital media evidence (computer forensic)?

I first became interested in digital forensics about 5 years ago after attending a presentation on the topic. Later, Dan Farmer and Wietse Venema released a collection of open source forensic tools called The Coroner’s Toolkit (TCT), which I used to learn about low-level Unix-based file system details. In early 2001, I released a set of tools called TCTUTILs, which added new features to TCT. At the same time, I released Autopsy, which was a graphical interface to the command line tools. Over time, more features and file system types were added to TCTUTILs and I split them from TCT to create The Sleuth Kit (TSK).

FSFA was written because of the lack of documentation associated with file systems. To write TSK, I frequently needed to refer to file system source code to figure out how things worked. This book bridges the gap between high-level descriptions and source code. It gives an overview of the concepts along with the data structures and hexdumps from actual disks. Many of the current digital forensic books describe the important concepts of file systems, but they do not help an investigator testify about how his analysis tools work. With the knowledge from this book, an investigator can more intelligently testify about the evidence and can better test his analysis tools.

How long did it take you to complete “File System Forensic Analysis” and what was it like?

I am the sole author of FSFA and it was released about 2 years after I started to outline it. Prior to this book, I wrote two chapters on forensics for the Honeynet Project’s Know Your Enemy 2nd Edition and learned many things from that process, which made writing FSFA easier. One of the challenges that I had while writing FSFA was that no similar books existed to use as a template, so I revised the outline several times in the process.

What are your favorite security and computer forensic tools?

As an author of digital forensic tools, I prefer mine. I like knowing how things work and the design of TSK (which includes nearly 20 command line tools) makes it easier (in my opinion) to understand what steps are needed to find evidence. The FSFA book is not about my tools, but they are used to illustrate examples. I think that open source tools have an advantage in digital forensics because they allow a lab or investigator to verify what a tool is doing.

What is, in your opinion, the biggest challenge in digital media investigation?

I think two of the bigger challenges that companies face with respect to digital investigations are downtime and storage sizes. In some cases, critical systems cannot be taken down when they are involved in an incident and performing a live analysis on the system introduces new challenges with respect to the trustworthiness of the evidence. With respect to storage sizes, it is common to make duplicate copies of all desktop systems before they were analyzed, but that is not always practical when dealing with terabyte arrays on servers

What are your future plans?

My next major goal is to finish my Ph.D. dissertation (on digital forensics) at Purdue.

Leave a Reply