Monday, February 25, 2008

Getting Started, pt II

Okay, in the face of recent (and completely bullsh*t) claims by Sen. Clinton that Sen. Obama plagiarized speeches (so the guy used some phrases...so what?), I thought that it would be best that I was up-front and came clean...I did not have se...oh, geez...wait a sec...

I was on the e-Evidence site this morning and saw a paper listed from Kennesaw State University, entitled,
"Digital Forensics on the Cheap: Teaching Forensics Using Open Source Tools", from Richard Austin. This paper goes right along with what I was referring to in my earlier post, but also takes it a step further with regards to using specific tools, in this case, Helix and Autopsy. This is a great read and definitely very useful.

So, you're probably wondering...what's the point? Well, lists of free and open-source tools, as well as to documents that describe their use can be used to provide a solid foundation in the fundamentals (and even in more advanced information and techniques) of computer forensic analysis. Some college (community college as well as university) courses may not have the budget for some of the more expensive tools, but can provide the time and impetus necessary for folks wanting to learn and develop skillz to do so.

The availability and access to images and tools for creating and obtaining images, as well as the access to tools for analysis also provide a foundation for training programs, as well, in order to develop more advanced skill sets. Not only that, but new areas of computer forensic analysis can be explored...for example, it's not entirely difficult to locate malware on a system, but one of the areas that isn't explored is how it got there in the first place. Training sessions, brown-bag or white-board discussions all lend themselves very well to advancing the knowledge base of any group of forensic analysts, and the availability of the tools and images put the basis for these training sessions within reach of anyone with a Windows system and some storage space.

One final thought to close out this post (but not this subject)...has anyone thought about using these resources as part of an interview process? I can easily see three immediate ways of doing so...
  • 1. Query the interviewee with regards to familiarity with the tools and/or techniques themselves; if familiarity is mentioned or discovered during the interview process, ask probing questions about the use of the tools (Note: this requires the interviewer to be prepared).

  • 2. Prior to the actual interview, have a candidate perform an exercise...point them to a specific image, and give them instructions on what tools to use (or not to use). Part of the interview can then be a review of their process/methodology.

  • 3. If an interview is conducted on-site, with the candidate coming into the facility (rather than a remote interview), have the candidate sit down at a workstation and solve some problem.
The whole point of the use of these tools and techniques as training and evaluation resources would be to get analysts thinking and processing information beyond the point of "Nintendo forensics", going beyond pushing a button to get information...because how do you know if the information you receive is valid or not? Does it make sense? Is there a way to dig deeper or perhaps validate that information, or is there a technique that will provide validation of your data?

When First Responders Attack!!

It still happens...an "event" or "incident" occurs within an organization, and the initial response from the folks on-site (most often, the organization's IT staff) obliterates some or all of the "evidence" of the event. Consultants are called to determine "how they got in", "how far they got" into the infrastructure, and "what data was taken", and as such, are unable to completely answer those questions (if at all) due to what happened in the first hours (or in some cases, days) after the incident was discovered.

Check out Ignorance wrecking evidence, from AdelaideNow in Australia. It's an excellent read from the perspective of law enforcement, but a good deal of what's said applies across the board.

One of the things that consultants see very often is a disparity between what first responders say they did during an initial interview, and what the analyst sees during an examination. Very often, the consultant is told that the first responders took the system offline, but didn't do anything else. However, analysis of the image shows that installing and running anti-virus and -spyware tools, deleting files, and even restoring files from backup all happened. A great deal of this can be seen once the approximate timeline of the incident is determined...and very often, you'll see an administrator login, install or delete/remove stuff, etc., and then say that they didn't do anything.

Why would this matter? Let's take a look...

Many analysts still rely on traditional examination techniques, focusing on file MAC times, etc. So an admin logs into a system and runs an AV or anti-spyware scan (or both...or just 'pokes around'...something that happens a LOT more than I care to think about...), so now all of the file access times on the system have been modified, and perhaps some files have been deleted. Anyone remember this article on anti-forensics that appeared in CIO Magazine? Why worry about that stuff, when there is more activity of this nature occurring due to either the operating system itself, or due to regular, day-to-day IT network ops?

So what's the solution? Education and training, starting with senior management. They have to make it important. After all, they're the ones that tell IT that systems have to stay up, right? If senior management were really aware of how many times (and how easily) their organization got punked or p0wned by some 15 yr old kid, then maybe they'd put some serious thought and effort into protecting their organization, their IT assets, and more importantly, the (re: YOUR) data that they store and process.

Thursday, February 21, 2008

Important Memory Update

I ran across this info today, and thought that I'd post it...it seems quite important, in that it pertains to the use of physical memory (RAM) to deal with whole disk encryption (WDE), referred to as "cold boot attacks on disk encryption".

This looks like very cool stuff. Give it a read, and let me know what you think.

Don't forget that TechPathways provides a tool called ZeroView, which can reportedly be used to detect WDE.

Wednesday, February 20, 2008

Getting started, or forensic analysis on the cheap

Quite often, I'll see posts or receive emails from folks asking about how to get started in the computer forensic analysis field. What most folks don't realize is that "getting into" this field really isn't so much about the classes you took at a college or the fact that you have a copy of EnCase. What it's about is how well you know your stuff, what you're capable of doing, and if you're capable of learning new stuff.

For example, who would you want to hire or work with...someone who only knows how to use one tool (for example, EnCase), or someone who can explain how EnCase does what it does (such as file signature analysis) and can come up with solutions for the problems and challenges that we all run into?

What I've decided to do is compile a list of free (as in "beer") resources that can be used by schools and individuals to develop labs, training exercises, etc., for the purposes of providing an educational background in the field of computer forensic analysis. With nothing more than a laptop and an Internet connection, anyone interested in computer forensics analysis can learn quite a lot without ever spending any $.

Imaging
FTK Imager 2.5.3 (and Lite 2.5.1)
George M. Garner, Jr's FAU
dcfldd - Wiki
dc3dd

Image/File Integrity Verification
MD5Deep

Images/Analysis Challenges
Lance's Forensic Practicals (#1 and #2) (no EnCase? Use FTK Imager to convert the .E0x files to dd format)
NIST Hacking Case
DFTT Tool Testing Images
HoneyNet Project Challenges
VMWare Appliances (FTK Imager will allow you to add these - most of which are *nix-based - as evidence items and create dd-format images)

Analysis Applications
TSK 2.51 (as of 10 Feb 2008...includes Windows versions of the tools, but not the Autopsy Forensic Browser - see the Wiki for how to use the tools)
NOTE: DFLabs is developing PTK, an alternative Sleuthkit interface, and they are reportedly working on a full Windows version, as well!
ProDiscover 4.9 Basic Edition
PyFlag

Mounting/Booting Images
VDK & VDKWin
LiveView (ProDiscover Basic will allow you to create the necessary .vmdk file for a dd-format image)
VMPlayer

Analysis Tools
Perl ('nuff said!!) - my answer for everything, it seems ;-)

File Analysis
MiTec Registry File Viewer - import Registry hive files
TextPad
Rifiuti - INFO2 file parser
BinText - like strings, but better
Windows File Analyzer

File Carving
Scalpel

Browser History
WebHistorian

Archive Utilities
Universal Extractor
jZip
PeaZip

AV and Related Tools
Miss Identify - identify Win32 PE files (different from an AV scan)
GriSoft AVG Free Edition anti-virus
Avira AntiVir PersonalEdition anti-virus
McAfee Stinger - standalone tool to scan for specific malware
ThreatFire (requires live system, best when used w/ AV)
GMER Rootkit Detection (requires live system)

Packet Capture and Analysis
PacketMon
WireShark

Other Tools
According to Claus at the GSD blog , Mozilla uses SQLite databases to store information, so if you're doing browser analysis, you may want to take a look at SQLite DB Browser, or SQLiteSpy. If you want to create your own databases in SQLite, check out SQLite Administrator. So, you can use these tools not only for analysis of the Mozilla files, but also with creating your own databases for use with other tools (ie, Perl).

Please keep in mind that this is just a list...and not an exhaustive one...of technical resources that are available. There are many, many other tools available.

Also, all of the technical tools and techniques are for naught if you (a) cannot follow a process, and (b) cannot document what you do.

Jesse rides again!

Jesse Kornblum has done it again! Jesse's one of those guys who releases some really amazing tools for use in the IR and forensic analysis space, and he's done it again with "Miss Identify".

Miss Identify is a tool to look for Win32 applications through the use of file signature analysis. By default, it looks for Win32 apps (per the PE header) that do not have executable file extensions. As with Jesse's other tools, Miss Identify is rich with features, all of which are configurable from the command line.

So, you're probably thinking...okay, so what? You can already do this sort of thing with other tools, right? What makes this tool so Super Bad, McLovin?? Well, right now, there are a number of ways that a forensic analyst can identify malware in an acquired image, including checking the logs of any AV app that is already installed, or mounting the image and running an AV scanner or hash set comparison tool. However, two issues arise with these approaches...one is that there are legitimate tools that can and are used for malicious purposes. The other is that signatures (AV signatures, hashes, etc.) don't always work. However, there is one thing that all malware must be, and that is executable!

Miss Identify can also print strings that are found in the files, as well. This is great because you may find an executable file in the system32 directory that has a Microsoft-sounding name, but does not contain the MS copyright info embedded in the resource strings. This would be a "clue".

The use of Miss Identify doesn't replace other analysis and data reduction techniques, but instead augments them. This is without a doubt a useful tool, and one that should be considered for use by all sysadmins, first responders, as well as forensic analysts.

A round of applause for Jesse, everyone!

Also, I love the "Hollywood teaser" Jesse used to let everyone know what was coming! Speaking of teasers, isn't IronMan coming out soon....? Can you think of a better way to get Marvel Comics and Black Sabbath to come together??? ;-)

Addendum: I reached out to Jesse and mentioned to him that it might be useful to parse out the file version information from an executable, rather than all of the strings. Also, reading through the comments to Jesse's blog, there are some very useful tips pointed out...for example, finding an executable file in a user's browser cache might be considered by some examiners to be a "clue"... ;-)

Friday, February 15, 2008

CIO article on the need for forensics

CIO Magazine out of the UK has an interesting article titled In-depth Investigation that discusses the need for computer forensics capabilities. While it is from across the pond, the message of the article is extremely applicable here in the US, as wel.

I know that as I agree with it, many folks are going to think, "well, yeah, you're a consultant...of course you agree with this article, because it recommends that companies hire you!" And yes, that's true...I am a consultant, and in most cases a company would have to hire someone like me to come in and do the kind of work that is recommended.

However, even taking e-discovery out of the equation for a moment, with the increase in state notification laws (goin' federal in the near future...), as well as the regulatory stuff (SEC, PCI Council, FISMA, HIPAA, etc.), a forensics capability is being mandated. The decision has been left to organizations, and they've opted not to develop the capability...and now many organizations are being told that they have to have it.

My personal thought on this is that ideally what an organization would want to do is develop an in-house capability for tier 1 response...trained folks whose job it is to respond to, triage, and diagnose a technical IT incident. By "trained", I mean in the basics, such as NSM, incident response, troubleshooting, etc...enough to be able to triage and accurately diagnose level 1 and 2 incidents, as well as preserve data until outside professionals can respond to level 3 or 4 incidents.

That leads to one other thought...many times when folks like me recommend that an outside third-party be called to perform incident response and/or computer forensic activities, it's not so much because we want your money (well, that IS part of it...), but look at it this way...if your organization is mandated (by the PCI Council, for example) to have a pen test performed, how well do you think they're going to accept the results when your report says that your own IT employees performed the pen test against the systems they set up, and they found no way to get in? Having an outside third party do this kind of thing adds credibility to the report...besides, this is what we do all the time. ;-)

New Docs at SWGDE

The Scientific Working Group on Digital Evidence (SWGDE) has released some new documents, the most notable of which are the Vista Technical Notes, and the document on "Live Capture".

The document on Live Capture was very interesting! At only 5 pages in length (the first page is formal disclaimer stuff...), there isn't a whole lot of detail, and the timeliness of the document may be questionable, but the point is that the document does reference the benefits of performing "live capture"...a term which encompasses three different activities. The document spends only a small paragraph discussing RAM dumps, and in that paragraph refers to "DD" as a software tool that can be used for collecting the contents of memory...on Windows systems, this is no longer the case (unless you have an old copy of the version of dd.exe sitting around). Further, this article in the Forensic Magazine mentions the use of dcfldd (version 1.3.4 was reportedly used when writing the article) to dump RAM from a Windows system...however, the command line listed in the article no longer seems to work (although for some odd reason, on a Windows XP SP2 system, replacing "\\.\PhysicalMemory" with "/dev/mem" seems to get something). Oddly enough, the document doesn't mention ProDiscover (which had the ability to collect RAM and volatile data before EnCase), nor does it mention Nigilant32.

The section of the document that addresses live acquisition is also extremely short and bereft of any real content...I'd love to know what "careful planning" they are referring to, just as I'm sure others reading the document who've never done a live acquisition must be wondering.

But hey...don't get me wrong...I think it's a great thing that the document is out. The more these techniques and methodologies are discussed and presented, the more likely they are to be used and then become part of standard procedures.

Thursday, February 07, 2008

DFRWS 2008 Announcement

The DFRWS 2008 CfP and Challenge have been posted!

The CfP invites contributions on a wide range of subjects, including:
  • Incident response and live analysis
  • File system and memory analysis
  • Small scale and mobile devices
  • Data hiding and recovery
  • File extraction from data blocks (“file carving”)
And here's a couple that should be interesting:
  • Anti-forensics and anti-anti-forensics
  • Non-traditional approaches to forensic analysis
Submission deadline is 17 Mar, with author notification about 6 wks later.

I may submit something on Registry analysis...we'll have to see. This may be a good segue into a book...I've been thinking that based on some new tools I've been working on, as well as data collected since Windows Forensic Analysis was published, I may have enough to put together a book just on Registry analysis.

This year's challenge is similar to 2005's, except that this time the issue is Linux memory analysis.

This year, the conference is in Baltimore ("Bahlmer"), MD, 11-13 Aug 2008.