Sooner or later -- when applications crash, servers fail or suspected hacking activity appears -- the IT department will need to sort through logs to find critical information; this includes application logs, operating system logs, security logs and many others.
There are dozens of commercial products -- from companies such as Splunk Inc., LogLogic Inc., Q1 Labs, ArcSight (now part of Hewlett-Packard Co.), RSA, the security division of EMC, and so forth -- intended to make this task easier, but most are expensive and require substantial effort to implement correctly. These products are most useful when log analysis is a sustained, ongoing and funded task.
So, what options exist for companies where log analysis is more tactical in nature, not a sustained IT project, and has zero budget? Previously, I've described how Microsoft Excel is frequently overlooked when it comes to analyzing logs. Today, I will examine Log Parser, a free Microsoft log analysis tool, with some in-depth Log Parser examples.
Microsoft describes Log Parser as providing "universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system and Active Directory." So, what does this really mean? With Log Parser, it's possible to take most log files and perform SQL searches against them. If you are familiar with SQL, you know how valuable this capability is. If you aren't familiar, then this tip will show you why it's worth it to gain fundamental SQL query skills.
To illustrate one use of this powerful free log analysis tool, let's pretend we want to know if the Time service (W32Time) on our Windows computer is failing to synchronize time. We could open the Event Viewer, view the System Log and look for those messages, or we could use Log Parser.
Figure 1 - Query for Time Synchronization Errors
The command in Figure 1 instructed Log Parser to examine the System event log, show events from the source W32Time, and only show events with a message that did not contain the text "is now synchronizing." I further limited this to the three most recent events with the query parameters "TOP 3" and "ORDER BY TimeGenerated DESC." The optional "–q:ON" enables quiet mode. If we wanted all of the failed messages and the output to go directly to a CSV file, we could modify it as shown in Figure 2.
Figure 2 - Generating CSV output
To make this even more useful, let's additionally look at the event log on another computer ("myserver") by adding "\\myserver\" before the log source ("FROM system") as shown in Figure 3. This use case of Log Parser requires that whoever is executing the command has privileges to view the System event log on the destination computer. You can even specify multiple computers with the syntax "\\computer1\system, \\computer2\system."
Figure 3 - Searching a remote computer
Now that we've been introduced to Log Parser, let's use it to examine a log file. The following is a log file from a Web proxy which shows websites accessed by a specific user, along with the category of the website (shopping, gambling, news, miscellaneous, etc). The format of the log, as opened in Excel, is shown in Figure 4.
Since this file is a CSV file with a header row defining names for each column, Log Parser will automatically detect the column names and allow you to query those names. In order to see which users (field "User") showed up in the logs the most, use the command shown in Figure 5.
Figure 5 - Top Users from Websense log
Log Parser can output results in several possible ways: straight to screen (stdout), to a text file of various formats, to a GUI "datagrid," to a syslog server and to a chart.
The possibilities for this tool are limited only by your imagination, data and time. SQL queries allow you to sort, filter and aggregate your data in many different ways. The bundled help file (logparser.chm) provides many examples, a SQL reference and documentation on how to use the various log sources. You can create batch files to process large numbers of files and then process the output of those original sources. You can also output the results into a database rather than a file of some sort.
In summary: If your job ever requires the analysis of logs, this often overlooked (and free) tool is well worth your time.
About the author:
Tom Chmielarski is a former senior consultant at GlassHouse Technologies.