Saturday 4 June 2011

XSLT Transformation to analyze MSSQL Server 2005 traces



During the final user validation of a new packaged application, we were getting complaints of bad performance, and the usual suspect, the database, was apparently perfectly well. This is where the SQL Server 2005 trace facilities shine. You can trace what is going on inside the database engine with quite a high level of accuracy, including filtering the "noise" generated by other users, other application components, or yourself.

After running a trace while users were exercising the application, I became completely convinced that the database was not at fault. However, I wanted to put my conclusions on a paper, and for that there is no replacement for issuing a formal report. Problem was, to create that document you need to take the trace data and use it to generate charts, rankings and whatever you need to support the conclusions.

This was going to be easy, just import the trace data into a spreadsheet and play around with it. Let's start with the trace data itself. The first thing you need is to save it on a file. The SQL Server trace tool only gave me the option of saving it on its own proprietary format, intended to be used by the tool itself, or use XML.

I choose XML, after all this is something close to a universal standard, isn't it. Yes, it is. I took my XML file and tried to import it in Excel. Only to discover that Excel was expecting basically a "flat" XML file, where the rows of your dataset are children nodes. SQL Server 2005 trace format is not like that. SQL Server 2005 uses a generic "Column" node with an identifier and a name.

No problem, I said to myself. This is XML, so a simple XSLT transform should get me the data in the columnar layout that Excel expects. Simple, right? So simple that I tried a couple of Google searches before convincing myself that either SQL Trace tool is not that popular or simply nobody had wanted to do this before.

And in the end, simple it was. Many ages ago I wrote quite a few XSLT transformations, so I was familiar with what I wanted to do and how to do it. However, memory fades quickly when you don't touch a subject in a while. And there are always new things to learn, which is to say that there are surprises around the corner, even when you use something as standard and plain vanilla as XLM. In the end, it took me the best part of four hours to get this right.

So much time over what I expected that I'm documenting it here for future generations and my lousy memory as well. And because to be honest, I expected that someone had already done this.

First, the XML output of the SQL Server trace tool contains an innocent looking line at the beginning:

<tracedata xmlns="http://tempuri.org/TracePersistence.xsd">

It took me a while to discover that my XSLT processor was trying to fetch that XSD and failing to walk beyond the root node due to silently failing at that. So the first thing you need to do is to edit the trace file and leave it as:

<tracedata>

Almost there. Then you create an XLST file with this (sorry for the lack of indentation, I don't want to spend too much time fighting with the Blogger way of rendering):

<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
<!-- Extracts the SQL Server trace data to columnar format -->
<!-- Author: consultuning@gmail.com -->
<!-- Last updated: 4/6/2011 -->
<xsl:output method="xml"/>
<xsl:strip-space elements="*"/>

<xsl:template match="/">
<Events>
<xsl:apply-templates select="TraceData/Events" />
</Events>
</xsl:template>

<xsl:template match="Event">
<Event>
<Name><xsl:value-of select="@name" /></Name>
<xsl:for-each select="Column">
<xsl:variable name="ColumnId" select="@name" />
<xsl:element name="{$ColumnId}">
<xsl:value-of select="." />
</xsl:element>
</xsl:for-each>
</Event>
</xsl:template>

</xsl:stylesheet>

And use your favorite XSLT processor for generating the transformed XML:

xsltproc [xslt file] [xml trace file] >[output XML file]

And that's all. Well, you'll get some data truncation warnings from Excel if the traced SQL statements are too long, but overall you'll be able to import this into Excel without problems.

The worst part of all of this is, after looking at the trace I discovered that there were some users having seriously bad response times. But that is the subject of another post.

No comments:

Post a Comment