Parse Deadlock Graph from System_Health session

Way back in October (2014) I published the post: XML, Glad that is over (yeah, I know such a good title). In that post I published a script built from researching how to get the deadlock report out of the Extended Event system_health session. Think of this session like the default trace of SQL Server 2005, but way cooler. Anyway…

It was brought up to me recently on an answer I provided on, that I needed a 2014 version. When in fact this is required starting on SQL Server 2012 because the Event XML output changed started in this release. You can find an excellent write-up of this from Jonathan Kehayias here.

I moved the script to my GitHub repository, and provided an update so it can be run on SQL Server 2012 and higher. The comments now include what has to be changed to make it run on SQL Server 2008.


Build your own StackExchange Database


Source: Flickr (adesigna)

Have you ever wanted to be like Mike Brent and have your own local, sample StackExchange database? Do you know what PowerShell is?

If you at least answered yes to the first and last question, I am here to help. I published a PowerShell script on GitHub and created a dedicated page to show you how I solved the problem with PowerShell. Travel over to this page: and you will see what I am talking about.

Is it the best PowerShell script for doing the job? Possibly. It accomplishes what I want it to, which is enough for me. I caveat this on the page, but to not let your hopes get to high I was not able to load the StackOverflow data with this script. It is more than likely just my laptop could not handle that much data. If I get a chance, I will try using this on a more beefy server to see if it will work.

Good luck and happy data-fishing.

Don’t just do it, love doing it

I attended a meeting today with the founder of the company I am now employed by. It was a session to hear the story and the background of the company and founder himself. It was very interesting. One part of the discussion near the end was a subject area that I think can define a person’s character and demeanor to some extent. It was regarding dealing with the policy and tools required for various processes that are to be used in my day-to-day responsibilities.

There are things that we all have to learn when we first start with a company. I think every company does things differently; be it HR policy or tools that the company offers for you to use in your job. It always takes getting familiar with those processes before you just get used to doing them. In some cases you may still wonder, or think, “this is stupid even knowing why and there is no reason to do it”. I personally think that understanding the “why” and having an open mind about things can make work life much more simple. I think a person that has the right character and professionalism will not think any process or tool is necessarily stupid to use regardless of the reasoning behind it. Whether it be due to some event, incident, or idea anything can have room for improvement. Every process or tool always has room for improvement, if you don’t allow for that you will never get anywhere, in life or career. I mean, we all have room for improvement and processes or tools do as well. If you put out a product at version 1.0 and never improve on it making it 1.5 or 2.0, how long do you think it is going to last.

I have worked with folks in various places that did what they need, to just get from 8:00 AM to 5:00 PM and then can go home. I have seen folks come to work with their head down and no passion for what they are doing. It all had really nothing to do with the company, but the individual. If I had followed that same regime I do not think my career would be where it is today. I love what I do for a living, no matter the company. SQL Server (and PowerShell) never bring about a boring day…maybe a slow one at times, but never boring. It is only never boring if you make it that way and work doing something you love.

I worked for companies that never used me to my full potential, at least that I think I have, and even some that refused to let me help on tasks that I was capable of doing to get it done more efficiently. (Think two people are better than one type situation.) I believe I am finally with a company that offers me the environment to reach my full potential, and go beyond it.

Here is to the future!

To be mentored or not to be

This happens to be the first post of 2015 for me, and actually one of the few non-technical post I have written. Paul Randal recently posted on Twitter offering to mentor a select number individuals.

Which I have read he also does this for a select few that go through the SQL Skills courses as well. Just one more thing to put on that “How great is Paul Randal” list we all keep up with.

I can say that I have been through one mentoring program (or more an experiment) back in 2011 in my career. This program actually introduced me to Tim Mitchell (blog | twitter) and am very grateful for the time spent on phone calls and emails. My current employer at the time did not really put me in a situation that I could take full advantage of the advice received but it did help none the less. I was employed as a DBA but not doing a high amount of “dirty” work in SQL Server.

I have changed employers a few times since then and in June of 2013 decided to leave the confines of an office and started doing full time SQL Server consulting work from home. I began working with a company out of Wisconsin. Which today is actually my last day with this company. I have decided to make a change and will be going to work at Pythian starting in February. Any new job starting off will offer new challenges but I think Pythian has an environment that I can expand my knowledge of SQL Server in and other technologies as well.

So, should I be mentored by Paul?

Who wouldn’t want to be mentored by Paul Randal. Do I deserve to be? You can always second-guess yourself thinking I am to far down the totem poll to deserve Paul’s time. I try not to look at things like that, most of the time. I have held a lot of different jobs in my career and numerous other part-time jobs. I am hoping to be in this new position for some time, at least plan to. My head is full of little tidbits of information and I know full of holes. I think being mentored by someone at Paul’s level, that has seen it all by now would help me narrow down goals. Those goals that will make sure what I am doing lines up with where I want to go (or maybe should go).

I have always just tried to learn anything and everything I can about SQL Server. Which some of it I can retain and some of it I don’t; maybe little bits I recall but have to go back and look up again. I have heard folks state that SQL Server is a large product and no one person is going to learn it all to an expert level. I began to realize that more doing consultant work, and how much I don’t know. It would help to just get down the best path that matches up to my personality and what I want to become (both in training and career).

If it is not to be, then it is what it is. I will keep chugging along in hopes that one day I make it to an IEPT01 class.

PowerShell and SQL Server: Search-SqlErrorLog

I have had a thought on how I might write this function for some time and finally decided to just sit down and do it.

When you have alerts setup or just in general need to troubleshoot a given instance of SQL Server one of the first steps is always looking through the ERRORLOG files for the instance. You can do this via SSMS or even via T-SQL with a known (but undocumented) stored procedure sp_readerrorlog. Either of these methods is useful if you are just looking through one log at a time. However, I configure most of the servers I manage to keep at least 15 error logs. I have a few that are set to the max at 99. Trying to search through all of those for one phrase or word can be extremely time consuming to do manually. I guess you could do this with T-SQL, but this is one of those things that PowerShell shines at by allowing more flexibility. I can easily wrap this function into a few lines that will allow me to search for a phrase against multiple servers all at once.

The script is provided below and includes help information so you can use the help cmdlet to pull out the examples or information about each parameter.

function Search-SqlErrorLog
		Allows you to search the error logs of a given SQL Server instance
		Utilizes enumeration methods within SMO to iterate through all or some of the error logs on an instance
	.PARAMETER server
		the SQL Server instance
    .PARAMETER logNumber
        number of the specific error log you want to search, or an array of them
    .PARAMETER all
        switch to indicate just search through all error logs of the instance, this is default
	Search-SqlErrorLog -server MyServer -value "Severity: 25"
	Searches all the logs of an instance for the text "Severity: 25"
	Search-SqlErrorLog -server MyServer -lognumber 0 -value "Severity: 25"
	Searches through latest error log of an instance for the text "Severity: 25"
	Search-SqlErrorLog -server MyServer -lognumber 2,5 -value "Severity: 25"
	Searches through two specific error logs for an instance for the text "Severity: 25"

	param (
				   Mandatory = $true,
				   Position = 0

				   Mandatory = $false,
				   Position = 1

				   Mandatory = $false,
				   Position = 2
		[switch]$all = $true,

				   Mandatory = $true,
				   Position = 3

	$s = New-Object Microsoft.SqlServer.Management.Smo.Server $server

if ($all)
		$collection = $s.EnumErrorLogs() | select -ExpandProperty name
		for ($i = 1; $i -lt $collection.Count; $i++)
			Write-Progress -Activity "Reading logs" -Status "Reading log number $i" -PercentComplete ($i / $collection.Count * 100)
		foreach ($l in $collection)
			$s.ReadErrorLog($l) | where { $_.Text -match $value } | select @{Label="LogNumber";Expression={$l}}, LogDate, ProcessInfo, Text
		if ($lognumber.Count -eq 1)
			$s.ReadErrorLog($lognumber) | where { $_.Text -match $value } | select LogDate, ProcessInfo, Text
		elseif ($logNumber.Count -gt 0)
			for ($i = 1; $i -lt $logNumber.Count; $i++)
				Write-Progress -Activity "Reading logs" -Status "Reading log number $i" -PercentComplete ($i / $logNumber.Count * 100)
			foreach ($l in $lognumber)
				$s.ReadErrorLog($l) | where { $_.Text -match $value } | select LogDate, ProcessInfo, Text

SQL Sentry Plan Explorer and Deadlock XML Report

OK, I just found one of those things that I can remember reading about but forgot until I saw it again. Back in October I had the privilege of being the recipient of a SQL Sentry Plan Explorer Pro license via one of their contest:

If you are not familar with SQL Sentry, you should be. One of the products they are well known for in the SQL Server Community is Plan Explorer. One of the coolest monitoring products they also offer is Performance Advisor that can monitor just about anything you want with your SQL Server environment. One of those features includes the ability to access graphical deadlock analysis. That product includes the ability to export those deadlock graphs similar to the deadlock files you can obtain from SQL Server Profiler. Well one of the many features that exist within the Pro Version of Plan Explorer is the ability to graphically view those deadlock graphs. As some may know one of the more common things you troubleshoot in environments are deadlocks. Being able to easily review those graphs can save a lot of time and headache, if you are a graphical person, which I like pretty pictures.

Well I don’t have access to Performance Analzyer for every client I work with (although it would be on the wish list), but I can pull the XML deadlock report from the system_health session in SQL Server 2008 and above. I wondered if I could save that XML data out into a file and open it up in Plan Explorer Pro. So I forced a deadlock to occur on my local instance and pulled it out of the system_health event file. I saved that XML into a file with the XML extension and opened up Plan Explorer…viola:


How freaking cool!!!

Tidbit – SQL Server Maintenance Plan Logging

WindowWasherWhether you are an advocate of Maintenance Plans or not, they have their place in some situations. However, when you do use them there is one thing you should understand that is turned on by default with every maintenance plan you create: report and logging. I am not sure why, but Microsoft decided that any sub-plan that exist within a maintenance plan should have a log created for it by date.

I have lost count of how many times I come across servers and review their default log directory to find thousands of small text files from years back of maintenance plan logs. How large the files are is based on what actions are being performed within the sub-plan of the maintenance plan.

You will generally find the log directory with files that are similar to this:


Now to see how this is configured if you open up a maintenance plan you will find a button on the top of that window pane to open up the reporting setup:



I will on every occasion simply uncheck the box for “Generate a test file report” because I do not need hundreds of files created that are simply going to show “successful” attempts:


If my maintenance plan is failing I will just use the output file that is configured on a SQL Agent job step, under the Advanced page. There, by default, it will do the one thing you can’t have the report and logging in maintenance plan do: overwrite my log file each time. I only need one log file to check in the event something failed.


Follow me on Twitter


Get every new post delivered to your Inbox.

Join 72 other followers