Get-Deadlock counts

Deadlocks can be captured several different ways. When using TF1222, the deadlock information is dumped into the SQL Error logs. Getting that information out is not always as easy as capturing the information however.

I created this simple PowerShell script to get the counts and only the counts of how many deadlocks occurred since the logs were recycled. I recycle my logs every night at 12:00 midnight.

The script can be modified as needed for your needs. To use it now just copy and paste and change the ‘your_server_name’ to your actual server name.


Download the file from here.



PowerShell and Perfmon Counters = Pure Goodness

A few weeks ago I ran into an issue after setting up perfmon (Performance Monitor) to run on one of my servers and trying to save the data to a .csv file for further analysis. I was getting some weird error. Data source C:\PerfLogs\test.csv is either invalid or cannot be found. The specified log type has not been installed on this computer”

OK. No problem. I know I did not have MS Excel on the server. Maybe that was the cause of the error? I decided to try and run the counters from one of my test servers that did MS Excel installed. I still got the  same error. When I posted a question about this error on Twitter only one person replied. Aaron Nelson ( twitter | blog ) He mentioned using PowerShell to get the data and send it right to excel (or csv). So I started exploring that idea.

Finding the Counters

There are certain counters I want to use but I am not all too sure about how to use or even call them with PowerShell. This is just one of the areas PowerShell is so awesome. When you are starting to use a new commandlet (for me anyway), one of the first thing I like to do is use the built in help to get some details. You do that with the following command.


Out spills a wealth of information. I highly recommend trying this with any commands you use in PowerShell. O.K. now that we have a little more knowledge about “Get-Counter” let’s try a few of the commands we just found.


OK. Now that’s a lot of information. It is kind of confusing if not intimidating at first glance. We don’t need all of it for now so let’s narrow down our return to make things easier to read and digest.


Now, that’s what we want to see, a nice clean and easy to read list.

Drilling down

Now that we have a list of all the counters by names and I hope you agree this is easier to view. All of these counters  have specific (sub) counters and paths we need to find under their covers. So, let’s expose them a bit more by using the following command.


Now we are getting somewhere. This list is much easier to work with and figure out what we need. But we still want to narrow things down more and get only specific memory counters. Try this one.


Now the we have the “path” needed for the counter we will return some data of the counter to the screen.


You can have the data continuously sent to the screen (or a file). We make that happen with the following.


Pulling the Counters Together

After you determine the counters you want to monitor you can send the data to many different file formats like a text file or a csv file. Now this next bit of code is a little more advanced, but not too far out of reach. First we are setting the counter path to a variable named “page_Counter”.

$page_Counter = “\Paging File(*)\% Usage”

Then using that variable we are piping Get-Counter to another variable called “pagedata

$pagedata = Get-Counter $page_Counter

After that we are using the Select-Object to only return the rows we want to see. Then we pipe the results to a text file and a csv file. Keep in mind you need to have the code on a single line.

get-counter-set-var$pagedata.counterSamples | Select-Object TimeStamp, CookedValue, RawValue, SecondValue | format-table -auto | out-file “C:\test1.txt”


$pagedata.counterSamples | Select-Object TimeStamp, CookedValue, RawValue, SecondValue | Export-CSV   “C:\test1.csv” -NoTypeInformation


That’s all there is too it. This solved my error problem and learned some new stuff in the process. I hope this helps provide some instructions and inspiration into using PowerShell to gather PerfMon data.


PowerShell and Active Directory

Just the other day I read a blog post by Jeffery Hicks ( twitter | blog ) he wrote back in April, about using a free tool made by Quest Software that makes working with Active Directory so easy.  It is called “Active Roles Management Shell for Active Directory” and is FREE to download. I decided to try it on my test server at work and the next day I was able to solve a problem another project was having trouble doing.

Problem at hand

I was talking with another DBA on our team about on of his projects. Seems they had some sort of licensing issue and they needed to get the users from several of the groups on the domain and output the data to a file periodically.

Well, I told him about this new Powershell Plug-in by Quest. I thought the tool was so cool I bet it would do what he needed. He said one of the SAs was working on a solution and to talk with him to see if he needed any help. The SA in question happen to reside in the cubical next to me at the office. Was this fate or something? Everything was just falling into place.

The next morning I talked with him (my SA neighbor)  for a few minutes to see what he was trying to do and where he was in the process. Seems the driving factor was, as the DBA had mentioned, related to vendor licensing per user on the system. But he was having trouble getting what was needed. He went on to say had been working with Microsoft to get some help but really was not getting what he needed, help or the data.

All he needed was the domain/username and name of the user, like this.

PRW\prwsmithjj, Smith, John J.

I told him about what I was doing the day before with the tool from Quest Software and if he wanted I would see what I could figure out. All I needed from him was the names of the groups and we could go from there.

Turns out I had exactly what he needed within 10-15 minutes and most of that was logging onto my servers. He was so happy, I heard him drop my name on the conference call later that morning say, “All the credit goes to AJ. He really knocked it out of the park”. True, I wrote the code needed, but I am not taking all the credit. Most goes to Quest Software for creating and distributing such cool (and FREE) software and some goes to Jeff for writing his post.

So the following is a short guide to what I did and the coded need to resolve the issue.

Install the Management Shell

You can download the management shell from here. There is a License Agreement that must be accepted before downloading. You have the option of downloading a .zip file or the .msi binary file. I highly suggest downloading the Admin Guide as well for future reference. Once you download the tool, installation is straight forward.

I think it is important to note the Management shell uses the permissions of the user executing the code. Read more about the permissions on page 26 of the Admin Guide.

Getting User Information

Before we can use any of the commandlets we need to load the reference to the snap-in by running this.

Add-PSSnapin Quest.ActiveRoles.ADManagement

Next get information about a user (this code is so simple you are going to love this).

Get-QADuser $env:username

That will return some basic (default) information on the current user running the query (you).  Mainly you name and the CN for you. Something like this:

CN=Smith\, John J., OU=PRW Solutions, OU=Contractor,…

You can also query by any of the fields in AD like city, company, or lastname.

Get-QADuser lastname = ‘Sm*’

Just like any query using a wildcard search, less specific in your filter is, the wider your dataset, the more specific your filter the smaller the return.

It is important to note here there are many fields in Active Directory and as you might expect every organization is different. So to get all the columns it is a little more complicated. You will use a “SELECT” statement to query by adding the pipe and “SELECT” statement at the end of the line like so.

Get-QADUser  Smith | SELECT *

OK, so it is not really complicated, I was trying to add some drama. As you know this statement will return all the columns. Probably not something you want to do all the time. So, now that we reviewed all the columns returned we can narrow our SELECT statement to our desired output.

Get-QADUser  Smith | SELECT firstname, lastname, city

Now that we have an idea of what we are doing let’s move on to the issue at hand.

Getting Groups

Getting group information is just as simple as user information. If you don’t actually know the full spelling of the group name you can use the wide card character to search just like we did for username above.

Get-QADGroup ‘PR*’

Getting Group Members

Get-QADGroupMember ‘PRW’

This returns all the users in the group. And as before we are going to limit the columns returned by using the following line.

Get-QADgroupMember ‘ PRW ‘ | select NTAccountName, name

This returns exactly what we need. The only additional thing we need to do is save this to a .csv file for tracking and archiving. That is done by adding a small bit of code and piping the output to a file.

Get-QADgroupMember ‘ PRW ‘ | select NTAccountName, name | Export-CSV  C:\domainUsersdata\domainuserdata.csv

That all there is too it. Hopefully this will help someone else along the way.


Presenting at CACTUSS

I don’t usually write about myself but this one is different. I am doing a little shameless self promotion.

I’ll be giving my first ever SQL presentation tomorrow at CACTUSS in Austin. Am I scared? Hell yes! Will I survive? There is no doubt in my mind. I plan on having fun and doing the best I can. If or when I make a mistake I will adjust and move on.

You see, it may be my first SQL talk but I am no stranger to public speaking. I am an elected official of a fraternal organization in our church and monthly I get up in front of a bunch of grumpy old men and run the meeting. I don’t always know the answers but, I keep my cool (usually) and work through any issues the best I can.

My presentation “Maintenance Plans or Not” is based on replacing the “canned” Maintenance Plans in SQL Server Management Studio with Jobs running Powershell Scripts. There is still some  work that needs to be done to it as I move forward with the concept but I have a start and will continue to improve.

The ideaor concept is to have “options”. If you, as a DBA, want to manually create the Maintenance Plans, no problem. If you want to use T-SQL O.K. If you want to use Powershell, then (wink, wink, nudge, nudge) I have scripts for that! All projects and server installs are not the same. They can be very similar but, this way we have several options and are not “locked” into only one solution.

So, if you are in the Austin metropolitan area and want to have some free food and want to heckle me then stop on by tomorrow night.
Here are the details and the CACTUSS web site.

We meet at:
Quarry Oaks II
10900 S. Stonelake Blvd.
Suite 225
Austin, TX 78759
5:30PM to 6:00PM – What problems are you having?
6:10PM to 7:00PM – Feature presentation
7:00PM 7:30PM – Q&A follow up and discuss next meeting.


Error running SQL 2008 SMO SQLBackup

The actual error I was getting is this.

Exception calling “SqlBackup” with “1” argument(s):

I am going to explain what I was doing, the cause of the error and the fix. The purpose of this little exercise is to inform and hopefully reduce the stress level of others.

A little back ground.

I have been working with powershell creating scripts to replace our maintenance plans. So, naturally backups were on the list. After creating the scripts at home (yeah, I’m like that, work is never done. But can you call this work?), testing and confirming they worked. I took them to work to start replacing different tasks in the Maintenance Plans with Jobs being run by SQL Agent.

Since we backup system, user, and user logs (if Recovery Model set to “Full”) I created three seperate Jobs and set a scheduled for the jobs to run. I then moved to the next task to be replace in the Maintenance plans; removing old backup files.

I finished writing the script and during testing needed to add some database files (bak) and log (trn) into the directory to confirm testing results. I ran the backup job no problem. When I ran the log backup, *BAM* the job failed. Reading through the error message in the Job History as usual was very informative, not! So I decided to take a took at the script in powershell ISE.

Again, I ran the backups with no error, but every time I ran the log backup I got this error.

Exception calling “SqlBackup” with “1” argument(s): On server “MyServerName”

What the heck was going on here? It ran fine on my machine at home. (Never heard that one before, hug)

I searched on Google, but did not find much help there. There were a few things but again as usual they were similar but different. So I kept digging. I then turned to my next source of good information Twitter. For those of you that don’t use twitter (What are you waiting for?) there are several help tags that all the smart (and cool) people monitor. Like #sqlhelp for SQL Server related problems, issues or questions and #poshhelp for the same related to Powershell. Unfortunitaley there wasn’t much help there either.

Then I got a direct message (DM), again for those of you not on twitter, that is a private message. It was from Aaron Nelson, a.k.a. SQLvarient ( blog | twitter ). He asked me if I wanted some help with the error. Know Aaron, (through twitter, I haven’t actually met him. Yet) I promptly took him up on the offer and sent him my scripts.

Long story short he said they worked on his testing environment. One thing he did ask. Was I loading in the plugins for SQL server (more on those later). He said he had seen weird behavior in the past and loading them into the profile fixed his issue, besides it couldn’t hurt. So, loaded the plugins and no luck, same error. Aaron then said something that sparked the light bulb, “It’s probably some kind of security “feature” that causes the problem.”

I search Google again, this time for SQLBackup (this is using SMO) and securiy related issues. I found a post from someone saying they had the same error and after setting the correct permissions on the database the error went away. SQL BOL (books online) stated the following:

To run the SqlBackup method, users must have BACKUP DATABASE or BACKUP LOG permissions on the database, or be a member of the db_owner and db_backupoperator fixed database role and the sysadmin fixed server role.

Am I reading that correctly? To create Backups I need permissions in addition to my sysadmin permissions? Hmm. Confusing to say the least. I sent an email to our Microsoft Rep and then we talked on the phone. He agreed, the wording is confusing and suggested doing some additonal testing related to permissions to confirm.  We are working with him and our security group to update post install scripts that remove Guest CONNECT permissions to databases. I asked if he thought that may be causing the issue, he said, “That is a strong possibility”.

So, armed with this knowledge, I went to the server I was originally testing my scripts on and granted connect permissions for the user I was logged in as to msdb.

Then I tested the log backup script and it ran without any errors. YeeHaa! (that’s a Texas thing)

So there you have it. If you are getting the error above, you may want to check permissions for the user you are logged in as and see if you have guest connect permissions on msdb. Removing these permissions can cause all kids of issues.

If you are removing GUEST CONNECT permissions based on old/bad information in BOL like we are I suggest you read more about why it is a bad idea from Buck Woody’s (blog | twitter ) post. Here are a few additional links to information you can check here and here about this as well.

I hope this helps someone else if/when they run into this issue.


Using Powershell SQL Database Backup and Compression

Recently I have been working with Powershell, alot. My goal is to create powershell scripts to replace the maintenance plans we have at work. I know maintenance plans, yuck! But they are better than having nothing at all.

Anyway, I wanted to demonstrate a really cool feature only recently add to SQL server in 2008 version. You may have read about database compression before but I am going to show you some hard numbers and let you decide if this is a feature you want to start using.

This script is from SQLvarient aka, Aaron Nelson’s ( blog | twitter ) powershell presentations.

Using this script:

## This script backs up all user databases
## The orginial code for this came from Aaron Nelson @SQLvarient
## Modified 9/25/2010
## By AJ Mendo @SQLAJ

[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.SMO”) | out-null
[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SqlServer.SMOExtended”) | out-null
# define the instance name
$instance = ‘Server_Name’
$s = new-object (“Microsoft.SqlServer.Management.Smo.Server”) $instance

#We define the folder path as a variable
$bkdir = “D:\Backup”
$dbs = $s.Databases
foreach ($db in $dbs)
if($db.IsSystemObject -eq $False) #We don’t want to backup the tempdb database
$dbname = $db.Name
$dt = get-date -format yyyy_MM_dd_HH_mm #We use this to create a file name based on the timestamp
$dbBackup = new-object (“Microsoft.SqlServer.Management.Smo.Backup”)
$dbBackup.Action = “Database”
$dbBackup.Database = $dbname
$dbBackup.Devices.AddDevice($bkdir + “\” + $dbname + “_db_” + $dt + “.bak”, “File”)

Here is a screen shot of the size of the files after we run this script.

Now we will add the compression Property to the script.

If you add this line:

$dbBackup.CompressionOption = “1”

above the last line in the code


Then run the code again and look at the sizes of the databases now.

Much smaller like 25% smaller. Now that is impressive and can save on disc space.

Hopefully you can or already use this feature on your SQL Server 2008 or 2oo8 R2.


Get ExecutionPolicy

After my laptop crashed last week I moved to Windows 7 OS and since it was a clean install (actually image) I  had to  install a lot of software. One bit of software I did not have to install was PowerShell. That is because it is part of the default install. Yeah!

One of the things I forgot about when first starting to work PowerShell and running scripts is the ExecutionPolicy. When trying to run a script to start SQL Server I ran into this little error.

startService.ps1 cannot be loaded because the execution of scripts is disabled on this system.
Please see “get-help about_signing” for more details.

I forgot about the default execution policy being set to restricted. This is a security feature built into Powershell. You can check your own execution policy by running this command.

# get the execution policy

#This is what PS returned:

PS C:\Users\sqlaj>

How do we fix this you ask? That is the easy part just run this command in PowerShell.

Set-ExecutionPolicy Unrestricted   # (other options are RemoteSigned and AllSigned)

Note, this will allow you to run all scripts even unsigned ones so this does pose a potential security issue to your PC or server if you set this level. To get more detailed information on what “signed” and “unsigned” scripts means type the following command.

Get-Help About_Signing

All the information and code commands I described here can be found here on MS Tech Net