SQL Server GROUP BY in PowerShell

I usually don’t have any specific reason to play with PowerShell other than strict joy of doing it, but this time I did. I wanted to go through a client’s server, and find out how many tables each database had.

Now something like this can easily be done in T-SQL for a single database with this bit of code:

SELECT COUNT(*) AS TotalCount
FROM sys.all_objects
WHERE is_ms_shipped = 0 AND type='U'

The above works but then you have to write the additional code to execute that against every database on a given server. I prefer to use PowerShell one-liners to get things like this done, at least when I can.

In this instance I figured out how I can by introducing you to the Group-Object cmdlet. This cmdlet can have a property passed to it and will provide a count based on the number of objects it finds based on that property. So what is done above can easily be accomplished using these two lines (ok isn’t a one-liner):

Import-Module SQLPS -DisableNameChecking
$s = New-Object Microsoft.SqlServer.Management.Smo.Server ORKO
($s.Databases).Tables | Group-Object Parent | select Count, Name

The above will output the below against my local instance of SQL Server 2012:

GroupObject

Max User Connections set to one (oops)

I frequent Database Administrators forum over on Stack Exchange Network, and came across a question that intrigued me enough to play around with the setting noted in the first sentence: Maximum number of concurrent connections. What happens when you set this to one, and how do you set it back to default?

To make a small note the method or process this OP tried to use just to get a database restored was completely the wrong way to go about it. You should not be making changes to server-level configurations without knowing the full affect they can have, especially if you are on a production instance.

Now, if you read the documentation on this setting the main sentence that should grab you:

The user connections option specifies the maximum number of simultaneous user connections that are allowed on an instance of SQL Server.

The default value of zero (0) means unlimited, so it goes without warning that if you set this to anything above zero you are limiting how many connections you can have. I would say that setting this to one is equivalent to setting the instance to single-user mode.

I set this to one on my local SQL Server 2012 instance and restarted the instance; because any current connections are not going to be closed automatically by setting this value. Once I tried to connect back to my instance I receive this error:

User Connections Error

Now what do you do to get access again? The most voted answer to the question states try using the DAC to connect. Alright, let us see what happens:

SSMS DAC attempt

I will note above is clicking on “New Query” in SSMS, because the initial connection prompt you get when opening SSMS does not support a DAC connection. In doing this it let me connect, but it is not guaranteed in every situation. So what is another method you ask? PowerShell, I answer.

You can make modifications to SQL Server configurations using SMO and PowerShell. The example provided actually is what pointed me to these lines of code to fix my predicament:

Import-Module SQLPS
$srv = New-Object Microsoft.SqlServer.Management.Smo.Server ORKO
$srv.Configuration.UserConnections.ConfigValue = 1
$srv.Configuration.Alter()

Executed the above and restarted the instance and I was back in business. Now I will stipulate that I executed the above code immediately after the instance was restarted so I ensured I grabbed the only connection allowed. If you have other applications or systems that are hitting your instance you may need to just try the DAC. If the DAC is disabled in your environment (some security standards want to see this disabled) then you will need to do a bit more work to grab that single connection.

I am Biml (Bacon Integrated Markup Language)

Wouldn’t it be awesome if that is what it was really called? Maybe not, so what does Biml really stand for?

BIMLLogo

Small Background

Varigence is the company that created Biml, particularly Scott Currie, back in 2008. You can check their page to get the full store, it is worth a read. It is a pretty remarkable story that they took a thought and turned it into a full technology that includes a proprietary and an open source product. I can remember hearing about it within the past year or so but it never clicked with me at the time (it does now). I think around 2012 it started to get an uptick in exposure. It has begun showing up in sessions at PASS Summit and at SQL Saturdays in some places more and more in the past year.

Now BimlScript is the use of C# (by default) code within your Biml that offers an enormous amount of flexibility into what you can do. This is where you can get into dynamically creating things within an SSIS projects or even SSAS projects.

The Potential

I think back to a contract job I had with the Army handling ETL design. Through FOA request I was responsible for making packages that would pull data out of the source and dump it into an Access Database or Excel spreadsheet. The design pattern I had to used there was creating a package per table of data that needed to be pulled and then a master package that would simply iterate through all of those “table” packages.

We used AGILE development process so one request would generally take me a few days to build out the project, deploy it into test, then staging, and then finally production. So depending on what amount of data was being requested it could take up to a week to get everything designed and finalized to send the file to the requestor.

If I could have used Biml for this process I would have cut down the initial design time considerably. With it providing me the ability using C# (which I would have had to learn as well back then) to iterate over the table list of a given source to generate all the Data Flow tasks; man think of the clicks you do just setting up one data flow task and how much time this would have saved.

Biml Products

The most common product referenced in the community, at least that I have seen, with Biml is BidsHelper. Which BimlScript is one part of what is included in BidsHelper, there are a ton of little things that were added in there that creators of BIDS left off. If you are using SSDT-BI now a few of those things included may be obsolete possibly now. You can look at the list under documentation of what is included. I actually listed some of those I find worthy of noting even with SSDT-BI on my Tools of the Trade page.

BidsHelper is something that is used with Visual Studio (BIDS or SSDT-BI) and is an excellent starting point to get started using Biml. If you find you need to go to the next step Varigence has an IDE they created called Mist that provides a very robust environment to develop SSIS and SSAS. I link to a few videos below that show off what this product can do, very cool stuff! The other product they have is Vivid, this is an add-on for Excel users that handle data analysis.

Resources to Note

When I wanted to learn more about Biml I found myself asking, “where do I start?”. Similar to how a lot of folks ask when they want to go into being a DBA or Database Developer. Below are just some links to articles and videos that can help you get started.

Articles:

SQLServerCentral’s Stairway to Biml (yes, you should register…just do it)

Bill Fellow’s Blog post on Biml (some good nuggets if you learn by seeing)

BimlScript Walkthroughs, Getting Started

Biml Language Reference (similar to MSDN for SQL Server)

Biml Snippets (ton of examples to learn from here)

Videos that I found the most helpful:

SQLug.se – Peter Hansen’s – Biml, session 1

SQLug.se – Scott Currie’s – Biml, session 2

Using BIML as an SSIS Design Patterns Engine (PASS DW/BI Virtual Chapter) (Andy Leonard does a great walkthrough of Biml)

Biml Introduction and Fundamentals Webinar (Peter Avenant with Varigence, great video showing Mist and Biml, you should subscribe to their channel)

Working with BimlScript to ease and automate your SSIS development (Session done by Jeff Mlakar, just this month. Some excellent tips and examples)

Powershell: Backup Files to Tape

I had a maintenance plan on a server with a client that was having issues running successfully each night because it ended up filling the drive. This as it turns out ended up being that the clean up task was set to delete backups older than 1 day. Ok you would think that was sufficient, however it appears in this instance that was not working. The time stamps showing for the previous day backup was within a minute of the next run of the maintenance plan. So I took it to mean that it was not actually seeing them as over a day old and therefore skipped them. I decided to change it to 21 hours instead of one day and that seemed to work for this client.

Either way I needed to delete the old backups that I know were already backed up to tape. To do this I wrote up this function that will simply return a bit of information on the file like name, last access, last write, and attribute values. The attribute will show “Normal” if the file has been backed up (or the archive bit cleared) and “Archive” if it still has to be backed up.

Now I would caution you to do your own verification as to whether the file actually exist on your backup media (tape, disk, etc.). There are multiple things that can modify the archive bit, so use at your own discretion.

function Verify-FileBackedUp ($rootdir,[switch]$recurse)
{
  # if value "archive" is shown means file has not been backed up
  # if value "normal" is shown means file has been backed up, or the archive bit cleared 
 if ($recurse)
 {
  dir $rootdir -recurse | where {-not $_.PSIsContainer} | 
  select FullName, LastAccessTime, LastWriteTime, 
   @{Label="FileBackedUp";Expression={(Get-ItemProperty $_.FullName).Attributes}}
 }
 else
 {
  dir $rootdir | where {-not $_.PSIsContainer} | 
  select FullName, LastAccessTime, LastWriteTime, 
   @{Label="FileBackedUp";Expression={(Get-ItemProperty$_.FullName).Attributes}}	
 }
}

A sample output would look something like below:

capture

SQL Server Data Tools: A spork of sorts (part two)

In the previous post (a bit of a rant too) I went over SSDT and how it has evolved for whatever reason by Microsoft into SSDT (Database development) and SSDT-BI (Business Intelligence development). In this post I wanted to go over how you actually get this installed so you can start the fun work of working with SQL Server 2014.

If during the CTP phases of SQL Server 2014 releases you may have caught the blog post on SQL Server Data Tools – Business Intelligence (SSDT BI) for SQL Server 2014 CTP1 (another interesting post to read response to this in the comments). This is where it was told that you will not be able to install SSDT BI on a new machine with SQL Server 2014 installation media. It became a separate download. You would get this download from Microsoft SQL Server Data Tools page on MSDN, scroll down to the bottom of the page to find SSDT-BI download links for VS 2012 and VS 2013.

I opted to just go ahead and move to Visual Studio 2013 version and downloaded that from here. If you have installed the version for VS 2012 before it has not changed from that installation process. I will go over the installation for VS 2013 version to just provide a walkthrough for those that may be just starting out.

You start out with a file named “SSDTBI_x86_ENU.exe, yeah no 64-bit version still. After the download is complete just double click. Now this is a basic clicking next through the wizard and I will just point out a few notes from my experience installing this on a new Windows 7 machine.

  • The file downloaded is going to extract the files to a directory of your choice. If you end up having to cancel and start over, you go into that extracted directory and run the setup.exe file. You are actually going to get the SQL Server Installation Center similar to this:
    Capture
    Just click on “New SQL Server stand-alone installation…” and it will bring you to the starting point of accepting the license terms.
  • It states in the feature selection that .NET Framework 4.5 is installed through this media. In my instance it actually did not install this and failed with a feature rule check. Might be worth verifying that is installed if you are on a new machine.
    Capture2
  • Also note above what it is also going to install, Visual Studio 2012. If you don’t believe it, check out my own laptop:
    Capture4

Once you get done with the installation you are ready to get started with Business Intelligence development.

Capture3

Happy learning!

SQL Server Data Tools: A spork of sorts (part one)

Prior to SQL Server 2012 the Business Intelligence (BI) toolset was made up of just Business Intelligence Development Studio (BIDS). Easy to install as it was part of the SQL Server installation media. Now with database development tools you had to go the route of getting a license for Visual Studio where it allowed you to create database projects. Oh, how that is all changing now, and in some ways go back in time.

SQL Server Data Tools (SSDT) brought about changes in BI and database development. First Database Development changed to SQL Server Projects and no longer required a fully licensed version of Visual Studio, it used the integrated shell. It stayed this way up until Visual Studio 2013. With VS 2013 it now appears it might have gone back to requiring a licensed version of VS. A bit more on this part below.

Business Intelligence development also changed tools from BIDS to SQL Server Data Tools. Wait, wasn’t SSDT for database development? It is actually for both but actually require separate installs. It is now being referred to as SSDT-BI. Nice work there Microsoft.

If you have installed SSDT for SQL Server 2012 you will note that it actually is only installing SSDT-BI by viewing the new project window:

Capture2

What?!!?? You want me to install the same thing again? Well it appears that even though they share the same title of application they are now separate. If you read the blog post on MSDN, SSDT and Visual Studio Versions, and note the comments. I particular liked this one:
Capture3

What that means is SSDT is now something that encompasses two things: Database Development and BI Development (reference a fork + spoon = spork) but two different teams are developing each one. Glad that confusion is over.

Now if you go to the page for Microsoft SQL Server Data Tools it will provide al ink to “Download Visual Studio 2013 with SQL Server Tooling” that basically points you to the Visual Studio 2013 product page. Which I have installed VS 2013 for Desktop and no where can I find how to make it update to include SQL Server Tooling. If you visit the Visual Studio 2013 production comparison to Visual Studio Online and view under “Development Platform Support” you will notice that SSDT only shows up for Pro, Premium, and Ultimate Editions. So they went back to 2010? So right now it appears if you want to do database development stick with VS 2012 and down. I think VS 2012 includes support for SQL Azure development but I have not dove into that as of right now.

In the next post I will go over some things with SSDT-BI and how installation for this changed.

SQL Server Prerequisites .NET Framework

As of SQL Server 2012 the .NET Framework prerequisite of .NET 3.5 SP1 is no longer installed for you by the SQL Server installer if it is found to be missing from the server. Now with Window Server 2008 R2 SP1 or higher this is simply enabling the feature within Windows Server, no actual installation. This can be easily accomplished with PowerShell as a quick script to run prior to doing the installation of SQL Server.

The PowerShell Windows cmdlet that is used in this instance will be Get-WindowsFeature and Install-WindowsFeature (Windows Server 2008 R2 you will use Add-WindowsFeature).

The .NET 3.5 SP1 feature in Windows Server is referenced as “Net-Framework-Core” in PowerShell. You can find this by calling the command:

Get-WindowsFeature Net*

NETFrameworkCore_ps

The below script is going first verify it is not installed and then will enable the feature (or technically install it I guess). In order to do this though you will need the OS media as you have to pass in the source path to “<drive letter>:\sources\sxs”. Note: You have to execute this in an elevated PowerShell console, so “Run as Administrator”.

#verify installed first
Get-WindowsFeature Net-Framework-Core).Installed

#Add feature
Install-WindowsFeature Net-Framework-Core -source 'G:\sources\sxs'

When you run this command you will see a status “bar” of sorts appear while it is performing the installation. Once completed you should see output that lets you know it was successful.

Now if you happen to see any errors you might reference the KB 2734782 – .NET Framework 3.5 installation error as there are various reasons why this command might fail.

 A small update to this post…

I recently came across a server that I was getting the “0x800f0906″ error and went through the article above verifying Internet access from the server and having mounted the ISO for the OS properly. I came across a blog post that brings up the possibility that you need to update the local source media on the server.

Follow me on Twitter

Follow

Get every new post delivered to your Inbox.

Join 45 other followers