Category Archives: SQL 2012

Interview with Warwick Rudd

 

Leadership-2

Part of traveling to various events and being a part of the SQL Community means one can meet some pretty awesome professionals. I was fortunate enough to run into Warwick Rudd (B|T) at one of the PASS Summit events held in Seattle, and he definitely lives up to all the hype.

Warwick is a SQL Server MVP, Microsoft Certified Master – SQL 2008, MCT, Founder and Principal Consultant at SQL Masters Consulting. He’s definitely an avid blogger, talented speaker, and a leader in our SQL Community.

After PASS Summit 2015 we kicked an idea around about getting something like this going where we could share a few questions and answers; the timing finally aligned right and well, without further ado:

  • How did  you get your start in working with SQL Server?

I was working as a UNIX scripting developer on an in house created scripting language. The company had a couple of web developers who had installed SQL Server 6.5 and the company needed someone to look after the SQL server environment. I moved in with the oracle DBA’s as there were no SQL server DBA’s and my first training course was delivered by Greg Low. Look where things have led me to now?

  • If there was another occupation you could see yourself doing what would it be and why?

Physiotherapy – I have played  a lot of sports and some to a very high level. Sports and sports remediation I find interesting and just naturally enjoy learning about it.

  • Being in technology we do play some pranks on our fellow colleagues. What is one that you are willing to share, that you have done in your past?

I was working in a bank and at the time we actually did not have pc’s but dumb terminals. We disconnected the keyboard and put sticky tape over the connection before seating the connection back just enough to make it look as though it was plugged in to pass initial inspection of why the keyboard was not working.

  • Where is one place that you would love to speak at someday (conference, SQL Saturday, event, etc.)?

Ha-ha this is a tough one as there are so many different things to take into consideration. But I guess I would love to speak at SQL Saturday in Colorado if it was ever available in winter as I love being in the snow and snowboarding – I would then get to do 2 things I enjoy. There are some bigger events, if I ever got the opportunity to speak at, that would be so humbling to be selected for, but I will keep those close to my chest so as to not jinx myself 🙂

  • For those out there that have not heard of SQL Community, what would you say in 3 words describes SQL Community?

Friendly, Supportive, Intelligent

Big thanks to Warwick for allowing us to take a glimpse into some of his thoughts. If you are ever at an event make sure you stop by and say hi to him; just a stellar individual.

Reports In SSISDB

I was approached with an in shop issue where a group could not view the execution reports in the SSISDB. The reason for this was due to the security standards in place at the shop; the user could not be allowed to have the SSIS_Admin role.

In order to allow the user group access to the review Integration Services reports on package executions the SSIS_Admin role had to be given. This role would allow the following capabilities:

· Import Packages

· Delete own packages

· Delete all packages

· Change own package roles

· Change all package roles

This role also elevates privileges to sysadmin. This elevation of privilege can occur because these roles can modify Integration Services packages and Integration Services packages can be executed by SQL Server using the sysadmin security context of SQL Server Agent.

What this document will provide is a solution around not granting the SSIS_Admin role, but still allow the necessary entities access to the SSIS execution reports without granting any write, execute, create, update, or insert access.

The Catalog Views

Microsoft has two catalog views that make up the report access for SSIS packages (2012 and later) that I have found. We will look at both of these views and alter them to comment out the where clause. In doing so this will negate only SSIS_admin and sysadmins from having access to the reports, but will allow other users access to these reports.

Catalog.Event_Messages

The first catalog view we will look at is catalog.event_messages. This view is simply utilized for displaying information about messages that were logged during operations. The way Microsoft has configured this view is to only allow the SSIS_Admin or sysadmin privilege to view. In our case we would like to have other groups the ability to review the messages only. In order to do this we need to alter the catalog view.

To access the catalog view navigate to the SSISDB on the SQL Instance. Right click the catalog view and say alter:

clip_image002

Next comment out the where clause and execute the alter statement updating the catalog view appropriately

clip_image004

Catalog.executions

The second catalog view can be accessed in the same manner as the first catalog by repeating the same methodology of altering the view. Catalog.executions displays the instances of the package execution in the Integration Services catalog. Packages that are executed with the Execute Package task run the same instance of execution as the parent package.

After right clicking and altering the view, the where clause will need to be commented out as shown below:

clip_image006

After the where clause is commented out execute the alter command to update the catalog view.

Granting Catalog Access

After altering the catalog views we are left with a manual but needed process. Within the Integration Services Catalogs the following steps will need to be completed:

1. Right click on the package and go to properties

2. Once the Folder Properties dialog box is initiated go to Browse in the upper right hand corner

image

3. Once the Browse All Principals dialog box is initiated select the public database role and click ok

image

4. Ensure the grant read access is then given to the database role and click ok

image

Conclusion

By taking the above steps the users can now see the reports and that is it. In testing I’ve found that users cannot do the following:

· Create new environmental variables

· Create new packages

· Create new folders

· Cannot add, insert, update, or delete anything in the SSISDB or the Catalog Folders

· Cannot initiate any SSIS SQL Agent jobs

· Cannot execute any queries against the SSISDB

What I’ve found users can do the following:

· See package names

· Right click and select all executions

· View the reports

**As with anything do not take code from the web and blindly implement into your production environment.

Why is Count(*) Taking So Long

Phone rings……..I answer…….a DBA from a third party vendor has supplied someone with two scripts. The first script is a simple insert with a where clause:

insert into [databasename].[dbo].[tablename]
( column1,column2, column3, column4 )

select column1, column2, column3, column4
from [databasename].[dbo].[tablename]

where column1<[integeramount]
and column1>[integeramount]

First question I ask; how much data is loading into the table – answer millions of records; the vendor gave us a script to see if the number is increasing. What is the script; do you have it? Sure…..script below:

select COUNT(*) from tablename

I was reminded of something I came across several years ago about this very scenario so figured why not put it to the test. I will try to explain to the best of my ability why the the second query was taking an hour to run.

The first problem I see right off hand is the COUNT(*) statement has to do a table scan; it is a requirement to figure out the calculation to return the result set. You take this statement and run it against several million row tables with a ton of reads on it you have a recipe of being prepared to sit and watch and wait for the result set to return.

How Do I Get Around This?

It’s not that difficult and here is a nice trick to provide a quick solution. I’m a huge fan of DMV’s and it just so happens that you can utilize one to return row counts for all tables in a database or specific tables in the database:

SELECT o.name,
ddps.row_count
FROM sys.indexes AS i
INNER JOIN sys.objects AS o ON i.OBJECT_ID = o.OBJECT_ID
INNER JOIN sys.dm_db_partition_stats AS ddps ON i.OBJECT_ID = ddps.OBJECT_ID
AND i.index_id = ddps.index_id
WHERE i.index_id < 2
AND o.is_ms_shipped = 0
–AND o.name = [table name] /*UNCOMMENT AND PLUG IN TABLE NAME FOR SPECIFIC TABLE INFO*/
ORDER BY o.NAME

The result will give you the specific table name with row count

image

Don’t be alarmed by using the system object. Unlike others the row count does not depend on any updated statistics so the count is accurate. On this 132 million record table I can get the result set to return immediately.

Next time you get stuck waiting on a COUNT(*) statement to run; think about using a DMV; for a listing check out what Microsoft has listed into categories

It is always nice to have some tricks up your sleeve; especially when dealing with outside vendors.

PowerShell Tactics

Last months T-SQL Tuesday party dealt with PowerShell. I debated on whether or not to post a topic, but in the end I didn’t feel like I should post something that I had not really ever tried before so I spent a few weeks going through some basic power shell scripting and what I ended up with was a way to check my databases from a list of servers I have contained in a text file; along with retrieving failed job history based on the same server listing.

I was impressed enough that I started to use some of this technology on my on-call days where I have automated reports dumped out into excel waiting for me when I get in for review.

While there are many variations of this out there I researched and then put some of my own twists into it when creating the scripts; I only plan today to show one that I use the SMO assembly to retrieve basic database information.

The Script

First I want to open excel:

#Create a new Excel object using COM
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $True
$Excel = $Excel.Workbooks.Add()
$Sheet = $Excel.Worksheets.Item(1)

Now that excel is open I can begin to retrieve the list of databases from the text file and create all the header information:

#Counter variable for rows
$intRow = 1

#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content “C:\Server\Servers.txt”)
{

#Create column headers
$Sheet.Cells.Item($intRow,1) = “INSTANCE NAME:”
$Sheet.Cells.Item($intRow,2) = $instance
$Sheet.Cells.Item($intRow,1).Font.Bold = $True
$Sheet.Cells.Item($intRow,2).Font.Bold = $True

$intRow++

$Sheet.Cells.Item($intRow,1) = “DATABASE NAME”
$Sheet.Cells.Item($intRow,2) = “OWNER”
$Sheet.Cells.Item($intRow,3) = “AUTOSHRINK”
$Sheet.Cells.Item($intRow,4) = “RECOVERY MODEL”
$Sheet.Cells.Item($intRow,5) = “LAST DATABASE BACKUP”
$Sheet.Cells.Item($intRow,6) = “SIZE (MB)”
$Sheet.Cells.Item($intRow,7) = “SPACE AVAILABLE (MB)”

#Format the column headers
for ($col = 1; $col –le 7; $col++)
{
$Sheet.Cells.Item($intRow,$col).Font.Bold = $True
$Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
$Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
}

$intRow++

Now the fun part; time to get all the information and watch your excel file go:

[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’) | out-null

# Create an SMO connection to the instance
$s = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’) $instance

$dbs = $s.Databases

#$dbs | SELECT Name, Owner, AutoShrink, RecoveryModel, Last Database Backup, Size, SpaceAvailable

#Formatting using Excel

ForEach ($db in $dbs)
{

#Divide the value of SpaceAvailable by 1KB
$dbSpaceAvailable = $db.SpaceAvailable/1KB

#Format the results to a number with three decimal places
$dbSpaceAvailable = “{0:N3}” -f $dbSpaceAvailable

$Sheet.Cells.Item($intRow, 1) = $db.Name
$Sheet.Cells.Item($intRow, 2) = $db.Owner

#Change the background color of the Cell depending on the AutoShrink property value
if ($db.AutoShrink -eq “True”)
{
$fgColor = 3
}
else
{
$fgColor = 0
}

$Sheet.Cells.Item($intRow, 3) = $db.AutoShrink
$Sheet.Cells.item($intRow, 3).Interior.ColorIndex = $fgColor
if ($db.RecoveryModel -eq “1”)
{
$Sheet.Cells.Item($intRow, 4) = “FULL”
}
elseif ($db.RecoveryModel -eq “3”)
{
$Sheet.Cells.Item($intRow, 4) = “SIMPLE”
}
elseif ($db.RecoveryModel -eq “2”)
{
$Sheet.Cells.Item($intRow, 4) = “BULK-LOGGED”
}
if ($db.LastBackupDate -eq “12:00:00 AM”)
{
$Sheet.Cells.Item($intRow, 5) = “Never”
}
else
{
$Sheet.Cells.Item($intRow, 5) = $db.LastBackupDate
}
$Sheet.Cells.Item($intRow, 6) = “{0:N3}” -f $db.Size

#Change the background color of the Cell depending on the SpaceAvailable property value
if ($dbSpaceAvailable -lt 1.00)
{
$fgColor = 3
}
else
{
$fgColor = 0
}

$Sheet.Cells.Item($intRow, 7) = $dbSpaceAvailable
$Sheet.Cells.item($intRow, 7).Interior.ColorIndex = $fgColor

$intRow ++

}

$intRow ++

}

$Sheet.UsedRange.EntireColumn.AutoFit()
cls

Conclusion:

I added and removed some various objects in the information I was pulling back. There are a lot of great topics out there that show you what the objects are and many various books that provide the syntax in order to accomplish many things utilizing PowerShell.

I was not happy to miss the party last month but going through the exercise of diving in and doing some things with PowerShell I am glad I did. I look forward to doing some more involved work with it as I move forward in the future ~ I’ve only begun to scratch the surface!

Select * Syndrome

NoSomething that I have seen lately over and over again and even ran into this morning is a practice that I would say is a pretty bad habit in SQL….the dreaded Select * syndrome

This method is heavily used for adhoc querying and I’ve seen it used in some troubleshooting scenarios but in my case I don’t have room for it in a production environment embedded in functions, procedures, or views.

To me it is a wasteful tactic in bringing back what is needed; it can produce unwanted scans or look-ups when in some cases all that is needed is index tuning. I’m a big fan of bringing back what you need instead of bringing back a tremendous amount of data. One can also make an argument for all the unused overhead it can produce.

I cannot begin to tell you the many times of deploying something out and then to find out the schema has changed and the select * in a view that was left in place years ago is my culprit from years of past coding that has been done.

For example; one that I have seen within the past couple of months is a view:

Select *

From table 1

Union All

Select *

From table2

This was being used quite frequently and is just asking for trouble and poor performance. There will always be excuses as to why it wasn’t done differently but in the end you will go back in and clean it up so it is best to think the process you are working on through in the beginning instead of the end.

Can I ever use it?

Sure….I’ve seen it used in If Exists clauses many times over and from my research and what I know SQL does not count this in the execution plans; if you leverage SQL the correct way it is more than powerful to handle what you need.

Tools to fight the good fight………

My source control is TFS and I like the newest version as you can set up controls that if a Select * is found it will break the build in dev forcing it to be resolved

If you haven’t already downloaded the free version check out the SQL Plan Explorer provided by SQL Sentry. Execute the tasks different ways with the select * and with pulling back designated columns and review the execution plan; you will be surprised at the outcome, and if you are old school that is fine too – analyze it in SSMS and see what you find.

Microsoft SQL Server 2012 SP1 Release

I’m a little behind in getting this information out; I wish I could say it was because I was in Seattle last week for the PASS Summit 2012; however that was not the case. In any sense Microsoft has released Service Pack 1 for SQL Server 2012.

The last list of fixes I saw for what Service Pack 1 will contain can be found here

You can access the download center here

I’m interested to know how many of you in the SQL Community are utilizing SQL Server 2012 and what are some of the features you like and dislike about it. If you get time drop me a line and let me know.

 

SQL Server 2012 Best Practices – The Book

 People who I have spoken with over a period of time know that I have dubbed this as “The Book“. There are several reasons why I am stoked to get my hands on a copy of this book (should be in transit to me as I type!), but one of the main reasons is I get to check out what 4 of the authors have to say of whom I follow daily from a DBA standpoint.

I am certain that all of the authors deserve kudos for their efforts in getting their respective pieces completed; for me personally I wanted to give a big shout out to Grant Fritchey, Gail Shaw, Jason Strate, and Chris Shaw. These 4 people have helped me tremendously in my career thus far by their efforts in the community and sharing their knowledge.

Looking forward to seeing what the book has to offer. I suggest you give the book a shot if you haven’t thought about it buying it yet.

You can find the book on Amazon here.