Category Archives: SQLServerPedia Syndication

Removing Backup Media By Database

DeletePersonHave you ever inherited a server and find out that your msdb database is rather large in nature?  Or maybe you have never set up any maintenance on your msdb with regards to retaining backup history? Okay, okay perhaps you don’t fall into those categories but maybe you have policies in place and you offline a database for whatever reason and your policies start to fail for backups not taken in the last 24 hours. Maybe you fall into one of these categories or maybe you fall into a different one, whatever the case may be there are options for removing the backup history cleanly and methodically without going all rogue style in SQL.

Microsoft has provided a couple system stored procedures within the msdb database for removing backup history, but I will only focus on one, sp_delete_database_backuphistory.

This little gem, if used properly, will remove the backup history for a particular database. This comes in very useful as recently taking a database offline to be decommissioned was messing with policy checks in place, particularly the one I mentioned above where a the policy checks to see if a database had been backed up within the last 24 hours.

The Code

/*This code deletes backup history for a specific database*/
use msdb
go
exec sp_delete_database_backuphistory ‘yourdatabasename’
go

The Tables Affected

  • backupfile
  • backupfilegroup
  • backupmediafamily
  • backupmediaset
  • backupset
  • restorefile
  • restorefilegroup
  • restorehistory

Conclusion

There are tools available at ones disposal to help in situations with SQL. Microsoft has provided a number of system stored procedures but it would not be prudent to just go start running scripts or executing procedures without first knowing what they are going to do to your systems. Never run anything in production or any environment for that matter without first testing it on your local sandbox.

Add this script to your repository, you never know when you will need it.

Adversity – How Do You Handle It?

Adversity Adversity by definition is a difficult situation or condition misfortune or tragedy. When that time comes; how do you handle it?

If replication breaks causing the business to not get the data they need in a timely fashion and you are the lone conductor driving the train down the tracks and everyone is standing at your desk, how do you handle it? I

f your backups failed from the previous night and for some reason your notification of failure didn’t reach you and you had no knowledge of the situation until days later, how do you handle it?

Someone calls and said they can’t access their systems and you find out something is causing tempdb to fill up, how do you handle it?

Better yet, you find out that a certification you were working toward had been retired (MCM), how do you handle it?

Response

I can only speak for myself and no one else, I fell into the same category a lot of people across the nation did of working toward the MCM certification goal. I mean let’s face it, last I checked a very small percentage obtained this goal and being the competitive person I am I wanted to reach that goal. Not for anyone else but for myself and the standards I set for my own self.

At first I was upset and like so many others not so much toward the retiring of the program but how it was delivered by Microsoft. I purposefully waited to construct a blog post as I didn’t want to let emotions get in the way of potential views toward the topic.

Adversity has now hit for a lot of people, for me in particular I choose not to dwell on it but wait to see what will be offered next by Microsoft, restructure my goals, and move forward. I am a bit biased but I get to work with some of the best technology day in and day out and am in constantly learning mode. Some of the things that we can control are the way we handle adversity when it rears its ugly head, the attitude to how we conduct ourselves through the adversity, and the character building we can learn from it.

Am I happy about the retirement, no I wasn’t. At the end of the day though I do have a lot to be thankful for and looking back on my studying, lab testing, hours of working toward the certification it allowed me to push myself to learn a lot more than what I did the previous day and the day before that. I’m eager to see where the next chapter leads in the stepping stone of my learning.

Adversity – how will you handle it? Each of us are different; doesn’t make one person better than the next by how we handle it, but I do encourage others to push through adversity when it comes. You’ll be a stronger person in doing so.

Back to Basics

BasicsI can sum up this blog post in two words: The Basics

What do you mean?

I cannot be the only one in this same boat. There have been times where I, as a DBA, have overcomplicated resolutions when there was simply a very non complex answer. Come on, you know what I’m referring to – the basics. In looking up the exact meaning of what basic says; the dictionary tells me it is the “fundamental or basic principal”. I took that meaning this week and looked deeper into how I attack some DBA related items and I keep coming back to – “keep it simple” and “get back to the basics”.

The Basics

When I say basics I’m coming at it from a DBA standpoint. Do you have your own checklist that you go by for daily, monthly, quarterly, yearly checks? If not, then this is as good as time as any to start.

What are some of the things to include in your basic checklist?

Some items to include, but not limited to are:

  • Backup processes and alerting upon failure
  • Review of jobs (if any have failed)
  • Anything in the SQL Error Logs?
  • If you use Policy Based Management (PBM) and receive reports – any action to take?
  • Security logs – have you checked?
  • If issues are found how do you handle them? Just don’t sweep them under the rug
  • Hopefully you have something in place that tells you when you are running low on space (storage)….if not might want to start thinking about getting something implemented; by the way how is the space looking?
  • You’re fragmentation process for indexes working properly?
  • Remember those backups you checked? You spot checking any restores to validate them at all?

Summary

The list above is just a simple list to get started with….get back to the basics of DBA work. It’s a fun and enjoyable ride, but keep one thing in mind as you traverse through the SQL terrain not to overcomplicate things. You will find many checklists on the web by some renowned DBA’s that one could model a standard off of if you don’t already have one. I’m a big believer in automation and automate what you can to help you become more efficient and streamlined. Get the reports emailed to you when you start your day, at best make sure notifications are set up on the jobs running in case of failure.

Keep it simple, get some standards in place, above all remember the basic principals. Don’t overcomplicate issues and when you run across them don’t sweep them under the rug and wait for the next DBA to come along to fix them; take the time to fix them correctly.

Giving Credit Where Credit Is Due

Respect2

Have you ever stopped and looked at the SQL Community as a whole entity and all it has accomplished? Better yet, have you ever stopped and thought about a problem you’ve researched and found that someone else has already been experiencing it and has provided a solution? If you are doing any blogging or social media and present a solution in a manner that it is your own, is that right? Answer to the last question is no.

The Community

I have never been associated with a community like the SQL Community where everyone is eager to share their knowledge or advice in trying to achieve an answer or solution to an issue. More times than not solutions are provided on a blog similar to this one, a news letter, or twitter; it is very easy to take the research that was found and utilize it and pass it off as being completed by oneself – I’ve seen some do it and not think twice about it.

The Cost

The SQL Community has a plethora of great minds, some of which you will find over to the right in the DBA Blog section. Think about the countless work everyone puts in figuring solutions to issues and then sharing. If someone’s work is taken and used for their own gain time and time again eventually the well might dry up.

The Call to Give Credit Where Credit Is Due

The solution is basic and simple, if you use something that someone else has written give credit to them by referencing it. Below are some ways to do this:

  • Script – think of the countless scripts that have been provided over the years, some that come to my mind off hand are Brent Ozar’s sp_Blitz, Adam Machanic’s sp_whoisactive, Kendra Little’s sp_BlitzIndex, and Glenn Berry’s awesome diagnostic queries. It doesn’t have to be the ones I’ve mentioned here it can be scripts that someone has provided however big and small they maybe. The point is reference their work; because they are the ones who provided it.
  • Blog Information – A vast majority of my DBA colleagues have blogs they use daily, weekly, or monthly with a ton of information on them. If you are passing or using this information along just note where you got it from.
  • Email the Author – email the author and ask them if it is okay to use there work on a site for example if you reference their name. A couple reasons I mention this – 1.) it is a display of respect and 2.) it also shows the author that there is appreciation for their efforts in sharing their knowledge

The Awareness

Not everyone is perfect, I understand that, but over the course of the last few months I have seen many occurrences where situations could have been avoided and hard working data professionals have been bitten by their work being taken and utilized for someone else’s personal gain. Think about this question – we are data professionals in some form or fashion; where does utilizing someone else’s work without referencing it sound professional? Let’s keep our community strong and thriving.

Closing with the Thanks

A big thanks to all the community for the relentless time and effort along with the countless hours in making solutions for us who seek them for every day issues. If we make a mistake along the way because we are human may we own up to it, learn from it, and move on with integrity and character.

How’s Your Database Mail?

Database MailFrom time to time I field questions regarding Database Mail usage within SQL. Questions come from all over discussing how to identify what the job is doing or what the job has done. Sure, you can send some test mails through the nice GUI part, but that is not what this post is about. I enjoy T-SQL and looking inside SQL the old fashion way so to speak so I utilize some simple queries that a colleague of mine recommended for me.

Database Mail in and of itself is a useful tool; it allows for notifications of failed SQL jobs for instance. The messages in and of itself can contain a plethora of information that can assist one in troubleshooting a variety of issues. According to Microsoft they state Database Mail in this manner – “Database Mail is designed for reliability, scalability, security, and supportability.”

**NOTE** Database Mail is not active by default; it has to be configured and turned on. The below information assumes that Database Mail is already set up. For information on how to set up Database Mail you can go here

To give a brief overview the below script is broken out into 7 mini scripts; these scripts consist of checks against Database Mail along with the process of stopping and restarting database mail. Please note the disclaimer and hope this helps with some of the questions that I’ve received thus far regarding Database Mail.

/**************************************************************************************************************

Disclaimer: Do not execute code found on the internet without testing on your local or testing environment. Running any code in a production environment that you find on the internet is not an acceptable practice and this site is not responsible for any repercussions that may follow if you choose to do so.

Scripts below are numbered; the corresponding numbers will give you a description of what they are utilized for.

1. The status of the Database Mail. Possible values are Started and Stopped (msdn article on sysmail_help_status_sp)

2. Stops the database mail queue that holds outgoing message requests (msdn article on sysmail_stop_sp)

3. Starts the database mail queue that holds outgoing message requests (msdn article on sysmail_start_sp)

4. Shows all the mail items

5. Shows all the unsent mail items

6. Shows all the sent mail items

7. Shows all the failed mail items

**************************************************************************************************************/

USE msdb

GO

/*1.*/ EXECUTE sysmail_help_status_sp

/*2.*/ EXECUTE sysmail_stop_sp

/*3.*/ EXECUTE sysmail_start_sp

/*4.*/ SELECT * FROM dbo.sysmail_mailitems (NOLOCK)

/*5.*/ SELECT * FROM dbo.sysmail_unsentitems (NOLOCK)

/*6.*/ SELECT * FROM dbo.sysmail_sentitems (NOLOCK)

/*7.*/ SELECT * FROM dbo.sysmail_faileditems (NOLOCK)

T-SQL Tuesday #41 Presenting and Loving It

SQL TuesdayThis months T-SQL Tuesday party is being hosted by Bob Pusateri (a.k.a. SQL BOB) (B | T). Bob’s topic for this month is intriguing to me as I wrestled with this for some time first starting out – Presenting!

I think for me personally, I really started to think about presenting after returning from my first Pass Summit (2011). Sure, I’ve given speeches in college, and talked openly in discussions but something drew me to speaking on a topic that I enjoy working with everyday ~ SQL.

  • First Presentation – my first presentation went surprisingly well. I had been in front of large crowds on many occasions so talking wasn’t my initial problem; my initial problem was knowing that the attendees came to get info and to learn, and I needed to be sure I was prepared to give that to them.
  • Investment – I started to look at speaking as investing time in others. I cannot begin to tell you how beneficial and fortunate I have been to study, learn, and tap into some of the best minds in the SQL industry. They have taken the time to invest in me to further my knowledge and career; I should do the same and help others along the way if I can. You see that is what is so great about the SQL Community; the community is there and always helps.

I know there are other people out there that are hesitant to take the plunge in speaking. I have personally experienced some great benefits to speaking. For those that are hesitant these might be worth considering.

  • Gain knowledge – when you are presenting on a topic it drives you to gain more knowledge about the topic. Knowing the ins and outs and going through test results and scenarios.
  • Networking – meeting new faces in the community; getting involved with peers
  • SQL Saturday’s – attending new SQL Saturday’s in different locations
  • Giving back – as I already stated giving back to the community that has helped me in so many ways over the years

Speaking is like anything else you set your mind to. Practice and you will get better; if you haven’t taken the plunge yet I encourage you to try it once; see how you like it; you might find something out about yourself.

Want to host a party?

So you want to know if you are eligible to host a T-SQL Tuesday party? Glad you asked, you can contact Adam Machanic (B | T) who is the founder of T-SQL Tuesday.

Why is Count(*) Taking So Long

Phone rings……..I answer…….a DBA from a third party vendor has supplied someone with two scripts. The first script is a simple insert with a where clause:

insert into [databasename].[dbo].[tablename]
( column1,column2, column3, column4 )

select column1, column2, column3, column4
from [databasename].[dbo].[tablename]

where column1<[integeramount]
and column1>[integeramount]

First question I ask; how much data is loading into the table – answer millions of records; the vendor gave us a script to see if the number is increasing. What is the script; do you have it? Sure…..script below:

select COUNT(*) from tablename

I was reminded of something I came across several years ago about this very scenario so figured why not put it to the test. I will try to explain to the best of my ability why the the second query was taking an hour to run.

The first problem I see right off hand is the COUNT(*) statement has to do a table scan; it is a requirement to figure out the calculation to return the result set. You take this statement and run it against several million row tables with a ton of reads on it you have a recipe of being prepared to sit and watch and wait for the result set to return.

How Do I Get Around This?

It’s not that difficult and here is a nice trick to provide a quick solution. I’m a huge fan of DMV’s and it just so happens that you can utilize one to return row counts for all tables in a database or specific tables in the database:

SELECT o.name,
ddps.row_count
FROM sys.indexes AS i
INNER JOIN sys.objects AS o ON i.OBJECT_ID = o.OBJECT_ID
INNER JOIN sys.dm_db_partition_stats AS ddps ON i.OBJECT_ID = ddps.OBJECT_ID
AND i.index_id = ddps.index_id
WHERE i.index_id < 2
AND o.is_ms_shipped = 0
–AND o.name = [table name] /*UNCOMMENT AND PLUG IN TABLE NAME FOR SPECIFIC TABLE INFO*/
ORDER BY o.NAME

The result will give you the specific table name with row count

image

Don’t be alarmed by using the system object. Unlike others the row count does not depend on any updated statistics so the count is accurate. On this 132 million record table I can get the result set to return immediately.

Next time you get stuck waiting on a COUNT(*) statement to run; think about using a DMV; for a listing check out what Microsoft has listed into categories

It is always nice to have some tricks up your sleeve; especially when dealing with outside vendors.

PowerShell Tactics

Last months T-SQL Tuesday party dealt with PowerShell. I debated on whether or not to post a topic, but in the end I didn’t feel like I should post something that I had not really ever tried before so I spent a few weeks going through some basic power shell scripting and what I ended up with was a way to check my databases from a list of servers I have contained in a text file; along with retrieving failed job history based on the same server listing.

I was impressed enough that I started to use some of this technology on my on-call days where I have automated reports dumped out into excel waiting for me when I get in for review.

While there are many variations of this out there I researched and then put some of my own twists into it when creating the scripts; I only plan today to show one that I use the SMO assembly to retrieve basic database information.

The Script

First I want to open excel:

#Create a new Excel object using COM
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $True
$Excel = $Excel.Workbooks.Add()
$Sheet = $Excel.Worksheets.Item(1)

Now that excel is open I can begin to retrieve the list of databases from the text file and create all the header information:

#Counter variable for rows
$intRow = 1

#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content “C:\Server\Servers.txt”)
{

#Create column headers
$Sheet.Cells.Item($intRow,1) = “INSTANCE NAME:”
$Sheet.Cells.Item($intRow,2) = $instance
$Sheet.Cells.Item($intRow,1).Font.Bold = $True
$Sheet.Cells.Item($intRow,2).Font.Bold = $True

$intRow++

$Sheet.Cells.Item($intRow,1) = “DATABASE NAME”
$Sheet.Cells.Item($intRow,2) = “OWNER”
$Sheet.Cells.Item($intRow,3) = “AUTOSHRINK”
$Sheet.Cells.Item($intRow,4) = “RECOVERY MODEL”
$Sheet.Cells.Item($intRow,5) = “LAST DATABASE BACKUP”
$Sheet.Cells.Item($intRow,6) = “SIZE (MB)”
$Sheet.Cells.Item($intRow,7) = “SPACE AVAILABLE (MB)”

#Format the column headers
for ($col = 1; $col –le 7; $col++)
{
$Sheet.Cells.Item($intRow,$col).Font.Bold = $True
$Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
$Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
}

$intRow++

Now the fun part; time to get all the information and watch your excel file go:

[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.SMO’) | out-null

# Create an SMO connection to the instance
$s = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’) $instance

$dbs = $s.Databases

#$dbs | SELECT Name, Owner, AutoShrink, RecoveryModel, Last Database Backup, Size, SpaceAvailable

#Formatting using Excel

ForEach ($db in $dbs)
{

#Divide the value of SpaceAvailable by 1KB
$dbSpaceAvailable = $db.SpaceAvailable/1KB

#Format the results to a number with three decimal places
$dbSpaceAvailable = “{0:N3}” -f $dbSpaceAvailable

$Sheet.Cells.Item($intRow, 1) = $db.Name
$Sheet.Cells.Item($intRow, 2) = $db.Owner

#Change the background color of the Cell depending on the AutoShrink property value
if ($db.AutoShrink -eq “True”)
{
$fgColor = 3
}
else
{
$fgColor = 0
}

$Sheet.Cells.Item($intRow, 3) = $db.AutoShrink
$Sheet.Cells.item($intRow, 3).Interior.ColorIndex = $fgColor
if ($db.RecoveryModel -eq “1”)
{
$Sheet.Cells.Item($intRow, 4) = “FULL”
}
elseif ($db.RecoveryModel -eq “3”)
{
$Sheet.Cells.Item($intRow, 4) = “SIMPLE”
}
elseif ($db.RecoveryModel -eq “2”)
{
$Sheet.Cells.Item($intRow, 4) = “BULK-LOGGED”
}
if ($db.LastBackupDate -eq “12:00:00 AM”)
{
$Sheet.Cells.Item($intRow, 5) = “Never”
}
else
{
$Sheet.Cells.Item($intRow, 5) = $db.LastBackupDate
}
$Sheet.Cells.Item($intRow, 6) = “{0:N3}” -f $db.Size

#Change the background color of the Cell depending on the SpaceAvailable property value
if ($dbSpaceAvailable -lt 1.00)
{
$fgColor = 3
}
else
{
$fgColor = 0
}

$Sheet.Cells.Item($intRow, 7) = $dbSpaceAvailable
$Sheet.Cells.item($intRow, 7).Interior.ColorIndex = $fgColor

$intRow ++

}

$intRow ++

}

$Sheet.UsedRange.EntireColumn.AutoFit()
cls

Conclusion:

I added and removed some various objects in the information I was pulling back. There are a lot of great topics out there that show you what the objects are and many various books that provide the syntax in order to accomplish many things utilizing PowerShell.

I was not happy to miss the party last month but going through the exercise of diving in and doing some things with PowerShell I am glad I did. I look forward to doing some more involved work with it as I move forward in the future ~ I’ve only begun to scratch the surface!

Select * Syndrome

NoSomething that I have seen lately over and over again and even ran into this morning is a practice that I would say is a pretty bad habit in SQL….the dreaded Select * syndrome

This method is heavily used for adhoc querying and I’ve seen it used in some troubleshooting scenarios but in my case I don’t have room for it in a production environment embedded in functions, procedures, or views.

To me it is a wasteful tactic in bringing back what is needed; it can produce unwanted scans or look-ups when in some cases all that is needed is index tuning. I’m a big fan of bringing back what you need instead of bringing back a tremendous amount of data. One can also make an argument for all the unused overhead it can produce.

I cannot begin to tell you the many times of deploying something out and then to find out the schema has changed and the select * in a view that was left in place years ago is my culprit from years of past coding that has been done.

For example; one that I have seen within the past couple of months is a view:

Select *

From table 1

Union All

Select *

From table2

This was being used quite frequently and is just asking for trouble and poor performance. There will always be excuses as to why it wasn’t done differently but in the end you will go back in and clean it up so it is best to think the process you are working on through in the beginning instead of the end.

Can I ever use it?

Sure….I’ve seen it used in If Exists clauses many times over and from my research and what I know SQL does not count this in the execution plans; if you leverage SQL the correct way it is more than powerful to handle what you need.

Tools to fight the good fight………

My source control is TFS and I like the newest version as you can set up controls that if a Select * is found it will break the build in dev forcing it to be resolved

If you haven’t already downloaded the free version check out the SQL Plan Explorer provided by SQL Sentry. Execute the tasks different ways with the select * and with pulling back designated columns and review the execution plan; you will be surprised at the outcome, and if you are old school that is fine too – analyze it in SSMS and see what you find.

Dashboard Time

AutomationI was fortunate enough to attend the PASS 2011 Summit in Seattle. If you do not know what I am speaking of when I say PASS I encourage you to check it out. PASS stands for Professional Association for SQL Server. The event that is put on yearly speaks for itself and I can dedicate a whole blog to just that but no; I’m going to speak of something I picked up while at the conference.

SQL Server MVP – Deep Dives Vol 2

This book has a plethora of valuable information and golden nuggets so much so I figured I’d implement something on my own that I can use everyday from it. There are countless number of good authors in this book

The Dashboard

I’m on a team that runs a full range of SQL servers from 2000 to 2012 on physical and VM’s, but chapter 12 stood out to me the other day which I decided to tried out. I’ve built reports and metrics in the Utility Database (idea spawned in my head after attending a session by Chris Shaw (B|T) but I started thinking of building a dashboard off the information.

Pawel Potasinski (B|T) wrote a chapter in this book called “Build your own SQL Server 2008 performance dashboard” – as I read through the chapter ideas started to spin in my head and before I knew it I was giving it a try.

I combined some of his ideas with the metrics I pull back using Glenn Berry’s (B|T) Diagnostic Queries and built a standard dashboard for myself that gets generated every morning when I walk in the door. In it I include some of the basics such as CPU, PLE, %Log Used. Pawel uses DMV’s and SQLCLR to get the performance counters; I’ve started to incorporate some extended events results in there as well.

Some additional items I’ll be incorporating in the near future is further drill downs into the details of the counters themselves and sharing the report out to the team I am on as a custom report. Once I have everything completed my plan is to make another post entry with the screen shots, code, etc.

In the end I would say I was not fully taking advantage of what SQL Server has to offer for me….are you? I’ve enjoyed digging further into Reporting Services and what I can leverage from it in administering databases I’m responsible for. Take a look at what your processes are and if it isn’t automated how can you better leverage your time and can it be automated?