Category Archives: SQL Server

Headache + Pain <> Red Gates SQL Search

Recently, I started diving into my utility belt. There are tools that I’ve had at my disposal for some time, some that I have tried and not gone back to, and ones that I use everyday. This has a definite potential to start exploring these utilities over the next couple of months and one that I touched on last few weeks that I want to expound a bit more on today is Red Gates SQL Search utility.

I’ve come accustomed to this utility as being my quick hit Sherlock Holmes style of investigating all sorts of issues that a common data professional may encounter throughout the work day.

The Utility

SQL Search is a “free” tool that is an SSMS add-in that pretty much allows you to scan across all databases for a plethora of information for all objects. How do you access this utility?

Once downloaded and installed within SSMS you will find it on your tool bar as indicated below:

SQLSearch1

By clicking on the SQL Search icon or by utilizing the hot key of (CTL+ALT+D) you will open the SQL Search Tab in your current listed session below:

SQLSearch2

The Search Begins

Once you have this open I’d like to point out a few items.

  1. In the last image shown you will notice that in my Object Explorer I have my (local) instance selected and highlighted. If I had multiple instances in my Object Explorer SQL Search would open the tab on whichever instance I had highlighted.
  2. I can now begin my search. I have multiple options at my disposal on what I can check for and I can set limitations or filters if you will by selecting what type of object I’d like to filter on as noted below:
  3. The last two options are self explanatory with what database you are searching in, but I do want to point out the last option where I have (local) in my drop down. Currently, I do not have a second instance fired up in my Object Explorer; if I did then you could click this drop down and hop over to the other instance. This feature is offers nice compatibility when you have to work with multiple instances at once time.

SQLSearch3

By Jove Watson We’ve Found It

Now here is where the magic begins. For example, I know that I have a table called Release but I cannot remember where I saw it at, or maybe you’ve been in a meeting before and you leave with schema names but you’ve never seen the lay of the land before. Well here you go; if I type in the term Release and set it for all objects, scanning all databases my results are below:

SQLSearch4

…but it gets better. If you look at the link that says “Select object in Object Explorer” it will take you directly to the location inside the DB where the object is located for review

SQLSearch5

Fancy but now what?

SQL’s version of Scotland Yard says this is all well and good but can it do more? Let’s step back and think about this a moment. The whole goal, with any business I hope, is to make processes and procedures more efficient. This utility makes a strong case for the data professional to add this to their arsenal. Here are some real examples on what I utilize this tool for that haven’t been mentioned thus far:

  • Off and on code reviews are done. Have you ever ran across the infamous /* TO DO */ comment? Heck I’ve put those comments in before in my own sp’s noting I need to finish something at a later time. I can quickly scan a DB and easily find these gaps in seconds
  • Select *; don’t fool yourself. While these are frowned upon they are still used. Utilizing this tool has helped me pin point such code in seconds which then proceeds my conversation with the author of the code or update
  • I’ve utilized this utility as my central hub for branching out when I know parts of names to objects. Gives me quick insights into the schematics of the database and what dependencies objects have prior to making any changes
  • Efficiency – this one word carries so much weight. I wish I had this utility when I was first starting out; for me this product is definitely a game changer and did I mention it is free? Why not take advantage of it > go check it out now you know you need it in your data professional tool belt

Are You Using SQL Search Already?

If you already are utilizing this utility then drop Red Gate a line and let them know what your thoughts are on the subject; Red Gate always enjoys getting feedback and I have never seen a more thorough company in taking use cases and every day occurances by the user and figuring out how they can make their products better.

Contact Red Gate

More utilities from Red Gate

If you like what you’ve seen thus far; than you haven’t seen anything yet. This will become a mini series I’m proud to work on with some other individuals that are part of the Red Gate family. To see their twist please check out:

Julie – SQL Tools Review: SQL Search
Mickey – On a SQL Quest Using SQL Search by Red Gate
Jeffrey – How SQL Search saves you time

Entering the SQL Matrix

MatrixI know why you are here and I know what you’ve been doing. You hardly sleep and why night after night you sit by the computer; you are looking for what they call ~ The SQL Knowledge. I know because I too have looked for the knowledge, and when that knowledge found me it told me I was not really looking for it; I was simply looking for an answer. It’s the question that drives SQL Data Professionals and is the question that brought you here……

The SQL Matrix is comprised of many components and with that comes SQL Data Professionals from all walks of life from all over the world. It has been amazing to see components of peoples skill set come together to provide different view points on various topics.

Well, I’m about to welcome everyone to the Desert of the Real where real SQL Data Professionals collaborate with one another on various topics related to SQL. What you will come to know as MCJJ (Mickey Stuewe, Chris Yates, Jeffrey Verheul, and Julie Koesmarno); you will find four distinct Data Professionals who seek to gain more knowledge inside the SQL Matrix and provide thoughts and real life scenarios into series of blog posts.

With that said I now must tell you; you have two choices:

  1. You can take the blue pill and go back to sitting in front of your computer all day long stuck in the mundane rut that happens to all of us
  2. You can take the red pill and join MCJJ on this collaboration effort and perhaps hopefully pick up some new and interesting SQL Tips along the way.

The option is yours what will you decide.

Tomorrow will mark our first collaboration; I’m happy to announce that it will be done on Red Gate’s Utility – SQL Search.

Check out my peers reviews tomorrow and also check out the others collaboration release posts.

Stay Tuned

Please visit the following links to see the unique views of my collaborators.

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

T-SQL Tuesday #051: Place Your Bets

SQL-Tuesday.jpgThis months T-SQL Tuesday block party is coming from the renowned Jason Brimhall (blog|twitter). You see each month a SQL Community member hosts this block party and this months theme is “Place Your Bets”.

So you want to gamble? Come on up to the SQL table and place your bets. Any Data Professional is welcome – junior level up to senior level all money is acceptable at the SQL table.

Okay, I’m in what are we betting on today. Well, you are in luck my friend today’s bet is on backups; sounds simple enough doesn’t it? Sure that sounds like fun I’m all in, well wait what about backups?

You’re lucky you asked, otherwise you’d be called a sucker and I’d just would have taken the all in to the house “Little Joe”.

The Scenario

It was a dark grey morning…oh wait that’s a different story. Let’s say you have a plethora of databases that are all business critical and you have automated tasks that backup these databases. If something happens while in the backup process and the process fails than a failure notification is sent out notifying the advantageous Data Professional that their process had failed and go take a look so you can fix it. All is well, right?  Most would say yes, some would say no, and then there is some, the gambler, who says who the heck cares. You have the backup process in place ~ Let’s Roll.

The Gamble

I bet on that scenario early on in my career. I went all in with the house and you know what, that didn’t pan out to well for me. Why you ask, well gambling on whether or not my backups were solid and good opened my eyes to something that I knew but didn’t really take into consideration in the beginning stages of my career. I had a critical database being backed up…phone rings. The proud DBA picks the phone up…yes we have an issue and we need to look at our backup for x date. Sure thing, I got it right here. I’ll restore it and we’ll take a look at it.

Go to restore and the backup is corrupt; initially I’m thinking well that isn’t good. It was then when I had to go back and tell the business that the backup was corrupt and I would need to go a day before or after to get the information – but wait Mr. Gambler what about T-Logs did you have those – um nope business deemed it not necessary and didn’t want to pay for space etc for the growth needed.

Conclusion

Even after taken precautions in my backups I still feel the strong need to ensure testing of the backups is being done whether it is through an automated process, spot checking, etc. Taking a backup is great, can that backup be restored? Are the settings on the backup set properly? If you can’t answer some or all of these then take time today to do some checking.

Each shop is different and will have it’s hurdles to climb. With that said are you all in? Do you want to take that gamble and bet against house? Business looks at backups as a safety net or in some cases really doesn’t care as long as you have them. To the Data Professional they are much more.

I’ve always been taught to work hard and hone your skill set; for me backups fall right into that line of thinking. Always keep improving, learn from your mistakes. From mistakes comes growth and don’t be afraid to fail. Many successes have come from failures or setbacks.

What is T-SQL Tuesday

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

SQL Prompt What Is It Good For Absolutely…

SOSEverything. If you’ve never had the opportunity to use Red Gates SQL prompt utility then you are missing out. Recently, I went without it for a day by some circumstances that I decided to create for myself. Of course these actions were not intentional but then it got me thinking somewhat on if I was stranded on a desert SQL island what would be the one utility I would want most in my survivor tool belt.

I posed this question over at SQL Brit’s Forum (if you haven’t been over there yet check it out; there is some good stuff going on over there) and got some really good responses back.

Back to the subject at hand, SQL Prompt, and why do I like it so much. I am a heavy Red Gate user; have their bundle and enjoy using all their utilities. Makes my job a lot easier; one of these is SQL Prompt.

What Is SQL Prompt?

SQL Prompt is an add-in for SSMS and Visual Studio, and according to Red Gate, has been designed to strip away the repetition of coding.

Features

There are many features about SQL Prompt but I won’t cover all of them. I want to focus on just a few that have helped me tremendously fight the good DBA fight.

Summarize Script Option – as you can see below the script that is in the query window for my test database called release I have 3 insert statements. If I had a very huge query I was working with, or perhaps even going through highlights of an SP I can get a quick summary of what I’m dealing with:

RedGate1

If you click on one of the Insert Scripts you will see that it is highlighted in your query window for quick review:

RedGate2

Just how do you get to this Summarize Script Option?  If you look at your toolbar inside SSMS you will see the SQLPrompt option available. From there it is as simple as selecting the summarize script feature:

RedGate3

Snippets – these are nice hot keys to allow you to get to code quicker. You can save your own snippets for frequently used code. For example some of the snippets that are quick to learn are

  • SSF which is select * from
  • DF which is delete from
  • CT which is create table
  • II which is insert into

RedGate4

The list and possibilities could go on but the mere fact of having this at my finger tips has cut down on a lot of coding and “repetition” that occurs on a daily basis.

Conclusion

What I’ve shared are just two features that SQL Prompt has to offer and we haven’t even begun to scratch the service. The more I think on it the more I believe I will be doing an in depth over view of my bundle. I believe I have features that I am not taking full advantage of that are at my disposal.

Just this utility alone has formatting short cuts, navigation short cuts (Summarize Script is Ctrl+B, Ctrl+S), Code-completion Shortcuts, Refactoring, and much more.

Check out SQLPrompt for more information at RedGates website.

What’s your go to utility? If you were stranded on a SQL Desert island what would you want to have with you that you could not survive without?

Utilizing PowerShell With 2012 SSIS Deploys

toolsThis venture was first inspired by watching and reading Steve Jones’ (Blog|Twitter) PowerShell challenge. I always knew that PowerShell was a powerful tool and one that I needed to jump into at some point in my career. I dabbled with it some last year and did some small things, but it became more evident heading into this year that I needed to get more accustomed to all the intricacies and the usefulness I could gain from it as a data professional.

One major paint point in automation portion of deploys was the fact of deploying .ispacs in SSIS 2012. Now granted the actual deploy is not cumbersome, in my opinion, Microsoft has done a decent job this go around with how to deploy with the wizard etc. However, a good developer friend of mine decided that he wanted to start out on the PowerShell venture as well and has since taught me a few things regarding deployments and how to utilize power shell. Who knew that developers and DBA’s could actually work on something and an idea together; just another step in bridging that gap.

Keep in mind that the below code has dummy environmental variable and parameters for obvious reasons. This is just to provide an example and for one tweak it in hopes it may help another data professional at a future time.

The Code

First line is the most important for this venture. It sets the parameter of what action will be taken in the script.

param([Int32]$step = 1)
  1. If the value is 1 then the process will create the folder, deploy the package, set up the environment variable and parameter, reference parameters (do everything)
  2. If the value is 2 then the process will just re-deploy the package
  3. If the value is 3 then the process will change the environment variable/parameter
  4. If the value is 4 then the process will re-deploy the package and change the environment variable

After setting the variable for what action is being taken the next few steps will depict variable setup, loading the integration services assembly, storing the integration services assembly to avoid typing it every time, creating the connection to the deployed server, and creating the integration services object.

# Variables
$ProjectFilePath = "This is the location of where the ispac is located. For me it is getting the latest in TFS and then found on my C:\"
$ProjectName = "Insert the project name"
$FolderName = "Folder that the package will reside"
$EnvironmentName = "The environment name of the package"
$RPTServerName = "Server being deployed to"

# Load the IntegrationServices Assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices") | Out-Null;

# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"

Write-Host "Connecting to server ..."

# Create a connection to the server
$sqlConnectionString = "Data Source=$RPTServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString

# Create the Integration Services object
$integrationServices = New-Object $ISNamespace".IntegrationServices" $sqlConnection
$catalog = $integrationServices.Catalogs["SSISDB"]

Create the folder, deploy the package, set up the environment variable and parameter, reference parameters (do everything)

if ($step -eq 1)

            {
                if ($catalog.Folders[$FolderName]) {Write-Host "SSIS folder $FolderName already exists"}
                else {
            Write-Host "Creating folder $FolderName ..."
            $folder = New-Object $ISNamespace".CatalogFolder" ($catalog, $FolderName, "Folder description")
            $folder.Create()

            Write-Host "Deploying " $ProjectName " project ..."
             # Read the project file, and deploy it to the folder
             [byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
             $folder.DeployProject($ProjectName, $projectFile)

            Write-Host "Creating environment ..."
            $environment = New-Object $ISNamespace".EnvironmentInfo" ($folder, $EnvironmentName, "Description")
            $environment.Create()

            Write-Host "Adding server variables ..."
            # Adding variable to our environment
            # Constructor args: variable name, type, default value, sensitivity, description
            $environment.Variables.Add(“Env1”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env2”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env3”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env4”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env5”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env6”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env7”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env8”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“DatabaseEnv1”, [System.TypeCode]::String, "DatabaseName1", $false, "Database1")
            $environment.Variables.Add(“DatabaseEnv2”, [System.TypeCode]::String, "DatabaseName2", $false, "Database2")
            $environment.Variables.Add(“DatabaseEnv3”, [System.TypeCode]::String, "DatabaseName3", $false, "Database3")
            $environment.Variables.Add(“Reporting_Deployed_Server”, [System.TypeCode]::String, $RPTServerName, $false, "Reporting_Deployed_Server")
            $environment.Alter()

            Write-Host "Adding environment reference to project ..."

            # making project refer to this environment
            $project = $folder.Projects[$ProjectName]
            $project.References.Add($EnvironmentName)
            $project.Alter()

            Write-Host "Adding reference to variables ..."

            $project.Parameters["Paramater1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater1")
            $project.Parameters["Paramater2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater2")
            $project.Parameters["Paramater3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater3")
            $project.Parameters["Paramater4"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater4")
            $project.Parameters["Paramater5"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater5")
            $project.Parameters["Paramater6"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater6")
            $project.Parameters["Paramater7"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater7")
            $project.Parameters["Paramater8"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Paramater8")
            $project.Parameters["DatabaseParamater1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParamater1")
            $project.Parameters["DatabaseParamater2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParamater2")
            $project.Parameters["DatabaseParamater3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParamater3")
            $project.Parameters["Reporting_Deployed_Server"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Reporting_Deployed_Server")
            $project.Alter()
}
   Write-Host "All done." }

Redeploying the package

 elseif ($step -eq 2) # If you want to just Re-deploy the package
        {

                if (-not $catalog.Folders[$FolderName]) {Write-Host "SSIS folder $FolderName doesn't exists. SSIS folder $FolderName must exist before package can be deployed "}
                else {

    Write-Host "Deploying " $ProjectName " project ..."

    $folder = $catalog.Folders[$FolderName]
    $project = $folder.Projects[$ProjectName]
    # Read the project file, and deploy it to the folder
    [byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
    $folder.DeployProject($ProjectName, $projectFile)

    Write-Host $project.Name "was deployed with"
    Write-Host "Description: " $project.Description
    Write-Host "ProjectID: " $project.ProjectID
    Write-Host "All done."
      }
}

Changing the environment variable/parameter

   elseif ($step -eq 3) # If you want to change the environment variable/parameter
{

if (-not $catalog.Folders[$FolderName]) {Write-Host "SSIS folder $FolderName must exist before add/modify parameters and environment variables "}
                else {

$catalog = $integrationServices.Catalogs["SSISDB"]
$folder = $catalog.Folders[$FolderName]
$project = $folder.Projects[$ProjectName]

#Drop an environment if already exists
Write-Host "Drop Environment Variable ..."
if ($folder.Environments[$EnvironmentName]) { $folder.Environments[$EnvironmentName].Drop() }

#Create an environment
Write-Host "Creating environment ..."
$environment = New-Object $ISNamespace".EnvironmentInfo" ($folder, $EnvironmentName, "Description")
$environment.Create()

Write-Host "Adding server variables ..."
# Adding variable to our environment
# Constructor args: variable name, type, default value, sensitivity, description
            $environment.Variables.Add(“Env1”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env2”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env3”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env4”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env5”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env6”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env7”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env8”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“DatabaseEnv1”, [System.TypeCode]::String, "Reporting_TRS_IN", $false, "DatabaseEnv1")
            $environment.Variables.Add(“DatabaseEnv2”, [System.TypeCode]::String, "Reporting_TRSODS", $false, "DatabaseEnv2")
            $environment.Variables.Add(“DatabaseEnv3”, [System.TypeCode]::String, "ReportingAdmin", $false, "DatabaseEnv3")
            $environment.Variables.Add(“Reporting_Deployed_Server”, [System.TypeCode]::String, $RPTServerName, $false, "Reporting_Deployed_Server")
            $environment.Alter()

Write-Host "Adding environment reference to project ..."

<# making project refer to this environment $project = $folder.Projects[$ProjectName] $project.References.Add($EnvironmentName) $project.Alter()#>

Write-Host "Adding reference to variables ..."

            $project.Parameters["Parameter1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter1")
            $project.Parameters["Parameter2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter2")
            $project.Parameters["Parameter3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter3")
            $project.Parameters["Parameter4"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter4")
            $project.Parameters["Parameter5"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter5")
            $project.Parameters["Parameter6"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter6")
            $project.Parameters["Parameter7"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter7")
            $project.Parameters["Parameter8"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter8")
            $project.Parameters["DatabaseParam1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam1")
            $project.Parameters["DatabaseParam2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam2")
            $project.Parameters["DatabaseParam3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam3")
            $project.Parameters["Reporting_Deployed_Server"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Reporting_Deployed_Server")
            $project.Alter()

Write-Host "All done."
}
}

Redeploy the package and change the environment variable\parameter

 elseif ($step -eq 4) # If you want to Re-deploy the package and change the environment variable/parameter
 {
if (-not $catalog.Folders[$FolderName]) {Write-Host "SSIS folder $FolderName must exist before add/modify parameters and environment variables or deploy package "}
                else {

            $catalog = $integrationServices.Catalogs["SSISDB"]
            $folder = $catalog.Folders[$FolderName]
            $project = $folder.Projects[$ProjectName]

            if ($folder.Environments[$EnvironmentName]) { $folder.Environments[$EnvironmentName].Drop() }

             Write-Host "Deploying " $ProjectName " project ..."
             # Read the project file, and deploy it to the folder
             [byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
             $folder.DeployProject($ProjectName, $projectFile)

            Write-Host "Creating environment ..."
            $environment = New-Object $ISNamespace".EnvironmentInfo" ($folder, $EnvironmentName, "Description")
            $environment.Create()

            Write-Host "Adding server variables ..."
            # Adding variable to our environment
            # Constructor args: variable name, type, default value, sensitivity, description
            $environment.Variables.Add(“Env1”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env2”, [System.TypeCode]::Int32, 0, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env3”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env4”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env5”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env6”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env7”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“Env8”, [System.TypeCode]::Int32, 1, $false, "1=run 0=do not run")
            $environment.Variables.Add(“DatabaseEnv1”, [System.TypeCode]::String, "Reporting_TRS_IN", $false, "DatabaseEnv1")
            $environment.Variables.Add(“DatabaseEnv2”, [System.TypeCode]::String, "Reporting_TRSODS", $false, "DatabaseEnv2")
            $environment.Variables.Add(“DatabaseEnv3”, [System.TypeCode]::String, "ReportingAdmin", $false, "DatabaseEnv3")
            $environment.Variables.Add(“Reporting_Deployed_Server”, [System.TypeCode]::String, $RPTServerName, $false, "Reporting_Deployed_Server")
            $environment.Alter()

            Write-Host "Adding reference to variables ..."
            $project.Parameters["Parameter1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter1")
            $project.Parameters["Parameter2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter2")
            $project.Parameters["Parameter3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter3")
            $project.Parameters["Parameter4"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter4")
            $project.Parameters["Parameter5"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter5")
            $project.Parameters["Parameter6"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter6")
            $project.Parameters["Parameter7"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter7")
            $project.Parameters["Parameter8"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Parameter8")
            $project.Parameters["DatabaseParam1"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam1")
            $project.Parameters["DatabaseParam2"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam2")
            $project.Parameters["DatabaseParam3"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "DatabaseParam3")
            $project.Parameters["Reporting_Deployed_Server"].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, "Reporting_Deployed_Server")
            $project.Alter()
}
   Write-Host "All done." }

Conclusion

With any type of code you find on the internet it’s imperative that you test this on local machines or in a Dev sandbox before thinking of running in a QA, UAT, or Prod environment. What this has done is cut down my deployment times significantly and has helped improve efficiency. I always like a challenge and combing over this project and testing and re-testing proved fruitful in the end. For me it was worth it; hope others can get something out of it as well.

A big thanks to Steven Robinson who was the developer mentioned on this project; Steven is one of the hardest working developers I’ve been around and the communication between the two units played a key roll in achieving in the end success.

Change Sometimes Is A Good Thing

Change2The past several weeks I have started to take inventory. Every Data Professional should do this whether it is a bi-monthly, quarterly, or yearly. Technically what comes to my mind is backups, security checks, what can be automated, what worked for you this year and what didn’t, etc.

With that said I came full circle back to this blog The SQL Corner. This adventure started in 2011 and I have shared this story with you many times over regarding the fire being lit after returning from PASS Summit (by the way if you have never been to one check into it; can be career changing hint hint). I started to look at this blog in a different light lately, something has been missing and I couldn’t quite put my finger on it.

One of my favorite posts I completed this past 2013 was The Fab Five; people who I looked up to in the industry; when going back and revisiting that post it dawned on me. Sure each one has their own style of writing; no one person is the same but they are all tied via one common thread – content.

2014

This year, 2014, has much promise. That statement in and of itself can be said each and every year. The hitch or catch is what will I make of it, or what will you make of it. I’ve become more focused in the SQL Community, being a SQL DBA day to day, and being simply a Data Professional than ever before. While 2013 was a great year I want 2014 to be an exceptional year.

With that said The SQL Corner will be getting a face lift here and there. The focus or theme for it this year is content. I want to focus on the message at hand and relay it to readers to the best of my ability all the while making it easy to perhaps get an answer to an underlying question that one might not have known the answer to, or better yet provoke thought into an issue that one can derive a solution.

Learning

I look back at 2013 and realize that perfection is just a word in our industry. Not one person is perfect. We all make mistakes, daily, however continuous learning is key. While content will be at the forefront I will also be sharing some learning aspects that I am going through as well.

A few things that have already come to mind are noted below:

Steve Jones came up with his Power Shell series which has challenged me to dive a bit more into Power Shell; I’ll be sharing some of my thoughts on that.

Grant Fritchey has spoken multiple times on Azure; to which I am going to dive into this year.

My own tool belt; I get asked often by people from all over what I have in my arsenal of attack to conquer issues within SQL.

Many more an exciting topics forthcoming.

Structuring

This year will provide much more structure to the blog. John Sansom brought to light to me the calendar idea for blogging which I plan to follow. A lot of times I pull from my One Note pool of ideas but structuring my posts I think will help me stream line a bit more along with providing that thought out content I am desiring to put forth for everyone.

**you can find this calendar idea and much more over at SQL Brit’s Forum; great content there. Why not register today and get your learn on

Conclusion

2014 is game time. I’d like to take The SQL Corner to a new level and for that matter my SQL Presence in the community. I challenge not just myself but you as well; what kind of year will you make it. Will you stay stagnant going with the status quo; or will the fire be lit and you start challenging yourself to step outside the comfort zone and box.

Let’s grow this community that is already a great one; heck lets blow the doors off and make it exceptional.

Thank you all for a great year; look forward to this year.

If you could give a DBA just one piece of advice, what would it be?

thinpaperback

This post is part of the SQL Community Project #DBAJumpStart by John Sansom.

“If you could give a DBA just one piece of advice, what would it be?”

John asked 20 successful and experienced SQL Server professionals this exact question. I share my own thoughts with you below and you can find all our answers together inside DBA JumpStart, a unique collection of inspiring content just for SQL Server DBAs. Be sure to get your free copy of DBA JumpStart.

Two Essential Pieces to Your Success Puzzle

“If you could give an aspiring DBA just one piece of advice what would it be?”

This is a question that has stood the test of time; one that has been asked by many aspiring DBA’s on their journey. Some might say that you need to be centralized and good at one technical aspect such as replication, ETL processing, database mirroring, disaster recovery, or log shipping. All the technical aspects that a DBA must learn and endure along their journey are all prevalent and are a must for success, but over my thirteen year career to date I can visualize taking a new DBA and, if they are adaptable, teach them along the way.

For me I see two very important pieces to the puzzle that many miss, few obtain, and those who do are humble in their practice.

Hunger

Most people will look at my title of this section and be thinking, “What in the world is this guy talking about, hunger?” No, I am not speaking of food in a literal sense or being hungry for literal food, but do you have the hunger and the drive to learn?

I’ve been around long enough to know and understand the personas of a lot of the all-knowing DBA’s; do they exist – sure they do but the more prevalent ones I’ve noticed something different about and they are the ones who have a hunger or desire to better their skill set day in and day out.

There will be many days on one’s journey that the DBA will experience frustration as the DBA will wear many hats along the way; however each challenge that presents itself can be viewed as an opportunity to learn and find a resolution. Write your own learning destiny, success doesn’t come handed to you. It is something that you have to want and strive for.

Be Exceptional

You may or may not have heard the term “How do I become an exceptional DBA?”.

Being exceptional is going above and beyond the desired task at hand. This can have a broad meaning; how are your communication skills with others such as the business teams, development teams, release management teams, QA teams? Or when you are faced with an issue do you follow it through to the end or do you wait for someone else to help? Or is there something else noteworthy that in your field you can look on and gauge where you are at professionally?

Being exceptional is, at times, doing things that no one else wants to do. For example, the places I have worked at in my career I like to come in and first thing I ask my boss is, “I’d like to have the things that no one else wants to do”. Guess what, so far a lot of times it has been documentation on the system and processes which uncovers a plethora of information about the environment.

I have been asked by many people how do I get there? How do I achieve being exceptional? Each individual is different and circumstances surrounding that individual are different.

For me, I grew up playing sports and was heavily involved with team consensus at an early age and into college that has carried over into my career – I’ve always tried to surround myself with people who have been through the trials before; having a good team. Today, like so many others, I look up to people I consider top in our industry – the Brent Ozar’s, Paul Randal’s, Glenn Berry’s, John Sansom’s, Chris Shaw’s, John Sterrett’s, and Adam Machanic’s. If you have never gotten involved with the SQL community itself start now. The men and women in our industry share knowledge like no other community and a lot of times it is free. Let me ask you this….have you ever been to a technical blog by any of them, checked out SQL server forums such as SQLServerCentral, or logged onto twitter and followed these people. The wealth of knowledge you can pick up on is phenomenal and some of the free tools they offer so you don’t have to re-invent the wheel such as sp_whoisactive, sp_blitz, or Glenn Berry’s diagnostic queries are outstanding!

Whatever the situation, issue or challenge may be, take a step back and be exceptional. Provide leadership through service, and at times put away our ego and pride and just listen – you will be surprised at what you can pick up on by just being a sponge.

Make a Commitment to Yourself

I ask this question of you today….what is your hunger level and after taking inventory are you striving to be an exceptional DBA, or are you satisfied with just showing up?

Be hungry, be motivated, and be exceptional. I will not sit here and expect you to believe that every day will be a rose garden when dealing with DBA tasks, but what I can guarantee is that each person has the ability to make a difference and impact in their respective environments. It only takes one – will you be that one?

Fab Five – The SQL

The History

The original Fab Five dates back to 1991 when five high school kids found their way to the University of Michigan to play college basketball. These guys were the best of the best coming out of high school; four of them were ranked in the top ten while the fifth was ranked in the top one hundred. At that present time getting four guys ranked that high to come to your college was unprecedented in college basketball. They went on to transcend college basketball both in persona and in style.

The Positions

Each of the five guys that made up the Fab Five had their own specific positions they played. For those of you who don’t follow basketball I’ll lay it out for you here real fast:

  1. Point Guard – this position is often viewed as the play maker; they run the team’s offense and ensure people get to where they need to go. One main aspect of a point guard is to understand the coaches game plan and implement that plan. Best way I can think of this position is being a coach on the floor.
  2. Shooting Guard – this position scores points, is athletic, and is reliable in close games. They are often times the teams primary scoring option.
  3. Small Forward – this position is often considered to be one of the most versatile positions on the court; this position can score and they can get to the foul line at will and is one of the key components to any team.
  4. Power Forward – this position wears many responsibilities including rebounding; they typically have a good mid-range jump shot; they don’t mind to get in the mix with the opposing team and do a lot of the grunt work.
  5. Center – this position normally goes to the tallest player and has a great deal of strength. When you find a center who is athletic and tall they are an unparalleled asset to any team. Again this position does a lot of grunt work and gets the job done.

Present Day

The term “Fab Five” has stuck with me throughout my life. I have categorized many things on various levels of what my Fab Five are. My SQL journey also falls into this category. Many times throughout the year I get asked who or what has helped me along my data professional/DBA journey and who has made an impact on me throughout my career. With that said I have categorized who my Fab Five are.

Fab Five – The SQL

Now that we have arrived here to what this blog post is about I have had my own Fab Five from a SQL standpoint that have helped me in my career to date. Please keep in mind that others have helped along the way; these guys have just made a pretty huge impact on me and since I am asked frequently about this topic I thought I’d share them.

  1. Point Guard/Chris Shaw (B|T) – it was 2011 and I attended my first PASS Summit (for those of you who have never attended this conference I suggest you attend). I was able to sit in on a session that Chris presented on, utility databases, and was able to relate and resonate with his ideas to incorporate them into some of my daily activities. Up to this point I hadn’t been really involved in the SQL Community although I had been around SQL for years. I ran into Chris outside the conference while sessions were on break and he was about the most down to earth guy I had met at the conference; here this guy is an MVP like others who were there and he is having just as much fun as the rest of us were. Long story short, after the PASS Summit I started to reflect more on what I had learned and since I had never ventured out much into the community I reached out to Chris to see if he would guide me on some things which he did so without hesitation. Over the course of the past 2 years Chris has been an integral part of my DBA growth to which I’m also proud to call him my friend. I had to have this guy as my point guard as he’s the general and has orchestrated a lot of my ongoing efforts to further my career.
  2. Shooting Guard/Brent Ozar (B|T) – playing ball all my life I wanted to surround myself with good players. I always wanted to play up at a higher level; I figured if I wanted to be the best I needed to beat the best and play with the best. That was my basketball mentality and part of the competitiveness to a certain extent has carried over into my career. I sat back and jotted down who I thought to be in my own views key players in the SQL industry and Brent was one I jotted down. I wanted to learn all the techniques I could from him and his crew  and after the PASS Summit 2011 I started really diving into what Brent and his group had to offer. I started off with his weekly Tuesday morning/early afternoon video sessions. These were quick hits where I could eat my lunch at my desk and learn some new and cool things I didn’t really get to dive into in the past. Along with the videos I have become accustomed to the sp_Blitz and sp_BlitzIndex which his group (Jes, Kendra, Jeremiah, and Doug) provides. One thing I like about Brent’s group is they are just flat out real; what you see is what you get. If you haven’t been on a Tuesday video session check it out – they are just real people like you and I having a good time and teaching some really cool SQL stuff to data professionals like me along the way. Brent gets my shooting guard position because as a shooting guard he can score in buckets and in a variety of ways. Always provides a plethora of information that is both helpful and precise. A lot of my current techniques are attributed to him.
  3. Small Forward / John Sansom (B|T) – I actually was reviewing a great post by John on Automation and found my way onto his Forum. I had been looking to get involved in different ways with the community and have posted some on various boards when I stumbled upon his forum I liked the idea he was presenting. As a few days passed I kept coming back and finally decided to dive in head first. Am I glad I did, in a short period of time John has flat out boosted my drive to do what I can in helping the community. Like Chris Shaw, John has been an instrumental influence on my day to day activities as a DBA/data professional and has encouraged me to not only look for the solution but also think outside the box. He gets my small forward position because again of the versatility, dependability, and flat out getting after it day in and day out. I would put John up against anyone in the community and his integrity to his craft speaks volumes.
  4. Power Forward / Steve Jones (B|T) – If you have dove into the community at all then you know about SQL Server Central or possibly Red Gate. I am blessed enough to work with the Red Gate SQL tool belt daily; because of that I got to know Steve’s work and the Voice of the DBA podcasts/posts he conducts on a daily basis. You want to see a work ethic then look at Steve. Seeing guys like this drives and inspires me more to take what I can do to the next level. One thing I’ve noticed Steve do is always trying something new for example this month is his Power Shell month. Steve gets my power forward position because he always provides me with consistency. One thing he has taught me is don’t be afraid to try something new while testing your theories. My training schedule was waking up at 3 a.m. to run sprints in the street then go shoot 500 jumpers before class followed by practice in the afternoon. Steve epitomizes that work ethic in the SQL community, the daily topics from the Voice of the DBA pertain to real everyday issues that everyone can relate to. Seeing a professional like this just go about his job and continuously knock out speaking event after speaking event, post after post, engagement after engagement is just flat out admirable. Again like all the Fab Five this one has taught me hard work pays off and get in and learn your skill set.
  5. Center / John Sterrett  (B |T) – PASS Summit 2011 was a really good conference I reckon because I came to know John from the same conference, he did a session on PBM/CMS that rocked. At the time it was something we were going to implement in the shop I’m at and seeing it up front in person was perfect. After getting back from the conference I noticed that John put out on his blog that he was taking people under his wing and to let him know if anyone was interested. I reached out and sure enough John has helped me along my journey ever since, just like my previous four have. If you look at what a center does they just flat out get after it. Defense, offense, rebound, gets fouled non-stop but keeps getting up and hustling. John has shown me that hard work does in fact pay off.  From his blog posts all the way through his presentations the information he provides is spot on.

To my Fab Five I thank you, in today’s world this is not said enough. The time that you guys put in day in and day out does make a difference. So when you think you’re doing things in vain it is because of these efforts that it has helped me where I am at.

Crunch Time (4th quarter / Second Half)

When you are down to the last two minutes and the game is on the line what will you fall back on? For me the obvious answer is my training. I was told something one time by a friend of mine, “Somewhere, somebody is practicing getting better while you are not. What will it take?” For me, being a DBA is not just a job; for me it is a career that I thoroughly enjoy. My outlook I had in basketball is the same outlook I have today in my work ethic, I want to continue to practice, work hard, get better, hone my skill set. These guys have afforded me multiple resources to learn from.

I have my own saying that, it only takes one. It took one at Pass Summit 2011 (Chris Shaw) to get me involved in the community and have a strong footprint in how I conduct my DBA duties, it took one in Brent Ozar that decided you know what, I want to help my community and offer videos that others may learn what I already know and provide helpful tools that sets a standard in the industry, it took one in John Sansom to start a Forum that bridged the States to London for a guy who was looking for some identity in how to help others, it took one in Steve Jones who takes time out of his schedule to answer some emails that I send or spends countless hours being the Voice of the DBA which provides insight into the SQL technology, and it takes one in John Sterrett who always answers my attempts to further my knowledge and learn from him, and who introduced PBM\CMS to me.

Conclusion

I said all that to say this, the re-occurring theme is it “takes one”; these guys did it and I hope one day I’m that one to someone else. You see the community is full of men and woman like my Fab Five ~ who is your Fab Five and will you be that one?

Get involved, bring something to the table, and as the renowned Thomas Larock (B|T) says ~ make today your someday!

I’ll leave you with this one encouragement that I’ve shared here before and on other Forums:

Believe while others doubt

Plan while others are playing

Study while others are sleeping

Decide while others are delaying

Prepare while others are daydreaming

Begin while others are procrastinating

Work while others are wishing

Save while others are wasting

Listen while others are talking

Smile while others are frowning

Commend while others are criticizing

Persist while others are quitting

What Exactly Is This Sysadmin You Speak Of?

SecuritySysadmin; seems important doesn’t?

What exactly is this sysadmin you speak of?

SQL is provided with certain fixed server roles out of the box to help  manage permissions. This doesn’t mean you should stop with just these roles; how locked down you need to be is often times dependent on the place or environment you are in although I’m a very big advocate of ensuring your SQL servers have the proper and adequate security necessary and in place.

While there are nine fixed server roles out of the gate I only want to speak on one and that is the sysadmin role.

By simple definition the sysadmin role contains members who can do anything on the SQL server itself. Now that you know what members in this role can do let me ask you something else; on your servers do you know who are members of this role? If the answer is no then I suggest you take a moment and dive into your servers to find this information out as you potentially have a door wide open that needs to be shut.

This particular role can do anything it wants to the SQL Server, I said this particular role can do ANYTHING, it wants to the SQL Server. This role needs to be carefully controlled and monitored for it can bypass security checks; it is by far the most powerful role so get familiar with it ~ quickly.

Is it a daunting task to discover this information? The answer to that is no it is not and I’ll show you a quick way that I utilize to discover who are my members in this role.

Microsoft has provided additional views, functions, and commands that are there to help the everyday data professional. Some of these options are known, others are not, and some are just not known by people as they never have had to run across utilizing such before. Just like fixed server roles there are about ten or eleven that I can think of off hand that are good to know, but again I only want to focus on one –sp_helpsrvrolemember.

This system stored procedure is a gem as it will provide information on specific fixed roles that you want to inquire about; or if left null will return information on all fixed roles. Let’s take a look…..

First, the syntax…..you ready for it it is quite lengthy:

sp_helpsrvrolemember

That’s it, I execute this in the master database; the result sets are quite simple:

Results

What the result set provides you is a quick and easy look at who has sysadmin privileges on the SQL server; as you can see one that would raise an eyebrow immediately is the Test User log in. Why would this user need sysadmin privileges?

Conclusion

For a more in depth look at the fixed server roles and working with them you can visit the Microsoft link below:

http://technet.microsoft.com/en-us/library/ms188659.aspx

If you are new to SQL server or if you just haven’t ever thought about seeing just who holds the keys to the SQL kingdom it would behoove you to check it out. Utilize what has been already provided and start taking control of your environment. In the end it is our responsibility and duty to ensure security of our data is constantly being met.

There are many, many different twists and turns to explore; this is just the tip of the iceberg of what has been provided. I recently was approached just how do you obtain such information quickly; well this is it. Short, sweet, and too the point.

I’ve often found on machines I’ve had to hop on or take over this is one aspect that has always interested me. When I ask the question, “Do you know who is sysadmin on your servers” , what would your answer be?

Okay – it’s game time, be a play maker, and change the status quo. Take over your environments and ensure proper standards are in place and best practices are being met.

Removing Backup Media By Database

DeletePersonHave you ever inherited a server and find out that your msdb database is rather large in nature?  Or maybe you have never set up any maintenance on your msdb with regards to retaining backup history? Okay, okay perhaps you don’t fall into those categories but maybe you have policies in place and you offline a database for whatever reason and your policies start to fail for backups not taken in the last 24 hours. Maybe you fall into one of these categories or maybe you fall into a different one, whatever the case may be there are options for removing the backup history cleanly and methodically without going all rogue style in SQL.

Microsoft has provided a couple system stored procedures within the msdb database for removing backup history, but I will only focus on one, sp_delete_database_backuphistory.

This little gem, if used properly, will remove the backup history for a particular database. This comes in very useful as recently taking a database offline to be decommissioned was messing with policy checks in place, particularly the one I mentioned above where a the policy checks to see if a database had been backed up within the last 24 hours.

The Code

/*This code deletes backup history for a specific database*/
use msdb
go
exec sp_delete_database_backuphistory ‘yourdatabasename’
go

The Tables Affected

  • backupfile
  • backupfilegroup
  • backupmediafamily
  • backupmediaset
  • backupset
  • restorefile
  • restorefilegroup
  • restorehistory

Conclusion

There are tools available at ones disposal to help in situations with SQL. Microsoft has provided a number of system stored procedures but it would not be prudent to just go start running scripts or executing procedures without first knowing what they are going to do to your systems. Never run anything in production or any environment for that matter without first testing it on your local sandbox.

Add this script to your repository, you never know when you will need it.