Category Archives: SQLServerPedia Syndication

24 Hours of PASS

SQL Training……..to a SQL DBA I’m always up for any training to learn new techniques, new features, or new processes. You throw in the word FREE and how can you help not be excited about what PASS has to offer.

PASS has recently released their lineup for the free 24 hr SQL training with some stellar presenters and some great topics. To check out the sessions and register for any of the online classes you want you can click below:

24 hours of PASS SQL Training

Enjoy!

Linked Server Set-Up

At times I take for granted some of the features that SQL has to offer. One of the features that I was asked about from some fellow colleagues is setting up a Linked Server. First, what is a Linked Server? A linked server basically allows a SQL Server Database engine to execute commands against OLE DB data sources outside the instance of SQL Server. There are some advantages in having a

Linked Server a few just to mention are:

  1. The ability to access data from outside of SQL Server
  2. The ability to issue distributed queries, updates, commands, and transactions on data sources
  3. The ability to address diverse data sources

CONFIGURING A LINKED SERVER

One of the ways that you can set up a linked server is through SSMS; you can perform the following tasks:

  • Connect to a server in SSMS
  • Expand the Server Objects
  • Right click Linked Servers
  • Select a new linked server

The first box allows you to select a specific server type.

The security section will allow you to define the logins.

Under server options ensure the data access value is set to true

From there you can test the connection and ensure the validity of the setup has been completed correctly.

For further research you can review some of the documentation in the below links from Microsoft (some are from prior versions of SQL but the gist is still the same):

Linked Server Database Engine

Creation of Linked Server

Orphaned SQL Instances

It has been some time since I have ran into this issue; however yesterday brought back some old memories of my early years as a developer and then DBA – orphaned instances. Currently, I run 3 SQL instances on my machine. I kept SQL Server 2008 R2 installed as well as SQL Server 2012 as I have some needs for both that I won’t necessarily go into in this post.

I had some instance name issues that I had to work through when I installed 2012 on my machine that I knew I was going to have to go back and clean up at a later time. Well, that time came yesterday. As I uninstalled 2008 R2, I must have performed a misstep and in essence ended up with two orphaned instances on my box. Most people would not be thrilled with this scenario however I was pretty stoked as I knew that I could use this to brush up on how to get rid of them.

I wanted to have a fresh install of 2012 only on my machine so I uninstalled 2012 and then it was time to deal with my orphaned instances.

Before I share this I want to iterate that this is how “I” handled this situation and  it requires manual intervention. I re-tested this out on another box as well just to provide a second test. I would not recommend doing anything on a prod environment until you test it out; no matter what blog or article you get your information from. One, it helps you to understand what you are doing and two provides security in that you can perform the tasks.

Okay so below are my steps I took to manually remove my orphaned instances:

  1. Click on the Start menu, choose Run and type in Regedit (nervous yet?)
  2. Go to HKEY_LOCAL_MACHINE\Software\Microsoft\Microsoft SQL Server
  3. Remove all sub-directories and Instance name folder
  4. Delete the folder from % Program Files%\Microsoft SQL Server\%name%

Once completed I like to do a fresh reboot; then when you perform your installation the old orphaned instances are no longer available.

I did not share my instance names as they may vary across the board but this is a quick way to manually clean up orphaned instances. I do implore you to use extreme caution when dealing with Regedit. If you do not know what you are doing you can do some major damage to your box.

DBA Deployments

There are many aspects a DBA must handle on a daily, monthly, yearly basis. One that has been sticking out to me in recent weeks is deployment methodologies and the various ones that other people might have in place.

For example:

My current process utilizes TFS and RedGate tools. Below is a representation of what a deployment of mine might look like.

  • Code is checked in to source control by developer
  • A release is created consisting of bugs and tasks within TFS by a release manager
  • Once the release is put into place in TFS a comparison with in TFS is completed on the branch or by a label to see what was deployed out to our first landing zone ( a QA environment) from the last deployment
  • The latest code is gotten from Red Gate SQL Source Control
  • Using Red Gate SQL Compare differences are compared and pushed based on what was scheduled on the release ticket

Now this is just a very high level view without all the intricate workings that are not defined out in this post. I am always looking for ways to improve processes and be as productive as can be. I am curious as to what others might be using or the different methodologies and processes that others in the community use to deploy out to their respective environments.

Fill free to drop me a line or comment.

Job History Is Missing

I ran into an interesting issue this morning after upgrading to SQL 2012 that I did not know existed. I had a job fail for various reasons and was alerted like normal of the failure. Upon going to the SQL agent to view the history of the job I found nothing in the history – which I thought was odd. I decided to hop over to the sysjobshistory table and sure enough my history was all there as expected. So what is the correlation? Why was my job history in the table but not being shown in the view history on the agent?

I  noticed a message on the bottom left where the status and progress sections are in the properties section of the Log File Viewer for the job – the message read “The given key was not present in the dictionary”. Upon doing some research it so happens that a known bug was discovered with special characters in job names. In my job, for whatever reason, it had a “,” in the name of it. In digging some more I could find only the term “Special Characters” was being thrown about, but I could not find a specific listing of what special characters…was it 1,2,3 or all?

Once I removed the “,” from the name and re-ran the job upon completion the history itself was re-populated in the agent with no issues.

I decided even further to test this out; I put the comma back in the job and then scripted the job out to see if it would error, to my surprise it did not error out. I would have expected it to error on script creation due to the comma; I guess that the bug runs a bit deeper than I initially thought.

Microsoft has responded that this will be resolved in the next SP release of 2012; I figured if anyone else runs across this that the information might help.

Which Publication Are My Tables In

Ever been in a deployment situation where the deployment script breaks due to a table being replicated? I have – many times. Hopefully this tidbit of information will help you in your future deployments.

When I receive a scheduled or even a non scheduled deployment I like to prep for it so I don’t have any surprises at the actual time of deployment. I use Red Gates SQL Compare tool for my deployments; whatever your mechanism is the methodology is similar.

Table A is being replicated from the publisher down to a reporting environment lets say; Table A has a modification to some part of the schema. If Table A is being replicated you will receive an error on deployment stating that the table is being “published” for replication.

What I like to do is review the schema before hand; I have a pretty good working knowledge of what I am replicating over but I also deal with 17 publications that house various amounts of tables and I might not be able to recall where all the tables are in what publications.

In order for me to figure this out quickly I use the following script. I do not have a use statement at the beginning; however I do run this in the specific database that I am deploying to:

SELECT sps.name
FROM syspublications sps WITH ( NOLOCK )
JOIN sysarticles sas WITH ( NOLOCK ) ON sps.pubid = sas.pubid
WHERE sas.name = ‘tablename’ –change to your table name

By running the above script it will allow you to find what publication the table is being replicated in. From there you can script out the drop and create statements for the publication and continue on with your deployment.

Recompile Option – Yes, No, Maybe?

This seems to be hot topic right now among my peers here locally and even abroad ~ To Recompile or Not to Recompile that is the question at hand.

I’ve been associated with companies in the past where utilizing this option is almost a standard; then I’ve been with companies to where it was never used at all. What is my personal preference?  I don’t use the option all of the time but I do have some use for it in my daily checks of the systems I monitor.

My two real world examples I just ran into not long ago are some simple ones but thought I would share……..

  • I kept on having some issues with a stored proc and I knew that what I was troubleshooting was off kilter and wasn’t adding up in my head on what I was seeing in some of the results. In diving into the query of course I saw the recompile option which of course told me that anything in my plan cache I should just throw out the window……I love using my DMV’s and what not; however on a recompile each time a proc is used puts a damper on my assessments at times. There is a trade off when using the recompile option to improve performance, just something to keep in mind.
  • One aspect I do use the recompile option for is my diagnostic queries of my environments. Prime example: I’m a fan of Glenn Berry’s Diagnostic scripts he’s put together with some of my own customized ones. When running these, I personally, don’t care how they are stored in the plan cache etc.

I recently had the opportunity to sit in on a good session about this by Kendra Little over at Brent Ozar PLF ( good group of guys and if you haven’t ever checked them out I implore you to do so; they have some great items they discuss and free weekly training ). Kendra goes through some real world examples in her demo and it would behoove you to give it a look.

As I said earlier I’d love to hear some of your experiences with the option and the pros and cons you may have on it. If you have time drop me a line.

Static Data

Static Data Scripts


Ever had to wipe out tables only to put data back into the table such as maybe look up tables or some tables that never really ever change but maybe on a yearly basis?

To me static data scripts are integral in a sense it keeps you from re-inventing the wheel over and over again. In thinking of this topic I realized that I had some gaps myself I could shore up and get some quick scripts created so that when the time comes I have them at my disposal.

A basic template to go buy can be something like the one below:

/***************************************
*** Static data management script ***
***************************************/

PRINT ‘Updating static data table [dbo].[tabl1]’

— Turn off affected rows being returned
SET NOCOUNT ON

— Change this to 1 to delete missing records in the target
— WARNING: Setting this to 1 can cause damage to your database
— and cause failed deployment if there are any rows referencing
— a record which has been deleted.
DECLARE @DeleteMissingRecords BIT
SET @DeleteMissingRecords = 1

— 1: Define table variable
DECLARE @tblTempTable TABLE (
[Id] int,
[Info1] BIT
)

— 2: Populate the table variable with data
INSERT INTO @tblTempTable ([Id], [Info1])
VALUES (‘1′, ’10’)

— 3: Delete any missing records from the target
IF @DeleteMissingRecords = 1
BEGIN
DELETE FROM [dbo].[tabl1] FROM [dbo].[tabl1] LiveTable
LEFT JOIN @tblTempTable tmp ON LiveTable.[Id] = tmp.[Id]
WHERE tmp.[Id] IS NULL
END

— 4: Update any modified values with the values from the table variable
UPDATE LiveTable SET
LiveTable.[Info1] = tmp.[Info1],
FROM [dbo].[tabl1] LiveTable
INNER JOIN @tblTempTable tmp ON LiveTable.[Id] = tmp.[Id]

— 3: Insert any new items into the table from the table variable
INSERT INTO [dbo].[tabl1] ([Id], [Info1])
SELECT tmp.[Id], tmp.[Info1]
FROM @tblTempTable tmp
LEFT JOIN [dbo].[tabl1] tbl ON tbl.[Id] = tmp.[Id]
WHERE tbl.[Id] IS NULL

PRINT ‘Finished updating static data table [dbo].[tabl1]’

GO

I know in times past this type of methodology has saved me some hours here and there and if you have data that you know will not change, especially if you push from one environment to another makes life a little easier.

 

Sessions For PASS 2012 Released

I was excited to see the sessions have been released for this years PASS 2012 Summit

If you’ve been involved in the SQL Community for any particular time than hopefully PASS has come up here or there. I can say from personally attending in the past that the knowledge gained and networking that you will do is invaluable.

Some of my peers have inquired what are the pro’s, and why should I attend such a Summit. Again, I can only say that you get out of it what you put into it. The atmosphere and mindset going into the Summit to learn and gain some knowledge from the best speakers the SQL Community offers ~ well you’ll walk away amazed.

I think PASS does a great job in detailing what they are about and the why’s and who should attend questions. You can check all that out here.

Everything you will need to know about the Summit from lodging to pricing, etc is available on their site. If you are attending ~ GREAT; if you are on the border line than can I implore you to go; and if you don’t think you are going I ask you to take a hard look at it in  hopes to change your mind.

Let’s make this PASS Summit the best yet. Hope to see you there!

Granting View Definition to All Procedures

The other day I came across something where I needed to grant view definition rights to a specific user for one single DB. Now, there are several different ways to achieve this but I decided to dabble in just writing some code to take care of it for me. Others might have something very similar or different mechanisms to do such but for me I used the following code:

The basic gist is as follows:

  1. Create Temp Table – you can use a # table just drop it when done with it
  2. Declaration of Variables
  3. Insert data into temp table – only want to grab the procedures and nothing with a _dt
  4. Grab all of the procedures found in the DB and dynamically set the grant permissions to a SQL string
  5. Execute the SQL string
  6. Restart the counter and reset the name variable just for good measure
  7. Select the temp table to view what was just granted view definition to

Code in doing this:

DECLARE @PermTable TABLE
(
PK_ID INT IDENTITY(1, 1)
NOT NULL
PRIMARY KEY CLUSTERED ,
[name] SYSNAME
)

–declare variables
DECLARE @name SYSNAME ,
@RowCount INT ,
@RecordCount INT ,
@strSQL VARCHAR(2000)

INSERT INTO @PermTable
(
[name]
)

SELECT ROUTINE_SCHEMA + ‘.’ + ROUTINE_NAME
FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_TYPE = ‘Procedure’
AND ROUTINE_NAME NOT LIKE ‘dt_%’

— counters for while
SET @RecordCount = ( SELECT COUNT(*)
FROM @PermTable
)

SET @RowCount = 1
WHILE ( @RowCount < @RecordCount + 1 )
BEGIN
SELECT @name = [name]
FROM @PermTable
WHERE PK_ID = @RowCount
SET @strSQL = N’Grant VIEW Definition on ‘
+ RTRIM(CAST(@name AS VARCHAR(128))) + ‘ to [User]’

–Execute the Sql
EXEC(@strSQL)

–Restart the Counter
SET @RowCount = @RowCount + 1

–Never know so why not reset
SET @name = NULL
END

SELECT *
FROM @PermTable

Hope this helps; cheers.