SQL Saturday #122 – Pre cons

Just a reminder of the pre cons that are available this July 20th, 2012. Kevin Kline will have a session on Leadership Skills for IT Professional, Bill Pearson will have a session on Practical Self Service BI With PowerPivot, and Dave Fackler will have a session on SSIS – Live it, Love it, Learn it.

Cost is $100 per pre con session

The main event will be held July 21st, 2012 at the University of Louisville School of Business

University of Louisville

2301 S 3rd St

Louisville, Ky 40208

You won’t want to miss this event. Alot of good industry speakers will be at your disposal; visit the  main site to register today.

Are you a fit DBA?

Over my career I have seen many articles on what it takes to be a DBA. I always enjoy reading other viewpoints on the topic; so much so I’ve wanted to share my mentality on how I attack being a DBA. Every person is different and unique in their own way; however some of the principles, I believe, can remain the same.

How does a DBA stay on top of his/her game?

For those that know me individually know that I enjoy playing the game of basketball. I grew up with it; played up through college and once college was over still delve in it for fun. I can remember putting in countless hours in the gym shooting thousands of jump shots, running hundreds of laps, waking up at 4 in the morning to work out then going to class, then back to practice after classes. The reason I trained so hard was to be the best that I could be and squeeze the best out of my potential.

The other day as I was driving down the road i thought about my career as a DBA really is no different. In order for me to reach my goals and to stay on top of my game and enhance my skill set is to condition myself in the realm of DBA expertise. How do I become a better DBA? To me I need to take my conditioning mentality and make that into a SQL conditioning mentality. As I stated before each person is unique so your goals might be different than my goals but the underlying base is similar in many ways.

TRAINING YOURSELF

In any training program you have to continually work at it. My jump shot didn’t just happen over night; I couldn’t even begin to tell you how many hours I put in at the gym to where it became second nature to me. It was no longer thinking of the mechanics “Step, Bend, Follow-Thru” it became automatic. To me why would my SQL training be any different? I took 5 (my magic number, yours may be different) and looked at those 5 in ways that I can improve them. Then guess what; training begins! Train yourself and keep training. Just because you train for a day here or a day there means you will improve. You have to keep at it and train regularly just like you would when you train for a race, lifting weights, or in my case putting a ball through a hoop.

EAT HEALTHY

Part of my training consisted of eating healthy, and eating certain foods before big games to maximize my energy. Same thing goes for my SQL Skill sets. You will see on this blog many various bloggers who I look up to. To name a few of them:

The list could go on but these guys who I look up to I try to gain as much knowledge as I can from them and be like a sponge. They provide a plethora of information and you know what….I found that all of them are as eager to help as I am eager to learn. The SQL Community is abundant with tips and resolutions from such sites as:

To grow in this industry take advantage of what is at your disposal. Chances are issues or questions that you may have will be answered by someone or some form of partition in the SQL Community. That is why I enjoy the SQL Community so much; it is one big family. Eat knowledge regularly; grow your knowledge just like an infant eats and grows as he or she gets bigger.

Maximize Your Abilities ~ Fourth Quarter!

In high school (4th quarter) and in college (second half) I had this thing mentally I would do. I’d come out and slap both hands on my chest and then smack the floor. By this time I was dog dead tired; I had pushed myself as hard as I could the whole game and then I had to find the strength (where conditioning comes in) to make it through this quarter or half and stay focused when I was tired.

Have you ever pushed your limits with SQL? If not then how are you growing? I challenged myself and I will challenge you; push your limits and get out of your comfort zone. Learn something you didn’t know before; add some skill sets to your arsenal. You can’t expect to go out and knock out every inch of transactional replication if you’ve never seen it. Think of it in terms of taking small baby steps, but keep moving forward.

HELP OTHERS

I could chalk this up to getting older, the longer I’m in the SQL industry the more I enjoy helping others along their way achieve their goals. Take advantage of the many forums out there; share your knowledge; who knows maybe one day you can help others as they have helped me. I think a lot of times it is easy for us in the industry to take for granted the ones who are at the forefront such as the Grant Fritchey’s, the Paul Randal’s, the Glenn Barry’s, the Chris Shaw’s, the Brent Ozars, and the list could go on and on. These guys get it and have worked hard so that people like you and I could also get it.

As I close this thought, I recently read a post from Paul Randal. I can’t find it now but in essence it spoke of a request he received for advice and the request didn’t even say please or a thank you. I implore you to not only challenge yourself but also remember that a lot of times these guys help us for free. A simple please and thank you can go along way. Take the time to thank them for what they do. You may have others that you follow or mentors ~ have you thanked them lately for their advice or for being a sounding board as you maneuver through your career. If not think about it.

How many .MDF Files did you say?

How many File Groups?

It isn’t abnormal in the line of business that I am in to run across a past vendor who has set up a SQL Database, train the users on the application, then depart off to the wild blue yonder leaving the DB’s in the hands of the company’s DBA team on site. Not long ago this very same scenario came about, once the vendor had departed I was curious as to what they had done; and since now I’m responsible for the maintenance, issues, etc I figured it would behoove me to check it out.

I found a 3 GB Database which is cool; checking out the file groups I discovered over 300 .MDF files had been created. My initial gut instinct was one of why, but as I got into further research I can see where they were going with this. The sliding window approach was utilized by this vendor because of two reason upon me asking them…….one they had one architect and that is what they used; and two that is how their app interacts with the product on the back end. I figured given their answer a more prudent search was merited for my own purposes. I was intrigued……

In researching I came up with a few reasons that multiple data files would be utilized

  1. Performance increases
  2. Parallelism
  3. Rolling off extinct data

Now, I’m sure that there are more than just three gains that can be brought up; however in my case these were the ones that stood out.

I can see that if tables/indexes were placed onto separate file groups that it would allow for better performance from a complex queries standpoint in that SQL Server can utilize the “in” parallel I/O operations. I can also see that if the app is doing obsolete removal of data that separate partitioning can be utilized pretty heavily.

All in all I found that that their are some perks to having multiple File Groups. I’m not totally sold on having over 300 data files for a 3 GB Database; however I do see some valid points on to use this type of methodology against a database that is very large along with complex queries hitting the database.

I think it goes without question that thorough testing should be completed when using multiple file groups. Just because it works well in one environment does not mean it will work as well in the other, and I also want to STRONGLY recommend that you don’t decide to make this jump in a prod environment. I would hope that is given but always like to re-iterate. Some formal testing in my opinion should occur first.

Utilizing SQL CMD Variables in Transactional Replication

In some situations it maybe necessary to use SQL CMD variables while executing scripts. This happens when utilizing a replication process I have. To start off with I must iterate that to use SQL CMD variables you have to have the setting actually turned on in SSMS. The enabling is pretty simple and can be viewed on MSDN’s site at the following location.

http://msdn.microsoft.com/en-us/library/aa833182.aspx

If you start to use SQL CMD variables while this setting is turned off than a syntax error will occur when attempting to check the syntax or run the script.

In my situation I have several environments that I can deploy to; each environment has set SQL CMD Variables such as:

–local
:setvar Pub_Server “[(local)]”
:setvar Sub_Server “[(local)\Reporting]”
:setvar PostSnapshotScript “index file location”

I have four more settings like this one for my other environments that are commented out. I can easily flip the switch at any time.

From the statement above I am setting my SQL CMD variables for my publisher (local server) to my subscriber (local\reporting server) utilizing some index files I have at a location on my C:\ As I move through my environments of course these settings will change.

Once I have my environment variables in place I utilize 3 more variables for setting the publication name, the database the replication is occurring in, and the subscribers database:

:setvar Pub_Name “Name of Publication”
:setvar Repl_TRS “Database Name”
:setvar SubscriberDB “Subscribers Database (reporting side)”

I will not go into the specifics of the internals of the script that deploys out the replication; that in and of itself will need to be a different post. I plan on disecting the internals in the near future. However to utilize these variables throughout the script you reference them by using the following methodology:

exec sp_addarticle @publication = $(Pub_name) and so on………

I found this approach very fast and simple.

SQL Saturday #122

SQL Saturday #122 is fast approaching! Don’t forget it is set for July 21, 2012 in Louisville, KY. The event is being held at the University of Louisville School of Business. The formal address is:

University of Louisville School of Business

2301 3rd Street

Louisville, Ky 40202

As of the 24th they still had available seating left. As time approaches more information will be available regarding the speakers, track, and session titles. For more information you can check out SQLSaturday.com or for specific inquiries email the SQL Saturday team at sqlsaturday122@sqlsaturday.com

To take a quote directly from the SQLSaturday site:

“PASS SQLSaturday’s are free 1-day training events for SQL Server professionals that focus on local speakers, providing a variety of high-quality technical sessions, and making it all happen through the efforts of volunteers. Whether you’re attending a SQLSaturday or thinking about hosting your own, we think you’ll find it’s a great way to spend a Saturday – or any day.”

Take advantage of these free training sessions; come learn from some of the top notch speakers in the area. Hope to see you there.

For MapQuest directions you may go here

Automating Startup of Replication Agents

In my business world we utilize Transactional Replication to copy data from our OLTP to another box for Reporting purposes. For a little background on replication I will quote directly from MSDN what transactional replication is http://msdn.microsoft.com/en-us/library/ms151198.aspx:

Replication At a Glance

“Replication is a set of technologies for copying and distributing data and database objects from one database to another and then synchronizing between databases to maintain consistency. Using replication, you can distribute data to different locations and to remote or mobile users over local and wide area networks, dial-up connections, wireless connections, and the Internet.

Transactional replication is typically used in server-to-server scenarios that require high throughput, including: improving scalability and availability; data warehousing and reporting; integrating data from multiple sites; integrating heterogeneous data; and offloading batch processing. Merge replication is primarily designed for mobile applications or distributed server applications that have possible data conflicts. Common scenarios include: exchanging data with mobile users; consumer point of sale (POS) applications; and integration of data from multiple sites. Snapshot replication is used to provide the initial data set for transactional and merge replication; it can also be used when complete refreshes of data are appropriate. With these three types of replication, SQL Server provides a powerful and flexible system for synchronizing data across your enterprise.”

Now that an explanation of what replication is about I can dive further into the steps I’ve taken to automate my own replication. Within replication there are publications that reside on the publishers side (where the data is being copied from) and subscriptions on the receiving side (where the data is being copied to). In my current set up I have 18 publishers. Since this post is regarding Automation I will not go into how to manually start the agent via the Replication Monitor.

I created 18 SQL files that correspond to my 18 Publications. Within my sql files I have the following:

Use [DBName]

Go

Declare @publication VARCHAR(50)

SET @publication = [PublicationName]

EXEC sp_startpublication_snapshot @publication = @publication

GO

This allows me to have 18 seperate start up files; now can I have one file instead of 18. Sure I can; however in some instances I only want to run maybe 1 or 2 so to keep this automated I have seperated mine out. To execute these I have multiple options. I can use my Redgate SQL Multi Script utility that I have at my disposal (which I have done in the past); however I have also created a SSIS package. The SISS package calls a .BAT file with the following contents:

for %%G in (*.sql) do sqlcmd /S (local) -E -i”%%G”

Within my SSIS package I point the running of this command to the directory where I store all my publications; the command executes in this case all sql files (*.sql) to my local environment.

Once the process completes all my agents are started and the snapshot generation is underway.

Database Backups ~ Good or Bad?

I recently responded to a post on SQL Server Central regarding upgrading SQL Server 2005 to SQL Server 2008 R2. The suggestions laid out before my post were spot on; it was the one that came after my response of “agreeing” that backups were important……their is an age old saying I’ve always heard of “You’re only good as your last backup”. The response I pondered on this weekend was from a user who said, “Backups are worthless; restores are priceless“. Being the analytical person that I am I decided to dive into this statement for myself. I can only conclude that in order to do the priceless restore you first must have the worthless backup to restore. So, for me personally I cannot justify backups as being worthless.

I hope DR plans never have to be used but we live in a world that has error so it could be inevitable. No one knows when a disaster will strike, heck we just experienced one of the worst tornado’s around my area since 1974. Stuff happens, the world doesn’t stop turning so as a DBA I better be making sure that our systems have solid backups.

Backups should be pretty obvious, to have a back up is essential in order to recover ( to even be able to do the restore ) from most disasters. Backups aren’t just needed for disasters; think about upgrades gone bad, system issues, or even table corruption.

I know I probably shouldn’t have read to much into the statement, but to me that is a pretty big ordeal when hearing backups are worthless. I’d be interested to here from others what their thoughts are on backups and strategies, do you think they are worth it?

I won’t dive into the different methodologies of backing up databases; that would need to be an entirely different post, I personally like a good backup strategy and for any DR plan in place you need to have a good one. I also, like the idea of routinely verifying backups by restoring them to a DBA sandbox so to speak.

In any sense have a good one guys, thanks for listening to my ramblings on the matter!

Are Developers Data Scripts Standard?

In all my years experience with SQL and being a developer before transitioning to become a DBA I will answer for me personally NO. Each developer has their own niche or style that they use when writing code. I’ve seen some pretty wild and insane ways people develop. I worked with one guy several years ago who would name his counters off t.v. shows so every now and then you might have a skipper or a s.s. minnow thrown in. I hope and trust that development teams have set their own standards better than the one we had years ago…..however even with standards I think as a DBA we need to be able to communicate and bridge the gap that it just makes sense that when you receive a data script or migration script that it is in an expected format than everyone adheres and agrees to with no gotchas.

Recently this standard below was presented; given what the business needs are the script will allow for multiple servers that the script will be allow on. This is particular handy since some people may or may not accidentally run the script on an environment it shouldn’t be ran on. This also is helpful in dealing with a CMS server.

USE [DBName]

IF @@SERVERNAME IN (‘ServerName’,’ServerName’) /*Enter Valid Server Name/Instance*/

  BEGIN

    BEGIN TRY

      BEGIN TRANSACTION

      /*Enter Code*/

      COMMIT TRANSACTION

      PRINT‘Successfully committed!’

    End TRY

    BEGIN CATCH

      ROLLBACK TRANSACTION

      PRINT ‘An error has occurred.  Rolled back!’

      PRINT ERROR_MESSAGE()

    END CATCH

  END

ELSE

  BEGIN

    PRINT ‘PLEASE EXECUTE ON ServerName ONLY!’ /*change server name*/

  END

Whatever your choice of a standard is work toward getting things streamlined and laid out so there are not variations of the same kind of step. When it is up front and in your face as a standard it is easier to follow and maintain.

Work Encouragement a DBA Perspective

Motivation……..what is it? In Websters Dictionary this is defined as a motivating force, stimulus, or influence . I’m often asked, “What is your motiviation, or why should I be motivated”. Each person is different; I cannot tell you why you should be motivated or why you shouldn’t be motivated. Work wise I think we should take pride in what we do. At the end of the day I do not something out with my name on it that doesn’t meet my standards (which I set pretty high for myself). What do you think the reason the guy in the picture is motivated……for me as a DBA I’m motivated to help our team keep every server up and running with the best security, least amount of down time, always accessible, fine tuning processes like the excercise rinse, wash, and repeat, and the list can go on and on.

I have a sheet of paper on my office wall; on the paper I have the following:

  • Believe while others doubt
  • Plan while others are playing
  • Study while others are sleeping
  • Decide while others are delaying
  • Prepare while others are daydreaming
  • Begin while others are procrastinating
  • Work while others are wishing
  • Save while others are wasting
  • Listen while others are talking
  • Smile while others are frowning
  • Commend while others are criticizing
  • Persist while others are quitting

So I ask this question; what motivates you? Do you take pride in your work?

Removing Read Only Attribute From Files Using SSIS

Recently, I had built an SSIS package that searches my TFS structure after getting the latest version. I had many files I needed to bring over to one location; so to do this I built out a simple SSIS package. I failed to realize that when I bring my files down from TFS to my local environment they are protected as read only. This causes me a problem as I am not able to clear out my repository directory ~ you will inevitably get a “permission denied” error.

To prohibit manual intervention every time I perform this action I tacked on a Script Task property to my package. The basic flow of the package is as follows:

  • Check if my landing zone (repository) is available. If it is then use the existing directory if it is not than create it.
  • If there are any files in the directory prior to bringing over my new files than obliterate them from the directory.
  • Grab 16 various files from different locations and drop them into the landing zone.
  • Turn off the read only attribute of the files brought over.
  • Rinse and repeat as needed.

It is in step 5 where I turn off the read only attribute. As you can see the file system task in my package is below:

Read Only Attribute (Script Task)

By double clicking the Script Task you will be taken to a Script Task Editor screen. Simply click on “Edit Script”:

Edit Script Task

After clicking on the “Edit Script” button you will be taken to an SSIS Script editor. The below code provides a simple way of removing the read only attribute; in my beta version I have left the value hard-coded but eventually I would like to pass this is as a parameter so that I won’t have to come inside the code to change it. Also note that you can swap out the false with true and turn the read only attribute back on.

public void Main()
{
{
System.IO.DirectoryInfo directory = new System.IO.DirectoryInfo(@”C:\INRepository”);

foreach (System.IO.FileInfo fi in directory.GetFiles())
{
fi.IsReadOnly = false; // or true
}
}
}