Worst Day As A DBA


I remember the day pretty vividly, it was in the summer months and I was as green as green can be coming into the technology field. Walking through the doors to the start of my career  I was ready to tackle the world. The training for new hire sessions had already been completed and it was game time. I didn’t know what a SQL High Five meant at the time but I would have given one to everybody as that is how I was feeling walking through the doors.

The introductions began and I got the normal “new guy”, “fresh meat”, “greenhorn” labels that most people call newbies. As we turned the corner I noticed something that looked a bit off to me. A pen cap stuck in the keyboard by one of the developers. Hmm that is odd, they didn’t teach anything about pen caps in keyboards in college. That person was not at their desk at the time but we did find them in the break room and he was introduced to me as one of the developers.

Time past and the introductions complete I went back to my desk in my little cubicle and couldn’t help but think back to that pen cap. The curiosity was too great so off I go; back over to the desk with the pen cap, mind you this is about an hour later.

Hey man, I got to ask you a question. What’s up with that pen cap?

“Oh, yeah I was building an import process and I forgot and left a MSG box in there. I started to load the files in and instead of stopping it I figured this was a quick way to get through it.”

Hmm, interesting tactic but the red flags and sirens started to go off in my head. Being the new kid on the street and the youngster I went to one of my peers and started poking around a bit. Explaining what I saw I was amazed to learn that this in fact had happened before.

“Before”, little did I know those 6 letters would start to build the foundation of my DBA career. What, wait a minute….”before” you say? Yeah, ole Billy (no not his real name) over there has done that “before”. Nice, so I go back to my desk again and sit down. I take out a pen and paper and start writing down my questions.

1. Where is Billy loading this data?

2. What kind of data is Billy loading?

3. What kind of access does Billy have?

4. Does my boss know this?

5. What method is being used to import the data?

6. Who is the business owner?

Now realize I hadn’t even turned my computer on yet to get the lay of the land. Off I go with my questions.

Um Hey Billy, just out of curiosity where you loading that data? Prod he replies

My heart sunk, I knew the writing on the walls and where this was going. What kind of data are you using? Client data for our system. Back to my desk I go; sit down flip on the computer. I started researching, digging, and sure enough my thoughts were now a reality.

Dev Ops had gone rogue and had access directly to prod. Remember the pen cap; well after realizing that the import was loading more data than the file had in it we discovered the app didn’t have a stopping mechanism and no duplicate checks. In the end we were loading a 100,000 record file 8 times!

Light bulb goes off in my head, as I turn to colleague. Hey where is the last backup? It is on x drive but it won’t do you no good. Why is that I ask; yeah it’s a week old. We run them manually before we leave for the day.

In the end that worst day started off my worst week but looking back I believe that worst day started the groundwork for a solid foundation. How or why is that you ask?

1. Security – I’m a huge proponent of it and probably rightfully so after enduring the major cleanup that ensued.

2. Documentation – no documentation was found anywhere; we all can do a better job of this; me included.

3. Don’t be afraid to speak up; if something is off to you question it. Research it. Dig in and figure it out.

4. Just because something is done one way for years doesn’t mean it is the right way. Evolve and become more efficient. Do you think having a pen cap on an enter key to load data is efficient?

5. If you are a newbie and seasoned vet review your systems on a routine scheduled basis.

6. Backups – are you taking them? If so are you in turn testing them or validating them?

Some of the things I know now that I didn’t then are handy utilities such as Red Gate’s SQL Backup Pro that could have benefited me; take a look at their arsenal for the data professional. A wide range of products that will allow you to streamline your processes and tasks.

I look back on my time there and we brought it so far. We righted the ship but it was no easy task and is not for the faint of heart. It taught me to chip away at the wrongs and turn them to rights. I speak a lot about being a game changer. That means if you see something amiss go after it. Make it right.

While I have had a few “worst” days since then, I’ve learned one thing about being a Data Professional ~ being one comes with a price tag of having great responsibility. Don’t abuse it.



PASS Summit – What Does It Mean To Me?

PASS_14_Google_240x400 (1)PASS Summitwhat does it mean to me? So listen, I’m not perfect. I will never claim to be and you will never her me utter those words. I make mistakes every day, but I try to learn from those mistakes as much as possible.

I was asked by several people yesterday via email and word of mouth conversations what PASS Summit meant to me and how are community is. During the work day I was not able to keep up with all the happenings since session selections came out, but I did catch quite a bit on my feed. Since being approached and being just one voice in this big game I thought it prudent to share with others what PASS Summit has meant to me.

For me personally, PASS Summit changed my career. I rolled into town (Seattle) back in 2011 not sure what to expect. Brand new to this scene; I didn’t know anyone from anybody. I can still remember to this day walking into the convention center thinking to myself, “What in the world did I get myself into?”

Each session I went to seemed to give me something I could take back and incorporate into my job. I was able to meet and interact with fellow colleagues in technology from all over the world. It exposed me to another part of what we call “The SQL Family” I had not known before.

I can recall purchasing the Deep Dives book and introducing myself to all the MVP contributors. Eating breakfast and lunch with 5000 people before going to learn and try to enhance my skill set. It lit a fire in me for my career that I hadn’t had before. Some of the techniques learned then are still part of my everyday work now. So, as you can see the PASS Summit has meant a great deal to me and where I am at today.

Fast forwarding to today; what spawned these questions to me by others stems from session selection discussions. I will not dive into processes or procedures as I am not privy to the background and the inner workings of selection of sessions. That’s not my goal nor do I want it to be with this post. I see many points some valid and some not in my opinion; however I do believe that is part of being a SQL Family / community. We can share our thoughts, opinions, concerns and review processes, policies, and procedures. It is the basis and foundation on how we grow and improve. At the end of the day we are all in this together.

I look forward to attending this year and learning from a great group of speakers. Heck all the volunteers, speakers, attendees take time out of their families lives, work schedules, and the like to attend. It is definitely a unique environment and one that I hope can continue to grow and overcome hurdles.  Who knows I hope in the very near future my session is selected maybe then I to can share what I’ve learned along my journey.

Looking back I can honestly say the conference in its entirety changed my career, my outlook, and my drive. I don’t have all the answers but I will continue to give it all I got day in and day out, and from a past attendee I thank all the speakers (both old and new) and the volunteers that make this happen.

What about you? What do you think about PASS Summit? What are some of your opinions on the process for selections? Can we improve; if so how?


SQL Saturday 286 Roundup

IndianaWeslyanI got to admit, this SQL Saturday was a power packed lineup with some really good talented speakers. Due to other speaking obligations I was unable to submit a session or attend the full day, but the first half of the day did not disappoint. The venue at Indiana Wesleyan was fantastic and my hats off to all the speakers and volunteers.

I was able to attend 2 and a half sessions (yes you read that right as I wasn’t able to stay for the full time for the last session). I didn’t realize it then but looking back was glad to see that the sessions in which I did attend were my fellow Friends Of Red Gate colleagues.

Session 1: The day started off early with a session on Changing Your Habits To Improve Performance of Your T-SQL  by Mickey Stuewe ( B | T ). The session was a smooth session that captured the audience both with intelligence and structure. Several good points on how to improve SQL queries, formatting, and cursors. It was even nice to see an attendee ask what formatting tool was being used; which the reply was SQL Prompt.

Not long into it one of my fab five walked into the room Mr. Steve Jones ( B | T ) himself. It was an honor to meet him and have him give a session here in Louisville even though I could not be in attendance for it. He was one of the most down to earth people I’ve met in the SQL Community thus far.

Session 2: Next up on the list was a speaker I’ve been wanting to hear for awhile ~ Wayne Sheffield ( B | T). His session was titled Table Vars & Temp Tables – WHAT YOU NEED TO KNOW!  This session was not for the faint of heart as we dove into a fair good amount of technical data surrounding myths and the like. A very interactive session that provided some great insight into the internals of the structures found in tempdb etc. For those who have not heard Wayne speak on this session I urge you to download the session and go through it at your leisure; there is a treasure trove of goodness to be found.

Session 3: Last up on my list prior to my departure for my own speaking engagement was none other than the Scary DBA himself Grant Fritchey (  B | T ). His reputation proceeds him, but I found it awesome that when I walked in that he was doing burpees (which is a form of Crossfit training). His session was one I wanted to stay for Best Practices for Database Deployment. Grant is one of the coolest speakers I’ve been able to hear in a while; not sure if it is because I can relate the DBA portion or what but the first half was spot on with structure of DBAs, developers, releases and finding the middle ground. At the end of the day we all want to end up at the same goal and working together not against each other is key.

I hated to miss Steve Jones,  Ed Watson ( B | T), and Jason Brimhall’s ( B | T) afternoon sessions but glad they are able to be downloaded here. (please check the schedule tab; downloads will be found there)

I’m glad I was able to meet a lot of new faces and interact with some fresh talent coming up. The community is alive and well guys; groups like what was experienced today in Louisville shows the eagerness of people wanting to learn. Taking their own time out of their own days; thankful for families who support us in our SQL endeavors. I can’t wait for next year to submit some session(s).

None of today would have been possible if it wasn’t for Malathi Mahadevan ( B | T). My hats off to you for your continued effort year in and  year out.

Until the next time……

Alerts – Who Knew?

CollaborateImageThis week I am back at it with my SQLCoOp friends in sharing something new that we’ve learned since our last post. You ready? Great, pull up a chair and let’s see where this takes us shall we?

Wait a second…….do you hear that……..no, listen closer…….ALERT ALERT ALERT! Sirens seem to be going off all around and somehow we found ourselves in all hands on deck mode. Have you ever been there before? Chances are if you have worked in technology at all this has happened to you at one point in your career. If it hasn’t then eventually it will; trust me.

How do you handle such scenarios when they do come up? Reporting of alerts off your systems or applications can be very useful and while there are many ways to accomplish this I was introduced to an SSIS method I had to deploy this week. It made me dive in a little deeper and look into how the methodology behind it. If we break it down; it can be done so my analyzing 4 steps within the package.

Main Package:


The main package will consist of a database call, a for each loop container, setting of a variables task, then a wsdl (windows service) by utilizing an expression. The intent of this post is to show how you can make these calls happen and not to go into depth of what is located within the procedures or wsdl file; that shall be something for another today or better yet this should set a foundation in place for additional research one can do…..thus the learning something new part.

Getting the Exceptions

The Get Exceptions Alert utilized a simple SQL Task editor; once you create this editor you will notice the below screen shot. To complete the set up simply fill in the name and description. The next two important properties are the most important of them all. The connection and the SQL Statement. The connection is simply the database to which you are making the connection to; the second is executing a stored procedure that will gather all the alerts by the application that need to be emailed out.


For Each Container

You remember the SQL Task Editor we set up in the first step; this next step is what takes that result set; loops through the iterations and gets a collection of the alerts to be sent. Part of the homework is setting up the return result set in the SQL Task Editor. Once that is complete then you set the result set in the ADO Object Source variable noted below; this will allow multiple iterations to flow through. In this case the selection of enumeration mode should be set to rows in the first table


Set Variables

One of the key components are the variables to which you will utilize. These may be different depending on the in depth alerting that is wanting to be accomplished but for simplicity sake I will list out what the one I ran across was utilizing. We will notice that we have seven variables that will be set.

  • Alert – the alert name
  • Alert Info – what is so fascinating about this alert
  • Alert Type ID – correlates back to the type of alert that was set off
  • Result – the result of the alert
  • Result Data Set – the data that caused the alert and in this case not used
  • Source Process – what process triggered the alert
  • Source System – what system did this come from


Registering the Alerts

Registering the alert and sending the email notification is the last step to the puzzle. This is done by utilizing the windows task service editor within SSIS. As you can see I have a wsdl file located on my local drive in a directory called Windows Services. As I mentioned before, I will not be going into detail of the contents of the wsdl file; simply know that it will trigger the email notification. The result parameter will be fired off to a group of individuals to review on a time based period. That parameter value will be supplied in the output section of the Web Service Task editor.



There are several ways to arrive at this same goal and I found this one to be unique and one that is not traveled often. The package can be set up to be executed by the Agent on a time based interval. Whatever the case may be and whatever method that you do choose the important part is to keep learning.

I tell you what; check out what my other colleagues have to say on something they learned recently:

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.