Okay okay, maybe DTS left a bad taste in my mouth or maybe it was because I’ve developed over 256 DTS packages and I still twitch at night. Or maybe I used the first edition of SSIS and while it was better than DTS I still wasn’t a fan; then one of my old colleagues touched base with me and told me that I needed to give the new SSIS version a shot. He was curious about what I thought, I reluctantly held him at bay for a while but then an opportunity presented itself one day last week. A current colleague approached me with an issue where he had some .txt files that contained SQL code in them that he was running manually.
The first thing that popped in my head was giving SSIS another shot, I initially said no way not going there but figured ah what the heck – lets give it a try. Now the task at hand was quite simple really and by no means did I over complicate anything. I just wanted to take the files and load them to help my buddy out. Took me all of 8 minutes to whip a quick package up. Below is a representation of how I accomplished this; I will try to elaborate step by step.
I’m assuming that one already knows how to open SSIS in Studio…….
Step 1: Connection Managers – will need 3 of them
- Connection Manager (new database connection) will house the server name and database on the server
- SMTP Connection Manager – this pertains to the email functionality and the SMTP server
- File Connection Manager – empty connection for existing file that will be used in a later step
Step 2: From the toolbox I selected my “For Each Loop Container” this will allow me to loop through “all” files that I’m looking for in my specified directory. Double click the Container and or go to the properties and you will be presented with the below window. For my testing purposes I created a test folder called Test1 on my local drive. The files that I need to loop through are .txt files; in my example I am going to load all .txt files within the directory.
Step 3: From the tool box drag the Execute SQL Task over into the For Each Loop Container.
Double click on the Execute task to go to the properties section. The connection will need to be the connection from the Connection Manager that was setup in Step 1. Next the FileConnection will need to be the SQL connection set up in the Connection Manager that was established also in Step 1.
Step 4: I always set up a SMTP Connection Manager so I can utilize the send mail task from the tool box. This allows for me to communicate on failure or success to the appropriate parties. Simply drag two send mail tasks from the tool box and in the properties section add your SMTP server and how you want to connect. Click on the for each loop container and drag the arrow to your send mail tasks; to change for failure right click the arrow and select on failure.
Pretty simple in a nut shell. I felt the tools were very easy to use and while SSIS does a lot more powerful operations this solution provided a way for me to loop through my files in the designated directory, load them in without issues, and a method for notification.
Not covered in here but I will touch on is the fact that I liked the ispacs that deployment utility created. I found it much easier than using the manifest etc. All in all I was quite pleased.
People who I have spoken with over a period of time know that I have dubbed this as “The Book“. There are several reasons why I am stoked to get my hands on a copy of this book (should be in transit to me as I type!), but one of the main reasons is I get to check out what 4 of the authors have to say of whom I follow daily from a DBA standpoint.
I am certain that all of the authors deserve kudos for their efforts in getting their respective pieces completed; for me personally I wanted to give a big shout out to Grant Fritchey, Gail Shaw, Jason Strate, and Chris Shaw. These 4 people have helped me tremendously in my career thus far by their efforts in the community and sharing their knowledge.
Looking forward to seeing what the book has to offer. I suggest you give the book a shot if you haven’t thought about it buying it yet.
You can find the book on Amazon here.
What is an MVP?
For myself growing up in the realm of sports through high school and college an MVP is a most valuable player. In general an MVP is recognized in his area or field, an honor bestowed on him or her that distinguishes them as being recognized by their peers.
What is a SQL MVP
This carries over for me from my statement above on what an MVP is. I have friends that are SQL MVP’s and some friends that aren’t. Microsoft’s SQL MVP program recognizes individuals who make exceptional contributions to technical communities, sharing their passion, knowledge, etc.
Am I a SQL MVP?
No, I am not currently a SQL MVP and this is where my thought and blog really comes to life and the purpose for the post. As I stated before I have several friends who are SQL MVP’s and a lot who aren’t. One who is not approached me the other day via phone and I could tell something was bothering them. After some inquiring I discovered that the person was clearly upset that they did not have an MVP title next to their name so much so that they disclosed they were going to stop writing, being involved in the SQL Community etc.
I have mad respect for all of the current SQL MVP’s that are available to the community and the efforts that they put forth day in and day out; they are examples to me of what hard work and diligence can achieve in the profession and I hope one day I can become one; but I also want to share a different point of view to other fellow SQL Server Professionals. The SQL Community is just that a community of individual professionals that provide a knowledge base like no other. I implore the individuals who like my friend, basically was going to throw the towel in to keep working hard.
I once was told by my coach “Attitude – what you or I feel or think about something or somebody”. What’s your attitude today? Are you making a difference? Are you helping your co-workers? Are you continually learning to make yourself better? Do you want to me a game changer?
Somewhere somebody will always be practicing, learning, fine-tuning their skills – what will you be doing? Let’s get in the game, stay in the game, and while we are at it we might as well have some fun with it. All the other stuff will fall into place in due time, give 110% every time out.
I enjoy getting viewpoints by others in the community, the other day I sat down and started to ponder on some of the articles I’ve read that have either helped me along my way and career or have sparked me to think about how I would apply what I read to an issue I was having.
Below are 5 articles written by others in the community that I’ve read over time that have helped me; they are in no particular order and I’ll do my best to give a brief synopsis of what each represent are:
Chris Shaw – Utility Databases – this topic is geared toward having a utility database at your disposal as a DBA. The link is for Chris’ demo; I first heard this topic at PASS 2011, made so much sense to me to have a database at your disposal to keep as a repository to report off of and trend. This idea sparked other ideas which has lead to a full blown monitoring solution for me.
Adam Machanic – sp_WhoIsActive – I am going to take the blurb directly from Adam’s site. If you don’t have this tool I recommend getting it. “Who is Active? is a comprehensive server activity stored procedure based on the SQL Server 2005 and 2008 dynamic management views (DMVs). Think of it as sp_who2 on a hefty dose of anabolic steroids.~ Adam Machanic”. I highly recommend checking this one out
Glenn Berry – Diagnostic Query – The link will take you to different SQL versions of his diagnostic queries. These have helped me tremendously throughout my time as a DBA.
Brent Ozar – sp_Blitz – I enjoy reading from Brent’s outfit. If you have never been to his blog you should check it out. Jeremiah Peschka, Kendra Little, and Jes Borland have some great stuff they blog about. When I switched jobs I used Brent’s “free” utility to help with me learning what the servers were and what I was dealing with. For me this was a life saver. I give Brent and his team credit on this one.
John Sansom – How To Become A DBA – in this awesome article I liked John’s “How to become an outstanding DBA section”
I have many more articles that I will share periodically but these are just some to start with that I think from my standpoint have helped me. Hope you can get as much out of them as I have……….I’m tempted to just go ahead and start doing something like this monthly or weekly haven’t decided yet.
Have a good one!
This month’s host for T-SQL Tuesday is Nick Haslam (b | t) and the question he poses is based on a movie called Soylent Green. I have not seen the movie; however his question he asks is what the most horrifying thing you’ve seen in SQL Server.
WHAT THE MOST HORRIFYING THING YOU’VE SEEN IN SQL SERVER
Thinking back through the past I’ve had some serious, funny, and down right what the type of things happen. I’m sure all of us have.
One that sticks out to me is early on in my career I was fortunate enough to work for a company (mid-size) earning millions of dollars in revenue. Back then I had my own homegrown check list to go through, so after getting acclimated to everything I start going down my checks.
I come to my backup check……I start searching……looking………looking……..still looking. Come to find a previous employee had removed all the back up jobs; which now leads me into the next question, “Were there checks daily for certain things oh I don’t know verification of backups were occurring?” Unfortunately, the person who was handed over the DBA daily tasks was not the person who should have been in the position of such importance.
I will not say how long the company went without any backups but needless to say that along with several other issues were resolved!
For information on hosting or getting involved with T-SQL Tuesday you can contact Adam Machanic.
Last year, 2011, I had the opportunity to attend this conference. I never had been to Seattle and my peers who have attended before me said that I would be amazed at what I would be getting myself into.
I’m not sure how to explain it other than by telling you my experience about it.
Upon arriving to the conference I was amazed to see that I was there with roughly 3 to 5 thousand of my fellow community brothers and sisters. I am not by any means an introvert; but I can see how people would be awestruck when walking into the convention center.
I quickly found that networking, sharing issues, sharing solutions, and gaining knowledge was at the forefront. Each day PASS puts on a daily luncheon, now I tell you one thing…..food…..SQL…..community…can’t really top that.
Last year I had the opportunity to meet some of the most respected SQL MVP’s in the industry such as Chris Shaw, Jason Strate, Brad Mcgehee, Grant Fritchey, and Steve Jones.
FIRST TIME ATTENDING
First and foremost let me congratulate you on attending. Secondly, don’t be alarmed or get to overwhelmed. PASS does a great job of orientation and getting attendees acclimated to the week ahead.
For an insight into the workshops and orientation you can go directly to the PASS website for reference.
GET IN THE GAME
This will be the most beneficial conference SQL related you could attend. If you are looking for something that could change your career or outlook I would seriously consider attending; where else will you get
Registration is simple; for exact information you can visit the PASS site directly here
Recently I did a post on Disaster Recovery, well I just went through a simulated test which I believe is good. It provided a plethora of information on gaps that we need to shore up on.
One of the things that came to my mind though during this simulated test is most companies that I’ve worked with in the past have an annual Disaster Recovery. Then my mind wondered some more and thought about not only the frequency but what about a total unplanned Disaster Recovery test.
I’m very curious to what others think about this topic and what they are doing. Things that I would like to see are:
- Frequency – how often do you DR test
- Is your DR environment the same as your prod environment in terms of hardware, sizing, etc. or do you just have enough to get by
- Planned or Unplanned – do you have a totally planned simulation or do you have a select view individuals know and then trigger a DR on a certain date
- What is your favorite method for bringing data over to DR site
- Do you have checklists in place
These are just a few of the topics that ran through my head over the weekend. If you get time drop me a line and let me know your thoughts, and please feel free to expound on any of the topic related to DR.