Tag Archives: TSQL2sDay

T-SQL Tuesday #72 Invitation – Data Modeling Gone Wrong

SqlTuesdayT-SQL Tuesday is here again. I’ve had good intentions the past few times this event has come around and even have drafts still waiting to be queued up which I will eventually turn into regular blog posts, but I decided to just make time this month and jump back into the monthly party.

This month Mickey Stuewe (b|t) is hosting and has asked for some data modeling practices that should be avoided, and how to fix them if they occur.

What is Data Modeling?

Data Modeling itself is referred to as the first step of database design as you move from conceptual, to logical, to actual physical schema.

While that definition sounds simplistic, we can expound upon it to arrive to the conclusion that data modeling is a very important aspect from database design on all levels.

What to Avoid?

As a data professional and in senior management I’ve seen pit falls wide-spread in various business units when it comes to design architecture. The listing you are about to read are some of the methods and items I’ve discovered on my journey and conducting gap analysis type of events that carry a chain reaction. They consist of doomed failure from the get go.

  1. Audience – the audience and/or participants should be defined up front. I differ with many and that’s okay. To me the ability to identify business stakeholders, subject matter experts, technical groups, BA’s is an integral piece to the puzzle. Too many times I have seen the engine pull out of the gates with a design to only find out that the design and documentation to not even meet the criteria and standards of the shop.
  2. Detailed Project – how many times have you received documentation only to find out there was not enough meat to get the project off the ground? As a data professional we do think out of the box; however it is imperative to be clear and concise up front. When my team is given projects to complete that involve Database Design and creation, I implore business units to provide as much detail up front that is agreed upon. This helps streamline and makes for better efficiency.
  3. Understandability – With details comes the ability to articulate understandably. All to often items get lost in translation which causes additional work on the back-end of the database. This could mean unfortunate schema changes, large amounts of affected data, and so on.
  4. Business Continuity – ask yourself a question in design phase. Is what you are building that will be presented to the business efficient? Will business be able to decipher what is being presented back to them; if not why?
  5. Downstream Analytics – How does the business want to see this data in the form of analytics or reporting?  Most modern systems are going to either be queried by, or push data to, ETL processes that populate warehouses or other semantic structures.  Avoid complex table relationships that can only be interpreted by the code that stores the data.  Make sure you define all your data domains so that the BI professionals are not scratching their heads trying to interpret what a status of ‘8’ means. (In speaking with a colleague, Tom Taylor, at my shop – he brought up this valid point).

Items To Look For

Some key and general practices to look at and decide on are:

  1. Primary Keys – yes they are your friend – add them.
  2. Look at all audit data and what needs to be audited
  3. Clustered/Non Clustered indexes – have you read through your execution plan?
  4. Has the scope of the data model been met?
  5. Are tables normalized properly?
  6. One Data Modeling Tool – it’s easier if the team is looking at one utility together; if you have many varieties spread across many team members it could leave views skewed.

Conclusion

Data modeling, in and of itself, is a key component for any business. What often falls by the wayside is the poor leg work done up front. You have to lay a proper foundation in order to be successful with any design; taking into consideration all personnel in order to make the best strategic decisions to move forward.

Hopefully the next time you go down this path you have some questions to ask yourself along with some solutions to those problems.

What is T-SQL Tuesday.

Adam Machanic’s (b|t) started the T-SQL Tuesday blog party in December of 2009. Each month an invitation is sent out on the first Tuesday of the month, inviting bloggers to participate in a common topic. On the second Tuesday of the month all the bloggers post their contribution to the event for everyone to read. The host sums up all the participant’s entries at the end of the week. If you are interested in hosting and are an active blogger than reach out to Adam and let him know of your interest.

T-SQL Tuesday #60 Something New Learned

SQL Tuesday

So, here it is. I put the challenge out to discuss something new learned last week. I was fortunate enough to attend the PASS Summit last week in Seattle. While this post will not be my summarization of that trip (that will be another post) I did have several take-a-ways. I sat in some stellar sessions with some renowned speakers.

However, one re-occurring theme kept coming to my mind – the people. Listen, I’ve been through a lot over my 15 years with SQL, and my 3 years actively involved in the community and this past week affirmed something for me. LISTEN to the people.

I place strong value in the sessions I attended; along with that I have to note that face time; one on one time with real people in my industry is about the best form of learning I could ever hope to obtain.

With that learning comes in issues related to both SQL and non SQL attributes. I had so many positive conversations on leadership alone that sparked a new kind of fire within me; one that was not as bright as what my technical fire had been.

Guys, listen. I could write 10 blog posts on how buffer size could help with backups, the need to have always on implemented, or how to tune indexes all day long. The people, better yet the community is where I believe the learning lies within. Out of 5k people last week I ended up meeting a guy that works two blocks from me and we got to discuss the community and what it means to us.

Have you challenged yourself lately? I mean have you really challenged yourself lately in learning something. I don’t care if you are just starting out or the most seasoned vet around; the ability to learn happens everyday and I’m learning that is what separates the exceptional data professional from the data professional.

You see, the exceptional data professional hangs around the community zone at Pass Summit to help others in the community with issue they may have. The exceptional data professional sits down next to you when you are the new kid on the block and encourages you to make the most of your career then tells you some of his/her pitfalls they had that you can avoid, and the exceptional data professional takes you under his/her wing when you ask them for help or assistance.

You don’t have to travel all the way to Seattle to learn; no you have learning opportunities all around you. From SQL Saturday’s to Virtual Chapters on the web but it starts with you. That’s right, you have to be willing to take that first step; get involved and start learning.

I can tell you from experience and the roller coaster ride I’ve been on for the past three years that you will not regret it. Strive for excellence and provide that leadership through service that the community seeks. Yeah, I may be a tad passionate about what I do; you’ll find that kind of trait with others in the community.

So, I’ve challenged myself……..will you?

Passwords – A T-SQL Tuesday Topic

LateBeing engrossed with the daily tasks at hand I completely missed this month’s T-SQL Tuesday is being brought to you by Sebastian Meine Blog | Twitter.

Although the time frame has passed I would be amiss if I didn’t continue on my journey of joining in these block parties; with that said I’m going to write what I “would” have contributed. This month’s topic is intriguing in that it can cover a wide array of discussion – Passwords.

When I think of passwords I think of etiquette. I cannot tell you how many times I have been on calls, meetings, emails, and the list could go on of scenarios that relate to passwords where users just don’t think or take into consideration the impact of their actions. To me the last four words are the key, “impact of their (our) actions“.

Password Etiquette

  • Conference Calls – how many times have you been on a production call with numerous individuals and hear someone say, “Okay here is the user name and password?” If you have then you are not the only one. Credentials should be kept out of the hands of unnecessary individuals.
  • Open Text Passwords in tables – check into encrypting those; protect yourself before you realize breaches have occurred and you are left holding the bag.
  • Email – transmitting password information via email; not a big fan of. This kind of relates back to the conference call section; who all is on the email? Are you sending it to Project Managers and the like? Probably not the best choice to make.
  • Backups sent offsite – do you have any backups going off site? Is any pertinent credentials contained in the dB and if so are your backups being encrypted before shipping them off?
  • Length – Look at the length of the passwords you are creating; how strong is the password you are making?
  • Sharing – don’t do it; simple enough.

All the above reflects, what I deem, good etiquette. That barely scratches the surface. You have to take into consideration many other factors one of them being a policy.  Small, big, medium – whatever kind of shop you are in define out what the best practice is for your shop and then adhere to it. A good reference could be found on Technet Best Practices

Lastly, if you feel as though a password has been compromised be proactive and take the necessary steps to change it. Don’t wait for something to happen; you be the game changer.

Get your defense model in place and let the good times roll.

T-SQL Tuesday #051: Place Your Bets

SQL-Tuesday.jpgThis months T-SQL Tuesday block party is coming from the renowned Jason Brimhall (blog|twitter). You see each month a SQL Community member hosts this block party and this months theme is “Place Your Bets”.

So you want to gamble? Come on up to the SQL table and place your bets. Any Data Professional is welcome – junior level up to senior level all money is acceptable at the SQL table.

Okay, I’m in what are we betting on today. Well, you are in luck my friend today’s bet is on backups; sounds simple enough doesn’t it? Sure that sounds like fun I’m all in, well wait what about backups?

You’re lucky you asked, otherwise you’d be called a sucker and I’d just would have taken the all in to the house “Little Joe”.

The Scenario

It was a dark grey morning…oh wait that’s a different story. Let’s say you have a plethora of databases that are all business critical and you have automated tasks that backup these databases. If something happens while in the backup process and the process fails than a failure notification is sent out notifying the advantageous Data Professional that their process had failed and go take a look so you can fix it. All is well, right?  Most would say yes, some would say no, and then there is some, the gambler, who says who the heck cares. You have the backup process in place ~ Let’s Roll.

The Gamble

I bet on that scenario early on in my career. I went all in with the house and you know what, that didn’t pan out to well for me. Why you ask, well gambling on whether or not my backups were solid and good opened my eyes to something that I knew but didn’t really take into consideration in the beginning stages of my career. I had a critical database being backed up…phone rings. The proud DBA picks the phone up…yes we have an issue and we need to look at our backup for x date. Sure thing, I got it right here. I’ll restore it and we’ll take a look at it.

Go to restore and the backup is corrupt; initially I’m thinking well that isn’t good. It was then when I had to go back and tell the business that the backup was corrupt and I would need to go a day before or after to get the information – but wait Mr. Gambler what about T-Logs did you have those – um nope business deemed it not necessary and didn’t want to pay for space etc for the growth needed.

Conclusion

Even after taken precautions in my backups I still feel the strong need to ensure testing of the backups is being done whether it is through an automated process, spot checking, etc. Taking a backup is great, can that backup be restored? Are the settings on the backup set properly? If you can’t answer some or all of these then take time today to do some checking.

Each shop is different and will have it’s hurdles to climb. With that said are you all in? Do you want to take that gamble and bet against house? Business looks at backups as a safety net or in some cases really doesn’t care as long as you have them. To the Data Professional they are much more.

I’ve always been taught to work hard and hone your skill set; for me backups fall right into that line of thinking. Always keep improving, learn from your mistakes. From mistakes comes growth and don’t be afraid to fail. Many successes have come from failures or setbacks.

What is T-SQL Tuesday

T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

T-SQL Tuesday #050: Automation, how much of it is the same?

SQL-Tuesday.jpgNot a better way to start off the year with a good ole fashioned T-SQL Tuesday block party. Just what is this block party I am speaking of…well I’m glad you asked.

What is T-SQL Tuesday?
T-SQL Tuesday is a monthly blog party hosted by a different blogger each month. This blog party was started by Adam Machanic (blog|twitter). You can take part by posting your own participating post that fits the topic of the month and follows the requirements below. Additionally, if you are interested in hosting a future T-SQL Tuesday, contact Adam Machanic on his blog.

Who is hosting this week?

slqchow is hosting these weeks theme on automation to check out more about him you can visit his blog at (blog|twitter)

Lights, Cameras, AUTOMATE

As a data professional automation is key and often times can be taken for granted. Most professionals take automation to mean tasks such as the following:

  • Daily Backups
  • Index Maintenance
  • Job Failure Notifications
  • Job Success Notifications
  • Routine Maintenance
  • T-Log Shipping
  • Disk Alerts
  • Space Alerts
  • New Servers
  • Extended blocking
  • Deployment Automation – thanks RedGate (Twitter)

…and the list could go on for quite some time.

One specific item that I have found helpful to me is related to CMS/PBM. I like this tool because you can verify and evaluate all your servers from one central location. Some take-a-ways to think about CMS/PBM are evaluations such as:

  • Last Successful Backup Date
  • Database Page Verification
  • Database Auto Shrink
  • Database Auto Close
  • Data and Log File Location
  • Backup and Data File Location
  • Blocked Process Threshold
  • All SQL Agent jobs succeeded in the last 24 hrs (you and I both know that notifications don’t always get set up)

I am not going to go into specifics on creating an .rdl however once you have all the policies in place from there there an .rdl file can be created and a report emailed to you directly at the start of the business day, simple as that.

If you are interested in CMS\PBM check out John Sterrett’s (blog|twitter) information that he has published here

Also, another great article published is over at John Sansom’s (blog|twitter) on automation

One last piece of advice that I will give to myself along with this post is doing more extensive research on the use of PowerShell. It seems pretty powerful and easy to use up to this point, and it would appear that a lot of tasks can be automated through such.

Conclusion

In today’s Data Professional world if you are not automating tasks that will make you more efficient then why not start today? There are so many ways to automate tasks but with that said comes great responsibility. Before you just set out blindly get some thoughts and goals down on paper then start researching and exploring. Many Data Professionals before us have blazed these same trails, sure you might find something along the way that someone else has worked on and then you take it to another place but always remember to give credit where credit is due.

To Automate or Not to Automate that is the question?

T-SQL Tuesday #45–Follow the Yellow Brick Road

SQL TuesdayIs it me or does it seem like we just had a T-SQL Tuesday blog party? These days are just flying by! This week Mickey Stuewe (b|t) is hosting and she has brought up a fabulous question about auditing with the topic being Follow The Yellow Brick Road. I have been on both sides of the fence in shops where there has been very limited auditing versus mandated regulatory auditing of their systems – so what’s my preference? Glad you asked; pull up a chair as we travel down the yellow brick road from munchkin land while trying to avoid the wicked witch and her ape like minions.

What to Audit?

As we start out on this journey, there are multiple questions on what needs to be audited one should ask themselves. Let’s face it, there are many wicked witches out there that just flat out want to get to our data and the longer I’m in the business I’ve seen more and more attacks from within, then from the outside. Here are a few ideas on some things to audit:

  • Tables with sensitive data such as SSN’s, personal information, transaction data
  • Review of your QA, UAT, and Prod environments (some shops like to audit all 3 some like to only audit Prod) – don’t be the one who audit’s none!
  • Check-ins of code into your repository
  • Inserts
  • Updates
  • Growth trends
  • Security Access
  • File Shares

These are just a few ideas, but don’t limit yourself to just auditing who inserts or updates data into your systems. As you fly through the house like Dorothy did in the SQL tornado remember to think outside the box some and audit other areas such as growth, baseline trends. The SQL environment is always evolving, use the necessary tools to keep one step ahead of the storm.

How To Audit

All right, so I’ve identified what I want to Audit. It seems like it is a daunting task and I have no clue where to begin; that’s where the Tin Man comes into play. If you are a Data Professional then you have to care about the data enough to even be reading this post – – means you have a heart.

There are many different avenues you can take to audit your systems, a few of those are:

  • Triggers within SQL
  • 3rd Party Auditing Tools
  • CDC
  • Home grown utilities

Whatever avenue you decide to take, just take the stance of doing something. Doing nothing is not something I would advise; knowing who is changing your data, schema, files, etc. is an important aspect of any data professionals life – who knows if you have a mighty Wizard of Oz such as your auditing department or compliance department they might just be thankful for your efforts.

Summary

When you are going through your own SQL journey on the yellow brick road remember that auditing can be viewed upon as an asset. Look into what you are doing currently, can you improve upon it? Are you doing anything at all? Is your data being protected? Even better do you know if your data is being protected?

Take the time to put some safeguards in place in the end you will be glad you did.

T-SQL Tuesday

Wait a minute, wait just a minute. What is this T-SQL Tuesday you speak of. The mighty Wizard of Oz, Adam Machanic (b|t), started this party in 2009. Basically each month on the first Tuesday an invitation is sent out that describes a topic for that month; the second Tuesday bloggers put together a post regarding the topic and then send it back to the person who is hosting. If  you have a blog and are interested contact the mighty Wizard of Oz and he’ll get you on the schedule.

T-SQL Tuesday #44 The Second Chance

http://www.sqlballs.com/2013/07/t-sql-tuesday-44-second-chance.html?utm_source=buffer&utm_campaign=Buffer&utm_content=buffer2679c&utm_medium=twitter What a perfect way to start back on the blog after taking a month off for vacation and family time – T-SQL Tuesday. If you are hear reading this blog chances are you already know what T-SQL Tuesday is; if you don’t then first off I want to take a moment just to fill you in. Adam Machanic (T|B) came up with this block party every second Tuesday of each month. It is a party where a host can present a topic to a community and then the community can blog about the topic ~ fantastic idea! How can you host; well I’m glad you ask….you can contact Adam via his blog and let him know you are interested…he has all the information you need to know and what the requirements are from here.

In saying that, this months party is hosted by Bradley Ball (T|B) and is labeled T-SQL Tuesday #44 The Second Chance.

I’ve been asked this question from time to time in conferences or groups that I have spoken to. I keep going back to a certain point in my life that I remember very vividly which I will try to do it justice in this blog post.

PROGRESS NOT PERFECTION

Being a developer for many years and then transitioning over to DBA for again many years I have had my share of mistakes that I have made….guess what? We all have and no one is perfect. Mistakes will occur; it is inevitable. How you learn from those mistakes are integral for ones professional growth. Do you pick yourself up, accept needed constructive criticism from a co-worker, or maybe no one knows about the mistake that you made. In that case, do you own up to it or do you try to sweep it under the rug? It’s all right to make mistakes ~ how do you learn from it?

The Blunder

I remember coming right out of college and accepting my first position. It was a position where I had a friend working from high school; he had already been on the job developing and working with SQL for some time and I was green as the grass that grows in the summer time. One of the duties I was afforded was building import process for files to load into the system. Back then there wasn’t SSIS, we used a language called Progress which is similar to what a VB (Visual Basic) is. Now that I’ve dated myself a bit I’ll share the mishap. In building an import process it was tested and moved on to production, to not go into to much detail about the broken processes in place a break in the code would display a message box when certain criteria was hit. Needless to say in a 500,000 record file the scenario was hit often; what was the solution – instead of asking for assistance and not to hit the enter key 500,000 times on the message box another developer decided it would be a grand idea to stick a pen cap in place to hold the enter key down. Now, mind you 20 yr. olds might not be a seasoned vet but this did smell fishy. The import graciously finished but since the pen cap was doing a nice job it just went on ahead and loaded the data again on a second run. After realizing what had happened it was then felt prudent to perhaps rope the ole boss in. Ah yes, that was a fun conversation – one might ask – you mean you loaded dups into the system; didn’t the import take that into consideration – I’ll plead the fifth. I learned how to clean up a lot of data at an early point in my career!

SUMMARY

Everyone will make mistakes; how you learn from them and how you proceed in taking a negative and turn into a positive rests solely on how you handle the situation. You will have choices and times will come where mistakes will occur ~ Progress Not Perfection.

T-SQL Tuesday #41 Presenting and Loving It

SQL TuesdayThis months T-SQL Tuesday party is being hosted by Bob Pusateri (a.k.a. SQL BOB) (B | T). Bob’s topic for this month is intriguing to me as I wrestled with this for some time first starting out – Presenting!

I think for me personally, I really started to think about presenting after returning from my first Pass Summit (2011). Sure, I’ve given speeches in college, and talked openly in discussions but something drew me to speaking on a topic that I enjoy working with everyday ~ SQL.

  • First Presentation – my first presentation went surprisingly well. I had been in front of large crowds on many occasions so talking wasn’t my initial problem; my initial problem was knowing that the attendees came to get info and to learn, and I needed to be sure I was prepared to give that to them.
  • Investment – I started to look at speaking as investing time in others. I cannot begin to tell you how beneficial and fortunate I have been to study, learn, and tap into some of the best minds in the SQL industry. They have taken the time to invest in me to further my knowledge and career; I should do the same and help others along the way if I can. You see that is what is so great about the SQL Community; the community is there and always helps.

I know there are other people out there that are hesitant to take the plunge in speaking. I have personally experienced some great benefits to speaking. For those that are hesitant these might be worth considering.

  • Gain knowledge – when you are presenting on a topic it drives you to gain more knowledge about the topic. Knowing the ins and outs and going through test results and scenarios.
  • Networking – meeting new faces in the community; getting involved with peers
  • SQL Saturday’s – attending new SQL Saturday’s in different locations
  • Giving back – as I already stated giving back to the community that has helped me in so many ways over the years

Speaking is like anything else you set your mind to. Practice and you will get better; if you haven’t taken the plunge yet I encourage you to try it once; see how you like it; you might find something out about yourself.

Want to host a party?

So you want to know if you are eligible to host a T-SQL Tuesday party? Glad you asked, you can contact Adam Machanic (B | T) who is the founder of T-SQL Tuesday.

T-SQL Tuesday #040: File and Filegroup Wisdom

SQL Tuesday

It’s that time again for the T-SQL Tuesday party! This party was created by none other than Adam Machanic (Twitter). If you are interested in hosting a party at some point this year give him a shout; you need to have participated in two T-SQL Tuesdays along the way and also maintain your own blog for at least 6 months.

Now that we have what the party is all about let’s get into what this month’s party is centered around Filegroups and his hosted by Jen McCown / MidnightDBA

My focus today is garnered toward indexes on filegroups and what they can do to your index strategy. I’m a big fan of having strategies when tackling issues, problems, or even believe it or not from the beginning of a project. Placing indexes on filegroups carefully can improve query performance (at the same time I want to note that indexes can also hurt performance in some situations so thorough testing needs to be taken into consideration).

Back from my 2008 R2 studies, if my memory serves me correctly, indexes are stored in the same filegroup by default; a non-partitioned clustered index and the associated table always reside in the same filegroup however you can do one of three things:

  1. You can partition both clustered and non clustered indexes to span multiple filegroups
  2. Create non clustered indexes on a filegroup
  3. Move a table from one filegroup to another

You can achieve performance gains by created non clustered indexes on a different filegroup if the filegroups are using different physical drives. The data and index information can be read in parallel by the multiple disk heads when the physical drives are on their own controllers.

One cannot foresee the access that will transpire or when it will happen, a better decision to spread your tables and indexes across all file groups might be of help. This would guarantee all disks are being used and accessed because all data and indexes would be spread evenly across all disks.

To bring this all back together you can think of a filegroup in its simplest of forms. Every database that you create has at least a data file and a log file, and every database has a primary filegroup. The filegroup contains the primary data file and any secondary files that are associated with it. One filegroup can contain multiple mdf/ndf files.

In the end I have seen significant gains with indexes being placed on specific filegroups, but as I stated before it is good to test all this out. Set up some scenarios on your test server and start doing some test cases to prove different theories and ideologies. One thing to remember as well is not every case is the same; ensure that the decisions you are making is good for what you are working on; never take a suggestion and drop it into a production environment. Prove the statement to be true or false no matter who it comes from.

Well, that’s a wrap for today’s party. Until next month…….

T-SQL Tuesday #38 – Standing Firm

SQL TuesdayIt seems as though months are coming and going at an alarming rate but nonetheless I can say I am enjoying the ride. Today’s T-SQL Tuesday party is hosted by Jason Brimhall (B | T) T-SQL Tuesday #38 – Standing Firm.

Jason has provided a three word topic that entails the words resolve, resolute, and resolution…given these three words one must come up with a story based on one of them.

I believe I have chosen mine and have taken a somewhat different spin on it and how I applied it to T-SQL and my DBA career.

RESOLVE

The word as some various meanings but pertaining the expression I would like to use; I will use it literally as a noun “A determination or decision; a fixed purpose”.

Many of us who are DBA’s (which I am proud to be one) have all taken various routes, various avenues if you will, along the way to get where we are at. Growing up I had a passion and desire to play the game of basketball. I would train very hard; my days would consist of waking up at 4:00 a.m. prior to school, go to the gym where I would meet my assistant coach, shoot 700 jumpers and 100 free throws, shower and go to school, practice after school with team, and then repeat the same process daily. My summers consisted of going to various camps in the United States where coaches from all over would attend with my hope of one day gaining a scholarship. I am 6′ foot tall and not overly fast I was always told that I was too small to play college and too slow. The day finally came for me to graduate and guess what ~ I made it to college to play the game I had worked so hard at for so long.

After my career was over now what? I started out in Physical Therapy but a friend of mine showed me a program he had developed. Once I saw the inner workings of what he was doing it I was intrigued and went down the path of becoming a developer to which led me into the DBA world. You know what; along those same lines I got the same rhetoric from people….”You can’t do that; you don’t know enough”; “You can’t do that; stick with sports its to long of a road for you”. Resolvea determination or decision; a fixed purpose. You see I had a fixed purpose when playing basketball. I knew what I wanted and where I wanted  to go. The same can be said with becoming a DBA or if you have a desire to learn something in T-SQL; if you  have a desire to learn something new in SQL or track down a long running query and improve it; show some resolve – have a fixed purpose. Don’t ever let anyone stand in front of your dreams to become something you want to be. It takes hard work and dedication and a drive to become not just a good DBA but an exceptional DBA.

In the end you will build a strong character and life values. What I learned on the court has carried over to my work habits. I love being part of a team, but I also enjoy when the game is on the line and SQL tasks are at hand and it is fourth quarter and systems are down I want to be the one to take the last shot.

I said all that to say this, if you are faced with “potholes” or “adversity” along your journey from a technical standpoint show some resolve. The community as a whole is here to help and don’t lose sight of your endpoint; granted that endpoint may change many times but remain purpose driven and adapt when need be.

Hosting a T-SQLTuesday Party

If you are interested in hosting a T-SQL Tuesday Party contact the founder Adam Machanic (B | T). It is a great way to get involved in the community and provide an avenue to share your ideas.