Archive

Archive for the ‘SQLServerPedia Syndication’ Category

SQL Saturday Louisville Re-Cap

9 Aug 2017 3 comments

IMG_20170804_064418And like the wind another SQL Saturday Louisville has come and gone. This past weekend seemed to be a huge success, but it didn’t come with some take-away’s and that is okay. I think every time we put on an event like this we are always looking for ways to make things better the next year. So, enough yapping. What are some of the highlights?

VENDORS

I’m privileged to work along side one of the other co-organizers in John Morehouse (b|t) when it comes to vendors. We were very thankful this year to have the following sponsors on board with us:

It was awesome to see each and every one of these vendors at the event. Most of them have been prior years and the attendees seem to enjoy speaking to them about their products. We truly appreciate the support they have shown us over the years and look forward to many more events ahead with them.

IMG_20170804_163609

SPEAKERS

Once again we had a very talented pool of speakers that came in town. I won’t take the time list all of them out here, but do go over to the SQL Saturday Louisville website if you are interested. As a speaker, it always amazes me that they come from all over to these events to give their time and hone their craft. If you ever attend one of these events I encourage you to do a few things:

  • Say thank you – believe me, it goes a long way.
  • Give serious session feedback; we look for ways we can make our presentations better.

A huge thank you to all the speakers that came out to our event; we had some great times together and look forward to seeing each of you somewhere down the line. I think everyone did a phenomenal job in their sessions and I’ve heard nothing but positive feedback from attendees; even after the event.

PRE CONS

This year we had Grant Fritchey (b|t) and Josh Luedeman (t) in the house. Both had packed sessions, I know Grant’s sold out and Josh wasn’t very far off from the numbers I was looking at. A full day of training from a couple of the best in our SQL community made for some great times. Observing and listening to attendees during the breaks again, nothing but positive things. Huge thank you to both of them for taking the time to spend a day with us before the event to share their knowledge with the attendees.

SQL SATURDAY CREW

I can’t say enough about the volunteers that help out with this event. It is no easy task to put one of these things on and the countless hours leading up to the event are many. The “behind the scenes” action is huge. A few things that stand out to me are the character, selfless acts, time given, and pride everyone takes in trying to make this the best event we’ve ever had each year. Doesn’t mean times are always easy; have a cohesive unit with single sight focus to knock out tasks and obstacles as they arise are pretty awesome. In any sense; can’t be more proud to serve along side these individuals.

This year we lost one of our very own SQL crew members. Dave Ingram passed this year. He was one of the earlier on volunteers that gave time to help make the event what it is today. In the three years as co-organizer, I knew him for 2 of those. It was evident his passion was helping the local community base here in Louisville. I was honored to be able to say a few words at the end of the day. It was touching to have the opportunity to meet his daughter, whom we did not know would be there, we were able to take a moment and recognize her and her dad with a round of applause. It’s because of men and woman like this who have forged the way for others like me to pay it forward. Thank you, Dave!

Dave

RIP Dave Ingram. SQL Saturday Louisville volunteer(7 years), SQL Cruise and PASS Summit Alumni. We miss you.

 

NEW SPEAKERS

IMG_20170805_153122It was awesome to see first time SQL Saturday speaker Kat Edrington (t) presenting a session for our attendees this year. This is what it is about for us. We need to continue to cultivate and bring in new leaders of tomorrow. Hats off to Kat on an excellent job well done!!

 

ATTENDEES

Thankful for the many conversations with the attendees that were had. From the questions regarding products they know that I am associated with to conversations on local tech news, to asking where things are at the venue. We at SQL Saturday Louisville strive to make it the best experience we possibly can; doesn’t mean we always get it right – but we will go down swinging trying. Thankful for the all the conversations that I was able to have with the everyone and look forward to much more.

At the end of the day, we had about half of the attendees who raised their hands stating it was their first SQL Saturday.

MARKS FEED STORE

Okay, so if you are from around here then you know about Mark’s Feed Store. The barbecue they have is simply amazing and they were are caterers for this event. Once again they were spot on and provided some great food for all of us at the venue. If you are ever in town go check them out.

TRUE PROFESSIONALS

As I said before there will always be something that comes up at an event. This time around we had a sound system issue in one of the rooms. Rie Irish (b|t) handled things without any issue and we took a field trip to a new room. These types of issues are things that bug the heck out of us hosting, but at times they are out of our hands. Appreciate the flexibility by Rie along with the attendees for being patient; putting a speaker behind in their session is not what we want to do here people.

As I am walking down the hall checking on things I hear a huge humming noise. As I enter the room I see Lori Edwards (b|t) in her session and Andy Mallon (b|t) providing assistance to the problem. After the humming subsided the bulb in the projector decided it was time for it to go. Once again we found ourselves taking a field trip across the hall. Once again, hats off to the ability to adjust and the attendees were very accepting.

Why do I bring these up you ask? Few reasons, but mainly that no matter how much you plan and get things orchestrated issues will arise. It’s important to address as quickly and politely as possible and move on. BTW if you aren’t following the people above in this section please do so…stellar data professionals.

THE WHY?

So at the event, I was asked why do you do this? Why do you help? Why do you speak? I keep saying the same thing but it holds true. In 2011 when I attended my first PASS Summit it changed my life and career.You don’t have to wait until you can go to Seattle Washington though; you can attend these local events all over the globe to learn, network, and test the waters. I know there are other Chris Yates’ out there who, like me, was wanting to get plugged in but didn’t know how. Wanting to make a difference locally, but yes also globally I will always try to be me and help others along the way. Appreciate, encourage and value everyone ~ we got this.

THAT’S A WRAP

Another Saturday has come and gone. I hope everyone from the speakers to the attendees had a great time. Next year will be our 10th year which is a special milestone. Look forward to what the journey holds and hope to see many of you there.

PAC Community Ambassador – SQL Sentry

pac-logoLast week Aaron Bertrand (b|t) published a post regarding five new PAC Community Ambassadors for SentryOne. I am privileged and honored to be a part of this journey with some stellar data professionals:

  • Andy Mallon (b|t)
  • John Morehouse (b|t)
  • Derik Hammer (b|t)
  • Mike Walsh (b|t)

This venture is a new community program that SentryOne is starting this summer which allows us more avenues to get out into the community, stay connected, and continue to be involved in the programs that SentryOne has to offer.

Knowing each of the other four individuals I can without a doubt say that the mindset is focused on helping others. How do I know this you may ask? Because each of these data professionals has helped me over the years, and I know their drive and motivations to help others succeed.

Thanks SentryOne for the honor to continue to serve others and look forward to meeting, even more, faces as we travel around, collaborate, and impact the community!

T-SQL Tuesday #92, Lessons Learned The Hard Way

TSQL2SDAY-150x150Wow, hard for me to believe it has been a little bit since the last T-SQL Tuesday block party. This month Raul Gonzalez (b|t) has chosen the topic of what lessons one has learned the hard way. Before we get into the story, however, let’s take a look at who, what, when, and why of T-SQL Tuesday.

What Is T-SQL Tuesday?

T-SQL Tuesday was started by Adam Machanic (b|t) and is a monthly blog party. It occurs on the second Tuesday of each month; where a designated host picks a topic and fellow community bloggers publish a piece. It has been a very useful tool in my opinion and I’m looking forward to doing many more of these. It has been one avenue for others to share their experiences while learning something new along the way.

If you are interested in hosting a T-SQL Tuesday on your blog then reach out to Adam.

Lessons Learned The Hard Way

A lot has transpired over a sixteen-year career thus far. Many lessons have been learned along the way some were more difficult than others. I think it is important to note that not all lessons learned by a data professional have to be of a technical nature as well. Let me see if I can split some up technical vs. non-technical that I’ve learned over the years.

Technical
  • Unit testing – who knew that this would be so important right? As a developer starting out and then becoming a DBA I have an appreciation for making sure things test out as they should; rigorous testing. Earlier in my career, I thought that’s what we have QA for, right?
  • Backups – yeah I’ve been burned before early on regarding backups and not having them in place as they should have been. You want a dose of reality real fast? That’s a good way to start.
  • Blinders On – become so focused that you only take into account a certain area of the picture when in essence what is being changed can affect a multitude of things.
  • Knowing vs. Doing – putting comments in code such as “this is probably not the best way to things” is not the attitude to have when fixing the problem – been there done that.
Non-Technical
  • Listening/Heeding Advice – this is key and something I did not learn until later on in my career. It’s not a skill set that I came out of the gate with, having a mentality that you are always right is not the best approach to take.
  • SME (Subject Matter Expert) – I enjoy helping people; it’s part of who I am. This is both a good and bad trait to have at times. If you are not careful you can find yourself overextending into areas where you think you know something but you don’t. Over the past several years I’ve learned that it’s okay to help people even if it is pointing them in the right direction to someone else. But be as sure as I’m typing this, I’ll always be willing to help and will never apologize for that.
  • Conflict Management – over the years I’ve seen many data professionals and worked with various people. All of these experiences have equipped me over time to become a better professional in dealing with conflict which is never easy. A lot of lessons learned along the way on this one.
Failure

I want to bring this topic up in a section all by itself. Having a sports background for most of my life, and then morphing into an avid runner I’ve had “failure is not an option” instilled within me since a very early age. This saying is okay, but in the same token, one cannot be afraid of failure. Some of the best lessons I’ve learned both professionally and non-professionally have come of a result of something I’ve tried and failed out. The key is not staying knocked down, but look at it in a light of if you aren’t trying then you aren’t failing and pushing the envelope.

In Summary

This is a great topic this month. Don’t be ashamed or afraid of your journey and past failures or lessons learned. These are the things that mold and shape us into being the people are to become in the future. May we continue to push the envelope both in technology and beyond; impacting and coaching others along the way. Always remember you started somewhere; remember how that felt? Pay it forward.

Built My Presentation, Now What?

IMG_20161025_092017_01Over the course of several years, I have given many technical and non-technical presentations. It is fun for me to put a new slide deck together, but it also requires a lot of hard work and can be time-consuming. I’ve had a few mistakes, to say the least, over the years where that one typo slips through or something doesn’t go according to plan ~ guess what? It happens.

I compare articulating a presentation to similar fashion in testing something. Yeah, you go over it again and again just like you would test a backup process or verify indexes are actually working. For me the same concept applies; I can’t remember who in the SQL Community always mentions having a checklist handy. I know I’ve read that somewhere before but cobwebs are thick right now so, please, forgive me if I don’t remember. Through the years, I’ve managed to build my own checklist regarding presentations. It is the nuts and bolts of what works for me; it doesn’t necessarily mean it, in turn, will work for you.

Given light of some past conversations I’ve had, I figured I’d share it with you all and maybe someone out there will benefit from it.

Presentation Checklist (a.k.a. Project Double Check Yourself)

What is the purpose – fully understand the purpose of the presentation. By that I mean, what outcome are you seeking?

  • To inform
  • To convince
  • To generate insight and discussion
  • To drive action

Know your audience

  • Do you know who my audience is? Have I provided adequate context to make it easier for them to understand?
  • Are there any personal motivations that you need be aware of?
  • Is the audience familiar with the topic? Have you included adequate detail and background information?
  • Is the presentation tailored to fit the audiences communication style?

Know the message

  • If applicable, do you know the problem or issue you are trying to address?
  • Do you have three to five key teaching points you want to deliver? If so, have you tied those teaching points logically and clearly to the original problem?
  • Have you clearly linked your teaching points to key data or trends along with explaining how the analysis supports, confirms, or denies beliefs about the problem and/or possible solutions?
  • Have you limited the data to what matters most?
  • Have you clearly established relevance? (why would your audience care? Have you clearly highlighted how this aligns with the target audience?)
  • Have you clearly established urgency (why would the audience act now; why is it critical?)

Structure

  • Is the presentation clearly marked with markers and sign posts? Is it easy to follow?
  • Is there an agenda that clearly identifies the different elements and how it fits together? Key point up front?
  • Are there additional details about internal or external sources that were consulted for the included information? Give credit where credit is due

Narrative

  • Does the presentation include insights that will be most influential to the audience? Is the scripting memorable and powerful?
  • Does the presentation identify key assumptions?
  • Does the presentation articulate immediate actions that you believe the audience should take?

Graphics

  • Do you know the purpose of each graphic? Is it tied to a teaching point in the message?
  • Do the graphics present information in a logical, visually appealing manner? Are there other ways of interpreting the graphic other than your intention?
  • Is the page balanced?

Formatting

  • Does the presentation have a standardized look and feel (same headings, colors, fonts)?
  • Are page elements consistent (background, title, body text)?
  • Are colors used judiciously (to emphasize, highlight, and organize)

Conclusion

Checklists; they are everywhere. They don’t necessarily have to be for technical related activities; heck we use checklists for grocery items. They are a part of our daily lives; so when you get that presentation built and you are ready to give it at your shop, on the job, a conference or a client take a few minutes and review a checklist. Make sure you have your house in order and that everything makes sense.

Remember, you get out what you put into something. Continue to work hard and hone in on your speaking and presentation talents that lie within. Like I said, these are some of the things that have helped me over the years; doesn’t mean they are for everyone. The flip side to that, you may have some of your own to share. I encourage you to do so.

 

QA, Utility Databases, and Job Executions

thinking-outside-the-box1Sometimes we, as data professionals, have to think outside the box. I know, crazy idea right? Each shop and situation are different; there will always be several different ways in most cases that you can arrive at a solid solution.

 

This post has a few intentions behind it:

  • It is not a “take this solution; it’s the only way”.
  • Generate some thought and additional methods to reach a goal.
  • This is not intended for a production environment.

Good, now that we have those few things out-of-the-way let’s get to the meat of the topic. A situation arises where you want to give a bit more control to teams to execute jobs without giving full access to the SQL agent. In that case a good utility database may come in handy.

Example of an issue: A QA team is in need of kicking jobs off to test in a specific environment. Keeping in mind that each shop can be different this also means that security levels at varying shops will be different. There are a few choices that may come to mind with this issue:

  • Fire off an email to the DBA team and wait for them to kick job off.
  • Fire off an email to someone with access and wait for them to kick the job off.
  • Wait for the predefined schedule on the job agent and let it kick the job off.

Another method would be to utilize a utility database. You can give it whatever name meets your criteria in this case we will just call it TestingJobs. Let’s look at the overall picture below and how this all fits together:

Things you’ll need

  • UtilityDatabase
  • Two Stored Procedures
  • Table
  • Agent Job

Step1: Create the TestingJobs database (I won’t go into specifics here on proper set up; assume this is already created).

Step2: Create a table called ControlJobs inside the TestingJobs database

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[ControlJobs](
[JobControlID] [INT] IDENTITY(1,1) NOT NULL,
[JobName] [VARCHAR](500) NOT NULL,
[RunStatus] [BIT] NOT NULL DEFAULT ((0)),
[LastRanBy] [VARCHAR](50) NOT NULL,
[LastRanByApp] [VARCHAR](150) NULL,
[Date_Modified] [DATETIME] NOT NULL DEFAULT (GETDATE()),
[Active] [BIT] NOT NULL DEFAULT ((1))
) ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

Step3: Store procedure creation for table insertion (note the parameter @JobName)

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[test_StartJobs] ( @JobName VARCHAR(100) )
AS
BEGIN

        /************************************************************************
This script will insert the record needed to kick off agent jobs.

        ************************************************************************/

INSERT  INTO [TestingJobs].[dbo].[ControlJobs]
( [JobName] ,
[RunStatus] ,
[LastRanBy] ,
[LastRanByApp] ,
[Date_Modified] ,
[Active]
)
VALUES  ( @JobName ,
1 ,
” ,
” ,
GETDATE() ,
1
);

END;

Step4: Set up stored procedure that will run the pending jobs.

USE [TestingJobs]

GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE
[dbo].[RunPendingJobs]
AS
SET NOCOUNT ON;

    DECLARE @JobName VARCHAR(500) ,
@JobStatus INT ,
@RC INT;

DECLARE cur_RunJobs CURSOR
FOR
        SELECT  JobName
FROM    Ddbo.ControlJobs
WHERE   RunStatus = 1
ORDER BY JobName;

OPEN cur_RunJobs;

FETCH NEXT FROM cur_RunJobs
INTO @JobName;

WHILE @@FETCH_STATUS = 0
BEGIN
PRINT ‘Checking to see if job is currently running. ‘;

EXEC @RC = dbo.GetCurrentRunStatus @job_name = @JobName;

IF @RC = 0
EXEC msdb.dbo.sp_start_job @JobName;
ELSE
PRINT
@JobName + ‘ is currently running.’;

UPDATE  ControlJobs
SET     RunStatus = 0 ,
Date_Modified = GETDATE()
WHERE   JobName = @JobName;

FETCH NEXT FROM cur_RunJobs INTO @JobName;

END;

CLOSE cur_RunJobs;
DEALLOCATE cur_RunJobs;

Step5: Set up stored procedure to check if job is already running

USE [TestingJobs];
GO

SET ANSI_NULLS ON;
GO
SET QUOTED_IDENTIFIER ON;
GO
ALTER PROCEDURE
[dbo].[GetCurrentRunStatus] ( @job_name sysname )
AS
SET NOCOUNT ON;

    /* Is the execution status for the jobs.
Value Description
0 Returns only those jobs that are not idle or suspended.
1 Executing.
2 Waiting for thread.
3 Between retries.
4 Idle.
5 Suspended.
7 Performing completion actions  */

DECLARE @job_id UNIQUEIDENTIFIER ,
@is_sysadmin INT ,
@job_owner sysname ,
@Status INT;

SELECT  @job_id = job_id
FROM    msdb..sysjobs_view
WHERE   [name] = @job_name;
SELECT  @is_sysadmin = ISNULL(IS_SRVROLEMEMBER(N’sysadmin’), 0);
SELECT  @job_owner = SUSER_SNAME();

CREATE TABLE #xp_results
(
job_id UNIQUEIDENTIFIER NOT NULL ,
last_run_date INT NOT NULL ,
last_run_time INT NOT NULL ,
next_run_date INT NOT NULL ,
next_run_time INT NOT NULL ,
next_run_schedule_id INT NOT NULL ,
requested_to_run INT NOT NULL , — BOOL
request_source INT NOT NULL ,
request_source_id sysname COLLATE DATABASE_DEFAULT
NULL ,
running INT NOT NULL , — BOOL
current_step INT NOT NULL ,
current_retry_attempt INT NOT NULL ,
job_state INT NOT NULL
);
INSERT  INTO #xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @is_sysadmin, @job_owner,
@job_id;
SELECT  @Status = running
FROM    #xp_results;
RETURN @Status;

DROP TABLE #xp_results;

SET NOCOUNT OFF;

Step6: Job Creation

Create a SQL agent job that will call the RunPendingJobs in the database. You can set this schedule to three minutes for this test.

The Benefit

Now think of a QA team member sitting at their desk running multiple tasks. This does take some coordinated effort in getting the job names but now that the basics are set up the team member could run the execute command for the test_StartJobs which will place the necessary information into the control jobs table. Of course the proper security would need to be set up in order for the user to be added (think AD groups). By utilizing the above method the team can suffice on it’s own in a non prod environment streamlining some of the inefficiencies that have plagued the groups in the past.

Summary

A few things to note here:

  • Don’t ever take code off the internet without testing it. This is just a thought-provoking post and there are some things in this post that are dependent upon one to set up and test.
  • I realize there are multiple ways to accomplish this. This is just an avenue to explore and test with some thinking outside the box.
  • Don’t limit yourself to “I can’t” or “This will not fly at my shop”; challenge yourself to become innovative and think of ways to tackle problems.

T-SQL Tuesday #89 Invitation – The times they are a-changing

TSQL2SDAY-150x150This month’s topic by Koen Verbeeck (B|T) is based around times that are a changing. To break it down somewhat it was inspired by a blog post that Kendra Little (B|T) put together around Will the Cloud Eat My DBA Job. Koen is wanting to know with the ever-changing world of technology what kind of impact has it had on you and how do you plan to deal with the future of data management/analysis.

To The Cloud

I think one of the topics that I’ve seen a gradual change in is the topic that revolves around the cloud. Being in the financial district cloud talk is not always welcome. It is a myth to some; I am glad that a while back I headed some of Grant Fritchey’s (B|T) advice in that he said you better start learning cloud techniques sooner than later. The cloud discussion is not always a welcomed one, but is one that needs to be had to keep up with innovative technologies.

Finding what is the best solution for you and your respective area the cloud does allow for flexibility and control. One of the main issues I see most shops running into are security based around a cloud model along with costs in data size etc.

With the proper planning and oversight the cloud is a viable option that should not scare away data professionals

Third Party Utilities

I think over time some of the third-party tools have become game changers in a lot of shops. I see vendors such as SQL Sentry and Red Gate that have evolved over time to help streamline and provided better efficiencies around data management and monitoring. Cutting edge keeps the users of these third-party tools on edge and wanting more. Tying all these into automation of daily tasks and not just on premise monitoring but cloud monitoring has been a huge plus for the community.

The Way Business Interprets Data

Data is what drives us; it is what a lot of decisions are derived from and direction of companies and shops. The data is ever-changing and how we look at it. Take for example Power BI. The methods we used years ago have morphed into a greater approach of delivery to businesses. I never thought I would be watching a professional sports game and see them pull out their tablets and review live data in between innings or set of downs.

PowerShell

No, this isn’t a tribute to Mike Fal (B|T) or Drew Furgiuele (B|T), but I do appreciate their nudge in getting me to utilize PowerShell for some of my every day usage. This could fall under the third-party utilities section up above but I thought it beneficial to state that in some cases it has been a game changer. Stumbling upon the DBATools website has been a blessing in disguise; I love getting to work with technology that I may have not utilized as much in the past.

Do You Feel Endangered?

No, and neither should you. I should paraphrase that with don’t be afraid of change for it allows us to learn new technologies and grow on our journey. There will be opportunities to always learn; each day you should strive to learn something new that you didn’t know before. A wise SQL community member once told me, “The data will always be here; will you?”

T-SQL Tuesday

For those that are not aware T-SQL Tuesday is a block party started by Adam Machanic (B|T). Each month community members who blog pick a topic then gather all the blogs who participated in the event and provides a recap. It’s a great method to share knowledge and an avenue to give back to the community. If you are an avid blogger and would like to be a host then do reach out to Adam via the methods provided.

Don’t Duck On Responsibilities

10 Apr 2017 1 comment

ResponsibilitiesBeing a data professional you assume a certain amount of responsibility. It often requires having the right attitude and an action plan in place for finding the solutions to our problems at hand. Too many times we attack the symptoms causing the issue, but overlook the root cause. The quick Band-Aid fixes are found many times over, whereas our jobs should be identifying the real issues that lie beneath the symptoms. Now, don’t get me wrong – I understand at times you have to stop the bleeding. In the end though one should uncover the root cause and make the permanent fix.

Prioritize the issue at hand

Chances are you, dear reader, encounter many problems throughout the day. Never try to solve all the problems at one time; instead make them line up for you one by one. Might seem odd but make them stand in a single file line and tackle them one at a time until you’ve knocked them all out. You may not like what you find when uncovering the root cause issues, but that is part of the process. Be careful of this uncovering and be cognizant that what you find with the issues may or may not be the root to all the problems.

Take time and define the problem

In it’s simplest form, take time out and ask yourself this question – “What is the problem?” Sounds easy enough doesn’t it; you’d be amazed by the many accounts of knee jerk reactions data professionals make all over the world. You  may be thinking to yourself that there has to be more to it than that. Think about it in four easy steps:

  • Ask the right questions – if you only have a vague idea of the situation, then don’t ask general questions. Do not speculate but instead ask process related questions things relating to trends or timing. What transpired over the course of the week that may have led to this issue.
  • Talk to the right people – you will face people who inevitably will have the all-knowing and all correct way that things should be done. Heed caution to such as you may find resistant to change and blind spots by these individuals. Creativity is, at times, essential to any problem-solving skill.
  • Get the “set in stone” facts – once the facts are all laid out and defined you may find that the decision is pretty concise and clear on action that should be taken.
  • Be involved – don’t just let the first three steps define you; get involved in the process of being the solution.

Questions to ask yourself regarding the problem

  • Is this a real problem?
  • Is it urgent?
  • Is the true nature of the problem known?
  • Is it specific?
  • Are all parties who are competent to discuss the issue involved?

Build a repository

Once you’ve come to the conclusion and provided a solution to the issue – document it. I know I just lost several readers there. Believe it or not documentation will save your bacon at some point. Maybe not next week or next month, but at some point down the line it will. Some things to consider are:

  • Were we able to identify the real cause to the problem?
  • Did we make the right decision?
  • Has the problem been resolved by the fix?
  • Have any key people accepted the solution?

I am reminded by a saying I once ran across:

Policies are many, Principles are few, Polices will change, Principles never do

Summary

Each day we encounter issues and problems. Don’t let them define you but rather you define the issue. Often times we overlook the root cause; remember to go through your process, policy, and standards in rectifying the problems at hand. It is better to tackle the problems when they are known than to sweep them under the rug for the next data professional to come along and then they are faced with fixing them.

Hopefully this short post will provoke you to think about the issues you deal with on a daily basis and how best to tackle them.

%d bloggers like this: