Category Archives: SQL Server

SQL Summer Vacation–SentryOne

sqlvacation2017We are having an extra Louisville SQL Server and Power BI User Group meeting this month due to the SQL Summer Vacation coming into town. SentryOne’s Kevin Kline (B|T) will be rolling into town for a fun filled 2-hour event on Wednesday the 25th. This is a fun event that Kevin and family travel around for every year, and for our SQL community is a great time to sit in multiple sessions learning from a Microsoft SQL Server MVP.

Seats are filling up fast and should have a packed house over at Homecare Homebase whose gracefully opened their doors to host this event. John Morehouse (B|T) and I will both be in attendance and as PAC Ambassadors for SentryOne we would love to talk to you and answer any questions that you may have of us.

Look forward to seeing you all there; going to be a great and fun time. Head on over to the user group site here and check it out available seating.

PAC Community Ambassador – SQL Sentry

pac-logoLast week Aaron Bertrand (b|t) published a post regarding five new PAC Community Ambassadors for SentryOne. I am privileged and honored to be a part of this journey with some stellar data professionals:

  • Andy Mallon (b|t)
  • John Morehouse (b|t)
  • Derik Hammer (b|t)
  • Mike Walsh (b|t)

This venture is a new community program that SentryOne is starting this summer which allows us more avenues to get out into the community, stay connected, and continue to be involved in the programs that SentryOne has to offer.

Knowing each of the other four individuals I can without a doubt say that the mindset is focused on helping others. How do I know this you may ask? Because each of these data professionals has helped me over the years, and I know their drive and motivations to help others succeed.

Thanks SentryOne for the honor to continue to serve others and look forward to meeting, even more, faces as we travel around, collaborate, and impact the community!

Built My Presentation, Now What?

IMG_20161025_092017_01Over the course of several years, I have given many technical and non-technical presentations. It is fun for me to put a new slide deck together, but it also requires a lot of hard work and can be time-consuming. I’ve had a few mistakes, to say the least, over the years where that one typo slips through or something doesn’t go according to plan ~ guess what? It happens.

I compare articulating a presentation to similar fashion in testing something. Yeah, you go over it again and again just like you would test a backup process or verify indexes are actually working. For me the same concept applies; I can’t remember who in the SQL Community always mentions having a checklist handy. I know I’ve read that somewhere before but cobwebs are thick right now so, please, forgive me if I don’t remember. Through the years, I’ve managed to build my own checklist regarding presentations. It is the nuts and bolts of what works for me; it doesn’t necessarily mean it, in turn, will work for you.

Given light of some past conversations I’ve had, I figured I’d share it with you all and maybe someone out there will benefit from it.

Presentation Checklist (a.k.a. Project Double Check Yourself)

What is the purpose – fully understand the purpose of the presentation. By that I mean, what outcome are you seeking?

  • To inform
  • To convince
  • To generate insight and discussion
  • To drive action

Know your audience

  • Do you know who my audience is? Have I provided adequate context to make it easier for them to understand?
  • Are there any personal motivations that you need be aware of?
  • Is the audience familiar with the topic? Have you included adequate detail and background information?
  • Is the presentation tailored to fit the audiences communication style?

Know the message

  • If applicable, do you know the problem or issue you are trying to address?
  • Do you have three to five key teaching points you want to deliver? If so, have you tied those teaching points logically and clearly to the original problem?
  • Have you clearly linked your teaching points to key data or trends along with explaining how the analysis supports, confirms, or denies beliefs about the problem and/or possible solutions?
  • Have you limited the data to what matters most?
  • Have you clearly established relevance? (why would your audience care? Have you clearly highlighted how this aligns with the target audience?)
  • Have you clearly established urgency (why would the audience act now; why is it critical?)

Structure

  • Is the presentation clearly marked with markers and sign posts? Is it easy to follow?
  • Is there an agenda that clearly identifies the different elements and how it fits together? Key point up front?
  • Are there additional details about internal or external sources that were consulted for the included information? Give credit where credit is due

Narrative

  • Does the presentation include insights that will be most influential to the audience? Is the scripting memorable and powerful?
  • Does the presentation identify key assumptions?
  • Does the presentation articulate immediate actions that you believe the audience should take?

Graphics

  • Do you know the purpose of each graphic? Is it tied to a teaching point in the message?
  • Do the graphics present information in a logical, visually appealing manner? Are there other ways of interpreting the graphic other than your intention?
  • Is the page balanced?

Formatting

  • Does the presentation have a standardized look and feel (same headings, colors, fonts)?
  • Are page elements consistent (background, title, body text)?
  • Are colors used judiciously (to emphasize, highlight, and organize)

Conclusion

Checklists; they are everywhere. They don’t necessarily have to be for technical related activities; heck we use checklists for grocery items. They are a part of our daily lives; so when you get that presentation built and you are ready to give it at your shop, on the job, a conference or a client take a few minutes and review a checklist. Make sure you have your house in order and that everything makes sense.

Remember, you get out what you put into something. Continue to work hard and hone in on your speaking and presentation talents that lie within. Like I said, these are some of the things that have helped me over the years; doesn’t mean they are for everyone. The flip side to that, you may have some of your own to share. I encourage you to do so.

 

QA, Utility Databases, and Job Executions

thinking-outside-the-box1Sometimes we, as data professionals, have to think outside the box. I know, crazy idea right? Each shop and situation are different; there will always be several different ways in most cases that you can arrive at a solid solution.

 

This post has a few intentions behind it:

  • It is not a “take this solution; it’s the only way”.
  • Generate some thought and additional methods to reach a goal.
  • This is not intended for a production environment.

Good, now that we have those few things out-of-the-way let’s get to the meat of the topic. A situation arises where you want to give a bit more control to teams to execute jobs without giving full access to the SQL agent. In that case a good utility database may come in handy.

Example of an issue: A QA team is in need of kicking jobs off to test in a specific environment. Keeping in mind that each shop can be different this also means that security levels at varying shops will be different. There are a few choices that may come to mind with this issue:

  • Fire off an email to the DBA team and wait for them to kick job off.
  • Fire off an email to someone with access and wait for them to kick the job off.
  • Wait for the predefined schedule on the job agent and let it kick the job off.

Another method would be to utilize a utility database. You can give it whatever name meets your criteria in this case we will just call it TestingJobs. Let’s look at the overall picture below and how this all fits together:

Things you’ll need

  • UtilityDatabase
  • Two Stored Procedures
  • Table
  • Agent Job

Step1: Create the TestingJobs database (I won’t go into specifics here on proper set up; assume this is already created).

Step2: Create a table called ControlJobs inside the TestingJobs database

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[ControlJobs](
[JobControlID] [INT] IDENTITY(1,1) NOT NULL,
[JobName] [VARCHAR](500) NOT NULL,
[RunStatus] [BIT] NOT NULL DEFAULT ((0)),
[LastRanBy] [VARCHAR](50) NOT NULL,
[LastRanByApp] [VARCHAR](150) NULL,
[Date_Modified] [DATETIME] NOT NULL DEFAULT (GETDATE()),
[Active] [BIT] NOT NULL DEFAULT ((1))
) ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

Step3: Store procedure creation for table insertion (note the parameter @JobName)

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[test_StartJobs] ( @JobName VARCHAR(100) )
AS
BEGIN

        /************************************************************************
This script will insert the record needed to kick off agent jobs.

        ************************************************************************/

INSERT  INTO [TestingJobs].[dbo].[ControlJobs]
( [JobName] ,
[RunStatus] ,
[LastRanBy] ,
[LastRanByApp] ,
[Date_Modified] ,
[Active]
)
VALUES  ( @JobName ,
1 ,
” ,
” ,
GETDATE() ,
1
);

END;

Step4: Set up stored procedure that will run the pending jobs.

USE [TestingJobs]

GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE
[dbo].[RunPendingJobs]
AS
SET NOCOUNT ON;

    DECLARE @JobName VARCHAR(500) ,
@JobStatus INT ,
@RC INT;

DECLARE cur_RunJobs CURSOR
FOR
        SELECT  JobName
FROM    Ddbo.ControlJobs
WHERE   RunStatus = 1
ORDER BY JobName;

OPEN cur_RunJobs;

FETCH NEXT FROM cur_RunJobs
INTO @JobName;

WHILE @@FETCH_STATUS = 0
BEGIN
PRINT ‘Checking to see if job is currently running. ‘;

EXEC @RC = dbo.GetCurrentRunStatus @job_name = @JobName;

IF @RC = 0
EXEC msdb.dbo.sp_start_job @JobName;
ELSE
PRINT
@JobName + ‘ is currently running.’;

UPDATE  ControlJobs
SET     RunStatus = 0 ,
Date_Modified = GETDATE()
WHERE   JobName = @JobName;

FETCH NEXT FROM cur_RunJobs INTO @JobName;

END;

CLOSE cur_RunJobs;
DEALLOCATE cur_RunJobs;

Step5: Set up stored procedure to check if job is already running

USE [TestingJobs];
GO

SET ANSI_NULLS ON;
GO
SET QUOTED_IDENTIFIER ON;
GO
ALTER PROCEDURE
[dbo].[GetCurrentRunStatus] ( @job_name sysname )
AS
SET NOCOUNT ON;

    /* Is the execution status for the jobs.
Value Description
0 Returns only those jobs that are not idle or suspended.
1 Executing.
2 Waiting for thread.
3 Between retries.
4 Idle.
5 Suspended.
7 Performing completion actions  */

DECLARE @job_id UNIQUEIDENTIFIER ,
@is_sysadmin INT ,
@job_owner sysname ,
@Status INT;

SELECT  @job_id = job_id
FROM    msdb..sysjobs_view
WHERE   [name] = @job_name;
SELECT  @is_sysadmin = ISNULL(IS_SRVROLEMEMBER(N’sysadmin’), 0);
SELECT  @job_owner = SUSER_SNAME();

CREATE TABLE #xp_results
(
job_id UNIQUEIDENTIFIER NOT NULL ,
last_run_date INT NOT NULL ,
last_run_time INT NOT NULL ,
next_run_date INT NOT NULL ,
next_run_time INT NOT NULL ,
next_run_schedule_id INT NOT NULL ,
requested_to_run INT NOT NULL , — BOOL
request_source INT NOT NULL ,
request_source_id sysname COLLATE DATABASE_DEFAULT
NULL ,
running INT NOT NULL , — BOOL
current_step INT NOT NULL ,
current_retry_attempt INT NOT NULL ,
job_state INT NOT NULL
);
INSERT  INTO #xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @is_sysadmin, @job_owner,
@job_id;
SELECT  @Status = running
FROM    #xp_results;
RETURN @Status;

DROP TABLE #xp_results;

SET NOCOUNT OFF;

Step6: Job Creation

Create a SQL agent job that will call the RunPendingJobs in the database. You can set this schedule to three minutes for this test.

The Benefit

Now think of a QA team member sitting at their desk running multiple tasks. This does take some coordinated effort in getting the job names but now that the basics are set up the team member could run the execute command for the test_StartJobs which will place the necessary information into the control jobs table. Of course the proper security would need to be set up in order for the user to be added (think AD groups). By utilizing the above method the team can suffice on it’s own in a non prod environment streamlining some of the inefficiencies that have plagued the groups in the past.

Summary

A few things to note here:

  • Don’t ever take code off the internet without testing it. This is just a thought-provoking post and there are some things in this post that are dependent upon one to set up and test.
  • I realize there are multiple ways to accomplish this. This is just an avenue to explore and test with some thinking outside the box.
  • Don’t limit yourself to “I can’t” or “This will not fly at my shop”; challenge yourself to become innovative and think of ways to tackle problems.

SQL Doc by RedGate

SQL DocI recently was on a call where a technical unit indicated they did not receive any form of documentation around the vendor database that was created. Now, seeing that I fall into the database profession it sparked my fancy. I began to ask a few questions to the individual who was asking for this documentation; these are important questions in that you have to determine if there is a need for what was running through my mind. Sure enough, the technical team, was just needing some guidance on overall structure and what they were dealing with in terms of tables, procedures, and so on. This group was trying to review and write a process around information they were not privy to.

My mind went straight for the SQL Doc utility that RedGate has available. It’s a simple utility really to utilize and often times can save the day for such cases like the one above. Check out the steps below on how easy the utility allows you to document a database on the fly:

Step 1: As you open the application you will be prompted to enter a server location followed by how you’d like to connect to it (Windows or SQL authentication). In this case we’ll just hook up to a local instance I have on hand.

image

Step 2: Once connected you’ll have some default settings. There will be a cover page option along with the databases that you want to document.

image

Step 3: Looking at the project you’ll begin to review some of the following information:

  • Database Selection
  • Server Properties
  • Server Settings
  • Advanced Server Settings
  • Sections that are included in the report

For this specific test I’m just going to take a look at the TempDB

image

The screen capture will note that under Object Types you are able to drill into and get as granular as you can. The below example will show you a snippet from a table in the TempDB and will also show that you can enter a description of what the field is utilized for in the far right hand column under Description.

image

Step 4: If you have to save this documentation out for any meetings or other purposes you can create a cover letter along with any logo information and description. Simply click on the cover page option on the left menu and complete the following:

image

Step 5: After all the choices are made you can click on the General Documentation Go button on the menu and be prompted for the following:

image

image

Give the location and file a name and BOOM; you’re done.

Summary

You may find yourself in a situation where you are needing a quick hit for documentation purposes. If you are an avid RedGate user and enjoy using their SQL Doc product; or maybe you had this product and didn’t even know what it was then you can benefit greatly from documenting multiple databases in a matter of minutes. This post is to show you what type of utility SQL Doc is and what it can actually be used for in a real life circumstance. In the end it was the right product and right time to use it for a technical team in need. Well done RedGate, well done.

T-SQL Tuesday #89 Invitation – The times they are a-changing

TSQL2SDAY-150x150This month’s topic by Koen Verbeeck (B|T) is based around times that are a changing. To break it down somewhat it was inspired by a blog post that Kendra Little (B|T) put together around Will the Cloud Eat My DBA Job. Koen is wanting to know with the ever-changing world of technology what kind of impact has it had on you and how do you plan to deal with the future of data management/analysis.

To The Cloud

I think one of the topics that I’ve seen a gradual change in is the topic that revolves around the cloud. Being in the financial district cloud talk is not always welcome. It is a myth to some; I am glad that a while back I headed some of Grant Fritchey’s (B|T) advice in that he said you better start learning cloud techniques sooner than later. The cloud discussion is not always a welcomed one, but is one that needs to be had to keep up with innovative technologies.

Finding what is the best solution for you and your respective area the cloud does allow for flexibility and control. One of the main issues I see most shops running into are security based around a cloud model along with costs in data size etc.

With the proper planning and oversight the cloud is a viable option that should not scare away data professionals

Third Party Utilities

I think over time some of the third-party tools have become game changers in a lot of shops. I see vendors such as SQL Sentry and Red Gate that have evolved over time to help streamline and provided better efficiencies around data management and monitoring. Cutting edge keeps the users of these third-party tools on edge and wanting more. Tying all these into automation of daily tasks and not just on premise monitoring but cloud monitoring has been a huge plus for the community.

The Way Business Interprets Data

Data is what drives us; it is what a lot of decisions are derived from and direction of companies and shops. The data is ever-changing and how we look at it. Take for example Power BI. The methods we used years ago have morphed into a greater approach of delivery to businesses. I never thought I would be watching a professional sports game and see them pull out their tablets and review live data in between innings or set of downs.

PowerShell

No, this isn’t a tribute to Mike Fal (B|T) or Drew Furgiuele (B|T), but I do appreciate their nudge in getting me to utilize PowerShell for some of my every day usage. This could fall under the third-party utilities section up above but I thought it beneficial to state that in some cases it has been a game changer. Stumbling upon the DBATools website has been a blessing in disguise; I love getting to work with technology that I may have not utilized as much in the past.

Do You Feel Endangered?

No, and neither should you. I should paraphrase that with don’t be afraid of change for it allows us to learn new technologies and grow on our journey. There will be opportunities to always learn; each day you should strive to learn something new that you didn’t know before. A wise SQL community member once told me, “The data will always be here; will you?”

T-SQL Tuesday

For those that are not aware T-SQL Tuesday is a block party started by Adam Machanic (B|T). Each month community members who blog pick a topic then gather all the blogs who participated in the event and provides a recap. It’s a great method to share knowledge and an avenue to give back to the community. If you are an avid blogger and would like to be a host then do reach out to Adam via the methods provided.

Don’t Duck On Responsibilities

ResponsibilitiesBeing a data professional you assume a certain amount of responsibility. It often requires having the right attitude and an action plan in place for finding the solutions to our problems at hand. Too many times we attack the symptoms causing the issue, but overlook the root cause. The quick Band-Aid fixes are found many times over, whereas our jobs should be identifying the real issues that lie beneath the symptoms. Now, don’t get me wrong – I understand at times you have to stop the bleeding. In the end though one should uncover the root cause and make the permanent fix.

Prioritize the issue at hand

Chances are you, dear reader, encounter many problems throughout the day. Never try to solve all the problems at one time; instead make them line up for you one by one. Might seem odd but make them stand in a single file line and tackle them one at a time until you’ve knocked them all out. You may not like what you find when uncovering the root cause issues, but that is part of the process. Be careful of this uncovering and be cognizant that what you find with the issues may or may not be the root to all the problems.

Take time and define the problem

In it’s simplest form, take time out and ask yourself this question – “What is the problem?” Sounds easy enough doesn’t it; you’d be amazed by the many accounts of knee jerk reactions data professionals make all over the world. You  may be thinking to yourself that there has to be more to it than that. Think about it in four easy steps:

  • Ask the right questions – if you only have a vague idea of the situation, then don’t ask general questions. Do not speculate but instead ask process related questions things relating to trends or timing. What transpired over the course of the week that may have led to this issue.
  • Talk to the right people – you will face people who inevitably will have the all-knowing and all correct way that things should be done. Heed caution to such as you may find resistant to change and blind spots by these individuals. Creativity is, at times, essential to any problem-solving skill.
  • Get the “set in stone” facts – once the facts are all laid out and defined you may find that the decision is pretty concise and clear on action that should be taken.
  • Be involved – don’t just let the first three steps define you; get involved in the process of being the solution.

Questions to ask yourself regarding the problem

  • Is this a real problem?
  • Is it urgent?
  • Is the true nature of the problem known?
  • Is it specific?
  • Are all parties who are competent to discuss the issue involved?

Build a repository

Once you’ve come to the conclusion and provided a solution to the issue – document it. I know I just lost several readers there. Believe it or not documentation will save your bacon at some point. Maybe not next week or next month, but at some point down the line it will. Some things to consider are:

  • Were we able to identify the real cause to the problem?
  • Did we make the right decision?
  • Has the problem been resolved by the fix?
  • Have any key people accepted the solution?

I am reminded by a saying I once ran across:

Policies are many, Principles are few, Polices will change, Principles never do

Summary

Each day we encounter issues and problems. Don’t let them define you but rather you define the issue. Often times we overlook the root cause; remember to go through your process, policy, and standards in rectifying the problems at hand. It is better to tackle the problems when they are known than to sweep them under the rug for the next data professional to come along and then they are faced with fixing them.

Hopefully this short post will provoke you to think about the issues you deal with on a daily basis and how best to tackle them.