T-SQL Tuesday #92, Lessons Learned The Hard Way

TSQL2SDAY-150x150Wow, hard for me to believe it has been a little bit since the last T-SQL Tuesday block party. This month Raul Gonzalez (b|t) has chosen the topic of what lessons one has learned the hard way. Before we get into the story, however, let’s take a look at who, what, when, and why of T-SQL Tuesday.

What Is T-SQL Tuesday?

T-SQL Tuesday was started by Adam Machanic (b|t) and is a monthly blog party. It occurs on the second Tuesday of each month; where a designated host picks a topic and fellow community bloggers publish a piece. It has been a very useful tool in my opinion and I’m looking forward to doing many more of these. It has been one avenue for others to share their experiences while learning something new along the way.

If you are interested in hosting a T-SQL Tuesday on your blog then reach out to Adam.

Lessons Learned The Hard Way

A lot has transpired over a sixteen-year career thus far. Many lessons have been learned along the way some were more difficult than others. I think it is important to note that not all lessons learned by a data professional have to be of a technical nature as well. Let me see if I can split some up technical vs. non-technical that I’ve learned over the years.

Technical
  • Unit testing – who knew that this would be so important right? As a developer starting out and then becoming a DBA I have an appreciation for making sure things test out as they should; rigorous testing. Earlier in my career, I thought that’s what we have QA for, right?
  • Backups – yeah I’ve been burned before early on regarding backups and not having them in place as they should have been. You want a dose of reality real fast? That’s a good way to start.
  • Blinders On – become so focused that you only take into account a certain area of the picture when in essence what is being changed can affect a multitude of things.
  • Knowing vs. Doing – putting comments in code such as “this is probably not the best way to things” is not the attitude to have when fixing the problem – been there done that.
Non-Technical
  • Listening/Heeding Advice – this is key and something I did not learn until later on in my career. It’s not a skill set that I came out of the gate with, having a mentality that you are always right is not the best approach to take.
  • SME (Subject Matter Expert) – I enjoy helping people; it’s part of who I am. This is both a good and bad trait to have at times. If you are not careful you can find yourself overextending into areas where you think you know something but you don’t. Over the past several years I’ve learned that it’s okay to help people even if it is pointing them in the right direction to someone else. But be as sure as I’m typing this, I’ll always be willing to help and will never apologize for that.
  • Conflict Management – over the years I’ve seen many data professionals and worked with various people. All of these experiences have equipped me over time to become a better professional in dealing with conflict which is never easy. A lot of lessons learned along the way on this one.
Failure

I want to bring this topic up in a section all by itself. Having a sports background for most of my life, and then morphing into an avid runner I’ve had “failure is not an option” instilled within me since a very early age. This saying is okay, but in the same token, one cannot be afraid of failure. Some of the best lessons I’ve learned both professionally and non-professionally have come of a result of something I’ve tried and failed out. The key is not staying knocked down, but look at it in a light of if you aren’t trying then you aren’t failing and pushing the envelope.

In Summary

This is a great topic this month. Don’t be ashamed or afraid of your journey and past failures or lessons learned. These are the things that mold and shape us into being the people are to become in the future. May we continue to push the envelope both in technology and beyond; impacting and coaching others along the way. Always remember you started somewhere; remember how that felt? Pay it forward.

Built My Presentation, Now What?

IMG_20161025_092017_01Over the course of several years, I have given many technical and non-technical presentations. It is fun for me to put a new slide deck together, but it also requires a lot of hard work and can be time-consuming. I’ve had a few mistakes, to say the least, over the years where that one typo slips through or something doesn’t go according to plan ~ guess what? It happens.

I compare articulating a presentation to similar fashion in testing something. Yeah, you go over it again and again just like you would test a backup process or verify indexes are actually working. For me the same concept applies; I can’t remember who in the SQL Community always mentions having a checklist handy. I know I’ve read that somewhere before but cobwebs are thick right now so, please, forgive me if I don’t remember. Through the years, I’ve managed to build my own checklist regarding presentations. It is the nuts and bolts of what works for me; it doesn’t necessarily mean it, in turn, will work for you.

Given light of some past conversations I’ve had, I figured I’d share it with you all and maybe someone out there will benefit from it.

Presentation Checklist (a.k.a. Project Double Check Yourself)

What is the purpose – fully understand the purpose of the presentation. By that I mean, what outcome are you seeking?

  • To inform
  • To convince
  • To generate insight and discussion
  • To drive action

Know your audience

  • Do you know who my audience is? Have I provided adequate context to make it easier for them to understand?
  • Are there any personal motivations that you need be aware of?
  • Is the audience familiar with the topic? Have you included adequate detail and background information?
  • Is the presentation tailored to fit the audiences communication style?

Know the message

  • If applicable, do you know the problem or issue you are trying to address?
  • Do you have three to five key teaching points you want to deliver? If so, have you tied those teaching points logically and clearly to the original problem?
  • Have you clearly linked your teaching points to key data or trends along with explaining how the analysis supports, confirms, or denies beliefs about the problem and/or possible solutions?
  • Have you limited the data to what matters most?
  • Have you clearly established relevance? (why would your audience care? Have you clearly highlighted how this aligns with the target audience?)
  • Have you clearly established urgency (why would the audience act now; why is it critical?)

Structure

  • Is the presentation clearly marked with markers and sign posts? Is it easy to follow?
  • Is there an agenda that clearly identifies the different elements and how it fits together? Key point up front?
  • Are there additional details about internal or external sources that were consulted for the included information? Give credit where credit is due

Narrative

  • Does the presentation include insights that will be most influential to the audience? Is the scripting memorable and powerful?
  • Does the presentation identify key assumptions?
  • Does the presentation articulate immediate actions that you believe the audience should take?

Graphics

  • Do you know the purpose of each graphic? Is it tied to a teaching point in the message?
  • Do the graphics present information in a logical, visually appealing manner? Are there other ways of interpreting the graphic other than your intention?
  • Is the page balanced?

Formatting

  • Does the presentation have a standardized look and feel (same headings, colors, fonts)?
  • Are page elements consistent (background, title, body text)?
  • Are colors used judiciously (to emphasize, highlight, and organize)

Conclusion

Checklists; they are everywhere. They don’t necessarily have to be for technical related activities; heck we use checklists for grocery items. They are a part of our daily lives; so when you get that presentation built and you are ready to give it at your shop, on the job, a conference or a client take a few minutes and review a checklist. Make sure you have your house in order and that everything makes sense.

Remember, you get out what you put into something. Continue to work hard and hone in on your speaking and presentation talents that lie within. Like I said, these are some of the things that have helped me over the years; doesn’t mean they are for everyone. The flip side to that, you may have some of your own to share. I encourage you to do so.

 

QA, Utility Databases, and Job Executions

thinking-outside-the-box1Sometimes we, as data professionals, have to think outside the box. I know, crazy idea right? Each shop and situation are different; there will always be several different ways in most cases that you can arrive at a solid solution.

 

This post has a few intentions behind it:

  • It is not a “take this solution; it’s the only way”.
  • Generate some thought and additional methods to reach a goal.
  • This is not intended for a production environment.

Good, now that we have those few things out-of-the-way let’s get to the meat of the topic. A situation arises where you want to give a bit more control to teams to execute jobs without giving full access to the SQL agent. In that case a good utility database may come in handy.

Example of an issue: A QA team is in need of kicking jobs off to test in a specific environment. Keeping in mind that each shop can be different this also means that security levels at varying shops will be different. There are a few choices that may come to mind with this issue:

  • Fire off an email to the DBA team and wait for them to kick job off.
  • Fire off an email to someone with access and wait for them to kick the job off.
  • Wait for the predefined schedule on the job agent and let it kick the job off.

Another method would be to utilize a utility database. You can give it whatever name meets your criteria in this case we will just call it TestingJobs. Let’s look at the overall picture below and how this all fits together:

Things you’ll need

  • UtilityDatabase
  • Two Stored Procedures
  • Table
  • Agent Job

Step1: Create the TestingJobs database (I won’t go into specifics here on proper set up; assume this is already created).

Step2: Create a table called ControlJobs inside the TestingJobs database

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[ControlJobs](
[JobControlID] [INT] IDENTITY(1,1) NOT NULL,
[JobName] [VARCHAR](500) NOT NULL,
[RunStatus] [BIT] NOT NULL DEFAULT ((0)),
[LastRanBy] [VARCHAR](50) NOT NULL,
[LastRanByApp] [VARCHAR](150) NULL,
[Date_Modified] [DATETIME] NOT NULL DEFAULT (GETDATE()),
[Active] [BIT] NOT NULL DEFAULT ((1))
) ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

Step3: Store procedure creation for table insertion (note the parameter @JobName)

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[test_StartJobs] ( @JobName VARCHAR(100) )
AS
BEGIN

        /************************************************************************
This script will insert the record needed to kick off agent jobs.

        ************************************************************************/

INSERT  INTO [TestingJobs].[dbo].[ControlJobs]
( [JobName] ,
[RunStatus] ,
[LastRanBy] ,
[LastRanByApp] ,
[Date_Modified] ,
[Active]
)
VALUES  ( @JobName ,
1 ,
” ,
” ,
GETDATE() ,
1
);

END;

Step4: Set up stored procedure that will run the pending jobs.

USE [TestingJobs]

GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE
[dbo].[RunPendingJobs]
AS
SET NOCOUNT ON;

    DECLARE @JobName VARCHAR(500) ,
@JobStatus INT ,
@RC INT;

DECLARE cur_RunJobs CURSOR
FOR
        SELECT  JobName
FROM    Ddbo.ControlJobs
WHERE   RunStatus = 1
ORDER BY JobName;

OPEN cur_RunJobs;

FETCH NEXT FROM cur_RunJobs
INTO @JobName;

WHILE @@FETCH_STATUS = 0
BEGIN
PRINT ‘Checking to see if job is currently running. ‘;

EXEC @RC = dbo.GetCurrentRunStatus @job_name = @JobName;

IF @RC = 0
EXEC msdb.dbo.sp_start_job @JobName;
ELSE
PRINT
@JobName + ‘ is currently running.’;

UPDATE  ControlJobs
SET     RunStatus = 0 ,
Date_Modified = GETDATE()
WHERE   JobName = @JobName;

FETCH NEXT FROM cur_RunJobs INTO @JobName;

END;

CLOSE cur_RunJobs;
DEALLOCATE cur_RunJobs;

Step5: Set up stored procedure to check if job is already running

USE [TestingJobs];
GO

SET ANSI_NULLS ON;
GO
SET QUOTED_IDENTIFIER ON;
GO
ALTER PROCEDURE
[dbo].[GetCurrentRunStatus] ( @job_name sysname )
AS
SET NOCOUNT ON;

    /* Is the execution status for the jobs.
Value Description
0 Returns only those jobs that are not idle or suspended.
1 Executing.
2 Waiting for thread.
3 Between retries.
4 Idle.
5 Suspended.
7 Performing completion actions  */

DECLARE @job_id UNIQUEIDENTIFIER ,
@is_sysadmin INT ,
@job_owner sysname ,
@Status INT;

SELECT  @job_id = job_id
FROM    msdb..sysjobs_view
WHERE   [name] = @job_name;
SELECT  @is_sysadmin = ISNULL(IS_SRVROLEMEMBER(N’sysadmin’), 0);
SELECT  @job_owner = SUSER_SNAME();

CREATE TABLE #xp_results
(
job_id UNIQUEIDENTIFIER NOT NULL ,
last_run_date INT NOT NULL ,
last_run_time INT NOT NULL ,
next_run_date INT NOT NULL ,
next_run_time INT NOT NULL ,
next_run_schedule_id INT NOT NULL ,
requested_to_run INT NOT NULL , — BOOL
request_source INT NOT NULL ,
request_source_id sysname COLLATE DATABASE_DEFAULT
NULL ,
running INT NOT NULL , — BOOL
current_step INT NOT NULL ,
current_retry_attempt INT NOT NULL ,
job_state INT NOT NULL
);
INSERT  INTO #xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @is_sysadmin, @job_owner,
@job_id;
SELECT  @Status = running
FROM    #xp_results;
RETURN @Status;

DROP TABLE #xp_results;

SET NOCOUNT OFF;

Step6: Job Creation

Create a SQL agent job that will call the RunPendingJobs in the database. You can set this schedule to three minutes for this test.

The Benefit

Now think of a QA team member sitting at their desk running multiple tasks. This does take some coordinated effort in getting the job names but now that the basics are set up the team member could run the execute command for the test_StartJobs which will place the necessary information into the control jobs table. Of course the proper security would need to be set up in order for the user to be added (think AD groups). By utilizing the above method the team can suffice on it’s own in a non prod environment streamlining some of the inefficiencies that have plagued the groups in the past.

Summary

A few things to note here:

  • Don’t ever take code off the internet without testing it. This is just a thought-provoking post and there are some things in this post that are dependent upon one to set up and test.
  • I realize there are multiple ways to accomplish this. This is just an avenue to explore and test with some thinking outside the box.
  • Don’t limit yourself to “I can’t” or “This will not fly at my shop”; challenge yourself to become innovative and think of ways to tackle problems.

Can a Data Professional Be Organized?

Organization“One of the advantages of being disorderly is that one is constantly making exciting discoveries.”  While this quote is true it is often times too late to act on an opportunity therefore it is missed. If you are a leader at your respective shop it can then be conceived that you are somewhat out of control.

Becoming a bit more organized in your day will allow you to have priorities that are clear in your mind and can help you orchestrate complex events with a masterful touch. You can transition smoothly from one project to the next without wasting motions. People will begin to believe the promises you make because you are following through on them. When you enter a meeting you are prepared for it and when you show that hand of knowing the topic ~ well it pays off.

It might sound funny, but I run into a lot of data professionals who are and are not organized. I get it; it happens to everyone – but if you are interested in taking steps to become organized below are some thoughts that may help you on your journey.

Set Your Priorities

Sounds easy enough? Two things that are difficult to get people to do. The first is to do things in order of importance, and the second being continuing to do the things in order of importance. Try listing out all your major responsibilities according to importance and time needed to accomplish those tasks. This will become the gauge to help keep you on track and keep moving forward. Perhaps start with a monthly checklist.

Place Priorities In Your Calendar

Place this list in a prominent area such as your calendar. You could also share this list with a trusted resource for accountability sake pending on the nature of the item.

Allow Time For The Unexpected

We all know that things will come up; that is inevitable. Based on your role as a data professional you can build in additional time to the priorities that need to be accomplished.

Do Projects One At A Time

A feeling of being overwhelmed is the result of too many projects that are clamoring for your attention. If this is something that happens to you then maybe try some of the following:

  • Itemize all that needs to be completed.
  • Prioritize things in order of importance.
  • Organize each project that suits you such as a folder.
  • Emphasize only one project at a time.

Work According To Your Temperament

If you are a morning person, then schedule time in the morning to be most effective; if you are a late starter then do the opposite. Whichever holds true, be sure to not allow the weaknesses of your temperament excuse you from what you know you need to do to work most effectively.

Use Your Driving Or Travel Time for Light Work And Growth

I was given some great advice a long time ago. Whether you ride the subway or drive the car use this time to reflect on your thoughts. I have several friends for instance who, while on the subway, knock out tasks or read a book that continues their growth process as a data professional. No, I’m not saying never turn on the radio for a jam session, but I am saying you may find some useful time on the drive in to the office.

Develop Systems That Work For You

Whether you utilize your phone, computer, calendar, or writing tasks down – all of these are there to help you do things better and quicker. By improving them, you can decrease your expenses and increase your results. Don’t fight the systems; instead improve upon them. Remember, you are the CEO of your journey.

Always Plan For Those Minutes Between Meetings

I find myself in meetings constantly. That can be both good and bad. Hours can be saved by making the best use of minutes in between meetings. I try to keep a list of things to do that can be done anywhere in a very short amount of time. Keep handy a list of things you can do in a short time such as:

  • Email reply.
  • Call to make.
  • Thank you note to jot down.

Focus On Results, Not The Activity

Doing things right versus doing the right things? Focus on doing the right things and what is truly important. Welcome responsibility and be responsible for who you are. It is often rare to find a person who will be responsible, who will follow through correctly and finish the job. An old boss of mine once told me the following:

“I am only one, but still I am one. I cannot do everything, but I can do something; and because I cannot do everything I will not refuse to do the something that I can do”

Summary

Whatever the case may be take steps to improving yourself along your journey and look for ways to help improve efficiency on your day-to-day processes. These items mentioned are just to provoke your thought process; not something that is or should be a standard for you. If you’re struggling; maybe try some of them out and see how it works for you. Don’t expect more from others than you expect from yourself. Get after it and let’s get it done.

Categories: Uncategorized

SQL Doc by RedGate

SQL DocI recently was on a call where a technical unit indicated they did not receive any form of documentation around the vendor database that was created. Now, seeing that I fall into the database profession it sparked my fancy. I began to ask a few questions to the individual who was asking for this documentation; these are important questions in that you have to determine if there is a need for what was running through my mind. Sure enough, the technical team, was just needing some guidance on overall structure and what they were dealing with in terms of tables, procedures, and so on. This group was trying to review and write a process around information they were not privy to.

My mind went straight for the SQL Doc utility that RedGate has available. It’s a simple utility really to utilize and often times can save the day for such cases like the one above. Check out the steps below on how easy the utility allows you to document a database on the fly:

Step 1: As you open the application you will be prompted to enter a server location followed by how you’d like to connect to it (Windows or SQL authentication). In this case we’ll just hook up to a local instance I have on hand.

image

Step 2: Once connected you’ll have some default settings. There will be a cover page option along with the databases that you want to document.

image

Step 3: Looking at the project you’ll begin to review some of the following information:

  • Database Selection
  • Server Properties
  • Server Settings
  • Advanced Server Settings
  • Sections that are included in the report

For this specific test I’m just going to take a look at the TempDB

image

The screen capture will note that under Object Types you are able to drill into and get as granular as you can. The below example will show you a snippet from a table in the TempDB and will also show that you can enter a description of what the field is utilized for in the far right hand column under Description.

image

Step 4: If you have to save this documentation out for any meetings or other purposes you can create a cover letter along with any logo information and description. Simply click on the cover page option on the left menu and complete the following:

image

Step 5: After all the choices are made you can click on the General Documentation Go button on the menu and be prompted for the following:

image

image

Give the location and file a name and BOOM; you’re done.

Summary

You may find yourself in a situation where you are needing a quick hit for documentation purposes. If you are an avid RedGate user and enjoy using their SQL Doc product; or maybe you had this product and didn’t even know what it was then you can benefit greatly from documenting multiple databases in a matter of minutes. This post is to show you what type of utility SQL Doc is and what it can actually be used for in a real life circumstance. In the end it was the right product and right time to use it for a technical team in need. Well done RedGate, well done.

T-SQL Tuesday #89 Invitation – The times they are a-changing

TSQL2SDAY-150x150This month’s topic by Koen Verbeeck (B|T) is based around times that are a changing. To break it down somewhat it was inspired by a blog post that Kendra Little (B|T) put together around Will the Cloud Eat My DBA Job. Koen is wanting to know with the ever-changing world of technology what kind of impact has it had on you and how do you plan to deal with the future of data management/analysis.

To The Cloud

I think one of the topics that I’ve seen a gradual change in is the topic that revolves around the cloud. Being in the financial district cloud talk is not always welcome. It is a myth to some; I am glad that a while back I headed some of Grant Fritchey’s (B|T) advice in that he said you better start learning cloud techniques sooner than later. The cloud discussion is not always a welcomed one, but is one that needs to be had to keep up with innovative technologies.

Finding what is the best solution for you and your respective area the cloud does allow for flexibility and control. One of the main issues I see most shops running into are security based around a cloud model along with costs in data size etc.

With the proper planning and oversight the cloud is a viable option that should not scare away data professionals

Third Party Utilities

I think over time some of the third-party tools have become game changers in a lot of shops. I see vendors such as SQL Sentry and Red Gate that have evolved over time to help streamline and provided better efficiencies around data management and monitoring. Cutting edge keeps the users of these third-party tools on edge and wanting more. Tying all these into automation of daily tasks and not just on premise monitoring but cloud monitoring has been a huge plus for the community.

The Way Business Interprets Data

Data is what drives us; it is what a lot of decisions are derived from and direction of companies and shops. The data is ever-changing and how we look at it. Take for example Power BI. The methods we used years ago have morphed into a greater approach of delivery to businesses. I never thought I would be watching a professional sports game and see them pull out their tablets and review live data in between innings or set of downs.

PowerShell

No, this isn’t a tribute to Mike Fal (B|T) or Drew Furgiuele (B|T), but I do appreciate their nudge in getting me to utilize PowerShell for some of my every day usage. This could fall under the third-party utilities section up above but I thought it beneficial to state that in some cases it has been a game changer. Stumbling upon the DBATools website has been a blessing in disguise; I love getting to work with technology that I may have not utilized as much in the past.

Do You Feel Endangered?

No, and neither should you. I should paraphrase that with don’t be afraid of change for it allows us to learn new technologies and grow on our journey. There will be opportunities to always learn; each day you should strive to learn something new that you didn’t know before. A wise SQL community member once told me, “The data will always be here; will you?”

T-SQL Tuesday

For those that are not aware T-SQL Tuesday is a block party started by Adam Machanic (B|T). Each month community members who blog pick a topic then gather all the blogs who participated in the event and provides a recap. It’s a great method to share knowledge and an avenue to give back to the community. If you are an avid blogger and would like to be a host then do reach out to Adam via the methods provided.

Don’t Duck On Responsibilities

10 Apr 2017 1 comment

ResponsibilitiesBeing a data professional you assume a certain amount of responsibility. It often requires having the right attitude and an action plan in place for finding the solutions to our problems at hand. Too many times we attack the symptoms causing the issue, but overlook the root cause. The quick Band-Aid fixes are found many times over, whereas our jobs should be identifying the real issues that lie beneath the symptoms. Now, don’t get me wrong – I understand at times you have to stop the bleeding. In the end though one should uncover the root cause and make the permanent fix.

Prioritize the issue at hand

Chances are you, dear reader, encounter many problems throughout the day. Never try to solve all the problems at one time; instead make them line up for you one by one. Might seem odd but make them stand in a single file line and tackle them one at a time until you’ve knocked them all out. You may not like what you find when uncovering the root cause issues, but that is part of the process. Be careful of this uncovering and be cognizant that what you find with the issues may or may not be the root to all the problems.

Take time and define the problem

In it’s simplest form, take time out and ask yourself this question – “What is the problem?” Sounds easy enough doesn’t it; you’d be amazed by the many accounts of knee jerk reactions data professionals make all over the world. You  may be thinking to yourself that there has to be more to it than that. Think about it in four easy steps:

  • Ask the right questions – if you only have a vague idea of the situation, then don’t ask general questions. Do not speculate but instead ask process related questions things relating to trends or timing. What transpired over the course of the week that may have led to this issue.
  • Talk to the right people – you will face people who inevitably will have the all-knowing and all correct way that things should be done. Heed caution to such as you may find resistant to change and blind spots by these individuals. Creativity is, at times, essential to any problem-solving skill.
  • Get the “set in stone” facts – once the facts are all laid out and defined you may find that the decision is pretty concise and clear on action that should be taken.
  • Be involved – don’t just let the first three steps define you; get involved in the process of being the solution.

Questions to ask yourself regarding the problem

  • Is this a real problem?
  • Is it urgent?
  • Is the true nature of the problem known?
  • Is it specific?
  • Are all parties who are competent to discuss the issue involved?

Build a repository

Once you’ve come to the conclusion and provided a solution to the issue – document it. I know I just lost several readers there. Believe it or not documentation will save your bacon at some point. Maybe not next week or next month, but at some point down the line it will. Some things to consider are:

  • Were we able to identify the real cause to the problem?
  • Did we make the right decision?
  • Has the problem been resolved by the fix?
  • Have any key people accepted the solution?

I am reminded by a saying I once ran across:

Policies are many, Principles are few, Polices will change, Principles never do

Summary

Each day we encounter issues and problems. Don’t let them define you but rather you define the issue. Often times we overlook the root cause; remember to go through your process, policy, and standards in rectifying the problems at hand. It is better to tackle the problems when they are known than to sweep them under the rug for the next data professional to come along and then they are faced with fixing them.

Hopefully this short post will provoke you to think about the issues you deal with on a daily basis and how best to tackle them.

%d bloggers like this: