Sessions For PASS 2012 Released

I was excited to see the sessions have been released for this years PASS 2012 Summit

If you’ve been involved in the SQL Community for any particular time than hopefully PASS has come up here or there. I can say from personally attending in the past that the knowledge gained and networking that you will do is invaluable.

Some of my peers have inquired what are the pro’s, and why should I attend such a Summit. Again, I can only say that you get out of it what you put into it. The atmosphere and mindset going into the Summit to learn and gain some knowledge from the best speakers the SQL Community offers ~ well you’ll walk away amazed.

I think PASS does a great job in detailing what they are about and the why’s and who should attend questions. You can check all that out here.

Everything you will need to know about the Summit from lodging to pricing, etc is available on their site. If you are attending ~ GREAT; if you are on the border line than can I implore you to go; and if you don’t think you are going I ask you to take a hard look at it in  hopes to change your mind.

Let’s make this PASS Summit the best yet. Hope to see you there!

Granting View Definition to All Procedures

The other day I came across something where I needed to grant view definition rights to a specific user for one single DB. Now, there are several different ways to achieve this but I decided to dabble in just writing some code to take care of it for me. Others might have something very similar or different mechanisms to do such but for me I used the following code:

The basic gist is as follows:

  1. Create Temp Table – you can use a # table just drop it when done with it
  2. Declaration of Variables
  3. Insert data into temp table – only want to grab the procedures and nothing with a _dt
  4. Grab all of the procedures found in the DB and dynamically set the grant permissions to a SQL string
  5. Execute the SQL string
  6. Restart the counter and reset the name variable just for good measure
  7. Select the temp table to view what was just granted view definition to

Code in doing this:

DECLARE @PermTable TABLE
(
PK_ID INT IDENTITY(1, 1)
NOT NULL
PRIMARY KEY CLUSTERED ,
[name] SYSNAME
)

–declare variables
DECLARE @name SYSNAME ,
@RowCount INT ,
@RecordCount INT ,
@strSQL VARCHAR(2000)

INSERT INTO @PermTable
(
[name]
)

SELECT ROUTINE_SCHEMA + ‘.’ + ROUTINE_NAME
FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_TYPE = ‘Procedure’
AND ROUTINE_NAME NOT LIKE ‘dt_%’

— counters for while
SET @RecordCount = ( SELECT COUNT(*)
FROM @PermTable
)

SET @RowCount = 1
WHILE ( @RowCount < @RecordCount + 1 )
BEGIN
SELECT @name = [name]
FROM @PermTable
WHERE PK_ID = @RowCount
SET @strSQL = N’Grant VIEW Definition on ‘
+ RTRIM(CAST(@name AS VARCHAR(128))) + ‘ to [User]’

–Execute the Sql
EXEC(@strSQL)

–Restart the Counter
SET @RowCount = @RowCount + 1

–Never know so why not reset
SET @name = NULL
END

SELECT *
FROM @PermTable

Hope this helps; cheers.

 

“Make It Right”

Over the years I have grown accustomed to watching a show called Holmes Inspection. I enjoy working projects at the house ~ am I good at it, eh some things yes some things no, but nevertheless I do enjoy it.

The premise of the show is Mike Holmes goes into these homes in Canada where contractors have come in and basically not done the job they should have done; or families have bought homes and the inspector hadn’t caught obvious issues with the home to report back to the family. His biggest quote and what he is known for is “Make It Right” and also a big believer in “Do it right the first time and you won’t have to come back to it”. He has built his show and business around three things:

  1. Quality
  2. Integrity
  3. Trust

I’ve started to sit and dwell on these things and how that applies to my DBA work that I enjoy doing.

Quality – without a doubt the quality of a DBA is very important. In some instances it can make or break a company. From security down to the backup of the tiniest of databases. If a DBA is not on top of his/her game daily potential catastrophic events can occur. This in no way means that the DBA will be perfect; we hold ourselves to very high standards but at the end of the day we are human and mistakes will be made. When those mistakes do occur how do we handle those?

Integrity – one definition found in the dictionary says that this about integrity, “possession of firm principles: the quality of possessing and steadfastly adhering to high moral principles or professional standards” I can’t think of a better synopsis of a DBA.

Trust – as a DBA do people trust you? To me this is an obvious question….if you are a DBA you have all the ins and outs to the database where all the data is stored. A whole topic can be delved into on trust.

I’ve done things in the past and have viewed other DBA’s work that was done before me at jobs. Some things I sit back in amazement and think, “Why in the world is it done this way?”, and “What was I thinking?”

I find it very prudent to “Do It Right” the first time even if it means taking a bit longer to accomplish the task and ensuring all aspects (if applicable) have been thought out, and along those same lines if I come upon something that I know isn’t right then, “Make It Right” ~ quality, integrity, and trust

 

T-SQL Tuesday #31 – Logging

Looking at my schedule I realized after discussing with fellow DBA’s that I was missing out on something ~ T-SQL Tuesday. I think this is fantastic idea and to host a T-SQL Tuesday you can contact Adam Machanic.

Today is T-SQL Tuesday #31 and is by Aaron Nelson on Logging……….

Logging

In general, to me, logging is an essential part of my every day activities as a DBA. It helps me in many ways tracking changes from deployments from fellow DBA’s and myself to bringing back statistical information for us to determine growth.

I recently came across two real world examples that has helped me with logging.

  1. A DBA friend was in the middle of a deployment; after several minutes the deployment was labeled complete and everyone was on their merry way. A few days later some catastrophic events unfolded; and everyone wanted to know the why; because we were logging the who, what, when along with the commands given we traced back quickly the change and what was changed by the user.
  2. I’m a huge fan of the Utility Database idea; have been ever sense I sat in on it at PASS ( Chris Shaw presented it). After getting back to the office it was my mission to get this idea implemented in. Currently, we are building an enterprise edition out that encaptulates a plethora of information from statistical stats, heartbeats, file sizes, space on drives, and the list goes on. Logging this information will help guide and plan for sizing and space.

I’ve only touched on a couple of topics regarding logging; there is so much more out there.

Roles, Schemas, and Drops Oh My!

I recently ran into an issue where a development manager was creating a new role in SQL for his data warehouse initiative and cube building. He granted the role ownership of a newly created schema then changed his mind on what he wanted regarding a few items.

I provided the explanation to him that I had ran into this in the past and provided the following so that he could transfer the owned schema to another user or role then drop his role he created.

If you try to drop the role while the schema is owned by it SQL will produce the error, “The database principal owns the schema in the database, and cannot be dropped”.

In this particular instance we only had one schema this role was tied to but if a user or role was tied to many you can run the below query to determine what all schema’s are tied to the user:

SELECT s.name
FROM sys.schemas s
WHERE s.principal_id = USER_ID(‘testname’)

The next query will transfer the schema’s you want to move over to dbo for example so you can remove the role then re-assign out as needed:

ALTER AUTHORIZATION ON SCHEMA::SchemaName TO dbo

As always I recommend testing locally or in a test environment prior to doing any type of prod related activity.

Database Auditing ~ Who Did What?

Real life scenario ~ I’ve seen companies that I’ve come into not have their production environment on lock down and every one who is anyone has access to the environment.

I like the idea of having a table with the following columns:

  • AuditID – Unique Identifier
  • AuditDate – Date the event occurred
  • LoginName – Who did the event
  • ObjectName – What object was affected
  • Command – The command that was executed
  • EventType – What kind of event occurred
  • ServerName – The server the Event occurred on
  • DatabaseName – Which Database the Event occurred on

Once the table is created you can create a database trigger that will dump data into your table so you can track who is doing what in the database at any given time. I’ve used this practice at previous job sites and was happy to see this being utilized at my current employer.

The trigger will look something similar to this:

CREATE TRIGGER [tr_Trigger Name]

ON DATABASE

FOR DDL_DATABASE_LEVEL_EVENTS

AS

BEGIN

SET NOCOUNT ON

DECLARE @EventData XML
SET @EventData = EVENTDATA()

BEGIN TRY

INSERT INTO [DATABASE NAME].dbo.[TABLE NAME]
(
AuditDate,
LoginName,
ObjectName,
EventType,
ServerName,
DatabaseName,
Command
)
SELECT
GETDATE(),
t.c.value(‘(LoginName)[1]’,’VARCHAR(255)’),
t.c.value(‘(ObjectName)[1]’,’VARCHAR(255)’),
t.c.value(‘(EventType)[1]’,’VARCHAR(100)’),
t.c.value(‘(ServerName)[1]’,’VARCHAR(255)’),
t.c.value(‘(DatabaseName)[1]’,’VARCHAR(255)’),
t.c.value(‘(TSQLCommand/CommandText)[1]’,’NVARCHAR(MAX)’)
FROM
@EventData.nodes(‘/EVENT_INSTANCE’) t(c)

–Note you can add a where clause and exclude “Stats” if you so desire”

END TRY
BEGIN CATCH
EXECUTE [DATABASE NAME].dbo.[SP TO CAPTURE ERRORS]
END CATCH

END

I’ve found this technique useful in running reports after deployments on what was deployed; going back for future reference on what was done, capturing issues among team members of not knowing what they were doing in order to correct it, and just for overall satisfaction of knowing what is occurring on the database itself.

If you’re not sure what is occurring on your databases then maybe this is something that you can try to help build a base to capture events that are occurring from drops, creates, to grants, etc.

 

Where Did The Time Go?

A lot has been happening over the course of the last month. It was very nice to take a much needed vacation with the family, upon coming back our team has had the opportunity to upgrade several of our SQL Servers to 2012 (which I’m thoroughly excited about).

While I was out, SQL MVP Chris Shaw, had the opportunity to host T-SQL Tuesday #30 where he discussed Ethics and the DBA. I felt it prudent to blog on this even though I missed the timeline as it seems to be a pretty important topic that will always be around with a lot of opinions.

After reading SQL MVP Chris Shaw’s post, I always had this preconceived notion of what I deemed ethical in my day to day activities as a DBA. I have my own standards that I set high for myself and strive to maintain; have I ever given much thought to a set of authoritative rules for the whole DBA governing body?? It has crossed my mind but like others I think how in the world would one police such a thing and who would be responsible for it?

The institution I work for very well could have a totally different set of standards than the institution you work for so how would that play out? No, to me, the situation is so diverse I don’t see a certain set of ethical standards being derived for DBA’s in general, but I do think that every DBA should have a very high set of standards in place for themselves even if the company where you work does not have any in place. DBA’s in general are responsible for so much and are such an integral part to a company that if standards aren’t in place I truly believe can cause severe damage.

After reviewing the post(s) by so many others I found it interesting to see everyone’s points of views. DBA’s are a brotherhood, we are a close knit community. We have high standards for ourselves which is the way it should be. Take pride in what we do and give it 110%.

To review the SQL Tuesday #30 wrap up on Chris Shaw’s site you can go  here.

Drop and Create Databases ~ SQLCMD Style

Sometimes there are instances when I need to drop 6 or 7 databases on my local instance. Currently, I run 3 instances locally, two are for OLTP duties and one is for reporting. One method for dropping the databases that I like is based on just a couple of SQL files and a .bat file. First, I make sure that my replication is turned off on my instances. If replication is not turned off then the below process will error out when trying to drop the database.

SQL FILE CREATION FOR DROPPING AND CREATING THE DATABASES

In my case I need to drop multiple databases so my SQL file resembles:

DROP DATABASE db1

DROP DATABASE db2

DROP DATABASE db3

DROP DATABASE db4

DROP DATABASE db5

DROP DATABASE db6

DROP DATABASE db7

DROP DATABASE db8

Once complete I create a generic directory on my C:\ and saved this file as DropDatabases.sql. The create database script is the same script just replace the drop with the create. Keep in mind on the create database statement you can add more setting defaults to the script but I will not go into those here. You can also tinker around with this script and add some standards such as checking to see if the database exists before dropping and creating etc. This is just an idea up to you on how you use it.

CREATING THE .BAT FILE(s)

I use the sqlcmd utility in my .bat file and make calls to the server instance and specific file. Simply open up a notepad editor and type the following command(s)

sqlcmd -S (local) -i”C:\CI\DropDatabase.sql”

sqlcmd -S (local) -i”C:\CI\CreateDatabase.sql”

Then save the file as DropDatabase.bat and CreateDatabase.bat

The two statements will call either the DropDatabase.sql file or the CreateDatabase.sql file

—————————————————-

I believe the SQLCMD utility is a pretty useful tool that I will eventually incorporate into my continuous integration initiatives. For those of you who know me or follow my blog for a long period of time know that I am Red-Gate fan. Because of this I have many useful tools at my disposal and currently am going to set up a form of CI following some of their practices which can be found here

MSDN provides a thorough listing of SQLCMD syntax and variables you can view here

I enjoy trying different ways of doing things that help daily performance initiatives. Hope this helps someone else in the future.

To Document or Not To Document ~ That is the question

Recently, I approached a person in a business unit and asked if they had some standard documentation on one of their processes. The reply I received in general terms was, “No we do not have any standard documentation but we do continue to build on our processes”.

Knowing how I am, I began to read into it and I couldn’t get passed that statement. Then it dawned on me, of course everyone is different and we all have different mindsets. My DBA processes I feel must be documented, but others will not necessarily share that same view point – especially in other business units.

I’ve always heard that documenting is an age old battle; most people I know don’t like doing it but for me I do believe it is a necessity. I’m not the best at it but having standards in place within a team or solo act is beneficial for the next person that comes in. One of my old mentors several years ago told me if he walked out to lunch and something happened to him he would want me to be able to pick up a process even if I didn’t know it and look at his documentation and be comfortable in completing the task. Somewhere along the way that kind of stuck with me. Some of the standard documentation we have put in place for the DBA team that I’m on are:

  • Backup Procedures
  • Job Retention
  • Security
  • Maintenance
  • Server Installations
  • Code Promotions
  • Object Naming Conventions
  • Job Notifications

The list could go on but those are some high level ones that I just threw out there. Documentation made it on SQL Server Pedia’s “10 Things Every DBA Should Do” – the article came from John Sansom who really has a great blog; I suggest you check him out.

One of the things I like to do when starting new employment is to look at their documentation; some cases it doesn’t really exist. I like to take that and use it to gain knowledge of the systems and what they do. Documentation……”To Document or Not To Document ~ That is the question” To me it is a no brainer.

Drop me a line and let me know what your thoughts are if you think documentation is important or not important and some standards that you may have in place.

Someone Forget to Logoff that Remote Desktop Session?

 

Did Someone Forget To Logoff?

Phone rings….. alerts start going off……… coffee hasn’t been made yet….. it’s 2:13 a.m. and the issue requires you to remote login to the server to perform some task; only to find out that the server has already exceeded the maximum number of users. Has that happened to you before? Has me; either other people are on the box looking at something else or someone forgets to go ahead and log off when they are done using the box.

When this occurs I like to use the QWINSTA\RWINSTA operations from the command line to help me identify who is on the box and then graciously log them off. I know there are other methods to complete the same task; however I tend to roll with this one as it is engrained in my head.

To display information about Terminal Sessions you can type the following command in the cmd prompt:

  • qwinsta /server:[servername]

This command will give you the session name, username, id, state, type, and device.

This information is good however; how can you get someone off the server? That can be accomplished by using a similiar cmd line; howerver I DO RECOMMEND that you check with the individual prior to just booting them out unless it is an extreme emergency

  • rwinsta /server:[servername] [sessionid]

You need to make sure that you capture the sessionid that you saw from the qwinsta function in order to log them off.

I know Windows Administrators go through the Terminal Services Manager route but if you’re like me and enjoy some good command line based tools this can be another option for you; especially when you are woken up at night and the other people logged in could be fast asleep!!