• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
  • Pocket DBA®
  • Blog
  • Testimonials
  • Contact
  • About

SQL

T-SQL Tuesday: Your Fantasy SQL Feature

September 10, 2019 by SQLDork Leave a Comment

T-SQL Tuesday is a monthly blog party for the SQL Server community. It is the brainchild of Adam Machanic (b|t) and this month’s edition is hosted by Kevin Chant (b|t) who has asked us to describe “Your Fantasy SQL Feature“. (This is also the first time I’ve made a blog post for T-SQL Tuesday.)

The Problem: As a junior DBA, part of my job involves checking job failures for 2-3 clients each day, which we have a script in-house for. Every once in a while, a job actually fails! So like any DBA who knows that you can go to the Job Activity Monitor under the SQL Agent to view details on job history, i do exactly that only to find it’s a job that ran an SSIS package and gave me this very informative error message:

That’s super helpful, thanks!

So now i have to:

  • Check the step properties
  • Look at the path for the SSIS package
  • Drill down into the integration services catalogue
  • clickity, clickity
  • Wait which folder was it in?
  • Go check the path again
  • clickity clickity
  • There’s 12 different packages with very similar names:

  • Resize the window or scroll over to see which one i care about
  • Finally, open the All Executions Report to get a very helpful error message:

Gee darnit, i’m out of coffee.

The Solution: Just put a button in the failure message/job history that does all that for you.  Or a hyperlink. Whatever.

Kevin Hill (b|t) has made a script that reads the SSISDB tables, but often takes a while to run with only the existing Microsoft indexes. Especially if you are keeping crazy amounts of history…like the MS default 365 days, on a busy system.

/*Please work*/,

SQLDork

Filed Under: SSIS, TSQL2sday

T-SQL Tuesday #118: Fantasy SQL Feature

September 10, 2019 by Kevin3NF Leave a Comment

T-SQL Tuesday is a monthly blog party for the SQL Server community. It is the brainchild of Adam Machanic (b|t) and this month’s edition is hosted by Kevin Chant (b|t), SQL dude and fellow cyclist!  Kevin wants to know what our “Fantasy SQL Feature” is.

I asked my LinkedIn connections a very similar question in my “Question of the Week” there.  Hit the link to see a lot of different responses.

One of the recurring themes and possibly the most common was:

Load Balanced Writes.  Unless I’ve completely missed something, there is nothing in SQL Server natively that will allow you to write to different copies of the database (leaving out Merge and Bi-Directional Transactional Replication, because they suck and don’t scale).

Don’t get me wrong…I love read-only replicas in an Availability Group, replication for reporting, or even delayed Log Shipping.  They are wonderful for taking the read traffic off an OLTP box, but if you are dying under heavy load in a very optimized setup…oh well.

Bonus Fantasy: Give me a button in the Log Shipping Setup GUI (Database Properties>>Log Shipping) that just says “Re-initialize”.  Sometimes LS just falls apart and its easier to tear it down and start over.  Even easier is to just back up the db, restore over the Secondary db and go.  But, for the small business using LS for poor-man’s DR and no DBA on staff…give me a button.  Ideally that button calls sp_ReInit_Log_Shipping @DBName = ‘MyDB’.  But someone will have to write that first.  I am not that someone.  This may already be in the DBATools.io set of toys, but I haven’t looked, and again…small customers need a button in a GUI.

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

Filed Under: HADR, Performance Tuning, TSQL2sday

T-SQL Tuesday: Fantasy SQL Feature

September 10, 2019 by SQLandMTB Leave a Comment

T-SQL Tuesday is a monthly blog party for the SQL Server community. It is the brainchild of Adam Machanic (b|t). This month’s edition is hosted by Kevin Chant (b|t), who has invited us to share our fantasy SQL feature.

From Jeff, The Ironic DBA (Dallas DBAs Apprentice):

I haven’t been poking around in production servers very long, and so far my main responsibility is customer server reviews. I’ve blogged a couple of times about tweaking existing scripts to massage the date output from the Error Logs and Job History for readability.

One of the obstacles I’ve identified working with SQL Server is the sheer depth of the product. There’s so much to learn it’s hard to get up to speed quickly. With Microsoft’s recent advances in AI, I see a potential area where they could add a nice little feature inside SSMS.

Take a look at this video from @Patrick Leblanc (b|t) (one-half of Guy in a Cube) where he demonstrates a Power BI feature called “Column from Examples”—specifically where he plays around with datetime formatting at about 2:15.

Essentially, Power BI uses PowerQuery to let you provide examples for column formatting and then uses the underlying PowerQuery AI to figure out what you want. Then, it writes the appropriate M Language formula for you.

A similar, but relevant, implementation within SSMS would be a pretty cool time saver. SSMS already writes T-SQL scripts for you anytime you use a wizard, so having it write T-SQL commands that output a desired style of formatting isn’t too far outside the realm of possibility.

What are some other ways you could see PowerQuery implemented within SSMS?

Filed Under: SSMS, TSQL2sday

The Ironic DBA—Failure Is Always an Option

September 5, 2019 by SQLandMTB Leave a Comment

Welcome back to The Ironic DBA Files, a series where the newbie DBA on staff at Dallas DBAs chronicles the ups and downs of fast-tracking a new career as a DBA.

In the last episode I showed how I tweaked some in-house scripts to provide more user-friendly output. This time around I want to revisit those scripts and give my fellow beginner DBAs some insight on some very basic troubleshooting.

Failure is Always An Option

I’m a big Mythbusters fan, and was saddened when the show eventually went off the air. There’s so much I learned about how the world around me works by watching the antics of Adam Savage, Jamie Hyneman, the rest of their crew. I still follow both of them on Twitter (links above), and also regularly watch Adam Savage’s Tested on YouTube.

Several pithy phrases were said over the many seasons of Mythbusters episodes, including gems like, “I reject your reality and substitute my own,” and “Jamie likes big boom.” My favorite line from the show is “Failure is always an option.”

My family has been rewatching some of the show’s episodes, and in one of their final shows Adam mentions how they wouldn’t have been able to accomplish that particular episode’s goals without their previous years of experience. If you were to go and watch every episode, you’d quickly realize that the Mythbusters failed more often than succeeded. It’s through repeated trial and error that they learned the most beneficial lessons.

Learning about SQL Server’s capabilities has been a similar sort of journey. It’s still early days for me, but I’m sure that most Senior DBA’s out there will tell you that the knowledge they’ve gained over the years has been full of “that didn’t work” moments. For fun, check out this video from Bert Wagner (b|t) about SQL Fails.

No matter how much I try to remember all of this, I’m still the sort of person who gets that flip-flopping stomach feeling when I mess up or can’t figure something out right away. It’s in moments like these that I have to take a breath and remind myself that I’m still learning. SQL Server is a very complex piece of software—so complex that I doubt there is any one person who knows EVERYTHING about it, not even the people who’ve worked to develop it over the years.

Failure is always an option. As a SQL Server DBA, you’ll soon learn that your client’s servers will fail—no matter how good you are at your job. How will you identify those failures? Here’s one way.

There’s Your Problem

SQL Server has a bunch of built-in tools and resources that help identify failures and errors. Having said that, there’s a learning curve involved that’s sort of like baking a loaf of bread. The components are all there at your fingertips, but you need someone to show you how to use them in the correct properly.

One of the scripts I run daily, the Read Errorlog script, has a bit of code that looks like this:

--Dump all the things into the table
	insert into #Errorlog
	EXEC sys.xp_readerrorlog 
	0 -- Current ERRORLOG
	,1 -- SQL ERRORLOG (not Agent)

What’s relevant here is understanding WHAT is being read when this script is run. I’m not all that concerned today with showing how we massage the output, just where the information is coming from.

The line EXEC sys.xp_readerrorlog is executing a widely-known but undocumented Extended Stored Procedure. This is why you see “xp” in the scriptlet. If a regular Stored Procedure were being executed you’d see “sp” instead.

NOTE: You’ll see the following message at the top of the MS Docs related to Extended Stored Procedures: “This feature will be removed in a future version of Microsoft SQL Server. Do not use this feature in new development work, and modify applications that currently use this feature as soon as possible. Use CLR Integration instead.” We will probably need to rewrite our in-house scripts some time in the future to stay current.

Error logs are not stored in the database, but rather in text files on the host server. So, this Extended Stored Procedure looks outside of SQL Server to where the error log text files are stored within the hardware environment.

What is the procedure reading? xp_readerrorlog is pulling information from the files you can find in the Object Explorer under Management–>SQL Server Logs (highlighted in green). The Extended Stored Procedure helps make our lives as DBAs just a little bit more efficient by pulling the relevant information from the text files for us rather than forcing us to view each individual log file and scroll through hundreds of lines of results.

What’s also important to note here is what our version of the Read Errorlog script is NOT reading. Notice the Error Logs folder highlighted in red in the Object Explorer. You can find it under SQL Server Agent–>Error Logs. We don’t care about those error logs for this particular task.

xp_readerrorlogs accepts several parameters. The two we use most often are the Log Number and Log Type parameters.

The Log Number parameter we pass is “0”, which tells SSMS to read the current log. The Log Type parameter we pass is “1”, which tells SSMS to read from SQL Server Logs (green) and NOT from the SQL Server Agent Error Logs (red).

Beyond this, we are then able to use our script to tell SSMS what data we’d like displayed from the logs, rather than having it output every single line. For instance, if we’re specifically looking for deadlocks, our SELECT statement can be written to only look for LogText like ‘%deadlock encountered%’.

That’s all for this week. Join me next time for the next episode in The Ironic DBA Files.

Follow me on Twitter at @SQLandMTB, and if you’re into mountain bikes come over and check out my site NTX Trails.

The Ironic DBA Files

        • Prequel: The Ironic DBA—Starting a New and Unexpected Career
        • Episode 1: You Back That Up?
        • Episode 2: Attack of the Corruption
        • Episode 3: Revenge of the Index
        • Episode 4: A New Primary Key
        • Episode 5: The Maintenance Plan Strikes Back
        • Episode 6: Return of the TSQL
        • Episode 7: The Backup Awakens
        • Episode 8: The Last Rebuild
        • Episode 9: Rise of the Clients
        • Review One: A SQL Story
        • It’s Hip to Be Square
        • Rock Around the Clock

Follow @Dallas_DBAs

Filed Under: Apprentice, Beginner, Career, EntryLevel

The Ironic DBA—Rock Around the Clock

August 27, 2019 by SQLandMTB Leave a Comment

Welcome back to The Ironic DBA Files, a series where the newbie DBA on staff at Dallas DBAs chronicles the ups and downs of fast-tracking a new career as a DBA.

Last week I shared why you shouldn’t completely hate [square brackets], and this week I’m going to build on that theme a little bit more by showing you some minor tweaks to some scripts we use here at Dallas DBAs on a daily basis.

I Love it When Something Unplanned Comes Together

If you’ve been following my weekly Ironic DBA posts, you know that I’m new to this gig and have been learning things as rapidly as I can. It’s a little like being thrown into a pool to learn how to swim—though not the deep end. I’ve chronicled what I’ve learned each week and attempted to share it with the world. A funny thing about this week’s post requires a little backstory.

Kevin and I have known each other for years, and our families have gotten together weekly to play games for the last several years. Last time we were sitting around the table together, we were talking about my posts and I jokingly said something like, “If I don’t have a topic for a blog post in any given week, you need to ask me what I’m doing with my time.”

Then I proceeded to struggle to come up with a topic for this week. Oh, the irony.

Archimedes
Archimedes takes a bath and learns a thing.

What’s really cool is that I obviously did come up with something…unless the rest of this post is simply a ramble. I’ve long believed (as a former school teacher) that two of the best methods for learning are immersion and repetition. That’s been my approach to any of my self-guided SQL Server studies, and it’s paid off so far. Earlier this week I had an Archimedes-type Eureka! moment when some various threads I’ve been pulling all came together.

One of the stepping stones I’ve been using in my studies is Kevin’s post “Top 10 SQL Server functions every DBA should know by heart.” I’ve revisited that post several times but don’t always have many relevant opportunities to put those functions into practice on my VM. Either way, I’ve been using the repetition method to remind myself that these functions exist. The relevant function for today is Getdate().

Also relevant this week is this excellent post from Ken Fisher (b|t) about the built-in agent_datetime() function in SQL Server. I first read about it because Kevin found it and tweeted about it. We ended up using it in the script edits you’ll see below.

Generic SQL Server Output Sucks

As an “artsy” type person, some of the ways SQL Server displays information pains me. I get that we’re working with data and it doesn’t always have to be beautiful, but can we at least get something a bit more reader-friendly? The answer is usually, “Yes, but you’ll have to work for it.”

Now that I’ve been doing daily server reviews in Production for a while, I’ve gotten pretty familiar with three scripts that I run against servers every day. Those three scripts are Job History, Read Errorlogs, and Last Backups (generic titles). Let’s look at Job History first since it has some of the most interesting edits applied. Here’s the original script:

Select j.name, jh.step_name, run_status, run_date, run_time, run_duration, [server], [message]
From 
	[msdb].[dbo].[sysjobhistory] jh
	join [msdb].[dbo].sysjobs j 
		on jh.job_id = j.job_id
Where 1=1
and run_status not in (1,2,4)
and run_date > 20190701 
and [step_name] <> '(Job outcome)'
Order by run_date  desc, run_time desc

--Select MIN(run_date) from msdb..sysjobhistory

Which renders the following output:

Meh, the run_date and run_time output is underwhelming and hard to read quickly. Imagine checking 100 servers or more and needing to quickly read the time/date stamps.

As I was working on my own edits, Kevin sent me a snippet of code changes—based on Ken Fisher’s blog post mentioned above—in order to get a much nicer output:

Select 
	j.name as [Job Name], 
	jh.step_name as [Job Step Name], 
	run_status as [Run Status], 
	msdb.dbo.agent_datetime(run_date,run_time) as [Job Run Time], 
	[message] as [Message]
From [msdb].[dbo].[sysjobhistory] jh
	join [msdb].[dbo].sysjobs j 
		on jh.job_id = j.job_id
Where 1=1 
	and run_status not in (1,2,4)
	and run_date >= 20190701
	and jh.step_name <> '(Job outcome)'
Order by 
	j.name,
	msdb.dbo.agent_datetime(run_date,run_time) Desc

Which gives us the following output:

You’ll also notice that I added some more of my own square bracket magic to make the column headers more readable. It’s the little details that sometimes make a big difference.

I did the same sort of thing to our Read Errorlog script, but added my own line to change the datetime stamp here as well. The original script had this line of code:

And Logdate > getdate() -3

Which rendered this result:

With Kevin’s guidance, I changed the line using the Convert() function and received the subsequent output:

convert(nvarchar(30), getdate() -3, 20) as [Error Date & Time], --convert datetime to readable format

Finally, we check for the latest backups each morning by using a script that…you guessed it…checks for the latest backups. The procedure goes something like this:

1. Run Last Backups script and get results (see screenshot).

2. Copy results with headers and paste into an Excel spreadsheet.
3. Sort and filter results in spreadsheet to make reading of latest backup timestamps easier on the eyes and fall in sequential order.
4. Report findings to client.

That’s all well and good, but since I studied the GROUP BY and ORDER BY commands in TSQL recently I thought, “Why are we going through the extra copy/paste spreadsheet sort/filter steps? Why not simply rewrite the script to sort the results for us?”

So that’s what I did. I simply edited the last line of the script (as well as do more square bracket magic):

Order by RecoveryMode, [Status], [LastFullDate], [LastLogDate], db.[Database]

And here’s what the output looks like now:

Nice! Sorted and filtered, reader-friendly last backups results.

Now, in the end, does any of this improve our client’s server efficiency? No, but what it does do is let ME be more efficient for our clients. There’s no reason to spend extra time unnecessarily.

Have I written super-complex code? Nope. Have I contributed something to the SQL community that a Senior DBA couldn’t have written in 2 minutes? Nope, but that’s not the point. The point is I learned from the experience of editing existing scripts and now have some slightly sharper tools in my toolbox. 🙂

That’s all for this week. Join me next time for the next episode in The Ironic DBA Files.

Follow me on Twitter at @SQLandMTB, and if you’re into mountain bikes come over and check out my site NTX Trails.

The Ironic DBA Files

      • Prequel: The Ironic DBA—Starting a New and Unexpected Career
      • Episode 1: You Back That Up?
      • Episode 2: Attack of the Corruption
      • Episode 3: Revenge of the Index
      • Episode 4: A New Primary Key
      • Episode 5: The Maintenance Plan Strikes Back
      • Episode 6: Return of the TSQL
      • Episode 7: The Backup Awakens
      • Episode 8: The Last Rebuild
      • Episode 9: Rise of the Clients
      • Review One: A SQL Story
      • It’s Hip to Be Square

Follow @Dallas_DBAs

Filed Under: Apprentice, Beginner, Career, EntryLevel

The Ironic DBA—It’s Hip to Be Square

August 20, 2019 by SQLandMTB Leave a Comment

Welcome back to The Ironic DBA Files, a series where the newbie DBA on staff at Dallas DBAs chronicles the ups and downs of fast-tracking a new career as a DBA.

Last week I reviewed how far I’d come and some of what I’d learned so far on this journey. You [hopefully] noticed the titles of the previous episodes were all Star Wars-esque titles, and the episodes were numbered. All of those posts were a generalized recap of what had happened two or three weeks previously. Going forward, my contributions will be more timely—as in what I’ve learned or experienced within the last week or so—and hopefully continue to add value to my fellow apprentices in the #sqlfamily.

 

Brackets Not Feeling the Love?

There seems to be some generalized hating on the use of square brackets in TSQL code among both new and seasoned DBAs. What are square brackets? I’m sure if you’ve spent almost any time in the SSMS query editor you’ve seen something like this:

[StackOverflow2010].[dbo].[Users]

This commonly happens when you use SSMS to create your scripts for you, or you drag and drop a database or table into the query editor window. The example above is a demonstration of a three-part naming convention for the Users table in the StackOverflow2010 database. What this shows is [NameOfDatabase].[Schema].[TableName]. The periods, or dots, are concatenating the three parts of the name together.

I get why some folks don’t like the brackets as they tend to muddle your script’s readability. To see a great example of this in action, check out Michael J. Swart’s post Remove SQL Junk (Brackets and Other Clutter). In most cases, you don’t need square brackets in your scripts, but there are instances where you will need them. That’s mostly beyond the scope of this post.

Cool side note: One of the tidbits I learned during this investigation into square brackets is that a database can have tables with identical names as long as each table’s schema is different. That’s something for you apprentice and accidental DBAs to be aware of for the future.

An Example of Nifty Bracket Usage

I mentioned last week how I have been working through the Stairway to T-SQL course at SQL Server Central. While playing with the GROUP BY clause scripts, I became annoyed at the inelegance of the table header output. Here’s an example:

USE tempdb;
GO
SELECT StoreName 
     ,SUM(TotalSalesAmount) AS StoreSalesAmount
FROM dbo.SalesTransaction  
GROUP BY StoreName;

The script above renders this result:

I decided to test the waters and “fix” this by using square brackets for the column header names:

USE tempdb;
GO
SELECT StoreName AS [Store Name]
     ,SUM(TotalSalesAmount) AS [Store Sales Amount]
FROM dbo.SalesTransaction  
GROUP BY StoreName;

Which rendered this:

Ahh, satisfying.

It’s a super simple addition, and one that does not detract all that much from the script’s readability. What is does do in my opinion is make the output more reader-friendly.  Essentially, consider if you might want to use square brackets when you want to use spaces or special characters in your column headers. If you want to go crazy, you can take using square brackets to rename columns a whole lot deeper.

As a side note, you may have to use square brackets if you have a table or column name that uses a word that is reserved in SQL DML. For instance, if someone was wicked enough to name a table or column “Index” then you would have to use square brackets around the name in order to create scripts that function properly.

But no one would ever be that evil, would they?

That’s all for this week. Join me next time for next episode in The Ironic DBA Files.

Follow me on Twitter at @SQLandMTB, and if you’re into mountain bikes come over and check out my site NTX Trails.

The Ironic DBA Files

      • Prequel: The Ironic DBA—Starting a New and Unexpected Career
      • Episode 1: You Back That Up?
      • Episode 2: Attack of the Corruption
      • Episode 3: Revenge of the Index
      • Episode 4: A New Primary Key
      • Episode 5: The Maintenance Plan Strikes Back
      • Episode 6: Return of the TSQL
      • Episode 7: The Backup Awakens
      • Episode 8: The Last Rebuild
      • Episode 9: Rise of the Clients
      • Review One: A SQL Story

Follow @Dallas_DBAs

Filed Under: Accidental DBA, Beginner, EntryLevel

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 16
  • Go to page 17
  • Go to page 18
  • Go to page 19
  • Go to page 20
  • Interim pages omitted …
  • Go to page 37
  • Go to Next Page »

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2026 · WordPress · Log in

 

Loading Comments...