A quick video for the Accidental DBAs in the crowd:
Thanks for watching!
Happy New Year!
They say focus on the positive and make your goals public, so that’s exactly what I’m going to do here!
Dallas DBAs had a great 2018, by every measurable category. No formal goals were laid out, other than “don’t close the doors” and “help people”.
The goals for 2019:
The things we put in place this year have worked well, so I may be fudging the number too conservatively for 2019. Check back next year…we may actually hit $14 net profit!
Thanks for reading!
Short post today…
My client makes extensive use of SSIS and deploys the packages to the Integration Services Catalog (ISC), and runs them via hundreds of jobs.
When one of the jobs fail, I have to go get the details.
Job History doesn’t have it.
So my process was:
Now, repeat for each job step that failed.
The SSIDB database has all of this in the Tables (schema – Internal) and or the Catalog Views.
It took a while, but I created this to run around those tables and views to get ALL of the errors for the last day:
Created by Kevin Hill, Dallas DBAs LLC, 12/28/2018
Inspired by work from Jules Behrens
Index tested by Peter Schott
This queries multiple SSISDB tables to return a clear path from Top to bottom
related to errors in an Integration Services Catalog based SSIS package.
It has not yet been tied back to job execution, nor is it set to email info out.
Use this as a backup to your normal job failure checks to tie it all together
instead of spending a full cup of coffee clicking and drilling into the cumbersome
All Executions report.
Free to use and modify, please leave this header as a courtesy.
fold.[name] as Folder_name
,proj.[name] as Project_Name
,pack.[name] as Package_Name
--,mess.[execution_path] -- this is pretty long path if you are pasting into an email or Excel
From [internal].[projects] proj
Join [internal].[packages] pack on proj.project_id = pack.project_id
Join [internal].[folders] fold on fold.folder_id = proj.folder_id
Join [internal].[executions] execs on
execs.folder_name = fold.[name] and
execs.project_name = proj.[name] and
execs.package_name = pack.[name]
Join [internal].[operation_messages] ops on execs.execution_id = ops.operation_id
join [internal].[event_messages] mess on ops.[operation_id] = mess.[operation_id]
and mess.event_message_id = ops.operation_message_id
and mess.package_name = pack.[name]
and ops.message_type = 120 -- errors only
--and mess.message_type in (120,130) -- errors and warnings
and ops.message_time > getdate() - 1 -- adjust as necessary
If you have a very large SSISDB due to activity or long retention, please consider this index:
CREATE NONCLUSTERED INDEX [NC_OpsID_MessageType_MessgeTime] ON [internal].[operation_messages]
If this is useful for you, please rate it on the TechNet Gallery where I posted it.
If you have a very active server, this can take awhile as the indexes on the underlying tables are for the cascading delete cleanup process. I’m working on a set of good indexes that will likely be unsupported by Microsoft, but useful.
Adjust as you see fit. Please leave the header if you use it.
Thanks for reading!
While this post is for everyone, it is specifically written for SQL Saturday organizers…
I am filling out my 2019 schedule of target dates/locations of SQL Saturdays I want to attend/present (US only…I don’t even have a passport)
I am offering to do my DBA Fundamentals pre-con and one or two Saturday sessions (Security/DR) with a twist:
You keep my part of the pre-con proceeds for your group. I’ll cover my own travel, meals, etc. if you provide the space. This should allow you to increase your funding for the rest of the event, buy some pizza for your local group, or even discount the pre-con price should you choose to do so.
If you only have 1 or 2 pre-con spots and cannot secure an additional room at your venue, please pick from the other submissions 🙂
* All of these are tentative, some dates not granted yet
So, dear organizer…if you think this offer will be a benefit to your SQL Saturday, please get in touch and let me know what your planned date is so I can be sure to submit. Kevin@DallasDBAs.com
Thanks for reading!
If you are going to PASS Summit in Seattle and want someone to sit with, check out my schedule (primary and backup sessions) – very much subject to change:
Wednesday, November 7, 2018
10:15 AM – 11:30 AM Room: 2AB
dbatools Powershell Library – The Complete Introduction (Yes, my very first session is Powershell, lol)
10:15 AM – 11:30 AM Room: 612
Uncovering Duplicate, Redundant, & Missing Indexes
1:30 PM – 2:45 PM Room: 606
Improving Availability in SQL Server and Azure SQL Database
1:30 PM – 4:00 PM Room: 6E
SQL Server Migrations Done the Right Way
4:45 PM – 6:00 PM Room: 6A
SSMS & T-SQL Tricks: Working Smarter, Not Harder
Thursday, November 8, 2018
10:45 AM – 12:00 PM Room: 6B
Containers, Pods, and Databases- The Future of Infrastructure
1:30 PM – 4:00 PMRoom: 606
Query Tuning Internals for the Advanced SQL Developer
1:30 PM – 2:45 PM Room: TCC Tahoma 2
Top 5 Power BI Custom Visuals
1:30 PM – 2:45 PM Room: 6C
Top Tips for Deploying AGs and FCIs On Premises or In the Cloud
3:15 PM – 4:30 PM Room: 618
An Introduction to Partitioning
4:45 PM – 6:00 PM Room: 604
Data Science 101 for the SQL DBA
4:45 PM – 6:00 PM Room: 6C
Validate Your SQL Server Estate with Ease Using dbachecks
Friday, November 9, 2018
8:00 AM – 9:15 AM Room: 6C
T-SQL Habits and Practices That Can Kill Performance
9:30 AM – 12:00 PM Room: 6E
Migrating to SQL Server 2017
11:00 AM – 12:15 PM Room: 3AB
BI Power Hour
2:00 PM – 3:15 PM Room: 3AB
Lightning Talks: DBA
2:00 PM – 3:15 PM Room: 618
What’s New With Adaptive Query Processing?
3:30 PM – 4:45 PM Room: TCC Yakima 1
A DBAs Guide to the Proper Handling of Corruption
3:30 PM – 4:45 PM Room: TCC Tahoma 4
SQL Server Performance Monitoring Made Easy
PLEASE feel free to come introduce yourself like we are old friends. My brain is defective in remembering names and where we met, so you may have to remind me if in fact we have met before 🙂 Fair warning…I may ask you to take a selfie with me. I suck at selfies so I may actually get a shot of your shoes, or the ceiling 🙂
Thanks for reading!
Oh snap…one of your two DBAs just gave 2 weeks notice.
You have 2 weeks before your other DBA starts feeling the pressure of doing 80 hours of work in a 40 hour week.
Did you see this coming? Do you have a favorite recruiting firm on your vendor list? Do you know a guy that knows a guy?
Assuming that the budget exists to pay for a replacement, you are very likely 4 weeks out from actually getting a replacement in the door:
In reality, recruiters these days are handling 30+ openings and dozens of applicants per opening. That’s a lot on their plate.
You have Stuff To Do™ as well, which limits your time to properly review resumes and get through the interview process. Hopefully your process isn’t 3 phone calls and 4 face to face on different days…
Great…so we’ve defined the problem. What is the solution?
Put a DBA in your pocket.
You likely already have a lawyer and accountant in there. Put a few critical technical people in there with them.
Set up a retainer agreement with an independent DBA to fill in when:
This doesn’t even have to be an actual independent contractor…many qualified DBAs, SysAdmins, storage people, etc. that are working full-time jobs but can check on your systems on weekends and evenings.
If your systems are mission critical, and you are running skinny in certain areas…protect yourself by having someone you already know in your pocket, ready to call when the unexpected happens.
For my fellow DBAs…would you like to take a vacation without your laptop? Or go to a training class? Send this link to your manager if you are stretched so tight you can’t breathe!
Dallas DBAs provides Remote DBA services at ‘X’ hours a week. Contact us if this resonated with you.
Thanks for reading!