• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

Explaining SQL Server in plain english

  • Services
    • SQL Server Health Check
  • Pocket DBA®
  • Blog
    • Speaking
  • Testimonials
  • Contact
  • About

SSIS

T-SQL Tuesday: Your Fantasy SQL Feature

September 10, 2019 by SQLDork Leave a Comment

T-SQL Tuesday is a monthly blog party for the SQL Server community. It is the brainchild of Adam Machanic (b|t) and this month’s edition is hosted by Kevin Chant (b|t) who has asked us to describe “Your Fantasy SQL Feature“. (This is also the first time I’ve made a blog post for T-SQL Tuesday.)

The Problem: As a junior DBA, part of my job involves checking job failures for 2-3 clients each day, which we have a script in-house for. Every once in a while, a job actually fails! So like any DBA who knows that you can go to the Job Activity Monitor under the SQL Agent to view details on job history, i do exactly that only to find it’s a job that ran an SSIS package and gave me this very informative error message:

That’s super helpful, thanks!

So now i have to:

  • Check the step properties
  • Look at the path for the SSIS package
  • Drill down into the integration services catalogue
  • clickity, clickity
  • Wait which folder was it in?
  • Go check the path again
  • clickity clickity
  • There’s 12 different packages with very similar names:

  • Resize the window or scroll over to see which one i care about
  • Finally, open the All Executions Report to get a very helpful error message:

Gee darnit, i’m out of coffee.

The Solution: Just put a button in the failure message/job history that does all that for you.  Or a hyperlink. Whatever.

Kevin Hill (b|t) has made a script that reads the SSISDB tables, but often takes a while to run with only the existing Microsoft indexes. Especially if you are keeping crazy amounts of history…like the MS default 365 days, on a busy system.

/*Please work*/,

SQLDork

Filed Under: SSIS, TSQL2sday

Integration Services Catalog package errors

December 31, 2018 by Kevin3NF 3 Comments

Short post today…

My client makes extensive use of SSIS and deploys the packages to the Integration Services Catalog (ISC), and runs them via hundreds of jobs.

When one of the jobs fail, I have to go get the details.

Job History doesn’t have it.

So my process was:

  • Get the package path out of the job step
  • Expand the ISC
  • Expand SSISDB
  • Expand the Folder (get this from the job step)
  • Expand the Projects Node
  • Expand the correct Project (get this from the job step)
  • Expand Packages
  • Right-click the relevant package (get this from the job step)
  • Go to Reports, Standard Reports, All Executions
  • Filter for Failed Executions
  • Cross your fingers that you can get useful info from the result.

Now, repeat for each job step that failed.

The SSIDB database has all of this in the Tables (schema – Internal) and or the Catalog Views.

It took a while, but I created this to run around those tables and views to get ALL of the errors for the last day:

/*

Created by Kevin Hill, Dallas DBAs LLC, 12/28/2018
Inspired by work from Jules Behrens
Index tested by Peter Schott

This queries multiple SSISDB tables to return a clear path from Top to bottom 
related to errors in an Integration Services Catalog based SSIS package.
It has not yet been tied back to job execution, nor is it set to email info out.

Use this as a backup to your normal job failure checks to tie it all together
instead of spending a full cup of coffee clicking and drilling into the cumbersome 
All Executions report.

Free to use and modify, please leave this header as a courtesy.

*/

Use SSISDB
go

Select Distinct
	fold.[name] as Folder_name
	,proj.[name] as Project_Name
	,pack.[name] as Package_Name
	,ops.[message_time]
	,mess.[message_source_name]
	,ops.[message]
	--,mess.[execution_path]		-- this is pretty long path if you are pasting into an email or Excel
From [internal].[projects] proj
	Join [internal].[packages] pack on proj.project_id = pack.project_id
	Join [internal].[folders] fold on fold.folder_id = proj.folder_id
	Join [internal].[executions] execs on 
		execs.folder_name = fold.[name] and
		execs.project_name = proj.[name] and 
		execs.package_name = pack.[name]
	Join [internal].[operation_messages] ops on execs.execution_id = ops.operation_id
	join [internal].[event_messages] mess on ops.[operation_id] = mess.[operation_id]
		and mess.event_message_id = ops.operation_message_id
		and	mess.package_name = pack.[name]
Where 1=1
	and ops.message_type = 120		  -- errors only
	--and mess.message_type in (120,130)  -- errors and warnings
	and ops.message_time > getdate() - 1 -- adjust as necessary

/*
If you have a very large SSISDB due to activity or long retention, please consider this index:

CREATE NONCLUSTERED INDEX [NC_OpsID_MessageType_MessgeTime] ON [internal].[operation_messages]
(
	[operation_id] ASC,
	[message_time] ASC,
	[message_type] ASC
)
INCLUDE ([message])

*/

If this is useful for you, please rate it on the TechNet Gallery where I posted it.

If you have a very active server, this can take awhile as the indexes on the underlying tables are for the cascading delete cleanup process.  I’m working on a set of good indexes that will likely be unsupported by Microsoft, but useful.

Adjust as you see fit.  Please leave the header if you use it.

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

—————————————————————-

Filed Under: SQL, SSIS, TSQL

SSIS Memory errors

August 30, 2017 by Kevin3NF 7 Comments

New one today….memory errors from SSIS package execution.

Update: Check out THIS article from my friend Tim on SSIS memory

Background: I’m a Database engine dude.   I’m well versed in how the engine uses memory, buffer pool, mem-to-leave, etc. I teach these things to others around me all the time.   I rarely use SSIS other than install and verify jobs run successfully.   Until yesterday I had never created a functional package outside of the Import/Export Wizard.

My team supports an in-house, proprietary application written for us by a Very Big Vendor (VBV).  We have this application and supporting apps installed in Dev, Test, Stage, and Prod.

There are multiple SSIS packages (37 specifically), each with dozens of tasks inside them.  Many dataflow, lots of transformations, etc..  They are called by .Net applications that run as services on SSIS dedicated servers.

SO….

A very bright (and I mean that sincerely) analyst on my team brings me a piece of paper printed out with errors all over it from SSIS.   I got the .txt copy and it looks like this:

Version 12.0.5000.0 for 32-bit
Copyright (C) Microsoft Corporation. All rights reserved.
Started: 7:48:25 PM
Error: 2017-08-25 19:48:35.17
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0xC0047012
Source: dft_DataFlow1
Description: A buffer failed while allocating 10485760 bytes.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC0047011
Source: dft_DataFlow1
Description: The system reports 2 percent memory load. There are 1,099,441,074,176 bytes of physical memory with 1074776596480 bytes free. There are 4294836224 bytes of virtual memory with 65310720 bytes free. The paging file has 1153128165376 bytes with 1124943544320 bytes free.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC02020C4
Source: dft_DataFlow1 olesrc_Package1Stage_Package1 [259] Description: The attempt to add a row to the Data Flow task buffer failed with error code 0x8007000E.
End Error
Error: 2017-08-25 19:48:35.22
Code: 0xC0047038
Source: dft_DataFlow1 SSIS.Pipeline
Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on olesrc_Package1Stage_Package1 returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
End Error
Error: 2017-08-25 19:48:37.58
Code: 0x8007000E
Source: dft_HOS SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 7:48:25 PM
Finished: 7:48:44 PM
Elapsed: 19.047 seconds

This came from our UAT/Test environment AND a client specific setup that had all components in one server.

Notice the red sections above:

  • Not enough storage – was this physical memory, virtual memory or disk?
  • Buffer failed to allocate (10MB) – Why?
  • 2% memory load – yes, this box has 1 TB of RAM…plenty for what we are doing you would think.
  • Attempt to add a row – well, yeah…if SSIS uses buffers and cannot allocate one, there are either none, or they are full.  Makes logical sense to me

So now what?  Off to Google of course…because nobody here is an SSIS admin.

My first generic query was to get a general understanding of how SSIS uses memory, which lead me here:
SSIS Out of Memory/Buffer Manager errors

Nice high level overview, with some key elements: DefaultMaxBufferSize and DefaultMaxBufferRows, as well as a link to a very detailed (and old) SSIS performane tuning article.  See the section on Buffer Usage. and Measuring Performance (scroll to SSIS Performance Counters).

This gave me a very broad understanding that while SSIS uses buffers in a similar way to the DB Engine, they are not part of a configurable dedicated memory space.

The short version is this: SSIS looks at the estimated row size of the incoming data, then does a bunch of math to figure out how many rows it can fit into a buffer.   Too many or too few and it does a little magic behind the scenes to get the best performance it can.  Remember – these are estimates.  If you have wildly different datatypes and row sizes, it may guess wrong, or less than optimally.

This made me wonder:

  • how many buffers were needed
  • how to measure this
  • is something else fighting for the memory space that SSIS wanted for new buffers
  • Is there possible memory fragmentation (is that a thing!?!?)

So I did something I’ve never done in my career: I opened Visual Studio and created a new SSIS solution, containing one package that imported a .txt file into SQL Server.  Straight import, no transformations.  I just needed something I could run and have Perfmon tracking.

The fun part about making a large enough .txt file is taking the contents and copying and pasting them over and over.  I wound up with a file of 19 million records….Notepad actually saved all 3GB of it, but wouldn’t open it again.  Neither would Notepad++.  But SSIS was fine with it.

Here’s the very advanced package I developed:

Note that the Data Flow Task is highlighted, and the DefaultBufferMaxRows and DefaultBufferSize parameters are at their default values.

Once I ran, debugged and got this to complete, I set up a user defined Data Collector Set in Perfmon and started it:

I messed around with a lot of counters until I settled on these.   All I really wanted to see was how many 10MB buffers I was using in my simple test, and verify I could accurately measure this in the problem environment.

I got this result:

Note the black line the Buffers in Use counter…in my case it started as 8.   This capture is from when I had the package reading the same 19 million row .txt file into the same table simultaneously.  It used 8 each time.

I was unable to reproduce the errors on my machine, no matter how funky I got with the package, since I am running 16GB of RAM, and you cannot configure a cap at the SSIS or package level (such as the SQL Server MaxMemory setting).

But I was confident that I could go to the UAT team and set this monitoring up to at least tell them what was happening, so we could then work to track down why.

At this point I was 8 hours in and still Googling things, and I found this thread on DBA Stack Exchange.

Crud.  Really?

Did you miss it too?  Go look at the very first line in the error text I posted.

Version 12.0.5000.0 for 32-bit.

Dagnabbit.  32-bit applications are limited to 2GB of memory by deafult.  With some switchcraft (aka the /3GB switch) you can bump it.

When setting up the environment the person doing it pointed the SSIS packages to the C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn directory instead of C:\Program Files\Microsoft SQL Server\120\DTS\Binn.

We changed this config parameter in the application setup and everything worked perfectly.

SSIS filled up the 2GB with buffers (big file, lots of columns) and fell apart.  Simple as that.

Hope this helps someone…please let me know if it helped you by commenting!

Update: If you read this far, also check out THIS article from my friend Tim on SSIS memory.

Thanks for reading!

Kevin3NF

My Pluralsight course for new SQL Server DBAs

Follow @Dallas_DBAs


Filed Under: SSIS, TIL

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!


Copyright © 2023 · Genesis Sample on Genesis Framework · WordPress · Log in

Privacy Policy
 

Loading Comments...