SSIS Memory errors

New one today….memory errors from SSIS package execution.

Update: Check out THIS article from my friend Tim on SSIS memory

Background: I’m a Database engine dude.   I’m well versed in how the engine uses memory, buffer pool, mem-to-leave, etc. I teach these things to others around me all the time.   I rarely use SSIS other than install and verify jobs run successfully.   Until yesterday I had never created a functional package outside of the Import/Export Wizard.

My team supports an in-house, proprietary application written for us by a Very Big Vendor (VBV).  We have this application and supporting apps installed in Dev, Test, Stage, and Prod.

There are multiple SSIS packages (37 specifically), each with dozens of tasks inside them.  Many dataflow, lots of transformations, etc..  They are called by .Net applications that run as services on SSIS dedicated servers.


A very bright (and I mean that sincerely) analyst on my team brings me a piece of paper printed out with errors all over it from SSIS.   I got the .txt copy and it looks like this:

Version 12.0.5000.0 for 32-bit
Copyright (C) Microsoft Corporation. All rights reserved.
Started: 7:48:25 PM
Error: 2017-08-25 19:48:35.17
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0xC0047012
Source: dft_DataFlow1
Description: A buffer failed while allocating 10485760 bytes.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC0047011
Source: dft_DataFlow1
Description: The system reports 2 percent memory load. There are 1,099,441,074,176 bytes of physical memory with 1074776596480 bytes free. There are 4294836224 bytes of virtual memory with 65310720 bytes free. The paging file has 1153128165376 bytes with 1124943544320 bytes free.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC02020C4
Source: dft_DataFlow1 olesrc_Package1Stage_Package1 [259] Description: The attempt to add a row to the Data Flow task buffer failed with error code 0x8007000E.
End Error
Error: 2017-08-25 19:48:35.22
Code: 0xC0047038
Source: dft_DataFlow1 SSIS.Pipeline
Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on olesrc_Package1Stage_Package1 returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
End Error
Error: 2017-08-25 19:48:37.58
Code: 0x8007000E
Source: dft_HOS SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 7:48:25 PM
Finished: 7:48:44 PM
Elapsed: 19.047 seconds

This came from our UAT/Test environment AND a client specific setup that had all components in one server.

Notice the red sections above:

  • Not enough storage – was this physical memory, virtual memory or disk?
  • Buffer failed to allocate (10MB) – Why?
  • 2% memory load – yes, this box has 1 TB of RAM…plenty for what we are doing you would think.
  • Attempt to add a row – well, yeah…if SSIS uses buffers and cannot allocate one, there are either none, or they are full.  Makes logical sense to me

So now what?  Off to Google of course…because nobody here is an SSIS admin.

My first generic query was to get a general understanding of how SSIS uses memory, which lead me here:
SSIS Out of Memory/Buffer Manager errors

Nice high level overview, with some key elements: DefaultMaxBufferSize and DefaultMaxBufferRows, as well as a link to a very detailed (and old) SSIS performane tuning article.  See the section on Buffer Usage. and Measuring Performance (scroll to SSIS Performance Counters).

This gave me a very broad understanding that while SSIS uses buffers in a similar way to the DB Engine, they are not part of a configurable dedicated memory space.

The short version is this: SSIS looks at the estimated row size of the incoming data, then does a bunch of math to figure out how many rows it can fit into a buffer.   Too many or too few and it does a little magic behind the scenes to get the best performance it can.  Remember – these are estimates.  If you have wildly different datatypes and row sizes, it may guess wrong, or less than optimally.

This made me wonder:

  • how many buffers were needed
  • how to measure this
  • is something else fighting for the memory space that SSIS wanted for new buffers
  • Is there possible memory fragmentation (is that a thing!?!?)

So I did something I’ve never done in my career: I opened Visual Studio and created a new SSIS solution, containing one package that imported a .txt file into SQL Server.  Straight import, no transformations.  I just needed something I could run and have Perfmon tracking.

The fun part about making a large enough .txt file is taking the contents and copying and pasting them over and over.  I wound up with a file of 19 million records….Notepad actually saved all 3GB of it, but wouldn’t open it again.  Neither would Notepad++.  But SSIS was fine with it.

Here’s the very advanced package I developed:

Note that the Data Flow Task is highlighted, and the DefaultBufferMaxRows and DefaultBufferSize parameters are at their default values.

Once I ran, debugged and got this to complete, I set up a user defined Data Collector Set in Perfmon and started it:

I messed around with a lot of counters until I settled on these.   All I really wanted to see was how many 10MB buffers I was using in my simple test, and verify I could accurately measure this in the problem environment.

I got this result:

Note the black line the Buffers in Use counter…in my case it started as 8.   This capture is from when I had the package reading the same 19 million row .txt file into the same table simultaneously.  It used 8 each time.

I was unable to reproduce the errors on my machine, no matter how funky I got with the package, since I am running 16GB of RAM, and you cannot configure a cap at the SSIS or package level (such as the SQL Server MaxMemory setting).

But I was confident that I could go to the UAT team and set this monitoring up to at least tell them what was happening, so we could then work to track down why.

At this point I was 8 hours in and still Googling things, and I found this thread on DBA Stack Exchange.

Crud.  Really?

Did you miss it too?  Go look at the very first line in the error text I posted.

Version 12.0.5000.0 for 32-bit.

Dagnabbit.  32-bit applications are limited to 2GB of memory by deafult.  With some switchcraft (aka the /3GB switch) you can bump it.

When setting up the environment the person doing it pointed the SSIS packages to the C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn directory instead of C:\Program Files\Microsoft SQL Server\120\DTS\Binn.

We changed this config parameter in the application setup and everything worked perfectly.

SSIS filled up the 2GB with buffers (big file, lots of columns) and fell apart.  Simple as that.

Hope this helps someone…please let me know if it helped you by commenting!

Update: If you read this far, also check out THIS article from my friend Tim on SSIS memory.

Thanks for reading!


My Pluralsight course for new SQL Server DBAs

Leave a Comment

Sign up for our Newsletter

%d bloggers like this: