• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
  • Pocket DBA®
  • Blog
  • Testimonials
  • Contact
  • About

TIL

TIL: Restore-DbaDatabase

April 29, 2020 by SQLDork Leave a Comment

(This post comes from one of our Junior DBAs – @SQLDork)

If you haven’t read it already, i recently wrote a blogpost where i go over Backup-DbaDatabase, you can read that post here.

Or here, it’s your choice which link you want to click to go there.

Today’s command: Restore-DbaDatabase

Same setup as last time, backups, ISE, SQL Server instances, all that good stuff.

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016

This kinda works, but asks to specify a bunch of paths if you run it. Best to specify those in the parameters ahead of time, i’d imagine.

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Log'

This seems to work, but won’t do anything because the databases already exist.
Warning message suggests using -WithReplace to overwrite the existing databases, but i think there’s another option…

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -DatabaseName 'test_restored'`
-Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\test_202004241133.bak'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Log'

Moved some parameters around for readabilities’ sake.
Warning message implies we need to specify the names for the .mdf and .ldf, which makes sense.

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -DatabaseName 'test_restored'`
-Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\test_202004241133.bak'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Log'`
-DestinationFilePrefix 'restored_'

Success! Database name is “test_restored”, and file names are “restored_test.mdf” and “restored_test.ldf”
Let’s dork around with the parameters a bit to get the names to be consistent:

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 `
-Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\test_202004241133.bak'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationFilePrefix 'restored_' -RestoredDatabaseNamePrefix 'restored_'

Moved the log files to go in the DATA folder, because apparently the Log folder is for different stuff. (Ed: yep…that’s for errorlogs, crash dumps, default trace files, etc.)
No longer specifies the database name via -DatabaseName.
Instead uses -RestoredDatabaseNamePrefix to put ‘restored_’ at the beginning of the name.
This would be useful if we were restoring multiple databases at once.

NOTE: There is a -DestinationFileSuffix, but there is no -RestoredDatabaseNameSuffix parameter. I’ve asked twitter for clarification on this.

There doesn’t seem to be a parameter for renaming the logical names of the
files in Restore-DbaDatabase either, but there is one in Rename-DbaDatabase.

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 `
-Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\test_202004241133.bak', 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\square[]brackets_202004241133.bak'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationFilePrefix 'restored_' -RestoredDatabaseNamePrefix 'restored_'

Restore multiple databases at once, there’s probably a better way to do this without having to specify the obnoxiously long path both times.

Yes, i have a database named square[]brackets.
Yes, i have gone crazy.
No, you shouldn’t make a database with square brackets in the name.

 

Restore-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 `
-Path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\test_202004241133.bak'`
-DestinationDataDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationLogDirectory 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA'`
-DestinationFilePrefix 'restored_' -RestoredDatabaseNamePrefix 'restored_' -OutputScriptOnly

Using -OutputScriptOnly dumps a T-SQL script into the command line, which you can edit or run to your heart’s content.

 

Summary

Parameters i found to be useful, some of which i didn’t test
because i’d have to insert lines into the database and i’m lazy™:

-SqlInstance: Required, tells powershell what instance to restore to.
Can use commas to point to multiple instances, like if you wanted a fresh copy of prod data on your 7 dev boxi.

-Path: Tells powershell where the backups to restore are, enclose in single-quotes if there’s a space. (Looking at you, Program Files)
Can point to a folder to restore all the backups in that folder, or to a .bak file to restore just that file.
Use commas to specify multiple paths.

-DestinationDataDirectory: Where to put the physical data files, enclose in single-quotes if there’s a space.
-DestinationLogDirectory: Where to put the physical log files, enclose in single-quotes if there’s a space.

NOTE: All paths need to be specified relative to the instance specified with -SqlInstance.

-DestinationFilePrefix: Adds a string to the beginning of the names of the restored files.
-RestoredDatabaseNamePrefix: Adds a string to the beginning of the restored database name.

-DestinationFileSuffix: Adds a string to the end of the names of the restored files.
-RestoredDatabaseNameSuffix: Does not exist, i wish it did, but it doesn’t.

-OutputScriptOnly: Paranoid? This will give you the script for the thing you just did, just in case.

-NoRecovery: Restores the files with NoRecovery, for if you have more files to restore.
-Recover: Does a With Recovery, nothing fancy.

-RestoreTime: Point in time recovery, stop restoring before the bad thing™ happened.

-WithReplace: Overwrites the existing database if there is one with the same name.

-MaintenanceSolutionBackup: Tell dbatools that you’re using OlaHallengren’s (b) maintenance solution.
Makes things faster since it now knows what to look for, rather than checking every file in the folder.

That’s all for today, if you want to bombard me (b|t) with questions i don’t know how to answer, you can do so on twitter!

Filed Under: PowerShell, Restore, TIL Tagged With: DBATools.io, syndicated

TIL: Backup-DbaDatabase

April 27, 2020 by SQLDork Leave a Comment

(This post comes from one of our Junior DBAs – @SQLDork)

I’m learning how to use dbatools.

Today’s command: Backup-DbaDatabase

I’m using the 64-bit PowerShell ISE (running as administrator), running PowerShell version 5.1 on Windows 10, with a local SQL 2016 install.

Here’s the code, each snippet adding additional parameters and building off the one before it, mostly, with a bunch of 100% hilarious comments.*

*Comments may or may not be 100% hilarious, on account of hilarity not being a measurable substance.**
**Probably.

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test

This one works, yaaaay! All it does is take a full backup of one database. We can do better, probably.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type Diff

This takes a differential backup, but calling your diffs .bak is bad™

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type Differential -FilePath test2.dff

-FilePath lets us set the extension, but in doing so, we lose timestamps. I wonder if there’s a way to fix that… (Spoiler alert: there totally is)

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type Differential -FilePath test2.dff -CompressBackup

Also figured out how to compress backups, but it doesn’t work if you try to take a compressed diff after an uncompressed full.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type full -FilePath test3.bak -CompressBackup
Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type diff -FilePath test3.dff -CompressBackup

This one worked, because science!™ Also backup chains, there’s probably a blog post around here somewhere explaining those, you should go find it and send it to all your friends!

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type full -FilePath test3.bak -CompressBackup -initialize

Using -initialize makes it overwrite the existing copy, nothing too exciting.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016

Bare minimum required to run without errors, takes full backups of every system and user database on the box.
Except for tempdb for some reason, idk why*
* I totally know why

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type diff `
-FilePath test3_timestamp.dff -CompressBackup -ReplaceInName

Using -ReplaceInName lets us use timestamps with reasonable file extensions
Welcome to people-not-yelling-at-you-because-they-can’t-tell-if-the-important-backup-file-is-a-full-or-a-log town!
Population: Us

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test, model

Use commas to take backups of multiple databases at once.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -confirm

Anything that’s a checkbox in the Command Add-On doesn’t need any fancy parameters, it just works™.
If you’re using the PS ISE. If you’re using notepad, stop it.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test -Type full `
-FilePath test3_timestamp.bak -ReplaceInName -buildpath -createfolder `
-path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup'

-Path tells the files where to go
-FilePath sets the name/extension of the file
-BuildPath tells PowerShell to make the path if it doesn’t exist
-CreateFolder sends all the files into nice little subfolders for the humans to look at and go “Hey look, subfolders!”.

And then they go drink coffee and complain about the database being “slow”.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Type full `
-FilePath dbname_timestamp.bak -ReplaceInName -ExcludeDatabase test -buildpath `
-path 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\Backup\donuts'

-ExcludeDatabase has a very non-intuitive and confusing function: it excludes databases from being backed up.
Useful for the really big ones you only want to backup once a week or something.

 

Backup-DbaDatabase -SqlInstance PRECISION-M7520\SQL2016 -Database test `
-FilePath dbname_timestamp.bak -ReplaceInName -FileCount 2

This makes multiple striped files, appending numbers automatically, as so:
[filename]-1-of-2.bak
[filename]-2-of-2.bak
[filename]-donuts-of-2.bak
Actually that last one was a joke, but that’s all I got for today, and probably the next month or so.

 

Follow me (b|t) on twitter, where I also probably won’t say anything for the next month or so.

 

Filed Under: backup, PowerShell, TIL Tagged With: DBATools.io, syndicated

TIL: Security Stuff™

March 12, 2018 by SQLDork 3 Comments

Testing Grant and Deny permissions in SQL Server.

Here’s a rough breakdown of the steps we followed:

  • Create a windows group, and create 2 users in the group (We’ll call them test1 and test2).
  • Make a new database called SecurityTest.
  • Add the windows group from earlier as a SQL LOGIN.
  • Add a USER for the login in the SecurityTest database, with no permissions (yet).
  • Create a table and put some data™ in it (We’ll call it SuperSecretTable™).
  • Grant select permission to the USER (Group).
  • Verify that users test1 and test2 can see the data™ (You’ll have to logon to the machine as these users).
  • Test2 takes a screenshot of the SuperSecretTable™ and posts it on Twitter, gets 3 likes.
  • I login as myself and specifically DENY test2 rights to view the table because he cannot be trusted with the data in the SuperSecretTable™ (Or crayons).
  • Test2 logs in again, and finds himself unable to even see the SuperSecretTable™, much less the data™ in it, even though he’s in a group that has been GRANTed SELECT permissions.

In doing this, we’ve proved that DENY takes precedence over GRANT, because we’re cool like that

/*Please work*/,

The Apprentice

Filed Under: Apprentice, Security, TIL

SSIS Memory errors

August 30, 2017 by Kevin3NF 7 Comments

New one today….memory errors from SSIS package execution.

Update: Check out THIS article from my friend Tim on SSIS memory

Background: I’m a Database engine dude.   I’m well versed in how the engine uses memory, buffer pool, mem-to-leave, etc. I teach these things to others around me all the time.   I rarely use SSIS other than install and verify jobs run successfully.   Until yesterday I had never created a functional package outside of the Import/Export Wizard.

My team supports an in-house, proprietary application written for us by a Very Big Vendor (VBV).  We have this application and supporting apps installed in Dev, Test, Stage, and Prod.

There are multiple SSIS packages (37 specifically), each with dozens of tasks inside them.  Many dataflow, lots of transformations, etc..  They are called by .Net applications that run as services on SSIS dedicated servers.

SO….

A very bright (and I mean that sincerely) analyst on my team brings me a piece of paper printed out with errors all over it from SSIS.   I got the .txt copy and it looks like this:

Version 12.0.5000.0 for 32-bit
Copyright (C) Microsoft Corporation. All rights reserved.
Started: 7:48:25 PM
Error: 2017-08-25 19:48:35.17
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0x8007000E
Source: dft_DataFlow1 SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
Error: 2017-08-25 19:48:35.19
Code: 0xC0047012
Source: dft_DataFlow1
Description: A buffer failed while allocating 10485760 bytes.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC0047011
Source: dft_DataFlow1
Description: The system reports 2 percent memory load. There are 1,099,441,074,176 bytes of physical memory with 1074776596480 bytes free. There are 4294836224 bytes of virtual memory with 65310720 bytes free. The paging file has 1153128165376 bytes with 1124943544320 bytes free.
End Error
Error: 2017-08-25 19:48:35.20
Code: 0xC02020C4
Source: dft_DataFlow1 olesrc_Package1Stage_Package1 [259] Description: The attempt to add a row to the Data Flow task buffer failed with error code 0x8007000E.
End Error
Error: 2017-08-25 19:48:35.22
Code: 0xC0047038
Source: dft_DataFlow1 SSIS.Pipeline
Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on olesrc_Package1Stage_Package1 returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
End Error
Error: 2017-08-25 19:48:37.58
Code: 0x8007000E
Source: dft_HOS SSIS.Pipeline
Description: Not enough storage is available to complete this operation.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 7:48:25 PM
Finished: 7:48:44 PM
Elapsed: 19.047 seconds

This came from our UAT/Test environment AND a client specific setup that had all components in one server.

Notice the red sections above:

  • Not enough storage – was this physical memory, virtual memory or disk?
  • Buffer failed to allocate (10MB) – Why?
  • 2% memory load – yes, this box has 1 TB of RAM…plenty for what we are doing you would think.
  • Attempt to add a row – well, yeah…if SSIS uses buffers and cannot allocate one, there are either none, or they are full.  Makes logical sense to me

So now what?  Off to Google of course…because nobody here is an SSIS admin.

My first generic query was to get a general understanding of how SSIS uses memory, which lead me here:
SSIS Out of Memory/Buffer Manager errors

Nice high level overview, with some key elements: DefaultMaxBufferSize and DefaultMaxBufferRows, as well as a link to a very detailed (and old) SSIS performane tuning article.  See the section on Buffer Usage. and Measuring Performance (scroll to SSIS Performance Counters).

This gave me a very broad understanding that while SSIS uses buffers in a similar way to the DB Engine, they are not part of a configurable dedicated memory space.

The short version is this: SSIS looks at the estimated row size of the incoming data, then does a bunch of math to figure out how many rows it can fit into a buffer.   Too many or too few and it does a little magic behind the scenes to get the best performance it can.  Remember – these are estimates.  If you have wildly different datatypes and row sizes, it may guess wrong, or less than optimally.

This made me wonder:

  • how many buffers were needed
  • how to measure this
  • is something else fighting for the memory space that SSIS wanted for new buffers
  • Is there possible memory fragmentation (is that a thing!?!?)

So I did something I’ve never done in my career: I opened Visual Studio and created a new SSIS solution, containing one package that imported a .txt file into SQL Server.  Straight import, no transformations.  I just needed something I could run and have Perfmon tracking.

The fun part about making a large enough .txt file is taking the contents and copying and pasting them over and over.  I wound up with a file of 19 million records….Notepad actually saved all 3GB of it, but wouldn’t open it again.  Neither would Notepad++.  But SSIS was fine with it.

Here’s the very advanced package I developed:

Note that the Data Flow Task is highlighted, and the DefaultBufferMaxRows and DefaultBufferSize parameters are at their default values.

Once I ran, debugged and got this to complete, I set up a user defined Data Collector Set in Perfmon and started it:

I messed around with a lot of counters until I settled on these.   All I really wanted to see was how many 10MB buffers I was using in my simple test, and verify I could accurately measure this in the problem environment.

I got this result:

Note the black line the Buffers in Use counter…in my case it started as 8.   This capture is from when I had the package reading the same 19 million row .txt file into the same table simultaneously.  It used 8 each time.

I was unable to reproduce the errors on my machine, no matter how funky I got with the package, since I am running 16GB of RAM, and you cannot configure a cap at the SSIS or package level (such as the SQL Server MaxMemory setting).

But I was confident that I could go to the UAT team and set this monitoring up to at least tell them what was happening, so we could then work to track down why.

At this point I was 8 hours in and still Googling things, and I found this thread on DBA Stack Exchange.

Crud.  Really?

Did you miss it too?  Go look at the very first line in the error text I posted.

Version 12.0.5000.0 for 32-bit.

Dagnabbit.  32-bit applications are limited to 2GB of memory by deafult.  With some switchcraft (aka the /3GB switch) you can bump it.

When setting up the environment the person doing it pointed the SSIS packages to the C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn directory instead of C:\Program Files\Microsoft SQL Server\120\DTS\Binn.

We changed this config parameter in the application setup and everything worked perfectly.

SSIS filled up the 2GB with buffers (big file, lots of columns) and fell apart.  Simple as that.

Hope this helps someone…please let me know if it helped you by commenting!

Update: If you read this far, also check out THIS article from my friend Tim on SSIS memory.

Thanks for reading!

Kevin3NF

My Pluralsight course for new SQL Server DBAs

Follow @Dallas_DBAs


Filed Under: SSIS, TIL

TIL: Joining across databases with different collations

May 16, 2017 by Kevin3NF 3 Comments

Quick post today…

Yesterday I was approached by a developer with something it has never occurred to me to try in 18 years of being a DBA.

“Can I write a query across two databases with different collations?”

My first thought took me back to SQL 7, when you had to reinstall SQL Server to change collation settings (IIRC).   My second and immediate thought was “Quit being so old and look it up, Kevin.”

So I bounced over to The Google, then to an answer on StackExchange that I can’t find right now (of course, but I did upvote it yesterday) and found what I needed.

I have two test databases on my local SQL 2014 Dev instance, both with an Orders and OrderDetails table in them with no data.  Both were the default collation so I changed one of them to CS instead of CI:

ALTER DATABASE [test2] COLLATE SQL_Latin1_General_CP1_CS_AS

I entered a test order in each DB manually, and in the “Note” column I changed the case of the word Five.   Why Five?  No idea…its just text, and that’s what my developer (Jared) was having to join on.

Select * from Test.dbo.orderdetails where note = 'five'
 
Select * from test2.dbo.OrderDetails where Note = 'FIve'

Results:


OrderDetailID OrderID ItemQty Note
------------- ----------- ----------- ---------
1 1 5 Five

(1 row(s) affected)

OrderDetailID OrderID ItemQty Note
------------- ----------- ----------- ---------
1 2 5 FIve

(1 row(s) affected)

Next up, write the Select statement the normal way to duplicate the error Jared was getting:

--fails
Select *
from Test.dbo.orderdetails A join test2.dbo.OrderDetails B
 on A.[note] = B.[Note]

Result:

Msg 468, Level 16, State 9, Line 11
Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CS_AS” and “SQL_Latin1_General_CP1_CI_AS” in the equal to operation.

 

So…now to bring it in the correct way (and there is probably more than one correct syntax for this…I didn’t look):


--works
Select *
From Test.dbo.orderdetails A
Join test2.dbo.OrderDetails B
on A.[note] = 
   Cast(B.[Note] as varchar(50)) COLLATE SQL_Latin1_General_CP1_CI_AS

Result:


OrderDetailID OrderID ItemQty Note OrderDetailID OrderID ItemQty Note
--------------------------------------------------------------------- 
1 1 5 Five 1 2 5 FIve

(1 row(s) affected)

 

All I did here was tell the query to treat the Test2 Note field as Case Insensitive for this one query.  Nothing changed in the source data itself, or the database settings/DDL.

It took me five times longer to write this up than to test it, since I already had some generic test databases ready to go.   Same ones I use for testing my answers on MSDN, SSC and DBA Stack questions.

If this helped you, please share, like, re-post, etc. as you see fit.   Or a thumbs up in a comment if you like.

Thanks for reading!

Kevin3NF

(stalk me on Twitter)

Filed Under: SQL, TIL, TSQL

TIL: Starting a recurring SQL trace

December 8, 2016 by Kevin3NF 1 Comment

Today I learned…

How to set up a recurring Server-side SQL trace that runs every hour for 10 minutes.

Issues:

  • 6 people in the room are staring at me waiting for the last second request to be done at the end of an 11 hour day (3 of them from the VBV – Very Big Vendor)
  • Trace file names must be different, or you get errors
  • Trace files cannot end with a number
  • I can’t tell time when I am hungry and tired

I know I need to script out the trace to T-SQL (which I had already done), and put this in a ‘start trace’ job on an hourly schedule. I have set these up before to run ad hoc, but not recurring.
I also need code to stop the trace 10 minutes later that my boss understands…so we are going for fast and functional here….not pretty.

In the SQL script Profiler generates, the @tracefile parameter supplied to sp_trace_create is specified in the call:

exec @rc = sp_trace_create @TraceID output, 2, 'D:\MyTraceFiles\Trace', @maxfilesize, NULL

If you need different names for each execution you have to get creative. I created a parameter to feed that included a getdate():

declare @tracefile nvarchar(245)
set @tracefile = 'D:\MyTraceFiles\Trace_'+Cast (getdate() as varchar(20))+'_Trace'

this worked and gave this result in testing:

D:\MyTraceFiles\Trace_Dec 8 2016 7:08AM_Trace

But…running the full trace script generated files in the directory with this name:

Trace_Dec 8 2016 7

It took me some time playing with variable sizes and such before I asked one of the VBV guys to look at it. We figured out the ‘:’ in the time stamp was an issue, so I included a REPLACE:

set @tracefile = 'D:\MyTraceFiles\Trace_'+replace(Cast (getdate() as varchar(20)),':','')+'_Trace'

Which gave this result:

D:\MyTraceFiles\Trace_Dec 8 2016 711AM_Trace

More importantly, I had a functioning trace script I could schedule:

-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
declare @tracefile nvarchar(245)

set @maxfilesize = 4096
set @tracefile = 'D:\MyTraceFiles\Trace_'+replace(Cast (getdate() as varchar(20)),':','')+'_Trace'

--select @tracefile

exec @rc = sp_trace_create @TraceID output, 2, @tracefile, @maxfilesize, NULL
if (@rc != 0) goto error

-- Client side File and Table cannot be scripted

-- Set the events
declare @on bit
set @on = 1
exec sp_trace_setevent @TraceID, 10, 1, @on
exec sp_trace_setevent @TraceID, 10, 9, @on
exec sp_trace_setevent @TraceID, 10, 2, @on
.
.
.
exec sp_trace_setevent @TraceID, 13, 64, @on
exec sp_trace_setevent @TraceID, 13, 66, @on

-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint

exec sp_trace_setfilter @TraceID, 35, 0, 6, N'Database1'
exec sp_trace_setfilter @TraceID, 35, 1, 6, N'Database2'
--above, the '1' in parameter 3 changes this to an OR instead of an AND
-- so where databasename is Database1 OR Database2. Also new info for ---me today

-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1

-- display trace id for future references
select TraceID=@TraceID
goto finish

error:
select ErrorCode=@rc

finish:
go

Scheduling was simple…drop the above into Job ‘Start trace’ and enable to run every hour on the hour.

Create a second job to run every hour on the 10 minutes with the following:

exec sp_trace_setstatus 2, 0
exec sp_trace_setstatus 2, 2

On this system, my trace will always be 2. 1 is the default trace and nobody else here runs them.

Summary:

I already knew I don’t do well at the end of a long day or in a hurry, so I took my time while the others waited, in order to not put in a bad solution to run overnight. There are much more elegant ways to do this, but this worked for me and I learned it on the fly.

Thanks for reading!

Kevin3NF

Filed Under: Performance Tuning, TIL Tagged With: job, performance, SQL Trace, TIL

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2026 · WordPress · Log in

 

Loading Comments...