Schema Compare multiple databases at once…

I recently had the need to schema compare a “Gold” or “Master” copy of our database to the 300 client copies of that database that exist in our Production environment.  I’m not alone…many fellow DBAs have had the same need.   Google searches for this confirm it.   This is for an upcoming upgrade to the application that will need post-upgrade comparison/verification.

There are 3rd party tools that do SQL Compares…my particular favorite is aptly named SQL Compare from Red Gate.  I’ve been using it off and on for 10 years.   I don’t know if it can be set up to hit more than one database at a time.  The other issue is that I don’t have a copy here.

Microsoft’s SQL Server Data Tools will also do this within Visual Studio.   Still one database at a time.  I forget where, but someone pointed me to the fact that SSDT uses SQLPackage.exe under the hood to do the work.  I figure if I can run it at a command line I can script out all of the databases.  I’m not much of a DOS scripting guy, so everything that follows is just my hack version…but it works, and not just on my machine!

I got most of this from StackOverflow user Mike Hyde’s response here
I had a followup “gotcha” question on SO here

(Edit:  When I migrated this from Blogger to here, a lot of my paths lost all of the backslashes (\)…)

The process:

  • Create a Test master database with 2 tables and 1 view
  • Create 5 copies as Test1, Test2, etc. (If it works for 5, it works for 300)
  • Add a column to a table in Test4, and a view to Test5
  • Use SQLPackage to Extract a .dacpac file from the “Gold” database (.dacpac is basically a schema only backup)
  • Dynamically create a script that generates the SQLPackage call
  • Paste that into a CMD window to compare the Gold db to the Client dbs
  • View the results, looking for exceptions

Create a Test master database:

--Create Gold copy  
 CREATE DATABASE [Test]  
  ON PRIMARY  
 ( NAME = N'Test',  
      FILENAME = N'c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATATest_Gold.mdf',  
      SIZE = 4096KB , FILEGROWTH = 1024KB )  
      LOG ON ( NAME = N'Test_log',  
      FILENAME = N'c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATATest_Gold_log.ldf',  
      SIZE = 1024KB , FILEGROWTH = 100MB)  
 GO  
 --Create some Objects  
 USE [Test]  
 GO  
 CREATE TABLE [dbo].[Orders]  
      (  
      [OrderID] [int] NOT NULL,  
      [OrderDate] [datetime] NULL  
      )  
      ON [PRIMARY]  
 GO  
 ALTER TABLE [dbo].[Orders]  
      ADD CONSTRAINT [DF_Orders_OrderDate]   
      DEFAULT (getdate()) FOR [OrderDate]  
 GO  
 CREATE TABLE [dbo].[OrderDetails]  
      (  
      [OrderDetailID] [int] NOT NULL,  
      [OrderID] [int] NOT NULL,  
      [ItemQty] [int] NOT NULL  
      )  
      ON [PRIMARY]  
 GO  
 Create View vOrders AS  
 SELECT  
      dbo.Orders.OrderID,  
      dbo.Orders.OrderDate,  
      dbo.OrderDetails.OrderDetailID,  
      dbo.OrderDetails.ItemQty  
 FROM    
      dbo.OrderDetails  
      INNER JOIN  
      dbo.Orders ON dbo.OrderDetails.OrderID = dbo.Orders.OrderID  

 

Create 5 copies as Test1, Test2, etc.:

--Back it up  
 Backup Database test  
 To Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 --restore x 5  
 Restore Database Test1  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest1.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest1_log.ldf'  
      ,Recovery  
 Restore Database Test2  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest2.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest2_log.ldf'  
      ,Recovery  
 Restore Database Test3  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest3.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest3_log.ldf'  
      ,Recovery  
 Restore Database Test4  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest4.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest4_log.ldf'  
      ,Recovery  
 Restore Database Test5  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest5.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest5_log.ldf'  
      ,Recovery  

 

Add a column to a table in Test4, and a view to Test5:

Use Test4;  
 Go  
 Alter Table dbo.Orders  
      Add CustomerName varchar(500)  
 Use Test5;  
 go  
 Create View VOrderDetails AS  
 Select * from dbo.OrderDetails  

So at this point we have a gold copy (Test), 3 that we know match it (Test 1/2/3) and 2 that have Drifted.   Get used to the term drift…3rd party vendors are using it and later, we will create scripts to fix ‘drift’.

Finally it gets interesting!

Use SQLPackage to Extract a .dacpac file from the “Gold” database:

We will use SQLPackage.exe to create a .dacpac file.  Search your box for the file…likely it will be buried in Visual Studio’s folders or SQL Server’s.  Once I found it, I added that folder to my PATH variable so I didn’t have to CD to it in the CMD window each time.

My default install has SQLPackage here:
C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin

If you cannot edit your PATH due to company restrictions, use this at the beginning of your script:
SET PATH=%PATH%;C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin

Now, SQLPackage needs three basic parameters (over simplified, I know):

  1. Action
  2. Source
  3. Target

On my system, this is the command I will use:

sqlpackage.exe /a:Extract /scs:Server=US1213113W1SQL2014;Database='+[name]+'; /tf:C:Usershillke4DocumentsSQLScriptsDACPACS'+[name]+'.dacpac'   

Make sure this is all one line.  CMD will freak out on the CR;LF and tabs.  Also, change the server name in /scs and path in /tf.  /a is the action….we are Extract-ing the schema into a .dacpac file named in the /tf parameter.

Paste this into a CMD window, hit Enter and you should get this back:

Connecting to database ‘Test’ on server ‘MyLaptopSQL2014’.
Extracting schema
Extracting schema from database
Resolving references in schema model

 

Successfully extracted database and saved it to file ‘C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac’.


 
 

If it fails, the error messages are pretty specific about servername, database, etc.

Dynamically create a script that generates the SQLPackage call:

Now on to the part that gets you to multiple databases “at once.”  Technically, this is not all at once…there are multiple commands being run in order, not in parallel.  But, you can still check Twitter or get coffee while they run instead of sitting in “Point and Click Heck”.

For comparing to the live databases, I used:

Set NoCount On  
 -- Extract the gold copy from Test  
 Select   
      'sqlpackage.exe /a:Extract /scs:Server=US1213113W1SQL2014;Database='+[name]+'; /tf:C:Usershillke4DocumentsSQLScriptsDACPACS'+[name]+'.dacpac'  
 from sys.databases  
 where [name] like 'Test'  
 --Create a compare script with test.dacpac as the source  
 --and all databases starting with Test as the targets  
 Select   
      'sqlpackage.exe /a:Script /sf:C:Usershillke4DocumentsSQLScriptsDACPACSTest.dacpac /tsn:US1213113W1SQL2014 /tdn:'+[name]+' /op:C:Usershillke4DocumentsSQLScriptsDACPACSDeltas'+[name]+'.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True'  
 from sys.databases  
 Where 1=1  
   and [Name] like 'test%'  
   and [Name] <> 'Test'  

 

Note the parameters in the Compare section:

    /a – the action to be performed.  Script creates a .sql file with the changes to be made at the target
    /sf – source file to compare from.  the “gold copy”
    /tsn – target server name
    /op – output path for the .sql file
    /p – Property setting to drop tables, views and other objects that should not be there
    /p – Property setting to drop indexes that should not be there (different from other objects)
There are a ton of different ‘/p’ settings and options.  If you don’t include the first one I did (dropobjectsnotinsource), you may get back an empty .sql file even when you know there are differences.  This was the topic of my follow up question on Stack Overflow.  Schema Compare in VS showed the new objects, CMD didn’t.
Go to the SQLPackage.exe link for all of the specifics.   This is highly configurable to your needs.
Your output should resemble:
Extract the Gold .dacpac:  
 sqlpackage.exe /a:Extract /scs:Server=MyLaptopSQL2014;Database=Test; /tf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac  
   
   
 Compare to the target databases:  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test1 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest1.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test2 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest2.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test3 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest3.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test4 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest4.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test5 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest5.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
   

Copy and paste the /a:Extract line into a CMD window (don’t forget the PATH variable mentioned earlier) if you have not already done this part.  Run it by pressing Enter.

Copy all of the /a:Script lines (plus a carriage return at the very end) into a CMD window, and they will start running immediately.  Without the final CR;LF at the end, the last one will just sit there waiting for you to hit enter.

Any yellow text you see is warning you about possible data loss on dropping Tables and Indexes.

*** The object [Test] exists in the target, but it will not be dropped even though you selected the ‘Gene
rate drop statements for objects that are in the target database but that are not in the source’ check bo
x.
*** The object [Test_log] exists in the target, but it will not be dropped even though you selected the ‘
Generate drop statements for objects that are in the target database but that are not in the source’ chec
k box.

 

*** The column [dbo].[Orders].[CustomerName] is being dropped, data loss could occur.

The first two are the data and log logical file names, and the 3rd is warning about data loss from dropping the CustomerName column that was added to Test4.   The dependent view also gets a metadata refresh after that column is dropped…a nice feature to have.

View the results, looking for exceptions:


At this point, you should have a .sql file for each of your target/client databases, assuming no connection issues.   Ideally, they are all in the same place, which will help with the last step (actually running the scripts is your task…I’m just taking you through generating them).The way I choose to analyze mine is not to look at each one, but rather to look for exceptions in the size of the file.   There is some text in each file that is exactly the same.   Any additional text is T-sql code that drops, creates or alters something.Sample output .sql file, with the DROP statements in red:

/*
Deployment script for Test5
This code was generated by a tool.
Changes to this file may cause incorrect behavior and will be lost if
the code is regenerated.
*/
GO
SET ANSI_NULLS, ANSI_PADDING, ANSI_WARNINGS, ARITHABORT, CONCAT_NULL_YIELDS_NULL, QUOTED_IDENTIFIER ON;
SET NUMERIC_ROUNDABORT OFF;
GO
:setvar DatabaseName “Test5”
:setvar DefaultFilePrefix “Test5”
:setvar DefaultDataPath “c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATA”
:setvar DefaultLogPath “c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATA”
GO
:on error exit
GO
/*
Detect SQLCMD mode and disable script execution if SQLCMD mode is not supported.
To re-enable the script after enabling SQLCMD mode, execute the following:
SET NOEXEC OFF;
*/
:setvar __IsSqlCmdEnabled “True”
GO
IF N’$(__IsSqlCmdEnabled)’ NOT LIKE N’True’
    BEGIN
        PRINT N’SQLCMD mode must be enabled to successfully execute this script.’;
        SET NOEXEC ON;
    END
GO
IF EXISTS (SELECT 1
           FROM   [master].[dbo].[sysdatabases]
           WHERE  [name] = N’$(DatabaseName)’)
    BEGIN
        ALTER DATABASE [$(DatabaseName)]
            SET ENABLE_BROKER
            WITH ROLLBACK IMMEDIATE;
    END
GO
USE [$(DatabaseName)];
GO
PRINT N’Dropping [dbo].[VOrderDetails]…’;
GO
DROP VIEW [dbo].[VOrderDetails];
GO
PRINT N’Update complete.’;

My technique for comparing file sizes is to open a CMD Window by using Shift-RightClick on the folder they are in, choose ‘Open Command Window Here’, then run a ‘DIR’, which will list the files with the size in bytes.  Windows Explorer default is KB.  There are no doubt a bunch of ways to do this, so pick whatever you prefer.   My results:

(Image lost in migration from Blogger)

Note that the three we didn’t change are all 1,330 bytes.  Test4 is larger, as is Test5.  They are different from each other due to the exact T-SQL in them to perform the drops necessary to bring them back to gold.  These are my exceptions.

In my environment, it is appropriate for me to stop here and start going through the exceptions manually.  For you, it may be appropriate to automatically execute the scripts.  Look at the SQLPackage link from Microsoft for the /Action: Publish General parameter.

WARNING:  Publishing a .dacpac file that has any data in it overwrites the existing data in the target. This may be fine for a ‘State’ table, but not at all for Orders.

Note:  If you are comfortable with PowerShell, please go look at this article from Chris Sommer, which he wrote in response to me asking the Compare Multiple Databases question.  It covers the same basics I just did, but without the looping to get all the DBs at once.   He did the script and blog in about 2 hours…

Note2:  All of my testing and scripting was done on a laptop, running Windows 7, SQL 2014 Dev edition, Visual Studio and SSDT 2015.

Note3:  Special thanks to whoever wrote this Code Formatter for Blogspot posts!

All of this works in my environment, but its possible I missed a step in the writing.  Please help me QA this post.  If something is not clear or I made some egregious error, please please please make a Comment.  If it works, and helps you please let me know that as well!

Thanks for reading,

Kevin3NF
The OnPurpose DBA

Leave a Comment

Sign up for our Newsletter

%d bloggers like this: