• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
  • Pocket DBA®
  • Blog
  • Testimonials
  • Contact
  • About

SQL

Migrate multiple SQL databases to Azure SQLDB

August 1, 2017 by Kevin3NF 1 Comment

I ran across a forum post asking about moving multiple databases to Azure, so of course I Googled the asker’s question and found:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-migrate-your-sql-server-database

Since I might be doing a similar thing in the near future, I decided to see if I could push 4-5 small test databases to Azure SQLDB at the same time (SQL on Azure VM test may come later).  If this happens with my client it will be hundreds of databases, with the same schema but wildly varying sizes.

The “Plan”

  • Create 5 sample databases
  • Create .bacpacs on local drive
  • Create Azure Logical Server if not already done (see step 4)
  • Create a Notepad of SQLPackage.exe commands,  one for each .bacpac
  • Run with “start” in CMD to fire off separate windows
  • Wait.
  • Enjoy.

I took the easy way out and created my 5 databases by creating 1 test database, backing it up and then restoring to Test2 – 5:

USE [master]
RESTORE DATABASE [AzureSQLDB_Test2] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test3] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test4] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test5] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5_log.ldf'
GO


Create .bacpac files…basically just zip files of schema and data that Azure SQLDB can read.   You cannot do a traditional RESTORE DATABASE to an Azure logical server.   Right-click the database>>Tasks>>Export Data-Tier Application.  Then work through the GUI:

Azure SQLDB bacpac import

Browse for the location to place the .bacpac.   Remember this, as you will need it shortly.

Azure SQLDB bacpac import

Click Finish to create the file.

At this point, you need to make sure you have an Azure account and active Subscription.  You also need a Storage Account.   The Azure SQLDB lives on a Azure Logical server that has its own settings, firewall access rules, etc.   I am not going into those details as they are well documented in multiple places, and I know that your Google-Fu is strong!

SQLPackage.exe is a command line tool that is used to work with data-tier applications.   I’ve used it to extract schemas, do multiple schema compares at once, deploy changes to live databases, etc.   Read the first paragraph of the link…

I created a list of commands, one for each database, that I need to run to move my individual .bacpac files into Azure SQLDBs:

start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test2;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test2.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test3;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test3.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test4;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test4.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test5;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test5.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic

Yes, that is a bunch of text for just 5 parameters.  Breaking it down:

  • /a:import
  • — import action to import the data and schema into Azure SQLDB
  • /tcs:”Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL”
  • — Server, Database, UserID and Password to connect to and create
  • /sf:”C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac”
  • — Source File that gets imported.  Use double quotes if your location has spaces in the path
  • /p:DatabaseEdition=Basic
  • — Basic, Standard, Premium (cost and performance from low to high)
  • /p:DatabaseServiceObjective=Basic
  • — Lots of options here if you chose Standard or Premium above.  Basic is just Basic.

Run these commands to import the .bacpacs:

Note that in my script I have ‘Start’ at the beginning.  What that does when I paste all 5 lines into a CMD window is fire off a new Window that then runs the SQLPacakage.exe command.   All 5 .bacpacs are then being executed simultaneously.   Without that Start, they will run one after the other.  Also, make sure you get the carriage return after the final command (line 6 in my code), or the 5th database will just sit there in the CMD window waiting for you to press enter.

My screen as they were running:

Azure SQLPackage Import SQL Server

My SSMS after connecting to my Azure logical Server:

SSMS Azure SQL

Note the Table_1 is there just to verify that the one database I actually changed came across properly.

That’s it.  They are there, ready for me to do whatever it is I plan to do with them

Some things to consider:

  • Bandwidth:  Your internet connection will determine how fast this works (throughput and stability)
  • Local or Azure storage: You can copy the .bacpacs to your storage account and run from there.   Adjust the /sf parameter accordingly
  • Database size: There are limits to Azure SQLDB on size, performance, etc.   Do your research.
  • Database performance tier: the more you want, the more you pay.  Again, do your research.
  • Yes, I’m sure there are a number of other ways to do this, including Powershell options.   This is what I went to due to the GUI and simple nature of it.

That’s all I have for today…comments appreciated!

Thanks for reading.

Kevin3NF

Filed Under: Azure, backup, Deployment, SQL

Determine an Index Size in SSMS

July 28, 2017 by Kevin3NF Leave a Comment

Quick post today…I needed to find a quick way to get the size of a specific index without using T-SQL.

Tables are easy…right-click>>Properties>>Storage.   Indexes don’t have a storage option.   You can get the sum of all indexes on a table from the “Disk Usage by Top Tables” built-in report, but not one by one.

To do it in the SSMS GUI, expand the Database>>Tables>>Indexes:

SQL Index Table Size

 

Right-Click the index in question, select “Properties”, then “Fragmentation.”:

index fragmentation

As we know, pages are 8KB, so now its just a simple math problem:

46236 pages x 8 KB /1024 to get to MB = 361 MB (rounded for conversation’s sake):

SQl Server Index size calculator

If you need more than one, but not all…you’re going to need to Google up some T-SQL to run.  I found a variety of ways to do it, and didn’t like any of them.

Thanks for reading!

Kevin3NF

Follow Dallas DBAs on LinkedIn

 

Filed Under: Accidental DBA, Beginner, Index, SSMS

Greatest Moment of my Twitter Life

July 10, 2017 by Kevin3NF Leave a Comment

If you don’t know who Brent Ozar (b|t) is, this will mean nothing to you 🙂

I can retire now 😉

Kevin3NF

 

Filed Under: SQL

Thoughts from a New Speaker

June 19, 2017 by Kevin3NF Leave a Comment

Sunday afternoon…enjoying the “Post SQL Saturday” calm…

SQL Saturday Houston was yesterday, which in and of itself is not that unusual for me.   I’ve been to many technical conferences over the last 18 years…4 SQL Saturdays, 2 PASS Summits, SQL Server launch events, TechNet stuff back in the late 90’s and more user group meetings than I can count.

This particular event was the first time I ever went as a speaker.   Its technically the first time I’ve ever spoken to a technical audience.   12 years ago I did a demo of some Red Gate tools, but that was a short ad-hoc thing.  I’ve taught a few training classes in the workplace and am currently doing a 5 week fundamentals class now.  Small, interactive and very different than yesterday.

Wisdom for first time speakers can be found all over.   Blogs, Twitter, and there is even a Slack channel for it in the SQL Community slack. Almost all of that says to start small, like at work or at the local user group.  Practice and rehearse, until your dog and kids know your topic inside and out. Never type in a demo, etc.

I prepared a slide deck and pretty much winged it…which is what I do best.  If I rehearse things I get nervous and wind up second guessing myself.  I skipped the local user group part and went straight for  SQL Saturday.  I applied to speak at Houston (DR/Backups) and Chattanooga (Security) hoping to get a nibble on one.  Much to my surprise I was accepted to both.  Apparently the desire for beginner/fundamental topics is significant.  The attention some of my fundamental blog posts gets tends to support this.

My session went really well (in my opinion), with the only technical issue being a projector that overheated and had to cool for a few minutes. As a joke I broke out my backup laptop claiming to have been half-prepared.

I was a bit nervous the day before, but when I got to the event site and into the Speakers/Sponsor/Volunteer room, all that went away.  I almost had a last minute switch of session times with Bob Ward, as he was running a little late.  But, he got there with a few minutes to spare.

One of the best prep things I did was go to Brent Ozar’s (b|t)performance tuning pre-conference on Friday.  I went as much to watch presentation style as for the knowledge being shared.  Brent is a great presenter.  I also really enjoy anything Pinal Dave (b|t)has to say, mostly due to the excitement and passion he delivers it with.  Pinal could make a snail race exciting.

The San Jacinto Junior College facility was fantastic with clean rooms, stable WiFi, etc.  Everything there looks new.  The local SQL Saturday team had all the bases covered.  Allen (t), Devon (t), Sarah and the rest made for a fantastic day.

I’m looking forward to Chattanooga next weekend, and who knows where beyond that.   If you are interested is presenting, don’t let anything stop you from responding to a call for speakers.  Applying is half the battle 🙂

Thanks for reading,

Kevin3NF

 

 

 

Filed Under: Career, Speaking, SQLSaturday

SQL Saturday Houston 2017

June 13, 2017 by Kevin3NF Leave a Comment

I will be presenting my session “Backups for non-DBAs….the Why, not the How” at SQL Saturday Houston on June 17, 11am.

This is mostly targeted at non-DBAs such as owners, managers, CxOs, developers and sysadmins…but all are more than welcome.

We will be discussing various scenarios to consider as part of an overall DR/Business Continuity plan, as well as sharing a few horror stories from the field…both ancient and current.

The session abstract:

Join me and get some new perspective on what your DBA is doing, and why!

Targeted at the non-DBA such as BI/DW folks, Application Developers, Managers, and System Admins, this session will go over the reasons we back up our databases, our systems, etc. Considerations such as Single points of failure, High Availability/Disaster recovery, Business Continuity and others will be discussed in this interactive conversation. It will be conversation heavy, with supporting slides to download, and one Database backup demo at the end if time permits.

Everyone involved in a technical role needs to at least know that the things they have created will be recoverable in the event of a disaster, or even just an “oops” moment. The CIO/CTO should know how long critical systems will be down when bad things happen. Backups are everyone’s responsibility…whether asking the right questions or implementing the process.

I look forward to seeing you there!

Kevin3NF

 

Filed Under: backup, Career, HADR, Restore, SQLSaturday

Database Delete vs. Detach vs. Offline

May 31, 2017 by Kevin3NF Leave a Comment

If you haven’t been asked to Delete a database in your DBA career…its just a matter of time:

“We don’t need database XYZ anymore…delete it.” — Some well meaning manager

Stop right now, before blindly following an order like that.

First…there are some questions to ask:

  • What environment is this? (Prod, Dev, etc.)
  • Are you sure we don’t need it?
  • Has anyone verified that?
  • When was it last used?
  • Did something replace it?
  • Is it backed up?

As a DBA, any time anyone asks you to delete anything, from data to an entire database, you need to question it.  Not doing so can be a CLM/RGE combo!  (Career Limiting Move/Resumé Generating Event)

The request may be perfectly valid, but in my experience most people that ask for a database to be deleted use the term to mean “I don’t want to see that database in SSMS anymore” or “I don’t want it on Server A, put it on Server B”, with very little regard for impact to other teams or applications.

There are three main ways to take a database “out of service”:

  • Take it Offline
  • Detach it
  • Delete it

Each has its own considerations and each is the proper choice in different circumstances.  I am going to walk you through each one, from least risky to most, in that order.

Offline:

Takes the database offline, but still visible in SSMS.  Cannot be accessed by applications, backups, etc.   Can be brought back online relatively quickly.

My list of databases before starting this post…we’ll be messing with the Rename2222 database (nothing in it…came from a rename script I was writing/testing):

Right-click the database in question, Tasks, Offline:

 

Dialog Box you get after clicking “Take Offline”:

Note 1: You do not get the below dialog in SSMS 2014 (and possibly earlier)…this was done using SSMS 2016.
Note 2:  If you click the Script button, it will take the database offline in SSMS2016, instead of generating a script.  I’m told this is resolved in SSMS 2017.

 

Results in SSMS after Offline Operation:

Upside to Offline method: Database is still in place,  visible in SSMS, Record in master database is still there, relatively simple.

Downside to Offline: May cause SQL Monitoring tools to throw alerts and create tickets, files are not in use by SQL Server, can be deleted.

Detach:

Removes the database from SSMS, deletes its meta information in the master database, but leaves the physical files intact on the drive.

The detach menu:

The detach dialog:

Choose “drop connections” here…if not, the detach may fail.  Or, leave it unchecked as a test to see if anyone is using it

After Detaching:

Note that the MDF/LDF files are still right where they were.  SQL Server no longer has a handle on them:

You can re-attach fairly quickly should the need arise (i.e. user comes screaming at you 5 minutes after the detach):

Right-click “Databases”, select Attach

 

Click ‘Add’ and browse for the .MDF file….should find and include the .LDF file on its own

 

Click ‘OK’ to attach

Upside to Detach method: Files are still in place, re-attach is fairly straightforward, Monitoring tools will generally not error. Can save your job by not having deleted a database someone else needed.

Downside to Detach: Files not manually deleted take up disk space…set reminders to delete after some time period.

Delete:

Drops the database, its meta data in Master, AND its associated physical files…you cannot reverse a Delete (DROP Database).

This is one of those “Stop right here” moments.  Before you Delete a database, make very sure you are following protocols.   Make sure you have a recent FULL backup:

 

If not, run one.  The simplest possible command to backup a database:

 -- backs up to default location

Backup Database Rename2222 
To Disk = 'Rename2222.bak'

Better:

-- backs up to specified location 
-- with date included in the file name:

Backup Database Rename2222 
To Disk = 'c:\MyBackups\Rename2222_20170531.bak' 

Once you have a backup, you also should get a ticket, change control, or at the very minimum an email from the requester with approval for the delete.  Depends on your process.   Especially important if this is a production environment.   This can save your job.  If you Delete a database that a system depends on and cause an outage…whoever told you to do may claim they were not involved.

To delete is way too simple:

Deleting backup/restore history is up to you.  Drop all connections to force everyone out.  
If anyone is connected, should you be deleting?

Remember…once you Delete a database…its gone.   The ONLY ‘undo’ is to restore from Backup.   You can call Microsoft, but they will tell you the same thing.  Worst case, you can spend tens of thousands of dollars on a data recovery service, assuming you can take the server down and nothing overwrote the space the the files were in on the drives.

EDIT: If you take a database into “Offline” state, then delete it – the physical files will still be on the drive. Not sure if this is accurate in ALL SQL Server versions, so detach first.

Wrap up:

Almost everywhere I’ve worked, Detach with a 2-4 week follow up to delete the files was the best choice.   But beware that some systems may only be useful for Quarterly or Annual reporting.  In an ideal situation, you have a complete inventory of every database on every server from Production down to Dev and know exactly why they are there, as well as who the stakeholders are for each one.  If you don’t have this for the O/S, SQL and DB level info, start now.

Thanks for reading!

Kevin3NF

My Pluralsight course for new DBAs

Follow @Dallas_DBAs

Filed Under: Accidental DBA, backup, Beginner, EntryLevel

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 24
  • Go to page 25
  • Go to page 26
  • Go to page 27
  • Go to page 28
  • Interim pages omitted …
  • Go to page 34
  • Go to Next Page »

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2025 · WordPress · Log in

 

Loading Comments...