• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
    • SQL Server Health Check
    • Fractional DBA
  • Pocket DBA®
  • Blog
    • Speaking
  • Testimonials
  • Contact
  • About

backup

SQL Server Point In Time Restore

October 26, 2017 by Kevin3NF Leave a Comment

20 minute video of me setting up and demonstrating a point-in-time database restore in SQL Server.

If you’ve ever been on a SQL DBA interview, more than likely you’ve been asked about this.   I was recently, and I’ve been a DBA for 18 years…it never goes away.  But, the reason for that is that many DBAs know the concept, but have never actually needed to do one.

Its a basics video, so please don’t be looking for Paul Randal level internals here 🙂

Thanks for watching!

Kevin3NF

 

Filed Under: Accidental DBA, backup, Beginner, Restore

SQL Saturday Minnesota 2017

August 31, 2017 by Kevin3NF Leave a Comment

Now that the official schedule has been published, I am pleased to announce that I will be doing a pre-con and a regular session at SQL Saturday Minnesota on October 6-7. 2017.

The Friday pre-conference will be “DBA Fundamentals for the Junior, Accidental and non-DBAs.”  This will be a wide ranging day-long discussion on the underlying fundamental pieces of SQL Server Administration that are beneficial to both new DBAs, and those that have to work with or fill in for them.  Please see the link for details, and feel free to contact me with any questions.  There is a $110 (plus service fees) charge for this, which is split between me and the local SQL User group organizing the event, after expenses.

The Saturday regular session will be “Backups for new/non-DBAs…the Why, not the How”, where we will cover Disaster recovery scenarios and the part SQL Server plays in the recovery.  We won’t spend much time on specifically defining backup types, etc.    Currently this is scheduled for the 1-2pm time slot, but that may change.

Edit: my Saturday session is now 10:15am – 11:15am 🙂

I’m very excited to be coming to the MSP area!  I even have a friend that lives in the area that I get to see for the first time in 3 years…plus all of the new friends I will make at the events!

Thanks,

Kevin3NF

Filed Under: Accidental DBA, backup, Beginner, EntryLevel, HADR, Install, Security, SQLSaturday

Migrate multiple SQL databases to Azure SQLDB

August 1, 2017 by Kevin3NF 1 Comment

I ran across a forum post asking about moving multiple databases to Azure, so of course I Googled the asker’s question and found:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-migrate-your-sql-server-database

Since I might be doing a similar thing in the near future, I decided to see if I could push 4-5 small test databases to Azure SQLDB at the same time (SQL on Azure VM test may come later).  If this happens with my client it will be hundreds of databases, with the same schema but wildly varying sizes.

The “Plan”

  • Create 5 sample databases
  • Create .bacpacs on local drive
  • Create Azure Logical Server if not already done (see step 4)
  • Create a Notepad of SQLPackage.exe commands,  one for each .bacpac
  • Run with “start” in CMD to fire off separate windows
  • Wait.
  • Enjoy.

I took the easy way out and created my 5 databases by creating 1 test database, backing it up and then restoring to Test2 – 5:

USE [master]
RESTORE DATABASE [AzureSQLDB_Test2] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test3] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test4] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test5] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5_log.ldf'
GO


Create .bacpac files…basically just zip files of schema and data that Azure SQLDB can read.   You cannot do a traditional RESTORE DATABASE to an Azure logical server.   Right-click the database>>Tasks>>Export Data-Tier Application.  Then work through the GUI:

Azure SQLDB bacpac import

Browse for the location to place the .bacpac.   Remember this, as you will need it shortly.

Azure SQLDB bacpac import

Click Finish to create the file.

At this point, you need to make sure you have an Azure account and active Subscription.  You also need a Storage Account.   The Azure SQLDB lives on a Azure Logical server that has its own settings, firewall access rules, etc.   I am not going into those details as they are well documented in multiple places, and I know that your Google-Fu is strong!

SQLPackage.exe is a command line tool that is used to work with data-tier applications.   I’ve used it to extract schemas, do multiple schema compares at once, deploy changes to live databases, etc.   Read the first paragraph of the link…

I created a list of commands, one for each database, that I need to run to move my individual .bacpac files into Azure SQLDBs:

start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test2;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test2.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test3;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test3.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test4;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test4.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test5;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test5.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic

Yes, that is a bunch of text for just 5 parameters.  Breaking it down:

  • /a:import
  • — import action to import the data and schema into Azure SQLDB
  • /tcs:”Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL”
  • — Server, Database, UserID and Password to connect to and create
  • /sf:”C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac”
  • — Source File that gets imported.  Use double quotes if your location has spaces in the path
  • /p:DatabaseEdition=Basic
  • — Basic, Standard, Premium (cost and performance from low to high)
  • /p:DatabaseServiceObjective=Basic
  • — Lots of options here if you chose Standard or Premium above.  Basic is just Basic.

Run these commands to import the .bacpacs:

Note that in my script I have ‘Start’ at the beginning.  What that does when I paste all 5 lines into a CMD window is fire off a new Window that then runs the SQLPacakage.exe command.   All 5 .bacpacs are then being executed simultaneously.   Without that Start, they will run one after the other.  Also, make sure you get the carriage return after the final command (line 6 in my code), or the 5th database will just sit there in the CMD window waiting for you to press enter.

My screen as they were running:

Azure SQLPackage Import SQL Server

My SSMS after connecting to my Azure logical Server:

SSMS Azure SQL

Note the Table_1 is there just to verify that the one database I actually changed came across properly.

That’s it.  They are there, ready for me to do whatever it is I plan to do with them

Some things to consider:

  • Bandwidth:  Your internet connection will determine how fast this works (throughput and stability)
  • Local or Azure storage: You can copy the .bacpacs to your storage account and run from there.   Adjust the /sf parameter accordingly
  • Database size: There are limits to Azure SQLDB on size, performance, etc.   Do your research.
  • Database performance tier: the more you want, the more you pay.  Again, do your research.
  • Yes, I’m sure there are a number of other ways to do this, including Powershell options.   This is what I went to due to the GUI and simple nature of it.

That’s all I have for today…comments appreciated!

Thanks for reading.

Kevin3NF

Filed Under: Azure, backup, Deployment, SQL

SQL Saturday Houston 2017

June 13, 2017 by Kevin3NF Leave a Comment

I will be presenting my session “Backups for non-DBAs….the Why, not the How” at SQL Saturday Houston on June 17, 11am.

This is mostly targeted at non-DBAs such as owners, managers, CxOs, developers and sysadmins…but all are more than welcome.

We will be discussing various scenarios to consider as part of an overall DR/Business Continuity plan, as well as sharing a few horror stories from the field…both ancient and current.

The session abstract:

Join me and get some new perspective on what your DBA is doing, and why!

Targeted at the non-DBA such as BI/DW folks, Application Developers, Managers, and System Admins, this session will go over the reasons we back up our databases, our systems, etc. Considerations such as Single points of failure, High Availability/Disaster recovery, Business Continuity and others will be discussed in this interactive conversation. It will be conversation heavy, with supporting slides to download, and one Database backup demo at the end if time permits.

Everyone involved in a technical role needs to at least know that the things they have created will be recoverable in the event of a disaster, or even just an “oops” moment. The CIO/CTO should know how long critical systems will be down when bad things happen. Backups are everyone’s responsibility…whether asking the right questions or implementing the process.

I look forward to seeing you there!

Kevin3NF

 

Filed Under: backup, Career, HADR, Restore, SQLSaturday

Why is my SQL Log File Huge?

March 8, 2017 by Kevin3NF 11 Comments

Pluralsight courses for new SQL Server DBAs
Do you need our help?     Or our DBA retainer service for emergencies?

 

HUGE Log files and how to troubleshoot:

The single most common question I have encountered in 18+ years of working with SQL Server:

Why is the .LDF file filling up my 500GB drive?  I only have 100MB of data!!?!?!?  Why am I getting error 9002?

For new or non-DBAs, this is a very frustrating situation without a logical reason (or so it seems).  It is also very common for it to be accompanied by applications that won’t work, alerts firing for drive space issues, etc.

If you like video, I recorded my response to this question and discuss the two most common remedies.  If you don’t like video, scroll down for text:

 

There are a number of reasons a log file can fill to extreme sizes.  The most common one by far is that the database is in full recovery model, and Transaction Log backups are not happening fast enough, or not happening at all.  Next to that, it could be that you had a massive transaction happen such as a huge data import, rebuild all indexes, etc.  These are logged and stay there until the .ldf file is backed up (or checkpointed if you are in Simple Recovery).

Step 1: Verify recovery model

Right-click the database, go to properties, click the Options tab.   You will see Full, Bulk-Logged or Simple.   If you are in Full, you have the option of backing up the log…which is the best possible situation.

SQL Server Database Options

Step 2: Verify if the log is full or “empty”

Verify if the log file is actually full or not.  If you are backing up and the file still grew to ridiculous size…it may have just been a one time thing and you can deal with that easily.  Right-click the database, go to reports, standard reports, disk usage.  This will give you 2 pie charts.  Left is the data file, right is the log.  If the log shows almost or completely full AND the huge size, you need to backup.  If the log file is huge and mostly empty, you simply need to shrink to an acceptable size.

SQL Server Disk Usage

Step 3: Shrink the file (if empty)

Right-click the database>>Tasks>>Shrink>>Files

Choose ‘Log ‘ from the File Type drop down.  Hopefully there is only one log file.  If not, pick the big one.  Under Shrink Action, choose an appropriate size and ‘Reorganize pages before releasing space” option, even though log file shrinks don’t actually do that.   Pick a size in MB and click ok.  0 is not a good choice here.

SQL Server Shrink File

Step 4: Backup

I’m not going to go into a ton of detail here….Right-click the database>>Tasks>>Backup   Change the backup type to Transaction Log and work through the rest of the steps.

If the Log Backup works, but the space is not freed (refresh the usage report), you have a different issue that these steps will not help with. Check out the “Wrapping Up” section at the bottom of this post.

If you don’t have enough room on any local, attached or network drive to create a log backup, even with compression, keep reading:

Step 5: Flip the Recovery Model (if log backup is not possible)

Warning:  Doing this WILL cause you to lose point-in-time recoverability, but if you cannot backup the log, you are pretty much already there anyway.

Right-click the database>>Properties>>Options

Change the recovery model to Simple and click OK

SQL Server Recovery Model

Wait a few seconds and then go refresh the Disk Usage report.  The log file will be the same size, but should be almost empty:

SQL Server Disk Usage

Step 6: Shrink the Log file

See step 3 above…

Step 7: Flip the recovery back to Full

See step 1…

Step 8: Set up recurring log backups

If you don’t know how to do this, go to Management, Maintenance Plans, right-click Maintenance Plan>>Maintenance Plan Wizard and go from there.   This is well documented elsewhere.

Wrapping Up:

Hopefully, this resolved your issue but there are definitely other reasons for this issue to happen aside from a simple failure to back up.   Most notably, a very large transaction in a database that is participating in SQL Replication as a publisher.

If the above methods do not work, run these two statements and go post the results in the MSDN SQL Server forums, along with a description of the issue and what you have already tried (hopefully all of the above):

Select [name],recovery_model_desc, log_reuse_wait_desc 
from sys.databases
Where [name] = 'MyDatabase' --change this

DBCC OPENTRAN --results will be in the messages section

I love comments on my post, but if you need quick help go to the forums first, or maybe even a call to Microsoft Support if the “quick hits” don’t get you the resolution you need.  If this helped, please comment and share the link…

Thanks for reading!

Kevin3NF

My Pluralsight course for new SQL Server DBAs

Follow @Dallas_DBAs

Filed Under: Accidental DBA, backup, Beginner, EntryLevel

TSQL2SDay: Backup Basics

December 13, 2016 by Kevin3NF 1 Comment

t-sqltuesday

T-SQL Tuesday is a blog party started by Adam Machanic (b/t) just over five years ago. The first Tuesday of each month a blogger hosts the party and suggests a topic. Then anyone who is interested blogs on that topic. It can be a lot of fun and quite a challenge to blog on a topic you didn’t pick.

This month’s topic is presented by my friend Kenneth Fisher (b/t), who has challenged us to go #EntryLevel and talk about backups.  So I did…in a video just posted to YouTube:

You’ll want to go full screen to see what’s going on.  This video is targeted at non-DBAs, but give it a watch anyway…I can take the heckling!

Filed Under: Accidental DBA, backup, TSQL2sday

  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2025 · WordPress · Log in

 

Loading Comments...