• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
    • SQL Server Health Check
    • Fractional DBA
  • Pocket DBA®
  • Blog
    • Speaking
  • Testimonials
  • Contact
  • About

backup

SQL Server Backups are Not a High-Availability Solution

October 17, 2023 by Kevin3NF Leave a Comment

Please continue doing your backups!

Backups are Disaster Recovery, yes…but not HA.

Some will argue with this (in the comments most likely), but I broadly define “High Availability” as a system that can recover in seconds or minutes at most. Sometimes that is automatic, sometimes manual.

Backups might be quick to restore IF they are small enough and the right people are in place (not at lunch or otherwise out). But automated restores to prod just aren’t a thing.

SQL Server has this cute little marketing term called “Always On” which is nonsense. Always? Really? 9 Nines?

Always On covers both Failover Cluster Instances and Availability Groups. There are significant differences between the two. Both depend on the O/S Cluster…and they diverge a LOT from there. They are both HA.

Log Shipping (ancient tech) is great for DR and hitting your RPO number, but failover to a log shipped copy is manual.

Replication is not and never will be an HA or DR solution. Some things cannot be replicated, so they are lost if the publication database goes poof.

There are of course things outside of SQL Server that can help you hit your RPO/RTO goals. Feel free to share them.

What are you using for your HA solution?

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

Filed Under: backup, HADR, SQL Tagged With: syndicated

Test Restore Your SQL Server Databases

July 25, 2023 by Kevin3NF Leave a Comment

Pain Point: Something bad happened and you need to restore a SQL Server database.

Pain Point you didn’t know you had: The backup files are all corrupt due a problem with the storage subsystem.

Solution: Do test restores with one line of code from DBATools.io

Set-DbatoolsInsecureConnection -SessionOnly 

Test-DbaLastBackup -SqlInstance Precision7770-1

You will need a modern version of PowerShell and the DBATools module. Also a destination for the test restores to land.

This video is not sponsored by Sean or Jen from MinionWare, they are just really nice people selling a SQL Server maintenance solution for large enterprises 🙂

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

Filed Under: backup, Dallas DBAs, Restore, SQL Tagged With: DBATools.io, syndicated

Do Full Backups Break Log Shipping?

October 15, 2021 by Kevin3NF Leave a Comment

TLDR: Nope.

A tub boat pushing logs down a river
Image by jakeforlove from Pixabay

Keep on doing your full backups.

Make sure that any databases you Log Ship are NOT also doing log backups in your SQL Maintenance Plans, Ola Jobs, etc.

The Backup Chain will not be broken by running a Full backup, and you do not need to use Copy-Only

Proof:

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

Filed Under: backup, SQL Tagged With: syndicated, video

T-SQL Tuesday 120: What Were You Thinking?

November 12, 2019 by Kevin3NF 2 Comments

T-SQL Tuesday is a monthly blog party for the SQL Server community. It is the brainchild of Adam Machanic (b|t) and this month’s edition is hosted by Wayne Sheffield (b|t) who has asked us to write about “What Were You Thinking?” – things we have seen that left us scratching our collective noggins.

The list is long, so I’ll just go with something I saw yesterday.  And in September. Both created by a full-time DBA at a client.

BACKUP LOG MyDatabase
TO DISK = 'E:\SQLBackups\MyDatabase.trn'
WITH INIT

Yep.  It runs hourly.  And initializes the .trn file each time. Goodbye Point in Time recovery.

Ick.

Bonus points: Duplicate indexes, which I will cover in a future post.

Thanks for reading!

Kevin3NF

Follow @Dallas_DBAs

Filed Under: backup, TSQL2sday, Uncategorized

SQL Server backups to Azure Blob storage

January 16, 2018 by Kevin3NF 15 Comments

My Pluralsight course for new SQL Server DBAs

 

This is an intro level post, specifically written for the new and accidental DBAs that have been told to direct their SQL Server backups to Azure storage…but without any additional information.  It does not include discussions around backup types, recovery models, etc.  Those are all great topics that are very well documented elsewhere.  Also useful for veteran DBAs (like me) that suddenly have to sort out cloud technology.

Enough of that…

If you like to watch videos to learn new technology, I offer this.  Otherwise scroll on down for text and screenshots, with code samples:

 

Basic terminology:

  • Azure – Microsoft’s Cloud offering. We aren’t talking about AWS or Google today.
  • Subscription – an account to create and use Azure resources
  • Resource Group – a logical container for grouping like-purpose resources
  • Resource – anything in Azure you can create, such as SQL Databases, Virtual Machines, Storage accounts, etc.
  • Storage account – loosely similar to a Fileshare…created to hold a specific type of items
  • Container – a sub-group within a storage account – roughly similar to a folder on a fileshare
  • Access key – a code you need in order to access a Storage account
  • URL – the location specific to the container you have created
  • SQL Credential – created at the SQL instance – uses the access key to allow backups to go into containers. The credential is specified in a backup or restore statement.

What we are going to do:

  • Create a storage account of Kind “Storage”
  • Create a container – “Blob Service”
  • Create a SQL Server credential
  • Create a new SQL Database on my 2016 instance
  • Backup the database to the new container
  • Restore that backup to a new database name
  • All of this in T-SQL, because the SSMS GUI is cumbersome

Assumptions:

  • You already have an Azure account/subscription
  • You already have a test SQL instance, SQL 2012, SP1, CU2 or higher.
  • You are an amazing and wonderful person
  • You know the basics of SQL Server backups and recovery models
  • You are going to follow me on Twitter
  • You are interested in being even more awesome at your job than you already are

Start with the Azure part:

Log into your subscription and click Storage accounts on the left side, then click +Add:

Select the properties for your new storage account.

Account Kind: Storage or StorageV2, not Blob Storage

Performance: Standard vs. Premium is spinning disks vs. SSD…Standard for testing or infrequently accessed files.

Replication: research and choose wisely

Resource group: where will this storage account go?  Just a logical container

Location: probably a good idea to keep it fairly close to the SQL Servers you are backing up for performance reasons…but you may want to send it farther away…your call.

Click Create and wait for the notification to let you know the account exists.

Go back to the storage accounts blade and refresh to see your new account.  Click it.

Verify the settings you chose are there (highlights below):

Click on ‘Access Keys’ to see the two auto-generated Keys and Connection strings for accessing this storage account.  Do not give these out freely…only to those with a need for them.  If they get out, you can regenerate either one or both pairs.   You will need one of the 2 keys when we create a SQL Server Credential later on.

Click ‘Containers’ under Blob Service

Click + Container and give it a name.  Names can only be lowercase letters, numbers and hyphens.  If you enter an illegal name it will not let you continue.  I use the SQL instance name here, just as I would for a regular folder on a files hare.  Also, choose Private for the Public Access Level.  Click OK to create the container.

Enter the name here:

The deployment of the container should be very quick.  Click on its name to open it:

Note the “no blobs found” in the container.  After a successful backup, you will see it here.

Click on ‘Container Properties’ to get the URL for this specific container…this will be used in Backup and Restore statements.  Click the button next to the URL to copy it.  For now just remember where this is or copy it to Notepad, Query window etc.  When we start to build our T-SQL statements, we will need both the Access key from earlier and the URL.

At this point, you have an Azure Subscription, with a Standard Storage Account, that has a Blob Container in it.  For now, that is all we will do in the portal, but leave your browser open for copying the Key and URL, as well as refreshing to see results of the Backup command.

Now for the SQL Server part:

Open SQL Server Management Studio.   I am using 16.x despite 17.4 being out as of this writing, mostly because I upgrade slowly and don’t like the icons of the 17.x releases.  I am using SQL Server 2016 Developer Edition on a Windows 10 Pro Dell Precision laptop.

Create a SQL Server Credential

Note that I have no Credentials at this time:

Go to your portal, and copy one of the Access Keys associated with your new Storage Account.


USE master
GO
CREATE CREDENTIAL SQLBackups --give this a meaningful name
--storage account name:
WITH IDENTITY='sqlbackups12345',
--storage account key from portal:
SECRET = 'pvv99UFQvuLadBEb7ClZhRsf9zE8/OA9B9E2ZV2kuoDXu7hy0YA5OTgr89tEAqZygH+3ckJQzk8a4+mpmjN7Lg=='
GO

Run this in a Query Window, using the Storage Account name and Key from your subscription.

Refresh the Credentials in SSMS and verify the new one was created:

If you don’t already have database you want to backup (you should be doing this on a test system…), create one:


-- Create a test database
-- minimal options for demo...don't create
-- your databases like this

Create Database Test

 

Now, the whole point of all this…create a Backup Database command:

The ‘TO URL’ replaces the ‘TO DISK’ you are used to.   It includes the URL from the portal for the container as well as the name of the file you are creating.   Also, the WITH CREDENTIAL is new:


--back it up to Azure
--get URL from portal, add database name-date to the end of the URL

BACKUP DATABASE Test
TO URL = N'https://sqlbackups12345.blob.core.windows.net/kbh-precision-2016/Test_20180114_1038am.bak'
WITH credential = 'SQLBackups';
GO

-- go see the file in the portal

If your Backup created successfully, go to the Container in the Portal and refresh:

To RESTORE the database from the Blob backup you just created, use the same URL and Credential:


-- Restore the DB to a new DB:
--use the same URL as above
-- WITH Moves to new file names

RESTORE DATABASE Test_restored --new database name
FROM URL = 'https://sqlbackups12345.blob.core.windows.net/kbh-precision-2016/Test_20180114_1038am.bak'
WITH CREDENTIAL = 'SQLBackups',
Move 'test' to 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA\Test_Restored.mdf',
Move 'test_log' to 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA\Test_Restored.ldf'
;
GO

And refresh the SSMS Database list:

That’s it…the basics and minimums.  You can of course add other normal Options, such as STATS and COMPRESSION.

Edit…when I wrote this, I was not aware of a decent way to clear out files beyond a certain number of days.

Messing around with a PowerShell script I found, I got this:


#Script to delete backup files

# Set variables
$container="yourcontainername"
$StorageAccountName="YourStorageAccountName"
$StorageAccountKey="YourStorageAccountAccessKey"
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$retentiondate = Get-Date

# read the list of files from the container
$filelist = Get-AzureStorageBlob -Container $container -Context $context #-Blob *trn

foreach ($file in $filelist | Where-Object {$_.LastModified.DateTime -lt (($retentiondate.ToUniversalTime().AddDays(-7)))})
{
$removefile = $file.Name
if ($removefile -ne $null)
{
Write-Host "Removing file $removefile"
Remove-AzureStorageBlob -Blob $removeFile -Container $container -Context $context
}
}

Works on a VERY limited test on my system.  You can change the retention in the AddDays.  I had to add .ToUniversalTime to mine as PowerShell was reading the blobs as UTC/zulu and not deleting based on US Central.

If you have any other questions, feel free to ask in the comments.  If you get specific errors in the BACKUP or RESTORE commands, head off to Google first is your fastest choice.

Edit 2: Don’t use the VERIFY option if you are backing up to Azure directly…read this on why that can be an expensive option

Thanks for reading!

Kevin3NF

My Pluralsight course for new SQL Server DBAs

Follow @Dallas_DBAs

Filed Under: Azure, backup, Beginner, EntryLevel

Installing the ‘Ola Scripts’…quick and easy database maintenance

December 12, 2017 by Kevin3NF 1 Comment

I was recently in a conversation about the best way to go about setting up maintenance (Backups, Integrity Checks, Indexes and stats) for a group of SQL Servers, with minimal hassle, and easy to deploy to new servers.

The factors that came into play on this were:

  • Supportability
  • Cost
  • Ease of use

We discussed the following different options (I was not talking to a DBA, but a SQL Developer):

  • SQL Server Maintenance Plans
  • Ola Hallengren’s scripts
  • 3rd party products from Red Gate, Minion, etc.
  • Custom scripts

For this customer, in this environment, I decided to recommend Ola’s scripts.  The primary drivers were ease of installation and the amazing free support from the hundreds (thousands?) of DBAs that know and love them.  Myself included.

But I still have to prove my point to this client no matter what I recommend…so I made this video.

Enjoy:

Skip to 5:00 if you already have SQL Server installed…that first bit is just to show this on a clean instance 🙂

If you have any questions, feel free to comment on the video, or if you need specific help hit up #sqlhelp on Twitter.

Thanks for reading and watching!

Kevin3NF

Filed Under: backup, backup, Configuration, Deployment

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2025 · WordPress · Log in

 

Loading Comments...