• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
  • Pocket DBA®
  • Blog
  • Testimonials
  • Contact
  • About

Azure

SQL Server in the Cloud – Are DBAs Still Needed?

June 18, 2025 by Kevin3NF Leave a Comment

Things your cloud vendor may not tell you

 

Here’s a common theme I hear from small IT teams:

“Our SQL Server is in the cloud now. We don’t need a DBA.”

Not quite.

When you’re running SQL Server on cloud virtual machines like Azure SQL VM or AWS EC2, you’re still responsible for a lot of the same challenges you faced on-prem. The difference? Now you’re paying cloud rates for the same headaches. But you don’t have to deal with hardware and some of the other infrastructure hassles.

Let’s bust some of the top myths.

Cloud VM Myths That Can Hurt Your SQL Server

Myth 1: The Cloud Auto-Scales SQL Server for Me

  • Cloud VMs don’t automatically adjust CPU, memory, or storage. If you don’t right-size your SQL Server, you may be paying too much or running into performance bottlenecks.
  • Even with the right mix of vCPUs and Memory, you may be getting IOPS/Throughput throttling to the disks.
  • The VM size needs to be in the right family as well. For database workloads, look for “Memory Optimized” options, especially for reporting workloads.

 

Myth 2: Cloud Patching Means I’m Always Safe

  • Azure and AWS offer patching options for the O/S, but they don’t know your business hours or your workloads. A patch could restart your SQL Server at the worst possible time.
  • Azure (and AWS?) will not auto-patch your SQL Server unless you turn it on via the SQL Server IaaS Agent extension. Is this really a good idea for your system? Are patches automatically supported by your software vendors?
    • The IaaS extension will only apply “Important or Critical” patches.
  • As always, test all patches before applying to production.

 

Myth 3: Backups are Handled Automatically in the Cloud

  • Cloud VMs give you the option to back up SQL Server. They don’t configure backups or validate them for your RTO/RPO needs.
  • A DBA ensures backups run properly, are tested, and can meet your recovery goals.

 

Myth 4: The Cloud Handles SQL Performance

  • The cloud gives you CPU, memory, and storage but it doesn’t tune your indexes, optimize queries, or troubleshoot blocking.
    • Some level of auto-tuning exists in the PaaS offering (Azure SQL DB), but I’ve seen it go hilariously wrong.
  • A DBA does that. Without them, you may see slow performance over time, higher costs, and user/customer frustration.

 

Myth 5: SQL Server Licensing is Simpler in the Cloud

  • Simple? Maybe. Cheaper? Probably not, even if you get it right.
  • Cloud billing for SQL Server licensing can be complex, and mistakes add up fast.
  • Overpaying for cores “just in case” is a silly way to spend. Start smaller and add resources, better disks, etc. as you determine your actual workload.
  • This is one of the key reasons the Memory Optimized VMs exist…VMs are priced based more on vCPU count than anything else. Get that core count down by having a DBA tune your system periodically.

 

The Bottom Line:

SQL Server in the cloud can be a game changer. Wandering into the vendor’s portal or dashboard and blindly picking “something” is not the best approach. Get input from independent cloud experts and have a DBA review things from time to time.


Could Your SQL Server Survive a Failure Without Major Downtime?

Take the Quiz!

Find out how bullet-proof your DR plan really is!


 

SQL TidBits:

Check your SQL Server uptime:

SELECT 
   sqlserver_start_time 
FROM 
   sys.dm_os_sys_info;

Thanks for reading!

–Kevin

Filed Under: AWS, Azure, Cloud, SQL Tagged With: syndicated

SQL Server backups to Azure Blob storage

January 16, 2018 by Kevin3NF 15 Comments

My Pluralsight course for new SQL Server DBAs

 

This is an intro level post, specifically written for the new and accidental DBAs that have been told to direct their SQL Server backups to Azure storage…but without any additional information.  It does not include discussions around backup types, recovery models, etc.  Those are all great topics that are very well documented elsewhere.  Also useful for veteran DBAs (like me) that suddenly have to sort out cloud technology.

Enough of that…

If you like to watch videos to learn new technology, I offer this.  Otherwise scroll on down for text and screenshots, with code samples:

 

Basic terminology:

  • Azure – Microsoft’s Cloud offering. We aren’t talking about AWS or Google today.
  • Subscription – an account to create and use Azure resources
  • Resource Group – a logical container for grouping like-purpose resources
  • Resource – anything in Azure you can create, such as SQL Databases, Virtual Machines, Storage accounts, etc.
  • Storage account – loosely similar to a Fileshare…created to hold a specific type of items
  • Container – a sub-group within a storage account – roughly similar to a folder on a fileshare
  • Access key – a code you need in order to access a Storage account
  • URL – the location specific to the container you have created
  • SQL Credential – created at the SQL instance – uses the access key to allow backups to go into containers. The credential is specified in a backup or restore statement.

What we are going to do:

  • Create a storage account of Kind “Storage”
  • Create a container – “Blob Service”
  • Create a SQL Server credential
  • Create a new SQL Database on my 2016 instance
  • Backup the database to the new container
  • Restore that backup to a new database name
  • All of this in T-SQL, because the SSMS GUI is cumbersome

Assumptions:

  • You already have an Azure account/subscription
  • You already have a test SQL instance, SQL 2012, SP1, CU2 or higher.
  • You are an amazing and wonderful person
  • You know the basics of SQL Server backups and recovery models
  • You are going to follow me on Twitter
  • You are interested in being even more awesome at your job than you already are

Start with the Azure part:

Log into your subscription and click Storage accounts on the left side, then click +Add:

Select the properties for your new storage account.

Account Kind: Storage or StorageV2, not Blob Storage

Performance: Standard vs. Premium is spinning disks vs. SSD…Standard for testing or infrequently accessed files.

Replication: research and choose wisely

Resource group: where will this storage account go?  Just a logical container

Location: probably a good idea to keep it fairly close to the SQL Servers you are backing up for performance reasons…but you may want to send it farther away…your call.

Click Create and wait for the notification to let you know the account exists.

Go back to the storage accounts blade and refresh to see your new account.  Click it.

Verify the settings you chose are there (highlights below):

Click on ‘Access Keys’ to see the two auto-generated Keys and Connection strings for accessing this storage account.  Do not give these out freely…only to those with a need for them.  If they get out, you can regenerate either one or both pairs.   You will need one of the 2 keys when we create a SQL Server Credential later on.

Click ‘Containers’ under Blob Service

Click + Container and give it a name.  Names can only be lowercase letters, numbers and hyphens.  If you enter an illegal name it will not let you continue.  I use the SQL instance name here, just as I would for a regular folder on a files hare.  Also, choose Private for the Public Access Level.  Click OK to create the container.

Enter the name here:

The deployment of the container should be very quick.  Click on its name to open it:

Note the “no blobs found” in the container.  After a successful backup, you will see it here.

Click on ‘Container Properties’ to get the URL for this specific container…this will be used in Backup and Restore statements.  Click the button next to the URL to copy it.  For now just remember where this is or copy it to Notepad, Query window etc.  When we start to build our T-SQL statements, we will need both the Access key from earlier and the URL.

At this point, you have an Azure Subscription, with a Standard Storage Account, that has a Blob Container in it.  For now, that is all we will do in the portal, but leave your browser open for copying the Key and URL, as well as refreshing to see results of the Backup command.

Now for the SQL Server part:

Open SQL Server Management Studio.   I am using 16.x despite 17.4 being out as of this writing, mostly because I upgrade slowly and don’t like the icons of the 17.x releases.  I am using SQL Server 2016 Developer Edition on a Windows 10 Pro Dell Precision laptop.

Create a SQL Server Credential

Note that I have no Credentials at this time:

Go to your portal, and copy one of the Access Keys associated with your new Storage Account.


USE master
GO
CREATE CREDENTIAL SQLBackups --give this a meaningful name
--storage account name:
WITH IDENTITY='sqlbackups12345',
--storage account key from portal:
SECRET = 'pvv99UFQvuLadBEb7ClZhRsf9zE8/OA9B9E2ZV2kuoDXu7hy0YA5OTgr89tEAqZygH+3ckJQzk8a4+mpmjN7Lg=='
GO

Run this in a Query Window, using the Storage Account name and Key from your subscription.

Refresh the Credentials in SSMS and verify the new one was created:

If you don’t already have database you want to backup (you should be doing this on a test system…), create one:


-- Create a test database
-- minimal options for demo...don't create
-- your databases like this

Create Database Test

 

Now, the whole point of all this…create a Backup Database command:

The ‘TO URL’ replaces the ‘TO DISK’ you are used to.   It includes the URL from the portal for the container as well as the name of the file you are creating.   Also, the WITH CREDENTIAL is new:


--back it up to Azure
--get URL from portal, add database name-date to the end of the URL

BACKUP DATABASE Test
TO URL = N'https://sqlbackups12345.blob.core.windows.net/kbh-precision-2016/Test_20180114_1038am.bak'
WITH credential = 'SQLBackups';
GO

-- go see the file in the portal

If your Backup created successfully, go to the Container in the Portal and refresh:

To RESTORE the database from the Blob backup you just created, use the same URL and Credential:


-- Restore the DB to a new DB:
--use the same URL as above
-- WITH Moves to new file names

RESTORE DATABASE Test_restored --new database name
FROM URL = 'https://sqlbackups12345.blob.core.windows.net/kbh-precision-2016/Test_20180114_1038am.bak'
WITH CREDENTIAL = 'SQLBackups',
Move 'test' to 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA\Test_Restored.mdf',
Move 'test_log' to 'C:\Program Files\Microsoft SQL Server\MSSQL13.SQL2016\MSSQL\DATA\Test_Restored.ldf'
;
GO

And refresh the SSMS Database list:

That’s it…the basics and minimums.  You can of course add other normal Options, such as STATS and COMPRESSION.

Edit…when I wrote this, I was not aware of a decent way to clear out files beyond a certain number of days.

Messing around with a PowerShell script I found, I got this:


#Script to delete backup files

# Set variables
$container="yourcontainername"
$StorageAccountName="YourStorageAccountName"
$StorageAccountKey="YourStorageAccountAccessKey"
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$retentiondate = Get-Date

# read the list of files from the container
$filelist = Get-AzureStorageBlob -Container $container -Context $context #-Blob *trn

foreach ($file in $filelist | Where-Object {$_.LastModified.DateTime -lt (($retentiondate.ToUniversalTime().AddDays(-7)))})
{
$removefile = $file.Name
if ($removefile -ne $null)
{
Write-Host "Removing file $removefile"
Remove-AzureStorageBlob -Blob $removeFile -Container $container -Context $context
}
}

Works on a VERY limited test on my system.  You can change the retention in the AddDays.  I had to add .ToUniversalTime to mine as PowerShell was reading the blobs as UTC/zulu and not deleting based on US Central.

If you have any other questions, feel free to ask in the comments.  If you get specific errors in the BACKUP or RESTORE commands, head off to Google first is your fastest choice.

Edit 2: Don’t use the VERIFY option if you are backing up to Azure directly…read this on why that can be an expensive option

Thanks for reading!

Kevin3NF

My Pluralsight course for new SQL Server DBAs

Follow @Dallas_DBAs

Filed Under: Azure, backup, Beginner, EntryLevel

Are IT Certifications worth it?

September 6, 2017 by Kevin3NF 4 Comments

I’ve been in IT for 2 decades.   I got my MCSE in 1999.   Via a BootCamp training class that Microsoft paid my way into.

I was the walking definition of a “paper MCSE” – all knowledge and no experience.   At the time the MCSE was Windows NT 4.0, IIS 4 and some other elective, along with some networking tests.  I really don’t remember, other than TCP/IP which I had to take, fail, study and take again.

When i contracted at Microsoft (the second time) back in 2005, there were 20+ contractors all there getting ramped up for the SQL 2005 release.   Technically they were ramping up for SQL 7 and 2000, as the Full Timers were doing 2005 support. Most of them had the MCDBA certification….many hanging in their cubes.  Not one of them could admin their way out of a paper bag.   Nice people, but no experience.

I decided at that point that I would not be pursuing any more certifications, as these guys were making it look bad and watering the value down.   Certification tests are expensive, when you consider study time, prep materials, practice tests and exam fees.  I decided I would just learn the SQL Server things I needed to know and that were in wide use (skipped right on by Notification Services!).  This has served me well for the 10+ years since that contract at MS ended.

Fast forward to last month when I decided to quit being an old dog and go learn some new tricks.  Specifically Azure tricks…infrastructure and SQL, in parallel (MAXDOP 2).

I’m working through that.  I’m not ready to even think about the first test yet, because I’m taking my time and actually LEARNING the material, whether its on the test or not.  Shocking, eh? 😀  (Pluralsight is amazing for this, BTW).

The thing that triggered this blog came from a LinkedIn post this morning that caught the ire of Adam Machanic (w|t), as well as others:

Hello Folks,

Can someone please send me MCSA 70-764 and 70-765 exam dumps in PDF format.

My Email ID is: redacted@whatever.com

Thanks in advance..

Most of the responses were along the lines of “why not just study the material, you cheater?”

The original poster claimed this:

You are right . in my past I worked on old SQL version as of now working with new versions and everything is automation over here …hence just for reference I need them …

I call shenanigans.

OP is going for the “memorize, test, apply” method of certifications.   Maybe not apply for a new job. Maybe just to keep one.  Who knows.   But still shenanigans….and in .PDF format if you please.

As you might expect, this spilled over to Twitter, as many things do.   It was met there with a healthy dose of venom for those that shortcut the process, as well as places that issue fake certifications (I didn’t know that was a thing!).

For fun, I contacted my friend Gregory Knight at GTN Technical Staffing and Consulting (my first independent customer, way back in 2001) for some insider info on the other side of the recruiting and client pipelines.

My Question:

“Hey…curious how much recruiters and clients actually value Microsoft certifications these days, if at all…other then for resume filtering…thoughts?”

Gregory sent this wide internally at GTN and got these responses back:

I do. It tells me the following:

  • How much extra work a human has put into his/her career.
  • Suggests drive, love of the tech, going the extra mile
  • Low-level cert only or many unfinished certs (CCIE with no lab) and/or never finished college suggests lack of follow through.
  • Certs all over the place, e.g. MuleSoft, CCNA, MSCA suggests lack of career focus.
  • Date of completion matters. It tells me what the person was doing at a particular point of time.
  • E.g. an infrastructure/network candidate passing himself off as a network expert but has lots of Microsoft certs tells me the person likely leans more server side

It really depends on how much our individual client values it. MS Certs tend hold value and do demonstrate a general mastery of the technology. With that said, clients that do not require it tend to not be impressed or care enough about it to make a difference in choosing to select, interview, or hire a particular candidate.

I view certifications, any certifications, like a bachelor’s degree. It does not open the door for a candidate, but it can prevent the door from being slammed shut in their face…


Real world experience > certificates any time unless it’s Cisco.


Production > Lab


So…this settles nothing of course as far as whether you should pursue an IT cert or not.   If you do it the right way, you’ll learn things and have something to show for it.  If you shortcut the process, you’ll be found out when the time comes and others are depending on you to know what you are doing.

I’d love to hear your thoughts in the comments section.  Any comments bashing people (candidates, recruiters or clients) will not get approved.   Thoughtful, considerate discussion?  Yes please 🙂

Thanks for reading!

Kevin3NF

Filed Under: Azure, Career, SQL, Training

Migrate multiple SQL databases to Azure SQLDB

August 1, 2017 by Kevin3NF 1 Comment

I ran across a forum post asking about moving multiple databases to Azure, so of course I Googled the asker’s question and found:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-migrate-your-sql-server-database

Since I might be doing a similar thing in the near future, I decided to see if I could push 4-5 small test databases to Azure SQLDB at the same time (SQL on Azure VM test may come later).  If this happens with my client it will be hundreds of databases, with the same schema but wildly varying sizes.

The “Plan”

  • Create 5 sample databases
  • Create .bacpacs on local drive
  • Create Azure Logical Server if not already done (see step 4)
  • Create a Notepad of SQLPackage.exe commands,  one for each .bacpac
  • Run with “start” in CMD to fire off separate windows
  • Wait.
  • Enjoy.

I took the easy way out and created my 5 databases by creating 1 test database, backing it up and then restoring to Test2 – 5:

USE [master]
RESTORE DATABASE [AzureSQLDB_Test2] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test2_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test3] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test3_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test4] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test4_log.ldf'
GO

USE [master]
RESTORE DATABASE [AzureSQLDB_Test5] 
FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Backup\AzureSQLDB_Test1\AzureSQLDB_Test1.bak' 
WITH MOVE N'AzureSQLDB_Test1' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5.mdf',  
	 MOVE N'AzureSQLDB_Test1_log' TO N'C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\DATA\AzureSQLDB_Test5_log.ldf'
GO


Create .bacpac files…basically just zip files of schema and data that Azure SQLDB can read.   You cannot do a traditional RESTORE DATABASE to an Azure logical server.   Right-click the database>>Tasks>>Export Data-Tier Application.  Then work through the GUI:

Azure SQLDB bacpac import

Browse for the location to place the .bacpac.   Remember this, as you will need it shortly.

Azure SQLDB bacpac import

Click Finish to create the file.

At this point, you need to make sure you have an Azure account and active Subscription.  You also need a Storage Account.   The Azure SQLDB lives on a Azure Logical server that has its own settings, firewall access rules, etc.   I am not going into those details as they are well documented in multiple places, and I know that your Google-Fu is strong!

SQLPackage.exe is a command line tool that is used to work with data-tier applications.   I’ve used it to extract schemas, do multiple schema compares at once, deploy changes to live databases, etc.   Read the first paragraph of the link…

I created a list of commands, one for each database, that I need to run to move my individual .bacpac files into Azure SQLDBs:

start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test2;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test2.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test3;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test3.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test4;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test4.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic
start SqlPackage.exe /a:import /tcs:"Data Source=myTestServer.database.windows.net;Initial Catalog=test5;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL" /sf:"C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test5.bacpac" /p:DatabaseEdition=Basic /p:DatabaseServiceObjective=Basic

Yes, that is a bunch of text for just 5 parameters.  Breaking it down:

  • /a:import
  • — import action to import the data and schema into Azure SQLDB
  • /tcs:”Data Source=myTestServer.database.windows.net;Initial Catalog=test1;User Id=YouCanCallMeAl;Password=LikeImmaLetYouSeeMyPassword_LOL”
  • — Server, Database, UserID and Password to connect to and create
  • /sf:”C:\Users\Kevin3NF\Documents\SQL Server Management Studio\DAC Packages\AzureSQLDB_Test1.bacpac”
  • — Source File that gets imported.  Use double quotes if your location has spaces in the path
  • /p:DatabaseEdition=Basic
  • — Basic, Standard, Premium (cost and performance from low to high)
  • /p:DatabaseServiceObjective=Basic
  • — Lots of options here if you chose Standard or Premium above.  Basic is just Basic.

Run these commands to import the .bacpacs:

Note that in my script I have ‘Start’ at the beginning.  What that does when I paste all 5 lines into a CMD window is fire off a new Window that then runs the SQLPacakage.exe command.   All 5 .bacpacs are then being executed simultaneously.   Without that Start, they will run one after the other.  Also, make sure you get the carriage return after the final command (line 6 in my code), or the 5th database will just sit there in the CMD window waiting for you to press enter.

My screen as they were running:

Azure SQLPackage Import SQL Server

My SSMS after connecting to my Azure logical Server:

SSMS Azure SQL

Note the Table_1 is there just to verify that the one database I actually changed came across properly.

That’s it.  They are there, ready for me to do whatever it is I plan to do with them

Some things to consider:

  • Bandwidth:  Your internet connection will determine how fast this works (throughput and stability)
  • Local or Azure storage: You can copy the .bacpacs to your storage account and run from there.   Adjust the /sf parameter accordingly
  • Database size: There are limits to Azure SQLDB on size, performance, etc.   Do your research.
  • Database performance tier: the more you want, the more you pay.  Again, do your research.
  • Yes, I’m sure there are a number of other ways to do this, including Powershell options.   This is what I went to due to the GUI and simple nature of it.

That’s all I have for today…comments appreciated!

Thanks for reading.

Kevin3NF

Filed Under: Azure, backup, Deployment, SQL

SQL Saturday Dallas (563)

September 27, 2016 by Kevin3NF Leave a Comment

I’m 3 days out from my second SQL Saturday event as an attendee and my head is still swimming!

The most memorable things from the sessions I attended…not an exhaustive list, but highlights:

Agile Leadership Skills and Building High Performing Teams – George Govantes:

I went to this one, primarily because it was the first thing in the morning and because I am a team lead for all admins here (3 including me).  Lots of great quotes and reading/viewing references.  This was not about Agile software development, rather more about the difference between a leader and a manager.  (See every third post on LinkedIn for pithy graphics from recruiters on the same).

Takeaways:

  • Monday morning is the most popular day to have a heart attack
  • 70% of people hate their jobs.  I am not one of them.
  • “Your team is a reflection of what you will tolerate.”  Applies to the good and the bad.
  • “Nobody has time to do it right, but everyone has time to do it over.”

Data Masking and Always Encrypted – Sean Werick, PragmaticWorks

What admin doesn’t want to know about new ways to secure data and keep the wrong eyes from seeing it?  Sean did a masterful job of presenting the concepts…not just from a technical perspective, but personality as well.   Engaging, entertaining and he very clearly knew the tech part.  Any time someone threw an “off PowerPoint” question at him, he nailed it.

Takeaways:

  • Data masking
    • No Architecture difference…its all in the app, raw data is not changed
    • Complements TDE and other features
    • Role based
    • Sysadmins can ALWAYS see the data
    • Supported by Azure
    • Default Masks for rapid deployment
    • Not a replacement for row-level security and/or encryption
  • Always Encrypted:
    • Data is encrypted at rest, on the fly and in use
    • Deterministic and Randomized – pick the right one for how the data is used (Grouping, joining, etc.)
    • .Net 4.6 required
    • increases storage requirements
    • Encrypt first, then mask.

Deploy SQL Server with Microsoft Azure Virtual Machine – Bala Shankar

I’ve played with Azure SQL DB and SQL VM.  I even blogged about it for TSQL2sday.  But, having never deployed or supported a production Azure anything, how can I not go listen to a MS guy talk about it?  Bala is clearly a brilliant speaker and IT person.  If you go to one of his presentations, bring a seat belt and safety helmet…it will be fast and you don’t want to miss anything!

Takeaways:

  • VM’s built from the Azure marketplace bring the licensing costs for Windows and SQL with them.  Bring your own license is an option.
  • Server families:  A&D (V1?) level for Dev/Test, G-series for heavy loads (G for Godzilla). My notes say DS v2 for SQL, but not finding that term in Azure portal at the moment…more research required
  • Premium storage is SSD, with SLA for performance and availability
  • Standard storage is spinning rust, SLA for availability only
  • Connect through Public IP or Virtual network/sql port 1433
  • You can now have your VM Auto-patch and Auto backup the databases (30 day retention, SQL 2014 only). VM>>SQL Server Configuration>>Automated backup
  • So much more…but my pen melted…

Lunch – Fajitas, chips, ice cream later, coffee all day.  Cool.

Hi, my name is Powershell, let’s be friends! Intro to PoSH   Amy Herold

I’ve tried to learn PowerShell off and on a few times, but inevitably I get distracted and forget it all.   I went to this session to hopefully get inspired to pick it back up.  The session was fine and Amy is a very entertaining speaker who clearly knows her stuff.  But…I’m still struggling.  I’ll probably have to hire someone to teach me one on one someday, lol!

Takeaways:

  • Amy is a mean DBA (and proud of it)
  • The Shell and the ISE have their own profiles
  • You can do all sorts of stuff with PoSH.
  • Demos break for everyone, no matter how good you are

Let’s Break Some SQL Server Availability Groups!     John Harp

John is another one of those great speakers that clearly knows his material.   Setting up clusters, AGs, etc. on a laptop for a presentation is no small amount of work.

This was a high-speed, high-energy presentation.   I put the pen down and just watched, trying to pick up a few nuggets as we went along.  John had some things break of course, since we were trying to break some AGs. The ones that broke outside of the plan were fixed/resolved quickly.

Random Acts of Senseless Databasing – Kris Hokanson

I went to this one due to its time slot at the end of the day, my brain being at capacity, and just to hear some crazy things people do.  This blog used to be dedicated to that topic!  Former name – “You want fries with that?”

Single biggest takeaway was to look into sp_executesql instead of Exec() for stored procs.   It may not be perfect for every situation, but it may also save your bacon.   Kris referenced the SQLSkills post on it here.

Funniest part of this was the actual live crickets in the room being the only sound at one point after “Any questions on this?” {crickets}…

Overall:

I had a great time!  The event was well organized and well run.  The presentations started on time and the rooms were easy to find.  A wide variety of topics and well known speakers makes for a must attend event.  SQLSaturdayKC was happening the same day, and MSIgnite was firing up as well.   Its a testimony to the passion of the SQL Server Community that these did not dilute the speaker pool!

Kudos to the leaders and volunteers for a top-notch event that wore me out 🙂

Kevin3NF

Filed Under: Azure, Encryption, SQL, SQLSaturday

Azure: SQL VM vs SQL Database

September 13, 2016 by Kevin3NF 2 Comments

t-sqltuesday

“The Cloud” edition

 

Azure SQL Database vs. Azure SQL Virtual Machine

 

In keeping with my “this blog is for the beginner DBAs” style, I’m going to keep this short and simple.   I will also confess that I have only played with the cloud, and only Microsoft Azure so you won’t see anything about AWS or any other cloud provider.

Some basic terminology:

  • Cloud: No such thing.  It is just your stuff on someone else’s machines that they maintain for you.
  • Virtual Machine (VM): A Virtual Server on some physical servers…yours, or someone else’s.
  • Azure: Fancy name for Microsoft’s cloud. As a noun or an adverb it means “blue”.  Or a small butterfly.
  • Azure SQL database: Just a database in Azure on some storage
  • Azure Virtual Machine: A VM on Microsoft’s Azure servers, that you do not have to maintain the underlying physical infrastructure.

When the boss comes to you and wants to look at “moving SQL to the cloud”, please make sure you are comfortable with all of the above terms so you are on the same page, or know where he is misusing some buzzwords.

Let’s assume a very basic scenario:  You have one database that your company website uses to sell its widgets.   It needs to be up pretty much 24×7, and your building has occasional internet connectivity issues that are out of your control.  Enough so that you are losing sales and customer confidence.  Your traffic is low, and your business is small so every sale is important.

You basically have three choices:

  • Move to a colocation provider and manage it all yourself
  • Move to Managed Hosting solution such as Rackspace
  • Move to the cloud

For this conversation, let’s assume the first 2 choices are ruled out due to cost.  For Azure pricing, there is a handy calculator here.

Virtual machines are basically priced based on resource quantity (CPU/Ram/Storage) combined with number of hours per month it is turned on (on/off is via the Azure portal, not shutdown from the VM desktop).  You can get an 8 core, 56GB Ram, 605GB storage VM for $1.32/hr. or $982.08/month.

Azure SQL Databases are priced based on DTUs x number of hours/mo.  A single database with 250 DTUs and 500GB of storage runs $1.25/hr. or $930/mo.

Note: Pricing can and does change, so this is based on prices in the calculator as of 9/10/2016.  Also, you can change service tiers as necessary at any time.  Ramp up for seasonality, down for off-season, etc.

Pros and cons:

Virtual Machine:

  • Pretty much the same as a virtual or physical machine in your network.
  • You worry about the O/S in the VM, Microsoft takes care of the underlying infrastructure
  • Spin one up in minutes with SQL Server already on it
  • SSMS, database management, etc. is all the same as your local (on premises) servers. Use the SSMS GUI as you always have, or T-SQL
  • Backups are your responsibility and under your control (can be automated via the portal)
  • You patch O/S and SQL as you see fit (Automated is available through the portal)

azure_vm_screenshot

SSMS and Services from my Azure SQL 2016 VM on Windows 2012 R2

Azure SQL database:

  • Connectivity is through connection strings for SSMS and other applications. Strings are available through the Azure portal
  • Patching is automated under the covers by Microsoft…existing connections may be reset when they happen. You have no say in this.
  • Backups are completely automated
  • Can be created in just minutes (assuming storage accounts, etc. already configured)
  • Some commands only available through T-SQL
  • Geo-replication for DR in a few clicks (DR server needs to be created of course)
  • TDE in two clicks

azure_sqldb_screenshot

SQLDB from SSMS on my laptop, with Properties and Create User template…

This is not a comprehensive list of every feature available, nor is it intended to be.  Microsoft is adding more and more to the Azure SQL products all the time, so no list will really be able to keep up.

Rather, the goal here is to give the junior DBA a reference point to the major differences between SQL Server in an Azure VM vs. am Azure SQLDB.

The VM is most likely the more familiar choice for those new to cloud computing, but each workload and company will have to make the decision based on their requirements as to which of the two is the optimal choice.

Hope this helps clarify some things for you!  Please ask questions in the comments and I will answer what I can, as long as you remember the disclaimer at the top J

Kevin3NF

The OnPurpose DBA

 

Filed Under: Azure, SQL, TSQL2sday

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2025 · WordPress · Log in

 

Loading Comments...