• Skip to main content
  • Skip to primary sidebar

DallasDBAs.com

SQL Server Database Consulting

  • Services
    • SQL Server Health Check
    • Fractional DBA
  • Pocket DBA®
  • Blog
    • Speaking
  • Testimonials
  • Contact
  • About

Kevin3NF

Yet another SSPI post…

September 1, 2016 by Kevin3NF Leave a Comment

There are hundreds of posts floating around about SSPI failures in trying to authenticate to SQL Server, so I thought I would add another one!

Why?

Because this was a new one to me…and I was on the security team at Microsoft support 10 years ago that dealt with SSPI errors.

We have a corporate policy here that Service accounts expire after a certain time frame, and yesterday morning was the designated window to change them and update the SQL Server services.

Being cautious, we started with just one.

  • Changed the password.
  • Tried and failed to log on to the server with that account (which should have worked)
  • Tried again.  Still failed.
  • Tried with a different account with a good password and identical permissions…success
  • Tried the changed one…failed.
  • Fine…set it back to where it was – old password.

This was all for just one account…never even re-set it in the SQL Configuration Manager.

Then in testing, we found that any SQL Servers using that account were still up, but you could only connect by using RDP to the server, then SSMS.  SSMS from the laptop failed with:

“The target principal name is incorrect. Cannot generate SSPI context”

Apparently, the account was either locked out from our failed logon attempts, or had been disabled in Active Directory due to its age.  They do that sometimes.   Most likely the issue was locked.

We restarted the SQL Server (O/S restart) and that resolved it once the AD group unlocked it.

My assumption is that the lockout either blocked Kerberos authentication due to SPN no longer being valid, or the SPN itself got corrupted.  It was still there, just not working.   Verified its existence through running SetSPN -L with the account name.

A pretty smart guy on Twitter in the #sqlhelp conversation I started seems to think it was simple DC connectivity.  I’m not convinced, and we’ll never know for sure.

SQL Server did successfully register a new SPN on restart.

Hope this experience helps you troubleshoot something 🙂

Kevin3NF

Filed Under: Uncategorized

Schema Compare multiple databases at once…

August 22, 2016 by Kevin3NF 5 Comments

I recently had the need to schema compare a “Gold” or “Master” copy of our database to the 300 client copies of that database that exist in our Production environment.  I’m not alone…many fellow DBAs have had the same need.   Google searches for this confirm it.   This is for an upcoming upgrade to the application that will need post-upgrade comparison/verification.

There are 3rd party tools that do SQL Compares…my particular favorite is aptly named SQL Compare from Red Gate.  I’ve been using it off and on for 10 years.   I don’t know if it can be set up to hit more than one database at a time.  The other issue is that I don’t have a copy here.

Microsoft’s SQL Server Data Tools will also do this within Visual Studio.   Still one database at a time.  I forget where, but someone pointed me to the fact that SSDT uses SQLPackage.exe under the hood to do the work.  I figure if I can run it at a command line I can script out all of the databases.  I’m not much of a DOS scripting guy, so everything that follows is just my hack version…but it works, and not just on my machine!

I got most of this from StackOverflow user Mike Hyde’s response here
I had a followup “gotcha” question on SO here

(Edit:  When I migrated this from Blogger to here, a lot of my paths lost all of the backslashes (\)…)

The process:

  • Create a Test master database with 2 tables and 1 view
  • Create 5 copies as Test1, Test2, etc. (If it works for 5, it works for 300)
  • Add a column to a table in Test4, and a view to Test5
  • Use SQLPackage to Extract a .dacpac file from the “Gold” database (.dacpac is basically a schema only backup)
  • Dynamically create a script that generates the SQLPackage call
  • Paste that into a CMD window to compare the Gold db to the Client dbs
  • View the results, looking for exceptions

Create a Test master database:

--Create Gold copy  
 CREATE DATABASE [Test]  
  ON PRIMARY  
 ( NAME = N'Test',  
      FILENAME = N'c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATATest_Gold.mdf',  
      SIZE = 4096KB , FILEGROWTH = 1024KB )  
      LOG ON ( NAME = N'Test_log',  
      FILENAME = N'c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATATest_Gold_log.ldf',  
      SIZE = 1024KB , FILEGROWTH = 100MB)  
 GO  
 --Create some Objects  
 USE [Test]  
 GO  
 CREATE TABLE [dbo].[Orders]  
      (  
      [OrderID] [int] NOT NULL,  
      [OrderDate] [datetime] NULL  
      )  
      ON [PRIMARY]  
 GO  
 ALTER TABLE [dbo].[Orders]  
      ADD CONSTRAINT [DF_Orders_OrderDate]   
      DEFAULT (getdate()) FOR [OrderDate]  
 GO  
 CREATE TABLE [dbo].[OrderDetails]  
      (  
      [OrderDetailID] [int] NOT NULL,  
      [OrderID] [int] NOT NULL,  
      [ItemQty] [int] NOT NULL  
      )  
      ON [PRIMARY]  
 GO  
 Create View vOrders AS  
 SELECT  
      dbo.Orders.OrderID,  
      dbo.Orders.OrderDate,  
      dbo.OrderDetails.OrderDetailID,  
      dbo.OrderDetails.ItemQty  
 FROM    
      dbo.OrderDetails  
      INNER JOIN  
      dbo.Orders ON dbo.OrderDetails.OrderID = dbo.Orders.OrderID  

 

Create 5 copies as Test1, Test2, etc.:

--Back it up  
 Backup Database test  
 To Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 --restore x 5  
 Restore Database Test1  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest1.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest1_log.ldf'  
      ,Recovery  
 Restore Database Test2  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest2.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest2_log.ldf'  
      ,Recovery  
 Restore Database Test3  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest3.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest3_log.ldf'  
      ,Recovery  
 Restore Database Test4  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest4.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest4_log.ldf'  
      ,Recovery  
 Restore Database Test5  
 From Disk = 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLbackuptest.bak'  
 With Move 'test' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest5.mdf'  
      , Move 'test_log' to 'C:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATAtest5_log.ldf'  
      ,Recovery  

 

Add a column to a table in Test4, and a view to Test5:

Use Test4;  
 Go  
 Alter Table dbo.Orders  
      Add CustomerName varchar(500)  
 Use Test5;  
 go  
 Create View VOrderDetails AS  
 Select * from dbo.OrderDetails  

So at this point we have a gold copy (Test), 3 that we know match it (Test 1/2/3) and 2 that have Drifted.   Get used to the term drift…3rd party vendors are using it and later, we will create scripts to fix ‘drift’.

Finally it gets interesting!

Use SQLPackage to Extract a .dacpac file from the “Gold” database:

We will use SQLPackage.exe to create a .dacpac file.  Search your box for the file…likely it will be buried in Visual Studio’s folders or SQL Server’s.  Once I found it, I added that folder to my PATH variable so I didn’t have to CD to it in the CMD window each time.

My default install has SQLPackage here:
C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin

If you cannot edit your PATH due to company restrictions, use this at the beginning of your script:
SET PATH=%PATH%;C:\Program Files (x86)\Microsoft SQL Server\120\DAC\bin

Now, SQLPackage needs three basic parameters (over simplified, I know):

  1. Action
  2. Source
  3. Target

On my system, this is the command I will use:

sqlpackage.exe /a:Extract /scs:Server=US1213113W1SQL2014;Database='+[name]+'; /tf:C:Usershillke4DocumentsSQLScriptsDACPACS'+[name]+'.dacpac'   

Make sure this is all one line.  CMD will freak out on the CR;LF and tabs.  Also, change the server name in /scs and path in /tf.  /a is the action….we are Extract-ing the schema into a .dacpac file named in the /tf parameter.

Paste this into a CMD window, hit Enter and you should get this back:

Connecting to database ‘Test’ on server ‘MyLaptopSQL2014’.
Extracting schema
Extracting schema from database
Resolving references in schema model

 

Successfully extracted database and saved it to file ‘C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac’.


 
 

If it fails, the error messages are pretty specific about servername, database, etc.

Dynamically create a script that generates the SQLPackage call:

Now on to the part that gets you to multiple databases “at once.”  Technically, this is not all at once…there are multiple commands being run in order, not in parallel.  But, you can still check Twitter or get coffee while they run instead of sitting in “Point and Click Heck”.

For comparing to the live databases, I used:

Set NoCount On  
 -- Extract the gold copy from Test  
 Select   
      'sqlpackage.exe /a:Extract /scs:Server=US1213113W1SQL2014;Database='+[name]+'; /tf:C:Usershillke4DocumentsSQLScriptsDACPACS'+[name]+'.dacpac'  
 from sys.databases  
 where [name] like 'Test'  
 --Create a compare script with test.dacpac as the source  
 --and all databases starting with Test as the targets  
 Select   
      'sqlpackage.exe /a:Script /sf:C:Usershillke4DocumentsSQLScriptsDACPACSTest.dacpac /tsn:US1213113W1SQL2014 /tdn:'+[name]+' /op:C:Usershillke4DocumentsSQLScriptsDACPACSDeltas'+[name]+'.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True'  
 from sys.databases  
 Where 1=1  
   and [Name] like 'test%'  
   and [Name] <> 'Test'  

 

Note the parameters in the Compare section:

    /a – the action to be performed.  Script creates a .sql file with the changes to be made at the target
    /sf – source file to compare from.  the “gold copy”
    /tsn – target server name
    /op – output path for the .sql file
    /p – Property setting to drop tables, views and other objects that should not be there
    /p – Property setting to drop indexes that should not be there (different from other objects)
There are a ton of different ‘/p’ settings and options.  If you don’t include the first one I did (dropobjectsnotinsource), you may get back an empty .sql file even when you know there are differences.  This was the topic of my follow up question on Stack Overflow.  Schema Compare in VS showed the new objects, CMD didn’t.
Go to the SQLPackage.exe link for all of the specifics.   This is highly configurable to your needs.
Your output should resemble:
Extract the Gold .dacpac:  
 sqlpackage.exe /a:Extract /scs:Server=MyLaptopSQL2014;Database=Test; /tf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac  
   
   
 Compare to the target databases:  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test1 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest1.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test2 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest2.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test3 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest3.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test4 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest4.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
 sqlpackage.exe /a:Script /sf:C:UsersKevin3NFDocumentsSQLScriptsDACPACSTest.dacpac /tsn:MyLaptopSQL2014 /tdn:Test5 /op:C:UsersKevin3NFDocumentsSQLScriptsDACPACSDeltasTest5.sql /p:DropObjectsNotInSource=True /p:DropIndexesNotInSource=True  
   

Copy and paste the /a:Extract line into a CMD window (don’t forget the PATH variable mentioned earlier) if you have not already done this part.  Run it by pressing Enter.

Copy all of the /a:Script lines (plus a carriage return at the very end) into a CMD window, and they will start running immediately.  Without the final CR;LF at the end, the last one will just sit there waiting for you to hit enter.

Any yellow text you see is warning you about possible data loss on dropping Tables and Indexes.

*** The object [Test] exists in the target, but it will not be dropped even though you selected the ‘Gene
rate drop statements for objects that are in the target database but that are not in the source’ check bo
x.
*** The object [Test_log] exists in the target, but it will not be dropped even though you selected the ‘
Generate drop statements for objects that are in the target database but that are not in the source’ chec
k box.

 

*** The column [dbo].[Orders].[CustomerName] is being dropped, data loss could occur.

The first two are the data and log logical file names, and the 3rd is warning about data loss from dropping the CustomerName column that was added to Test4.   The dependent view also gets a metadata refresh after that column is dropped…a nice feature to have.

View the results, looking for exceptions:


At this point, you should have a .sql file for each of your target/client databases, assuming no connection issues.   Ideally, they are all in the same place, which will help with the last step (actually running the scripts is your task…I’m just taking you through generating them).The way I choose to analyze mine is not to look at each one, but rather to look for exceptions in the size of the file.   There is some text in each file that is exactly the same.   Any additional text is T-sql code that drops, creates or alters something.Sample output .sql file, with the DROP statements in red:

/*
Deployment script for Test5
This code was generated by a tool.
Changes to this file may cause incorrect behavior and will be lost if
the code is regenerated.
*/
GO
SET ANSI_NULLS, ANSI_PADDING, ANSI_WARNINGS, ARITHABORT, CONCAT_NULL_YIELDS_NULL, QUOTED_IDENTIFIER ON;
SET NUMERIC_ROUNDABORT OFF;
GO
:setvar DatabaseName “Test5”
:setvar DefaultFilePrefix “Test5”
:setvar DefaultDataPath “c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATA”
:setvar DefaultLogPath “c:Program FilesMicrosoft SQL ServerMSSQL12.SQL2014MSSQLDATA”
GO
:on error exit
GO
/*
Detect SQLCMD mode and disable script execution if SQLCMD mode is not supported.
To re-enable the script after enabling SQLCMD mode, execute the following:
SET NOEXEC OFF;
*/
:setvar __IsSqlCmdEnabled “True”
GO
IF N’$(__IsSqlCmdEnabled)’ NOT LIKE N’True’
    BEGIN
        PRINT N’SQLCMD mode must be enabled to successfully execute this script.’;
        SET NOEXEC ON;
    END
GO
IF EXISTS (SELECT 1
           FROM   [master].[dbo].[sysdatabases]
           WHERE  [name] = N’$(DatabaseName)’)
    BEGIN
        ALTER DATABASE [$(DatabaseName)]
            SET ENABLE_BROKER
            WITH ROLLBACK IMMEDIATE;
    END
GO
USE [$(DatabaseName)];
GO
PRINT N’Dropping [dbo].[VOrderDetails]…’;
GO
DROP VIEW [dbo].[VOrderDetails];
GO
PRINT N’Update complete.’;

My technique for comparing file sizes is to open a CMD Window by using Shift-RightClick on the folder they are in, choose ‘Open Command Window Here’, then run a ‘DIR’, which will list the files with the size in bytes.  Windows Explorer default is KB.  There are no doubt a bunch of ways to do this, so pick whatever you prefer.   My results:

(Image lost in migration from Blogger)

Note that the three we didn’t change are all 1,330 bytes.  Test4 is larger, as is Test5.  They are different from each other due to the exact T-SQL in them to perform the drops necessary to bring them back to gold.  These are my exceptions.

In my environment, it is appropriate for me to stop here and start going through the exceptions manually.  For you, it may be appropriate to automatically execute the scripts.  Look at the SQLPackage link from Microsoft for the /Action: Publish General parameter.

WARNING:  Publishing a .dacpac file that has any data in it overwrites the existing data in the target. This may be fine for a ‘State’ table, but not at all for Orders.

Note:  If you are comfortable with PowerShell, please go look at this article from Chris Sommer, which he wrote in response to me asking the Compare Multiple Databases question.  It covers the same basics I just did, but without the looping to get all the DBs at once.   He did the script and blog in about 2 hours…

Note2:  All of my testing and scripting was done on a laptop, running Windows 7, SQL 2014 Dev edition, Visual Studio and SSDT 2015.

Note3:  Special thanks to whoever wrote this Code Formatter for Blogspot posts!

All of this works in my environment, but its possible I missed a step in the writing.  Please help me QA this post.  If something is not clear or I made some egregious error, please please please make a Comment.  If it works, and helps you please let me know that as well!

Thanks for reading,

Kevin3NF
The OnPurpose DBA

Filed Under: Uncategorized

Min and Max Server Memory in English

August 15, 2016 by Kevin3NF Leave a Comment

This one is for the new DBAs…

There is a lot of confusion on memory settings in SQL Server.  Specifically Min and Max settings found in the Properties of an instance:

There are dozens, if not hundreds of blog postings that will go into beautiful detail about all of the intricacies of SQL Server memory…after all, SQL lives there!   Thus, it is very important to have a basic understanding early in your SQL DBA career….especially if you are an accidental DBA.

In the screenshot above, I have set the Min and Max on my laptop to 2GB and 14GB.  Its a 16GB laptop, so I left 2GB for the O/S…pretty basic stuff.

Max Server Memory is fairly obvious…when SQL Server reaches this point, it stops allocating and starts paging to disk as necessary.

This is pretty much the same as setting a throttle control on a car.  Or a restrictor plate in NASCAR.  Without it you can run the car engine all the way to the max if you want, but eventually the other systems are going to fail (Cooling, transmission, brakes, etc.).  Same thing if you don’t leave memory for the O/S.

Min Server Memory seems to get the most bad information spread around.   SQL Server does NOT automatically grab memory up to the Min setting when it starts.  However, once it gets there, it doesn’t give back.

Back to the car analogy…if you start up and head off down the road at 20 mph, you are above the default (0), but not at the max (100+, depending on the car and the tires…).  If you set the cruise control, you can accelerate up and down above 20, but you won’t go below that unless you hit the brakes.

So that’s it…by default, SQL installs at idle and full speed ahead.  Its your job to turn on the cruise control and not redline the engine until it blows.

There are holes in the analogy if you dig deep enough, but this should give you a real-world base to work from as you grow in your skills.

Kevin3NF

Filed Under: Beginner, Configuration, Install, Performance

Login Failed, pt 2

August 14, 2016 by Kevin3NF 1 Comment

In my last post I hoped to convince you to pay attention to all of the various “Login Failed for user…” messages that you see in your SQL Server ERRORLOGS.   ALL of them.

Yes, some you can ignore based on the environment or the person.   Jim the web guy on a Dev box is just not that much of a security threat (unless you let him touch Prod, but that’s a different post).

Some of you have one or two servers, and reviewing ERRORLOGs is no big deal to do manually.  More of you have tens and tens of them.   Some of you have thousands (I’m looking at you in Managed Hosting environments such as Verizon, Rackspace, etc. where customers pay you to do this).

By now, hopefully you are aware that you can issue queries against all servers or groups of servers at once.   If not, a very quick how-to in SSMS:

  1. View-registered servers (Ctrl-ALT-G)
  2. Expand Database Engine
  3. Right-Click Local Server Groups
  4. Set up groups as you see fit (mine are Prod and non-prod)
  5. Register servers in the appropriate group



To query a group, right-click the group (or Local Server Groups for all of them), select New Query and off you go.  The results will append the server name as the first column.

Note that my servers are listed in order, but the results are not.   Results come in the order SQL Server was able to connect and receive the results back:


Side note…running Select @@version is a great way to ensure all of the SQL Servers are up and running, especially after a Patch weekend, or even just first thing in the morning.

Now for the stuff you actually wanted to read/learn…how to read the ERRORLOGS all at once:

We are going to dump the results of sys.xp_readerrorlog into a temp table, and then query just like any other table:

Create Table #Errorlog
(Logdate datetime,
 ProcessInfo varchar(50),
 LogText varchar(5000))

–Dump all the things into the table
insert into #Errorlog
EXEC sys.xp_readerrorlog
 0 — Current ERRORLOG
,1 — SQL ERRORLOG (not Agent)

–Query just like you would anything else:
Select *
from #Errorlog
Where 1=1
and (LogText like ‘%Error%’
or LogText like ‘%Fail%’)
And Logdate > getdate() -3

–Clean up your mess, you weren’t raised in a barn!
Drop Table #Errorlog

My results:



If I’m doing this on a Monday, I set the date to look back to over the weekend…otherwise 1 or 2 days.  But whatever works best for you.  There is nothing magic in here.  For more details on xp_readerrorlog, click the link.  No point in me re-writing a perfectly good explanation.

Hopefully this will help you pay more attention to what’s going on in your ERRORLOGs, whether for Login Failed, or even just to find all backups for a certain DB.  Just change the Where clause to fit your needs.

If I saved you any time at all, I’ve done my job.  Let me know if you actually use this!

Fries are awesome, but I’m trying to ShrinkFile my gut…

Kevin3NF


Filed Under: Uncategorized

Security fail.

August 11, 2016 by Kevin3NF Leave a Comment

This…just no:

 

 

Filed Under: Security

Relax…its ok…

August 11, 2016 by Kevin3NF Leave a Comment

Sometimes its good to sit back, listen, nod and hear what is being said before speaking.   Actually, that is almost always the best idea.

Case in point:
I am the team lead here (small team of 3…SQL, Windows and storage admins…we overlap.).
I cam back from lunch yesterday and one of my guys very passionately launched into “We need to have a meeting!”, “The developers want too many deployments!”, “We need change management!”, etc.
All of his points were true.  This is a small team with very few procedures and practises, and our job is to get a handle on this.   We are also at the end of the development process for v1.0 of an internal application…which is being demonstrated today.   Not the best time to suddenly change things.
So I listened while he made his case, agreed with most of what he said and asked some questions when he was done:
1.  What problem are you trying to solve by forcing change windows today that don’t exist?
2.  How many “deployments” are we being asked to do each day?  (A deployment here could simply be ALTERing a stored proc, and the target is a Pre-Production database)
3. Should we be focusing on the other issues here we have already identified?  Where does this rank in the list? (Backups, security, perf, etc. all rank higher and are more actionable)
What it boiled down to is that we don’t really have a problem…he just got hit with three requests in a short time frame, due to the upcoming demo to the executive staff.
We get maybe 2 requests a day from the Devs, and have 3 people capable of deploying them.  At this time, on this project…all a forced window will do is alienate 10 of the 15 team members.   Yes, it is a good idea, but lets phase it in for better acceptance, when the team is not under the gun.  Production release is only a month away…
Sometimes its best to relax, look at the bigger picture and make the best decision for the team.
Imma buy this guy lunch, with fries 🙂
Kevin3NF

Filed Under: Uncategorized

  • « Go to Previous Page
  • Go to page 1
  • Interim pages omitted …
  • Go to page 31
  • Go to page 32
  • Go to page 33
  • Go to page 34
  • Go to page 35
  • Interim pages omitted …
  • Go to page 39
  • Go to Next Page »

Primary Sidebar

Search

Sign up for blogs, DBA availability and more!

Home Blog About Privacy Policy
  • Home-draft
  • Blog
  • About Us

Copyright © 2025 · WordPress · Log in

 

Loading Comments...