Why do I include Plan Guides in my Server Assessment

When I complete a server assessment there are a number of areas I look at.  These areas include security, performance and configuration.  Most of the specific items fall into one of these categories.  In my opinion, a complete server assessment should not only return the state of the server but should also return any configuration settings that may have an impact on performance.  Plan Guides fall into this area.  They can have a negative impact on performance just a quickly as they can have a possible impact on performance.

I my over 20 years of working with SQL Server I have never seen a plan guide actually used in production.  This doesn’t mean they are not being used, just that i haven’t seen it.

Before getting into why I look for Plan Guides, let’s take a few moments to go over exactly what a plan guide is.

Below is the definition from Microsoft:

“Plan guides let you optimize the performance of queries when you cannot or do not want to directly change the text of the actual query in SQL Server 2019 (15.x). Plan guides influence the optimization of queries by attaching query hints or a fixed query plan to them.”

Basically Plan Guides allow you to utilize optimizer hints without modifying the source code.  For example, Plan Guides allow for the use of a particular physical join type.  The optimizer utilizes a Hash Match join and you feel that the query might perform in a more optimal manner with a Nested Loop, the Plan Guide will allow this to happen without changing any code.

While this really isn’t a post about how to create and use Plan Guides, I think it is important to have a basic understanding of them. This is a link to Microsoft that will provide more detail.

When using the Object Explorer, Plan Guides can be found under Programmability in each database.

When I execute the query below in the AdventureWorks2014 database the actual execution plan will utilize a Merge Join operator.  This is without a plan guide to tell the optimizer to do anything different.  Of course, as you can see, an optimizer hint is also not used in the query.

SELECT *
FROM Sales.Customer AS c
INNER JOIN Sales.Customer AS ca
ON c.CustomerID = ca.CustomerID
WHERE ca.TerritoryID = 5

 

When a Plan Guide is created, you will be asked to provide the desired hint.  In the case below, the hint is to use a Nest Loop whenever the above query is sent to the server.

When the plan exists, the execution plan will include the use of the Nest Loop join.  This is because the query that matched the one in the defined in the Plan Guide.

Something to keep in mind, Plan Guides are both case sensitive and space sensitive.  If the plan guide includes the query below:

SELECT *
FROM Sales.Customer AS c
INNER JOIN Sales.Customer AS ca
ON c.CustomerID = ca.CustomerID
WHERE ca.TerritoryID = 5

And the following query is executed:

select *
FROM Sales.Customer AS c
INNER JOIN Sales.Customer AS ca
ON c.CustomerID = ca.CustomerID
WHERE ca.TerritoryID = 5

Notice that the SELECT keyword is not capitalized.  This will force the plan guide to be ignored.  The plan guide will also be ignored if there is a trailing space at the end of one of the lines.

Also, plan guides can also be enabled\disabled.  This can be done by simply right clicking on the plan guide in Management Studio and going to “Enable” or “Disable”.

When to use Plan Guides

While I haven’t used Plan Guides I can see where they have the potential to improve performance of a particular query in certain situations.  Many vendors don’t allow you to change the code of the stored procedures to add a query hint, not to mention this also may not be the best idea in the first place.

So why do a look for Plan Guides during the server assessment?

During the server assessment, I look for these more for awareness than anything else.  If a particular query performs poorly, it could be because of the plan guide.  If they do exist then it is important to determine if is helping or contributing to the poor performance.  As mentioned earlier, if a Plan Guide does exist, you will also need to determine of it is enabled or disabled.

The code below will check every database and return a row for each plan guide found.  In addition, it will return the query and whether or not it is enabled.

        CREATE TABLE #PGInfo
          ( DBName                 VARCHAR(128)
          , GuideName            VARCHAR(500)
          , CreateDate              DATETIME
          , ModifiedDate         DATETIME
          , IsEnabled                BIT
          , QueryStatement    VARCHAR(300)
          , Type                         NVARCHAR(Max))
       
       EXECUTE master.sys.sp_MSforeachdb ‘USE [?] INSERT INTO #PGInfo   SELECT
         DB_Name()
          , Name
          , create_date
          , modify_date
          , is_disabled
          , query_text
          , scope_type
        FROM sys.plan_guides ‘
        SELECT *
        FROM #PGInfo
        IF @@ROWCOUNT = 0
        BEGIN
         SELECT ‘No Plan Guides were found in any databases.’
        END
        DROP TABLE #PGInfo

 

As stated earlier, looking for Plan Guides is more about awareness than anything else.  If there is a performance issue with a particular query, knowing that these exist important.

Thanks for visiting my blog!

 

ADS: Simple Data Scripter

Moving data in a table from a production server to a lower environment such as a development server can be done a number of different ways.  These include backup\restore, using SSIS to move the data, import\export wizard and scripting the data in SQL Server Management Studio.  Azure Data Studio by itself doesn’t have a way to script the data in a table and it does not have the import\export wizard.  This is where the Table Data Scripter extension comes in handy.

Like most extensions for Azure Data Studio, this extension is also in a preview state.  As of the date of this post, it is currently at version 0.1.1 and is published by Sean Price. The documentation for this extension can be found here, Simple Data Scripter. In the documentation there are a few limitations identified

      • Not all data types are supported, specifically the binary datatypes like text, binary and varbinary
      • Has only been tested with SQL Server
      • Does not work on a view

Some extensions will add options under Preferences.  This extension does not add any options.  Given that, there is limited ability to configure this, which is OK, not really much of a need to do any configuration.

Just as a review, let’s take a look at how we can script the data in SQL Server Management Studio.  After right clicking on the database and going to Tasks…Generate Scripts…  When you do, a wizard will pop up and then after a few clicks on next, you will arrive at the screen below.  The box highlights the “Types of data to script” option.  With this there are three options, Data Only, Schema and Data and finally Schema Only.

When right clicking on a database in Azure Data Studio you will see limited options none of which are related to scripting data.  This is also true when right clicking on a table in the database.  Here are the options we see by right clicking on a table, as before, none of which are related to scripting data.

In order to script the data in Azure Data Studio we will need to install the Simple Data Scripter extension.  This is a pretty straight forward and simple install.  You will need to download the .vsix file.  You can get to the GitHub to download the file from the documentation for the extension in ADS.  You can install it under the File menu.  While you will not be prompted to restart ADS when you install it, I usually like to restart ADS anyway.  In this case, the extension doesn’t move to the installed extensions until after a restart.

Now that the extension is installed, you will see a new option when you right click on a table.  This is another potential limitation of the extension, there isn’t an option to script all the data in the database.  Well, at least I haven’t found it yet.  But then again, I am not sure having a simple place to script all the data in one place if the best way to go.

When clicked the following screen will appear at the top Azure Data Studio.  Initially when we look at it our first thoughts might be, what a bad query.  But it does make sense, we are creating a script of all the data in the table.

If you click anywhere, ADS will cancel the query and you will see this pop up in the lower right corner and the above screen will disappear.

If you press enter Azure Data Studio will execute the query and generate the script. You will not see a result set but you will see the script.  It will look similar to below.  Notice the Create Table statement at the top, it is commented out.  The table that is used for the example is named RentalCar.  The script is working with a table named #tempRentalCar.

When the script is finished, the Task window will open up at the bottom of the screen.  The top two are successful executions while the third failed, reason for the failure will be covered shortly.

You may be wondering if the query can be modified.  The answer is yes, however not every type of statement will work.  If you want to add a WHERE clause that works just fine.  However, as the query become more complex it failed more frequently.  I attempted to run a query with a join in it and it failed.  That is the failure in the image above.  The error stated “Query has no results to return”.  However, the query works just fine when running in ADS or SQL Server Management Studio.

Here are some of the statements I have tried.

      • Query with a join – failed
      • Query with a sub-query – failed
      • Query with a WHERE clause and just one table – succeeded

When compared to the expectations, this extension works very well and I can definitely see a use for this.

Thanks for visiting my blog!!

 

Auto Delete Jobs????

Have you ever looked at something in SQL Server and wonder why it is there?  That is what I think when I see this option in the SQL Server Agent job properties.  I can not come up with any good reason of why you would want a job to delete itself upon completion.  I even did a Google search and really didn’t find a good reason.  However, if you know of a great reason of why you would want to enable this, I would love to hear about it.

If you are not familiar with this option, you can find it under the Notifications tab of the job properties.

Just like all the notifications, you have several options.

      1. When the job secceeds
      2. When the job fails
      3. When the job compeletes

When the job deletes, it will also take a job history with it.  Basically there will be no evidence that the job ever existed.  This is pretty straight forward, so you might be wondering why bring it up.  Well, I see this setting as a potential danger, especially if you have a disgruntled DBA.  While being a DBA is a great job, there are, from time to time things that can have a negative impact on how we feel about our job.  Luckily the SQL Server community is filled with really great people.

We have a job that checks all of our jobs looking for any jobs that has this set.  You can use the query below to get the necessary information. There is a column, delete_level in sysjobs that will allow us to get the information needed.

SELECT    name
  , delete_level
  , CASE delete_level
    WHEN 0 THEN ‘Not Set’
    WHEN 1 THEN ‘Delete on Success’
    WHEN 2 THEN ‘Delete on Failure’
    WHEN 3 THEN ‘Delete on Completion’
    END AS ‘Delete Level Setting’
FROM sysjobs

So why do I think this is potentially dangerous?  Image the disruptive code that could be written and then imagine that same code in a job that is scheduled to execute 6 months later.  Now go one step further, all evidence of that job running is now gone.  I really don’t want to give too much information, but I can think of several things that could be done that could cost the company a ton of money.  This is why we check for this.

I also include this check on my server assessment scripts.

As I said earlier, if anyone can give me a good reason to have this set, I am willing to listen and will update this post accordingly.  Giving proper credit of course.

My suggestion is to look for this!

Thanks for visiting my blog!!!!

How to View Agent Jobs in Azure Data Studio

As a DBA creating and managing SQL Server Agent jobs is an essential part of our day to day activity.  We have always done this via SQL Server Management Studio or Transact SQL.  Both of these are still great options for managing jobs, but what about Azure Data Studio.  Unfortunately, the management of agent jobs is not part of the native functionality of Azure Data Studio.  If you look at the screenshot below you will see that the SQL Server agent is not available.  So does this mean that Azure Data Studio isn’t the proper tool for the management of agent job?  The answer to this question is a classic DBA answer, it depends.  If you do not install an extension you will not be able to manage jobs.

Before going any further let’s take a few minutes to go over what an extension is.  As the name might imply, it extends something.  In this case Azure Data Studio has it’s functionality extended to do something that is normal cannot do.   An example of this would be the ability to manage SQL Server Agent jobs.  Since Azure Data Studio requires an extension to manage jobs, what extension would need to be added?  The answer is the Admin Pack for SQL Server extension.  There is a SQL Server Agent extension as well, this is part of the Admin Pack for SQL Server. While this extension does more than just job management, we will focus mostly on how manage Agent jobs using Azure Data Studio.

Here is a list of some of the functionality of the Admin Pack for SQL Server extension.

    •   Agent Job management
    •   Import file wizard
    •   Deploy and extract dacpacs
    •   Deploy and extract bacpacs
    •   Browse Extended Events.
    •   Other functionality as well

As stated earlier, we will need to install the Admin Pack for SQL Server extension.  So where do we get it?  Since it was created by Microsoft, you will be able to get it right from inside Azure Data Studio.

Before installing the extension, we should probably check to see if it is installed already or if it has been disabled. If you click the icon in the green box below, the extension Marketplace will open up.  If you see the Admin Pack for SQL Server under the Recommended block, it is not installed.  If you see it under Disabled, the extension has been disabled.

List of disabled extensions can be found in the following location. Notice that the extension has been greyed out.

Another way to determine if the extension has been installed is to see if the SQL Agent appears as an option when you are managing the server level.   As in the image below the Agent will be present when managing the server.  If you do not see SQL Agent, it means either it is not installed or has been disabled.  If it is not installed you can go to the MarketPlace and install it from there.

Now that we have confirmed the Agent extension is not only installed, it is also enabled, we are ready to manage the jobs on the server.  Let’s do a quick inventory of what a typical DBA would do with jobs.  All of these can be completed with Azure Data Studio, however there just a few minor limitations that will be discussed later in this post.

      • Manage jobs history
      • Manage operators
      • Manage alerts
      • Manage proxies

To get to the Agent job information in Azure Data Studio, you will need to right click on the server and go to “Manage”.  As seen below.

Once you do, you will not only see the SQL Agent but also other Extensions you may have installed in Azure Data Studio.

In the above image, you will see that not only do I have the Admin Pack for SQL Server extension installed, but I also have the extension for sp_whoisactive and a few others.  All of which I would recommend that you install.

When you are finally in the proper place to manage the jobs, you will see the same items you will see when using SQL Server Management Studio.  You will of course see the job name, last run, enabled status, category and a few others.  These are the same items we have been working with for years in SSMS.  In the green box in the image below you will be able to click each item and you will be taken to the management area for jobs, alert, operators and proxies.

As you can see above, you can sort by any column you would like and it also displays information about the failure if the last execution was a failure.  In addition to sorting, when you click the down arrow in the column header, you will also have an option to filter the column, similar to Excel. In the red box above is something that is pretty cool. Each bar represents one of the last five job executions. If you click on the of bars, you will be taken to the details of that execution. The image below is an example of what you will see.

When looking at the jobs you will see the button we can use to create a new job. When creating a new job, all the same information is requested so it is pretty easy to figure out.

There are a few differences though when creating a new job in Azure Data Studio.  Starting with job step type.  When working in Azure Data Studio you don’t have the three BI stack related types.  The image below shows the job step types for both SQL Server Management Studio on the left and Azure Data Studio on the right.

Another difference is the creation of new job schedules.  This can be done using SSMS, but can not be done with Azure Data Studio.  You can get use existing shared schedules, just can’t create new ones.

Moving away from jobs, alerts can also be created and modified utilizing Azure Data Studio.  All the same information can be entered, however there is one difference.  When creating an alert you are asked to chose an alert type. When using SSMS you will have three options:

      • SQL Server Event alert
      • SQL Server Performance condition alert
      • WMI event alert

When using Azure Data Studio, you only have one option, SQL Server Event alert.

While there are a few limitations, Azure Data Studio is still a very good tool.  The few limitations in Azure Data Studio for Agent job management can easily be over come simply by using SQL Server Management Studio.

Thanks for visiting my blog!!

Azure Data Studio – Explain Button

One for the first things that jumped out at me in Azure Data Studio was the “Explain” button.  After you make your connection, you will see in the upper right hand corner of the query window the Explain button.

At first I was thinking maybe it is a short cut to help.  Nope, it is actually an easy way to get the estimated execution plan. The key is estimated execution plan. While in SQL Server Management Studio we have been able to easily get both the estimated or actual execution plan.

Just as a reminder, the actual execution plan requires the query to actually run. While the estimated plan will use statistics to generate the plan.  Generally, if the actual and estimated are capture very close to each other, you are more than likely not going to see a difference between the tow.

Now let’s talk about the Explain button.  It will return two things, the XML of the execution plan and a graphical representation of the execution plan.

You will notice in the above image, the Results tab.  Just as in SSMS, this is where you can find the results of the query.  In the image above I simply clicked the Run button and the results were returned.

When you click the Explain button, the look of what is returned will be a bit different.  You will still see the Results and Messages items, but there are now two additional items, Query Plan and Top Operators.

The query plan will be the item code that is selected, revealing the estimated execution plan.  There are two ways to show that it is indeed the estimated plan.  The first is that under the results tab you will not see the results of the query.  Rather, you will see the XML of the plan, no data.  Remember, SQL Server does not execute the query when returning the estimated execution plan.

Another area to look at to show that the plan is an estimated plan rather than actual is by looking at the properties of some of the operators.  Notice in the image below there are a number of Estimated properties but no actual.  If this was the Actual plan, you will see both estimated number of rows and actual number of rows.

Another feature that I like is the Top Operators item.  This does exactly what you expect it to do, return the top operators.  If you have used SentryOne Plan Explorer this is not new to you.  But for the others who have not used Plan Explorer, this will be a welcome feature.  And if you haven’t use the SentryOne Plan Explorer…..you SHOULD. It is a great tool and can be downloaded for free. To make is easy for you…here is a link to download it, Click Here.

What if I want the Actual execution plan and not the estimated plan.  There is not a button for this, but there is a short cut.  Just hit ctrl + m.  This will execute the query and return the actual plan.  As you can see below, we have both Actual number of rows and estimated number of rows.

When running this I did find something interesting.  If I have two queries such as below in SSMS and I capture the actual execution plan, I will see both plans.

SELECT * FROM Production.Product
SELECT * FROM Production.ProductCategory
Both plans returned in SSMS.
This is not how it works is Azure Data Studio.  While the results for both queries are returned as well as the XML for the plan, the graphical plan is returned only for the first query.
As you can see….both statements have the XML of the plan.  However only one graphical plan is returned, for the first query in the batch.
Another item that I noticed that is different in Azure Data Studio is the percent of batch in the plan.  In the image below from SSMS you can clear see which of the two queries was the most costly. In the other image for Azure Data Studio, notice no percent for the statement in the batch.  I really liked using this in SSMS.
I think overall, Azure Data Studio appears to be a pretty cool tool.  Thank you for stopping to visit my blog!!

Azure Data Studio Introduction

I recently downloaded and installed Azure Data Studio and so far it looks pretty good.  As I learn more about it I blog about the features and what is different from SQL Server Management Studio.  Azure Data Studio is a free offering by Microsoft and the install is very simple.

Once installed, you will find the menu item in a slightly differently location.  You will not find it under any of the SQL Server related groupings.  As you can see below, there is a new menu grouping, “Azure Data Studio”.  Under this is where you can find the menu item to launch the application.

Once you open Azure Data Studio obviously you do not have a connection to any data source.  At this point you will asked to establish a connection.

You are asked to provide much of the same information you would in SSMS.  This includes server name, type of authentication and well as the database.  You can also provide a Server Group if you like.  This provides nothing more than a logical grouping of servers for easier management.

Once you connect you will then see the dashboard, similar to below. Here you are able to confirm the connection information, see a list of all the objects in the database,

Here you will see a list of all the objects in the database.  You will also see that there are widgets.  The default widgets are to perform such tasks as database backup, database restore, new query and new notebook.  The first three of these have been around for years.  What might be new to some is the notebook.  We will get into that later in this post. This dashboard also includes information about the connection including the compatibility level of the database.

If you click the server name in the breadcrumbs across the top as below.  You will see a few widgets that apply to the server level.

As in the screenshot below, you can see the backup status of all the databases and the currently size.

If you click the upper left corner of the backup status, the current state of all the databases will be returned.

One of the biggest reasons of why we might be using Azure Data Studio is to read data.  This is very similar to SQL Server Management Studio.  Click the “New Query” item on the dashboard or New Query under the File menu.  When you do it is similar to SSMS.  There are tabs across the top and you can also change databases.  Both of these are the same as SSMS, although the database list is in a slightly different location.  You will see the connect\disconnect button and change connection. These do exactly as you would expect them to do.

Results and Messages are still there, just as in SSMS.  Now let’s take a quick look a few of the things that are different than SSMS.  The most obvious difference is the “Explain” button. This will simply return the estimated execution plan of the query you are looking to run.

When you click the “Explain” button you will see what is above.  The query plan and messages as well as the Top Operators.  The image below is what you will see on the Top Operators tab.

You might think that the results table would have results in it.  The answer is yes and no.  It does return results, just not data. You will find the XML of the Execution Plan.  If you click it, a new tab will open with the XML in it.

Another item that is different is on the far right of the results tab, as below.

The top four buttons are to save the data.  In order, they are for CSV, Excel, Json and XML.  The bottom two are for visualizing the data using charts.

Now we will move to the upper right corner.  The icon in the box below is for connections.

This is where we can add new connections.  In the upper right corner you will see three buttons.  The farthest on the left is the new connections button.  The middle button is for creating server groups.  This is what you see below in the red bar.

 

When adding a server group you are asked only to provide a group name, description and a color.  When you create a new connection, you will be able to pick a server group to add the new server to.

All the items covered above are also in SSMS in some form or another.  What is really cool about Azure Data Studio is something that cannot be found in SSMS.  That is SQL Notebook.  So far I am a big fan of this new feature.  It basically allows you to store notes and executable T-SQL code in the same document.

To add a new Notebook, you can find the Add Notebook item under the file menu.  There are two types of items that can be added to a notebook, code and text.  To add either one, you can click the buttons in the upper left or the links in the middle.

When the Text is clicked you will then have the opportunity to enter notes.  Below is an image of what you will see. It looks like there are duplicate lines, but in reality there is not. Once you click away from the note, the top one will disappear, leaving on the note.

If you click Code, you will then be able to enter T-SQL code just as you would in SSMS.

Notice the icon on the left side that looks like a play button.  Well that is exactly what it does, it runs or the query.

In the image above, you can see the text as well as the results of the query.  Once you are done with the results, you can clear them by clicking the ellipsis in the upper right corner of the results.  The bottom option is Clear Output.

Once you have everything in the Notebook you can save it.  The Notebook, the file extension is ipynb.

In the coming weeks I will be doing more posts on Azure Data Studio.  Although so far I really like it, there are a few things that I have not been able to find yet, like Actual Execution plans.