ADS: Simple Data Scripter

Moving data in a table from a production server to a lower environment such as a development server can be done a number of different ways.  These include backup\restore, using SSIS to move the data, import\export wizard and scripting the data in SQL Server Management Studio.  Azure Data Studio by itself doesn’t have a way to script the data in a table and it does not have the import\export wizard.  This is where the Table Data Scripter extension comes in handy.

Like most extensions for Azure Data Studio, this extension is also in a preview state.  As of the date of this post, it is currently at version 0.1.1 and is published by Sean Price. The documentation for this extension can be found here, Simple Data Scripter. In the documentation there are a few limitations identified

      • Not all data types are supported, specifically the binary datatypes like text, binary and varbinary
      • Has only been tested with SQL Server
      • Does not work on a view

Some extensions will add options under Preferences.  This extension does not add any options.  Given that, there is limited ability to configure this, which is OK, not really much of a need to do any configuration.

Just as a review, let’s take a look at how we can script the data in SQL Server Management Studio.  After right clicking on the database and going to Tasks…Generate Scripts…  When you do, a wizard will pop up and then after a few clicks on next, you will arrive at the screen below.  The box highlights the “Types of data to script” option.  With this there are three options, Data Only, Schema and Data and finally Schema Only.

When right clicking on a database in Azure Data Studio you will see limited options none of which are related to scripting data.  This is also true when right clicking on a table in the database.  Here are the options we see by right clicking on a table, as before, none of which are related to scripting data.

In order to script the data in Azure Data Studio we will need to install the Simple Data Scripter extension.  This is a pretty straight forward and simple install.  You will need to download the .vsix file.  You can get to the GitHub to download the file from the documentation for the extension in ADS.  You can install it under the File menu.  While you will not be prompted to restart ADS when you install it, I usually like to restart ADS anyway.  In this case, the extension doesn’t move to the installed extensions until after a restart.

Now that the extension is installed, you will see a new option when you right click on a table.  This is another potential limitation of the extension, there isn’t an option to script all the data in the database.  Well, at least I haven’t found it yet.  But then again, I am not sure having a simple place to script all the data in one place if the best way to go.

When clicked the following screen will appear at the top Azure Data Studio.  Initially when we look at it our first thoughts might be, what a bad query.  But it does make sense, we are creating a script of all the data in the table.

If you click anywhere, ADS will cancel the query and you will see this pop up in the lower right corner and the above screen will disappear.

If you press enter Azure Data Studio will execute the query and generate the script. You will not see a result set but you will see the script.  It will look similar to below.  Notice the Create Table statement at the top, it is commented out.  The table that is used for the example is named RentalCar.  The script is working with a table named #tempRentalCar.

When the script is finished, the Task window will open up at the bottom of the screen.  The top two are successful executions while the third failed, reason for the failure will be covered shortly.

You may be wondering if the query can be modified.  The answer is yes, however not every type of statement will work.  If you want to add a WHERE clause that works just fine.  However, as the query become more complex it failed more frequently.  I attempted to run a query with a join in it and it failed.  That is the failure in the image above.  The error stated “Query has no results to return”.  However, the query works just fine when running in ADS or SQL Server Management Studio.

Here are some of the statements I have tried.

      • Query with a join – failed
      • Query with a sub-query – failed
      • Query with a WHERE clause and just one table – succeeded

When compared to the expectations, this extension works very well and I can definitely see a use for this.

Thanks for visiting my blog!!

 

Auto Delete Jobs????

Have you ever looked at something in SQL Server and wonder why it is there?  That is what I think when I see this option in the SQL Server Agent job properties.  I can not come up with any good reason of why you would want a job to delete itself upon completion.  I even did a Google search and really didn’t find a good reason.  However, if you know of a great reason of why you would want to enable this, I would love to hear about it.

If you are not familiar with this option, you can find it under the Notifications tab of the job properties.

Just like all the notifications, you have several options.

      1. When the job secceeds
      2. When the job fails
      3. When the job compeletes

When the job deletes, it will also take a job history with it.  Basically there will be no evidence that the job ever existed.  This is pretty straight forward, so you might be wondering why bring it up.  Well, I see this setting as a potential danger, especially if you have a disgruntled DBA.  While being a DBA is a great job, there are, from time to time things that can have a negative impact on how we feel about our job.  Luckily the SQL Server community is filled with really great people.

We have a job that checks all of our jobs looking for any jobs that has this set.  You can use the query below to get the necessary information. There is a column, delete_level in sysjobs that will allow us to get the information needed.

SELECT    name
  , delete_level
  , CASE delete_level
    WHEN 0 THEN ‘Not Set’
    WHEN 1 THEN ‘Delete on Success’
    WHEN 2 THEN ‘Delete on Failure’
    WHEN 3 THEN ‘Delete on Completion’
    END AS ‘Delete Level Setting’
FROM sysjobs

So why do I think this is potentially dangerous?  Image the disruptive code that could be written and then imagine that same code in a job that is scheduled to execute 6 months later.  Now go one step further, all evidence of that job running is now gone.  I really don’t want to give too much information, but I can think of several things that could be done that could cost the company a ton of money.  This is why we check for this.

I also include this check on my server assessment scripts.

As I said earlier, if anyone can give me a good reason to have this set, I am willing to listen and will update this post accordingly.  Giving proper credit of course.

My suggestion is to look for this!

Thanks for visiting my blog!!!!

How to View Agent Jobs in Azure Data Studio

As a DBA creating and managing SQL Server Agent jobs is an essential part of our day to day activity.  We have always done this via SQL Server Management Studio or Transact SQL.  Both of these are still great options for managing jobs, but what about Azure Data Studio.  Unfortunately, the management of agent jobs is not part of the native functionality of Azure Data Studio.  If you look at the screenshot below you will see that the SQL Server agent is not available.  So does this mean that Azure Data Studio isn’t the proper tool for the management of agent job?  The answer to this question is a classic DBA answer, it depends.  If you do not install an extension you will not be able to manage jobs.

Before going any further let’s take a few minutes to go over what an extension is.  As the name might imply, it extends something.  In this case Azure Data Studio has it’s functionality extended to do something that is normal cannot do.   An example of this would be the ability to manage SQL Server Agent jobs.  Since Azure Data Studio requires an extension to manage jobs, what extension would need to be added?  The answer is the Admin Pack for SQL Server extension.  There is a SQL Server Agent extension as well, this is part of the Admin Pack for SQL Server. While this extension does more than just job management, we will focus mostly on how manage Agent jobs using Azure Data Studio.

Here is a list of some of the functionality of the Admin Pack for SQL Server extension.

    •   Agent Job management
    •   Import file wizard
    •   Deploy and extract dacpacs
    •   Deploy and extract bacpacs
    •   Browse Extended Events.
    •   Other functionality as well

As stated earlier, we will need to install the Admin Pack for SQL Server extension.  So where do we get it?  Since it was created by Microsoft, you will be able to get it right from inside Azure Data Studio.

Before installing the extension, we should probably check to see if it is installed already or if it has been disabled. If you click the icon in the green box below, the extension Marketplace will open up.  If you see the Admin Pack for SQL Server under the Recommended block, it is not installed.  If you see it under Disabled, the extension has been disabled.

List of disabled extensions can be found in the following location. Notice that the extension has been greyed out.

Another way to determine if the extension has been installed is to see if the SQL Agent appears as an option when you are managing the server level.   As in the image below the Agent will be present when managing the server.  If you do not see SQL Agent, it means either it is not installed or has been disabled.  If it is not installed you can go to the MarketPlace and install it from there.

Now that we have confirmed the Agent extension is not only installed, it is also enabled, we are ready to manage the jobs on the server.  Let’s do a quick inventory of what a typical DBA would do with jobs.  All of these can be completed with Azure Data Studio, however there just a few minor limitations that will be discussed later in this post.

      • Manage jobs history
      • Manage operators
      • Manage alerts
      • Manage proxies

To get to the Agent job information in Azure Data Studio, you will need to right click on the server and go to “Manage”.  As seen below.

Once you do, you will not only see the SQL Agent but also other Extensions you may have installed in Azure Data Studio.

In the above image, you will see that not only do I have the Admin Pack for SQL Server extension installed, but I also have the extension for sp_whoisactive and a few others.  All of which I would recommend that you install.

When you are finally in the proper place to manage the jobs, you will see the same items you will see when using SQL Server Management Studio.  You will of course see the job name, last run, enabled status, category and a few others.  These are the same items we have been working with for years in SSMS.  In the green box in the image below you will be able to click each item and you will be taken to the management area for jobs, alert, operators and proxies.

As you can see above, you can sort by any column you would like and it also displays information about the failure if the last execution was a failure.  In addition to sorting, when you click the down arrow in the column header, you will also have an option to filter the column, similar to Excel. In the red box above is something that is pretty cool. Each bar represents one of the last five job executions. If you click on the of bars, you will be taken to the details of that execution. The image below is an example of what you will see.

When looking at the jobs you will see the button we can use to create a new job. When creating a new job, all the same information is requested so it is pretty easy to figure out.

There are a few differences though when creating a new job in Azure Data Studio.  Starting with job step type.  When working in Azure Data Studio you don’t have the three BI stack related types.  The image below shows the job step types for both SQL Server Management Studio on the left and Azure Data Studio on the right.

Another difference is the creation of new job schedules.  This can be done using SSMS, but can not be done with Azure Data Studio.  You can get use existing shared schedules, just can’t create new ones.

Moving away from jobs, alerts can also be created and modified utilizing Azure Data Studio.  All the same information can be entered, however there is one difference.  When creating an alert you are asked to chose an alert type. When using SSMS you will have three options:

      • SQL Server Event alert
      • SQL Server Performance condition alert
      • WMI event alert

When using Azure Data Studio, you only have one option, SQL Server Event alert.

While there are a few limitations, Azure Data Studio is still a very good tool.  The few limitations in Azure Data Studio for Agent job management can easily be over come simply by using SQL Server Management Studio.

Thanks for visiting my blog!!

Azure Data Studio – Explain Button

One for the first things that jumped out at me in Azure Data Studio was the “Explain” button.  After you make your connection, you will see in the upper right hand corner of the query window the Explain button.

At first I was thinking maybe it is a short cut to help.  Nope, it is actually an easy way to get the estimated execution plan. The key is estimated execution plan. While in SQL Server Management Studio we have been able to easily get both the estimated or actual execution plan.

Just as a reminder, the actual execution plan requires the query to actually run. While the estimated plan will use statistics to generate the plan.  Generally, if the actual and estimated are capture very close to each other, you are more than likely not going to see a difference between the tow.

Now let’s talk about the Explain button.  It will return two things, the XML of the execution plan and a graphical representation of the execution plan.

You will notice in the above image, the Results tab.  Just as in SSMS, this is where you can find the results of the query.  In the image above I simply clicked the Run button and the results were returned.

When you click the Explain button, the look of what is returned will be a bit different.  You will still see the Results and Messages items, but there are now two additional items, Query Plan and Top Operators.

The query plan will be the item code that is selected, revealing the estimated execution plan.  There are two ways to show that it is indeed the estimated plan.  The first is that under the results tab you will not see the results of the query.  Rather, you will see the XML of the plan, no data.  Remember, SQL Server does not execute the query when returning the estimated execution plan.

Another area to look at to show that the plan is an estimated plan rather than actual is by looking at the properties of some of the operators.  Notice in the image below there are a number of Estimated properties but no actual.  If this was the Actual plan, you will see both estimated number of rows and actual number of rows.

Another feature that I like is the Top Operators item.  This does exactly what you expect it to do, return the top operators.  If you have used SentryOne Plan Explorer this is not new to you.  But for the others who have not used Plan Explorer, this will be a welcome feature.  And if you haven’t use the SentryOne Plan Explorer…..you SHOULD. It is a great tool and can be downloaded for free. To make is easy for you…here is a link to download it, Click Here.

What if I want the Actual execution plan and not the estimated plan.  There is not a button for this, but there is a short cut.  Just hit ctrl + m.  This will execute the query and return the actual plan.  As you can see below, we have both Actual number of rows and estimated number of rows.

When running this I did find something interesting.  If I have two queries such as below in SSMS and I capture the actual execution plan, I will see both plans.

SELECT * FROM Production.Product
SELECT * FROM Production.ProductCategory
Both plans returned in SSMS.
This is not how it works is Azure Data Studio.  While the results for both queries are returned as well as the XML for the plan, the graphical plan is returned only for the first query.
As you can see….both statements have the XML of the plan.  However only one graphical plan is returned, for the first query in the batch.
Another item that I noticed that is different in Azure Data Studio is the percent of batch in the plan.  In the image below from SSMS you can clear see which of the two queries was the most costly. In the other image for Azure Data Studio, notice no percent for the statement in the batch.  I really liked using this in SSMS.
I think overall, Azure Data Studio appears to be a pretty cool tool.  Thank you for stopping to visit my blog!!

Azure Data Studio Introduction

I recently downloaded and installed Azure Data Studio and so far it looks pretty good.  As I learn more about it I blog about the features and what is different from SQL Server Management Studio.  Azure Data Studio is a free offering by Microsoft and the install is very simple.

Once installed, you will find the menu item in a slightly differently location.  You will not find it under any of the SQL Server related groupings.  As you can see below, there is a new menu grouping, “Azure Data Studio”.  Under this is where you can find the menu item to launch the application.

Once you open Azure Data Studio obviously you do not have a connection to any data source.  At this point you will asked to establish a connection.

You are asked to provide much of the same information you would in SSMS.  This includes server name, type of authentication and well as the database.  You can also provide a Server Group if you like.  This provides nothing more than a logical grouping of servers for easier management.

Once you connect you will then see the dashboard, similar to below. Here you are able to confirm the connection information, see a list of all the objects in the database,

Here you will see a list of all the objects in the database.  You will also see that there are widgets.  The default widgets are to perform such tasks as database backup, database restore, new query and new notebook.  The first three of these have been around for years.  What might be new to some is the notebook.  We will get into that later in this post. This dashboard also includes information about the connection including the compatibility level of the database.

If you click the server name in the breadcrumbs across the top as below.  You will see a few widgets that apply to the server level.

As in the screenshot below, you can see the backup status of all the databases and the currently size.

If you click the upper left corner of the backup status, the current state of all the databases will be returned.

One of the biggest reasons of why we might be using Azure Data Studio is to read data.  This is very similar to SQL Server Management Studio.  Click the “New Query” item on the dashboard or New Query under the File menu.  When you do it is similar to SSMS.  There are tabs across the top and you can also change databases.  Both of these are the same as SSMS, although the database list is in a slightly different location.  You will see the connect\disconnect button and change connection. These do exactly as you would expect them to do.

Results and Messages are still there, just as in SSMS.  Now let’s take a quick look a few of the things that are different than SSMS.  The most obvious difference is the “Explain” button. This will simply return the estimated execution plan of the query you are looking to run.

When you click the “Explain” button you will see what is above.  The query plan and messages as well as the Top Operators.  The image below is what you will see on the Top Operators tab.

You might think that the results table would have results in it.  The answer is yes and no.  It does return results, just not data. You will find the XML of the Execution Plan.  If you click it, a new tab will open with the XML in it.

Another item that is different is on the far right of the results tab, as below.

The top four buttons are to save the data.  In order, they are for CSV, Excel, Json and XML.  The bottom two are for visualizing the data using charts.

Now we will move to the upper right corner.  The icon in the box below is for connections.

This is where we can add new connections.  In the upper right corner you will see three buttons.  The farthest on the left is the new connections button.  The middle button is for creating server groups.  This is what you see below in the red bar.

 

When adding a server group you are asked only to provide a group name, description and a color.  When you create a new connection, you will be able to pick a server group to add the new server to.

All the items covered above are also in SSMS in some form or another.  What is really cool about Azure Data Studio is something that cannot be found in SSMS.  That is SQL Notebook.  So far I am a big fan of this new feature.  It basically allows you to store notes and executable T-SQL code in the same document.

To add a new Notebook, you can find the Add Notebook item under the file menu.  There are two types of items that can be added to a notebook, code and text.  To add either one, you can click the buttons in the upper left or the links in the middle.

When the Text is clicked you will then have the opportunity to enter notes.  Below is an image of what you will see. It looks like there are duplicate lines, but in reality there is not. Once you click away from the note, the top one will disappear, leaving on the note.

If you click Code, you will then be able to enter T-SQL code just as you would in SSMS.

Notice the icon on the left side that looks like a play button.  Well that is exactly what it does, it runs or the query.

In the image above, you can see the text as well as the results of the query.  Once you are done with the results, you can clear them by clicking the ellipsis in the upper right corner of the results.  The bottom option is Clear Output.

Once you have everything in the Notebook you can save it.  The Notebook, the file extension is ipynb.

In the coming weeks I will be doing more posts on Azure Data Studio.  Although so far I really like it, there are a few things that I have not been able to find yet, like Actual Execution plans.

 

 

 

 

 

 

 

 

 

 

 

How to capture an Execution Plan

Since in the coming weeks I will be presenting a session on Execution plans at a few SQL Saturday events, I thought I would do a post on how to capture execution plans.  Execution plans are a critical tool when attempting to troubleshoot a performance issue.  SQL Server Management Studio is one of the common tools we use to get the plan, if we have the query. You can also use Extended Events and Query Store.

Using SSMS

In SSMS there are two options to capture the execution plan, Actual and Estimated plan. The biggest difference between the two is that the Actual plan needs to execute the query, while the estimated plan uses statistics to create the plan and does not execute the query.

To get the Estimated Execution plan, you can use the Query menu. There you will find the Display Estimated Execution Plan menu item.

There a two ways to get the actual execution plan.  One is on the tool bar and the other is on the Query menu.

When looking to capture the estimated execution plan, as soon as you click on the menu item, the plan will be returned.  If you are looking for the actual execution plan, SQL Server will execute the query and when it is finished the plan will be returned.  For simple queries the estimated execution plan will be returned very fast, however for more complex queries it might take some time.

When in SSMS, how can I tell if the plan is actual or estimated? There are a few ways. The easiest way is to check the number of tabs.  As you can see below there are two tabs, one for messages and another for Execution Plan.  There is a missing table, results.

Estimated Execution Plan:  Two tabs

Actual Execution Plan: Three tabs

Another way to determine if the plan is actual or estimated will require us to dig a little deeper.  We will need to look at the properties of a few of the operators.

When you float the cursor over the Index Scan you will see this popup.  The items in the green boxes are for the actual execution while the items in the red boxes is for estimated execution.  Although having the options for estimated execution, you might think this is for the estimated execution, it is not.

When both estimated and actual number of rows can be found in the pop up, this means that it is an actual execution plan.  If it were an estimated execution plan, you will not see the actual items, as you can see in the image below.

If the estimated and actual plans are captured very close to each other, you will not see much difference.  However, if there are a large period of time between them, there could be a number of differences.

This will work with both an on-prem instance of SQL Server as well as an Azure SQL Database.

Extended Events

Using SSMS works great if we have the query, however it is a manual process.  What if you want to capture the plans after hours or automatically.  This is where Extended Events comes into play.  Just a word of warning, when capturing execution plans with Extended Events, you could actually cause a performance problem so be careful.

This is what Microsoft has placed in the description of the events related to Execution Plans. I think it is a pretty clear warning about using Extended Events to capture Execution plans.

“Using this event can have a significant performance overhead so it should only be used when troubleshooting or monitoring specific problems for brief periods of time.”

When working with an On-Prem instance of SQL Server, Extended Events can be found under “Management”.  This is a server level setting, although you can limit the activity captured to a single database.

The location is slightly different for an Azure SQL Database.  Extended Events can be found under the database.

You can right click on the “Sessions” folder under Extended Events and then click “New Session”.

When creating the session there are three events we will want to consider. These events can be found in both an on-prem instance as well as Azure SQL Database.

query_pre_execution_showplan – this event is the equivalent of an estimated execution plan

query_post_execution_showplan – this event can be used to capture the actual Execution plan.

query_post_compilation_showplan – Occurs after a SQL statement is compiled.

To create an Extended Event session you can either use SQL Server Management Studio by right clicking “Session” under the  Extended Events item or using the CREATE SESSION statement.

If using SSMS graphical tool, you can right click on Sessions and then click “New Session” you will see something similar to this.

Once you give the session a name, the you are ready to move on to picking the proper events.  Remember, there are three options for events, query_pre_execution_showplan and query_post_execution_showplan.

If you enter “showplan” into the Event Library search box.  You will see four events, the three mentioned earlier and a fourth related to Query Store.  We will discuss this one later.

Move the events over by clicking the “>” button and then you are ready to add the target.  Once you do, save the session, start it and you will now be collecting execution plans.

You can also use the CREATE EVENT SESSION statement. The statement below collects all three events, again NOT recommended.  Notice that this statement also has a filter for the AdventureWorks2014 database.  If you do decide to attempt, what even Microsoft says you should use with caution, try to reduce activity by limiting the collection to a single database.

CREATE EVENT SESSION [ExecutionPlan] ON SERVER
ADD EVENT sqlserver.query_post_compilation_showplan,
ADD EVENT sqlserver.query_post_execution_showplan,
ADD EVENT sqlserver.query_pre_execution_showplan(SET collect_database_name=(1)
    ACTION(sqlserver.sql_text)
    WHERE ([database_name]=N’AdventureWorks2014′))
ADD TARGET package0.event_file(SET filename=N’ExecutionPlan’),
ADD TARGET package0.ring_buffer
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=ON)
GO

Now that we are collecting the plans all we have to do is right click on the target and then click view data.  You will see something similar to below.  Notice the Query Plan tab highlighted.

If you right click on the plan you will see this context menu.  You can now save the plan, view the XML and even open the SentryOne Plan Explorer, if you have it installed.  If you don’t have it installed…..go get it.  It is a great tool.

 

Extended Events are a great tool to collect information about what is going on in your server.  As stated several times, if you are looking to Extended Events to collect your execution plans, do so very cautiously.  Also, limit the time the sessions is running and limit the activity to only the database you are looking to find a solution to a performance problem.

Query Store

The last option we will cover is Query Store.  This is a great tool!!!  Query Store came out with SQL Server 2016.  What I like about Query Store is that it allows me to collected the needed execution plans without the overhead of Extended Events.

To enable Query Store, simply go to the properties of the database and select Query Store.  The Operation Mode Property is Off by default. Change this to either Read or Read Write.  Now Query Store is enabled.

Notice that there are two Operation Mode properties.  One is actual and the other is Requested.  If you change Requested to something different than Actual, once you click OK, the requested setting will now become the actual setting.

Per Microsoft, here are a number of scenarios of why you might want to utilize Query Store.

Once you enable it and are collecting data, there are a number of reports that you can use to view the plans. Under that database, in this case, AdventureWorks2014, you will now see an option for Query Store.  When you expand it, you will see the reports that are available to you.

When you right click on one of the reports you will see something similar to this. By clicking on the bar in the chart on the upper left, the other two panels will update.  As you can see, the execution plan is in the bottom panel.

Once you have the plan visible, it works just like all the option methods of collecting the plan.  You can view the properties of each operator as well as save the plan for analysis later.

If you utilize query store, there is an event in Extended Events you might want to consider.  This event is query_store_generate_showplan_failure and will fire when the Query Store has an error creating the plan.

Here is the description from the event:

“Fired when Query Store failed to store a query plan because the showplan generation failed”

This post is about how to collect the Execution Plans, including Query Store.  In this post, I am not going to get indepth about all the options of Query Store.  There is some really great information out there, including some great videos.  Eric Stellato has some great content on Query Store in a number of locations, including SQLSkills.com, SQLPerformance.com and yes, even YouTube.

Query Store on SQLPerformance

There are number of options of collecting Execution plans.  My preference would be to utilize Query Store. Mostly because it captures the plans automatically, has really good reporting and has very low overhead.

Thanks for visiting my blog!!!

Extended Events – Reading Multiple Session Files

Extended Events are a very useful tool when trying to troubleshoot performance issues. When setting up an Extended Event session there are multiple options for the storage of the data.  One of the most common targets for storage is the event_file target.  This type of storage can be used for both an on-prem instance of SQL Server as well as an Azure SQL database.

This post will go over how to read from both locations, on-prem and an Azure storage account.  They are similar, but different enough to cover here. The biggest difference as you will see, is setting everything up properly so you will be able to read multiple files with one TSQL statement.

When setting up a event_file there are a few options that can be set.  Of course the most logical is the path to the location where the files will be placed.  You will also have the opportunity to set the size of the files, maximum number of files as well as whether or not to allow file roll over.

Since you might have more than one file,  you may want to read from multiple files with the same query, that is the focus of this blog post.  Of course in order to read from a file, the file name is important, even if reading from multiple files.  If you look at the image below, you will see that there is a large number attached to the end of the file name.  Just a side note, I like to name the file the same as the Extended Event session name.  This makes it easier to match the files to the session.

According to Microsoft the number at the end of the file name is calculated using the following method:

“The integer value is calculated as the number of milliseconds between January 1, 1601 and the date and time the file is created.”

Reading On-Prem Files

When reading from a event_file target, the fn_xe_file_target_read_file function is used.  The first parameter of this function is the file path, which includes the file name.

Using the two files in the above image as an example, if I want to read from a single file, I would include the full path including the name of the file.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut_0_131920765279670000.xel’, null, null, null);

Notice that in order to read from the file, I included the full name of the file.  However, there are two files due to the rollover.  What do I need to do to read from both files in one query?

In order to do this, there is one simple change that needs to be made.  Notice in the code below, the file name has been modified to use the “*” following the name.  The number at the end of the file is no longer needed.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut*.xel’, null, null, null);

This will now return the data from all the XEL files in the location specified that start with “ExecutionTimeOut”.

Reading from Azure Storage Account

This works great for an on-prem instance of SQL Server, however there are some changes that must be made when quering the file targets of an Azure SQL database.

Before we get into the change that needs to be made, let’s go over a few prerequisites on using Extended Events for an Azure SQL Database and using an Azure storage account.  This blog will not get very deep into a few of the prerequisites, however they do need to be mentioned.

The first prerequisites is having an Azure Storage account.  When you do this, you will given a few options for the type of storage.  This can be obtained in your Azure portal.  You will need to decide on a few things as well, including what subscription you will use for the storage account and what region it is in.

To create the storage account, simply click the “Storage Account” menu item in the Azure portal.  You will need to fill in all the requested information.

Now that you have provided all the requested information, you can now utilize the storage.  When picking a target type in the Extended Events “wizard”, my preference is the event_file target type.

In order to read the Extended Event files, you will also need to get the EndPoint for the storage account.  As you can see above, you will need to go to the properties of the storage account.  When you do you will then have to capture the Endpoint for the type of storage account you chose.  In this case, the option would be between a file storage account or a Blob storage. Notice that the two Endpoints only have one small difference, right before the word “core”, you will see either File or Blog.  You just need to copy the path to the clip board.  You will need this path when reading the Extended Event files.

The next item you will need is a Shared Access Signature for the storage account. This can be obtained a few different ways.  One is to use the Azure Storage Explorer and the other is to use the Azure portal.

To obtain this signature using the Azure portal navigate to the storage account and then click “Shared account Signature”.  The screen on the right will then be populated with the settings of the signature.  One important setting you want to be aware of is the End option under “Start and Expiry data\time” section.  Make sure you pick the appropriate duration. When this data\time has past you will no longer be able to read the files in the storage account. When you tried to read an Extended Event file and the End date has past, you will not receive an error, you will just see no records returned.

Towards the bottom of the above screen, after you scroll down a bit, you will see a button that looks like the blue button at the top of image below.  When you click that, the SAS token will appear.  This is what we will need to read the Extended Event files.  To capture this, just click the blue button to the right of the box under SAS token.  In the blue box below, you will see two date, these are the start and end date for the token.  When the end date has past the token will no longer work and your queries to read from the files will not return any data.  In addition, you will also not get an error.

We are almost done with the prerequisites. Once you have configured the Extended Event session to use a Storage Account and the session is running, you a ready to read the files.  There is a lot of really good information out there on how to create an Extended Event session..

The last requirement is to create a database scoped credential in your Azure SQL Database.  To do this, you will need to SAS token mentioned about.  However, notice that there is a “?” at the front of it.  This “?” you will need to remove prior to creating the credential.

Although not a requirement, I would also highly recommend that you download the Azure Storage Explorer.  It is a very nice tool that can make our lives much easier when working with storage accounts.

To create the credential, you will use SQL Server Management Studio.  Below you will see the code used to create the credential.  First thing to remember is that you must run this code when connected to your Azure SQL Database.  Notice the path right after the CREATE line.  This is the path to your storage account and must start with HTTPS.  When you read this code, you will find the “SECRET”, this is where you place the SAS token you captured earlier.  Please remember to take the “?” out before executing the code.

CREATE DATABASE SCOPED CREDENTIAL [https://MyStorageAccountName.blob.core.windows.net/eefiles]
  — this name must match the container path, start with https and must not contain a forward slash at the end, in this case, the container is eefiles
WITH IDENTITY=‘SHARED ACCESS SIGNATURE’
  — this is a mandatory string and should not be changed  
 , SECRET = ‘sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-07-01T09:53:56Z&st=2019-07-01T01:53:56Z&spr=tldodr43465kadfuJRsCYUmi1OmYywUjw%3D’
   — this is the shared access signature key that you obtained from the portal, this shared access signature has been changes for security reasons
GO

You can run this code querying the  sys.database_credentials view to confirm the credential was actually created.  Make sure to be connected to your Azure SQL Database.

SELECT * FROM sys.database_credentials

Finally we have all the prerequisites done, now we can move on to actually reading the files.

As you can see in the image below, the file names follow the same pattern as an on-prem instance of SQL Server.

Now that we are ready to read the files, one change you will need to make is put the path to your storage account.  The code below will read just from the file hard coded in the path.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(
    ‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles_0_132044728780700000.xel’,
                null, null, null
            );;

However, we need to read from all the files.  When reading multiple files with an on-prem instance of SQL Server, we simply removed the number at the end of the file name and replaced it will an “*”.  Logic would say that should work when reading from an Azure Storage account, but it doesn’t work.  You will not get an error either. It will just not return any rows.

This will work, but will not generate an error either, just no rows.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles*.xel’,

The only change we need to make to the above code, is remove the numbers at the end of the file.  Make sure to include the front part of the name and remove the file extension as well.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles’,

Now, you can read from all the files with one query.  However, you need to make sure you are not pulling too much data, it could time out.  As it has for me a few times.

Using SSMS Menu items

Another way to read from multiple target files is inside SSMS.  However, it only works for an on-prem instance of SQL Server.  On the File menu, Open you will then see a pop out menu with “Merge Extended Event Files…” item.

When you click, you will have the option to navigate to the location of the files you wish to merge and display the data.  You will see the form like the one below.

You will then click the Add button and select the desired files.  You can use Control+ click to select more than one file.  Once you do, the files will be listed and you just click OK for the data to be displayed in SSMS.  Remember, this does not work for an Azure SQL Database.

The files don’t have to be from the same session to merge them, although many times they will be.  As you can see below there are files from two different sessions, the deadlock session and the ExecutionTimeout session.

Once the data is merged and displayed in SSMS, you can group the data by event. This can be done by right clicking on the column in the top half of the results and going to “Group by this column”.  The results will then look similar to the image below.  Notice that the top four rows are from the Deadlock session, while the bottom row, the Attention event, is from the QueryTimeout session.

 

Additional Code to Read the XML Data

While this post is about how to read mulitple files, I also want to include the code to parse through the resulting XML from the above code.  If you look, the above code simply takes the data and converts it to XML.  While the code below will parse the XML into a more readable format. You will need to make a few changes.  You will need to use your path to your storage account and you will need to update the columns you want to extract from the XML.

If you look at the code below, there are three steps.  Well actually only two, the middle step is optional.

–Extract the XML

SELECT event_data = convert(xml, event_data)
INTO #eeTable
 FROM sys.fn_xe_file_target_read_file(
                — Fill in Storage Account name, and the associated Container name.
                ‘https://Mystorageaccount.blob.core.windows.net/eefiles/efiles’,
                null, null, null
            );;
–SELECT statement just to confirm there are rows, this is not needed
SELECT * FROM #eeTable
–Parse the XML and put it into the temporary table
SELECT
  ts  = event_data.value(N'(event/@timestamp)[1]’, N’datetime’),
  [sql]  = event_data.value(N'(event/action[@name=”sql_text”]/value)[1]’, N’nvarchar(max)’),
  duration  = event_data.value(N'(event/data[@name=”duration”]/value)[1]’, N’nvarchar(max)’),
  spid  = event_data.value(N'(event/action[@name=”session_id”]/value)[1]’, N’int’),
  user_nm = event_data.value(N'(event/action[@name=”username”]/value)[1]’, N’nvarchar(max)’),
  dbname = event_data.value(N'(event/action[@name=”database_name”]/value)[1]’, N’nvarchar(max)’),
  explan  = event_data.value(N'(event/data[@name=”showplan_xml”]/value)[1]’, N’nvarchar(max)’)
FROM #eeTable
ORDER BY ts
DROP TABLE #eeTable

 

You can get the column names from the Extended Events GUI.  When you are looking at the Action tab, you would use ‘event\action’.  When you want to pull a column in the “event fields” table.  Something to remember, if you misspell a column name, you will not get an error when you run the query.  The column will just be NULL for every row.

Here is an example of where to get the Action column names. Notice that they can be found on the Global Fields(Actions) table.

Below is an example of the “Data” fields.  Notice that the “Event Fields” tab is that active tab.

Now you can read from multiple Extended Event files with a single statement.  This will allow you to take the results, dump the data into a temporary table and then use T-SQL to find the data you are looking for.

Thank you for visiting my blog!!!!

Extended Events File Target – File Rollover and Max File Size settings

Extended Events has a number of options for the storage of the events.  These include Ring_buffer, pair_matching, event_file and several others.  As expected, each of these have different options for configuration.  This post is about the event_file target.  As a side note, Extended Event sessions will have only one target, it is possible to have more than one.  The System_Health session does have two, event_file and ring_buffer.

Below you will find a screenshot of the options for the event_file target.  These are the same type of options we have had in Profiler for many years.

File name just identifies what the file name will start with.  As you can see in the image below, SQL Server adds a large number to the file name.  This number represents the number as defined below.

“The integer value is calculated as the number of milliseconds between January 1, 1601 and the date and time the file is created.”

The other options are pretty straight forward. The Maximum file size has two option, MB and GB.

If I only have a few sessions, I could easily go and look at each session.  However, what if I have a large number of sessions and I want to review these settings for all sessions.  It would be nice to be able to pull this information out of the database using T-SQL.

When using Extended Events, there are a number of DMVs that can help us gather information about the sessions that exist on the server. If you do a Google search on “DMV Extended Events” you will get a number of links that point to one of the views in the list below.  Notice that they all start with “dm_ex_”.  These are very useful views, however the data we are looking for can not be found in any of these.

To get this information we need to look in a different set of views.  These views start with “server_event” for an on-prem instance and “database_event” for an Azure SQL database.  Below is a list of this set of views.

To get the information we are looking for we need to use two of these, server_event_session_fields and server_event_sessions.  We will need to join these two views on the event_session_id column in both views.  If you look at the image below, you will notice that SQL Server stores the values as a row in the view rather than a column for each option.  Even though the GUI offers us two options for the maximum file size, MB and GB, the actual value is stored in MB.

For the maximum file size and the maximum rollover files, you will not see them in these views unless they are changed from the default. If you look at the image below there is one of the options missing, maximum rollover files.  This is because it is set to the default value of 5.

Just a little note about the max_rollover_files option.  This has a default of 5, as stated earlier, however if you see that it has a value of 0, then the “Enable File Rollover” is set to false.  As in the image below.  If you are wondering why all the options are not active, this is because the target has already been created.

To get the defaults we need to run the query below.

SELECT    name
  , object_name
  , type_name
  , column_value
  , description
FROM sys.dm_xe_object_columns
WHERE object_name = ‘event_file’

When you run the above query, your result set will look like what is below.  Notice the Max File Size and Max Number of files.

To get the data we are looking for we need to use the query below.

SELECT
 s.name,
 f.name,
 f.value
FROM 
 sys.server_event_session_fields AS f
INNER JOIN
 sys.server_event_sessions AS s
ON
 f.event_session_id = s.event_session_id
WHERE f.name IN (‘filename’ ,‘max_file_size’,‘max_rollover_files’
  , ‘filename’)
ORDER BY s.name

I have taken above code and used the PIVOT to hopefully make the results more readable.  This code also takes into account the default values, that as you may remember is in a different location.

 

Here is the complete code for an on-prem instance of SQL Server, including the PIVOT.

DECLARE @cols      AS NVARCHAR(MAX)
DECLARE @query  AS NVARCHAR(MAX);
CREATE TABLE #EEOptions
  ( OptionName  sql_variant,
    Filename   sql_variant,
    MaxFileSize  sql_variant,
    MaxRolloverFiles sql_variant)
 
SET @cols = STUFF((SELECT DISTINCT ‘,’ + QUOTENAME
       name)
      FROM  sys.server_event_session_fields
      WHERE name IN (‘filename’ ,‘max_file_size’,‘max_rollover_files’
        , ‘filename’)
            FOR XML PATH(), TYPE
            ).value(‘.’, ‘NVARCHAR(MAX)’)
        ,1,1,)

 
SET @query = ‘Insert #EEoptions SELECT SessionName AS ”OptionName”, ‘ + @cols + ‘ from
            (
            SELECT
 s.name AS ”SessionName”,
 f.name as ”OptionName2”,
 f.value
FROM 
 sys.server_event_session_fields AS f
INNER JOIN
 sys.server_event_sessions AS s
ON
 f.event_session_id= s.event_session_id
WHERE f.name IN (”filename” ,”max_file_size”,”max_rollover_files”
  , ”filename”)

           ) x
            pivot
            (
                  Max(value)
FOR OptionName2 IN  (‘ + @cols + ‘)
            ) p ‘
 
 
EXECUTE(@query)
SELECT OptionName
  , Filename
  , ISNULL(MaxFileSize, 1024) AS ‘MaxFileSize’
  , ISNULL(MaxRolloverFiles, 5) AS ‘MaxRolloverFiles’
FROM #EEOptions
DROP TABLE #EEOptions
This code will not work for an Azure SQL Database.  I will hopefully complete a post on how to do this in Azure.  One thing I can mention is that the properties are slightly different. Notice that the Maximum files size still an option, however Maximum Rollover files is not.
Thanks for taking the time to read my blog.  It is always appreciated.

Pending Restart for Authentication Type Change

In a previous post, I talked about how to determine if your SQL Server needs a restart(click here to read it).  In that post I talk about how to use the configurations view to determine if a restart is needed.  This just looks at the running vs configured values of the configuration settings. After talking about this with a co-worker, he had a very interesting question.

What about the server settings that are not in the configuration view?

Some server level settings, when changed require a reboot to take effect.  For example, if you change the Authentication type, a restart of the SQL Server service is needed for the change to take effect.

You will see this pop up box when you change a server property and a restart is needed.

In order to determine if a restart is needed, we need to look at two places, the registry and a ServerProperty.  There are a number of Server Properties. Because this post is about the Authentication type, the property we are looking for is “IsIntegratedSecurityOnly

When you change the value of the Authentication type in SQL Server Management Studio, it actually only changes it in the registry.  For the Server Property to change and match the registry, a restart of the SQL Server service is needed.

The code below can be used to completed the comparison. It looks at both places and completes the comparison.

USE MASTER
GO
DECLARE @authmodeServerProp TINYINT = 0
DECLARE @Restart            VARCHAR(100)
SELECT  @authmodeServerProp = CASE SERVERPROPERTY(‘IsIntegratedSecurityOnly’)
            WHEN 0 THEN 2  –Set to Mixed, Windows and SQL authentication
            WHEN 1 THEN –Set to Windows Only
END
DECLARE @AuthMode INT = NULL
EXEC xp_instance_regread N’HKEY_LOCAL_MACHINE’,
                         N’Software\Microsoft\MSSQLServer\MSSQLServer’,     
                         N’LoginMode’, @AuthMode OUTPUT,
       N’no_output’
IF @authmodeServerProp = @AuthMode
 BEGIN
  SELECT @Restart = ‘No pending restart of the service due ‘ +
       ‘a change in the Authenication Mode.’
 END
ELSE
  BEGIN
   SELECT @Restart =‘The Authentication Mode has changed, look into ‘ +
                 ‘restarting the service’
  END
  SELECT    CASE @authmodeServerProp
      WHEN  1 THEN ‘Windows Only’
      WHEN  2 THEN ‘Windows and SQL Authentication’
      ELSE  ‘Something when wrong’ END AS ‘ServerProperty’
    , CASE @authmode
      WHEN  1 THEN ‘Windows Only Authentication’
      WHEN  2 THEN ‘Mixed Authentication’
      ELSE ‘There was a problem’ END AS ‘Registry value’
    , @Restart   AS ‘RestartNeeded?

 

The code above simply looks into the registry and the Server Properties.  If they do not match, then a restart of the service is needed.  One thing to notice, the server property works with values of 0 and 1, while the registry works with values of 1 and 2.  That is why the code changes the 0 and 1’s of the server property to a 1 or a 2.

When you run the above code, you will have a data set similar to the one below.  The first two columns return the current value of each of the locations.  While the last column will tells us of the a restart is needed.

This will be added to my “How to complete a Server Assessment” SQL Saturday presentation.

Thanks for stopping by my blog.

Is there a pending reboot on my SQL Server?

As a DBA we make every effort to avoid bringing systems off line, however there are times where that is exactly what needs to happen.  From a Windows and SQL Server perspective there are a number of changes that require a reboot.  These reasons range from the application of Windows updates, the changing of Windows policies and changes to SQL Server settings.  Over the years there have been a number of times where I have connected to a server and found a message stating that there is a pending reboot.  When changes are made, of course we want the changes to take effect as soon as possible.  In some cases a change will not take effect until a reboot happens. In order to determine if there is a pending reboot, we need to look at a number of places, including both Windows and SQL Server.

Starting with Windows, there are some changes that require a reboot, including that application of Windows updates.  Checking for a pending reboot for Windows can be done using one of the newest friends of a DBA, Powershell.  The information that needs to be queried is in the registry.

This is the popup that I have seen a number of times showing that there is a pending update.  What makes this the less than ideal option is that I need to logon to the server.

Brian Wilhite has a very nice PowerShell module that you can use to check for pending reboots in Windows.  It can be found on GitHub here. It can also be found on PowerShellGallery.com here. You can also download the ps1 file here.

On August 20, 2018 he stated on Microsoft’s Script Center web site that he is no longer making updates to this on the Script Center site, you will need to get it from GitHub instead, link is above.  According to the Script Center site here are the versions of the OS it has been tested on.  I have tested it on Windows 10 and it appears to work just fine.

The script below will install the module and complete the check.

         # Install

Module -Name PendingReboot
# Run
Test-PendingReboot -Detailed -ComputerName localhost
In order to run this script, you must run the Powershell application as Administrator.  If you do not, you will see this error.
Below are the results of running the PendingReboot PowerShell script.  By adding the -Detailed switch you will see more information about what might have caused the need for the restart.  In addition, you can add the -ComputerName switch to check a remote computer.
What I like about this is that it allows me to test a remote computer.  Just as a precautionary measure, make sure to throughout research and exercise the appropriate caution prior to running any code obtained on the internet on production servers.
Of course knowing if there is a pending reboot in Windows is important, however there are a number of SQL Server settings that will also cause a need for a restart of the SQL Server service in order to take effect.  The above powershell script will not detect this.  There are a few ways using SQL Server Management Studio that can tell us if the service needs to be restarted.
The first method is to look at the settings of the SQL Server.  At the bottom you will see “Configured Value” and “Running Value”.  If you look at the images below, you can see that the Default index fill factor is different.  The configured value is 0, while the running value is 85.  This is a sign that the SQL Server service needs to be restarted.

 

 

 

 

 

 

 

The drawback of this type of check is that I need to manually go to each setting option and do a compare.  Luckily, Microsoft has added a view for us to use to programmatically check this.  The view is sys.configurations.  There are a number of columns that are important for what we are looking for.  Below I have listed the columns that are important for the purpose of this post.  Of course there are more columns that these.

Is_dynamic is the column that will tell us if a restart is needed if that value of the setting is changed. If this is true, a restart of the service is not needed if the value is changed.

is_advanced identifies if it is an advanced option

value_in_use is the column that will tells us the Running value

value will identify that configured value

When looking at this, if Value_in_use and value are different, then a service restart is needed.  In this case, the fill factor(%) is different.  The running value is use is 85% while the configured value is 0.

 

This is the code used to get the above result set.

SELECT    name
  , is_dynamic
  , is_advanced
  , value_in_use
  , value
FROM sys.configurations
WHERE is_dynamic = 0

In order to determine if a restart is needed, we need to add more to the WHERE clause.  We just need to do a simple comparison of the value_in_use and the value columns.  Of course we are also only looking at the settings that is_dynamic is set to FALSE or 0.  If this query below returns any rows, a restart of the service is needed in order to have it take effect.

SELECT    name
  , is_dynamic
  , is_advanced
  , value_in_use
  , value
FROM sys.configurations
WHERE is_dynamic = 0
  AND value <> value_in_use
IF @@ROWCOUNT >= 1
 BEGIN
  SELECT ‘Restart is needed’ AS ‘Restart Needed?’
 END
ELSE
 BEGIN
  SELECT ‘NO restart is needed’ AS ‘Restart Needed?’
 END
Hopefully you will find this information useful.
Thank you for visiting my blog.