Azure Data Studio Connections and SSMS Registered Servers

Azure Data Studio and SQL Server Management Studio share a great deal of functionality, especially if you add the functionality of the ADS extensions.  While they both are great tools, very little information is passed from one tool to the other. For example, when you make a connection to a SQL Server in either tool, it will query the sysdatabases table to get a list of databases.   You will see the same list of databases in both tools.  This is something that looks like the two tools share, but in reality do not.

Now, let’s take a look at something that does get passed from Azure Data Studio to SSMS. To see this you will first need to open the Registered Servers window.  This can be found under the View menu item. It is here that you will find something that is passed from Azure Data Studio to SQL Server Management Studio.

 

Registered servers in SQL Server Management Studio can be very useful, especially if you need to run the same block of code on multiple servers.  This window does not exist in Azure Data Studio.  When viewing the Registered Servers window in SQL Server Management Studio, you will see at least two groups, Local Server Groups and Azure Data Studio.

Notice in the above image, you see several folders under both Local Server Groups and Azure Data Studio.  The items under Local Server Groups were created using SSMS.  If you right clock on Local Server Groups, you will see a context menu pop up and you can then add another group or register a server.

However, if you right click on Azure Data Studio a context menu will NOT appear.  If you want to add a group here, you will need to do so using Azure Data Studio instead.

To do this in Azure Data Studio, you will need to create a new Server Group.  When in the Connections panel, use the button highlighted below to create a new Server Group.

You do not need to restart Azure Data Studio to see the new group in the Connections pane.  However, you will need to restart SQL Server Management Studio to see the new group in the Registered Servers window.

Thanks for visiting my blog!!!

ADS: New Database

Regardless of what tool we are using, SQL Server Management Studio or Azure Data Studio, the need to create new databases is always present. Using Transact SQL is an option in both tools.  What is not an option in both tools is to right click and to go “New Database”.  This has been in SSMS for many years, however it is NOT present in Azure Data Studio.

Below you can see what the menu options are when you right click on databases in SSMS.  The New Database option is right at the top.

 

Now lets move on to Azure Data Studio.  When you right click on a server you will have a number of options, including new query, refresh and disconnect.  What you do not see is “Delete Database”.

As with a great deal of functionality in Azure Data Studio, the ability to right click on a server and create a new database comes with the installation of an extension.  In this case the extension was created by Keven Cunnane, It is pretty simple to install and cost the same as ADS, nothing.  This extension is currently not in “Preview” mode.  It is currently at version 1.0.0.

To install you will need to click the install button when looking at the extension documentation in Azure Data Studio.

Once you do, you will be taken to the location to download the .VSIX file.  You will need to download it and save to a location where it won’t be deleted or overwritten. To install, simply click on “Install Extension from .VSIX package” on the File menu.

Once you do you will see in the lower right corner that the extension has been installed.  A restart of Azure Data Studio is not required.  Now when you right click server, what do you see?  Well..you will see that nothing changed.  You will need to right click on Databases under the connection.

Once you click “New Database” you will not see the same dialog box you will see in SSMS.  What you will see is something like what is in the image below.  In the box you will enter the name of the new database.

Then hit enter and assuming you have the proper rights, ADS will create the database.  You will not be able to enter any of the settings we often change to accommodate the specific needs of the database or system.  This includes things like file location, recovery model, file growth rate and many more. With this, you will either need to use TSQL to adjust some of the settings or use SSMS.  This method of creating a database will depend on the defaults for all the settings.

Overall this is a good extension although it depends heavily on default database settings.  I still prefer to use TSQL when creating a database, however this is a nice simple method to use as well.

Here is a list of a number of available extensions, click here.  This list does not include all the extensions that are available but is still a nice list.

Thanks for visiting my blog!!

ADS: Settings Editor

When working with the Settings in Azure Data Studio, you have a few options on how you will see them.  The two options available at this point are the “user interface” and “JSON”.  While it is usually easier to modify the settings using the UI, you might prefer to use the json instead.  I feel that anyone that uses Azure Data Studio on a regular basis should learn the basics of JSON.

These are my reasons why I think we should learn the basics of JSON.

      • Some settings must be changed in JSON
      • One of the export options for a result set is JSON

The first reason is where the focus of this post is.  Many settings can easily be modified using the user interface, however not all can.

The settings can be found under the File menu or you can use the keyboard shortcut of “Ctrl + ,”.  Under the File menu is you will find Preferences.  It is in the pop out menu that the Setting menu item can be found.

Once in settings you will quickly see that there are many settings.  Most of these you will not have a need to change, but it is good to know what can be changed. Once in the Settings, the user interface is quite easy to navigate around. You can even search for specific key words on the top.

As you scroll through the various settings you might notice that not all the options can be changed in the UI.  Some must be changed using the JSON file.

The settings.json file can be found in this location.

C:\Users\<<UserProfileName>>\AppData\Roaming\azuredatastudio\User

Notice that “azuredatastudio” does not have any spaces.  When you navigate to this location “C:\Users\<<UserProfileName>>\AppData\Roaming” you may see two folders, both named Azure Data Studio.  One will have spaces, the other will not.  The settings.json file is found in the folder without spaces in the name.

 

Now that we know the location of the settings.json file, let’s return to the editor.  As mentioned earlier, the settings can be changed in two places, the editor UI and the JSON file.

This is what you will see if the setting can be modified in the UI.  Depending on the setting, you may see a dropdown box, a text box or a check box.

If you see something like what is in the green box below, the setting must be modified in using the settings.json file.

When using the UI, you have both options.  However, there is an setting that will allow you to use only the settings.json file.  This setting can be found in the Workbench group.

The dropdown box will have two options, ui and json.  When set to json, the settings.json file will open when you attempt to look at the settings.  Keep in mind, when you change this to json, you will not get a warning and it will be saved automatically by default.

With it set to json, as expected the settings.json file will open instead of the user interface.

You might be asking, how can I change it back?  This is quite simple.  Look for the “workbench.settings.editor” setting and change it back to “ui”.  Remember there are only two option, “ui” and “json”.  Make sure to keep the quotes.  As you can see in the above image, it is on line 209 on my instance, although will probably be different for your instance of ADS.

When in the json file, if you float your mouse pointer over one of the settings a pop up will appear that may provide a bit of information about the setting.  Something like what is in the red box below.

While I prefer to use the ui, it is nice to have this option if I wanted to change.

If you are not using Azure Data Studio on a regular basis, my suggestion is to take a look at it.  The price is right, it is free and has many great features.

Thanks for visiting my blog!!

 

 

ADS: Notebook Text Formatting

Azure Data Studio has a great feature in the form of Notebooks.  A notebook is a document that allows for executable code such as TSQL or Python as well as notes for documentation.  Notebooks can be utilized to help organize things such as DR runbooks and troubleshooting documents.  What I like best about notebooks is that everything needed can be located in a single file, including data.

When working in a Notebook you have two types of cells, text and code.  The focus of this post is how to format the text cell.  Of course text goes into this cell so that part is easy and of course the text can say anything you would like to say.  When we work with text in Word, there is a format tool bar that we can use to make it look like we want it.  The text cells do not have this toolbar.

You might be asking, without the format toolbar, does that mean we can’t format the text?  That answer is no….we can still format the text, we just need to do it slightly different.  Rather than use a toolbar, we need to use characters.

While not all formatting options are available in Notebooks, there are a few commonly used options that are.  These are below as well as a few others we won’t cover in this post.

        • Font Size
        • Font Weight
        • Bullet lists
        • Italics

Here is an example of what your text might look like with formatting.  Notice that there are different size fonts, bold fonts, email address, URL and a small section highlight(blue vertical bar).  We will go over each of these in this post.

First of all, in order to get to the location that allows you to format the text, you simply need to find the text you want to format and double click on it.  When you do, the box to format the data will appear just above it.  Similar to below.

The chart below has an example of each of the font options. The sample code is in the parenthesis.

 

Font Changes in a Notebook

Formatting Applied and Sample code Example
No Formatting
One #

(# Font Changes)

Two #

(## Font Changes)

Three #

(### Font Changes)

Four #, adds bold

(#### Font Changes)

Five #, adds bold

(##### Font Changes)

Six # – Makes font tiny and bold

(###### Font Changes)

Seven # – no impact at all, prints the # signs

(####### Font Changes)

Italics

The above chart shows how to change the size and weight of the text.  Well, what if I want the text to be italics.  To do this we will use the *, well actually a pair of them. One will be placed at the start of the text you want to be italics and the other will be placed at the end.

Like this:   *Font Changes*

This is what it will look like in a Notebook

If you want a larger font you can combine characters.

Using both the # and *, we can get a larger font as well as italics.

This is what it would look like:  # *Font Changes*

The image above shows a much larger font and italics.

URL

If you want to include a URL to a web site you can easily do this with out the use of any special characters, such as a # or *.  When you do, the text to the URL appear just as you type it.  For example, if I want to include a link to my blog I could type, www.davebland.com.  When I do, that is exactly how it will appear in the text, www.davebland.com.

However, what if I want different text to appear, let’s say I want “Dave Bland’s blog” to appear in the text rather than www.davebland.com.  This is also pretty easy.  You will need to utilize the square brackets, [] and a pair of parenthesis, ().

The square brackets will hold the text I want to appear, while the URL will be placed between the ().

Like this:

[Dave Bland’s Blog](https://www.davebland.com)
Here is an example of both.

 

Bold

We already talked about how you can use the # to get a bold font.  But what if I want a word in the middle of the text to be bold and nothing around it.  This is where the * comes in.  We just put two * before and two after the text we want bold.

Like this:

# I what these words to be bold: **Font Changes**
This is what it will look like.  Notice the # at the start, remember this makes the font larger.

 

Block of Text

Azure Data Studio also offers the ability to block off sections of text, similar to the text in the green box  below.  Notice the blue vertical bar and the tinted background.

This is done by placing a > at the start of the line.

This is what it will look like:

You can add more > at the front if you like, this will create blocks in a block.

This is what it will look like:

>>### “Those responsible for sacking the people who have just been sacked, have been sacked.” – What movie is this from?

 

And this is the result:

Bullet Lists

The final format item I would like to cover it a bullet list.  Like all the other format options, this is completed by the use of a special character.  In this case it is the plus sign, +.

This is what it will look like:

+ First Bullet
+ Second Bullet
This is the result:

Put it all together

If you use a number of these in one text box, this is what it might look like.

And this is the result:

There you have it….formatting text in Azure Data Studio notebook without the use of a format toolbar.

Thanks for visiting my blog!!!

 

 

 

 

Azure Data Studio – Explain Button

One for the first things that jumped out at me in Azure Data Studio was the “Explain” button.  After you make your connection, you will see in the upper right hand corner of the query window the Explain button.

At first I was thinking maybe it is a short cut to help.  Nope, it is actually an easy way to get the estimated execution plan. The key is estimated execution plan. While in SQL Server Management Studio we have been able to easily get both the estimated or actual execution plan.

Just as a reminder, the actual execution plan requires the query to actually run. While the estimated plan will use statistics to generate the plan.  Generally, if the actual and estimated are capture very close to each other, you are more than likely not going to see a difference between the tow.

Now let’s talk about the Explain button.  It will return two things, the XML of the execution plan and a graphical representation of the execution plan.

You will notice in the above image, the Results tab.  Just as in SSMS, this is where you can find the results of the query.  In the image above I simply clicked the Run button and the results were returned.

When you click the Explain button, the look of what is returned will be a bit different.  You will still see the Results and Messages items, but there are now two additional items, Query Plan and Top Operators.

The query plan will be the item code that is selected, revealing the estimated execution plan.  There are two ways to show that it is indeed the estimated plan.  The first is that under the results tab you will not see the results of the query.  Rather, you will see the XML of the plan, no data.  Remember, SQL Server does not execute the query when returning the estimated execution plan.

Another area to look at to show that the plan is an estimated plan rather than actual is by looking at the properties of some of the operators.  Notice in the image below there are a number of Estimated properties but no actual.  If this was the Actual plan, you will see both estimated number of rows and actual number of rows.

Another feature that I like is the Top Operators item.  This does exactly what you expect it to do, return the top operators.  If you have used SentryOne Plan Explorer this is not new to you.  But for the others who have not used Plan Explorer, this will be a welcome feature.  And if you haven’t use the SentryOne Plan Explorer…..you SHOULD. It is a great tool and can be downloaded for free. To make is easy for you…here is a link to download it, Click Here.

What if I want the Actual execution plan and not the estimated plan.  There is not a button for this, but there is a short cut.  Just hit ctrl + m.  This will execute the query and return the actual plan.  As you can see below, we have both Actual number of rows and estimated number of rows.

When running this I did find something interesting.  If I have two queries such as below in SSMS and I capture the actual execution plan, I will see both plans.

SELECT * FROM Production.Product
SELECT * FROM Production.ProductCategory
Both plans returned in SSMS.
This is not how it works is Azure Data Studio.  While the results for both queries are returned as well as the XML for the plan, the graphical plan is returned only for the first query.
As you can see….both statements have the XML of the plan.  However only one graphical plan is returned, for the first query in the batch.
Another item that I noticed that is different in Azure Data Studio is the percent of batch in the plan.  In the image below from SSMS you can clear see which of the two queries was the most costly. In the other image for Azure Data Studio, notice no percent for the statement in the batch.  I really liked using this in SSMS.
I think overall, Azure Data Studio appears to be a pretty cool tool.  Thank you for stopping to visit my blog!!

Extended Events – Reading Multiple Session Files

Extended Events are a very useful tool when trying to troubleshoot performance issues. When setting up an Extended Event session there are multiple options for the storage of the data.  One of the most common targets for storage is the event_file target.  This type of storage can be used for both an on-prem instance of SQL Server as well as an Azure SQL database.

This post will go over how to read from both locations, on-prem and an Azure storage account.  They are similar, but different enough to cover here. The biggest difference as you will see, is setting everything up properly so you will be able to read multiple files with one TSQL statement.

When setting up a event_file there are a few options that can be set.  Of course the most logical is the path to the location where the files will be placed.  You will also have the opportunity to set the size of the files, maximum number of files as well as whether or not to allow file roll over.

Since you might have more than one file,  you may want to read from multiple files with the same query, that is the focus of this blog post.  Of course in order to read from a file, the file name is important, even if reading from multiple files.  If you look at the image below, you will see that there is a large number attached to the end of the file name.  Just a side note, I like to name the file the same as the Extended Event session name.  This makes it easier to match the files to the session.

According to Microsoft the number at the end of the file name is calculated using the following method:

“The integer value is calculated as the number of milliseconds between January 1, 1601 and the date and time the file is created.”

Reading On-Prem Files

When reading from a event_file target, the fn_xe_file_target_read_file function is used.  The first parameter of this function is the file path, which includes the file name.

Using the two files in the above image as an example, if I want to read from a single file, I would include the full path including the name of the file.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut_0_131920765279670000.xel’, null, null, null);

Notice that in order to read from the file, I included the full name of the file.  However, there are two files due to the rollover.  What do I need to do to read from both files in one query?

In order to do this, there is one simple change that needs to be made.  Notice in the code below, the file name has been modified to use the “*” following the name.  The number at the end of the file is no longer needed.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut*.xel’, null, null, null);

This will now return the data from all the XEL files in the location specified that start with “ExecutionTimeOut”.

Reading from Azure Storage Account

This works great for an on-prem instance of SQL Server, however there are some changes that must be made when quering the file targets of an Azure SQL database.

Before we get into the change that needs to be made, let’s go over a few prerequisites on using Extended Events for an Azure SQL Database and using an Azure storage account.  This blog will not get very deep into a few of the prerequisites, however they do need to be mentioned.

The first prerequisites is having an Azure Storage account.  When you do this, you will given a few options for the type of storage.  This can be obtained in your Azure portal.  You will need to decide on a few things as well, including what subscription you will use for the storage account and what region it is in.

To create the storage account, simply click the “Storage Account” menu item in the Azure portal.  You will need to fill in all the requested information.

Now that you have provided all the requested information, you can now utilize the storage.  When picking a target type in the Extended Events “wizard”, my preference is the event_file target type.

In order to read the Extended Event files, you will also need to get the EndPoint for the storage account.  As you can see above, you will need to go to the properties of the storage account.  When you do you will then have to capture the Endpoint for the type of storage account you chose.  In this case, the option would be between a file storage account or a Blob storage. Notice that the two Endpoints only have one small difference, right before the word “core”, you will see either File or Blog.  You just need to copy the path to the clip board.  You will need this path when reading the Extended Event files.

The next item you will need is a Shared Access Signature for the storage account. This can be obtained a few different ways.  One is to use the Azure Storage Explorer and the other is to use the Azure portal.

To obtain this signature using the Azure portal navigate to the storage account and then click “Shared account Signature”.  The screen on the right will then be populated with the settings of the signature.  One important setting you want to be aware of is the End option under “Start and Expiry data\time” section.  Make sure you pick the appropriate duration. When this data\time has past you will no longer be able to read the files in the storage account. When you tried to read an Extended Event file and the End date has past, you will not receive an error, you will just see no records returned.

Towards the bottom of the above screen, after you scroll down a bit, you will see a button that looks like the blue button at the top of image below.  When you click that, the SAS token will appear.  This is what we will need to read the Extended Event files.  To capture this, just click the blue button to the right of the box under SAS token.  In the blue box below, you will see two date, these are the start and end date for the token.  When the end date has past the token will no longer work and your queries to read from the files will not return any data.  In addition, you will also not get an error.

We are almost done with the prerequisites. Once you have configured the Extended Event session to use a Storage Account and the session is running, you a ready to read the files.  There is a lot of really good information out there on how to create an Extended Event session..

The last requirement is to create a database scoped credential in your Azure SQL Database.  To do this, you will need to SAS token mentioned about.  However, notice that there is a “?” at the front of it.  This “?” you will need to remove prior to creating the credential.

Although not a requirement, I would also highly recommend that you download the Azure Storage Explorer.  It is a very nice tool that can make our lives much easier when working with storage accounts.

To create the credential, you will use SQL Server Management Studio.  Below you will see the code used to create the credential.  First thing to remember is that you must run this code when connected to your Azure SQL Database.  Notice the path right after the CREATE line.  This is the path to your storage account and must start with HTTPS.  When you read this code, you will find the “SECRET”, this is where you place the SAS token you captured earlier.  Please remember to take the “?” out before executing the code.

CREATE DATABASE SCOPED CREDENTIAL [https://MyStorageAccountName.blob.core.windows.net/eefiles]
  — this name must match the container path, start with https and must not contain a forward slash at the end, in this case, the container is eefiles
WITH IDENTITY=‘SHARED ACCESS SIGNATURE’
  — this is a mandatory string and should not be changed  
 , SECRET = ‘sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-07-01T09:53:56Z&st=2019-07-01T01:53:56Z&spr=tldodr43465kadfuJRsCYUmi1OmYywUjw%3D’
   — this is the shared access signature key that you obtained from the portal, this shared access signature has been changes for security reasons
GO

You can run this code querying the  sys.database_credentials view to confirm the credential was actually created.  Make sure to be connected to your Azure SQL Database.

SELECT * FROM sys.database_credentials

Finally we have all the prerequisites done, now we can move on to actually reading the files.

As you can see in the image below, the file names follow the same pattern as an on-prem instance of SQL Server.

Now that we are ready to read the files, one change you will need to make is put the path to your storage account.  The code below will read just from the file hard coded in the path.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(
    ‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles_0_132044728780700000.xel’,
                null, null, null
            );;

However, we need to read from all the files.  When reading multiple files with an on-prem instance of SQL Server, we simply removed the number at the end of the file name and replaced it will an “*”.  Logic would say that should work when reading from an Azure Storage account, but it doesn’t work.  You will not get an error either. It will just not return any rows.

This will work, but will not generate an error either, just no rows.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles*.xel’,

The only change we need to make to the above code, is remove the numbers at the end of the file.  Make sure to include the front part of the name and remove the file extension as well.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles’,

Now, you can read from all the files with one query.  However, you need to make sure you are not pulling too much data, it could time out.  As it has for me a few times.

Using SSMS Menu items

Another way to read from multiple target files is inside SSMS.  However, it only works for an on-prem instance of SQL Server.  On the File menu, Open you will then see a pop out menu with “Merge Extended Event Files…” item.

When you click, you will have the option to navigate to the location of the files you wish to merge and display the data.  You will see the form like the one below.

You will then click the Add button and select the desired files.  You can use Control+ click to select more than one file.  Once you do, the files will be listed and you just click OK for the data to be displayed in SSMS.  Remember, this does not work for an Azure SQL Database.

The files don’t have to be from the same session to merge them, although many times they will be.  As you can see below there are files from two different sessions, the deadlock session and the ExecutionTimeout session.

Once the data is merged and displayed in SSMS, you can group the data by event. This can be done by right clicking on the column in the top half of the results and going to “Group by this column”.  The results will then look similar to the image below.  Notice that the top four rows are from the Deadlock session, while the bottom row, the Attention event, is from the QueryTimeout session.

 

Additional Code to Read the XML Data

While this post is about how to read mulitple files, I also want to include the code to parse through the resulting XML from the above code.  If you look, the above code simply takes the data and converts it to XML.  While the code below will parse the XML into a more readable format. You will need to make a few changes.  You will need to use your path to your storage account and you will need to update the columns you want to extract from the XML.

If you look at the code below, there are three steps.  Well actually only two, the middle step is optional.

–Extract the XML

SELECT event_data = convert(xml, event_data)
INTO #eeTable
 FROM sys.fn_xe_file_target_read_file(
                — Fill in Storage Account name, and the associated Container name.
                ‘https://Mystorageaccount.blob.core.windows.net/eefiles/efiles’,
                null, null, null
            );;
–SELECT statement just to confirm there are rows, this is not needed
SELECT * FROM #eeTable
–Parse the XML and put it into the temporary table
SELECT
  ts  = event_data.value(N'(event/@timestamp)[1]’, N’datetime’),
  [sql]  = event_data.value(N'(event/action[@name=”sql_text”]/value)[1]’, N’nvarchar(max)’),
  duration  = event_data.value(N'(event/data[@name=”duration”]/value)[1]’, N’nvarchar(max)’),
  spid  = event_data.value(N'(event/action[@name=”session_id”]/value)[1]’, N’int’),
  user_nm = event_data.value(N'(event/action[@name=”username”]/value)[1]’, N’nvarchar(max)’),
  dbname = event_data.value(N'(event/action[@name=”database_name”]/value)[1]’, N’nvarchar(max)’),
  explan  = event_data.value(N'(event/data[@name=”showplan_xml”]/value)[1]’, N’nvarchar(max)’)
FROM #eeTable
ORDER BY ts
DROP TABLE #eeTable

 

You can get the column names from the Extended Events GUI.  When you are looking at the Action tab, you would use ‘event\action’.  When you want to pull a column in the “event fields” table.  Something to remember, if you misspell a column name, you will not get an error when you run the query.  The column will just be NULL for every row.

Here is an example of where to get the Action column names. Notice that they can be found on the Global Fields(Actions) table.

Below is an example of the “Data” fields.  Notice that the “Event Fields” tab is that active tab.

Now you can read from multiple Extended Event files with a single statement.  This will allow you to take the results, dump the data into a temporary table and then use T-SQL to find the data you are looking for.

Thank you for visiting my blog!!!!

Azure Budgets

Finances are always a source of contention when they are not going well.  In many homes, finances can be a stress point for the family.  However, if you had a budget and were notified when we got closer to the limit, I think life might be a lot easier.  Fortunately Microsoft has added the ability to create and use budgets in your Azure environment.  Although almost everything I blog about is related to SQL Server, budgets are not just for SQL Server Azure services, it is for your entire Azure subscription.

This blog isn’t going to be about how Microsoft bills Azure customers or what type of activity can increase your bill.  Nothing in life is free, if you follow the simple rule that everything we do in Azure has a cost, although sometimes a very small cost, it might be a bit easier to manage the costs.  One thing to remember is that a budget is subscription based and it’s related resources.

It just takes a few clicks to get to the budget part of the portal.  The first click on “Cost Management + Billing” in the menu on the left side of the portal.

At this point, there are several different paths you can take to get to the next step. For the purposes for this post, we will just go one path.  Select Overview and the click the Subscription you wish to set the budget on.

The last step to get to the Budget area is to click the “Budgets” item. Now you are ready to start the process of setting up the budgets.

When you are looking to set up a budget, there are few questions you should have an answer to, prior to creating it.

        • What will be the maximum for the budget?
        • What will happen when I reach the limit?
        • Do I want incremental warnings? If so, who should be the alerts?
          • Maybe an email at 80% of budget
          • Text message at 100% of budget
        • When will the budget start and when will it end?
        • How often will the budget be reset?
          • Monthly
          • Quarterly
          • Yearly

Once you have an idea of what the answers to the above questions might be, you are ready to proceed with creating the budget.

To demo the creation of a budget, I will use these answers to the above questions.

        • What will be the maximum for the budget? – $5,000
        • What will happen when I reach the limit? – Text message and email
        • Do I want incremental warnings? If so, who should be the alerts?
          • Maybe an email at 80% of budget – yes, email at 80%
          • Text message at 100% of budget – yes
        • When will the budget start and when will it end? – When the budget is configured and will expire on June 6, 2021.
        • How often will the budget be reset? – Quarterly

Now to implement the budget in Azure.

What will be the maximum for the budget? – 5,000

This first thing that needs to be created is the budget itself.  Once you clicked the budget menu item as mentioned earlier, you then will need to click the “Add” link.  By clicking this, you will then be taken to the form that will allow you to enter all the necessary information for the budget.

The blue arrow above points to the subscription that will define the scope of the budget.  Remember, the budget is set for the subscription and covers all the costs associated with that subscription.

Once you click the “Add” link you will then see something similar to the image below.

 

Of course the name is important, I prefer to give the budget a name that it be somewhat descriptive.

Then you must enter an amount.  In order to do this properly, you will also need to identify when the amount will be reset to zero.

Based on the earlier answers, the amount will be $5,000.  Once you have the amount entered as you see below, now you need to determine how often you would like it to reset.

As you can see in the image above, there is a “Reset” option.  This is where you will determine the reset interval. You have multiple options for this settings.

          • Billing Month
          • Billing Quarter
          • Billing year
          • Monthly
          • Quarterly
          • Annually

When you pick on option, the next set of options will change, based on the option you pick. The order of the options is pretty straight forward, starting from the left, Year, Month and day. Notice below all three of which are not active, that is due to the option for renew you picked, I this case “Billing Monthly”.

Below are the Reset options and the impact of the Year\Month\Day options.  These images were created on the 8th of the month.  The day changed based on that.

          • Billing Month
          • Billing Quarter – notice the year and month are available to change
          • Billing year
          • Monthly – the day changed to the first, unlike the above images that set the day to the day before.
          • Quarterly
          • Annually

Now that we have set a starting date, we also need to set the Expiration Date.  This is the date you no longer what the budget to be in effect.

The next configuration setting we must address is what level of the budget will trigger an alert.  For example, if our budget for the subscription is $5,000 and we want to get an email at 80% utilization, or $400.  The budget amount, black box below, is a calculation of the amount, red box and the percent of budget, yellow box.  In this case, 80% of 5,000 is $4000.

Action Group

Now that we have set up the basics, we now need to determine what will happen when the Alert Condition is met.  Action Groups will help us out here.

When you are setting up the alert, you will have to click the drop down box under that Action Group header.  If no Action Groups exist you will only see None. However, in this case, there is an action group, so we see that as well.

Creating an Action Group

An essential component for setting up a response to a threshold being met. If you need to create an Action Group, you will need to click the “Manage Action Groups” in the upper left, just under the “Create Budget” header.

 

If you are trying to add an Action Group to an existing budget you can do so by clicking “Monitor” on the menu list on the left side of the Azure portal.

Regardless of how you get to the form to add a new Action Group, You will then be asked to complete a few basic settings.  These include, group name, a short name, Subscription and what resource group you would like to add it to.

You will need to give the Action a name.  Hopefully something descriptive.

Once you do, now it is time to pick the Action type.  You can see what types are available, click the drop down under the Action Type header.

While there are a number of options we will focus on the email\sms option.  If you look back at the requirements outlined earlier, you might remember that we wanted an email at 80% and a text message at 100%.

Types

Email – This obviously an email address you would like the alert to be sent to. Per Microsoft there is a limit of No more than 100 emails in an hour

SMS – This is a text message and you will need to provide a valid number to be sent to.  While the limit of emails was 100 per hour, the limit for SMS is no more than 1 SMS every 5 minutes or 12 an hour

Push – This will send a notification to the Azure app.  For more information about the app, please go here

Voice –  This will send a voice call to the number you have indicated. There is also a limit with this type as well, no more than 1 Voice call every 5 minutes

Here is the link to the limit page in the Microsoft documentation.

Filling in the information that is needed for the alert is pretty simple.  All you need to do if fill out the type you would like to utilize. For example, if you would like the action to only send an email, then you would enter an email address just under the work “Email”.

You can easily see the location for the options that we are considering.  These options include, Email, Voice, SMS and Azure App push notification.

In order to meet the requirement for notification, we will need two Actions.  One to send the email at 80% and one to send an SMS alert at 100%.  They will look something like the image below.

Summary form and Adding to a Dashboard

Once you have everything set, you can review the budget configuration.  if you look at the bread crumbs across the top, the path to this summary is easily identified.

Home –Cost Mangement + Billing -Overview –Subscription name — click on budget

 

When looking at the summary form, you will also see the progress chart similar to the one below.

If you like the Pin in the upper right corner of the graph, you will be able to pin the graph to a dashboard.

Below is similar to what you might see once the budget is added to the dashboard.  I find it very useful to have the budget easily seen.  By placing it on a Dashboard, this makes what could be a difficult task, managing the budget.  And gives us a nice tool to easily review current status.

 

 

Creating and utilizing a budget is essential to managing the over cost of your Azure environment.  Hopefully, this will help you take a small step in the right direction.

Thanks for visiting my blog!!

Azure SQL Database Connections work in SSMS but not with Azure Data Factory Connection

Over the past month or so, I have attended a number of sessions on Azure Data Factories.  Both presenters were amazing and I learned a ton.  However, as I usually do, I took the information and went off on my own to play with this new tool I hear a lot of people talking about. During both sessions I was able to connect to my Azure SQL Database with SQL Server Management Studio, but not with the connection in the Azure Data Factory. While I really didn’t get much of a chance to ask the presenter about my problem, I was finally able to.  He was able to give me the answer I was looking for.

Before getting to the solution, let’s go over what the problem actually was.  I have an Azure SQL Database that I have connected to many times via SSMS.  However, during both sessions I attempted to create a connection in ADF and the test of the connection failed every time.

So what did I check? Just to confirm, I again connected via SSMS as I had done many times in the past.  Again, no issues. Now I was convinced that the issue was not the database or the username\password. The next thing I thought of was the firewall rules.  I asked myself the following question: Could there be an issue with the firewall and the IP addresses?  Again, all appeared to be working just fine.  Then I decided to do one more check using a third technology, I connected to the database via the Azure SQL Editor, which requires a username and password to connect.  Again, all appeared to be fine.

At this point, I knew more about what was not causing the problem than what was causing the problem.  Finally I had a moment with one of the presenters and he was able to give me the cause and the solution. As is usually the case, it was a single setting that was causing this and a very simple fix.  That setting was the an Azure SQLDB setting, specifically the “Allow Access to Azure Services” setting.

To get to the setting, navigate to the settings of the Azure SQL DB and click on Overview, the orange box below.  Then you will see the “Set server Firewall” option, click that.

Once you do, you will see something like the image below.  Simply move the On\Off slider to On for the “Allow access to Azure service” setting.  Then click the Save button.

Once you do that, then you should be able to connect to your Azure SQL Database using the Azure Data Factory connection.  This made perfect sense, SSMS is not an Azure service, while Azure Data Factory is. Also keep in mind there are many reason that could create a similar situation, this was the cause of the issue I was seeing.  So make be prepared to continue to troubleshoot this issue if this does not work.

Thanks for visiting my blog!!!

Create a New Dashboard in Azure Portal

Dashboards can be used in many different products and can be very helpful for many reasons.  Azure has a default dashboard, but it allows you to create your own dashboards to easily get the information that is important to you.

If you look at the screenshot of an Azure portal dashboard, you can see I get quite a bit of information quickly.  In this case I can see three reports on activity for my subscription, a list of my resources and the current cost.

To get to the dashboard you need to click the “Dashboard” menu item on the portal menu on the left side of the screen.  Now that we are at the “Main” dashboard we can either leave it or edit it.

When you look at the dashboard the first time, you will see the default dashboard. An example of this is in the image below.

Create A New Dashboard

If the default dashboard does not meet my needs, Azure provides an option to create your own. To create a new dashboard, simply click “New Dashboard” on the menu that appeared at the top of the screen.

Once you click on “New Dashboard” you will see an empty dashboard as below.

The first item you should consider changing is the title.  Of course this should reflect what the dashboard is to display.

Add items to a Dashboard

Now we are at the point we need to add the tiles that will display the information you are looking for. On the left side you will see the “Tile Gallery”.  These are some of the items you can add to your dashboard.

To add the item you can either drag\drop the item to the dashboard or you can click the “Add” item to the right of the tile title.

When you click the “Add” item, Azure will place the tile in the next available location.  Azure will add the tile to the right of the tile in the top row.  In the image below, I added four items and Azure kept adding them to the right.  With the exception of the “User Sign-ins”.  Because the clock and the “User Sign=ins” are not as tall as the others, Azure placed the “User Sign-ins” under the clock.  However, with most of the tiles they will be placed to the right of the last tile on the top row.

Another method of adding tiles is to drag and drop the tiles from the Tile Gallery.  When you use this method, Azure will place a shadow outline on the screen that will show you where the tile will be placed, similar to the image below.

Once I have added the item I can change the size if the default size is not what I am looking for.  In the upper right corner of the tile you will see a garbage can and a box with three dots when you float your cursor over this area.  To change the size, click the box with three dots and a size context menu will appear.  You can then pick the size you would like for the tile.

Now that we have added a few tiles and have everything we need.  We can now click “Done Customizing” at the top of the screen.  When you do Azure will take you to your dashboard.

Change Dashboards

Now that we have added a dashboard, we should have two, the one we created and the default dashboard.  To change what dashboard you are viewing, you click the drop down and pick the dashboard you would like to view from the menu.

 

Edit an Existing Dashboard

If you have an existing dashboard you would like to edit, simply navigate to the dashboard and click “Edit” menu item on the menu across the top.

Now you can add tiles, removed tiles, resize tiles and change the order and placement of the tiles.

Add Metrics Reports

In a previous blog post, I talked about how to create a metrics report.  In this same blog I mention how to add a metrics report to a dashboard.  Here is the link to that post.

 

Thanks for visiting my blog!!!