Extended Events – Reading Multiple Session Files

Extended Events are a very useful tool when trying to troubleshoot performance issues. When setting up an Extended Event session there are multiple options for the storage of the data.  One of the most common targets for storage is the event_file target.  This type of storage can be used for both an on-prem instance of SQL Server as well as an Azure SQL database.

This post will go over how to read from both locations, on-prem and an Azure storage account.  They are similar, but different enough to cover here. The biggest difference as you will see, is setting everything up properly so you will be able to read multiple files with one TSQL statement.

When setting up a event_file there are a few options that can be set.  Of course the most logical is the path to the location where the files will be placed.  You will also have the opportunity to set the size of the files, maximum number of files as well as whether or not to allow file roll over.

Since you might have more than one file,  you may want to read from multiple files with the same query, that is the focus of this blog post.  Of course in order to read from a file, the file name is important, even if reading from multiple files.  If you look at the image below, you will see that there is a large number attached to the end of the file name.  Just a side note, I like to name the file the same as the Extended Event session name.  This makes it easier to match the files to the session.

According to Microsoft the number at the end of the file name is calculated using the following method:

“The integer value is calculated as the number of milliseconds between January 1, 1601 and the date and time the file is created.”

Reading On-Prem Files

When reading from a event_file target, the fn_xe_file_target_read_file function is used.  The first parameter of this function is the file path, which includes the file name.

Using the two files in the above image as an example, if I want to read from a single file, I would include the full path including the name of the file.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut_0_131920765279670000.xel’, null, null, null);

Notice that in order to read from the file, I included the full name of the file.  However, there are two files due to the rollover.  What do I need to do to read from both files in one query?

In order to do this, there is one simple change that needs to be made.  Notice in the code below, the file name has been modified to use the “*” following the name.  The number at the end of the file is no longer needed.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(N’C:\temp\ExecutionTImeOut*.xel’, null, null, null);

This will now return the data from all the XEL files in the location specified that start with “ExecutionTimeOut”.

Reading from Azure Storage Account

This works great for an on-prem instance of SQL Server, however there are some changes that must be made when quering the file targets of an Azure SQL database.

Before we get into the change that needs to be made, let’s go over a few prerequisites on using Extended Events for an Azure SQL Database and using an Azure storage account.  This blog will not get very deep into a few of the prerequisites, however they do need to be mentioned.

The first prerequisites is having an Azure Storage account.  When you do this, you will given a few options for the type of storage.  This can be obtained in your Azure portal.  You will need to decide on a few things as well, including what subscription you will use for the storage account and what region it is in.

To create the storage account, simply click the “Storage Account” menu item in the Azure portal.  You will need to fill in all the requested information.

Now that you have provided all the requested information, you can now utilize the storage.  When picking a target type in the Extended Events “wizard”, my preference is the event_file target type.

In order to read the Extended Event files, you will also need to get the EndPoint for the storage account.  As you can see above, you will need to go to the properties of the storage account.  When you do you will then have to capture the Endpoint for the type of storage account you chose.  In this case, the option would be between a file storage account or a Blob storage. Notice that the two Endpoints only have one small difference, right before the word “core”, you will see either File or Blog.  You just need to copy the path to the clip board.  You will need this path when reading the Extended Event files.

The next item you will need is a Shared Access Signature for the storage account. This can be obtained a few different ways.  One is to use the Azure Storage Explorer and the other is to use the Azure portal.

To obtain this signature using the Azure portal navigate to the storage account and then click “Shared account Signature”.  The screen on the right will then be populated with the settings of the signature.  One important setting you want to be aware of is the End option under “Start and Expiry data\time” section.  Make sure you pick the appropriate duration. When this data\time has past you will no longer be able to read the files in the storage account. When you tried to read an Extended Event file and the End date has past, you will not receive an error, you will just see no records returned.

Towards the bottom of the above screen, after you scroll down a bit, you will see a button that looks like the blue button at the top of image below.  When you click that, the SAS token will appear.  This is what we will need to read the Extended Event files.  To capture this, just click the blue button to the right of the box under SAS token.  In the blue box below, you will see two date, these are the start and end date for the token.  When the end date has past the token will no longer work and your queries to read from the files will not return any data.  In addition, you will also not get an error.

We are almost done with the prerequisites. Once you have configured the Extended Event session to use a Storage Account and the session is running, you a ready to read the files.  There is a lot of really good information out there on how to create an Extended Event session..

The last requirement is to create a database scoped credential in your Azure SQL Database.  To do this, you will need to SAS token mentioned about.  However, notice that there is a “?” at the front of it.  This “?” you will need to remove prior to creating the credential.

Although not a requirement, I would also highly recommend that you download the Azure Storage Explorer.  It is a very nice tool that can make our lives much easier when working with storage accounts.

To create the credential, you will use SQL Server Management Studio.  Below you will see the code used to create the credential.  First thing to remember is that you must run this code when connected to your Azure SQL Database.  Notice the path right after the CREATE line.  This is the path to your storage account and must start with HTTPS.  When you read this code, you will find the “SECRET”, this is where you place the SAS token you captured earlier.  Please remember to take the “?” out before executing the code.

CREATE DATABASE SCOPED CREDENTIAL [https://MyStorageAccountName.blob.core.windows.net/eefiles]
  — this name must match the container path, start with https and must not contain a forward slash at the end, in this case, the container is eefiles
WITH IDENTITY=‘SHARED ACCESS SIGNATURE’
  — this is a mandatory string and should not be changed  
 , SECRET = ‘sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-07-01T09:53:56Z&st=2019-07-01T01:53:56Z&spr=tldodr43465kadfuJRsCYUmi1OmYywUjw%3D’
   — this is the shared access signature key that you obtained from the portal, this shared access signature has been changes for security reasons
GO

You can run this code querying the  sys.database_credentials view to confirm the credential was actually created.  Make sure to be connected to your Azure SQL Database.

SELECT * FROM sys.database_credentials

Finally we have all the prerequisites done, now we can move on to actually reading the files.

As you can see in the image below, the file names follow the same pattern as an on-prem instance of SQL Server.

Now that we are ready to read the files, one change you will need to make is put the path to your storage account.  The code below will read just from the file hard coded in the path.

SELECT event_data = CONVERT(xml, event_data)
INTO #eeTable
FROM sys.fn_xe_file_target_read_file(
    ‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles_0_132044728780700000.xel’,
                null, null, null
            );;

However, we need to read from all the files.  When reading multiple files with an on-prem instance of SQL Server, we simply removed the number at the end of the file name and replaced it will an “*”.  Logic would say that should work when reading from an Azure Storage account, but it doesn’t work.  You will not get an error either. It will just not return any rows.

This will work, but will not generate an error either, just no rows.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles*.xel’,

The only change we need to make to the above code, is remove the numbers at the end of the file.  Make sure to include the front part of the name and remove the file extension as well.

‘https://MyStorageAccountName.blob.core.windows.net/eefiles/efiles’,

Now, you can read from all the files with one query.  However, you need to make sure you are not pulling too much data, it could time out.  As it has for me a few times.

Using SSMS Menu items

Another way to read from multiple target files is inside SSMS.  However, it only works for an on-prem instance of SQL Server.  On the File menu, Open you will then see a pop out menu with “Merge Extended Event Files…” item.

When you click, you will have the option to navigate to the location of the files you wish to merge and display the data.  You will see the form like the one below.

You will then click the Add button and select the desired files.  You can use Control+ click to select more than one file.  Once you do, the files will be listed and you just click OK for the data to be displayed in SSMS.  Remember, this does not work for an Azure SQL Database.

The files don’t have to be from the same session to merge them, although many times they will be.  As you can see below there are files from two different sessions, the deadlock session and the ExecutionTimeout session.

Once the data is merged and displayed in SSMS, you can group the data by event. This can be done by right clicking on the column in the top half of the results and going to “Group by this column”.  The results will then look similar to the image below.  Notice that the top four rows are from the Deadlock session, while the bottom row, the Attention event, is from the QueryTimeout session.

 

Additional Code to Read the XML Data

While this post is about how to read mulitple files, I also want to include the code to parse through the resulting XML from the above code.  If you look, the above code simply takes the data and converts it to XML.  While the code below will parse the XML into a more readable format. You will need to make a few changes.  You will need to use your path to your storage account and you will need to update the columns you want to extract from the XML.

If you look at the code below, there are three steps.  Well actually only two, the middle step is optional.

–Extract the XML

SELECT event_data = convert(xml, event_data)
INTO #eeTable
 FROM sys.fn_xe_file_target_read_file(
                — Fill in Storage Account name, and the associated Container name.
                ‘https://Mystorageaccount.blob.core.windows.net/eefiles/efiles’,
                null, null, null
            );;
–SELECT statement just to confirm there are rows, this is not needed
SELECT * FROM #eeTable
–Parse the XML and put it into the temporary table
SELECT
  ts  = event_data.value(N'(event/@timestamp)[1]’, N’datetime’),
  [sql]  = event_data.value(N'(event/action[@name=”sql_text”]/value)[1]’, N’nvarchar(max)’),
  duration  = event_data.value(N'(event/data[@name=”duration”]/value)[1]’, N’nvarchar(max)’),
  spid  = event_data.value(N'(event/action[@name=”session_id”]/value)[1]’, N’int’),
  user_nm = event_data.value(N'(event/action[@name=”username”]/value)[1]’, N’nvarchar(max)’),
  dbname = event_data.value(N'(event/action[@name=”database_name”]/value)[1]’, N’nvarchar(max)’),
  explan  = event_data.value(N'(event/data[@name=”showplan_xml”]/value)[1]’, N’nvarchar(max)’)
FROM #eeTable
ORDER BY ts
DROP TABLE #eeTable

 

You can get the column names from the Extended Events GUI.  When you are looking at the Action tab, you would use ‘event\action’.  When you want to pull a column in the “event fields” table.  Something to remember, if you misspell a column name, you will not get an error when you run the query.  The column will just be NULL for every row.

Here is an example of where to get the Action column names. Notice that they can be found on the Global Fields(Actions) table.

Below is an example of the “Data” fields.  Notice that the “Event Fields” tab is that active tab.

Now you can read from multiple Extended Event files with a single statement.  This will allow you to take the results, dump the data into a temporary table and then use T-SQL to find the data you are looking for.

Thank you for visiting my blog!!!!

Azure Budgets

Finances are always a source of contention when they are not going well.  In many homes, finances can be a stress point for the family.  However, if you had a budget and were notified when we got closer to the limit, I think life might be a lot easier.  Fortunately Microsoft has added the ability to create and use budgets in your Azure environment.  Although almost everything I blog about is related to SQL Server, budgets are not just for SQL Server Azure services, it is for your entire Azure subscription.

This blog isn’t going to be about how Microsoft bills Azure customers or what type of activity can increase your bill.  Nothing in life is free, if you follow the simple rule that everything we do in Azure has a cost, although sometimes a very small cost, it might be a bit easier to manage the costs.  One thing to remember is that a budget is subscription based and it’s related resources.

It just takes a few clicks to get to the budget part of the portal.  The first click on “Cost Management + Billing” in the menu on the left side of the portal.

At this point, there are several different paths you can take to get to the next step. For the purposes for this post, we will just go one path.  Select Overview and the click the Subscription you wish to set the budget on.

The last step to get to the Budget area is to click the “Budgets” item. Now you are ready to start the process of setting up the budgets.

When you are looking to set up a budget, there are few questions you should have an answer to, prior to creating it.

        • What will be the maximum for the budget?
        • What will happen when I reach the limit?
        • Do I want incremental warnings? If so, who should be the alerts?
          • Maybe an email at 80% of budget
          • Text message at 100% of budget
        • When will the budget start and when will it end?
        • How often will the budget be reset?
          • Monthly
          • Quarterly
          • Yearly

Once you have an idea of what the answers to the above questions might be, you are ready to proceed with creating the budget.

To demo the creation of a budget, I will use these answers to the above questions.

        • What will be the maximum for the budget? – $5,000
        • What will happen when I reach the limit? – Text message and email
        • Do I want incremental warnings? If so, who should be the alerts?
          • Maybe an email at 80% of budget – yes, email at 80%
          • Text message at 100% of budget – yes
        • When will the budget start and when will it end? – When the budget is configured and will expire on June 6, 2021.
        • How often will the budget be reset? – Quarterly

Now to implement the budget in Azure.

What will be the maximum for the budget? – 5,000

This first thing that needs to be created is the budget itself.  Once you clicked the budget menu item as mentioned earlier, you then will need to click the “Add” link.  By clicking this, you will then be taken to the form that will allow you to enter all the necessary information for the budget.

The blue arrow above points to the subscription that will define the scope of the budget.  Remember, the budget is set for the subscription and covers all the costs associated with that subscription.

Once you click the “Add” link you will then see something similar to the image below.

 

Of course the name is important, I prefer to give the budget a name that it be somewhat descriptive.

Then you must enter an amount.  In order to do this properly, you will also need to identify when the amount will be reset to zero.

Based on the earlier answers, the amount will be $5,000.  Once you have the amount entered as you see below, now you need to determine how often you would like it to reset.

As you can see in the image above, there is a “Reset” option.  This is where you will determine the reset interval. You have multiple options for this settings.

          • Billing Month
          • Billing Quarter
          • Billing year
          • Monthly
          • Quarterly
          • Annually

When you pick on option, the next set of options will change, based on the option you pick. The order of the options is pretty straight forward, starting from the left, Year, Month and day. Notice below all three of which are not active, that is due to the option for renew you picked, I this case “Billing Monthly”.

Below are the Reset options and the impact of the Year\Month\Day options.  These images were created on the 8th of the month.  The day changed based on that.

          • Billing Month
          • Billing Quarter – notice the year and month are available to change
          • Billing year
          • Monthly – the day changed to the first, unlike the above images that set the day to the day before.
          • Quarterly
          • Annually

Now that we have set a starting date, we also need to set the Expiration Date.  This is the date you no longer what the budget to be in effect.

The next configuration setting we must address is what level of the budget will trigger an alert.  For example, if our budget for the subscription is $5,000 and we want to get an email at 80% utilization, or $400.  The budget amount, black box below, is a calculation of the amount, red box and the percent of budget, yellow box.  In this case, 80% of 5,000 is $4000.

Action Group

Now that we have set up the basics, we now need to determine what will happen when the Alert Condition is met.  Action Groups will help us out here.

When you are setting up the alert, you will have to click the drop down box under that Action Group header.  If no Action Groups exist you will only see None. However, in this case, there is an action group, so we see that as well.

Creating an Action Group

An essential component for setting up a response to a threshold being met. If you need to create an Action Group, you will need to click the “Manage Action Groups” in the upper left, just under the “Create Budget” header.

 

If you are trying to add an Action Group to an existing budget you can do so by clicking “Monitor” on the menu list on the left side of the Azure portal.

Regardless of how you get to the form to add a new Action Group, You will then be asked to complete a few basic settings.  These include, group name, a short name, Subscription and what resource group you would like to add it to.

You will need to give the Action a name.  Hopefully something descriptive.

Once you do, now it is time to pick the Action type.  You can see what types are available, click the drop down under the Action Type header.

While there are a number of options we will focus on the email\sms option.  If you look back at the requirements outlined earlier, you might remember that we wanted an email at 80% and a text message at 100%.

Types

Email – This obviously an email address you would like the alert to be sent to. Per Microsoft there is a limit of No more than 100 emails in an hour

SMS – This is a text message and you will need to provide a valid number to be sent to.  While the limit of emails was 100 per hour, the limit for SMS is no more than 1 SMS every 5 minutes or 12 an hour

Push – This will send a notification to the Azure app.  For more information about the app, please go here

Voice –  This will send a voice call to the number you have indicated. There is also a limit with this type as well, no more than 1 Voice call every 5 minutes

Here is the link to the limit page in the Microsoft documentation.

Filling in the information that is needed for the alert is pretty simple.  All you need to do if fill out the type you would like to utilize. For example, if you would like the action to only send an email, then you would enter an email address just under the work “Email”.

You can easily see the location for the options that we are considering.  These options include, Email, Voice, SMS and Azure App push notification.

In order to meet the requirement for notification, we will need two Actions.  One to send the email at 80% and one to send an SMS alert at 100%.  They will look something like the image below.

Summary form and Adding to a Dashboard

Once you have everything set, you can review the budget configuration.  if you look at the bread crumbs across the top, the path to this summary is easily identified.

Home –Cost Mangement + Billing -Overview –Subscription name — click on budget

 

When looking at the summary form, you will also see the progress chart similar to the one below.

If you like the Pin in the upper right corner of the graph, you will be able to pin the graph to a dashboard.

Below is similar to what you might see once the budget is added to the dashboard.  I find it very useful to have the budget easily seen.  By placing it on a Dashboard, this makes what could be a difficult task, managing the budget.  And gives us a nice tool to easily review current status.

 

 

Creating and utilizing a budget is essential to managing the over cost of your Azure environment.  Hopefully, this will help you take a small step in the right direction.

Thanks for visiting my blog!!

Azure SQL Database Connections work in SSMS but not with Azure Data Factory Connection

Over the past month or so, I have attended a number of sessions on Azure Data Factories.  Both presenters were amazing and I learned a ton.  However, as I usually do, I took the information and went off on my own to play with this new tool I hear a lot of people talking about. During both sessions I was able to connect to my Azure SQL Database with SQL Server Management Studio, but not with the connection in the Azure Data Factory. While I really didn’t get much of a chance to ask the presenter about my problem, I was finally able to.  He was able to give me the answer I was looking for.

Before getting to the solution, let’s go over what the problem actually was.  I have an Azure SQL Database that I have connected to many times via SSMS.  However, during both sessions I attempted to create a connection in ADF and the test of the connection failed every time.

So what did I check? Just to confirm, I again connected via SSMS as I had done many times in the past.  Again, no issues. Now I was convinced that the issue was not the database or the username\password. The next thing I thought of was the firewall rules.  I asked myself the following question: Could there be an issue with the firewall and the IP addresses?  Again, all appeared to be working just fine.  Then I decided to do one more check using a third technology, I connected to the database via the Azure SQL Editor, which requires a username and password to connect.  Again, all appeared to be fine.

At this point, I knew more about what was not causing the problem than what was causing the problem.  Finally I had a moment with one of the presenters and he was able to give me the cause and the solution. As is usually the case, it was a single setting that was causing this and a very simple fix.  That setting was the an Azure SQLDB setting, specifically the “Allow Access to Azure Services” setting.

To get to the setting, navigate to the settings of the Azure SQL DB and click on Overview, the orange box below.  Then you will see the “Set server Firewall” option, click that.

Once you do, you will see something like the image below.  Simply move the On\Off slider to On for the “Allow access to Azure service” setting.  Then click the Save button.

Once you do that, then you should be able to connect to your Azure SQL Database using the Azure Data Factory connection.  This made perfect sense, SSMS is not an Azure service, while Azure Data Factory is. Also keep in mind there are many reason that could create a similar situation, this was the cause of the issue I was seeing.  So make be prepared to continue to troubleshoot this issue if this does not work.

Thanks for visiting my blog!!!

Create a New Dashboard in Azure Portal

Dashboards can be used in many different products and can be very helpful for many reasons.  Azure has a default dashboard, but it allows you to create your own dashboards to easily get the information that is important to you.

If you look at the screenshot of an Azure portal dashboard, you can see I get quite a bit of information quickly.  In this case I can see three reports on activity for my subscription, a list of my resources and the current cost.

To get to the dashboard you need to click the “Dashboard” menu item on the portal menu on the left side of the screen.  Now that we are at the “Main” dashboard we can either leave it or edit it.

When you look at the dashboard the first time, you will see the default dashboard. An example of this is in the image below.

Create A New Dashboard

If the default dashboard does not meet my needs, Azure provides an option to create your own. To create a new dashboard, simply click “New Dashboard” on the menu that appeared at the top of the screen.

Once you click on “New Dashboard” you will see an empty dashboard as below.

The first item you should consider changing is the title.  Of course this should reflect what the dashboard is to display.

Add items to a Dashboard

Now we are at the point we need to add the tiles that will display the information you are looking for. On the left side you will see the “Tile Gallery”.  These are some of the items you can add to your dashboard.

To add the item you can either drag\drop the item to the dashboard or you can click the “Add” item to the right of the tile title.

When you click the “Add” item, Azure will place the tile in the next available location.  Azure will add the tile to the right of the tile in the top row.  In the image below, I added four items and Azure kept adding them to the right.  With the exception of the “User Sign-ins”.  Because the clock and the “User Sign=ins” are not as tall as the others, Azure placed the “User Sign-ins” under the clock.  However, with most of the tiles they will be placed to the right of the last tile on the top row.

Another method of adding tiles is to drag and drop the tiles from the Tile Gallery.  When you use this method, Azure will place a shadow outline on the screen that will show you where the tile will be placed, similar to the image below.

Once I have added the item I can change the size if the default size is not what I am looking for.  In the upper right corner of the tile you will see a garbage can and a box with three dots when you float your cursor over this area.  To change the size, click the box with three dots and a size context menu will appear.  You can then pick the size you would like for the tile.

Now that we have added a few tiles and have everything we need.  We can now click “Done Customizing” at the top of the screen.  When you do Azure will take you to your dashboard.

Change Dashboards

Now that we have added a dashboard, we should have two, the one we created and the default dashboard.  To change what dashboard you are viewing, you click the drop down and pick the dashboard you would like to view from the menu.

 

Edit an Existing Dashboard

If you have an existing dashboard you would like to edit, simply navigate to the dashboard and click “Edit” menu item on the menu across the top.

Now you can add tiles, removed tiles, resize tiles and change the order and placement of the tiles.

Add Metrics Reports

In a previous blog post, I talked about how to create a metrics report.  In this same blog I mention how to add a metrics report to a dashboard.  Here is the link to that post.

 

Thanks for visiting my blog!!!

 

 

Azure SQL DB and @@version

Many of us are used to using @@version to get the version of SQL Server we are using.  This also works when you are utilizing an Azure SQL database, although the results are slightly different. However, before getting to @@version, I would like to go over a few other methods to obtain version information.

How to Get Version

There are many reasons why it is important to know the version of a SQL Server.  The reasons include everything from licensing to upgrades.  In this section, I will cover a few of the common methods used to get the version of SQL Server.

With an on-prem instance of SQL Server one method that can be used to get the version is to right click on the server in SQL Server Management Studio.  Once you do, you can click on the properties menu item and the window below will appear.  As you can see, in the boxes you are able to see both the version and edition of the SQL Server.

However, when you attempt to do this in an instance of Azure SQL database, the properties menu item is not available.  Below is what you will see when you right click on the instance of an Azure SQL database in SQL Server Management Studio.

Another method to get the version of SQL Server is to utilize the SERVERPROPERTY function.  The SERVERPROPERTY function can be used to view server level properties, in this case the product version.

SELECT SERVERPROPERTY(‘ProductVersion’);

When you execute the above statement you will then see the version. It will look like the number below.  The number represents the version, in this case, SQL Server 2017.

14.0.1000.169

As mentioned above, the ServerProperty function can be used to return a number of bits of information about the server.  Here is a link for more detailed information, click here.

The final method I would like to talk about is using @@version.  This is a global variable in SQL Server that will return not only the version of SQL Server, but also the OS version.

When reading from this global variable, you include it in a SELECT statement as illustrated below.

SELECT @@Version

The results will vary from an on-prem instance of SQL Server to an Azure SQL Database.

This is what you will see for an on-prem instance.

Here is the complete text of the results.

“Microsoft SQL Server 2017 (RTM) – 14.0.1000.169 (X64) Aug 22 2017 17:04:49 Copyright (C) 2017 Microsoft Corporation Developer Edition (64-bit) on Windows 10 Home 10.0 <X64> (Build 17134: ) “

As you can see it returns some good information.  Including the version of SQL Server, the service pack status, 32-bit or 64-bit and the OS version.

However, the results change quite a bit when this is ran against and Azure SQL database.

The complete text is just as you see it above.  Notice the lack of OS version.

What is next?

Now that we have covered a number of ways to get the version information, let’s take a look at some of the differences from an on-prem instance to an Azure SQL database.

When looking at the results, it is easy to identify an instance as an Azure instance in just in the first few words, it clearly states that it is an Azure instance, it is indeed in Azure.

If you look at the results below you will see that I used both @@version and ServerProperty to get the information I am looking for.  You will also notice that I included a simple query using the sys.databases object. This will return the compatibility level for each database, which as you will see is going to be important.

Looking above you will see that it is an Azure instance, has a product version of 12.0.2000.8 and both databases have a compatibility level of 140. The compatibility level of 140 is SQL Server 2017.

What….12.0?  Isn’t that SQL Server 2014?  If you said yes, you are correct. Is it really possible that Azure SQL Database is SQL Server 2014? The answer is no.  That number is different that an on-prem instance of SQL Server.  According to what I have ready, version 12.0 is the most current version. Given both the Azure instance and the SQL Server 2014 both of a product version of 12.0, it now comes down to the compatibility level for the databases for Azure.

As you can see both databases of a compatibility level of 140, which is SQL Server 2017, not 2014. Since I didn’t change this when I created the databases, this tells me that the instance of Azure SQL Database was on SQL Server 2017 when I created the database.  As with an on-prem instance of SQL Server, the compatibility level for an Azure SQL database can be changed. When I created a new Azure SQL Database, the compatibility level was 140, which is SQL Server 2017.

To change the compatibility level of a database you have to run an Alter Database command.

ALTER DATABASE Advent
SET COMPATIBILITY_LEVEL = 160

When I ran the above code, I got an error.

Msg 15048, Level 16, State 1, Line 1
Valid values of the database compatibility level are 100, 110, 120, 130, 140 or 150.

Notice it states that 150 is an option, while 160 is not.  Compatibility level of 150 is SQL Server 2019, the latest version.  It looks like the available options go back quite a bit, version 100, which is SQL Server 2008.

Since in Azure SQL Database, the version of the database is key when investigating versions, below you will find the code to retrieve the versions of all databases.

SELECT name, compatibility_level FROM sys.databases;

All the above code, works on both an Azure SQL database and an on-prem SQL Server.

Notice that the Advent database has a compatibility level of 150, SQL Server 2019.  When investigating version information, it important to also review the compatibility level of the database you are looking at as well as the server version.

Thanks for stopping by, I hope you learned something.

Azure Metrics Reporting

A few days ago, I did a post on DTUs of an Azure SQL Database.  In the post, I mentioned a report that you can use to see how many DTUs your subscription has used.  This is an important number to know and measure over time so you don’t get slapped with an unexpectedly high bill or have performance issues because you have exceeded your DTU limit.

Azure has quiet a bit of reporting that can me used to get the current state of your Azure subscription. Although many of the metrics are targeted more for subscription rather than database performance counters, there are few DB related such as Deadlocks. Below is a small list of the items that can be viewed, of course there are more than included in the list:

      • Azure SQL Database
        • Data space allocated
        • Data space used
        • Deadlocks
        • In-Memory OTLP usage
        • DTU used
          • What is a DTU?  Stands for Database Transaction Unit. Here is a nice article by Andy Mallon the does a nice job explaining what it is. Click Here
      • Storage Accounts
        • Used Capacity
        • Egress
        • Ingress
        • Availability

How do I Get to the reporting tool?

Getting to the reporting tool is just a matter completing a few clicks. Of course with most products there are several ways to get to a specific place.

On the left side of the Azure Portal, click “All Resources”.  You could also click on Resource Groups if you like.

Just a little side note on how to add a resource. As expected there are many resources in the Azure world.  Here are just a few: Azure SQL Database, Azure VM, Storage accounts, resource groups, Cosmos DB as well as many other items.  Not all resources are from Microsoft.  There are a number of resources that are made available in Azure.  After you click the “Add” button, you will be taken to a page that looks something similar to what is below.

Above is a screenshot of some of the available resources.  The list on the left is just the list of resource categories.  On the right are some of the resources in the chosen category.  You can either pick a category or click the “See All” link towards the top.

Once you click the “All Resources” link you will be taken to something that looks like this.

The above image lists a number of important items.  Obviously the resource name is included and the Resource Group that resource is part of.  You can also see the resource type and the Location.  The location is really important, especially when it comes to cost.  Since this blog post is about Metrics reporting, it is important to note that not all the resource types have this type of reporting.  Above you will see that I have three resources, a storage account, a SQL Sever and an Azure SQL database.  Of these three, the SQL Server is the only one that doesn’t have this metrics reporting functionality.

Once you click the resource you would like to get the reports for, you will be taken to a page similar to this.

By chosing a resource on the left, the panel on the right will update with the information for that resource.  In the above situation, I chose the storage account.  Notice two green boxes, both of these will take you to a reporting tool. You want to click the one on the top. The bottom one, Metrics(Classic) is going to be retired on June 30, 2019.  This is according to Microsoft.

Creating the Chart

Now you simple click on the Metrics link and you will be taken to a page that looks similar to what is below.

Now confirm that you are on the proper resource by viewing the resource identified in the box highlighted in the green box.  If you are not on the proper resource, simply click the resource that is there and you will have the option to change to the proper resource.

Staying in the same box as the resource, you will see a few other items.

Metric Namespace:  This is the group of metrics that you are looking for.  The items in the drop down box changes depending on the type of resource selected. Since I am working with a storage account, you see the account option.  If you don’t see this for some reason, click the “Add Metric” link that can be found just under the Chart Tile.

Metric: This is the counter that we are looking to review.

Aggregation:  This is what it says it is.  There are usually three options, Avg, Min and Max.  However, some only have Avg as the only choice.

Once you have chosen all the desired options, you will see this.

As you can see, this chart contains more than one metric.  This is easy to add, just click the “Add Metric” link in the upper left hand corner. When you do, you will be given the same options as before.  The additional metrics do not have to be from the same resource, as you can see.  The metric on the left is from the storage account, while the one on the right is from the Azure SQL Database.

Although this example is a line chart, you can change the chart type easily.  Just click the arrow to the right of where it says “Line chart” in the upper right corner.Just like in Excel, there are a number of options for chart types, just not as many options.  Only 5 options here.  Just like in any other reporting tool, not all charts work well with all data sets.

Also in the upper right corner you can change to time range of the chart.

If you click the time, the options window for the chart will open up. As you can see there are a number of options, including the time range, the Time granularity(time interval) and the time(GMT or local time).

The options for Time granularity are in intervals that would be expected.  The options are listed below.

      • 1 minute
      • 5 minute
      • 15 minutes
      • 30 minutes
      • 1 hour
      • 6 hours
      • 12 hours
      • 1 day
      • 1 week
      • 1 month

You also have another option, Automatic.  The interval for automatic is dependent on the Time Range chosen. Below are the time ranges with the automatic interval.

      • Last 30 min – 1 minute
      • Last hour – 1 minute
      • Last 4 hours – 1 minute
      • Last 12 hours – 5 minutes
      • Last 24 hours – 5 minutes
      • Last 48 hours – 15 minutes
      • Last 3 days – 15 minutes
      • Last 7 days – 15 minutes
      • Last 30 days – 30 minutes

Another area that could be very helpful is in the lower left corner.  There will be one number for each metric that has been added to the chart.

In the green box above there are a number of data points.  It contains the resource name of the color of the line in the chart.  It also contains the name of the metric, aggregation used as well at the value.  In this case the number represents the aggregation listed for the data range of the chart.  If you float your cursor over the chart and vertical line like below will appear.  When you do this, you can move it to any time on the chart and the numbers in the lower left corner will no longer represent an aggregate, but the numbers of the point in time on the chart.

Now that I have created my chart, but the numbers on the Y axis don’t meet my needs.  This can easily be changed.  In the upper right hand corner you will see three dots.  If you click it a menu opens up and you can the click “Chart settings”.

Once you click the menu item, the settings window opens.  Here you can change only Y value.  Remember the X values are determined by the date range mentioned earlier.

In addition to the Y value, you can also change the Chart Title and chart type.  Notice that the unit of measure is bytes, but many of the metrics are in MB.  If you change to a time granularity to something other than automatic, the warning goes away.

Notice the warning:

“Setting a min or max value is not recommended when the time granularity is set to Automatic”

Pin to Dashboard

Once you are in the Azure portal, you can have one or more dashboards.  Once  you have your chart looking they way you want it to look, you can then pin it to your dashboard.  Once you do that it, will now appear as below.  In this case I have two charts pinned.

 

To pin your chart to the current dashboard, click the “Pin to dashboard” button in the upper right corner.

When you do, you will have an option to pin to current dashboard.  However, when it is pinned, your dashboard may not look as you like it.  If this is the case, you can click the “Edit” option on the dashboard.

Once in edit mode, you can move the charts as you see fit.

One nice feature of pinned charts is that the vertical line that appears when you float your cursor over the chart still works.  There is one additional feature that is very nice if you have more than one chart.

 

When you have more than one chart, the line appears on all charts that are pinned to your dashboard.

Hopefully this will help you with creating charts in the Azure portal.  Thanks for stopping by my blog and I hope you learned something.

 

 

 

 

 

Azure SQL Database and DTU

As you know, when you are using an Azure SQL Database you can use SQL Server Management Studio on your computer to execute queries.  You also have an option to use the Azure Query Editor. Here is my post on the query editor.

As I was writing the post I wondered more and more about whether or not there might be a difference in how data usage is billed.  My thought was that since I am going from an Azure SQL database source to another Azure tool, Query Editor, could this potentially reduce my costs.

There are a number of Azure resources that have Egress as a metric that can be used to measure the amount of data that is being pulled from the Azure data center.  These resources include the Azure Storage Account. However, an Azure SQL Database does not need to use a storage account.

Before we move on, let’s define what egress is.

Here are a few definitions that I found online.

Azure egress charges apply only when data crosses from one Azure region to one or more other Azure regions. Any traffic that originates and terminates within a single Azure region is not subject to egress charges. It does not matter that traffic moves from one subscription to another. If the data remains within the same region, there are no egress charges.”  – help.bittitan.com

“….data going out of Azure data centers” – Microsoft

I think the most important point the two definitions are trying to make is that egress is data going out of a data center, including if it is going from a data center in one region to a data center in another region.  According to Microsoft, if there is a data pull and the database and the querying tool are both in the same region, you will not be billed.  However, you will need to confirm this.

I am not planning on getting into the pricing models for Azure in this post, however you can get that from Microsoft.  https://azure.microsoft.com

Although some resources have Egress to measure data, an Azure SQL Database does not.  It does however have another option, DTU.  DTU stands for Database Transaction Unit and it is a unit of measure that is used, in part, to calculate your bill and to help you chose the proper pricing tier for your database.

A DTU is a “blended measure of CPU, memory, and data I/O and transaction log I/O”, per Microsoft.  The DTU is a measurement for the Azure SQL database only, not for your entire Azure environment or any other database.  An important thing to keep in mind is that when your workload exceeds the DTU maximum for your pricing tier, Azure will then throttle your workload.  My subscription currently has a max DTU of 10. As long as my workload stays below that I should be fine.  Because of the potential of being throttled, it is important to monitor your work load.   This is a nice article on the DTU by Andy Mallon.

If you are not sure what your workload might be in Azure, there are a number of DTU calculators available that can help.  Here is a link to one, DTU Calculator.

As for the how many DTUs you can have, that will depend on the tier that you have chosen.  Below is a sample of a chart from Microsoft that outline the limits of two of the tiers.  Notice that there are a number of options for Max DTUs. Source of the chart is here. As you can see, the Max DTUs goes from 5 to 100.  It even goes higher if you have are using the Premium tier.

Per Microsoft, you have up to the Max DTU for your tier before you will be charged.

The DTU price for a single database includes a certain amount of storage at no additional cost. —Microsoft

Of course this might change at any moment, so I have to make a disclaimer.  Always consult your MS rep or Licensing expert to get the exact pricing and rules.

As I stated earlier, it is important to monitor your work load to see if you are at a position in which you might be throttled.  Azure is a pretty good set of monitoring reports that you can create.  You can use the metrics reports to accomplish this.  Here is a link to my post on how to create a metrics report.

So what should you monitor?  Well since we are talking about an Azure SQL database, you have three options for measure DTUs.


DTU Limit – this is what you think it might be, the limit for your tier.

DTU Percentage – this is just that, the percentage of the DTU max that is being used.

DTU Used – this is the current DTUs that are being your by your workload for the database.

This post is less about DTU calculation and pricing tiers and more about does it make a difference if I use SSMS or the Azure Query Editor on the use of DTUs.

From the tests that I have completed that answer is no.  Using Azure Query Editor showed little or no improvement in the use to DTUs.

Before we get into the results of the test, let’s cover a few basics of what I did.  I used the AdventureworksLT sample database that you can get from the Azure portal when creating a new sample database.

This is the query I ran this query using both tools.

SELECT TOP 100000 *
FROM saleslt.salesorderdetail a
CROSS JOIN saleslt.salesorderdetail b

In the chart below the green boxes represent the test using SQL Server Management Studio, while the blue boxes are the executions in the Azure Query Editor.  Notice that there is very little difference in the DTU usage.

Although there really isn’t much difference in DTU consumption, I did notice something I didn’t really didn’t expect.

When I ran the query in SSMS is usually took less the 3 seconds to run.  However, when I ran the same query using the Query Editor, it usually took 45 to 60 seconds.  That is a huge difference.  Also note that in the above chart, the Max DTU never reached 10, which is my limit.  Because if this, I know that I was not throttled by Azure.

You might be wondering with the red box is.  This is the test when I ran this query.

SELECT *
FROM saleslt.salesorderdetail a
CROSS JOIN saleslt.salesorderdetail b

The only difference it that it doesn’t have the TOP keyword.  What is interesting about this is that when I ran the query in SSMS, it took about 45 seconds.  However, when I ran in the Query Editor, it was pushing 5 minutes before I cancelled it.

How can I get information about the MAX DTU for the Azure SQL Database?  There are a few DMVs that you can use.  The first on is the dm_db_resource_stats.  This can be used to the Max DTU, as well a number of other bits of useful information.

If you run this query in your Azure SQL Database you can get the information.  The column you want to look for to get the Max DTU limit is DTU_limit.  Here is a link to get more information on this DMV.

SELECT * FROM sys.dm_db_resource_stats;

However, if you try to run it in the Master database, you will get this error:

Msg 262, Level 14, State 1, Line 1
VIEW DATABASE STATE permission denied in database ‘master’.
Msg 297, Level 16, State 1, Line 1
The user does not have permission to perform this action.

You can also use he sys.resource_stats dmv to get the DTU limit.  This DMV has the same column name as above, dtu_limit.  This DMV must run in the Master database.  Here is a link to get more information on this DMV.

SELECT * FROM sys.resource_stats

If you try to run it in another database, you will get this error.

Msg 208, Level 16, State 1, Line 3
Invalid object name ‘sys.resource_stats’

How do I get to the metrics report. This is pretty straight forward.  Navigate to your Azure SQL Database in the portal.

 

As you can see, in the menu on the side you will see the Metrics items.  If you click it, You will then be taken to an Azure Reporting tool.

You will then create your report.  This my post on the Metrics Report.

Thank you for stopping by and I hope you learned something.

Azure SQL DB Query Editor

We have all been using SQL Server Management Studio to query and manipulate data, even for an Azure SQL database.  There is also an option to do this same thing built into the SQL Azure database interface in the Azure portal.  Although there have been a number of posts related to this topic dating back a few years, this feature is still marked as “preview” in the Azure portal.

How to get to the Query Editor

Once you are in the SQL Database list in the portal, find the database you would like to work with and click it.

When you do, you will be taken to the administration page for the database.  On the left side you will see the “Query editor” link.  You will also notice that this is a feature that is still considered to be in preview.

When I did a Google search on the Query editor in the Azure portal, I found posts from as far back as 2017.  Despite that, I still wanted to complete a blog post on it.  Once you click it, you will be taken to a login page similar to below.

Notice on the right side there is the ability to utilize the Active Directory single sign on functionality, which I do not have.  On the left you will enter the appropriate credentials to login to the database.

Once at the Query Editor

Once you are in the Query editor, you will notice many of the same features as SSMS.

These features include the following

      • Tabs for new queries
      • Open query
      • Results pane
      • Messages pain
      • A condensed object explorer
      • as well as many other features of SSMS

Below you will find a screenshot of the Query editor interface.

Menu Bar

Starting with the menu bar across the top, you will find the expected buttons.  In addition to the Open query, new query and save query buttons, there is also an Edit Data(Preview) button.  This does as you expect it might, allows you to edit the data.  Just like the Query editor, this is also a preview feature. There is also a Login button if you wish to use a different user.

Object Explorer

 

The Object explorer is very similar as that in SSMS, however there are some very glaring differences, mostly that there are fewer object types displays in Query editor.  As you can see you can browse tables, views and stored procedures.  As for the tables, you can see the column names and data types as well. For stored procedures, you are able to see the parameters just as the image below indicates.

In SSMS a user can right click on a stored procedure and execute it.  In the Azure query editor, when you right click on an object you get nothing.  The context menu does not appear, therefore not allowing you to execute the procedure.  Of course you can still use T-SQL to execute a stored procedure.

Query Window

In many aspects the query window is very similar to that of SSMS.  There are tabs across the top, the T-SQL pane, a results pane and a messages pane.

Starting with the T-SQL pane, there is color coding of the SQL, however it is slightly different than that of SSMS.  First big difference is that there is not an option to change the size or color of the font.

If SSMS if you execute two SELECT statements, like the ones below, the results are much different in Azure Query editor when compared to the same two queries in SSMS.

SELECT TOP 10 p.name
, p.productnumber
FROM saleslt.product p

SELECT TOP 10 *
FROM saleslt.address

Here are the results when you run the code in SSMS.  Notice that there are two data sets

Now look what happens when I run the same queries as the Query editor, I only get one data set vs the two when executed in SSMS.

In SSMS, if you right click on a table you will see an option to Edit X number of rows.  That options is very similar in Azure Query Editor. The way this is done in the portal is to click on a table and then click the “Edit Data (Preview)” button just to the right of the Login button.

This will in turn, return the data in a grid that allows for editing data. You can just double click in a cell in the results set.  This will change the cell to an editable version.  The edit menu has just 5 buttons on it.

The are:

        • Create New Row
        • Save
        • Refresh
        • Discard
        • Delete row

When you click the new row button, the new row will appear in a different location that if you attempted to enter a new record use the grid results set in SSMS.  The big difference here is that the new row will appear at the top of the grid.

One option that is important to me is the ability to change the font size.  I have NOT found out how to do this in Azure Query editor.

When you execute a statement and an error is returned.  In SSMS this error is red, but default. In the Query editor it is black.

Once you have your query as you like, all you have to do it click the Run button

Limitations

There are a number of limitations in the Query editor at this time.  Since it is in preview, my hope is the Microsoft will address some of these deficiencies as the Query editor moves closer to having the preview label removed.

      • Does not allow for the viewing of object definitions
      • Limited objects in object explorer
      • Tabs in the query window does not have the SPiD on it
      • If more than one SELECT statement is executed in a batch, you will only see the results of the last one
      • Cannot right click on a stored procedure to execute it
      • No options such as display Actual Execution Plan
      • Cannot drag and drop object names into the query

Despite all of the limitations, I still like it.  Once you get used to the limitations it really isn’t that difficult to use.  Since the name of the tool is Query Editor, it does make sense that it doesn’t have all the same functionality as SQL Server Management Studio. However, it would be nice to have some the items that are currently not included.

Thanks for stopping by and I hope you learned something small.

How to Move an Existing Database to Azure SQL DB

When moving the enterprise to a cloud based solution such as Azure SQL Database, we will need to find a way to migrate existing database to the cloud.  I already completed a post on how to create an Azure SQL database.  This post will focus more on how to migrate an existing on-prem databases to an Azure SQL Database.  There a several options that you can use, use the “Deploy Database to Microsoft Azure Database…” option in SQL Server Management Studio, Data Migration Assistant tool or use the Azure Database Migration Service.  During this post I will cover all three options.

Deploy Database to Microsoft Azure Database…

After opening SSMS, move to the database you wish to move to Azure SQL Database and right click on it.  When you do, you will see this:

After clicking the first next in the wizard,  you will then be asked to provide some detailed information about what you are attempting to migrate to Azure SQL Database.

This information includes the following:

  • Server Connection – this is the Azure location you are attempting to migrate to
  • New Database Name – this will default to the name of the database you right clicked on, but can be changed in Azure SQL Database if you have a need to change it
  • Azure SQL Database Settings – these will not be active until you identify a Server Connection
      • Edition of SQL Server
      • Maximum Database size
      • Service Objective
  • Temporary File Name – This is the name and location of the .bacpac file.

What is a .bacpac file?  Since you may not have worked with this before, I thought I would take a few minutes to go over exactly what a .bacpac file is. It is considered a Data Tier Application that can be used for deploying databases from one location to another.  This file will contain both schema, object definitions and data.  There is also a .dacpac file type.  The only difference between the .bacpac and .dacpac is that the .bacpac is the only file type that also contains the data.

This is what the screen will look like.

When you click “Next”, you will have an opportunity to review all the settings before clicking “Finish”.  When you do, you will see the progress of the migration, step by step.

As you can see above, there are number of steps.  You can also see that there is a step named “Processing Table”.  This step will show each table as it is processed and migrated to Azure SQL Database.  Once the migration is completed, you will now see the database in both locations, the original SQL Server and in Azure SQL Database.

Data Migration Assistant

This is a relatively easy tool to use.  It really has two purposes.  One to assess an on-prem database to identify what challenges you might encounter when migrating at database to Azure and the other to actually complete the migration.

This is a tool the must be downloaded and installed before using.  To download it you can go here.  Download DMA.  Once downloaded and installed you will find it list as “Microsoft Data Migration Assistant” in the Start menu, not under the SQL Server items.

Below you will see the opening screen in the DMA.  As you can see there are two options, completed an assessment or do a migration.  By clicking the “+” on the left side you will have an option to chose what you would like to do.  Notice that there are also options to restart an assessment or open an assessment.  These are exactly what you might think they are.  When you complete an assessment you will be able to save the results or save the configurations of the assessment that you can restart if you like.

When you click the “+” sign you will then see a few options that you can chose from.  Here you will be able to chose to complete an assessment or a migration.


Both of which will require a project name, a source and target server type.

You will have several options for a target as you can see in the image below.  Since this blog post is about the migration of a database, we will focus on that target.

Once you pick you source and target, you will then be able to click the “Create” button.  Once you did, you will see this.  You will see three options, although only two are available at this time.  The two options that are available are Check Database Compatibility and Check Feature Parity.

Once you click “Next” you will then be asked to identify a source.  Since this post is about migrating an on-prem SQL Server database to Azure SQL Database, the source will be a local instance of SQL Server.  You will have to provide an authentication method and then the database(s) you would like to complete the assessment on.  Once you have provided all the information you can click the “Start Assessment” button.

Below is an example of the results of the assessment.  There were a few things that were expected, especially the Agent jobs not being supported in Azure SQL Database.

Fortunately Microsoft has also included links to each of the topics that will allow us to gather more information easily.

Make sure you look at both groups of items, SQL Server feature parity and the Compatibility issues.  In this case not compatibility issues were found.

Now that we have completed an assessment and addressed all the blocking issues, we can now use the same tool to perform the migration.  As expected you will also have to provide a project name, a source and a destination type.  This will also include the connection information for the target.

The migration scope has three options.

      • Schema and data
      • Schema only
      • Data only

You just need to decide which is best for what you attempting to accomplish.  In this case, we will pick Schema and Data.

Once you provide all the information, you will be able to click the “Create” button. As part of the migration, there are six steps.  These are list in the image below.

Selecting a source if the obvious starting point for a migration.  Once you do, you will then have an opportunity to pick which database you would like to migrate.

It this is a new database in Azure SQL Database, make sure to click the “Create a new Azure SQL Database…” link.

When you do, make sure you click the Connect button so the list of databases refreshes.

The next step will allow you to identify which objects you would like to migrate.

In this case there are still a few issues that will need to be address, most due to encryption, so I will not migrate those objects.  The next step is to generate the scripts.  Once the scripts are generated, you can save it or copy it.  Note below you the DMA tool will also identify items with assessment issues.

At this point the database has been created in Azure SQL Database and all the objects have been created.  However, we have not reached the point of migrating the data yet.

In order to migrate that data, all we have to do is click the “Migrate Data” button.

We will now have the option to decide which tables we would like to have the data migrated as well.  As you can see below, we can see the tables listed.  This list include the number of rows and whether or not the table is ready to move or not.

Once you have confirmed the information, click the “Start Data Migration” button.  What I like about this part of the tool is that it tells me the progress of each table and the status of each migration of data.

Now were are done. Not only have the objects been migrated, but the data as well.

Both of these migration options take place on your computer, one with SSMS and the other the downloadable Data Migration Assistance.  In my opinion, I think they are both pretty easy to work with.  The third option is done in the Azure portal.  There is a server called Azure Data Migration Service.  There are a number of required steps to utilize this service.

Azure Data Migration Services

As the name implies, this is a service in the Azure Portal.  When looking at all the services, the Azure Database Migration Services can be found either by doing a search or looking in the Databases group.

Once you click the Azure Database Migration Service, you will see what is below.  However, if you already have one of this service type running, you will not have to go through this step again.

For this method, you will need to provide a service name, a location(region) as well as a Virtual Network.  Yes, you will need to create a virtual network. This will include the creation of a virtual NIC.

If you already have the service created you can go to the All Resources list in the portal and click that service. Once you do, you will see this.

The next step is to click the “New Migration Project” button.  At this point you will pick a data source and a target.

The options for Type of Activity are:

    • Schema Only migration
    • Offline data Migration – this is to be used for source databases that will NOT be changed during the migration.
    • Online data migration – For systems that will be modified during the migration
    • Create project only

If you chose the Online Data Migration type, Azure with provide you with a list of rules that must be followed in order for the migration to be successful.

Also, if you chose Online Data Migration, it will require a higher level of SKU.  You must have Premium, which I currently do not have.  I received this warning what I tried to use Online Data Migration.

Given this I changed the type of activity to Offline Data Migration.  You will have to complete several steps to be successful in the migration. These steps are fairly basic and expected.  Identify a source, a target, map to target database, configure migration options and a summary.

There are a number of network and security steps that must be completed prior to utilizing this service.  This is a good description of what is required, click here.

As expected the source and the target require the usual information. What is a bit different is the third step, mapping target databases.  Below is a screenshot of the mapping step that is on the above Microsoft site. As you can see below, you simple map the source databases that you would like to migrate to a destination database in your Azure SQL database portal.

The settings step is simple telling the service what tables you would like to migrate.  Below is also from Microsoft’s site.

Finally, once you have provided all the required information and you can then begin the migration.  Once you do, Azure will provide progress status that will look like this, also from Microsoft’s site.

It is always good to have options.  In this case we have three different options.  Two you can run from your computer and one that must be done in the Azure portal.  Personally, I like the options of running the migration through an on-prem computer.  They appear to be less involved and I don’t need to involve other teams, such as the network or security team.

When migrating databases to Azure SQL Database, it is important to complete an assessment on the database first using the Data Migration Assistant.  This will allow you to identify what your obstacles might be.  You can then address them before starting the migration.

Thank for stopping by my blog and I hope you learned something.