ADS: Settings Editor

When working with the Settings in Azure Data Studio, you have a few options on how you will see them.  The two options available at this point are the “user interface” and “JSON”.  While it is usually easier to modify the settings using the UI, you might prefer to use the json instead.  I feel that anyone that uses Azure Data Studio on a regular basis should learn the basics of JSON.

These are my reasons why I think we should learn the basics of JSON.

      • Some settings must be changed in JSON
      • One of the export options for a result set is JSON

The first reason is where the focus of this post is.  Many settings can easily be modified using the user interface, however not all can.

The settings can be found under the File menu or you can use the keyboard shortcut of “Ctrl + ,”.  Under the File menu is you will find Preferences.  It is in the pop out memu that the Setting menu item can be found.

Once in settings you will quickly see that there are many settings.  Most of these you will not have a need to change, but it is good to know what can be changed. Once in the Settings, the user interface is quite easy to navigate around. You can even search for specific key words on the top.

As you scroll through the various settings you might notice that not all the options can be changed in the UI.  Some must be changed using the JSON file.

The settings.json file can be found in this location.

C:\Users\<<UserProfileName>>\AppData\Roaming\azuredatastudio\User

Notice that “azuredatastudio” does not have any spaces.  When you navigate to this location “C:\Users\<<UserProfileName>>\AppData\Roaming” you may see two folders, both named Azure Data Studio.  One will have spaces, the other will not.  The settings.json file is found in the folder without spaces in the name.

 

Now that we know the location of the settings.json file, let’s return to the editor.  As mentioned earlier, the settings can be changed in two places, the editor UI and the JSON file.

This is what you will see if the setting can be modified in the UI.  Depending on the setting, you may see a dropdown box, a text box or a check box.

If you see something like what is in the green box below, the setting must be modified in using the settings.json file.

When using the UI, you have both options.  However, there is an setting that will allow you to use only the settings.json file.  This setting can be found in the Workbench group.

The dropdown box will have two options, ui and json.  When set to json, the settings.json file will open when you attempt to look at the settings.  Keep in mind, when you change this to json, you will not get a warning and it will be saved automatically by default.

With it set to json, as expected the settings.json file will open instead of the user interface.

You might be asking, how can I change it back?  This is quite simple.  Look for the “workbench.settings.editor” setting and change it back to “ui”.  Remember there are only two option, “ui” and “json”.  Make sure to keep the quotes.  As you can see in the above image, it is on line 209 on my instance, although will probably be different for your instance of ADS.

When in the json file, if you float your mouse pointer over one of the settings a pop up will appear that may provide a bit of information about the setting.  Something like what is in the red box below.

While I prefer to use the ui, it is nice to have this option if I wanted to change.

If you are not using Azure Data Studio on a regular basis, my suggestion is to take a look at it.  The price is right, it is free and has many great features.

Thanks for visiting my blog!!

 

 

ADS: Number on the Explorer Icon

When you open Azure Data Studio you will see a tool bar on the left side of the screen.  On this bar you will see a number of icons.  The image below is what you might see.  From time to time you might see a number on the lower right of the icon.

In the example, there are two icons that have number over them.  The bottom one is related to extensions and it is telling you there are updates to extensions you have installed.  In this case there are two.

The top number is the focus of this post.  The icon is for the Explorer. When you click on the icon you will see a screen pop up. At the top of this screen you will see “Open Editors”. Similar to below.

These are files that you have opened, made a change to and have not saved.  As you open files and make changes, this number will increment.  It will also increment if you open a new query window and make a change to it.

Although Azure Data Studio can persist your unsaved documents, it is still a best practice to save your work as you go.

Thanks for visiting my blog!!!

 

SYS.DM_EXEC_SESSIONS and System Processes

Being able to gather information about current sessions has for many years been an important part of a DBA’s responsibilities when troubleshooting a performance issue.  The type of information that we need includes the SPID, host, database name, login name and application being used.  While all this is important, at times system processes need to be identified or even excluded from the results.

Over the years, DBAs have developed their own technics for gathering information about current processes. These have included sp_who, sp_who2 and a query pulling data from the sys.processes table. SYS.DM_EXEC_SESSIONS is a DMV that came out more recently, SQL Server 2008.  I have read many articles and posts over the years that have said any SPID with an ID of 50 or less should be considered system related.  While this is true, there are some processes that will have an ID greater than 50 and also be a system process. SYS.DM_EXEC_SESSIONS allows us to easily identify system processes.

Sp_who and Sp_who2 are stored procedures I have used many times over the years. I usually use sp_who2 rather than sp_who.  Sp_who2 returns more information than sp_who. If you look at the image below, you will see two data sets.  The top one is sp_who and the bottom is sp_who2. These additional columns is why I prefer sp_who2.

Sys.sysprocesses is also a great resource   However is does not include a column that will allow us to determine if it is a system process or not.  Below are the columns in sys.sysprocesses.

SYS.DM_EXEC_SESSIONS is a view that was included starting with SQL Server 2008.  This view returns much, if not all the same information as the above. However, it also returns a great deal of additional information.  One of the columns returned is “is_user_process”.  This is a bit column that if set to 0 is a system process and if set to 1, is a user process.
SELECT     session_id
        , login_time 
        , is_user_process
        , login_name
        , host_name
FROM SYS.DM_EXEC_SESSIONS
If you run the above query, you will see a result set similar to what is below.  Is_user_process is the third column.  The rows in the red box are above session_id of 50 or more.  The items in the green box are user processes, as
identified in the is_user_process column.

This is a great view to use.  Some thing to keep in mind, it is a view and one of the tables it pulls from is sys.sysprocesses.  Here is a link to Microsoft’s documentation on this view.

Thanks for visiting my blog!!!

ADS: Simple Data Scripter

Moving data in a table from a production server to a lower environment such as a development server can be done a number of different ways.  These include backup\restore, using SSIS to move the data, import\export wizard and scripting the data in SQL Server Management Studio.  Azure Data Studio by itself doesn’t have a way to script the data in a table and it does not have the import\export wizard.  This is where the Table Data Scripter extension comes in handy.

Like most extensions for Azure Data Studio, this extension is also in a preview state.  As of the date of this post, it is currently at version 0.1.1 and is published by Sean Price. The documentation for this extension can be found here, Simple Data Scripter. In the documentation there are a few limitations identified

      • Not all data types are supported, specifically the binary datatypes like text, binary and varbinary
      • Has only been tested with SQL Server
      • Does not work on a view

Some extensions will add options under Preferences.  This extension does not add any options.  Given that, there is limited ability to configure this, which is OK, not really much of a need to do any configuration.

Just as a review, let’s take a look at how we can script the data in SQL Server Management Studio.  After right clicking on the database and going to Tasks…Generate Scripts…  When you do, a wizard will pop up and then after a few clicks on next, you will arrive at the screen below.  The box highlights the “Types of data to script” option.  With this there are three options, Data Only, Schema and Data and finally Schema Only.

When right clicking on a database in Azure Data Studio you will see limited options none of which are related to scripting data.  This is also true when right clicking on a table in the database.  Here are the options we see by right clicking on a table, as before, none of which are related to scripting data.

In order to script the data in Azure Data Studio we will need to install the Simple Data Scripter extension.  This is a pretty straight forward and simple install.  You will need to download the .vsix file.  You can get to the GitHub to download the file from the documentation for the extension in ADS.  You can install it under the File menu.  While you will not be prompted to restart ADS when you install it, I usually like to restart ADS anyway.  In this case, the extension doesn’t move to the installed extensions until after a restart.

Now that the extension is installed, you will see a new option when you right click on a table.  This is another potential limitation of the extension, there isn’t an option to script all the data in the database.  Well, at least I haven’t found it yet.  But then again, I am not sure having a simple place to script all the data in one place if the best way to go.

When clicked the following screen will appear at the top Azure Data Studio.  Initially when we look at it our first thoughts might be, what a bad query.  But it does make sense, we are creating a script of all the data in the table.

If you click anywhere, ADS will cancel the query and you will see this pop up in the lower right corner and the above screen will disappear.

If you press enter Azure Data Studio will execute the query and generate the script. You will not see a result set but you will see the script.  It will look similar to below.  Notice the Create Table statement at the top, it is commented out.  The table that is used for the example is named RentalCar.  The script is working with a table named #tempRentalCar.

When the script is finished, the Task window will open up at the bottom of the screen.  The top two are successful executions while the third failed, reason for the failure will be covered shortly.

You may be wondering if the query can be modified.  The answer is yes, however not every type of statement will work.  If you want to add a WHERE clause that works just fine.  However, as the query become more complex it failed more frequently.  I attempted to run a query with a join in it and it failed.  That is the failure in the image above.  The error stated “Query has no results to return”.  However, the query works just fine when running in ADS or SQL Server Management Studio.

Here are some of the statements I have tried.

      • Query with a join – failed
      • Query with a sub-query – failed
      • Query with a WHERE clause and just one table – succeeded

When compared to the expectations, this extension works very well and I can definitely see a use for this.

Thanks for visiting my blog!!

 

ADS: Notebook Text Formatting

Azure Data Studio has a great feature in the form of Notebooks.  A notebook is a document that allows for executable code such as TSQL or Python as well as notes for documentation.  Notebooks can be utilized to help organize things such as DR runbooks and troubleshooting documents.  What I like best about notebooks is that everything needed can be located in a single file, including data.

When working in a Notebook you have two types of cells, text and code.  The focus of this post is how to format the text cell.  Of course text goes into this cell so that part is easy and of course the text can say anything you would like to say.  When we work with text in Word, there is a format tool bar that we can use to make it look like we want it.  The text cells do not have this toolbar.

You might be asking, without the format toolbar, does that mean we can’t format the text?  That answer is no….we can still format the text, we just need to do it slightly different.  Rather than use a toolbar, we need to use characters.

While not all formatting options are available in Notebooks, there are a few commonly used options that are.  These are below as well as a few others we won’t cover in this post.

        • Font Size
        • Font Weight
        • Bullet lists
        • Italics

Here is an example of what your text might look like with formatting.  Notice that there are different size fonts, bold fonts, email address, URL and a small section highlight(blue vertical bar).  We will go over each of these in this post.

First of all, in order to get to the location that allows you to format the text, you simply need to find the text you want to format and double click on it.  When you do, the box to format the data will appear just above it.  Similar to below.

The chart below has an example of each of the font options. The sample code is in the parenthesis.

 

Font Changes in a Notebook

Formatting Applied and Sample code Example
No Formatting
One #

(# Font Changes)

Two #

(## Font Changes)

Three #

(### Font Changes)

Four #, adds bold

(#### Font Changes)

Five #, adds bold

(##### Font Changes)

Six # – Makes font tiny and bold

(###### Font Changes)

Seven # – no impact at all, prints the # signs

(####### Font Changes)

Italics

The above chart shows how to change the size and weight of the text.  Well, what if I want the text to be italics.  To do this we will use the *, well actually a pair of them. One will be placed at the start of the text you want to be italics and the other will be placed at the end.

Like this:   *Font Changes*

This is what it will look like in a Notebook

If you want a larger font you can combine characters.

Using both the # and *, we can get a larger font as well as italics.

This is what it would look like:  # *Font Changes*

The image above shows a much larger font and italics.

URL

If you want to include a URL to a web site you can easily do this with out the use of any special characters, such as a # or *.  When you do, the text to the URL appear just as you type it.  For example, if I want to include a link to my blog I could type, www.davebland.com.  When I do, that is exactly how it will appear in the text, www.davebland.com.

However, what if I want different text to appear, let’s say I want “Dave Bland’s blog” to appear in the text rather than www.davebland.com.  This is also pretty easy.  You will need to utilize the square brackets, [] and a pair of parenthesis, ().

The square brackets will hold the text I want to appear, while the URL will be placed between the ().

Like this:

[Dave Bland’s Blog](http://www.davebland.com)
Here is an example of both.

 

Bold

We already talked about how you can use the # to get a bold font.  But what if I want a word in the middle of the text to be bold and nothing around it.  This is where the * comes in.  We just put two * before and two after the text we want bold.

Like this:

# I what these words to be bold: **Font Changes**
This is what it will look like.  Notice the # at the start, remember this makes the font larger.

 

Block of Text

Azure Data Studio also offers the ability to block off sections of text, similar to the text in the green box  below.  Notice the blue vertical bar and the tinted background.

This is done by placing a > at the start of the line.

This is what it will look like:

You can add more > at the front if you like, this will create blocks in a block.

This is what it will look like:

>>### “Those responsible for sacking the people who have just been sacked, have been sacked.” – What movie is this from?

 

And this is the result:

Bullet Lists

The final format item I would like to cover it a bullet list.  Like all the other format options, this is completed by the use of a special character.  In this case it is the plus sign, +.

This is what it will look like:

+ First Bullet
+ Second Bullet
This is the result:

Put it all together

If you use a number of these in one text box, this is what it might look like.

And this is the result:

There you have it….formatting text in Azure Data Studio notebook without the use of a format toolbar.

Thanks for visiting my blog!!!

 

 

 

 

Auto Delete Jobs????

Have you ever looked at something in SQL Server and wonder why it is there?  That is what I think when I see this option in the SQL Server Agent job properties.  I can not come up with any good reason of why you would want a job to delete itself upon completion.  I even did a Google search and really didn’t find a good reason.  However, if you know of a great reason of why you would want to enable this, I would love to hear about it.

If you are not familiar with this option, you can find it under the Notifications tab of the job properties.

Just like all the notifications, you have several options.

      1. When the job secceeds
      2. When the job fails
      3. When the job compeletes

When the job deletes, it will also take a job history with it.  Basically there will be no evidence that the job ever existed.  This is pretty straight forward, so you might be wondering why bring it up.  Well, I see this setting as a potential danger, especially if you have a disgruntled DBA.  While being a DBA is a great job, there are, from time to time things that can have a negative impact on how we feel about our job.  Luckily the SQL Server community is filled with really great people.

We have a job that checks all of our jobs looking for any jobs that has this set.  You can use the query below to get the necessary information. There is a column, delete_level in sysjobs that will allow us to get the information needed.

SELECT    name
  , delete_level
  , CASE delete_level
    WHEN 0 THEN ‘Not Set’
    WHEN 1 THEN ‘Delete on Success’
    WHEN 2 THEN ‘Delete on Failure’
    WHEN 3 THEN ‘Delete on Completion’
    END AS ‘Delete Level Setting’
FROM sysjobs

So why do I think this is potentially dangerous?  Image the disruptive code that could be written and then imagine that same code in a job that is scheduled to execute 6 months later.  Now go one step further, all evidence of that job running is now gone.  I really don’t want to give too much information, but I can think of several things that could be done that could cost the company a ton of money.  This is why we check for this.

I also include this check on my server assessment scripts.

As I said earlier, if anyone can give me a good reason to have this set, I am willing to listen and will update this post accordingly.  Giving proper credit of course.

My suggestion is to look for this!

Thanks for visiting my blog!!!!

Azure Data Studio – Schema Compare Extension

From time to time we are asked to help identify the differences between two databases.  For example, the differences between a database in QA and the same database in production.  Over the years I have become accustomed to using a very nice tool by Red-Gate, SQL Compare.  With the release of Azure Data Studio and the SQL Server Schema Compare extension, I now have another option.  As with many of the extensions, this one is also in preview.

In order to install the extension, after opening the extensions in Azure Data Studio, you will want search for Schema Compare, similar to the image below.  Notice the white star in the upper left corner, this means that this extension is recommended by Azure Data Studio.

Installation is very easy, once you have found the Schema Compare extension, click the Install button.  This is a pretty quick install and does not require a restart of Azure Data Studio.

Once installed you can launch the Schema Compare extension in one of two ways.  The first way is to right click on one of the databases you wish to compare and go to Schema Compare in the context menu.

The second method is to use the Command Palette.  You can reach this from the View menu.

The difference between the two is that if you use the Command Palette, both the source and target locations will be empty.  If you right click on a database to compare, that database will be the source and the target will be empty.

Using the Command Palette – notice both the source and destination are blank.

Right clicking on a database – the source is already populated with the database you right clicked on.

Setting up the Source and Destination

When setting up the source and destination, you will have a choice to make.  Will you compare a database or will you use a DACPAC to complete the compare?  You don’t have to just compare two databases, you can compare a database and a DACPAC if you have a need to do so. When you click the elipsis to the right of either the source or destination, you will be taken to the location that will let you set up the comparison.

What I like is that I don’t have to do a database to database compare or a DACPAC to DACPAC compare.  Schema Compare allows for a database to DACPAC compare as well.  Once set up your screen will something like what you see below.

Since I am comparing two DACPACs, there is a path, however if either the source or target is a database, it will look similar to below.  The format is ServerName.databasename.  In this case since I am using a named instance it is slightly different, ServerName\InstanceName.DatabaseName.

Menu Options

Once you have set up the source and target, the menu bar will appear across the top.

Let’s take a moment to go over each of these.

Compare and Stop – these are pretty simple and are used to start the compare or stop it.

Generate Script – Once the compare is done, we can use this to generate the script we can use to make the changes.  Notice in the image above this is not active. It will become active after you have completed the compare AND you are comparing two databases.  You will not be able to use this option when comparing two DACPACs.

Apply – Will apply the discovered changes to the target

Options – There are a number of options that include rules to follow, like “Block on Possible Data Loss” and what to include in the compare.  “Ignore Permissions” is an example.

Switch Direction – This will take the source and make it the target…and of course take the target and make it the source.  When you click this, the arrow between the two will stay pointing to the right, the items to compare will move.

Open .scmp\Save .scmp – Schema Compare allows you to save the configuration to run the compare again at a later date if you like.  If you save it after the compare is completed, this will NOT save the results of that compare.  These files are XML based and will look similar to below.  The format is pretty simple, the XML will contain the version, source, target and the current state of all the options.

 

Changing Target or Source

Once the target and source are set up, you can easily change one or both of them.  Simply click the ellipsis again, make the changes you want and click “OK”.

When you do make a change and click “OK” the following will appear in the lower right corner of Azure Data Studio.   It is just asking if you want to complete the compare now.  If you click no, you will still be able to complete the compare by clicking “Compare” when you are ready.

Generate Script

This does exactly as you would expect it to. It generates the script.  Once the compare is complete, as in the image below, the Generate Script option becomes active.  Just a reminder, if both the source and the target are DACPACs, generating the script is not an option.

Prior to generating the script, you will want to review what was found, specifically the Include and Action columns to make sure those are indeed the changes you would like to make.  If there is a difference you do not want to be part of the script make sure to clear the check box in the include column.  Once Azure Data Studio is completed with the generation of the script a new table will open up.  This tab will not be connected to a data source.  If you want to run it, you will need to connect to the desired data source.

Notice in the bottom left corner of the above image you see “Problems” with the number 7 to the right of it.  Initially you might think that Azure Data Studio created bad code, that is not the case.  When you click on “Problems” you will see a list of the problems as well as the line number and the location of the first character of the problem.  If you click on one of the problems, you will be taken to that location.

You still might be thinking ADS created bad code.  Now I will show that the code is just fine.  If you look at the code and scroll down just a bit you will see this.

“Detect SQLCMD mode and disable script execution if SQLCMD mode is 
not supported.

 

The code that is generated requires that SQLCMD be active. Once SQLCMD is active, the problems disappear.

Now you are able to run the script when you are ready.  Of course, the script can be saved as a .sql file if you want to run it later.

I think this is a very useful warning that can be found near the top of the generated script.

“This code was generated by a tool.
Changes to this file may cause incorrect behavior and will be lost if
the code is regenerated.”
Just something to think about before making any changes.

Options

The final item I would like to cover is the Options button.  Of course this is exactly what you might think it is, a location to set options.

When you do click this, a panel will open up on the right side of Azure Data Studio.  At the top of this panel you will see two options.

The first options is “General Options”.  This is where you can identify what you actually want to capture in the compare. While the second option will identify the object types you want to be looked at, for example tables, views or stored procedures.

Under the General Options there is one in particular I would like to point out, Verify Deployment. Having this checked, the deployment will not be allowed to procedue if the validation detects a problem.  The image below is the actual description you will see.

As time goes on we will use this extension more and more, we will then have a better understanding of some of the additional options we might want to include.  You should really check this extension out!

Thanks for visiting my blog!!

ADS – Did I Save My Changes?

After working on a query for a long time, we want to make sure that we save the changes we have made.  I have lost hours of work over the years because I didn’t save the changes.  Azure Data Studio has a few features that can help prevent this from happening.

The first feature is that ADS can “save” your changes if ADS is restarted.  This can be turned off in Preferences.  I completed a blog post on this feature.  Here is the link.  This feature allows Azure Data Studio to remember what changes you made and will open the file when you restart ADS.

In SQL Server Management Studio an asterisk is placed on the tab the code is on.  Once you save the file, the asterisk will disappear.

Azure Data Studio is similar but instead of an asterisk, ADS uses a dot.

Once the file has been saved, the dot changes to an X.

What is interesting is that if the file has not been saved when ADS is closed, the unsaved state will be retained.  The next time you open ADS, the file will be opened, the changes will still be there and the file will be marked as not saved.

This is a short and simple blog post.  Thanks for visiting my blog!!!

ADS – Outdated Extensions

As you add more and more extensions to Azure Data Studio,  keeping them up to date could present a challenge.  Of course we will want to install all the extensions that we think might be helpful to us.  Luckily Azure Data Studio has a nice way to let us know if there is an extension that is out of date.

If you look at the extensions in the left vertical bar you will see a number over the extensions, the blue circle. That number represents the number of extensions that have updates available. In the image above, you can see that there is one extension that is out of date.

Now we need to find that particular extension.  You can do this by scrolling through you active extensions and look for the green box as in the above image.  This will take you to the proper location for the update.  In the case above, I needed to update the Redgate SQL Search extension.

This is an update to the VISX file.  Once you download it, you will then need to click on the “Install extension VISX package” item under the File menu.  Prior to installing the new version, I would suggest that you review exactly what the new version brings.

As with the initial install of the extension, it is possible that you may need to restart Azure Data Studio to complete the installation.  In this case, there was a nice reminder to do so.

Thanks for visiting my blog!!!

ADS – Query History Extension

I just downloaded the latest release of Azure Data Studio.  A new extension came with it that I really like, Query History.  According to the documentation, the Query History extension does this: “Adds a Query History panel for viewing and running past executed queries”.

This installation of this extension is pretty simple and does not require a restart of Azure Data Studio.  However, if you disable or uninstall the extension, you will need to restart ADS.

If you look at the View menu, you will notice that there is not a menu item for “Query History”.

This will change once the extension is installed.  You will find the Query History item just under Output.

Once you install the extension, you will now begin to see query history.  If you run a query, you will see the history as seen below.  If you to not see this at the bottom of the screen, go to the menu item mentioned above.

When you run a query, it takes a very short period of time for it to appear in the history.  In the above image,  I ran three queries, two of which were the same query.  This window is not part of the query window.  Since it is a separate window, it will collect queries from all tabs that are open.  Each row in the history will return three things, the query, the server and database uses as well as the times stamp of when the query was ran.  The server can be seen in the green box below.  Sometimes a forth item is included in the list of items returned, the icon that is at the far left of each row.

Now that we have history, let’s take a moment to see what we can do.  First of all, if you float your cursor over one of the queries, you will see a pop up.  This pop up will display the query.

You will see a lot more options if you right click on one of the queries.

All the options are pretty straight forward and really don’t need a great deal of explanation.  The bottom option, Pause Query History Capture I can see being useful if you are running query that you don’t want to be part of the history or just running too many queries at the moment.

Since these options are pretty straight forward, let’s take a look at a few questions.

What happens to the history when Azure Data Studio is closed?

The history is no longer available

What if I execute a batch, will each query in the batch be a separate line?

All the queries for the batch will be on the same line. The query in the green box is actually two select statements.

To the left of the query, I see a green check mark.  What are other icons I might see? 

There is currently only one other icon you might see. The red X when there is an error.

When there is a error, you will see a small number next to the “Problem” tab.  When you clock to this tab, you can now see details on what the error was.

What’s the difference between “Run Query” and “Open Query”?

Open query will do just that. Open the query so you can review the TSQL.  Run query will simply run the query and you will see any results if there are any.

Over all I really like this extension and can really see the practical use of it.  Just remember, this extension does not stored the queries beyond the Azure Data Studio session.

Thanks for visiting my blog!!!