Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a new Tab to the Server Performance Section #349

Open
EsQueEl-Fella opened this issue Sep 29, 2022 · 3 comments
Open

Adding a new Tab to the Server Performance Section #349

EsQueEl-Fella opened this issue Sep 29, 2022 · 3 comments
Labels
DBA Dash Agent Issue affecting the agent DBA Dash GUI system client (web, mobile, CLI, 3rd party)

Comments

@EsQueEl-Fella
Copy link

This is a feature request.

The image shows the data in "Slow Queries" Tab.

I would firstly suggest to rename the Tab to something like "All Queries" as the Title "Slow queries" is a bit missleading. Some of the queries are defintly slow but not all of them. We are classifying all the queries by time in this stage and not all the shown results are slow. From other point of view, if these reported queries are the slow ones, then where can the user find the "Fast" ones? :D (Now that I think of a bit, this could be also an interesting feature: "Fast Queries" which you could implement in the future.)

Then I would suggest a new Tab "Most Expensive Queries" which would provide more in depth analysis on what is the real pain for the SQL engine. As the image shows, the number of the running queries on a server (in this case an empty one) can increase very fast and it is also not that easy to identify the one which may cause the pain if we are above 2 digit numbers. I am intentionally showing the executed query in the image to make the point. This is a WMI notification Alert which I have created for a certain purpose on the server and it is monitoing the state of the server.

Below you can find two samples of the "Most Expensive Queries", each one very useful. I would suggest a combination of data in both could be added to a new Tab where marked in the image.

most_expensive_queries

SQL SERVER – Find Most Expensive Queries Using DMV
https://blog.sqlauthority.com/2010/05/14/sql-server-find-most-expensive-queries-using-dmv/

SQL DMV: Most Expensive Queries, Missing Indexes
https://knowledge-base.havit.eu/2017/10/03/sql-dmv-most-expensive-queries-missing-indexes/

@DavidWiseman
Copy link
Collaborator

I'm not 100% on the name Slow Queries either. It's also not all queries - by default the collection will capture RPC/Batch completed events that took longer than 1 second. Changing it to RPC/Batch Completed would be more accurate but a slightly longer name for the tab.

DBA Dash doesn't capture statement level metrics which is a limitation. I would recommend using query store to compliment what is collected by DBA Dash.
I have thought about collecting data from sys.dm_exec_query_stats or possibly pulling data from query store to include performance tracking at the statement/plan level. Ideally we want to capture this over time so it gets more complicated. The tricky part is getting this to work well for servers that have a heavy adhoc workload and keeping the volume of data collected reasonable. I'll probably include this at some point - but probably not soon.

@DavidWiseman DavidWiseman added DBA Dash Agent Issue affecting the agent DBA Dash GUI system client (web, mobile, CLI, 3rd party) labels Sep 29, 2022
@EsQueEl-Fella
Copy link
Author

Just some short comment on QS: Not every DB has the QS Active. Default setting for an active QS, according to MS, will be implemented from SQL 2022. I would definitly go with DMV method as one can always extract some data for sure

As of the concern regadarding the workload and the volumn of the collected data: basically it would be enough to keep track of the Top 10 MEQs. I don't see any danger of overloading the DBADash or putting extra load on a busy server. DMWs are heavily tuned and we don't need to collect this data every minute. I would say collecting every hour would do the job, I even go further and collect them once every day. The point here is to identify the Painful queries and not saving second on a normal query which by design may take a minute or so to run.

@DavidWiseman
Copy link
Collaborator

There are challenges as you start to filter the data. For example, if you collect the top 10 queries - those will change over time. It's more a problem if you want to track the data every min/hour etc. Working out the diffs for the current period while not collecting everything.
Just returning top 10 from what is in the DMVs since each query was cached is easier. sp_BlitzCache does a good job of this. Given it's open source with MIT License it might be possible to incorporate this within DBA Dash - something I might consider at some point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DBA Dash Agent Issue affecting the agent DBA Dash GUI system client (web, mobile, CLI, 3rd party)
Projects
None yet
Development

No branches or pull requests

2 participants