Question

I know it is very easy to export a single query plan. My question is, when I create a query which returns multiple plans, such as looking at the most costly queries on the server, is there a way to bulk export those plans for later analysis or do I need to export them one at a time? Simply saving the query results will allow me to maintain the XML, but it is still a one by one review, plus I am missing something in the process of being able to view it later.

I am 99% certain there is something simple I am missing. Could you help me find that?

Was it helpful?

Solution

Your problem sounded like something I've been thinking about for a while now. So Powershell to the rescue. You can modify the query for the pain point you want to examine. The only thing actually saved is the xml for the execution plan. Each plan is saved to it's own file named by DB and number, under a dated directory, under the serverInstance directory. Hope this works for you!

EDIT: I reworked the script a little to add a few parameters and some descriptors. This version should be a little more user friendly. The comment block at the top will provide some help context when you use Get-Help. I am new to PoSh scripting, so be kind if something looks newbish.

    <#  
        .SYNOPSIS
           TopTen.ps1 - returns the top ten most expensive execution plans based on the criteria you provide 
        .DESCRIPTION
           Execution plans are saved to the location of your choosing as .sqlplan files
        .PARAMETER sqlServer
           SQL Server and Instance you want to query
        .PARAMETER filePath
            Defaulted to "C:\ExpensiveQueries\"; set to the location you want your files saved
        .PARAMETER painPoint
            Defaulted to "TotalCPU"; set to the pain point you want to float to the top

            Options are:
           --TotalReads
           --TotalCPU
           --TotalDuration
           --execution_count 
        .EXAMPLE
           c:\Scripts\TopTen.ps1 -sqlServer "Server\Instance"

           Queries against Server\Instance, saving files in the default location, looking specifically for High CPU plans
        .EXAMPLE
           c:\Scripts\TopTen.ps1 -sqlServer "Server\Instance" -filePath "D:\ServerAnalysis\ExpensivePlans" -painPoint "TotalDuration"

           Queries against Server\Instance, saving files in D:\ServerAnalysis\ExpensivePlans, looking specifically for long running plans
        .NOTES
           Script provided as is with no guarantees.  Use at your own risk and modify/share as desired
        #>
    param
    (
        [Parameter(Mandatory=$true, 
                    ValueFromPipeline=$true,
                    ValueFromPipelineByPropertyName=$true, 
                    ValueFromRemainingArguments=$false, 
                    Position=1,
                    ParameterSetName='Parameter Set 1')]
        [string]$sqlServer,


        [Parameter(Mandatory=$false, 
                    ValueFromPipeline=$true,
                    ValueFromPipelineByPropertyName=$true, 
                    ValueFromRemainingArguments=$false, 
                    Position=2,
                    ParameterSetName='Parameter Set 1')]
        [string]$filePath="C:\ExpensiveQueries\",

        [Parameter(Mandatory=$false, 
                    ValueFromPipeline=$true,
                    ValueFromPipelineByPropertyName=$true, 
                    ValueFromRemainingArguments=$false, 
                    Position=3,
                    ParameterSetName='Parameter Set 1')]
        [string]$painPoint="TotalCPU"
    )

    #=============================#

    $sqlcmd="SELECT TOP 10
           DB_NAME(st.dbid) as DBName,
           total_worker_time / execution_count AS AvgCPU,
           total_worker_time AS TotalCPU,
           total_elapsed_time / execution_count AS AvgDuration,
           total_elapsed_time AS TotalDuration,
           total_logical_reads / execution_count AS AvgReads,
           total_logical_reads AS TotalReads,
           --qs.plan_handle
           query_plan
    FROM 
           sys.dm_exec_query_stats AS qs
    CROSS APPLY 
           sys.dm_exec_sql_text(qs.sql_handle) AS st
    CROSS APPLY 
           sys.dm_exec_query_plan(qs.plan_handle) AS qp
    WHERE query_plan IS NOT NULL
    ORDER BY --Choose the one you want to use:
          $painPoint
           --TotalReads
           --TotalCPU
           --TotalDuration
           --execution_count 
           DESC
    OPTION (RECOMPILE);"

    $iAmHere=Get-Location
    $filePath=($filePath+"\").Replace("\\","\")

    $sqlResult=Invoke-SqlCmd -MaxCharLength 999999 -Query $sqlcmd -ServerInstance $sqlServer

    If($?)
    {
        $timestamp = Get-Date -Format s | foreach {$_ -replace ":", ""} | foreach {$_ -replace "-",""}

        $filePath+=($sqlServer.replace("\","_"))+"\"+$timestamp

        If (!(Test-Path $filePath))
        {
            New-Item -Path $filePath -ItemType "Directory"
        }

        $queryNum=0
        foreach ($d in $sqlResult)
        {
            $queryNum+=1
            $fileName=$filepath+"\"+$d.DBName+"_"+$queryNum+".sqlplan"

            $fileValue=$sqlResult.query_plan

            New-Item -Path $fileName -ItemType file -Value $d.query_plan
        }

        Set-Location $iAmHere
    }
    Else
    {    
        Throw $error[0].Exception
    }

OTHER TIPS

I know it is very easy to export a single query plan. My question is, when I create a query which returns multiple plans, such as looking at the most costly queries on the server, is there a way to bulk export those plans for later analysis or do I need to export them one at a time?

I would suggest you to create a utility database e.g. dba_repository (or whatever name you like the database to be called) and store the information in that utility database along with the query plan XMLs. Overtime, it would server as a warehouse for doing performance tuning and benchmarking.

Create physical tables and store relevant information in them. This way you can backup the database along with all the info you gathered.

IMHO, it would become ugly if you go on the route of exporting plan XMLs into folders.

Basically, you need to explore the plan cache from your server instance that will allow you to tune your workload and give you below information :

  • Plans with Missing Indexes
  • Plans with Warnings
  • Plans with Implicit Conversions
  • Plans with Index Scans
  • Plans with Lookups
  • Finding index usage
  • Plans with Parameterization
  • Cost of Parallel Plans
  • Cost of Parallel Plans with detail per Operator
  • Parallel plans where Avg. Worker Time > Avg. Elapsed Time
  • Parallel plans where Avg. Worker Time > Avg. Elapsed Time with detail per Operator

There is a wealth of information and scripts on blog written by Pedro Azevedo Lopes - Exploring the plan cache – Part 1 & Part 2.

Also, Jonathan Kehayias has written a lot about exploring plan cache.

A gentle note: Dont do knee jerk performance tuning e.g. dont just blindly create missing indexes when you find them, etc. Instead follow a more methodological approach by baselining your server instance and then take it from there.

Since you already have a query to get you the super-most-costliest queries on the server, why not just have it save the plans from each row, as you get the results? Not possible, you say? Sure it is :-) Just create a SQLCLR scalar function, such as the following, and add it to your query:

[Microsoft.SqlServer.Server.SqlFunction(IsDeterministic = false, IsPrecise = true)]
public static SqlString SaveXmlToFile([SqlFacet(MaxSize = 4000)] SqlString FilePath,
    SqlXml XmlData)
{
    try
    {
        File.WriteAllText(FilePath.Value, XmlData.Value, Encoding.Unicode);
    }
    catch (Exception __Exception)
    {
        return __Exception.Message;
    }

    return String.Empty;
}

And there is no need to do any NULL checking of the input parameters via .IsNull() since I am using the RETURNS NULL ON NULL INPUT option:

CREATE FUNCTION [dbo].[SaveXmlToFile](@FilePath NVARCHAR(4000), @XmlData XML)
RETURNS NVARCHAR(4000)
WITH EXECUTE AS CALLER,
     RETURNS NULL ON NULL INPUT
AS EXTERNAL NAME [SomeAssemblyName].[FileUtils].[SaveXmlToFile];

Then you can use like this:

SELECT *, Test.dbo.SaveXmlToFile(N'C:\TEMP\XmlQueryPlans\'
        + CONVERT(NVARCHAR(50), qstat.query_plan_hash, 2)
        + N'_'
        + CONVERT(NVARCHAR(10), qstat.statement_start_offset)
        + N'-'
        + CASE qstat.statement_end_offset
            WHEN -1 THEN N'InfinityAndBeyond'
            ELSE CONVERT(NVARCHAR(10), qstat.statement_end_offset)
        END
        + N'.sqlplan', qplan.query_plan)
FROM    sys.dm_exec_query_stats qstat
OUTER APPLY sys.dm_exec_query_plan(qstat.plan_handle) qplan;

Since this is a function that can work in any query, it can be automated, unlike an SSMS plugin.

And it is more flexible than embedding a query into a program that will save the plans (though I will say that the PowerShell suggestion from Steve's answer is the next best option).

And, it isn't specific to saving query plans. It can save any XML field or variable, so it can be used in several other places as well :-).

A few easy steps to get the above SQLCLR function working (and pretty much any Assembly you create that needs EXTERNAL_ACCESS or UNSAFE):

  1. The assembly needs to be signed. In Visual Studio, go to Project Properties -> SQLCLR tab -> Signing... button.

  2. "CLR Integration" needs to be enabled:

    EXEC sp_configure 'clr enabled', 1;
    RECONFIGURE;
    
  3. Create an Asymmetric Key in [master] from the DLL:

    USE [master];
    CREATE ASYMMETRIC KEY [KeyName]
    FROM EXECUTABLE FILE = 'Path\to\SomeAssemblyName.dll';
    
  4. Create a Login [master] from the DLL:

    CREATE LOGIN [SomeLoginName]
    FROM ASYMMETRIC KEY [KeyName];
    
  5. Grant the Key-based Login the appropriate permission:

    GRANT EXTERNAL ACCESS ASSEMBLY TO [SomeLoginName];
    

Please notice how none of those steps was to turn the database property of TRUSTWORTHY to ON!!!


Now, if you are interested in profiling a particular query or set of queries (to compare against each other), then check out my answer on a related question on StackOverflow:

Programatically read SQL Server's query plan suggested indexes for a specific execution of SQL?


UPDATE:

@Kin's answer reminded me that there is additional information, beyond the plan itself, that would likely be very useful to save with the plan. At the very least there are all of the additional fields in sys.dm_exec_query_stats, and there is also a sys.dm_exec_plan_attributes DMV that is quite interesting.

If you want to store all of this data into a utility database as Kin is suggesting then it should be fairly straight forward as to how to get the one row per plan of query stats and multiple rows per plan of plan attributes into one or more tables. But if you still want to export the plans, is it possible to also capture this information and keep it with the query plan so you don't have to look at multiple files or whatever? The answer: Yup. We can actually graft the fields from sys.dm_exec_query_stats, on the same row as the current plan, and all of the rows from sys.dm_exec_plan_attributes for the current plan, into the Query Plan XML such that it:

  • is still a single file, and
  • opens up just the same in SSMS and SQL Sentry Plan Explorer
SELECT *, Test.dbo.SaveXmlToFile(N'C:\TEMP\XmlQueryPlans\'
        + CONVERT(NVARCHAR(50), qstat.query_plan_hash, 2)
        + N'_'
        + CONVERT(NVARCHAR(10), qstat.statement_start_offset)
        + N'-'
        + CASE qstat.statement_end_offset
            WHEN -1 THEN N'InfinityAndBeyond'
            ELSE CONVERT(NVARCHAR(10), qstat.statement_end_offset)
        END
        + N'.sqlplan', STUFF(
            tplan.query_plan,
            CHARINDEX(N'<BatchSequence>', tplan.query_plan),
            0,
            (
                SELECT stat.data + attrs.data
                FROM (
                        SELECT qstat.*
                        FOR XML RAW('QueryStats'), BINARY BASE64
                    ) stat(data)
                CROSS JOIN (
                        SELECT * FROM sys.dm_exec_plan_attributes(qstat.plan_handle)
                        FOR XML RAW('Attr'), ROOT('PlanAttributes'), BINARY BASE64
                    ) attrs(data)
                )
            )
         ) AS [SavingXmlToFile]
FROM    sys.dm_exec_query_stats qstat
OUTER APPLY sys.dm_exec_text_query_plan(qstat.plan_handle, 0, -1) tplan;

The resulting XML looks like:

<ShowPlanXML xmlns="http://schemas.microsoft.com/sqlserver/2004/07/showplan" ...>
  <QueryStats 
        sql_handle="AgAAABFDQzHOhP02ljxkCQcP9kCIEz5aAAAAAAAAAAAAAAAAAAAAAAAAAAA=" 
        statement_start_offset="0" statement_end_offset="-1" plan_generation_num="1" 
        plan_handle="BgABABFDQzGQSbH4AQAAAAEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=" ... />
  <PlanAttributes>
    <Attr attribute="set_options" value="251" is_cache_key="1" />
    <Attr attribute="objectid" value="826491665" is_cache_key="1" />
    <Attr attribute="dbid" value="1" is_cache_key="1" />
    <Attr attribute="dbid_execute" value="0" is_cache_key="1" />
    ...
  </PlanAttributes>
  <BatchSequence>
    <Batch>
    ...

Grafting new elements into an existing structure is easier with XML DML and the .modify() function, but that only works in an UPDATE statement so it isn't an option in this case.

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top