need to reduce cpu cost on query runs quickly but gets called often

looking to see how I can reduce the cost of the query. Also confuse don why the cost of tbl_Fare_Detail is so high

Plan.pesession (2.6 MB)
avatar image By Me 1 asked Feb 21 at 08:46 PM
more ▼
(comments are locked)
10|10000 characters needed characters left

2 answers: sort voted first

So I think a few things are in order:

  1. Estimates are off for several operations, like estimate of 1 and actual of 7K+. To solve this I would consider using an appropriately-indexed #temp table instead of a table variable.

  2. I'll repeat my advice to try out my dynamic SQL model for optional parameters - this query is potentially going to be very vulnerable to suboptimal operations based on a plan being chosen for different parameters. Dynamic SQL isn't the most maintainable thing but if you combine this with Optimize for Ad Hoc Workloads you will get plans that are much more appropriate for any given set of provided parameters (for parameter sniffing issues, you can also add OPTION (RECOMPILE) within the dynamic SQL).

  3. I would consider not having that big ugly set of substring comparisons - why are there seven independent pieces of information encoded into a single string? This alone destroys any chance of a seek on FCL.

  4. What is the purpose of OPTION (FORCE ORDER) in this plan? What specific benefit does it provide in this case? Have you tried without it?

  5. Have you considered read committed snapshot instead of littering your query with READUNCOMMITTED hints? That may or may not improve performance, depending on current tempdb load, but it increases your odds of having correct results.

avatar image By Aaron Bertrand ♦ 1.7k answered Feb 22 at 08:44 PM
more ▼
(comments are locked)
10|10000 characters needed characters left

Thanks for the awesome answerand time much appreciated.

  1. I am using the declared tables to try and prevent a recompile because of how much this sp gets called. I have tried to use temp table with out much of a change but will look into further.

  2. I want to try and prevent the recompile but I will experiment also

  3. I have started to change the structure so that I am not using the function

  4. I have added this in to help with the declared table and force the order of the joins. This helped to deal with the estimate of 1 row in the declared table

  5. I will look into this. I am not very worried about the dirty reads here vs the locking. I have code further down the process to make sure the data is in check. I found the pay off was worth the chance of a small change between.

Thanks again for all your help

avatar image By Me 1 answered Feb 23 at 03:28 AM
more ▼
(comments are locked)
avatar image Aaron Bertrand ♦ Feb 23 at 02:45 PM

If you were getting recompiles simply by having a #temp table, there is probably a reason (like you are mixing DML and DDL). See this post and this post by Paul White. I have a hard time believe the performance with a table variable is better, recompile or not; and don't know how necessary the temporary object is, and whether performance would be better without either. Also, recompiles are not necessarily bad - often the cost of compilation is well worth the performance difference of a more appropriate plan, given runtime parameters or updated stats. In the case of a #temp table, an intentional recompile is often worthwhile if the number of rows in the #temp table will vary substantially from run to run (this article might also be worth a read). None of this is always true, but I honestly only worry about recompiles when they are actually demonstrating a real performance problem.

10|10000 characters needed characters left
Your answer
toggle preview:

Up to 50 attachments (including images) can be used with a maximum of 209.7 MB each and 209.7 MB total.

Follow this question

Topics:

x641
x455
x89
x1

asked: Feb 21 at 08:46 PM

Seen: 184 times

Last Updated: Feb 23 at 02:47 PM