Help to optimise stored procedure
I have inherited a complicated (to me) store procedure. This returns products in categories (by id), or by brand (id), or via a search string, and also checks for any valid attributes (which are dynamic, like material could be metal so an ID for metal would be passed in).
It works in terms of results, but as the data gets larger it gets noticabley slower.
Have attached the plan. Wonder if anyone can help point me in the right direction.
Please let me know if other information would be helpful.
- The "generic search" with optional arguments is extremely vulnerable to bad parameter sniffing. And it forces the optimizer to cater for lots of crazy combinations that may or may not ever actually happen.
In many cases adding OPTION (RECOMPILE) at the end of the query helps a lot, but at the expense of increased recompiles (soomething you should monitor for if you decide to try this). Not sure if it will help in your case but wirth a shot.
Another alternative is to create several separate stored procedures for each possible combination (or for the most common/most important combinations plus one for the rest, and have the primary entry point stored procedure just do a large IF ELSE IF ELSE IF to invoke the correct version.
- The price logic is also hard to optimize because you filter on DIFFERENT prices which renders indexes useless. And one of the prices comes from another table so you cannot even use a derived column.
Something you CAN consider (depending on how often data changes) is to create an indexed view that joins ProductTbl and the aggregated data from ProductVariants and adds a column that holds either Price or SalePrice or VariantMinPrice depending on the HasVariants and ShowSalesPrice columns.
That being said, the price logic does not appear to be a problem judging from the plan you posted so don't worry about this one too much. (Though you may find that there are other plans used for the stored procedure as well).
- And it will probably not surprise you that the string search with a combination of full-text search and LIKE with leading wildcards is not helping either…
- The combination of ROW_NUMBER and filter on RowNum = 1 suggests that you are getting duplicates that you need to get rid of. I suggest that you try to find the root cause for this and then change the query. That saves (1) the effort of creating the duplicates; and (2) the effort of removing them.
That being said, there definitely are options to improve this.
- The WHERE clause has a test on variables only, so that nothing is returned if not at least one of the search arguments is passed in. WHy not move that to an IF and not execute the query at all in such a case?
- If you are going to try the OPTION (RECOMPILE) idea, then I recommend testing the existence of rows in @AttrValsArray and @BrandsArray before starting the query and capturing that in a variable. A statement-level recompile can eliminate entire branches based on a variable value. (Something that cannot be done in an exeuction plan that might have to be reused).
- Based on the execution plan I see that user-defined function Fn_GetSubCategoryIds is not coded as an inline table-valued function. Either rewrite it to be inline (preferred because it will benefit every query that uses it). Or, second choice, executte it separately and capture the results in a temporary table or table variable.
That's all I have time for right now.
I will go away and go through your suggestions, I have some reading and understanding to do for some of these concepts. Your time in looking over this and feeding back to me is very much appreciated.