Query took like 7 hours to perform any tips for get this faster
By wherrera 16 asked Oct 08, 2013 at 05:08 PM
Well, you could consider a filtered index on the Weekly Activity table that isolates those three Record_Type values. You also might consider an indexed view that satisfies the joins that are present or otherwise pre-aggregating or adding partitioning along the leading grouping column (week number). It may be possible to use EXISTS for some of those joins, which allow short-circuiting in some cases; I can't tell if any are not needed for actual output. Statistics are also way off - if you have auto-update statistics enabled for the biggest table, it's quite possible that there is not enough churn in that table to trigger it. If this is the case, you may want to consider trace flag 2371 (see this search). Not that this will not likely magically fix all of the issues, but it will at least keep cardinality estimates more realistic.
Really this query is slow because you currently have a clustered index scan that must fully traverse all 1.3 billion rows in the Weekly Activity table (over 52 GB worth of data, and requiring a 5 GB memory grant to process!), and then hash match those against three different table outputs. The key to improving query performance here is to create more strategic indexing and/or partitioning (I can't currently tell what other indexes currently exist), of course keeping in mind how those indexes will affect the rest of the workload. Partitioning won't really help eliminate I/O in this case, since it seems that your report needs to present data for all of time. But it might help with distributing that I/O over multiple threads. If this is a VM you should strongly consider allocating more cores (and potentially more memory, but I can't tell if you're brushing up against any limits there - also can't tell what else is pressuring or being pressured at the time).
By Aaron Bertrand ♦ 1.7k answered Oct 11, 2013 at 08:22 PM