How to improve the performance of a SQL query even after adding indexes?
The query has a number of unusual features. For one thing, a large number of the joins are evaluated in a sub-tree that is cross joined to the single row from the Profiles table. This may be intentional, and valid according to your business rules, but it certainly looks odd. The section of the query plan I am referring to is this:
Another odd feature of this cross join is that the initial row count (cardinality) estimate for the Index Seek on the BatchActionHistory table is very wrong. The estimate is 1 row, the actual number of rows is 9,824.
This is not the only poor estimate in the plan by any means, but it is striking given that the predicate is simply BatchActionID = 6. This seems to suggest that statistics on this table are very out of date – you should check that auto update of statistics is enabled for these tables (and database) and manually update the statistics as well. After this, a simple SELECT COUNT(*) FROM BatchActionHistory WHERE BatchActionID = 6 should give an estimate reasonably close to 9,824 rows.
Updating your statistics for all the tables involved in the query (and checking the auto-update settings) will be a good start, but it may not be enough to produce a good execution plan. Please update the statistics and re-run the query, attaching the new results to your original question so we can see what to look at next.
The rest of the joins in the plan seem to follow a logical sequence, and do at least have correlated parameters (they are not cross joins!) but ensuring the basic statistics are up to date may help the optimizer with those as well. We will see.
and also I have attached the new execution plan, someone had added some indexes after I posted this question initially, hence the improvements.