Metric Filtering: Report Filter or Report Limit?

The most common way of filtering a Report in MicroStrategy is via the Report Filter.  Anything placed into this filter appears in the SQL Statement that is sent to the database.  But what exactly happens to the SQL when you place a Metric in the Report Filter?  How does that compare to a Report Limit which also filters on Metric values?  Which is more efficient?  Today we’ll explore.

Since a Metric can have aggregation depending on the level of the Report Template, MicroStrategy resolves this as a Set Qualification, which means it will run a multipass query to first determine the set of Attribute Elements that should be filtered on in the WHERE clause.  This results in multiple hits to the fact table, which could hurt performance.

By comparison, a Report Limit is essentially a HAVING clause.  This would result in simply tagging on the HAVING command to the end of the query that’s already there, clean and simple.

Here’s an example from my system:

No Filter – 32sec – 452 rows

select a12.Attribute,
sum(a11.Fact)  Metric
from FactTable a11
join DimAttribute a12
 on (a11.AttributeKey = a12.AttributeKey)
join DimDate a13
 on (a11.DateKey = a13.DateKey)
where a13.Year in (2011)
group by a12.Attribute

Report Filter – 3min 17sec – 242 rows

select a12.Attribute ,
sum(a11.Fact)  Metric
from FactTable a11
join DimAttribute a12
 on (a11.AttributeKey = a12.AttributeKey)
join (select a12.Attribute  Attribute
from FactTable a11
join DimAttribute a12
 on (a11.AttributeKey = a12.AttributeKey)
join DimDate a13
 on (a11.DateKey = a13.DateKey)
where a13.YearID in (2011)
group by a12.Attribute
having sum(a11.Fact) > 1000.0
) pa13
 on (a12.Attribute = pa13.Attribute)

join DimDate a14
 on (a11.DateKey = a14.DateKey)
where a14.YearID in (2011)
group by a12.Attribute

Report Limit – 34sec – 242 rows

select    a12.Attribute,
    sum(a11.Fact) Metric
from    FactTable    a11
    join    DimAttribute    a12
     on     (a11.AttributeKey = a12.AttributeKey)
    join    DimDate    a13
     on     (a11.DateKey = a13.DateKey)
where    a13.YearID in (2011)
group by    a12.Attribute
having    sum(a11.Fact) > 1000.0

In this example, there is a pretty significant difference in performance.  Report Limits aren’t always the fastest, but in my experience they usually are for the reports I’m doing.  At the very least, they should be considered and tested for your report and environment.

Reports Limits aren’t just for breaking ranking ties!

You may also like...

9 Responses

  1. rajesh b says:

    What is mean by Project Builder. what is purpose of use

  2. Chirag says:

    Hey Bryan,

    Any specific scenario you can figure apart in which Report limit is better choice than the Report filter? As far as I see, it is just an option provided by MSTR and not used much (At least I haven’t used it anywhere till date 🙂 ). Your takes on this?

    Thanks much!

    • Bryan says:

      This blog post has an example where using Report Limit runs 6x faster than using Report Filter. You use Report Filter more commonly because you filter on Attributes more commonly, but if you filter on Metric values, then Report Limit is almost always faster, which is why the option is there.

    • Chirag says:

      Thanks Bryan,

      I tried to figure out performance difference by using Report filter and Report limit in different scenarios. My investigation says that performance of report limit and report filter depends on the size of the fact table.

      If the fact table size is very high (say 1 million rows) then it would be recommendable to use the Report filter. Because essentially, metric qualification would find the set of those keys which qualifies the condition of metric qualification and then it would join those keys with the fact table in the second pass. On the other hand, Report limit would emphasis on “having clause”. It would try to get all the records first and then apply the having clause in the same SQL pass. Pulling all the rows in the case of a huge fact table would degrade the performance adversely.

      So in all, it would be advisable to use Metric qualification as a report filter when the fact table size is huge.

      Your views on this are highly desired and awaited.


    • Bryan says:

      Performance is never a rule, otherwise there wouldn’t be an option 😉 Environmental factors will always cause unique circumstances the most important of which is the database system, tuning, table size and structure. (I did use the world “almost”!)

      In the example I ran in this post, the table I was querying had several hundred million rows. The best thing you can do is what you just did, which is to run it multiple ways and pick the way that works best for you!

    • Chirag says:

      Thanks Bryan 🙂

      Appreciate your views..

  3. Thank you very Much Bryan..The above approach fixed a big performance problem which we are facing in our project.

  4. Ananth says:

    Hi Bryan,

    How does the Report Limit work in case of Intelligent Cubes?
    We are currently facing performance issue in our cubes, which display the “Top 50 Products” of each Store. The cube now uses a Report Filter and has about 10 million rows.

    When we use a Report Limit, the report created on the cube will have 10 million rows. However the size of the cube and the numbers of rows in the cube information is huge (size is twice the cube with filter and number of rows is 129million). But, in the report created on this cube, it displays only 10 million rows. But, there is considerable reduction in the time taken for this execution.

  5. anand says:

    Thanks Bryan for the difference. Do you know of a way to have a metric qualifier on a relationship filter. When I use a metric qualifier in a relationship filter the sql generated is similar to the second version(report filter). The extra join is unnecessary I think.