Billions and Billions of Reports (a la Carl Sagan)

I recently came across a white paper on the "five styles of BI" and thought that would be an interesting read. As it turns out, more interesting than I expected. In this paper, the vendor (in order to protect the innocent, we’ll just call them MacroTactics) made a statement regarding the performance capacity of this particular vendor’s solution: 72,000 reports per hour. Let’s see, 72,000 reports per hour... that would be 576,000 reports in an 8 hour day... and 149,760,000 reports per year. Wow. Who’s reading that stuff?

Now, I fully buy in to the fact that applications that deal with lots and lots of data need to be hugely scalable, but what I don’t buy is how this is in any fashion a measure that anyone can use to figure out if a particular BI solution is right for them. I can just imagine the requirements spec for that solution: "15.1.182.f - Solution must be capable of creating 70,000 reports per hour. Alternately, solution will be able to generate 140,000,000 reports per year." 140 million reports! Incredible. (Now, what did I do with my mini-me?)

Seriously, here’s the thing. More reports is rarely the answer. We already have plenty of data and plenty of reports. What buyers and users really want is fewer reports and more information that helps them get their jobs done better and faster.

We’d encourage business intelligence vendors to think of themselves more as data storytellers than data factories churning out generic report widgets…even if they can do it at incredibly high speeds. From this perspective, you wouldn’t want to hear Steven Spielberg bragging about his ability to pump out a dozen movies a year or J.K. Rowling trumpeting her ability to write 1000 pages a year (hmm, wait a sec).