A raging debate in business analytics

Ok, "raging" is too strong a word. But there is an growing debate about how to transform companies into data-driven, analytics-led organizations. This debate is worth cherishing: strong opinions are novel for a community that is more comfortable relying on facts that philosophy.

I wanted to summarized and provide impressions of the discussion because I think it sheds light on many of the challenges facing organizations in this area. We can all agree on the importance of analytics to drive smarter decisions; there is less agreement on how to implement analytics.

The discussion was initiated by Tom Davenport’s article published in the Harvard Business Review entitled "Competing on Analytics" published Jan 1 2006 (spin-off article here). It received enough notice that there was a follow-on conference. Tom interviewed 32 companies who were relatively advanced in using analytics (defined as statistical and predictive analytics). Research was "carried out independently" but sponsored by SAS and Intel.

The first reaction of many involved in this industry was to appreciate the attention. A few bloggers/writers were happy to summarize his work:

Then came the trouble: a few practicioners of business analytics looked this gift horse in the mouth and questioned Davenport’s well-intentioned but ultimately misguided assumptions about what it takes to be an "analytics competitor." In particular, there was a sense that he had lost touch (or perhaps never been in touch) with the realities of implementing analytics capabilities in a complex organization.

Neil Raden helps frame the essence of the debate:

There are two schools of thought when it comes to the value of BI in general. One is that it is best used by “quantitative" types and other analytical business people, who can spot trends and analyze patterns to assist in the big decisions and set and direct strategy. The other position is that BI is at its best when helping a broad range of people and processes at an operational level, marginally improving performance, repeatedly and often.

Here are some of the primary arguments provided by the competing schools of thought:

Centralized analytics. The Davenport camp of analytics focuses on centralization of resources and data, top-down decisions, and breadth of analytical capabilities.

* Top level commitment and vision. Davenport says you know you are competing on analytics when "your senior executive team not only recognizes the importance of analytics capabilities but also makes their development and maintenance a primary focus."

* Centralized analytical capabilities ensure a cross-organizational (therefore balanced) analytical conclusions. Jim Novo forcefully (if a bit angrily) argues:

"if a silo wants to keep an analytical “lead" in it’s own little box to do the navel-gazing, silo-focused analysis that impacts it’s own little box, then that’s OK. Just know that this analysis, while meaningful to the little box, cannot be used or trusted anywhere else in the company and so is of very little value in a macro way."

Furthermore, this centralization implies a team of quant experts who are responsible for analytics organization-wide.

* Required data centralization, standardization, control, and integration. Davenport argues that "the difficulty is primarily in ensuring data quality, integrating and reconciling it across different systems, and deciding what subsets of data to make easily available in data warehouses."

* Omnipresence. A curious portion of the requirements for "analytics competing" relies on quantity-related phrases like: "copious data", "seizing every opportunity to generate information", hiring "a lot of people with the very best analytical skills", "employ analytics in almost every function and department", and "building your capabilities for several years."

Decentralized. These people, Juice included, sense that building analytics capabilities is more about picking the high impact opportunities, scaling with proven value, and working through the organizational challenges that data-driven decisions can create.

* Good analytics is agile and local

[Centralized design] is another naïve assumption, because many organizations are not only decentralized—they’re dysfunctional. Separate units within organizations often need autonomy because they are just so different from the rest of the organization. In addition, as an organization becomes more “agile," which is a definite trend, decision-making, even for the big decisions, will become more decentralized. Imagine how difficult it will be to buy or sell pieces of a company if the “brain," the centralized analytical capability, stays with the parent and there is no local expertise?

Davenport admits that some of his not-yet-analytics-competitors face an environment with "very high levels of functional or business-unit autonomy, making it difficult to mount a cohesive approach to analytics across the enterprise." Well, that structure likely makes sense for many reasons -- and changing for analytics is letting the tail wag the dog.

* Focus analytics to the places that matter. Davenport poo-poos those companies who’s "efforts have been primarily local—that is, limited to particular functions or units, such as marketing." However, if you are targeting the right areas of your companies -- the areas that make a difference in your competitive environment -- then targeted analytics are just what you need. Analytics should be built around the key leverage points of the organization. Breadth of analytics implies both lack of focus and wasted resources.

We made the point:

Analytics is hard. Analytics takes resources. It takes effort for an organization to create and assimilate learnings from analytics...UPS focuses their analytics on knowing where packages are, Marriott focuses on revenue management. If you try to do everything, you won’t do anything well.

* Simplicity. Analytics doesn’t have to complex. In fact, analytics are often better when they are simple and accessible so audiences at many levels absorb and integrate the meaning into their decisions. Raden puts it another way:

"When it comes to quantitative modeling in business, there is a recurrent paradox—the more complex the model, the less faith people put in it. People take advice from people like themselves."

* Culture matters most. The biggest challenge is building a culture that embraces (and even demands) data to support decisions. This seems ignored by the Davenport crowd. For some executives, there is a viseral reaction to tools that appear to displace their years of hard-won expertise. or those of us who have been working on the ground helping companies move in this direction,

I’d be remiss to leave out another school of thought: the relativists. They recognize that it all depends on the unique situation of the organization and that there are important and valid points on all sides. These people (like Nishith from Open Source Analytics) would rather find the common ground. They recognize the role of centralized analytics (Raden: "Centralized data mining/predictive modeling groups are capable of discovering valuable insights that can then be encapsulated into reusable algorithms, scores, or rules") but recognize it isn’t practical or realistic for most businesses. But if we listen to them, our best debate in this business goes away.