A Capability Model for Business Analytics: Part I – Dimensions of Capability

Analytics has quickly become the hottest topic in BI. Every business uses analytics but with widely varied degrees of adoption and success. The work of creating analytics is shifting rapidly from IT into business units. From Excel jockeys in business departments doing formula-based analytics to the data-savvy business users of self-service analytics tools – Domo, Qlik, Spotfire, and Tableau to name a few – it is clear that analytics has gone mainstream.

As analytics migrates from BI-centric to departmental, distributed, and individual, the drive to self-service analytics continues to accelerate. Yet demand outstrips supply for qualified data scientists, and the skills of self-service analysts vary widely. Most organizations have pockets of analytics – individuals and groups performing analytics work unaware of others who do similar work. Redundant work, waste and rework, conflicting answers to similar questions, data and analysis quality, and data security and governance are all reasons for concern.

We certainly don’t want to inhibit growth of analytics. We want to encourage it, but make a shift from organic growth to managed growth. Management goes hand-in-hand with measurement, so an analytics capability model is a good place to begin. Ask a business “what is your analytic capability?” and it is likely that they will be unable to answer the question – perhaps even unable to understand the question.

The Capability Maturity Model

To describe or quantify analytic capability in any meaningful way, we need to begin with a capability model. The SEI Capability Maturity Model from Carnegie Mellon University provides a good foundation because it emphasizes capability first. Other models present themselves as maturity models, but without the strong capabilities focus that is needed here.

The CMM, in fact, makes a distinction between capability levels and maturity levels as shown in Figure 1. Note that while the names for capability and maturity levels are similar at some levels, there are subtle but significant differences. Maturity levels are defined and measured as an aggregate for all processes in an enterprise. Capability levels are defined and measured for particular and targeted processes. Understanding process capability is important for process improvement, process integration and evolution to new collaborative processes.

Figure 1. Capability Maturity Model

The capability levels are defined as:

Level  0 – Incomplete: An incomplete process is one that is not performed or is only partially performed. Specific process goals are not consistently met and not enterprise goals exist for the process.

Level  1 – Performed: A performed process one that consistently meets specific goals of the process area. It supports and enables the work needed to provide the services of the process area. Although an improvement over Level 0, performed processes are at risk due to operating without a strong connection to enterprise goals.

Level  2 – Managed: A managed process satisfies Level 1 criteria and has the basic infrastructure needed support the process. It has enterprise goals as well as process area goals. The process is consciously planned and executed, employs skilled people, has adequate resources, and involves key stakeholders. A managed process is monitored, controlled, and reviewed.

Level  3 – Defined: A defined process satisfied Level 2 criteria and has the necessary degree of rigor in standards, process descriptions, and procedures to be learnable, repeatable, easily audited, consistent in results, and capable of producing identical results given identical circumstances.

Level  4 – Quantitatively Managed: A quantitatively managed process satisfies all Level 3 criteria, and is controlled using statistical and other quantitative techniques. Measurable targets of quality and performance are established and used to manage the process. Quality and performance are measured and managed throughout the life of the process.

Level  5 – Optimizing: An optimizing process meets all Level 4 criteria and is continuously improved through analyzing and understanding the causes of variation in the process. Statistical Process Control (SPC) methods are used to achieve both incremental and innovative improvements.

The Analytic Capability Model

The CMM described above is a good starting place, but not sufficiently specific to be applied to analytic processes. To build an analytic capability model we need to add the dimensions that are specific to analytics: analytic roles and analytic uses. Figure 2 adds roles to the capability model using the two primary roles that occur in analytic practice: using analytics and creating analytics. It is important to realize that the same individual may perform both roles – as an analytic consumer and an analytic developer. This is especially true for self-service analytics.

Figure 2. Analytics Roles and Capabilities

At the intersection of each role with each level, it now becomes practical to identify specific characteristics that are indicative of that intersection. As you consider the descriptions of the capability levels above, read the word “process” in all of its occurrences as “analytic process” – the processes by which we create and use analytics such as data preparation, model building, model testing and tuning, etc.

Using analytics at Level 0 (incomplete), for example, might involve drawing conclusions and making decisions based on a partially finished spreadsheet that may never be completed. Similarly, creating analytics at Level 0 might entail building that unfinished spreadsheet that was perhaps “good enough” at the time. At the opposite extreme, Level 5 analytic usage could be application of a customer churn predictive model to guide decision-making in marketing, communications, and call center operations. And analytic creation at Level 5 would include the work of a data scientist or skilled predictive modeler to build and tune the customer churn predictive model.

Adding a third dimension – analytic purpose – produces the capability model shown in Figure 3. The usage dimension looks at the kinds of analyses that are performed. These include descriptive analytics to explore data content and meaning, discovery analytics to find interesting patterns and anomalies, diagnostic analytics to seek understanding of cause and effect, predictive analytics to quantify behavioral probabilities, and prescriptive analytics to recommend or automate optimal decisions.

In Conclusion (or until the next article)

This completes a top-level view of a capability model for business analytics. The model has 60 cells, each of which needs to have detailed criteria that express the indicators and norms by which analytic capability can be measured. Adding those criteria includes a look at analytic architecture and infrastructure, encompassing topics such as business capabilities, technological capabilities, organizational infrastructure, data infrastructure, system integration, and more. But that’s a topic for the second article in the series.

Using the model is also a topic for a future article. It is not my intent to suggest that we should all aspire to Level 5 for all of our analytics, nor that all analytic processes need to achieve the same capability level. The purpose is to know the level of capability that you desire – to set a goal – and then to pursue that level with tools to measure progress. Achieving that purpose requires classification, structure, and indicators – in short, a capability model. Using the model requires data gathering, analysis and quantification which are topics for the final article of this series.

Read - A Capability Model for Business Analytics: Part II – Assessing Analytic Capabilities

Dave Wells

Dave Wells is an advisory consultant, educator, and industry analyst dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information...

More About Dave Wells