A Capability Model for Business Analytics: Part III – Using the Capability Assessment

Read - A Capability Model for Business Analytics: Part I – Dimensions of Capability

Read - A Capability Model for Business Analytics: Part II – Assessing Analytic Capabilities

The first article of this series presents a capability model for business analytics, and the second article describes a process and provides a tool for analytic capability assessment. Capability assessment is interesting, as indicated by the sustained popularity of Carnegie-Mellon’s SEI Capability Maturity Model. Interesting, however, is not enough. As anyone who is involved in analytics knows, measurement without action is pointless. This third and final article of the series takes the next step – from an interesting assessment to informative and actionable measures of analytic capability.  

A Quick Review of Analytic Capability Assessment

The model that is the foundation for assessment is based upon the SEI model from Carnegie-Mellon, with six levels of process capability ranging from incomplete to optimized. Applying the model to analytics looks at two kinds of analytic processes – those to create analytics and those to consume analytics.

Capability assessment uses the model as a structure for data collection. Ten data points make a tool that is fast and light, yet effective for subjective self-assessment. Selecting the best-fit response from a set of descriptive phrases determines the response value for each data point. An example of assessment results is shown in Figure 1.

Figure 1. Analytics Capability Assessment

It is important to recognize that this is subjective self-assessment. As is true for all self-assessment processes, quality and accuracy of the assessment are strongly influenced by the participants in the process and by the care and consideration given to each response. It is essential to get the right people together and have the right conversations to arrive at a well-reasoned consensus response. For a detailed description of the assessment process see the second article of the series.

Assessment with Meaning

Every analyst knows that measures alone can’t tell a story. To derive meaning from measurement requires comparative context. Consider, for example, that you know a vehicle is traveling at a speed of 60 miles per hour. It is impossible to distinguish good news from bad news given that single measure. Is the vehicle an automobile on the highway? Is it an automobile in a school zone? A runaway bicycle on a steep slope? Or a jet airliner at an altitude of 35 thousand feet?  The point is that you must know what speed is desirable or appropriate – a target value – before you can determine if 60 mph is a number that indicates need for action.

At first glance, the assessment shown in Figure 1 appears to be a case of really bad news:  A score of 2.2 on a scale of zero to five – that’s well below fifty percent and most certainly a failing grade! The news, however, may not be as bad as it seems. There may, in fact, be good news in this assessment. Two fundamental errors are made in the leap to conclusion that 2.2 is a “failing” score.

The first interpretive error occurs by using the top of the five-point scale as the basis for comparison, and by assuming that 5 is the target value. Not every organization needs to have level 5 analytic capabilities. Thus the comparative basis should not be what is possible – the top of the scale – but what is needed.

The second error occurs by looking only at the aggregate score. Not every organization needs to have top-of-scale analytic capabilities, and very few need to be at level 5 for every category of assessment. When examined on a row-by-row basis the assessment shows a substantial score of 4.0 for descriptive analytics. With an evolving portfolio applied for decision support this organization’s descriptive analytics processes are quantitatively managed and well positioned on the capability scale.

Quantitatively managed certainly seems like good news. But is it really? Once again, it is difficult to know because the basis for comparison is an external scale with no direct connection to the needs of the organization. In the case of descriptive analytics, it is entirely possible that the organization needs to go beyond quantitatively managed, striving to optimize because strategic and operational uses are missed opportunities. In this case there is a gap between capabilities and needs indicating that capabilities should be strengthened. The opposite is also possible. When capabilities exceed needs (a negative gap) you may be incurring analytics cost from which no value is derived.

Another cell-by-cell look highlights some apparent bad news. The low 1.0 scores for using discovery analytics, creating diagnostic analytics, and creating prescriptive analytics all look particularly weak.

All of these observations are interesting numbers, but they lack context to be informative – a scenario similar to the vehicle traveling at 60 mph – because we’ve only quantified capabilities and know nothing about needs.

Figure 2. Analytics Needs Assessment

To find real meaning in this assessment we must know what is needed. A complete analytic capability assessment must collect data about both current capabilities and needed capabilities, and must provide the basis to perform capability gap analysis. The spreadsheet tool for assessment includes a sheet for needs assessment (see Figure 2) as well as gap analysis functions. Follow the same guidelines and definitions from the 2nd article to complete needs assessment as are used for capabilities assessment. Gap analysis is generated without direct input when both capabilities and needs assessments are completed.

Assessment with Purpose

With all of the assessment functions –current capabilities, needed capabilities, and gap analysis – we now have the essential elements to measure and evaluate with purpose. The purposes for which this tool is designed are:

  • To confidently know your current level of analytic capabilities.
  • Develop a well-reasoned consensus view of your need for analytic capabilities.
  • Understand the gap between analytic capabilities and needs.
  • Gain insights that help you make plans to close the gap.

The assessment data shown in figures 1 and 2 – responses and scoring of both capabilities and needs – is the input data to gap analysis. But the data isn’t easily analyzed because it is not organized for comparison. The “gap analysis” tab of the spreadsheet reorganizes the data to support visual comparison of capabilities and needs. Figure 3 shows a tabular view that helps to compare the numbers.

Figure 3. Gap Analysis Table

This table takes attention away from the aggregate scores by not displaying them. Too much attention to aggregate scores introduces bias when interpreting the results. The table is organized to suggest row-by-row and column-by-column analysis. Nothing new is found by examining the first two sets of columns. They simply restate the capabilities and needs numbers shown in the earlier tables. The interesting data appears in the third set of columns where gaps are quantified for each row. Examining the table I can quickly observe that:

  • For all of the five types of analytics the common theme is that needs exceed capabilities.
  • In one instance – creating descriptive analytics – capabilities exceed needs and the gap is negative.
  • In two instances – creating and using descriptive analytics, and using predictive analytics – the gap is zero.
  • For predictive analytics zero gap means that the needs are being met.
  • In the case of descriptive analytics the zero aggregate of creating and using is a result of +1 gap for using analytics and -1 gap for creating analytics.
  • The largest gaps are in the area of prescriptive analytics where needs to use far exceed capabilities to use.

The logical conclusion from these observations is that some action should be taken to close the gap between needs and capabilities. Obviously the answer is to increase capabilities, not to decrease needs. The hard questions for planning are “Which capabilities to increase?” and “Where to begin closing the  gap?” To answer these questions, it helps to visualize the gap. The gap analysis charts shown in Figure 4 help with visualization.

Figure 4. Gap Analysis Graphs

The visual perspective brings new observations, adding to the understanding gained by looking at the gap analysis table. Now we see that:

  • Prescriptive analytics with a disturbingly wide capabilities-to-needs gap is a high priority.
  • Although the gap for descriptive analytics is not especially wide – in fact the combined creating and using gap is zero – the disparity of creating and using indicates an area of concern. Creation and use processes for descriptive analytics seem to be out of sync with underutilized capabilities and unmet needs.
  • Discovery and diagnostic analytics both need some attention. Discovery attention should focus primarily on using analytics. Diagnostic action should focus mostly on creating analytics.

From Assessment to Action

Once the assessment has helped to gain insight, the work to be done is that of setting priorities, making decisions, and making plans. Many variables, of course, beyond the data of assessment go into that process. But we can follow the example a bit further using some assumptions. Let’s assume that:

  • Analysis areas with greatest need are given highest priority.
  • Areas with the widest gaps are to be addressed first and most aggressively.

Working with these assumptions, a plan to increase analytic capability should have as highest priority improved capability to create performance analytics and to create predictive analytics. Second priority improvements include ability to use performance and causal analytics, and to create behavioral analytics. The near-term plan, then has four distinct goals:

  1. Improved capability to create prescriptive analytics.
  2. Improved capability to use descriptive analytics with expectation to narrow both creating and using gaps.
  3. Increased ability to use discovery analytics.
  4. Increased ability to create diagnostic analytics.

The goals describe what needs to be accomplished. And they are measurable goals – the assessment process has already defined the measures and set the targets. The next stages of planning address how to achieve the goals. Here you’ll explore available resources that likely include training, technology, collaboration, and consulting. Making the right choices is very specific to your organization. Using analytic capability assessment, you can be sure that you are making informed choices.

Dave Wells

Dave Wells is an advisory consultant, educator, and industry analyst dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information...

More About Dave Wells