A Capability Model for Business Analytics: Part II – Assessing Analytic Capabilities

Read - A Capability Model for Business Analytics: Part I – Dimensions of Capability

The first article of this series  presented the capability model for business analytics that is illustrated in Figure 1.  The intent is to provide organizations with a means to quantify analytic capability – a fundamental requirement to manage growth. The model looks at analytics capability in three dimensions – degree of analytics discipline, analytic roles, and analytic purpose.

Figure 1. Analytics Purpose, Roles and Capabilities


Together these dimensions produce a model with 60 cells. Now let’s take the model from theoretical to something for practical application. The goal is to quantify the level of an organization’s analytic capabilities with sufficient indicators to provide measures upon which decisions can be made and actions taken – to form the basis of analytic capability assessment and growth plan. Ideally the assessment should be fast and light – something that is easy to perform yet provides informative and actionable results.

The Capability Assessment Process

To assess we must first collect data, and the matrix model provides the structure for data collection. Fortunately it isn’t necessary to examine all 60 cells of the model to assess analytic capability. The capability levels are, in fact, a product of the assessment, not specific variables for data collection. The roles dimension and the usage dimension intersect to create ten data points as illustrated in Figure 2.

Figure 2. Data Points for Capability Assessment

A total of ten data points fits the “fast and light” criteria, but a small number of data points means that each must be carefully considered to avoid extreme bias that might be introduced by a single value. Ideally, the assessment process involves a group of stakeholders who represent a broad cross-section of the business analytics community – those who create analytics and use analytics throughout the organization - ranging from data scientists with advanced analytics technologies, to business managers using spreadsheets and/or self-service tools.

The purpose of the group is to arrive at consensus responses for each of the data points. Consensus, of course, begins with discussion. A simple spreadsheet provides the structure to drive discussion and constrains the set of responses such that they are quantifiable. Figure 3 illustrates the assessment spreadsheet.

Figure 3. Analytics Capability Assessment Spreadsheet

The yellow shaded areas are the spaces in which responses are collected – one column of responses for creating analytics and one for using analytics. The dropdown lists limit the set of responses in each column. Behind the scenes, the spreadsheet converts text responses to numerical values and derives capability assessment scores in the columns on the right. We’ll look at the scoring portion later in this article. But first, we’ll take a closer look at data collection.

As previously stated, each response must be accurate for the assessment to have value. Responses must be carefully considered and represent consensus that is achieved through discussion. Accurate responses, then, require two support structures – defined terms and discussion guidelines.

Definitions of Terms

Choosing the correct response for each cell in the spreadsheet requires that you understand the terms that describe the cell and the terms used to define the set of allowed responses. There are three sets of terms for which definition is needed:

  • The members of the roles dimension – analytics creators and analytics users.
  • The members of the purpose dimension – descriptive, discovery, diagnostic, predictive, and prescriptive analytics.
  • The allowable set of responses – evolving portfolio, integrated and reused, repeatable processes, development projects, measurement and feedback, embedded analytics, standard metrics, defined requirements, pockets of competency, and rare or none.

THE ROLES

Analytic Creators. The people who collect measurement data, define metrics, and build the processes to derive and deliver those metrics to analytic users.

Analytic Users. The people who use measures and metrics to gain insight into business events and circumstances, to develop foresight into future business performance, and to inform and enhance decision-making processes.

THE  PURPOSE

Descriptive Analytics provides information about past actions, events, and outcomes: What happened? How much? When did it happen? The purpose is to provide quantitative descriptions of systems, processes, activities, events, etc. Descriptive analytics is even applied to data with the goal of quantitatively describing the data itself, providing knowledge needed to use the data for other types of analytics.

Diagnostic Analytics also looks at past events, but from a different perspective than descriptive analytics. Diagnostics focus on cause and effect: Why do things happen? Why do outcome measures go up or down? What are the influences on business outcomes? Why do things happen at specific points in time or on particular time cycles?

Discovery Analytics uncover interesting, previously unknown facts, trends, and patterns. Think of discovery as learning analytics – seeking to find new and useful knowledge. Bridging from data to business, discovery analytics may pursue data insights, business insights, or both.

Predictive Analytics uses data to forecast probabilities of future events and conditions. The purpose is to answer questions about what is likely to happen, and forecasts are typically related to expected behaviors of individuals (customers, employees, machines, parts, etc.). Prediction may focus on specific individuals or may categorize individuals by probability – Which customers are likely to churn? Who has high probability to respond to a cross sell offer? Who is likely to miss scheduled loan payments? Predictions are valuable in a variety of planning and decision-making processes.

Prescriptive Analytics recommends and/or automates decisions with the goal of optimizing future outcomes. Prescriptions as recommendations help decision makers to choose among alternative responses to a set of circumstances. When prescription is used to automate decision-making, the analytic model determines a single best response to a set of conditions.

RESPONSES – CREATING ANALYTICS 

Evolving Portfolio.  Analytics development is based upon a systematically managed collection of analytic capabilities and systems that is aligned with business processes and information needs, and that continuously adapts to business change.

Integrated and Reused. Analytic development processes have sufficient discipline and governance to ensure that measures and metrics are consistently defined, non-conflicting, non-redundant, and reused across business functions and analytic applications.

Repeatable Processes. Analytic development processes have a methodological foundation. They are defined, documented, repeatable, and teachable processes that are used consistently across the community of analytics creators.

Development Project. Analytics development activities are formalized as projects with all of the key project management elements of planning, execution, monitoring, control, and closure.

Pockets of Competency. Some business units and individuals have the qualities described for development projects, repeatable processes, and integration and reuse. The skills of these groups are local, not coordinated across groups, and not generally available to the broader organization.

Rare of None.  Creating and using analytics is not a common practice of business management or information management practitioners.

RESPONSES  – USING ANALYTICS 

Strategy and Planning. Analytics are viewed and applied as valuable strategy management tools that have important roles both in setting and achieving business goals. Analytics help to shape strategy, inform strategic planning with understanding and insight, and monitor to know the degree to which strategic goals are being achieved.

Operational Analytics. Analytics enable and support applications, services, and technologies to monitor, analyze, and manage performance of daily business operations. Business effectiveness and efficiency are improved through analytics that are focused on specific operational activities and workflows.

Decision Support. Analytics are applied to inform decision-making and problem-solving processes, with particular attention problems of uncertainty, ambiguity, and elusive problem definition. Iterative analysis where each answer brings new questions is a common occurrence.

Performance Management. Analytics are used to measure and monitor business results as compared to pre-defined tactical and operational goals. Directional trends and variance between goals and actual results are used to identify areas where management attention is needed.

Measurement and Reporting. Business outcomes are quantified as metrics and performance indicators, and are routinely published to a community of interested business stakeholders.

Rare of None. Creating and using analytics is not a common practice of business management or information management practitioners.

Discussion Guidelines

 Developing a consensus set of responses for each cell of the spreadsheet depends on communication and discussion. In addition to common definitions, it is useful to have a set of topics to guide a complete and multi-faceted discussion. I suggest several topics as a good discussion agenda:

  • Business Expertise – Do you have the right business subject experts with the knowledge that is needed to develop and apply analytics?
  • Modeling Skills – Do you have enough people with strong statistical background who know how to build and deploy analytic models?
  • Analysis Skills – Do you have the analytical capabilities to separate facts from assumptions, draw conclusions from data analysis, and find the business meaning in the data?
  • Data Sources – Do you have access to the right data, of the right quality, to be able to perform analytics effectively?
  • Data Expertise – Do you have the right data subject experts with the knowledge that is needed to develop and apply analytics?
  • Data Management – Do you have the necessary data infrastructure including architecture, technology, and governance practices to build and sustain analytics?
  • Data Quality – Do you have the capabilities to assess data quality and to improve quality when needed? Are you able to determine the level of quality needed for various analytics projects?
  • Data Preparation – Do you have the knowledge, skills, and technologies to select, sample, transform, cleanse, integrate and blend data from various sources to prepare it for analytics?
  • Data Visualization – Do you have the knowledge, skills, and technologies to turn data into understandable and meaningful visual communications?
  • Big Data Technology – Do you have the technology (and the skills to use that technology) to get analytic advantage from the variety of data sources – internal and external, structured and unstructured – that are available in a world of digital life and digital economies?
  • Self-Service Tools – Do you have the tools and technologies to provide self-service analytics and self-service data preparation?
  • Data Science Tools – Do you have the data preparation, analytic modeling, and data visualization tools that are needed for data mining, predictive analytics, and other types of advanced analytics? Do you have the technologies to deploy the models and for business people to access and use them?
  • Analytic Culture – Is your organization culture one that encourages quantitative business management and in which adoption of analytics can be assumed?
  • Decision Management – Does your organization have known and planned decision-making processes? Or do you have more of an intuitive, ad hoc, and seat-of-the-pants decision-making style?
  • Business Adoption – Does your organization embrace new concepts, ideas, and technologies and adopt them readily? Or are they better characterized as “late adopters?”

Combine each topic with roles and uses as illustrated in Figure 4 to develop a rich and robust agenda for discussion.  

Figure 4. Discussion Guidelines

Assessment Results

Analytic capability scored are calculated when all of the responses have been considered and entered into the spreadsheet. Capability scores are derived for each analytic role and for each analytic use. Scores are aggregated for each dimension and in total for the entire assessment as shown in Figure 5. 

Figure 5. Analytics Capability Assessment Scores

Seeing the scores is interesting, but the numbers are only indicators. Some interpretation is needed to make the transition from interesting to informative. And some planning is needed to make the shift from informative to actionable. Understanding and applying the capability assessment is the subject of the next and final article in this series.

Read - A Capability Model for Business Analytics: Part III – Using the Capability Assessment

Dave Wells

Dave Wells is an advisory consultant, educator, and industry analyst dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information...

More About Dave Wells