Register for "A Guide to Data Products: Everything You Need to Understand, Plan, and Implement" - Friday, May 31,1:00 p.m. ET

The Data Literacy Imperative - Part III: Data Literacy Assessment

Key stakeholders in a collaborative setup for data literacy

Read - The Data Literacy Imperative - Part I: Building a Data Literacy Program 

Read - The Data Literacy Imperative - Part II: The Data Literacy Body of Knowledge

In the first article of this series, I described a process for building the program, with assessment and planning as core activities. In the second article I described the Data Literacy Body of Knowledge (DLBOK) covering the broad scope of knowledge that is needed for a fully data literate organization to be skilled at managing, organizing, consolidating, governing, preparing, analyzing, and deriving value from data. Cultivating data literacy, both individually and collectively for the organization, is the predominant goal of a data literacy program. And growing literacy doesn’t happen without assessment and planning—assessment of literacy levels both individually and organizationally, and planning to fill knowledge gaps identified through assessment. It has often been said that you can’t manage what you don’t measure. This concept applies to growing a culture of data literacy as much as any other management effort. 

Assessment Basics

As described in the earlier article, a comprehensive Data Literacy Body of Knowledge (DLBOK) is the foundation for assessment, gap analysis, and development of learning plans. The DLBOK identifies topical areas for data literacy, ideally in a multi-level structure such as the partial example shown in figure 1.

Figure 1. Partial Example of 3 Levels of DLBOK

Assessment tests individual knowledge at the lowest level of topic hierarchy, then rolls the results up to higher levels. An assessment might show, for example, that an individual has:

  • Above average literacy for Data Analysis

    • Moderate literacy for Data Analysis Techniques

      • High literacy for Descriptive Statistics

      • Low for literacy level Inferential Statistics

      • Moderate literacy for Time Series Analysis

    • High literacy for Data Visualization

      • High literacy for Visualization Functions

      • High literacy for Reading Data Visualizations

      • Above average literacy for Creating Data Visualizations

In the example above, I have intentionally used subjective and comparative language—moderate, above average, high, low, etc.—to illustrate the concept. In practice, literacy assessment is quantitative with a score calculated for each bottom level topic then rolled up to higher level topics. The eLearningCurve DLBOK will be supported with a corresponding (and free) assessment tool based on these concepts. 

The Assessment Process

Assessing data literacy of individuals is valuable and informative for people who want to advance their data skills and their careers growth opportunities. But individual assessment is only the beginning. It is the foundation upon which organizational assessment is built, and organizational assessment is an essential process when building a culture of data literacy. Literacy assessment with business impact is performed at three levels—by individual, by role, and by group. (See figure 2.)

Figure 2. Data Literacy Assessment Process

Preparing for Assessment

Preparation is a necessary first step to determine whose data literacy will be assessed, both individually and collectively by roles and groups. When assessment is driven as a corporate initiative, it makes sense to begin by identifying roles and groups. Typical roles include business executives, business managers, internal auditors, data analysts, business analysts, data engineers, data scientists, etc. Group identification is likely to be based on organization structure, thus hierarchical and perhaps ranging from major business units such as Financial Management to sub-units and teams. Once roles and groups are identified, then it is practical to identify the individuals in those roles and groups. Sometimes, however, data literacy is more of a grassroots effort than a corporate initiative. In those cases, identify the individuals first, then associate each person with the appropriate roles and groups.

Preparation also includes getting ready to store and manage the assessment data that is needed to measure, monitor, and manage a data literacy program. 

Setting the Targets

As with any measurement process, measures aren’t meaningful without a basis for comparison. Begin by knowing the measurement basis of your assessment method and tool. Does it score literacy on a 10-point scale, a 100-point scale, or some other method. Then consider the DLBOK that frames the assessment. Not all topics in the DLBOK demand the same level of literacy from all individuals, roles, and groups. Data Scientist roles, for example, should certainly have high literacy in statistical analysis with perhaps lower expectations for database management. For each role and group you’ll want to set targets by topic. 

To make the process manageable, don’t set targets at too low a level. Referring back to the earlier example of Data Analysis, it is practical to set targets at the second level—Data Analysis Techniques and Data Visualization. With the measurement basis known, you can set target levels for each individual—perhaps depending partially on their roles—or ask them to set their own targets. You can also establish targets for each role and group to be assessed. Of course, when first getting started, target setting is a lot of guesswork. Don’t hesitate to set some initial targets, conduct that first assessment, then adjust the targets based on what you learn from assessment.

Individual Assessment

To conduct individual assessment, each person is tested by responding to questions that are based on the DLBOK. (See the column on the left side of figure 1.) Test scores are detailed by DLBOK topics at the same level as is used to set targets. Comparing actual scores to targets identifies both strengths and gaps. Gaps exist where the actual score falls below the target level. Use gap assessment to create a learning plan that will build new knowledge. (I’ll discuss learning resources in the next and final article of this series.) Complete the learning plan, then reassess.

Don’t overlook strengths that are identified. When assessment shows high literacy level or performance well above the target level, view that strength as an opportunity. In what ways might it offer growth and career opportunities? How can you share that knowledge and skill to grow overall literacy and data capabilities for your team or organization?

Organizational Assessment

Organizational data literacy has two dimensions—literacy by roles and literacy by groups. (See the two columns on the right side of figure 1.) For both dimensions, measures are derived by aggregating individual literacy scores of the people in those roles and groups. Similar to the process for individuals, literacy measures are compared with targets to identify strengths and gaps. Gaps are the areas where measures fall short of targets and a literacy growth plan is needed. That plan may include individual learning plans, group learning, implementing a data coaching program, and other methods. (More on this in the next article in this series.) Strengths should be acknowledged and recognized as group accomplishments, and may indicate opportunities for highly skilled groups to coach or mentor as part of other groups growth plans.

Ongoing Assessment

Growing data literacy is not an event but a journey. Assessment isn’t a one-time activity; it is an ongoing process of measurement as part of continuous growth and improvement. Collect and store assessment measures—ideally as star-schema dimensioned by roles, groups, and dates—to support analysis and monitoring of trends. Management, measurement, and monitoring work together to inform leadership when creating a data literacy culture. As is typical with continuous improvement processes, you may choose to adjust targets and “raise the bar” as the culture of data literacy matures. 

Beyond Assessment

Data literacy assessment identifies gaps but does not actively close the gaps. Use the assessment to plan for learning and growth. That planning needs to identify and implement a variety of learning and growth resources. I’ve alluded to a few of those resources—training, coaching, mentoring—in this article. I’ll dig deeper in the fourth and final article of the series.

Read - The Data Literacy Imperative - Part IV: Developing Data Literacy

Dave Wells

Dave Wells is an advisory consultant, educator, and industry analyst dedicated to building meaningful connections throughout the path from data to business value. He works at the intersection of information...

More About Dave Wells