Register for "A Guide to Data Products: Everything You Need to Understand, Plan, and Implement" - Friday, May 31,1:00 p.m. ET

Finding Value in Analytics, Part III: Next-Gen

Finding Value in Analytics - Part III

The journey of this blog series is finding the value generated by analytics within enterprise IT systems. Why should companies and governments invest resources in analytics as part of their IT infrastructure? Are good data management and insightful visual tools enough. What is the value of using analytics?

In the first blog, we set the stage for this journey by introducing the economic value principle:


Information has economic value only if we are enabled to make actionable choices that yield higher expected payoffs than if we made choices in the absence of that information.


Simply put, what do we do with what we know? The implication is… To have value, analytics must enable the organization to execute improved actions that, in aggregate, changes its behavior and improves its expected payoffs. Hence, the focus shifts to how analytics transforms data into actions, not just insights.

In the second blog, we explored our seven-decade legacy using various IT technologies to support Business Intelligence (BI). Each decade had a unique value generator, for which we noted six trends in flow patterns from one decade to the next: Cumulative, Pervasiveness, Disruption, Actionable, Generalizing, and Effectiveness.

This third blog futurized about… What will be the value generators for the 2020-decade, given these six trends? One aspect is certain. Analytics will play a key role in generating value within IT/BI systems in the 2020-decade. But, how?

Cumulative: Care for Your Current IT Infrastructure 

As highlighted in an earlier article, DataOps is already an essential factor in enabling the development and deployment of analytical systems. It is possible for a company to spawn successful stovepipe analytic applications based on manually created training datasets. It is more difficult to use the same analytics as part of production systems, which require mature data integration and governance, plus close coordination among groups with diverse skills.

The greatest difficulty will be for companies to integrate Data Science expertise into an enterprise strategy and culture. Changing strategies are easy; changing the culture is not so easy. For instance, the earlier article also argued the need to enlarge the role of the business/data analyst. Not as a “citizen data scientist”, but as a bridge-builder in close coordination and collaboration with a strong data science team.

More fundamental to this cultural change is across-the-organization literacy of analytic concepts (describing/generalizing thinking, crafted/learning logic, training/testing datasets, bias/variance tradeoffs, learning curve behaviors, supervised/unsupervised/reinforced/unknown architectures). Everyone must speak the same analytic-centric language in the 2020-decade.

Disruption: Identify Obsolete Components 

In the prior blog, we illustrated that prior technology is not destroyed but blends into the next, along with some disruption. Several decades ago, a common practice was that major hardware vendors spent considerable effort on elaborate migration strategies for their customers, which resulted in new blue boxes rolled onto your raised-floor data centers every 2-3 years, amid smiles by the vendors. It was a smooth (albeit expensive) hardware evolution over those decades.

In the 2020-decade, hardware will be floating in the cloud, drifting and evolving quietly into its daily next-gen. As the past decades have also illustrated, software is now the dominant factor. Further, the analytics of this decade has shown that learning logic, rather than crafted logic, has become a vital aspect of software. Hence, your obsolete technology components in the 2020-decade are more likely to be analytic models, which you carefully trained yesterday, but are not applicable today for conducting your business.

The key factor is the degree of change in your business. Should you conduct business today, the same way that you conducted yesterday? Should you interact with your customers the same? Should you change your prices with the same decision rules? Should you restock your stores the same way? And so forth. If the answer is YES, then your company will be okay with the analytics of the 2010-decade. If NO, then how should you minimize this constant disruption in tracking your ever-changing business situation?

First, IT management must constantly be aware of how each critical part (especially analytics) of their enterprise system will be blending into their next-gen version. Monitor and plan to alleviate disruptions in your technology evolution.

Second, realize that new methods and tools for managing the learning logic of your operational analytic models are rapidly evolving. The emerging label is “AnalyticOps” in the same sense as DevOps, DataOps, and the like. Pay attention to the best practices and tools for AnalyticOps. The goals are to manage the model versions based on training regiment, detect drift in prediction accuracy with live data streams, determine optimal retraining procedures (e.g., champion/challenger contests) to maintain performance, and measure realized performance continuously (e.g., A/B testing). It’s a new ballgame, like adding a fourth base to the playfield!

Pervasiveness: Analytics Will Be Embedded Everywhere 

Over past decades, we have seen that BI has touched a wider scope of persons, from executives only in the 1970s to every customer placing an order today. In the last decade, cell phones (which are seldom used as a phone) have been a factor in extending this pervasiveness, which will be extended further with reliable chatbot services and IoT sensors in the future.

The 2020-decade brings a shift from persons to systems. Analytics will be more pervasive as enterprise systems as embedded smart analytic modules throughout. For instance, consider a standard module that counts and characterizes transaction flow in terms of dollar amounts, type of goods, location, and other factors. In the future, it is likely that this module will be replace with one that learns with each transaction and characterizes it into clusters of business activity. Another example is a module that uses a variation in neural style transfer for image/video image identification of defected parts on an assembly line.

Actionable: Manage Your Data To-Action Value Chain

Analytics is just a means to an end, starting with raw data. The end is action, as emphasized in the economic value principle above.

If a company is serious about analytics, then it must also be serious about managing its data-to-action value chain as an end-to-end system. This implies that your company should trace each piece of raw data to an action resulting from a choice based on that data. Also, this implies that your company should trace each action back to the data transformed into choices for that action. This is analogous to manufacturing companies managing Bill-Of-Materials assemblies with traceability forward or backward. This may be the next-gen of Enterprise Resource Planning (ERP) systems.

Generalizing: Make This Your Core Competency

We have become proficient at describing known data, along with evolving the BI practices and BI tools to do so. However, we as humans are inept at generalizing beyond known data, even when using the latest BI systems.

This deficiency, however, is the strength of using analytics, especially when scaled to enterprise systems. The good news is that newer analytic technologies, such as neural networks, have evolved rapidly and dramatically over the past five years. The bad news is that these newer analytic technologies are unevenly distributed, especially from an organizational cultural perspective. With this rapid evolution, all data scientists struggle to maintain competency with best practices.

In the 2020-decade as analytics mature and are integrated into systems, there are two implications to this trend:

  • Gambling skills will be essential. Future business environments are increasingly uncertain and unpredictable. Certainty about any business situation will be a luxury. Every choice and its actions become a probability and statistical tradeoff. The cases of False Negatives and False Positives will become strategic decisions by top executives. It may be that successful professional gamblers will become the role models for the data scientists of the future.
  • Searching for the signal in raw data will be critical: However, it is possible that the signal data is novel and unknown to official corporate data, such as the data warehouse or even the data lake. Internet-of-Things (IoT) sensors are likely to contain the signals required to drive analytic systems. Move over data warehouses and data lakes. Data hurricanes are coming!

Effectiveness: Customize Actions for Each Situation

Over the decades, we have seen a shift from efficiency to effectiveness in the use of IT/BI in enterprise systems. As Peter Drucker stated, “Efficiency is doing things right; effectiveness is doing the right things.” [01]

Leveraging this quote, the 2020-decade will shift to intelligent effectiveness, which is customizing each task for each transaction to match the needs of the specific business situation.

Doing intelligent effectiveness at scale will require managing analytic models that continuously learning and improving their performance with every transaction. Systems must track each zigzag of the business world by constantly evolving the logic within those models. The learning analytic system that was installed last week is now quite different in behavior this week. How will these analytics systems be managed?

Take-Aways: What Should You Do...

As a summary of the above, here are suggested Take-Aways to consider:

  • Enlarge the role and skills of the business/data analyst as key bridge-builder for collaboration with the data science team.
  • Pay attention to the unique aspects of AnalyticOps for managing model versions, detect drift in prediction accuracy, determine optimal retraining, and measure realized performance. When exposed to live data, when do the analytic models go stupid?
  • Adopt next-gen unsupervised learning to characterize evolving clusters of business activity based on its latent space.
  • Be serious about analytics by being serious about managing the data-to-action value chain resulting from the analytics. Trace data to action, and trace action back to data.
  • Understand where in the raw data is the signal that drives analytic performance. Is that critical signal data part of your data warehouse or data lake? If not, how will you find it?

Summary

In ten years, how will we characterize the IT evolution of the 2020-decade, compared to these prior seven decades? Here is my best guess, as the next row on the legacy table. Please share your thoughts.

We have a rich legacy in our IT/BI evolution over the past seven decades. The 2020-decade will continue to challenge corporate IT as analytics evolves, matures and drives the business value generated by enterprise systems. We must maintain open minds and foster agile thinking to be successful and to guide its usage properly to benefit everyone fairly.

Read Finding Value in Analytics, Part IV: Action Distance


References

[01] The quote is from Drucker’s book The Effective Manager: Getting the Right Things Done. A concise summary of the 12 lessons from this book seems applicable to managing future analytic systems. Also, a concise elaboration of this Efficiency-Effectiveness distinction was captured in his May 1993 article Managing for Business Effectiveness.

Richard Hackathorn

Richard Hackathorn, Ph.D., of Bolder Technology, Inc. is a well-known industry analyst, technology innovator, and international lecturer in business intelligence and data analytics. He is currently focusing on the managerial...

More About Richard Hackathorn