Commodity AI and the Next Best Experiment
Earlier in October, I spent a week in Miami (before the hurricanes) at IBM’s Analytics University. Here are five cool ideas that you may enjoy thinking about:
1. There is No AI Without IA - Rob Thomas, IBM’s General Manager of Analytics, introduced me to this great phrase at his keynote. IA, in this case, stands for “Information Architecture”. The phrase encapsulates the most important lesson for those building and evolving their data science and AI departments: “you can’t build great AI unless your data is in good working order”.
2. Commodity AI - Mike Gualtieri of Forrester argued that today’s most popular machine learning algorithms were rapidly becoming a commodity. YES! This is an important insight for all analytics leaders trying to make their models have a business impact. For instance, XGBoost may be the current darling algorithm and boosting and bagging are improving decision trees and convolution pre-processors make deep learning work but the bottom line is that the breakthroughs in AI/ML/DS in the near future, are going to be from the mundane support pieces. The breakthroughs will come from excellence in things like model management, data integration, and ROI optimization. All the better if these functions are delivered as part of an overall data science / AI platform.
3. AI’s Next Best Experiment - There were several talks given that mentioned that IBM Watson was being used in life sciences as part of the experimental process. It is now easier to carry out new tests in the discovery process of a new drug but the possibilities are endless and not without cost. Being able to intelligently suggest the next most informative experiment is critical to success. This is just where AI is being put to work with Watson. This is a great application area for AI but, as I mentioned in my last post, it also indicates a critical threshold being crossed in AI. Up to this point, AIs have been learning from us. Now, with multitudes of sensors and actuators in the real world, AIs (in a very real sense for the first time ever) will be able to experiment on the real world environment and learn in a way similar to a human. This change is going to result in the most remarkable breakthroughs in AI in the coming years. These changes will have far more impact than has been realized from the breakthrough of swapping out the sigmoid function for ReLU in the hidden layer of a deep learning neural network.
4. Google’s AI Blind Spot: Enterprise Data - While Google and Facebook dominate the collection and use of personal information they have little insights into what goes on within large corporations. At some point, this will hamper their ability to provide services and applications that optimize the use of models for their enterprise customers. For example, Google could predict which customer wants a particular product but not whether selling it to them is optimal for the company unless they also have the enterprise data for inventory, production and distribution costs.
5. Is AI Suggestive, Prescriptive or Optimized? - I recently had a discussion with Wayne Eckerson about whether the predictive models produced by machine learning were really prescriptive or whether they were suggestive and required human-crafted rules to put the model scores into action. I argued that models are being used all the time without human interpretation. For example, models routinely approve or deny credit or recommend the next best product in a prescriptive/automated way. While that is true, Wayne also has a point. Taking a risk score and turning it into an action may be much more than implementing a simple cutoff limit in a rule. Something more sophisticated is often needed. This is where business optimization kicks in and this is where folks like IBM and a few others (like FICO and SAS) have significant leads over other vendors. Virginie Grandhaye at IBM described it this way in her talk on decision optimization: “Machine learning and predictive models tell you what will happen next. Optimization tools tell you want to do about it.” This marriage of predictive models and classical optimization algorithms (e.g. simplex) is going to integrate the power of machine learning with the real world constraints of business. The result will be a multiplicative effect of the value of either technique by itself.