All You Need is Tableau? The Relentless March of a Data Analytics Platform
Last week, I joined 17,000+ die-hard Tableau customers and partners in New Orleans for the company’s annual data love-fest. A millennial-themed event, the conference gave data analysts and their managers a chance to escape their departmental cubbyholes and share in a vibrant, caring, and sharing community that fosters learning and goodwill around the world. It’s an inspiring and invigorating event. It made me feel young again!
From a product perspective, Tableau is relentlessly focused on its customers. It is determined to follow them wherever they want to go. They set the roadmap and Tableau follows. Increasingly, customers want a seamless, end-to-end analytical experience that accelerates time to insight with minimal training. This desire is moving Tableau into new product segments, turning it into a bonafide enterprise platform supported by a fast-growing, third-party ecosystem which innovates around the edges of Tableau’s ever-expanding footprint.
The Expanding Footprint
Today. Tableau made its mark providing best-of-breed visual analytics. Now, it’s systematically tackling the obstacles that prevent people from doing visual analytics in the first place. On the back-end, that means adding a new in-memory database that eliminates most performance issues and data preparation capabilities to create a full self-service authoring environment. On the front-end, Tableau is adding natural language queries to simplify ad hoc access for casual users, artificial intelligence to surface insights automatically, and a scripting engine to run R and Python models. Culturally, Tableau continues to do a wonderful job cultivating a community of users, both online and in-person, who use meetings, gamification, and training to expand skills and data literacy.
Full-Fledged Data Prep. At Tableau Conference 2018, Tableau made two announcements, each of which has far-reaching implications. The first announcement was the release of Tableau Prep Conductor, a separately-priced add-on to Tableau Server and Tableau Online that turns the product from a personal productivity tool into an enterprise data integration platform that enables administrators to manage, schedule, and monitor data flows. It is likely that Tableau Prep will soon output to third-party databases, not just Tableau. In other words, companies might soon buy Tableau Data Prep to manage all data pipelines, not just those that end in a Tableau workbook. Watch out Alteryx!
Data Modeling. The other key announcement is that Tableau unveiled new data modeling features that close the gap with rivals, such as Microsoft Power BI and Qlik. Rather than consume a flat file of data (i.e., think Excel or TDE), Tableau can now recognize dimensional schema (e.g., snowflakes and stars). In addition, designers can create relationships in a data set using a visual drag-and-drop environment so they don’t have to build complex workarounds to handle multi-level data (e.g. data at different levels of granularity.) We suspect these features are the first-step in a more full-featured Tableau data modeling environment.
Organizations might be able to build an entire data analytics environment in Tableau.
The significance of these announcements is that someday organizations might be able to build an entire data analytics environment in Tableau. No need for a database, ETL tool, or design tool. Tableau could be a one-stop shop for all data AND analytics needs. Now, some might think that is far-fetched. But, many companies already use Tableau’s key rivals, Qlik and Microsoft Power BI, in this capacity. Typically, these are departmental implementations or small- or medium-sized companies. (Ironically, I have a client that is a $2 billion consumer packaged goods company that currently runs its entire BI/DW environment using a 1990s BI tool. Full disclosure, we are rebuilding this environment now and, no, we aren’t using a BI tool to do it.)
Given this trajectory, where will Tableau to go from here? How will it expand its “platform” in the next few years? Well, there are two obvious holes that Tableau needs to fill, not because it seeks market hegemony, but because customers will demand it.
The first hole is the need for a data catalog. (See “The Self-Service Triumvirate: The New Data Analyst Workbench” to understand the role of a data catalog for visual analysis.) I met one Tableau customer at the New Orleans event who used Tableau to build its own catalog of data sets and reports. I’m sure this customer is the tip of the iceberg. Of course, Tableau already has data catalog partners, such as Collibra, Unifi, Informatica, Alation, and Waterline, and theoretically, every enterprise only needs one data catalog. But you could say the same for data preparation, and that didn’t stop Tableau from adding Tableau Prep. A platform relentlessly consumes its partners’ capabilities. Just ask Microsoft or Salesforce.
A platform relentlessly consumes its partners’ capabilities. Just ask Microsoft or Salesforce.
Hyper API. The second hole is lack of access to Tableau’s new high-performance, in-memory database, called Hyper. Many third-party tools, such as Alteryx, can output to Hyper, but no tool can access data already in Hyper, except Tableau. Given the scalability of Hyper and Tableau’s new modeling capabilities, Hyper will soon contain valuable data that exists nowhere else in the organization. Naturally, people and applications will want direct access to that data without having to use Tableau’s front-end as an intermediary. Once Tableau becomes an enterprise analytics platform, it will need to publish an API to Hyper for the outside world. Then, Tableau will become both an enterprise data AND analytics platform.
So, in the future, maybe all you really need is Tableau?
What do you think? I’d love to hear!