Community Forum Webinars
February 01, 2023
The data mesh methodology calls for distributing development to data domains. Data and analytics generalists in these units rely on a shared self-service data platform and data governance standards to publish data products for local and enterprise consumption. To ensure enterprise alignment, enterprise data teams need to provide a self-service data platform that abstracts the complexity of the underlying infrastructure, making it possible for citizen developers to build data pipelines and data-driven applications without deep technical knowledge or writing code.
Data architecture as a service (DAaaS) is a new concept based on traditional best practices. DAaaS embeds enterprise standards and know-how into point-and-click development tools geared. This gives domain developers the best of all worlds: the freedom and agility to build their own data-driven solutions while adhering to enterprise standards and best practices. The result is the data domains build solutions quickly and efficiently without creating data silos that undermine enterprise data consistency. The webinar will examine the convergence of data mesh and DAaaS and demonstrate DAaaS-enabled tools in action.
You Will Learn:
• How DAaaS helps fulfill the promise of data mesh
• Best practices to create a self-service data mesh platform that abstracts complexity
• How DAaaS embeds standard rules and best practices into self-service tools
• Why data mesh requires data architecture as a service (DAaaS)
Sponsor: Coalesce
Community Forum Webinars
December 15, 2022
Data sharing is a hallmark of modern, data-driven organizations. Companies need data from both internal and external sources to fuel data science projects, manage real-time supply chains, and enrich customer data, among other things. But finding, consuming, and sharing high-quality data and integrating it with analytical systems is challenging. Fortunately, there is new technology that makes this process infinitely easier. In fact, it promises to make it easier to enrich existing data with third party data and transform traditional data consumers into data suppliers who monetize their aggregated, anonymized data assets.
This 3-hour virtual event explores new data sharing and monetization models and the underlying technologies that enable them. It is geared to business and data leaders who want to establish internal or external data marketplaces or start monetizing their data assets on third party exchanges. The free event consists of a keynote by Eckerson consultants, a panel discussion among top data exchange providers, breakout sessions, and vendor rooms where attendees can meet with sponsors, view product videos, and link to content.
Panelists: Wayne Eckerson, Kevin Petrie, Eric Wrobel, Ludovic Codeluppi, Ian Gilbert, Ian Stahl, Didier Navez
Event Sponsors: Dawex - Data Exchange Technology, Informatica, Revelate, Narrative I/O, Crux
Media Sponsors: Solutions Review, BARC, RTInsights
Community Forum Webinars
November 29, 2022
Data marketplaces form the centerpiece of the new data economy. They make it easy for companies to buy, sell, or exchange data products in a frictionless manner. Data suppliers use marketplaces to package, publish, and distribute data products, while data buyers use them to search, browse, acquire, and ingest those products.
There are many types of data marketplaces. Some are public, open to any buyer and seller. Others center around a single sizable data supplier or data buyer. Still others are internal, fostering the exchange of data products among disparate business units. Modern data marketplaces provide numerous value-added services that go well beyond static product lists and file transfer links. They actively vet and recommend data products; they check data quality and schema changes; they enrich customer data sets with multi-vendor data; and they host collaborative workspaces for buyers and sellers to validate, filter, and augment data products.
Sponsor: Revelate
Community Forum Webinars
November 02, 2022
Data catalogs stand at the nexus of exploding data supply and demand. Multiplying users, applications, and devices demand multi-structured data from a growing supply of sources, including databases, data warehouses, and data lakes. Enterprises that catalog their data assets effectively can help reconcile this supply and demand while governing consumption. Those that don’t will become less productive and increase compliance risk.
This webinar will explore the rising role of the data catalog in data governance programs. Kevin Petrie, VP of Research at Eckerson Group, and Lewis Wynne-Jones, VP of Product at ThinkDataWorks, will define why and how leading enterprises implement catalogs as part of their governance programs.
Community Forum Webinars
October 05, 2022
Data Access Management (DAM) is the process of defining and enforcing policies that control access to application data throughout the enterprise. Until recently, DAM required a trade-off between data access and data protection, but that zero-sum game does not work in today's complex global data landscape.
The modern approach to DAM provides both greater data access and better data protection through centrally managed and universally enforced data access policies. Data access management solutions dynamically evaluate every data request against applicable access policies at runtime to determine what data the requester can see. They enable data governance teams to centrally manage data access policies and equip data system administrators with no-code/low-code tools to configure automated enforcement.
You will learn:
• The main pillars of Data Access Management
• The evolution of DAM
• How DAM solutions work
• Strategies to improve implementation
Sponsor: InfoVia