Maximizing Your Current Data Architecture

Sometimes the next big thing holds so much promise that everyone embraces it, only to later realize that promise is difficult to fulfill. Take data lakes, for instance. Intended to solve the shortcomings of the data warehouse, they did not turn out to be the enterprise panacea that many had once hoped. Sure, you have more data in one place, but finding, using and creating value from it remains a challenge. 

With surging demand for data from every part of the business, enterprises are seeking new ways to quickly turn that data into business value. But that doesn’t mean you have to forfeit the technology you’ve invested in. There’s a simpler way to harness real value. But before we look ahead, let’s take a step back.

How Did We Get Here?

Modern data architecture emerged in the 1980s when the data warehouse became the de facto information repository for companies. It could store structured data in one place for business users’ needs. Data was organized and available to extract for business intelligence (BI), primarily reporting and basic data analysis. 

But as the proliferation of applications and devices caused the volume and types of enterprise data to grow, the limitations of the data warehouse became evident. Data warehouses were well suited for structured data and required a lot of work to maintain a consistent schema that would make that data usable. Plus they required specialized skills which limited the ability to extract value to a select few resources. 

Enter data lakes in 2010. Envisioned as a solution to right the wrongs of the data warehouse, the data lake was expected to be a more agile, flexible data architecture. The data lake would collect and store massive amounts of unstructured, structured and even streaming data and make it available for advanced analytics, machine learning or other uses. Because it didn’t rely on a common schema, more data became more accessible to more business users who were promised the ability to use their own preferred tools to discover and analyze this vast set of data. 

Though great in theory, data lakes and warehouses shared a fatal flaw: focusing on aggregating data and centralizing its control, rather than making it accessible and usable for data consumers. The dream to manage, mine and monetize the data lake simply did not match its reality. 

“The main challenge is not creating a data lake, but taking advantage of the opportunities it presents.” — Brian Stein, PwC

At a Data Fork in the Road

The proliferation of the data lake architecture has led to a whole new set of technologies and tools, such as data prep, data integration and data catalogs, that aim to help organizations get to value faster. And despite all of the options and investments, we are seeing a wave of new architectures emerge to succeed the data lake, including the data lakehouse (aka hybrid data lake-warehouse) and data hub. 

If your business needs aren’t being met with your current data lake architecture, you may be tempted to move on. But be aware that each of these new models represents an incremental iteration of the same core paradigm, based on aggregating data and centralizing control, and a heavy investment of time and resources to get there.

What if Technology Isn’t the Problem?

Before pivoting away from your current data lake architecture and technology stack, let’s consider a new approach to getting value from data. Let’s not start with aggregating data but instead start with democratizing it — with empowering data consumers to find, understand and consume it quickly and easily.

The key to unlocking the full value of your data sits not with your technology, but with your strategy. A value-oriented strategy adopts the following principles:

  • Curate data products: In a data lake, data is merely…data. Instead, shift focus to deliver high-quality data products that curate data from your data lake, data warehouse or any other source and package it to be easily found, understood and consumed. 
  • Embrace self service: Allow anyone in your organization to get the data they need — no advanced degrees needed. Data products help data consumers to search, preview, analyze or model, and export data without queues, support requests or IT go-betweens.
  • Decentralize data ownership: Empower domain experts as data product managers responsible for the quality, performance and change management of data products. 
  • Facilitate collaboration: Eliminate the middleman in your data value chain. Instead, bring together data domain experts (who know the data best) and business stakeholders (who know use cases best) for close collaboration that builds trust and fast-tracks outcomes.
  • Double down on value creation: Instead of a constant stream of “one-off” data pulls, enable data product managers to build and share new, blended or refined data products that scale value across the organization and accelerate business outcomes.
  • Automate processes and data pipelines: Maximize efficiency with built-in automation to create, publish and deliver data products, while simultaneously ensuring consumers always have the most current data. 
  • Expand access while maintaining control: Ensure policies, governance, compliance and security remain intact, even while democratizing data access.  

Embrace, Don’t Replace

Wouldn’t it be nice to provide more value to the business while also taking advantage of the years and millions invested in your current data technologies? An enterprise data exchange can help you do that. It expands your data toolkit with a set of capabilities focused on empowering the data consumer and accelerating value. As importantly, it operates as an extension of your current data technology stack — extending its value without requiring an architecture overhaul.  

Maybe the next big thing is…improving upon what you already have. With an enterprise data exchange, you can deliver repeatable, scalable data-driven outcomes from your data lake (and/or warehouse) without sacrificing the financial investment and resources you’ve already put in. Find out how.

The enterprise data lake: Better integration and deeper analytics, Brian Stein and Alan Morrison, PwC, 2014.