Logical Data Warehousing and the Supply Chain

Logical data warehousing enables organizations to extract the maximum value from their enterprise data and may be the missing link to helping supply and manufacturing companies generate the insights needed to drive data decision making.

Adobe Stock 243619150
WrightStudio/stock.adobe.com

With the pandemic now fully in our rear view mirror, data continues to grow in volume and complexity faster than it did even before COVID. As a result, Chief Procurement Officers (CPOs) and other supply-chain professionals are finding that they need effective, end-to-end data management capabilities that can provide seamless, timely views of data across numerous different types of sources. In fact, a Deloitte survey from 2018 found that more than 3 out of 5 CPOs (65%) had limited or no visibility beyond tier-1 suppliers. A year later, Deloitte found that CPOs were listing data quality as a major impediment to procurement.

As organizations continue to grapple with data management issues within their supply chain and seek out the right solution for their enterprise, a strong alternative has been percolating in the background for years; a logical data warehouse (LDW), which is an agile foundation for transforming and delivering data. Gartner’s Mark Breyer first proposed the term as early as 2008, as the next evolution of the data warehouse, as it “focuses on the logic of information and not the mechanics.” Since that time, logical data warehouses have been successfully used by thousands of companies and they have grown in sophistication and reliability. Unlike traditional data warehouses, logical data warehouses enable real-time data views across multiple disparate systems, including cloud-based repositories and streaming data sources.

Logical Data Warehousing and Today’s Supply Chain

Recently, Gartner outlined how to leverage logical data warehousing for the supply chain. In it, they mapped out an architecture in which the logical data warehouse forms a central role in relation to other key components of a data infrastructure, which include operational data stores, data warehouses, data marts and data lakes, to address the specific needs of different users such as business analysts, data engineers, and data scientists, who engage in a range of analytics that involve operational intelligence, business reporting and intelligence, advanced analytics and data science.

Gartner recommends supply chain leaders structure their approach to analytics and intelligence around Gartner’s Data and Analytics Infrastructure Model (DAIM), a four-quadrant model that covers the majority of use-cases for data and analytics, according to the following two dimensions:

  1. Data, Known vs. Unknown: Where “known” data is structured, with clear business value, whereas “unknown” data is unstructured, without having demonstrated its business value.
  2. Business Questions, Known vs. Unknown: Known questions are those that are routinely asked in regular reporting activities, whereas unknown ones are those that arise in the moment, in response to changing business or market conditions.

Gartner’s report shows how the five key components of a data infrastructure (logical data warehouses, operational intelligence components, data warehouses, data lakes and data science components) map onto the DAIM, as well as which roles and skills tend to apply in each case.

Two observations are clear: First, aside from the logical data warehouse, the four other infrastructural components, with their associated roles and skills, line up to the four categories of DAIM data and analytics use cases in roughly this way:  

  1. Foundational, Core Use Cases i.e. when the data and business questions are both known: Primarily performed by casual users and analysts leveraging operational intelligence components and data warehouses.
  2. Expanding, Understanding and Investigating Use Cases i.e. when the data is known but the business questions are unknown: Primarily performed by analysts and data scientists leveraging data warehouses and data lakes.
  3. Innovation and Exploration Use Cases i.e. when the data and business questions are both unknown: Primarily performed by data engineers and data scientists leveraging data lakes and data science components.
  4. Establishing-Value Use Cases i.e. when the data is unknown but the business questions are known: Primarily performed by data engineers leveraging data science components.  

Second, the logical data warehouse embraces all four categories in the DAIM, this is because it can be implemented above a company’s existing infrastructure, including operational intelligence components, data warehouses, data lakes and data science components, enabling seamless, real-time access to all of the different types of data stored across the different components.

Supply Chain Data Management Powered by Data Virtualization

Data virtualization makes modern data management possible because its logical approach to data integration and management provides real-time views across disparate data sources without having to first physically replicate data into a consolidated repository. Data virtualization acts as an abstraction and semantic layer on top of all of the different underlying data sources that make up the logical data warehouse, including on-premises and cloud sources, structured and unstructured sources, static and streaming sources and legacy and modern sources.

Many companies are leveraging data virtualization to establish logical data warehouses to solve their supply chain problems. Hastings Deering Pty Ltd (a Sime Darby Industrial company), is one such example. As of 2022, Hastings Deering was two years into a five-year digital transformation, which encompassed robotic automation, digitization of paper-based forms, digital application development and data and analytics, to support employee and customer experience improvements. Leveraging data virtualization, Hastings Deering has fast tracked its analytical capabilities, expanded its self-service analytics capabilities and streamlined data delivery for their Parts stream. Challenges with respect to data continued to surface with ongoing changes in economic and market conditions; nevertheless, Hastings Deering took the first step in getting the basics right by adopting a logical data warehouse and the right methodologies. The logical data warehouse provided the needed architecture so Hastings Deering could deliver reporting, intelligence, data sharing and digital programs through their new data marketplace, as well as establish the platform needed to establish ongoing data literacy and data governance programs.

Next-Generation Supply-Chain Analytics

Logical data warehouses, supported by data virtualization, are critical to the supply chain because they enable organizations to gain real-time visibility and powerful analytical capabilities across a myriad of use cases. As the above example showed, through the company’s new technology and methodology initiatives, Hastings Deering has been able to accelerate the acquisition of important data sets, create more value and further enhance forecasting capabilities to meet business needs. As Gartner suggests, logical data warehousing enables organizations to extract the maximum value from their enterprise data and may be the missing link to helping supply and manufacturing companies generate the insights needed to drive data making.

Latest