3 Ways to Make Your Supply Chain Data More Actionable

Your data is only as good as the outcomes it produces. If it’s stored in silos and processed in bootstrapped legacy systems, it can’t be operationalized. A more responsive supply chain depends on more responsive data. Here's how to make that happen.

Microsoft Teams Image (7)

*This content is brought to you in partnership with Dun & Bradstreet (D&B)*

As a supply chain executive, you don’t get to where you are by making decisions solely based on instinct, intuition or even experience. While those elements come into play (and certainly more as you become a veteran in this field as I have) they aren’t the primary determinant of success. That’s a title held by data and data alone. Every initiative I’ve witnessed in my 30 years of supply chain consulting for major organizations has relied on data. Where data-backed decision-making goes, profit grows.  

If you’re looking to take a defensive position against recessionary headwinds, moving beyond data collection and maintenance is necessary. Making data actionable is the goal. I’ve built data warehouses and business intelligence systems for senior executives at a major home goods store, a popular clothing chain and other retail and consumer product powerhouses.  

In all cases they had a variation of the same challenge – a complicated supply chain and a need to report data efficiently enough to be effective. In all cases they had similar pain points. Today, as distinguished solution architect at the international data and research firm Dun & Bradstreet (also known as D&B), I’m no longer the owner of those pain points, but the in-house expert to help companies resolve them. As a consultant, I can see the patterns that emerge and quickly identify and resolve operational issues. Here’s what I’ve learned.  

Before You Get Started

You can’t maximize data if you don’t first conduct a value chain analysis and answer: Which suppliers are the critical link? A lot of people underestimate their risks downstream. For instance, the manufacturer of a washing machine may not look into the suppliers of a specific screw manufacturer, but if that screw doesn't exist, they can't produce the washing machine. Taking it to the next level is something that most companies really don't do but should.  

Pain Point #1: Usability

Solution: Focus your attention on the most useful, timely data to meet your goals. 

Once we decide which suppliers are critical, we need to decide which data they provide is useful. To be successful, I look not at what data is available to me, but what is the objective? What are we trying to optimize? Too often supply chain managers are trying to optimize against shelf space. But shouldn’t it be about availability of products to the customer? One-third of all people that you're trying to sell to are actually looking for the product that you're trying to sell them, and if the product is not available, they can’t make the purchase. Think outside the shelf. Present them with more options and you have more opportunities to make the sale. 

The pandemic opened eyes to downstream supply chain risks affecting product availability. But I don't know that a lot of companies incorporated that data into their ongoing analysis. Shortages persist due to that lack of visibility. That lack of visibility is a real concern today on another front, too. 

The supply chain has been changing rapidly since the pandemic. One of the most significant changes is the increasing frequency of mergers and acquisitions (M&A). When two companies merge or one acquires another, they suddenly become associated with each other's suppliers, partners, and customers. This can be a challenge, especially if the two companies have different values or standards. 

I experienced this firsthand when I worked at a large retail consumer packaged goods (CPG) company. We only did business with companies that used ethical sourcing in the creation of their products. Every quarter, we required our suppliers to complete attestation forms and undergo inspections to verify that they continued to support our requirement. 

When we acquired a new company, it was important to ensure that our data and theirs aligned. We had to make sure that both companies were using the same standards and that we were both committed to ethical sourcing. This became even more critical as we acquired more companies and purchased from more vendors.

Pain Point #2: Accessibility 

Solution: Determine how much you’re willing to share your data and bring all the players to the table to open accessibility.  

Data sharing doesn’t just happen in M&A. There was a big movement in 2011, around the time Doug Laney authored “The Data Lake: A New Way to Think about Managing Information,” to move from data warehouses to data lakes. This was the start of data democratization. Companies built data lakes and published data to groups that could access it in near real-time.  

To make data more useful, we need to remove the gatekeepers while still considering data privacy. So which data can you publish? And how do you protect the data that shouldn't be distributed widely? That’s a challenge especially for larger companies that are trying to build an enterprise data lake. Those are things that you will solve from your own perspective. For instance, I worked at an American insurance company that took the perspective that its data was a corporate asset to be shared widely throughout the organization for proper analysis. Other business units in the same company, namely our European organization, were different. They couldn’t share some aspects of their information because of their General Data Protection Regulation (GDPR) requirements. 

It's normal for regions to have different rules that lead to a hierarchy of access and a discussion around governance. Organizations must decide what’s a business decision versus a legal decision and manage the interplay between those two forces. Nowadays, you must involve legal upfront to understand the many overlapping data privacy rules. In the same conversation, you need your data security team to understand whether there are additional parameters that you must put in place and if you're dealing with legacy data that may not have been addressed before. And so now you have data security and legal, neither of whom may be aware of the data risks the other is dealing with.  

Today, data democratization is tempered. You don’t necessarily get rid of the gatekeeper; you change who the gatekeeper is. That causes conflict. You need to broaden your discussions and bring more stakeholders to the table to have those discussions trickle down to the organization, every business unit, and every department. Getting deeper penetration into organizations is going to drive more adoption of your data. The more people you involve, the longer it takes, but it does enrich the process—and subsequently the data analysis.  

Pain Point #3: Reliability  

Solution: Look for ways to enhance your current data and glean insights with a worldview that is both reliable and timely. 

I personally think that current, accurate data is best for tactical decisions, and historical data of the same granularity is best for strategic analysis. What you consider reliable entirely depends on the objectives you set in your analysis criteria.  

The data we provide at D&B is aggregated from many disparate sources around the globe. if you're connected with our application programming interfaces (APIs), you get current, comprehensive information straight into your database. For instance, D&B data includes monthly and quarterly reported financials factored into the Supplier Evaluation Risk Rating, a score of how likely it is a supplier will seek relief from creditors or cease operations in the next 12 months. That data you want now. 

The reason I came to work at D&B was that it is one of the oldest information and research organizations in the world, and we have a reliable process for looking at company data that is more holistic than other sources. We’re not a static data organization. We're constantly looking at ways to improve the information that we're providing to our customers. We're building metrics based on what we see in the data. We provide proactive intelligence to our customers. For example, during the pandemic, D&B provided the COVID-19 Impact Index that that I, as a customer, could then use to overlay on my data to see risks that I might have missed. That level of insight made me more effective looking at our risk exposure.  

Data reliability increases when you have a significant cross-section of companies worldwide contributing to the measurements of trends. This helps identify macroeconomic and microeconomic changes in regions of the world and specific industries. It’s exciting to be on this side of the fence now helping to report and analyze that data today, because this is the information that will help companies mitigate the risks of a downturn. As we potentially approach a recession with socioeconomic, geopolitical, and environmental risks, we have never seen a greater need for more reliable data. 

What’s Next? 

The naysayers in your organization will point out how hard it is to answer questions around what’s useful and what’s timely when it comes to data. They’ll say the subjectiveness and ambiguity make it impossible to know. These are the people who may never start a project. I like to focus on the positives and start to gradually incorporate the metrics that are useful, accessible, and reliable. That's one of the things I’ve learned: Start somewhere. These naysayers will become the raving fans of your data project once they see it work. You've taken them from negative to positive, and now they can’t live without it. That’s how you define actionable data – when everyone is better for having it. 

To learn more about Dun & Bradstreet’s data solutions for supply chain leaders, visit https://www.dnb.com/supply .

Lynn Overall is a Distinguished Architect at Dun & Bradstreet. He has extensive experience working for major organizations like Home Depot, Aeropostale, Freddie Mac and McKesson to solve their data challenges. With degrees from the University of Louisville, George Mason University and Heriot-Watt University, he is also a lifelong learner certified in all major databases and platforms—with more to come.