The network is responsible for validating the transmission of data at two levels the structure of the transaction and the content of the transaction. The structure of the transaction deals with how well the message adheres to any standard or agreed-upon format. For example, EDI advanced shipment notification 856 has all required fields and is a properly structured X.12 VICS 856 ASN. The system should have the capability to collect non-structured data such as forecasts from the sales team, or operational capacities from manufacturing or distribution.
Validating transaction content is more complicated and deals with its business value.
This is the most critical step in the network, for here the rules of data quality are enforced in a central location before data is moved into business applications. Data quality rules are applied to:
* Validate codes
* Enforce required fields (required for business content, not the standard being used)
* Normalize time zones and currencies
* Validate transaction dependencies and event sequences.
Moreover, any rules engine that validates content must also be able to alert a human who can investigate data integrity questions. If a transaction is critical to downstream systems, the workflow needs to drive a path-to-resolution process that will enable an alert to correct data errors and process the transaction. Data owners and data stewards must be identified and are responsible to monitor and execute processes that ensure data quality.
It is also recommended that key performance indicators and scorecards are developed for all trading partners, so the quality of information sent can be measured and tracked. This has proven to be one of the easiest ways to improve data quality and should be part of any partner compliance program.
For example, a large retailer was able to drive its data quality metrics with trading partners to 98 percent. It did so by clearly defining standard operating procedures for suppliers, using scorecards to measure effectiveness and applying incentives for data quality. Incentives included penalties for non-compliance, but more importantly, also featured rewards like early payment for meeting data quality targets. The program was put in place by a joint effort from Sourcing, Logistics, IT and Inventory Management.
The event manager is the next part of the application architecture. It is designed to increase confidence in supply chain planning predictions by continuously monitoring certain planned events, and eventually comparing the plan to the actual event. The event manager captures all event messages as they occur and processes them against a planned set of events. It performs several important functions, including:
* Event management
* Path to resolution
* Alert management
* Collaboration and integration.
Key supply chain events are typically planned (either manually or with a decision support system). With increased visibility, these events can be monitored closely by updated event messages, which make use of the timeliest information. An event message can represent a revised prediction of a planned event, or the actual execution of an event. By continually capturing event messages, the event manager allows supply chain partners and key stakeholders to gain visibility into an event's actual status. This increases their ability and effectiveness to manage the pipeline, because decisions are made on timely information not just the plan. Visibility not only exposes an issue, but pinpoints its cause, enabling a more focused path to resolution.
By using event messages, along with defined business rules (e.g., a revised date falls after a planned date), alerts can be generated and sent to appropriate parties, so actionable items can be identified and dealt with quickly. Alerts can appear in a variety of forms, such as an e-mail, text message or simply a color coding on a user interface. For example, a buyer may want to be notified via e-mail when an ASN arrives featuring a stock-keeping unit (SKU) quantity that differs significantly from what was ordered.
The architecture of the event manager enables integration with internal systems and maps business processes across the organization's external boundaries. The event manager masks external processes from internal processes and systems, easing the transition to a collaborative environment. Complex supply chain processes can be mapped to include design collaboration, request for quote/request for proposal (RFQ/RFP), purchase order management and demand signaling, based on point-of-sale data. By linking these processes, a total supply chain picture can be created and monitored from customer to supplier to service provider and back to customer. Visibility across the total supply chain tightens integration between individual systems by giving each access to the actual status of events on which it depends.