Process Optimization — Fine-tuning the Manufacturing Enterprise

An in-depth look at how one company combined vision with execution to gain insight and control of its critical-path processes

Twenty-first century technologies have catapulted many global companies and their management teams into the center of an alternate universe; a world where IT reigns supreme and acronyms are meaningful. Why? Because, in this world, new methodologies and standards can make the difference between a relatively smooth-running, flexible infrastructure or a maelstrom of competitive and proprietary systems that can only be modified by vendors at an annual cost running in the millions.

Why now? Because across industries, global markets are pushing customer segmentation, competition and pricing wars to the boiling point, straining infrastructures and IT staff to increase systems productivity and agility. What many management teams don't understand is that their current operational infrastructures may have reached critical mass. Most of these so-called heterogeneous environments consist of patchwork layers of new and legacy applications and systems, often hard-wired, point-to-point integrations designed to perform yesterday's tasks with an every-changing array of co-conspirators.

Financial services, healthcare and telecommunications all share this operational challenge. Regulatory and competitive pressures have forced some to re-engineer their IT operations to survive. Others, specifically communications companies, have been dragged into the fray due to a market demand for sales and service of bundled products such as video, voice and data. Manufacturing, however, has focused almost exclusively on powerful design and collaboration tools to stay ahead of escalating product complexity and global competition — further complicating the interoperability challenge. While this approach is certainly understandable, and even good business practice, it may be short sighted.

Concurrent engineering and product data management (PDM) systems have delivered major improvements in information access, collaboration and project management, however, operational silos persist due to the maze of data, systems and processes required to bring complex products to market. These silos present barriers to centralized control and management of human and IT assets and, as a result, to enterprise initiatives critical to improving the organization's ability to respond to economic pressures, market opportunities or infrastructure challenges. So how do you evaluate and "tweak" processes involving multiple departments and suppliers scattered around the globe, managing thousands of parts and functions? Most companies don't even track or catalogue their most critical processes...except perhaps at departmental or key function levels.

Until fairly recently, this level of interoperability was just a pipe dream or, if mandated, required expensive and disruptive integration projects, often solving one problem while proliferating the hard-coding that caused the initial problem. These methodologies have impeded growth and strained relationships between customers and vendors for years, leading to accusations of "vendor lock in" and worse. All that is beginning to change.

Advances in XML standards, application programming interface (API) design, Web services, business process management (BPM) tools and service-oriented architecture (SOA) frameworks are converging to create opportunities for complex organizations to gain control over the processes that determine market success or failure. While process optimization may require focus, resources and commitment, it offers original equipment manufacturers (OEMs) the path to managing their business operations according to their business plan, not the reverse. However, to reach this point requires a top-down and grass roots commitment to thinking in terms of "business flows," not just functions. After all, this is not just about vision, it's about survival.

Process Optimization In Practice

One company that knows how to combine vision with execution is well on its way to gaining insight and control of its critical-path processes but, like its market success, it isn't happening overnight. For more than 75 years, this Canadian company has led the market in the design and manufacture of state-of-the-art aircraft engines. However, global demand and breakthrough technology have increased the complexity of engine designs and the supply chains needed to build them. To stay ahead of the innovation curve, manufacturer-suppliers must produce more sophisticated engines in less time while seamlessly managing development partners spanning cultures and time zones.

As early as 2002, this organization began exploring new methods and product lifecycle management (PLM) technologies for streamlining engine development, with the goal of drastically reducing time to market. Today, the company is on track to become the first in aerospace industry history to develop engines using digital technology throughout the entire design and manufacturing lifecycle. This program represents one of the major components of the company's digital initiative, the goal of which is to enable all stakeholders to work on engines concurrently in a virtual development environment.

For this industry leader, driving innovation in power, fuel efficiency and cost of maintenance provides a critical competitive advantage, so in order to make major improvements while preserving time to innovate — and to market — critical-path processes had to be identified and evaluated. While the overarching goal of the digital program was to reduce development time and costs, to realize dramatic gains the company needed to target and virtually eliminate "non-quality events" across the entire organization — from design and production engineering, to the field, customer support and service.

A larger vision soon evolved out of the product development initiative: To make this manufacturing organization more agile and efficient, it needed to transform itself into an "e-culture" with a shared commitment to automating and optimizing enterprise processes. However, to achieve this transformation would require far more than a compelling vision and sound strategy. Like most large manufacturers, this is a complex, distributed organization employing thousands of people, producing tens of thousands of engines for customers in almost 200 countries. But it is also evolving into a global development "engine" with a new PLM program that provides leading-edge tools and process management technologies designed to support and extend the company's quality process initiative gradually — and with minimal business disruption.

As with most business transformations, there was one individual in particular who understood the potential of the PLM technologies and knew enough about quality processes across the enterprise to act as both guide and project leader. This unique individual was a "quality guru" with over 20 years experience in disciplines ranging from advanced design and analysis, to quality assurance testing, through service center management. She currently heads up the company's E-PM (Enterprise Process Management) quality process management program, a critical component of both the digital engine and e-enterprise improvement initiatives. The digital quality process strategy will leverage ENOVIA V5 E-PM from software tools from Dassault Systèmes to design, manage and optimize enterprise business processes impacting quality, maintenance costs and, ultimately, customer satisfaction.

While enterprise implementation of E-PM is central to this initiative, the path that led to this realization began with a much simpler problem. The quality guru had employed a customized ENOVIA application called Product Manager (PM) to track and report on non-quality events almost eight years ago. But tracking alone couldn't explain why they occurred or prevent them from occurring again. Using only the PM tools and "sweat equity," the "quality team" invested three months researching and analyzing the root causes and processes driving engineering change orders (ECOs). Based on the results of this project and her deep knowledge of quality issues and systems, she began to see applications for the Product Manager beyond ECR/ECO reporting. She theorized that any event or process that could be captured and analyzed could also be evaluated for improvement and, if appropriate, standardized for continuous quality assurance. It made perfect sense to use the same business rules, flows and methods to treat product and process management as they were inextricably linked.

For example, many development tasks and processes were linear, involving multiple functional silos as the product definition evolved. If problems occurred or errors were uncovered, the change request process and rework could add days onto the product lifecycle. Locating problems early on could deliver significant returns if events and triggers could be tracked and requests for action (RFAs) quickly processed. The only way to find out if the legacy application could do the job was to design a program that supported QA work in progress and could serve as a pilot for an enterprise process optimization initiative. However, the challenge was to convince diverse business units and managers, who spoke different "process languages" and used different systems and methods, that standardizing processes was a good idea. The necessary leap of faith was considerable, and though it took some time to build support and move forward, our process champion prevailed. In her words: "We wanted to create a single process that was really a way of thinking and working with software tools and with each other so quality was always the outcome.'

The scope of this quality process management project was ambitious. Working closely with the technology provider, the quality team identified 12 key processes spanning the entire product lifecycle and began working with department managers and critical-path quality and governance groups to roll out the plan across seven worldwide locations. Active participation by the pilot group was essential, not only to improve the targeted processes, but to help identify and log all non-quality events into the system as part of their daily work routine.

A typical non-quality event might address service problems related to part replacement. If the system doesn't provide enough detailed information and directions, it's possible that a replacement request for an incorrect part may be fulfilled with exactly the same part. When this problem is identified and submitted to the digital quality process system, an RFA is created and, if validated, resources are assigned. Using advanced workflow and data management tools, humans and systems work together to immediately contain the problem, then perform rigorous root cause analysis and build the documentation. Based on the results, corrective actions, including new business rules and process templates, are submitted for review and approval and an implementation schedule is activated with appropriate workflows and a feedback loop to ensure the problem is handled correctly.

Since the inception of the program, requests for corrective actions have averaged around 5,000 at any one time, with over 75 percent now processed in minutes rather than days. As the system evolves and participation increases, the resources required to manage the requests continue to plummet. Standardization has reduced risk and streamlined processes, making most operations easier to modify and adapt to changing conditions. The piles of paper have disappeared, accelerating throughout and reducing errors. Stakeholders from every region and function now participate enthusiastically or are requesting training and support for their departments. In fact, the chief executive officer reviews all RFAs every Thursday, so it appears that the company's process champion will have no problem maintaining momentum for quality process improvement going forward.

As they prepare for the next phase of implementation, the quality teams plan on extending access to the Web, enhancing querying, monitoring and reporting capabilities, and integrating the system with external applications — including global business partners, suppliers, and customers. While the success of this project was due, in part, to an individual with deep tribal knowledge of the organization and its processes, it would still be in the pilot phase if not for a sound tactical and communications strategy, advanced tools, and rigorous documentation and oversight.

Finally, this ambitious program confirmed that without support and participation from the executive team on down, optimization projects can never get off the ground, nor will continuous process improvement become a permanent part of the corporate culture. Because enterprise processes can control the outcome of thousands of "transactions" between humans and systems every day, they truly are the revenue engine of any large organization…and this engine is taking off.

About the Author: Lance Murphy, Product Marketing Manager of Dassault Systemes’ ENOVIA Brand, has more than 10 years experience in product management and marketing. He has expertise in product lifecycle management, enterprise resource planning and customer relationship management domains, including design for manufacturing best practices. He can be reached at [email protected].

Latest