Cutting Through the "On Demand" Hype

With ongoing economic uncertainty putting a premium on business agility, technology companies are offering up visions for new computing and business process models. Here's a primer on the new models and their potential implications for the supply and demand chain.

With ongoing economic uncertainty putting a premium on business agility, technology companies are offering up visions for new computing and business process models. Here's a primer on the new models and their potential implications for the supply and demand chain.

[From Supply & Demand Chain Executive, December2003/January 2004] Grid computing. Utility computing. Organic IT. e-Business on demand. Adaptive enterprise. Seems like with every passing week some new catchphrase pops up to seize the attention and imagination of the information technology (IT) marketplace.

It would be easy  particularly for those outside the IT function  to write off these buzz phrases as so much marketing noise. But the concepts behind all the sloganeering are forming the foundation for new supply and demand chain models that some analysts believe could give early adopters a significant competitive advantage in the years ahead.

A Matter of Definitions

Of course, given the proliferation of these catchphrases, it's hardly surprising that not even all IT specialists have a clear understanding of what the terms mean. For example, a study by Westport, Conn.-based consultancy Saugatuck Technology found that just 19 percent of IT managers in a survey understood what "grid computing" meant. And while 48 percent of the managers in the survey were familiar with the phrase "utility computing," just 2 percent said they had a firm understanding of this model.

With that in mind, let's start with some definitions.

In its recent report "How to Differentiate Grid Computing and Utility Computing," technology research firm Clabby Analytics described grid computing as "a network architecture that finds and exploits computing resources and storage." A grid, Clabby writes, is all about sharing unused computing power within a set pool of available resources. Using this model, companies potentially can ensure that their computing resources are more optimally utilized, and they can also attack large computational problems without necessarily having to invest in new, high-power computers. This is because a grid system could break one huge number-crunching assignment into smaller sub-assignments that low-power desktop machines could solve and send back to a central computer for aggregation.

In contrast to a grid, Clabby continues, "utility computing" refers not to a specific IT architecture but rather to an approach to computing that "is predicated on the idea that computing power and resources should be available like electricity"  thus the name "utility." The idea is for a computer, or a company, to acquire computing power or storage space on an "as-needed" and "pay-as-you-go" basis, just as a consumer would "acquire" water on those same bases by turning on the tap. The advantages for a company adopting this model are fairly straightforward: utility computing shifts the requirement to invest in capital equipment from the user company to the service provider, and the user company pays only for the computing resources it actually uses. Fixed costs become variable costs, and the user company avoids the expense of maintaining and upgrading the equipment.

Moving beyond grid and utility computing, we encounter a host of trademarks and catchphrases that analyst firms and technology providers have created to describe their own visions for the future of IT. These include Forrester Research's Organic IT and what has become known as on-demand computing models, including IBM's e-Business on Demand, HP's Adaptive Enterprise and others. But while these models may draw on the elements of grid and utility computing, they go beyond a focus purely on IT to incorporate aspects of business transformation and process re-engineering.

From Organic to On Demand

Forrester Research debuted its vision for the future of IT in its April 2002 report "Organic IT." In the report, primary author Frank Gillett defines Organic IT as "computing infrastructure built on cheap, redundant components that automatically shares and manages enterprise computing resources  software, processors, storage and networks  across all applications within a datacenter." Rather than running siloed applications on isolated servers, the Organic IT model calls for shared computing resources and applications that can be linked across business functions and reconfigured using technologies like Web services.

Ignoring (for our purposes) the technical implications of this model, the business impact, according to Forrester, would be to free companies from "rigid technologies" that prevent them from quickly adapting their processes, allowing enterprises to more quickly compensate for, or take advantage of, changing business conditions. Explains Navi Radjou, a principal analyst with Forrester: "Organic IT means that process owners can focus on designing, deploying and optimizing their processes  and look to applications that fit into their processes rather than trying to make their processes fit to the alphabet soup of different applications available, such as SCM (supply chain management), WMS (warehouse management systems), ERP (enterprise resource planning) and so on. Organic IT holds the promise of the software adapting to the process rather than the other way around."

In using Organic IT to break the chain binding processes to specific applications, enterprises would have the capacity, Radjou argues in his June 2003 report "Helping Supply Chain Cope with Demand," to better orchestrate sell- and buy-side activities into what he calls "composite processes," or "technology-enabled, cross-organizational process flows designed to share and act on changes in demand." In other words, companies adopting the Organic IT model would become more agile and better able to detect and respond to demand signals.

Meanwhile, late last year IBM rolled out its own grid and utility computing-based vision called  as anyone who has been exposed to IBM's marketing machine over the past year knows  "e-Business on Demand." IBM is casting on demand as a way to enable enterprises to break down functional barriers and to integrate business processes both internally and with customers and suppliers in a way that lets an entire supply chain respond more rapidly to changes in demand or the competitive landscape.

In practice, the on demand vision provides a platform for IBM  and other solution providers pursuing similar visions  to offer services in three different markets, as Forrester Research Principal Analyst Ted Schadler describes in his November 13, 2003, report "IBM's On Demand Strategy Goes Beyond IT." Those markets, Schadler writes, comprise business transformation projects (which could lead to business process outsourcing deals), business process integration projects (both within the four walls of a single enterprise and between the enterprise and its trading partners), and standards-based infrastructure projects (which could lead to data center automation or IT outsourcing deals).

Outsourcing of Sorts

The business process outsourcing aspect of on demand could be of particular interest to executives looking to hand over some part of their supply and demand chain to an external service provider. Interestingly, IBM already undertook an initiative of this nature internally in January 2002, creating an Integrated Supply Chain (ISC) group headed by Robert Moffat, the company's senior vice president for supply chain. Under Moffat, this 19,000-employee group consolidated some 30 different supply chains, supporting about 50 divisions within IBM, into a half-dozen supply chains, all but one of which Moffat operates on behalf of the company's divisions, essentially acting as an outside service provider. IBM has reported first-year cost savings from the initiative of $5.6 billion, and Moffat has said that the company also took its inventories to the lowest levels in history, drove on-time deliveries to the highest levels and took nine days out of the cash collection cycle. Those results came both from the consolidation of the supply chains and the company's ability to use the ISC as a channel for disseminating best practices from one group to others within IBM, according to Moffat, who spoke about the ISC at AMR Research's spring executive conference in Scottsdale, Ariz., this past June.

William Schaefer, vice president for procurement services with IBM Global Services, argues that just as IBM's internal divisions achieved significant saving by handing control over their supply chains to the ISC group, so, too, could other companies drive both greater savings and responsiveness by turning over responsibility for their procurement functions to a service provider like IBM on an on-demand, pay-as-you-go basis. "The beauty of on demand is that it's the perfect theme for a procurement person," Schaefer says. "It's saying, 'I pay for what I need, just what I need and when I need it.'" That pay-for-what-you-use model distinguishes on demand from a traditional outsourcing arrangement, under which a company might pay a flat fee for a fixed set and quantity of services.

Scott Lundstrom, senior vice president and chief technology officer at AMR Research, points out another way in which on demand is not just "business process outsourcing to IBM" by another name. "When you start to look at what the difference is between on demand and outsourcing, certainly there is an implied ongoing improvement that IBM wants its customers to see," Lundstrom says. "That [improvement] comes in two ways: one is IBM's ability to support quick transformation of business process, and the other is their ability to find ongoing efficiencies in a business process. Those two elements ultimately are going to be pretty important in this market, and they're not really a focus of what we might traditionally think of as outsourcing companies."

Lundstrom and other analysts seem to agree that the range of technology and services encompassed by "e-Business on Demand" distinguishes IBM's vision from those of other IT providers that are proffering their own version of on demand. The analyst community most frequently cites HP, promoting "Adaptive Enterprise," and Sun, with its cryptically named "N1" strategy, as the other major contenders in this market, although both these providers would rely more on external partners to deliver certain software components of their visions, as well as business transformation and process integration services, according to the analysts. Separately, Oracle is going after the market for grid software, and some analysts see Microsoft, Computer Associates and Veritas competing on the software side, and Dell competing on the hardware side, of utility computing. EDS has scored client wins with its utility computing offering, too.

The Adoption Curve

For all the chatter in vendor and analyst circles about the potential for grid computing and utility computing, and for all the advertising muscle that IBM and others have put behind their on demand messages, broad adoption of these models has been modest to date. For example, interest in grid computing has centered primarily among users of scientific, engineering and mathematical applications. Case in point: United Technologies' aircraft engines division Pratt & Whitney is using a grid-enabled solution called LSF from Markham, Ontario-based Platform Computing to accelerate its turnaround time for computer-aided simulations, reducing engineering time and cutting development costs.

By contrast, broader business solutions have yet to be re-written to take advantage of the kinds of bump up in efficiency that grid models promise, and companies have yet to tackle the potential political issues that could arise when one business unit seeks to lower its own costs by tapping into the computing power "owned" by another division within the company. In addition, solution providers and enterprises also will have to tackle the issue of priority: Providers must incorporate the ability to assign precedence to certain users or groups of users when they must compete for computing power across a grid; and enterprises will have to address the same issue on the political level within their organizations. In general, Boston-based IT research firm Summit Strategies, in its October 2003 report "Grid Goes to Market," predicts that it will be another three to five years before grid computing emerges as "a core element of many enterprise IT architectures."

Summit Strategies sees a somewhat shorter timeline for the broader adoption of utility computing models. In its August 2003 report "Under New Management: Five-step Strategy for Enabling Utility Computing Operations," the firm predicts that "enterprise management architectures will be transformed during the next two years as ... management strategies for utility computing take hold." The driver? "The promises of dramatic cost savings and business agility benefits from utility-based architectures are too compelling to ignore," Summit writes. Alternatively, Andy Efstathiou and Jamie Gruener, analysts with Boston-based technology consultancy Yankee Group, writing in their August 2003 report "Utility Computing in Next-Gen IT Architectures," see a "deafening" hype around utility computing over the next year, but real-world products incorporating utility computing arriving in volume only in another 18 to 24 months.

One possible barrier to broad adoption of utility computing could be enterprise's reluctance to give over control of critical data or processes to external service providers. However, solution providers like IBM (with customers like Deutsche Bank) and EDS (with customers like Coors Brewing) already have been addressing these issues in their utility-based IT outsourcing deals. And the success of such Web-hosted solutions as (for customer relationship management) and Arena Solutions (for product lifecycle management) seems to indicate that enterprises are increasingly comfortable with outside services handling their data and processes.

As for the on demand models that IBM, HP and Sun are promoting, AMR's Lundstrom sees building awareness but slow adoption at this point. "While the vision seems pretty sound, there's a lot of maturity that has to happen," the analyst says. Specifically, he believes that commercial application vendors will have to move further along in their adoption of Web services and UDDI, the Universal Description, Discovery and Integration standard for identifying goods and services online. Not that early adopters aren't already moving toward on demand. Schaefer notes, for example, that IBM has had a long-standing on-demand relationship with United Technologies for procurement services. In general, Lundstrom views the on-demand offerings that are available now as being geared toward the higher end of the market. Large enterprises may consider this type of model, for instance, in order to leverage the global resources of a provider such as IBM to manage very large, very complex projects across geographies, or to facilitate a significant ongoing change within a business process, such as during a merger or acquisition.

In addition, Forrester's Radjou sees potential application of an on-demand model for addressing major, immediate technological challenges, such as Wal-Mart's requirement that its top suppliers adopt radio frequency identification (RFID) over the next year. By handing over the technical aspects of RFID to an outside service provider, a company would be able to ensure compliance with a minimal investment while also allowing the company's supply chain leaders to focus on more strategic aspects of RFID, such as how they could leverage the new visibility that RFID will provide into their goods in motion.

Companies in this article