Boris Evelson, principal analyst at Forrester, agrees. In a July 2008 Forrester report titled “The Forrester Wave: Enterprise Business Intelligence Platforms, Q3 2008,” Evelson writes: “With most products and services being highly commoditized, more and more businesses are competing on analytics. Getting better insight from information based on richer data sets, more complex models, or even making the same decisions as everyone else but before everyone else makes them — this is how most advanced enterprises compete in today’s world. Business intelligence tools and technologies form the major components of the foundation that supports and enables such competitive differentiation.”
More Data for Better Intelligence
As an example of how he sees BI solutions developing, Owens points to the business intelligence solution that he and ProcureStaff have spent the last five years developing to help their clients measure results against the market at large and, through expert statistical analysis, make strategic planning decisions. Owens and his team have developed a data warehouse (DW) and spent the last five years populating it with anonymous program data from the dozens of Global 1000 companies that use ProcureStaff’s services procurement solution, a managed services program (MSP) that includes their Vendor Management System (VMS) application.
Aggregating the data from every program in the DW along with external market data from various industry sources allows Owens’ team to provide each client with a more accurate understanding of how their individual program performance stacks up against the broader market. This aggregation, Owens says, is the key to enabling companies to view their metaphorical swim times against those of other companies in the market. The question for a procurement officer is no longer, “How much less did we pay this quarter for IT project workers than last quarter?” The much more pertinent question enabled by the warehousing and analysis of aggregate data becomes, “How close is the rate we pay for IT project workers to true market rates?” The difference could add up to millions of dollars saved each year.
Owens suggests several reasons why more services procurement providers have yet to commit to developing the kind of BI offerings that ProcureStaff offers. For most, it comes down to expertise. Building a data warehouse is itself a difficult task, and populating it with cleansed data is extraordinarily time consuming and perhaps cost-prohibitive for some companies. The work involved in normalizing the data collected from dozens of programs across numerous clients in disparate industries is a daunting task.
“It’s not as easy as simply compiling the data in a warehouse and then slicing it to provide reports,” says Owens. “The data must be cleansed and classified for analytical purposes, and that process requires expertise in human capital.” He explains that it’s not as simple as mapping what one organization refers to as “Programmer Level I” to another organization’s “Application Developer Level I.” Job categories and titles used by different companies are incomplete at best and often over-generalized or inaccurate.
Furthermore, the individual business practices of each company are varied, and these too must be cleansed to avoid incongruent comparisons and flawed analyses. Supplier performance analyses will be skewed if not based on statistically validated metrics and unless the situations being measured do not truly reflect the actual performance of the supplier. For example, simply dividing the number of candidate submittals by the number of orders a supplier receives does not yield a valid metric. Measuring a supplier based on contractors provided through an approved pass-through arrangement (i.e., corp-to-corp or payrolled contractors) will yield invalid results not reflective of true supplier performance. Deep domain expertise in services supply chain practices – understanding how contractors are sourced and how the orders are conducted – is as important to executing BI as expertise in statistics and data management.
Owen’s BI process automates the extraction and cleansing of transactional data from the VMS and loads them into the warehouse automatically. Thus, the data enter into the warehouse according to consistently verifiable protocols, regardless of how each individual client organization refers to a position and/or sources a candidate. After this entire process is complete, Owens and his team keep a vigil for “red flags” or anomalous outliers that trigger the deeper analyses yielding potent BI product.