by Walter Ratcliff
In October 2009, a large pharmaceutical firm contacted me with an urgent training request. They were installing an automated equipment calibration system as part of their Enterprise Resource Planning (ERP) software. The training vendor with which they had originally contracted had botched the job. One month before go-live, they wanted to change horses.
In pharmaceuticals, everything that touches patient safety must be qualified. That means it must be rigorously tested so that it achieves defined tolerances consistently. Manufacturing equipment also must meet tolerances. Tolerances and actuals are routinely monitored so that equipment is repaired before it goes out of range. This activity is called calibration. For this company, calibration was recorded on paper, subject to error; and the data were not retained in a way the enabled predictive maintenance.
Similarly, the people who operate and calibrate the equipment have to be qualified. Unfortunately, “qualified” means they have passed a paper and pencil test–once.
Given the water under the bridge and the client’s anxiety level, we decided not to rock the boat…at first.
The original training vendor had taken a standard, demonstration-based approach. Five role-specific courses demonstrated step-by-step procedures. Our intervention was to clarify the business process, identify what operators and their managers started with and needed to end with, define expected quality levels for the system to work, and clarify the different “use scenarios.” With the help of experienced engineers, we created a training system that simulated these use scenarios. The goal was to balance instruction with practice 40 – 60% and refocus on the real world, instead of just the happy path.
Go-live came. We rolled out training. Level 1 and 2 evaluations were very positive. No surprises.
Big Data and the Missing Link
Emergencies often present opportunities. In this case, the opportunity was money left in the training budget. Not a lot, but enough to ask the sponsor what the value would be to know how well calibration engineers were actually using the system. And, if they were not using it properly, why not. It is hard to say “no” to this kind of offer.
The company had purchased a reporting package from a “business intelligence” vendor and organized various datasets against which reports could be run. It is not unusual that only a few reporting specialists know anything about these datasets. Similarly, operational managers know the high level business workflows. But translating the business workflows into the data transformations that reflect progress at each step in the automated workflow is often beyond all players. Fortunately, this is the domain in which performance consultants play.
Knowing the workflow allowed me to build a model of the data we would need. The initial set of numbers came from queries using existing datasets.
The diagram below illustrates the pipeline of calibration from closed calibration records (that is, finished work) to fully online records (that is, the target state that is usable for predictive maintenance).
In December 2009, a total of 583 calibrations were completed. Of those, 91% were done in-house by people we trained (rather than contractors who use paper records). Forty-five percent of the 583 calibrations had a model data template (MDT) available. MDTs had to be created and assigned to a piece of equipment for records to be fully automated. But only 25% had MDTs assigned. And for various other reasons, only 17% of the 583 calibrations were fully automated.
So engineers were fully qualified, the system was up and running, but much of the system investment had been left on the table. This should not be a surprise to anyone involved in ERP implementations.
The challenge now was twofold: (1) Make a business case for this additional expense to create “implementation analytics” and (2) collect the data and display it in a way that increased accountability for implementation.
Fortunately, the project had a published business case. It lent itself well to a simple equation:
Annual Opportunity Cost= 1- (Online Records/Total In-House) x (Cost per record x Number of records annually =
(1-.17) x $54.27×21,000 = $945,900
That is, annually there was $946K of projected value that was not being realized. This figure did not include the value of preventive maintenance, only automation. Was this a problem worth doing something about?
At the same time, we mocked up a dashboard and defined data feeds, all using the “business intelligence” tool. The dashboard needed to focus down to the unit manager level and provide both summative and formative information. Our prototype was constructed using queries, so it did not allow drill-down or real-time display. But the mock-up did contain real data for one site.
In the mock-up below, Current Adoption Indicators provide diagnostic information about which key parts of the workflow process were being followed, how accurately, and how completely. The graphs show these measures over time.
Good New Bad News
The initial pipeline findings caused the data template work to be re-prioritized so as to eliminate that bottleneck. Unfortunately, the leadership team decided not to create a window of accountability using the dashboard approach because the project was already over-budget.
Lesson learned is to build implementation analytics into the system implementation budget. It requires a performance consultant and a business analyst. The approach has four steps:
To paraphrase Harold Stolovich’s adage Telling Ain’t Training, “Implementation Ain’t Performance.” Implementation analytics (IA) maps business workflow to the datasets that are ubiquitous in big data. It provides sponsors with real-time insights into the progress of implementation and provides a window of accountability through the management chain. According to McKinsey (2011), using big data to improve the management of companies is an area of untapped opportunity. Let’s go for it!
McKinsey Global Institute. (2011, October). “Are you ready for the era of ‘big data’?”
About the Author
Walter Ratcliff has been an independent consultant for more than 30 years in the areas of human performance technology, instructional design, project management, organization development, and facilitation. He has also served as a training manager, organization effectiveness consultant, instructor, and change agent within diverse industries. You may reach him at firstname.lastname@example.org