My first project
when I joined Intel in 1983 was to oversee the design, development, implementation, and evaluation of a supervisory training program. It wasn’t Intel’s first supervisory training program, and it wasn’t their last. It was just a new take on a topic of training that everyone knew was needed. I was working on my Master’s in Educational Technology at the time and studiously trying to apply what I was learning. The standard for evaluation at the time was participant surveys only, but we pushed for in-class testing and measures to determine what was actually getting applied on the job. We ended up compromising on the higher levels of evaluation (settled for self-evaluations in class and on the job), but we were proud that we had begun educating people on the value of evaluation.

A few years later, I was leading an inventory management training project in the factories. The target audience for this program was primarily supervisors, as well. The difference with this program is that it was intended to solve a specific business problem—factory inventories were swelling and this had direct and indirect costs that impacted Intel’s bottom line.

My task was to take a white paper on low-inventory management that had been circulated but not followed, and turn it into a course. Again, I had responsibility for design, development, implementation, and evaluation. But it was a very different experience. First of all, no one told the factories they had to implement my course. I presented it to the manufacturing managers’ council and waited for them to call me—or not. Eventually, one of the manufacturing managers did call because he knew he had a serious inventory problem and he was looking for any help he could get. He was serious enough about it to insist on personally teaching the course to all his managers and supervisors.

I had one-on-one meetings with the manufacturing manager just prior to his teaching each of the modules. It was in these meetings that I asked what he hoped to get out of the training. He said his WIP turns were too low, and he wouldn’t be bothering to do this if he didn’t expect the training to help raise the WIP turns. WIP = wafers in process. So, WIP turns are inventory turns which = output divided by inventory over any given period. Higher WIP turns usually equate to lower inventory, higher output, and higher quality. His request had come before I had put together an evaluation plan for the course, so I was pleased to be given this nice, hard, end-result measure.

One of the objectives of the course was to change the way each supervisor set his or her area output goals each shift. The idea was that instead of each area getting as much output as they could on each shift, schedules should be used to balance the output across all the areas. As taught in the course, the best indicator of whether WIP turns would rise was the extent to which each area stuck to its balanced output goals. This information was measured and reported automatically (by the shop floor tracking system) as often as people wanted to see it. We decided we didn’t need to test the participants on their ability to calculate the balanced goals and other key skills, because each day we’d find out if the goals were being calculated and followed or not. As always, we surveyed the participants about their satisfaction with the course.

Within days after teaching the first module, we knew there was an impact on performance because the vast majority of supervisors were operating to the balanced goals. At the end of the first couple weeks, the overall inventory levels were much more balanced and the level of inventory was starting to subside (for the first time in years in this factory). The course was implemented over six weeks and within six months, WIP turns had risen from about 1.0 to about 1.4 (a 40% improvement in output relative to inventory). The manufacturing manager was very happy and other factories started signing up for the training.

I’ve documented these results and more detail about the program elsewhere (Esque & Patterson, 1998). The point here is to show how this experience changed my perspective on performance measurement. Quite by accident, my focus had shifted from evaluating my training program to helping identify measures (both end results and indicators) that the client could use to determine if performance was improving or not. Over the next 15 years, I expanded my focus from training to broader performance improvement. But what I had learned about performance measurement continues to apply. I no longer think of performance measurement as evaluation. I think of it as a way to make sure the client links any improvement efforts to business results. In fact, my approach these days is to help the client get ongoing performance measures in place before any improvement efforts ever begin. Performance measurement, as I’m describing it here, is in itself a powerful form of performance improvement.

What started me reminiscing about how my perspective has changed is some recent dialogue about the GOT RESULTS? campaign. For more than five years, I, along with Carl Binder and others, have been using this campaign to encourage performance measurement in our profession. But, we are still clarifying to others and ourselves exactly what we mean to encourage. To some, the ultimate form of performance measurement is the demonstration of ROI (proving training and other interventions are worth more than they cost), to others performance measurement applies to the lab as well as the organization and includes all aspects of treatment evaluation, and there are a variety of other perspectives. To me, performance measurement has become something people in organizations (the clients for us practitioners) need to be doing to manage and make decisions on an ongoing basis. I believe helping clients do this well is our best vehicle for getting clients focused on results (ISPI’s #1 Standard of Performance Technology).

The dialogue about performance measurement has been and continues to be a learning experience for me. I’d encourage others to give this some thought as well. What does it mean to get performance results? Should the results from the in-class and on-the-job evaluations on my first program be considered performance results (or evaluation results)? Why can’t all performance improvement interventions be linked to organizational results, such as WIP turns in my second example? These are some of the questions that the GOT RESULTS? campaign is designed to get more people thinking and talking about.

Timm J. Esque is an independent performance consultant specializing in helping clients set up and sustain performance systems. He co-edited Getting Results (ISPI/HRD Press, 1998) and co-founded the campaign now called GOT RESULTS? Timm may be reached at tjesque@yahoo.com.

Advertisement 

Would you like to advertise in this space? Contact marketing@ispi.org

 


 

I no longer think of performance measurement as evaluation. I think of it as a way to make sure the client links any improvement efforts to business results.



by Carol Haig, CPT and Roger Addison, EdD, CPT


While we were attending ISPI’s Annual Conference in Tampa, we sat down with two colleagues from The Netherlands, Jolanda Botke, jbotke@cinop.nl, and Joep Straathof, joep.straathof@zuidema.nl, to explore the particular performance improvement challenges and future opportunities they identify in their work in Europe.

Jolanda’s work at CINOP focuses on providing performance improvement, learning interventions, and evaluation for small-to-medium entrepreneurial organizations. Joep is director of De Zuidema Groep, where he works with both government and private organizations to provide performance-based support from front-end through implementation.

Critical Challenges and Opportunities
Jolanda and Joep identified three critical challenges and opportunities they face:

  • Demonstrating the relationship between leadership and performance improvement
  • Focusing on performance-based evaluation and return on investment (ROI)
  • Establishing the return on value for organizations

Reasons for these Challenges and Opportunities
Both the economy and ROI for stakeholders are factors in the relationship between leadership and performance improvement. Jolanda and Joep discussed the European Union, and the many changes it has brought to the workplace, as an important influence on how organizational leaders view performance improvement. Workers from Eastern European Union countries, for example, are expanding the European workforce. As they add their labor to the economy, the results produced are of great interest as are the ways of measuring them. As the workplace becomes more complex, organizational leaders are held increasingly accountable, and all performance improvement interventions are carefully scrutinized.

When it comes to performance-based evaluation/ROI, Jolanda and Joep report that clients first want to know the cost of a performance improvement project and what the specific performance indicators will be. They ask if the intervention was a success, and how it was measured. A related challenge, which we have discussed previously in this space, stems from a lack of organizational linkage among functions such as marketing, sales, and human resources. Today, separate evaluative information is required for each. Helping these entities link together, moving from silos to a horizontal structure, will pave the way for more comprehensive evaluation information. Joep shares an example from a company on the border between Germany and The Netherlands where language skills training was provided to workers so they could understand Dutch instructions on the job. The project was considered a great success because pre- and post-tests showed language skill improvement; but no one ever asked if the quality and quantity of the work improved.

Stakeholders express interest and concern about ROI, particularly as management roles are increasingly specialized by function. Like stakeholders everywhere, Europeans are concerned about corporate financial scandals, increased competition, and the influx of multinational companies. With their newfound skepticism and erosion of trust, stakeholders are increasingly concerned with organizational values and image. As their interest in values grows, so does their interest in return on value.

If we define return on value as company loyalty (a predictor of employee retention), then organizations that strive to create an environment where people can do their best and be more productive will see a return on value. In such organizations, workers will likely stay with the company, become more self-motivated and self-reliant, and support an entrepreneurial spirit. Jolanda and Joep believe that HPT practices can make a difference if organizational leaders create an environment where employees can be successful. And this return on value can begin with employees who are encouraged to improve work processes to produce better results.

How Organizations Will Be Different
It is likely that many organizations will restructure for greater efficiency. Related internal culture changes may take place as operating philosophies, work processes, and procedures are changed to align with new structures. This will open the door for an increased focus on performance improvement activities, their impact on business results, and the responsibilities of organizational leaders.

Implications for CINOP and Zuidema Groep
All change brings with it the opportunity for innovation, for discovering new ways to work and to measure results. Organizational leaders will be changing their approaches as their companies re-align and restructure. Successful change is a product of successful leadership, and performance improvement principles and practices will play an increasingly important part in organizational evolution.

If you have any predictions about the future of HPT that you feel would be of interest to the PerformanceXpress readership, please contact Carol Haig, CPT, at carolhaig@earthlink.net or Roger Addison, EdD, CPT, at roger@ispi.org.


  
   
  
  


All too often,
instructional technology winds up getting used for things we could be doing as well with paper. Before you create your next online textbook, stop and ask yourself if that’s really a worthwhile use of your time and your client’s money. Is the purpose of the training to improve problem solving? If so, you should probably be building some kind of simulation or game into your software. True believers will tell you that simulations and games are the one type of technology-based training that clearly justifies the investment.

OK, I admit that simulations and games are more complicated to build than online textbooks. And, they are less familiar to designers than classical tutorial computer-based training (CBT) (the design of which is also rapidly becoming a lost art—but that’s a topic for another day). But, if you hang around anyone under the age of 25 or so, you know that games are among the most common and motivating of computer applications. The good news is that most of the software structures used for games can also be used for instructional simulation. Technology training of all kinds has long used simulation—often recreations of systems such as aircraft, manufacturing plants, networks, and the like. And, there are lots of non-training uses of simulation and game technology, for R&D, and for entertainment.

Of course, not all games are instructional simulations, and not all simulations are useful instructionally. So, what makes a simulation or game instructional? I argue that these are the key characteristics of instructional simulations and games:

Real-World Context
Real-world context, presented in a way that allows the learner to make the connection to his or her work environment. Ask yourself, “what’s the story” of the problem-solving environment you’re creating? However, don’t fall into the trap of believing that you always need high physical fidelity. There are times when you need to reproduce the physical environment in detail, but that’s what drives up the cost of simulation, so use high physical fidelity only when you need it—lots of times you don’t. Don’t build SimCity when all you needed was Monopoly.

Clearly Defined Decision-making Skills
An instructional simulation or game is designed to produce a specific, defined learning outcome. And, the skill is defined by the real-world task you are training for—not by the fantasy environment of the game. That’s what differentiates instructional simulations and games from other kinds.

High Cognitive Fidelity
High cognitive fidelity with the real-world task. This means that the learner’s decision-making task must use real-world information, in a realistic context, to make realistic decisions. Unfortunately, many instructional games and simulations spend lots of effort on creating their simulation or game space, but the underlying cognitive task is simple recall, algorithmic thinking—or just twitches. The complex cognitive task of problem solving gets lost entirely.

Note also that it’s often a good idea to control the level of difficulty of the simulation by using a design strategy called scaffolding. This may cause you to violate the principle of high physical fidelity, by doing things such as making internal processes visible, or by expanding or compressing time or space. This is another reason why good instructional simulations and games may actually appear unrealistic—and it is simply not true that superficial realism is an indicator of the instructional effectiveness of the simulation or game.

Cognitively Relevant Feedback
Because you are teaching decision-making, you need to provide feedback to the learner not only on the outcomes of the simulation (Did you win? Did you blow up the system?), but also on the decision-making processes the learner used to arrive at that outcome. Doing so requires that you have a strategy for capturing and measuring the thought processes of the learner—this is a very tricky, state-of-the-art problem in instructional design and psychometrics. Many simulations and games—including most of those used for engineering design and entertainment—simply ignore cognitive feedback as irrelevant to their purpose. Unfortunately, when we design instructional simulations we don’t have that luxury. Even if you decide to delegate feedback to humans, through collaborative learning and cognitive coaching, you still need to design it, and provide relevant information for the humans to use.

During my session at ISPI’s 2004 Performance-Based ISD Conference, we’ll explore these issues and talk about some of the challenges of designing an instructionally effective simulation or game. In the meantime, if you need to get oriented, see if you can bribe a local teenager to give you a little joystick time on games and simulations such as Zork, SimCity, Flight Simulator, or any of the online group games.

Rob Foshay, PhD, CPT is Vice President of Instructional Design and Quality Assurance for PLATO Learning, Inc. He is a former ISPI Board member and recipient of the Distinguished Service and Honorary Life Member awards. He may be reached at rfoshay@plato.com.

Advertisement 

Would you like to advertise in this space? Contact marketing@ispi.org

 



 

To learn more, attend Rob’s Masters Series presentation on October 1, 2004, during ISPI’s Performance-based ISD Conference. Register today!



The International Society for Performance Improvement is presently conducting an online survey regarding members’ employment profiles, compensation information, and work satisfaction. All ISPI members were sent an email from ISPI’s Executive Director Richard Battaglia a couple of weeks ago introducing the survey and providing information about its URL.

The results of this survey will be very useful in helping the Society to understand the professional practices of our members. The online survey contains 20 questions and takes less than 10 minutes to complete. The results will be shared with all ISPI members through our publications.

If you have not already done so, please complete the survey today. Your participation is critical. For those of you that have not yet completed the survey, you will receive a reminder via email. To those of you who have already participated, your input is valued.

 

  




This month I took a sentence
about the ISPI advantage from the Society’s website and converted it into a puzzle. Drag and drop the pieces back together to complete the puzzle and discover an importance message about human performance technology!



  
  




Recently, we extracted descriptive data from the membership database of the International Society for Performance Improvement. These data provide a profile of the members of ISPI.1 This delineation provides details about the demographics and professional interests of our colleagues and insight to the makeup of our membership and our organization. Each of us will have a better understanding of how we as individuals are similar to and different from our ISPI colleagues.

Gender and Geography
In terms of gender, there are more female members than male members of ISPI. Percentage wise, we estimate that the ISPI membership is made up of 54% females and 46% males.2

ISPI is an international society. However, a majority (88.3%) work in the United States. Canada has the next largest number (4.0%). The remainder of the members (6.4%) work in 46 different countries around the world.3 Table 1 shows the percentage of ISPI members working in four different regions of the United States: West, Midwest, South, and Northeast.4

Table 1. Members by Regions in the U.S.

Region

Percentage of Members

West

23.2

Midwest

30.2

South

33.1

Northeast

13.5

The largest number of ISPI members work in the South, with the largest concentration in Florida, followed by Texas, and Virginia. The next largest number of ISPI members work in the Midwest region, with the state of Illinois leading with 300 members, and Ohio with 159. In terms of individual states, California has the highest number of members, followed by Illinois, Florida, Texas, and Virginia. Figure 1 shows the breakdown of ISPI members in terms of the five top member states in the U.S.

Figure 1. Top Five ISPI Member States

ISPI Members’ Work Settings & Responsibilities
Figure 2 shows the work settings of ISPI members. The largest number of ISPI members (N=790) work in the field of consulting and contracting, followed by other (N=366), financial services (N=323), and industrial manufacturing (N=285).5

Figure 2. Work Settings of ISPI Members

Figure 3 below shows the breakout of the members’ significant work responsibility. The largest number of members that completed this section of the membership form have responsibilities involving training director/manager/coordinator (N = 765).6 This is followed by consultant (N = 467) and developer of training/non-training solutions (N = 385). On the lower end of significant work responsibility are training evaluators (N = 15), researchers (N = 24), and training deliverers (N = 39).

Figure 3. Significant Work Responsibility of ISPI Members

Interests of ISPI Members
A large number of ISPI members that completed this section of the membership form have interests in more than one topical area of performance technology (N = 1,974).7 This suggests that many of ISPI’s performance technologists are multi-dimensional in their approach to the field. An analysis of the data indicated that the largest number of members have an interest in the topics of performance/problem solving (N = 1,285) and training design and development (N = 1,054). See Figure 4 for the top five interest areas of ISPI members.

Figure 4. Performance Technology Related Interests of ISPI Members

We hope that these data are of use and interest to the ISPI membership. It is our plan to continue the analyses in the future and to observe patterns over time. We would like to encourage ALL ISPI members to be diligent when completing their membership forms. More complete data will yield an even more complete profile.

Notes
1 The ISPI membership database is interactive and continually changes. The data in this report are static. They were acquired during the afternoon of July 12, 2004, and represent the 3,898 members at that specific time. This number does not include individual members who belong to one of ISPI’s 63 local chapters around the world.

2 The ISPI membership database does not have a field for gender. The gender percentages were estimated by content analyzing the field designated first name. We identified 1,956 individuals with names that are commonly used as female names, 1,673 that are commonly used as male names, and 269 names that were non-distinguishable. The non-distinguishable names include the use by several members of one or two initials, or names such as Jo, Pat, Robin, Terry, and so on. Given that these are estimations, there is some error in the designation of gender.

3 There were 52 members (1.3%) that did not clearly indicate the country they were working in. The 46 countries (besides the U.S.A. and Canada) include: Argentina, Australia, Austria, Belgium, Bermuda, Brazil, Chile, Czech Republic, Ecuador, France, Germany, Ghana, Greece, Guatemala, India, Indonesia, Ireland, Israel, Italy, Japan, Korea, Kuwait, Malaysia, Mexico, Netherlands, New Zealand, Nigeria, Norway, Philippines, Portugal, Qatar, Saudi Arabia, Singapore, Slovak Republic, South Africa, Spain, Sweden, Switzerland, Taiwan, Trinidad, Turkey, United Arab Emirates, United Kingdom, Vatican City, Vietnam, and West Indies.

4 For the regional breakouts, the states are: West = AK, AZ, CA, CO, HI, ID, MT, NM, NV, OR, UT, WA, WY; Midwest = IA, IL, IN, KS, MI, MN, MO, ND, NE, OH, SD, WI; South = AL, AR, DE, FL, GA, KY, LA, MD, MS, NC, OK, SC, TN, TX, VA, WV; Northeast = CT, MA, ME, NH, NJ, NY, PA, RI, VT.

5 There were 1,353 members that did not indicate their work setting on the membership form.

6 There were 1,372 members that did not indicate their significant work responsibility on the membership form.

7 In indicating topical interests, 1,974 (50.6%) members chose two or more areas of interest, 406 (10.5%) members chose one area of interest, and 1,518 (38.9%) members did not complete this section of the membership form.

 


  



There are so many variations in the meaning and usage of business terms that it’s often difficult to know what it is a writer or speaker means when using a particular word or phrase. “Performance” is currently being promoted by a number of groups as a replacement term for “process.” In fact, our July BPTrends newsletter focuses on the various uses of the term “performance.” In this article, we focus on an organization that has been using the term “performance” for decades—the International Society for Performance Improvement (ISPI).

Most who know of ISPI associate it with the improvement of human performance. ISPI has its roots in the 1960s, when it was founded to promote the application of principles derived from behavioral psychology, to training and education. Apocryphally, the behavioral psychologist, B.F. Skinner, attended some of his children’s grade school classes and was appalled by the way they were being taught. Skinner wrote a series of articles calling for a revolution that would apply behavioral principles to education and ISPI was founded to promote those principles. Over the course of time, ISPI devoted less time to the concerns of general education (which is well organized to reject efforts at improvement) and more to concerns related to business training and education. At the same time, ISPI gradually expanded its toolkit to include cognitive and managerial techniques well beyond the narrow behaviorism that stimulated its creation. Today, ISPI is a professional organization with members worldwide.

The constants, over the years, have been the empirical method that emphasizes results and a systems view of organizations. ISPI members have conducted extensive research relating to effective education and training practices. More important, they have emphasized the value of gathering data to determine the effect of changes made to organizational systems. This has led to a gradual accumulation of good practices and development of a methodology for improving human performance that incorporates environmental changes, changes in training and education, changes in motivation, and changes in management. This body of knowledge is often referred to as human performance technology (HPT). The HPT methodology is described in a white paper on the BPTrends site, What is Human Performance Technology. Similarly, it is described in an April issue of the BPTrends newsletter, Analyzing Activities. ISPI also offers a program that recognizes HPT professionals and certifies them as Certified Performance Technologists.

A number of ISPI members have had an impact on the business process management field. Probably the best known is Geary Rummler. Rummler started at the University of Michigan’s School of Business and was an early member of ISPI. In the late 1960s and early 1970s, when I worked for him, he and his partner, Tom Gilbert, were managing a company called Praxis and were primarily focused on helping business organizations analyze and improve human performance. In the 1980s and 1990s, Rummler joined with Alan Brache to found a company, Rummler-Brache, and to co-author a book, Improving Performance: How to Manage the White Space on the Organization Chart (Jossey-Bass, 1990), which describes Rummler’s systematic approach to analyzing and improving business processes.

In 1993, Hammer, Champy, and Davenport published their books and began the Business Process Reengineering (BPR) craze that dominated business thinking in the mid-1990s. The BPR authors emphasized the reasons for process change, but didn’t provide a systematic approach for achieving it. Thus, Rummler-Brache suddenly found itself overwhelmed with clients seeking a practical methodology for the analysis and redesign of business processes. To be fair, Rummler has never advocated radical process redesign, but has, instead, emphasized a systematic, gradual, targeted approach to process design. Rummler’s emphasis on process redesign is nicely balanced against his emphasis on understanding the organization, as a whole, and identifying specific processes that will benefit from improvement. Rummler often emphasizes that changes in the way managers manage processes are more effective than changes in the way employees actually perform their work. ISPI has just published a new book by Rummler, Serious Performance Consulting: According to Rummler.

While Rummler is probably ISPI’s best-known member among those concerned with process improvement, ISPI has many other members who have achieved recognition for helping organizations change human performance. I recently had a discussion with Roger Addison, ISPI’s Director of Human Performance Technology, about work ISPI has been doing to combine ISPI technology with the Six Sigma methodology to provide more powerful human performance analysis tools for Six Sigma practitioners. Clearly, there are some exciting possibilities in this area. There’s even a group within ISPI, known as the “Tucson Seven” who meet regularly to discuss BPM issues. One of them, Donald Tosti, this year’s president of ISPI, was instrumental in choosing the theme for the 43rd International Performance Improvement Conference & Exposition: Process, Practice, & Productivity. This conference will be held in Vancouver, British Columbia, Canada, on April 10-15, 2005.

Those involved in business process change within organizations need to draw on and integrate a wide variety of approaches and technologies, ranging from strategy change systems and process analysis tools, to ABC, BPMS, a wide variety of software automation systems, Six Sigma, and job design. ISPI represents a well-developed source of theory and practice designed to help improve human performance within organizations. It’s a rare process improvement project that doesn’t require changes in management and the jobs performed, or that wouldn’t benefit from better feedback or an improved incentive system. ISPI is a resource that business process change practitioners ought to be familiar with.

NOTE: This article appeared in and is copyrighted by Business Process Trends, www.bptrends.com, and is used with permission.

 


 

ISPI represents a well-developed source of theory and practice designed to help improve human performance within organizations.





Unconventional. Inventive. These words describe the best of our profession, and the best of what we find here at I-Spy. Together, we seek to explore how the Internet can be a powerful tool to improve our work together.

Quick recap: Every month, three sites, one theme. While far from comprehensive, hopefully these sites will spark readers to look further and expand views about HPT. Please keep in mind that any listing is for informational purposes only and does not indicate an endorsement either by the International Society for Performance Improvement or me.

These are the general categories I use for the sites featured:

  1. E-Klatch: Links to professional associations, research, and resources that can help refine and expand our views of HPT through connections with other professionals and current trends
  2. HPT@work: Links to job listings, career development, volunteer opportunities, and other resources for applying your individual skills
  3. I-Candy: Links to sites that are thought provoking, enjoyable, and refreshing to help manage the stresses and identify new ideas for HPT

The theme for this month’s column is Invention Conventions. August is a time to celebrate National Inventors Month, Admit You’re Happy Month, Psychic Week (but you knew that already) and, of course, August 28—both Crackers Over the Keyboard Day and Race Your Mouse Around the Icons Day. For more August celebrations, race your mouse to the Brownie Locks and the 3 Bears August holiday listing. In the United States, August is nestled between two of our presidential nominating conventions (Democrats: July 26-29 and Republicans: August 30-September 2). This combination inspires us to see how inventions and conventions improve our personal, organizational, and societal performance. And don’t forget—still one month to prepare your session proposal for the 2005 ISPI Conference! So, join us as we vent our innovations online, with “SHRM”s, squeaks, and spare hats. Bring your boss.

E-Klatch
How about a human resources invention convention? The Northeast Human Resources Association offers their 2004 HR Invention Convention “Ahead of the Curve” October 20-22, 2004 in Boston, MA. The NEHRA site has many features of value to the discerningly inventive PTers, including job listings, articles and resources (including a Training and Development E-Survey), and conferences and events. The association is currently seeking convention and other speakers. NEHRA, with approximately 4,000 members and affiliated with the Society for Human Resource Management, “is a regional membership organization for human resources professionals.”

HPT@work
Can you invent a better mousetrap? How about a better, more fun mouse? Scamper on over to Squeakland.org to learn more about Squeak, an open source authoring tool to improve classroom learning. “Squeak is a media authoring tool” software that you download to your computer and then use to create your own media or share and play with others. It is free and downloadable. The site features guidelines for instructional designers using this software to create “eToys” and other tools for curriculum enhancement. For conventional instructional design researchers, there is plenty of theory behind it, too: “Alan Kay, guiding light of Squeak, has based this educational work with Squeak on a number of different sources. His Scientific American article details his theories, based in part on Jerome Bruner’s groundbreaking ideas of constructivist learning as well as Seymour Papert’s important work in using computers to find new ways to reach children with powerful ideas of math and science.” Any PTers up for the challenge of developing tools using Squeak? If so, share them with I-Spy, and we’ll share with our readers!

I-Candy
At the end of the day, do invention conventions improve our workplace? Well, at least they keep us thinking in new ways, picturing new solutions. For some ingenious approaches to system analysis and design, pay a visit to this gallery of Rube Goldberg inventions. “Through his wacky cartoons which depict the most elaborate and ridiculous devices to accomplish the most mundane tasks, Rube Goldberg’s ‘Inventions’ have become synonymous with any maximum effort to achieve minimal results.” Unsettlingly familiar to many performance technologists. The gallery and other fun stuff are brought to you by The Official Rube Goldberg website. If you know an unconventional inventor in college or high school, encourage them to enter the Rube Goldberg National Machine Contest. This year’s challenge: Select, mark, and cast an election ballot in 20 or more steps. Otherwise, grab your spare hat and Keep the Boss from Knowing You’re Late.

Did I mention our intention for an extension of reader retention? In other words, hope to see you in September!

When he is not Internet trawling for ISPI, Todd Packer can be found improving business, non-profit, and individual performance through research, training, and innovation coaching as Principal Consultant of Todd Packer and Associates based in Shaker Heights, Ohio. He may be reached at tp@toddpacker.com.



  


In an effort to reduce overstock
in the ISPI bookstore, several books and CD-ROMs are on sale. This is a great opportunity for you to purchase some exceptional books at bargain prices. All book and CD-ROM sales are final and non-returnable. For a complete listing of sale items, click here.

 


  




With a title like this you’re probably thinking I’ve finally gone over the edge to total geekdom. But please bear with me. The distinction between units of analysis and units of measurement can help us do a better job measuring results. Confusing the two often yields measures of questionable value for accountants, engineers, or most business managers.

Let’s clarify an assumption from the outset: measurement is intended to assign quantitative values to outcomes that allow direct comparison before and after interventions, and between different types or instances of interventions. This is the perspective of performance engineering.

The Unit of Analysis
A unit of analysis is the “thing” we decide to increase, decrease, or otherwise improve. It is also what we measure, but the decision about how comes later. For example, to improve the performance of programmers, do we focus on lines of code, applets, or similar chunks of software they produce; or help them set and achieve short-term production goals, whatever those might be? Or both?

To improve service quality, do we try to influence specific positive and negative behaviors or words that CSRs use; focus on the consistency with which they follow processes and algorithms; or aim to reduce mistaken statements about products and services? These are all possible targets, but we must decide which to pinpoint when we begin the effort.

To improve sales performance, should we work to increase how much product is sold, on revenues produced, on sales calls where customers ask for a next step, or perhaps on accepted proposals?

As we all know, pinpointing what we’re trying to improve is one of the most important steps at the start of any improvement effort. It occurs in an analysis where we define our objectives. Specifying the unit(s) of analysis allows us to design specific means to improve performance defined by those units.

The Unit (and Dimension) of Measurement
Once we decide what to improve, we must select the dimension(s) and units of measurement for quantification. What we measure (the unit of analysis) will constrain the range of dimensions we can select (e.g., we can’t measure customer satisfaction in kilograms), but it is still a separate decision.

In some cases the choice is obvious. Any financial goal is likely measured in currency units (e.g., dollars). Most industrial production can be measured in standard units—ounces, gallons, kilowatt-hours, and so on. We select medical/physiological units (e.g., temperature, blood pressure) if we’re working on health outcomes. Many accomplishments we seek to improve have measurement dimensions and standard units derived from scientific or engineering disciplines. Almost any unit of analysis that can be counted, such as people who do X, responses to inquiries, or successful and unsuccessful proposals can be assigned the standard measurement dimension of “countability” (Johnson & Pennypacker, 1980).

But what about customer satisfaction, service quality, or degree of knowledge achieved by newly hired employees? What are the units for evaluating outcomes in these areas? This is where we get into trouble, in two ways.

First, we confuse units of analysis with units of measurement. This leads to non-standard quantities such as numerically labeled ratings on a category scale. While such numbers provide a convenient way to label points on a range of positive-to-negative feelings, they’re not standard units of either analysis or measurement that can be added, subtracted, multiplied, or divided. The common error is to act as though “customer satisfaction” IS the rating on the 5-point scale, and to treat the rating labels as units of measurement. The correct alternative is to define “people who rate service as X” as the unit of analysis, and then use countability as the dimension, i.e., to count those people. This yields standard quantities (counts) of defined units (people who say X) that can be calculated and compared arithmetically.

The second error is to select units of analysis (e.g., good and below-standard widgets from an assembly line), count each type of unit, then calculate percent of good widgets—a “dimensionless quantity” in which the counts are cancelled out—as our measure (Johnson & Pennypacker, 1980, pp. 139 ff). The problem is that by dropping the standard dimensional information (the actual counts) we produce numbers that cannot be directly compared. Is 95% good widgets from a batch of 100 produced per day the same, better than, or worse than 95% good widgets from 1000 produced per day? If we ignore the actual counts, we can’t know for sure. 95% of 1000 is a lot bigger output than 95% of 100, and removing the actual counts obscures that difference. The same goes for percent correct measures on tests where the tests have different numbers of items and different test durations.

The Bottom Line
The message is that we should plan measurement in two steps: 1) select the unit of analysis (what you will measure), and 2) select the dimension and unit of measurement, remembering to use standard units and not reduce them to dimensionless quantities (percentages or ratios). This approach will yield the most useful information for decisions by accountants, executives, and even performance improvement professionals!

Reference
Johnston, J.M., & Pennypacker, H.S. (1980). Strategies and tactics of human behavioral research. Hillsdale, NJ: Lawrence Earlbaum Associates.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance, behavior, and the measurement of results. His easy-to-remember email address is CarlBinder@aol.com and you can read other articles by him at www.Binder-Riha.com/publications.htm. See past issues of this column by clicking on the “Back Issues” link at the bottom of the blue navigation bar to the left.

 



 

As we all know, pinpointing what we’re trying to improve is one of the most important steps at the start of any improvement effort.



It is time once again for you,
the ISPI membership, to determine the future direction of ISPI by nominating those members who you feel have the qualifications, experiences, and vision to lead our Society. Up for nominations this year are the President-elect and two Board members. They will join the President, three continuing Board members, and the non-voting Executive Director who make up the eight-member Board.

The duties of the Board are to manage the affairs of the Society and determine the strategic direction and policy of the Society.

Brief Job Descriptions
President-elect
The President-elect assumes the presidency of ISPI for a one-year term at the conclusion of his or her one-year term as President-elect. The President-elect’s efforts are directed to assuming the Presidency, and assignments are designed to prepare for that transition. The President-elect serves to provide continuity of programs, goals, objectives, and strategic direction in keeping with policy established by the Board of Directors.

Director
Each Director on the Board serves a two-year term and is a leader in motivating support for established policy. He or she serves to develop new policy and serves to obtain support for ISPI’s programs. A Director should provide an objective point of view in open discussion on issues affecting the membership and profession. He or she should thoroughly analyze each problem considered, vote responsibly, and then support those actions adopted by majority vote. Individually, each member of the Board is considered a spokesperson for ISPI and represents the integrity, dedication, and loyalty to established policy.

The deadline for nominations is August 27, 2004. If you would like to nominate a member, please send the following information to nomination@ispi.org:

  • The candidate’s name and contact information.
  • The position for which the candidate is being nominated.
  • Your name and contact information.
  • A 250-word statement on the candidate’s qualifications.

If you are interested in additional information on the nominations process, or the complete job descriptions and qualifications required, click here.



  



This past semester
I taught a graduate course about performance technology for 20 on-campus students and 24 distance students. One assignment, dubbed PT Makeover, asked graduate students to choose a past project that would benefit from some serious PT “magic.” Here is another in a series of articles taken from their assignments.

The lesson in this two-part story is all about analysis and matched solutions. It is a “detailed overview” fable that Charlotte Donaldson presents, one with a bite. Who among us hasn’t tried to please by giving our clients just what they asked for and then wound up not serving anyone at all? That is what happened at Catch’em and Squeez’em Bank.


It All Begins
Once upon a time the Catch’em and Squeez’em Bank, located in a southern kingdom, converted to a new teller system. King Rufus had signed an expensive contract with a software vendor to train bank trainers for the new system. The software vendor came onsite and used its boilerplate-training package to train all four branch managers, as requested. That group of branch managers, led by bank training director, Shilly Shally, then quickly assembled their own training programs, based on bank policies and procedures. Shilly’s marching orders were “go train your branches.” For months, each branch manager implemented training at his/her branch, readying tellers for the big day of conversion. Shilly, busy with King Rufus (planning the summer offsite management meeting) assumed all was going well and didn’t follow up.

D-Day Arrives!
The software vendor’s training director, Willy Nilly, led his training and support staff onsite for the big conversion. As usual, staff would spend the first week in branches, “helping out.” About 10:15 that Monday morning, the avalanche of phone calls to the war room—from the two north and south branches—commenced. “Willy, all H-E-double hockey sticks is breaking loose at the north branch! Customer serfs are lined up down the block!” “What’s wrong?” Willy cried. “The tellers must not have been trained well—or, they’ve forgotten everything they learned! Customers are going crazy because they can’t get their transactions completed!” “Oh, dear,” Willy moaned. “Stay put, I’ll send help.” Call after call came pouring in from the two north and south branches. Kingdom police had to control traffic at the north branch. One customer was asked to stop the abusive language or to leave the kingdom. TV reporters were onsite at the south branch, in response to the ruckus. Willy was frantic—it had never been this bad before! What to do? He called his support staff at both the east and west branches. Oddly—very oddly—things were running smoothly, with only an occasional delay for their customers. “What happened? What’s going on?” Willy moaned.

Just then, King Rufus called. “Willy, your training program for our branch managers was a complete failure—this mess at the branches is your fault! I want your best trainer in the boardroom at 7:00 tomorrow morning until noon, to re-train the branch managers!” “Yes, sir, we’ll be there,” squeaked Willy. Willy called in his star trainer, Starr Trainer, and told him he had to re-teach the branch managers tomorrow morning. “Re-teach them what?!?” Starr asked in amazement. “That’s a three-day course we taught—what do I leave out?” “Just do whatever it takes to make them happy—give them a “detailed overview” of the system—if you don’t, they might sue us!” Sadly recognizing that “detailed overview” was an oxymoron, Starr began to ponder his plan of action—and his future.

The Re-training Occurs
The next morning, Starr did his best. He wore his new Jacque Pennéy suit, his killer cologne, and the purple silk tie that made the girls back home swoon. He brought doughnuts. He greeted everyone warmly as they entered the boardroom. Starr started promptly at 7:00 am and gave the best explanation of the system and its capabilities that had ever been delivered in the history of his software company. He interacted and engaged, called them all by name, and used the flip chart with panache—he felt sure he was high from the fumes of the markers! He smiled. He dazzled. His jokes were great. Just before noon, Starr asked for any final questions, as he distributed the evaluation sheets. There were no questions, and the branch managers left the room, seemingly in an upbeat mode. The evaluations were great; they loved his session.

Late that afternoon, King Rufus called Willy and said the re-training had been a complete failure, too. The branches were still a mess. Willy, unsure what to do, sat in the war room and pondered where he had gone wrong, and what the heck Starr had done to mess the situation up.

Reflection on the Effort—As It Could Have Been
This story is an example of both clients and training professionals behaving as if training were the only intervention for effective performance. Robinson and Robinson (1996) contend those in a performance consultant role must focus on identifying and addressing gaps, performance drivers, and barriers, as opposed to a more limited view of addressing only learning needs. Performance consulting is what was needed from the start.

Human performance technology (HPT)—another name for performance consulting—is a systematic and systemic effort to identify opportunities and barriers to both individual and organizational performance, according to the International Society of Performance Improvement. While anyone could panic in a very bad situation like this one, Willy could have pursued the following dialogue, when the King called about the hideous problems in the north and south branches:

“Your Royal Highness, I’ve heard about the north and south branches, and I sincerely regret this situation. As your partner, I know we can work toward a successful outcome. I also know our need is immediate and urgent. I’d like to conference in our four branch managers to determine the current environment, begin to determine gaps, and work toward a resolution. May I place you on hold, while I conference them in now?”

After the four north, south, east, and west branch managers were conferenced in, Willy might have used this line of questioning.... Return with us next month to learn how this “cliff hanger” ends!

References
Robinson, D.G. & Robinson, J.C. (1996). Performance consulting: Moving beyond training. San Francisco: Berrett-Koehler.

HPT: Principles of Human Performance Technology. Retrieved April 10, 2004 from www.ispi.org.

Charlotte Donaldson has spent her entire career involved in all phases of training and performance improvement for adults in the workplace. Formerly with EDS and Bank of America, Charlotte is currently a Learning Manager at Booz Allen Hamilton, Center for Performance Excellence. She completes her master’s in Distance Education from the University of Maryland University College in December 2004. Charlotte may be reached at donaldson_charlotte@bah.com.



 


Bonnie Shellnut, PhD, CPT, and Susan McClure, CPT,
represent Minneapolis-based Carlson Marketing Group in a shared role as Advocate representatives to the International Society for Performance Improvement. Bonnie and Susan are part of Carlson Marketing Group’s Learning and Performance Solutions Group, a world-class team of instructional development and training experts. The group is comprised of two business teams—one focused on automotive, and the other focused on non-automotive learning and training solutions. Bonnie is a Senior Training Manager and Performance Improvement Consultant with the automotive group in Detroit. Susan, based in Atlanta, is the Director of Consulting within the non-automotive group in Minneapolis.

The team uses human performance technology principles to design targeted solutions to improve clients’ organizations, processes and performance. Carlson Marketing Group’s “On the Lane” program for Ford Motor Company was honored with ISPI’s Award of Excellence for the Outstanding Human Performance Intervention in 2002. “On the Lane” used a holistic approach to implement effective, consumer-focused point of sale processes with training, incentives, performance measures, and performance support. In addition to in-dealership training, the automotive team provides events, seminars, interactive-distance learning, web courses and certification testing, and reward and recognition programs. The non-automotive team creates custom training and learning solutions in the areas of business knowledge, leadership and management development, sales and service, and personal effectiveness.

Carlson Marketing Group, a Relationship Marketing company, helps global Fortune 1000 clients solve their critical business issues and increase ROI by designing marketing strategies that build better relationships with the audiences that clients depend on for their success: employees, channel partners, and consumers.

The company provides turnkey services to support almost every marketing initiative including: sales meetings and product launches, meetings consolidation, website creation, e-learning systems, call-center management, award redemption and fulfillment, CRM campaign management, creative print and broadcast communications, partnership marketing alliances, corporate anniversary galas, and Olympics-related marketing campaigns and events.

Ranked by Advertising Age magazine as the largest marketing services agency in the United States, Carlson Marketing Group has international capabilities that span 26 countries. Offering well-designed Relationship Marketing strategies, the company helps its clients optimize their marketing spend through the design and delivery of custom solutions.

Carlson Marketing Group is one of the major operating groups of Carlson Companies, which has been recognized by Working Mother magazine as one of the “100 best companies for working mothers” in 2001, 2002, and 2003. In addition, Fortune magazine named Carlson Companies as one of “the 100 Best Companies to Work For” in 2002. For more information on Carlson Marketing Group, visit www.carlsonmarketing.com.

 


  



Call for Proposals
Amazing! Where did the summer go? School will be starting soon—or already has in some locations. It’s time to get organized for fall. And chief among the “Things to Do” is to make sure you have submitted a proposal for ISPI’s 2005 Annual International Performance Improvement Conference and Exposition, April 10-15 in Vancouver, British Columbia, Canada. The deadline is August 31, 2004. Click here to download a PDF of the submission guidelines. Receive a $50 speaker discount by submitting conference registration along with your proposal application. This discount is only extended to the lead presenter. In the event your proposal is not accepted and you are not able to attend the conference, you will receive a full refund upon request.

Not sure how to compile a winning proposal? Click here for examples of a sample proposal application template, a successful session proposal, and a sample handout and performance tool.

Volunteer Opportunities: Calling All Students!
Are you interested in attending ISPI’s 43rd Annual Performance Improvement Conference and Exposition but unable to afford the registration fee? If you are willing to attend pre-assigned sessions or workshops, are open to monitoring sessions you may not have selected on your own, and are able to distribute and collect evaluation forms and assist ISPI presenters, send your name, complete mailing address, phone, fax, and email address to: conference@ispi.org.

ISPI will significantly reduce the conference registration fee for all conference volunteers. Volunteers will be responsible for their own travel, hotel, and other costs associated with attending the conference. Volunteers are assigned on a first-come, first-served basis. ISPI will contact you regarding your assignment in November.



 

 



ISPI is pleased to announce that the New Mexico Chapter will be inducted into the Chapter Hall of Fame at ISPI’s 43rd Annual International Performance Improvement Conference and Exposition in Vancouver, British Columbia, Canada, April 10-15, 2004. The New Mexico Chapter reached eligibility when it received the Chapter of Merit award at the 2004 Annual Conference in Tampa, Florida. Chartered chapters must be recognized for three consecutive years in at least two of the three categories for chapters in the Awards of Excellence program to be named to the Hall of Fame. ISPI’s Chapter of Merit Award is presented to chapters that fulfill rigorous standards of excellence; the Outstanding Chapter Communication Award is awarded to chapters whose communication vehicles support their objectives and strategic plan; and the Outstanding Educational Program recognizes chapters that spread the performance improvement message to others in academic and non-academic settings.

The New Mexico Chapter is the first chapter to be named to the Chapter Hall of Fame since 2002, when the Michigan Chapter was inducted. Other chapters to be named to the Hall of Fame in the last 10 years include the Montreal Chapter, the Great Valley Chapter, and the Columbia Northwest Chapter. To find out more about the New Mexico ISPI Chapter, visit their website at http://www.nmispi.org.

The ISPI Awards of Excellence program is designed to showcase people, products, innovations, and organizations that represent excellence in the field of instructional and human performance technology. The 2004-2005 Awards of Excellence Committee, chaired by Nancy Green, CPT of Integ Inc., and co-chaired by Joseph Monaco, CPT, of Monaco Group Inc. is accepting submissions for the 2005 Awards of Excellence Program through October 15, 2004. Recipients will be recognized in Vancouver on April 15, 2005.

 


  


The ASTD Dissertation Award is given each year to foster and disseminate research in the practice of workplace learning and performance. This year’s award will be presented to the person who has submitted the best doctoral dissertation for which a degree was granted between September 21, 2003-September 20, 2004. The topic must focus on some issue of relevance to the practice of workplace learning and performance. Illustrative areas of concentration include: training and development, performance improvement/analysis, career development, organization development/learning, work design, and human resource planning.

All research methodologies will be considered on an equal basis including, for example, field, laboratory, quantitative, and qualitative investigations. The candidate must be recommended and sponsored by his or her committee chair. All materials submitted must be in English and in Word format by email. Submission requirements correspond to the full manuscript requirements of the Academy of Human Resource Development’s (AHRD) Dissertation of the Year procedures that require applicants to follow the full manuscript conference proposal submission guidelines.

The award winner will receive a $500 cash prize, a commemorative plaque presented at the awards ceremony during the 2005 ASTD International Conference and Exposition, and a designated place on the conference program to present the research (with conference registration fee paid). Submissions must be sent via email by September 20, 2004 to Jean Leslie at lesliej@leaders.ccl.org.

 

 


Performance Marketplace is a convenient way  to exchange information of interest to the performance improvement community. Take a few moments each month to scan the listings for important new events, publications, services, and employment opportunities. To post information for our readers, contact ISPI Director of Marketing, Keith Pew at keithp@ispi.org or 301.587.8570.


Books and Reports
This second edition of Fundamentals of Performance Technology contains two new appendices that describe the ISPI developed Standards of Performance of Technology and map the content of Fundamentals of Performance Technology and Performance Improvement Interventions to those Standards.

Serious Performance Consulting According to Rummler uses an extensive case study to illustrate what a serious performance consulting engagement looks like, and what a serious performance consultant does. Do you have what it takes to be a SPC?

Training Ain't Performance is a whimsical, entertaining, and solidly written book that addresses human performance. From beginning to end, readers are guided toward an understanding of human performance improvement and how to use it for real organizational value.

Conferences, Seminars, and Workshops
Add performance and pizzazz to your training. Whether it’s a 45-minute presentation or a week-long workshop, Thiagi can make your training come alive with interactive experiential activities. Nobody does instructional design faster, cheaper, and better than Thiagi.

Darryl L. Sink & Associates, Inc. is offering these workshops for Fall 2004: Designing Instruction for Web-Based Training, Atlanta, September 14-16; The Instructional Developer Workshop, Los Angeles, September 21-23; The Criterion Referenced Testing Workshop, Oak Brook, IL, October 5-6. Visit www.dsink.com to register!

 

 

 

 

Job and Career Resources
ISPI Online CareerSite is your source for performance improvement employment. Search listings and manage your resume and job applications online.

Magazines, Newsletters, and Journals
The International Journal of Coaching in Organizations (IJCO) is a professional journal, published quarterly to provide reflection and critical analysis of coaching in organizations. The journal offers research and experiential learning from experienced practitioners representing various coaching schools and methodologies.

Performance Improvement Quarterly, co-published by ISPI and FSU, is a peer-reviewed journal created to stimulate professional discussion in the field and to advance the discipline of Human Performance Technology through literature reviews, experimental studies with a scholarly base, and case studies. Subscribe today!

Resource Directories
ISPI Online Buyers Guide offers resources for your performance improvement, training, instructional design and organizational development initiatives.

 

 

 


Are you working to improve workplace performance? Then, ISPI membership is your key to professional development through education, certification, networking, and professional affinity programs.

If you are already a member, we thank you for your support. If you have been considering membership or are about to renew, there is no better time to join ISPI. To apply for membership or renew, visit www.ispi.org, or simply click here.

 

 



ISPI is looking for Human Performance Technology (HPT) articles (approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:

  • Short “I wish I had thought of that” Articles
  • Practical Application Articles
  • The Application of HPT
  • Success Stories

In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to april@ispi.org. Each article will be reviewed by one of ISPI’s on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact april@ispi.org.

 

 

Go to printer-friendly version of this issue.


Feel free to forward ISPI’s PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone else’s PerformanceXpress, send your complete contact information to april@ispi.org, and you will be added to the PerformanceXpress emailing list.

PerformanceXpress is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Society’s activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.

If you have any questions or comments, please contact April Davis, ISPI’s Senior Director of Publications, at april@ispi.org.

ISPI
1400 Spring Street, Suite 260
Silver Spring, MD 20910 USA
Phone: 1.301.587.8570
Fax: 1.301.587.8573
info@ispi.org

http://www.ispi.org