by Patricia Pulliam Phillips


Return on investment (ROI) in training
and performance improvement programs has never been at the heightened level of interest it is today. Go to any conference, pick up any book on training, performance improvement, e-learning, or human resources, check out promotional materials for various programs and you will see ROI used in some form or fashion.

Occasionally the acronym ROI is defined in terms unknown to management such as return on information or return on individuals. Forms of ROI, such as ROE (return on equity) and ROA (return on assets), are often touted as return on expectations and return on anticipation. Again, these terms are unfamiliar to management, confusing the decisionmakers when comparing the impact of investments of all programs, processes, and organizational functions.

Other times ROI is accurately defined as return on investment, however, when you look closely, the actual description represents only cost-reduction benefits of a program or process, but fails to compare the benefits to the cost of implementing the program or process.

So, what is ROI?
ROI is a concept used in accounting and finance for centuries to show the relationship between profit and invested capital. It became widespread in industry for making decisions about operating performance in the early 1960s (Horngren, 1982), and is used today to compare opportunities inside and outside the organization.

The financial equation for ROI is earnings (net income) divided by investment. Phillips (1983) introduced the concept to the field of training and performance improvement, defining the equation as:

ROI (%) = Net Program Benefits X 100
Program Costs

Put simply, the ROI ratio (express as a percentage) is derived by converting program benefits to monetary value, subtracting the program costs from the monetary benefits, dividing the difference (the “net” program benefits) by the program costs and multiplying the results by 100.

This calculation provides a cost-benefit comparison in terms management understands and can easily compare to the impact of other investments.

Is ROI enough to show the value of training and performance improvement programs?
The ROI ratio represents all the major elements of financial success of a program. But alone, it is an imperfect measure—other measures of performance provide input into program success. Figure 1 provides a summary of the various measures to be included in evaluating training and performance improvement program success, including ROI (Phillips, 2002; Phillips, 1997; Kirkpatrick, 1994).




Level 1: Reaction, Satisfaction,
and Planned Action


Level 2: Learning

Level 3: Application

Identify Barriers and Enablers

Isolate the Effects of the Program



 
Level 4: Impact/Benefits
Convert benefits to monetary value
Tabulate program costs
Identify intangible benefits
Compare monetary benefits to program costs

Level 5: ROI

Intangible Benefits
 

Figure 1. Summary of Measures Included in Developing ROI.

Should the ROI be calculated for all programs?
No. It is not necessary nor is it feasible. A number of criteria should be considered in determining if ROI is appropriate. A simple way to decide which programs should be evaluated at ROI is to list all the programs being considered, compare them to the criteria, and assign points to each criteria for each program. The programs with the highest scores are those to be considered for ROI. Table 1 provides a simple tool to help determine which programs should be evaluated at ROI.

Criteria
Program #1
Program #2
Program #3
Program #4
1. Life Cycle of the Program        
2. Linkage to Organization Strategy        
3. Program Costs        
4. Audience Size        
5. Visibility        
6. Management Interest        

Rating Scale

1. Life Cycle 5 = Long life cycle; 1 = Very short life cycle
2. Linkage to Strategy 5 = Closely related to organization strategy; 1 = Not directly linked to organization strategy
3. Program Costs 5 = Very expensive; 1 = Very inexpensive
4. Audience Size 5 = Very large audience; 1 = Very small audience
5. Visibility 5 = High visibility; 1 = Low visibility
6. Management Interest 5 = High level of interest in evaluation; 1 = Low level of interest in evaluation

Table 1. Selection Criteria for ROI (Source: Phillips & Burkett, 2001).

Should all organizations pursue ROI?
Again, the answer is no. Pursuing a comprehensive evaluation process may not be for every organization. ROI is only intended for organizations in which there is a need to show the linkage between performance improvement programs and the business strategy. It is only intended for organizations in which accountability is an issue.

However, it is never too early to start thinking about ROI. A comprehensive evaluation process including ROI, takes time to understand, implement, and integrate into an organization. So if there is the potential that an organization will ever need to show the value training and performance improvement programs bring, ROI should be considered.

Are you a candidate for ROI? Take the following self-assessment.

Is Your Organization a Candidate for ROI Implementation?
Read each question and check off the most appropriate level of agreement.
1 = Disagree; 5 = Total Agreement

 
Disagree               Agree
1
2
3
4
5
1. My organization is considered a large organization with a wide variety of training and performance improvement programs.          
2. We have a large training and performance improvement budget that reflects the interest of senior management.          
3. Our organization has a culture of measurement and is focused on establishing a variety of measures including training and performance improvement.          
4. My organization is undergoing significant change.          
5. There is pressure from senior management to measure results of our training and performance improvement programs.          
6. My training and performance improvement function currently has a very low investment in measurement and evaluation.          
7. My organization has experienced more than one program disaster in the past.          
8. My organization has a new training and performance improvement leader.          
9. My team would like to be the leaders in training and performance improvement processes.          
10. The image of our training and performance improvement function is less than satisfactory.          
11. My clients are demanding that our training and performance improvement processes show bottom-line results.          
12. My training and performance improvement function competes with other functions within our organization for resources.          
13. There is increased focus on linking training and performance improvement processes to the strategic direction of the organization.          
14. My training and performance improvement function is a key player in change initiatives currently taking place in my organization.          
15. Our overall training and performance improvement budget is growing and we are required to prove the bottom-line value of our processes.          

Scoring
If you scored:

15-30 You are not yet a candidate for ROI.

31-45 You are not a strong candidate for ROI; however, it is time to start pursuing some type of measurement process.

46-60
You are a candidate for building skills to implement the ROI process. At this point there is no real pressure to show the ROI, which is the perfect opportunity to perfect the process within the organization.

61-75 You should already be implementing a comprehensive measurement and evaluation process, including ROI.

Source: Phillips, 2002.

So what’s the bottom line?
The bottom line is that ROI is here to stay. It has a long history of showing the value of investments and is now being used by organizations around the globe to do the same for training and performance improvement programs. Used as it was originally intended and in combination with other measures of program success, ROI can be a powerful tool for training and performance improvement functions.

References
Horngren, C.T. (1982). Cost accounting. Englewood Cliffs, NJ: Prentice Hall.

Kirkpatrick, D.L. (1994). Evaluating training programs: The four levels. San Francisco: Berrett-Koehler Publishers.

Phillips, J.J. (1983). Handbook of training evaluation and measurement methods. Houston, TX: Gulf Publishing.

Phillips, J.J. (1997). Return on investment in training and performance improvement programs. Boston, MA: Butterworth-Heinemann.

Phillips, P.P. (2002). The bottomline on ROI. Atlanta, GA & Silver Spring, MD: Center for Effective Performance & International Society for Performance Improvement.

Phillips, P.P., & Burkett, H. (2001). Managing evaluation shortcuts. Infoline. Alexandria, VA: ASTD.

Patti Phillips is CEO of The Chelsea Group, an international consulting and research organization focused on accountability issues within organizations. Patti is author of The Bottomline on ROI co-published by the Center for Effective Performance and ISPI (2002), and co-author of The Human Resources Scorecard: Measuring the Return on Investment published by Butterworth-Heinemann (2001). She may be reached at thechelseagroup@aol.com.

 

The ROI ratio represents all the major elements of financial success of a program. But alone, it is an imperfect measure—other measures of performance provide input into program success.



by Jeanne Farrington, ISPI Director


It’s my first day in the office
since the International Society for Performance Improvement’s Annual Conference in Dallas. There’s a big stack of handouts and other conference stuff sitting on my desk, which I’ll file where I can find it again. Inevitably, I’ll want to send something to a colleague, provide a reference, or figure out how to do some new thing I heard about at the conference. Every year, without fail, there are useful gems in the pile of papers I bring home.

My first conference was in 1988 in Washington, DC. I remember being nervous about my presentation. Being new to the field and finding myself in the company of so many experienced attendees, I was afraid I wouldn’t have much to add—still, I had something to share. At the presenters’ reception I met the author of a textbook we used at San Jose State. He came to my presentation and gave me some helpful pointers afterward. Plus, he threw in some career queries for good measure: Where did I see myself in five years? In 10 years? And there it was, mentoring and guidance, which started from a question or two and then continued at conferences and through correspondence for many years after that.

Years later I noticed I was the one asking questions about someone’s career or about being more involved with ISPI. What do you want to accomplish with your work? Would you write an article about that? How about assisting us with this project? And people said they were encouraged by these questions. “Oh, you think I can write? I always wanted to write.”

This year was ISPI’s 40th conference. People attended from 22 different countries—service men and women in uniform, academics, business folks, training professionals, performance technologists, newcomers, and old timers. Everywhere I went I saw people introducing themselves and each other, talking about ideas and practical matters, having friendly arguments about the best way to structure a session, or sharing tips about finding work in this less than auspicious economy.

Over the years surface things change about the conference. But the most important things remain the same: it’s a great place to network with other people in our field, and it’s a great place to work toward keeping current. Human Performance Technology is such a big field that we can always broaden or deepen our knowledge about something we know a lot about, and we can always begin to learn something new. Learning from each other at our conferences is a great way to do both of these things.

  



by Carol Haig & Roger Addison


In this TrendSpotters interview, we talked with Dick Clark of the Rossier School of Education at the University of Southern California. Dick is the 2002 recipient of ISPI’s Thomas F. Gilbert Distinguished Professional Achievement award. He may be reached at clark@usc.edu. Based on his research, Dick identifies three trends to watch:

Significant Trends
The high failure rate of such interventions as Total Quality Management (TQM) and Downsizing has generated new Organizational Change Strategies. The growth of complex technologies is fueling the need for new Knowledge Capture Models. And, there is new evidence of the power of Motivation to increase performance.

Impact of These Trends
According to Standard & Poor’s, organizational life expectancy has diminished from 65 years in 1920 to 10 years in 2001. Other research shows that in a two-year period, TQM produced no results for 67% of companies surveyed, while 40% of the Fortune 300 reported TQM as a “total failure” and another 40% reported “inadequate results.” Downsized organizations had a 9% sales increase and a 4.7% shareholder value increase after three years. In contrast, comparable non-downsized organizations reported a 26% increase in sales and 34% in shareholder value in the same period.

With the lifecycle of organizations now so short, effective change strategies to ensure survival are a necessity. Successful experimental organizational change strategies, according to Mourier and Smith, Conquering Organizational Change (CEP, 2001), share these markers:

  • Change must be gradual and systemic rather than rapid and episodic
  • The commitment to change strategies must permeate an organization from top management to the team and individual level
  • Support and encouragement must be given to organizational mavericks who have positive ideas

Such change strategies must also match the organization’s environment in:

  • Knowledge intensity
  • Speed of change
  • Complexity of change

New knowledge capture has historically relied upon Subject Matter Experts (SMEs) and exemplary performers to extract the “right” information for training. Research tells us that 50% of SME information is often missing or inaccurate but new Cognitive Task Analysis methods can eliminate these errors. Our tried and true ISD model supports training development for simple knowledge but may lack the features and flexibility to support the needs of knowledge-intense industries requiring specialized training systems for complex knowledge.

The U.S. Joint Chiefs of Staff and Hewlett-Packard are experimenting with new systems designed to capture accurate and complex knowledge. Their key characteristics are:

  • Separate the design of routine and non-routine applications and integrate them in instruction, and capture knowledge for training using cognitive task analysis
  • Organize on-the-job, monitored practice over time
  • A new rule: the practice required to learn and transfer complex knowledge is NOT equal to the sum of its parts so application exercises must include all knowledge learned

Current research shows Motivation as a necessary condition for performance despite all knowledge. That is, even the most skilled and knowledgeable worker needs appropriate motivation to produce optimum results.

Because complaints about motivational systems center on fairness, communication, and clarity, new motivational systems must:

  • Help workers create personal value for their work tasks
  • Coach people so that they believe they can succeed at work goals
  • Demonstrate the organization’s alignment with tools and resources plus efficient and effective work processes to do the job
  • Create a positive mood that is optimistic

Implications of These Trends for Practitioners
Faced with a short organizational lifespan, the savvy HPT practitioner will seek out new organizational change processes that match the success profile and look for opportunities to use them to improve performance, rather than waiting for a problem to present itself. Change must become an essential part of the culture in all organizations.

Resources
The Center for Effective Performance will publish, Turning Research into Results, Dick’s book with Fred Estes in the late Spring.

HPT practitioners interested in new training design models for complex knowledge should read a new article by Van Merrienboer, Clark, and deCrook (2002) titled “Blueprints for Complex Learning: The 4C/ID Model” in Educational Technology Research and Development, vol. 50, no. 2.

Practitioners interested in exploring new ISD models and knowledge management strategies can attend the International Society for Performance Improvement’s 2002 Fall Conference: Performance-Based Instructional Systems Design to discover what their colleagues and leaders in the field have been learning.

And, to learn more about the latest motivation research, readers can purchase the results of a research partnership between the SITE Foundation and ISPI. See the “Incentives, Motivation, and Workplace Performance: Research and Best Practice” article in this issue of PerformanceXpress for more details.

If you have any suggestions of trends driving performance in today’s business environment that you feel would be of interest to the PerformanceXpress readership, please contact Carol Haig at carolhaig@earthlink.net or Roger Addison at roger@ispi.org.

  



Last month, the International Society for Performance Improvement (ISPI) launched its Certification program and new website at the 40th Annual ISPI Conference & Expo in Dallas, Texas.

ISPI developed the Certification program in response to a growing desire among employers and clients to have standards and criteria to help them distinguish practitioners who have proven they can produce results through a systematic process. In addition, practitioners have asked for a credential that would help them assess their ability, better focus their professional development efforts, and recognize their capability. The result is a program through which individuals can apply to receive the designation of Certified Performance Technologist (CPT).

From April 30, 2002 to May 1, 2003, ISPI is offering professionals with six years of experience in the field of performance improvement an opportunity to apply under a special “grandparenting” provision. To see if you qualify for this exemption or for more information, please visit www.certifiedpt.org or e-mail certification@ispi.org.

 

  


by Carl Binder


A reader wrote to ask, “How can ‘credit’ be reliably attributed to various HPT interventions, when there are so many possible causal factors?” Assuming that we gather good measures of the behaviors, accomplishments, and/or business results that we are attempting to improve, how can we decide if our intervention actually produced the desired results?

At the recent ISPI Annual Conference & Expo in Dallas, Randall Finfrock and I addressed this question in a session called “Practical Performance Evaluation Without Statistics.” Let me give you a high-level summary and then suggest some additional resources.

First, we need to measure (i.e., count) what we hope to change on a regular basis over time. To assess learning we might count correct and incorrect behaviors per minute for brief samples each day over weeks. For job outputs we might count per hour, per day, or per week on a weekly or monthly calendar basis. For business results, we might monitor on a count-per-week or per-month basis. Using such “time series” measures, we can create graphs that display levels, trends, and variability (bounce) over time.

There are several designs that apply to this type of non-statistical evaluation. We presented six types in Dallas.

The first is simply to begin measuring and see if the desired changes occur over time. While this cannot tell for certain whether or not the intervention caused observed results, it does tell whether desired results are being achieved or not. For many managers, this is sufficient—as long as the numbers are moving in the right direction.

A second approach compares two simultaneous conditions, for example productivity of two comparable groups receiving different coaching methods. If we can repeat (or replicate) a difference between two conditions several times, then we can be pretty sure that the different approaches caused the different results.

Additional designs start with a “baseline”—a period of time during which we gather and graph measures to determine the level, trend, and variability of performance before the intervention. After a baseline, we can introduce the intervention and see whether it causes a change in level, trend, and/or variability. If we can repeat a result several times, we can be fairly confident that the intervention is causing the difference in results.

Occasionally we use a “reversal design” in which we implement an intervention after a baseline period, and then reverse to the baseline condition after a period of the intervention. If the performance improvement returns to the level or trend observed in baseline, we can conclude that it was our intervention that caused the change in performance. The trouble with reversal designs is two-fold: First, many phenomena simply do not reverse, for example whenever something is learned during the intervention it is unlikely to be unlearned in the reversal. Second, if we achieve a desired result, most managers are not eager to go back to the way things were.

A type of design called “multiple baseline” is often more practical than a reversal. It applies whenever we can do a pilot test or a staged rollout over people/groups, locations, or specific types of outcomes. We first gather baseline data for a number of different people/groups, locations, or types of outcomes. We then intervene in one of them but not in the others. We wait for a while to see if there is an effect, and then intervene in the second situation, watch for an effect, intervene in the third, and so on. If there is a result each time we introduce the intervention, but not until, we can be quite certain that our intervention is, indeed, the cause of improvement.

For a practical summary of these evaluation designs, see chapter 13 of Aubrey Daniels’ classic book on performance management, Bringing Out the Best in People. You can also download the slide set from our recent ISPI presentation.

Next month we’ll begin to present examples of measures that you can use to determine the effects of different types of performance improvement interventions.

Reference
Binder, C., & Finfrock, R. (2002). Practical performance improvement evaluation without statistics. Presented at the ISPI Conference, Dallas, Texas, April 23. Handouts available on the ISPI Conference CD-ROM or at www.Binder-Riha.com/publications.htm.

Daniels, A.C. (1994). Bringing out the best in people. New York: McGraw-Hill, Inc. Chapter 13, pp. 106-113.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance, and behavior to deliver valuable results. His easy-to-remember e-mail address is CarlBinder@aol.com and his company’s website is www.Binder-Riha.com.


 


by Christine Marsh

Words into Action
“The measure of people’s intent is based not only on what they say, but also more importantly on what they do.”

Concept to Reality
The idea of forming ISPI Europe was first expressed at the ISPI Conference in San Francisco in 2001. From that initial concept, action was taken by a group of dedicated people that has resulted in the launch of ISPI EMEA (Europe, Middle East, and Africa) at the recent ISPI Conference in Dallas, 2002.

The Way Forward—Global Fluency
How will the principles of HPT be understood and accepted across so many different countries and cultures? The aim is to build a bridge to span the different cultures and languages whilst respecting the local environment.

Culture: Beliefs and values, processes and ways of living and working together or put simply “The way we do things around here!”

Language: Another factor we should not underestimate. Our choice and use of specific words, especially HPT “speak,” must not be a barrier to open and honest communication.

Conference Committee
In order to organize the first official ISPI EMEA event to be held in the Netherlands in June 2002, a Conference Committee was formed which includes: Monique Mueller, Switzerland, Andreas Kuehn, Germany, Christian Voelkl, Germany, François Lamotte, France, and Jan-Peter Kastelein, The Netherlands.

It is to pay tribute to this dedicated group of people that we offered them the opportunity to express their personal reasons for their commitment to turning this concept into a reality. We need to take into consideration the subtle changes that can occur during the process of translation.

Monique Mueller

Spanish     Nuestro grupo se ha encontrado gracias al interés profesional en HPT, la ilusión por mejorar la vida laboral y el deseo de dar a conocer HPT y ISPI a través de Europa, Oriente Medio y África.

Nuestro trabajo en común para crear IPSI EMEA también ha tenido como resultado nuevas amistades con las que nos reimos, nos divertirnos, y cuidamos del buen comer.

English Translation     A shared belief in the professional value of HPT, a mission for improving people’s work life, and the desire to spread the knowledge about HPT and ISPI across Europe, the Middle East and Africa brought us together.

I love the shared sense of humor, the fun, and the friendships that have developed from our common effort to create ISPI EMEA.

Andreas Kuehn

German     Ich finde ISPI EMEA deshalb so wichtig, weil wir mit einer solchen Struktur überhaupt erst in der Lage sind, internationale Projekte zu managen, unseren Kunden zu zeigen, dass die Integrationsarbeit, die dort oft noch zu leisten ist, bei uns bereits vollzogen ist, wir uns zusammengerauft haben und voneinander lernen.

English Translation     For me ISPI EMEA is important because only such a kind of structure enables us to manage international projects. We can show our clients that we’ve already made much progress in the integrational work, which they still often have to accomplish, that we found a way to work together and to learn from each other.

François Lamotte

French      J’ai la conviction que créer un lieu actif d'échanges sur notre pratique est un Objectif Fondamental dont le sens est de grande valeur. Nous nous interessons à ce qui donne du sens au travail et aux liens que les Hommes tissent entrez eux dans l'entreprise.

L’expérience acquise dans ce cadre montre que nous y apprenons chaque jour à ne plus faire des barrières de tous ordres, techniques, culturelles, politiques, des obstacles, mais bien à les vivre comme le stimulant essentiel d'un enrichissement mutuel dans la diversité.

English Translation     I am convinced that creating a space for actively exchanging our practices is a fundamental goal with great value in itself. We are interested in what gives sense to our work and the bonds people weave amongst them within their businesses.

The experience acquired in the frame shows that we learn daily not to build all kinds of barriers: technical, cultural, political, but instead to view our differences as mutually stimulating and enriching experiences of diversity.

Christian Voelkl

English      One of the basic tenets in our profession is that we are able to put ourselves in the shoes of our clients, peers, or colleagues to see the world through their eyes and from their perspectives. While living and working in foreign countries for several years, I enjoyed the opportunity to develop a much deeper understanding of this principle and it helped me foster an appreciation for what’s really important in our field: the value of human beings. And no matter how big the differences between individuals might seem at times, we shall always remind ourselves to rather focus on the many things that we all have in common. Our daily struggles to come closer to each other is what’s really at heart for me when I think about Global Fluency.

Jan-Peter Kastelein

Dutch      Het is geweldig om deel uit te maken van een team dat zich met zo veel passie inzet om de afstand tussen ISPI leden te overbruggen.

English Translation   It is great to be part of a team that is so passionate about bridging the distance between ISPI members who work and live in different areas of the world.

Additional Contributions to be Acknowledged
The added dimension of extending ISPI Europe to ISPI EMEA (Europe, Middle East, and Africa) was a direct result of the contribution of Michelle Katz, Israel and Belia Nel, South Africa.

For more information on the ISPI Europe, Middle East, Africa (EMEA) Conference and Assembly Meeting in The Netherlands, June 20-22, 2002 visit www.ispi.org.

 

 

 

  

 

 

 

 

  

 



Join your colleagues for Performance-Based Instructional Systems Design, September 26-28, 2002, Chicago, IL, a conference devoted to the latest models, methods and tools for the design of learning. You will take away valuable hands-on solutions to your most critical challenges in Instructional Systems Design, and will return to your employer or clients with the tools needed to improve performance and deliver success.

Dr. Allison Rossett delivers the keynote address, entitled The Sweet Spot: Where ISD, Performance and E-Learning Come Together. Is e-Learning the answer to training needs? Can it possibly fulfill its great promise? Not, Rossett believes, without performance technology, and not without ISD. Rossett, Professor of Educational Technology at San Diego State University, is editor of the ASTD E-Learning Handbook: Best Practices, Strategies and Case Studies for an Emerging Field (2002).

Our Masters Series presenters are Brenda Sugrue, PhD and Darryl Sink, EdD.

Dr. Sugrue will describe and illustrate a variety of strategies for increasing learning and performance improvement outcomes from e-Learning. These include strategies for online learning by doing, integrated performance support, and building communities of practice. Sugrue co-edited the ASTD book Performance Interventions: Selecting, Implementing, and Evaluating the Results (1999). She was profiled by Training magazine in 2001 as one of the training field’s “movers and shakers.”

Dr. Sink suggests viable alternatives or modifications to the “traditional” ISD process that make it more efficient, flexible, and appropriate for performance-based, results driven environments. Sink will highlight positive changes in the practice of instructional design and development by exploring how “master” instructional designers approach the ISD process. Sink is a human performance improvement consultant specializing in training and development solutions and has twice received the ISPI Award of Excellence for Outstanding Instructional Product of the Year.

Performance-Based Instructional Systems Design will also feature some 30 concurrent sessions by Carl Binder, Marilyn Gilbert, Ken Silber, Rob Foshay, Ruth Colvin Clark, Lynn Kearny, Margo Murray, Judith Hale, William Coscarelli, Sharon Shrock, Peter R. Hybert, and others, as well as a special conversation with Don Tosti and Geary Rummler.

The conference is preceeded by full-day workshops offered on September 25 by Dr. Tom Welsh, Ruth Colvin Clark, EdD, and Lynn Kearny and Kenneth H. Silber, PhD.

Take away new knowledge and insights, a plethora of useful performance tools, and valuable new contacts with experts and peers. The conference is limited to 250 participants, so make your plans early. Contact ISPI for a conference brochure at 1.301.587.8570, or visit www.ispi.org/isd.

  



Triad designs custom learning solutions for major corporations around the world to help them get business results from learning. For more than 13 years, Triad has been committed to providing the most advanced, customized learning strategies, methods, and tools, all for the express purpose of enhancing human and organizational performance.

Since Triad’s founding, our consultants have been actively involved as committee chairs and presenters at local chapters and international conferences. We were one of the architects in ISPI’s design of the HPT Institute. In addition, our work has been recognized several times by our peers at ISPI, with numerous local awards and three international Awards of Excellence to our credit, including Outstanding Instructional Communications (1998), Outstanding Human Performance Intervention (1996), and Outstanding Instructional Product or Intervention (1991).

Triad employs an approach known as High Impact Learning Systems®, or HILS®, a thoroughly researched and repeatedly proven planning and measurement process that enables companies to align their learning and performance-support initiatives with specific business goals. The HILS® approach was developed by Dr. Robert O. Brinkerhoff, a renowned authority in the learning and development field and a Triad principal consultant. Our principal consultants, including Rob Brinkerhoff and Anne Apking, have contributed extensively to this field with numerous books, articles, and presentations.

We are very proud of our long association with ISPI and are honored to have the opportunity to increase our level of commitment to ISPI by upgrading our sponsorship from Patron to Advocate. Anne Apking will serve as Triad’s Advocate representative.

Triad’s corporate headquarters is in Farmington Hills, MI. For more information, visit us on the web at http://www.triadperform.com.


 



by Pat McLagan


Change is a focal point of attention today for two major reasons:

Faster Pace of Change. The pace of change is accelerated. What took years, decades, and even centuries to do in the past, now happens within minutes, days, or weeks. Here are just a few examples:

  • In 1920, a firm listed on the Standard & Poor’s 90 was likely to be there for 65 years. In 2001, a company listed on the Standard & Poor’s 500 (S&P 500) is expected to remain there only 10 years.
  • The average useful life of a consumer product in dynamic sectors such as consumer electronics is as little as six months today. Nokia plans to introduce one new phone model per month for the foreseeable future.
  • The number of jobs a worker can expect to have before age 40 is now 12.

Broader Impacts. Since the 1980s, organizational changes have shifted from more tactical changes (install a new machine or a new division, open a new office) to more systemic ones. Systemic changes are changes that have far-reaching effects and cause fundamental changes in people’s roles and skill sets. Systemic changes are happening globally. The largest global study to date of organizational changes occurring in European, Japanese, US, and UK companies (Pettigrew, Massini, & Numagami, 2000) find the following incidence of systemic changes occurring in the mid-90s:

  • Implementation of information systems: 82%
  • Creating horizontal links within the firm (sharing services and information): 74%
  • New human resources (HR) practices geared to flexibility: 65%
  • Outsourcing: 65%
  • Alliances: 65%
  • Decentralizing operational decisions: 62%
  • Taking out or adding layers of management: 50%
  • Adopting project structures: 42%
  • Decentralizing strategic decisions: 41%

In addition, since 1979 more than 43 million jobs have been lost to downsizing in the US (U.S. Department of Labor). There were 20,000 alliances worldwide in 1996-98, double what it had been in the early 1990s (Harbison & Pekar, 1998). More than 8,600 alliances were formed worldwide in 1999 and another 10,200 in 2000. The trend is up for e-retail and down for shopping mall visits (3.7 average visits in the US per month in 1998 vs. 3.1 in 1989). The trend for businesses purchasing online is growing and significant. Worldwide mergers and acquisitions since 1995 are valued at $12 trillion, $6 trillion of which occurred in the US in 61,484 transactions.

All of these changes had ripple effects not only in the organizations directly involved but also in firms doing business with them, as well as their customers. This is systemic change!

Finally, there is strong evidence that the pace of change in day-to-day work is accelerating, as workers routinely change work procedures and processes, innovate and improve products and services, and find better ways to work together (Weldon, 2000).

References
Harbison & Pekar. (1998). Institutionalizing alliance skills: Secrets of repeatable success. strategy+business.

Pettigrew, A., Massini, S., & Numagami, T. (2000). Innovative forms of organizing in Europe and Japan. European Management Journal.

Weldon, E. (2000). The development of product and process improvement in work groups. Group & Organization Management.

Pat McLagan is the CEO of McLagan International, Inc., a 30-year-old change research, training, and consulting company. ISPI carries the RITEstuff reports developed by her organization for purchase in our online bookstore.

 



“My mind is made up, don’t confuse me with the facts!”

Occasionally, we as performance consultants encounter managers who resist data and really do not want to change what they are doing or how they do it. Luckily, more often our clients not only listen to our findings, but also demand to see the data. So when we have the opportunity to provide credible data on topics as controversial as incentives and motivation, we need to fully examine and embrace it!

This is why the recent study sponsored by the International Society for Performance Improvement (ISPI) and funded by the SITE Foundation is so important to every performance consultant. The study, entitled Incentives, Motivation & Workplace Performance: Research and Best Practice, was conducted by this year’s ISPI President’s Citation awardees:

  • Harold Stolovitch
  • Richard Clark
  • Steven Condly

The purpose of the study was to cut through the conflicts and controversies that have existed regarding the use of incentives to improve performance. The central questions guiding the study were:

  • Do incentives increase work performance?
  • What kinds of incentive systems are most effective?
  • What organizational conditions indicate a need for an incentive system?
  • What model best expresses the events that occur during the selection and implementation of successful incentive programs?

These are exactly the type of questions managers usually ask us when we are recommending incentives as part of our performance intervention. Wouldn’t it be great to have the answers to these supported by data and a landmark study? This study gives you this and much more. The study also introduces a new diagnostic and prescriptive model that can be used to guide the design and implementation of incentive plans. This model—Performance Improvement by Incentives Model—provides guidance on the step-by-step procedures, which allows decisionmakers to trouble shoot and improve incentive systems.

The need for solid research on the use of incentives has never been more important. With a soft economy and incentive plans being questioned, performance consultants are expected to provide direction and insight in this challenging time. So why wait? Order your copy of the study’s report, which is available from ISPI Bookstore for only $35 for ISPI members and $50 for non-members. Call today, 1.301.587.8570.

 

The need for solid research on the use of incentives has never been more important.

This study cuts through the conflicts and controversies that have existed regarding the use of incentives to improve performance.



by Judith Hale, ISPI Past-president


Jim Russell, a long-time member of the International Society for Performance Improvement (ISPI) is retiring from Purdue University on June 30, 2002. Jim served on ISPI’s Board of Directors from 1978-1980 and has been a frequent contributor to Performance Improvement journal. He is widely known for the Heinich, Molenda, Russell, and Smaldino Instructional Media and Technologies for Learning textbook published by Merrill-Prentice Hall. The book is now in its 7th edition. Jim has been a professor of Curriculum and Instruction at Purdue University for 32 years. He taught courses on Media Utilization, Instructional Design, Instructional Delivery Systems, and Principles of Adult Education. He received his department’s Outstanding Teacher Award in 1993. He also received the Purdue School of Education’s Best Teacher Award in 1996.

Jim consults with faculty and graduate teaching assistants on instructional improvement. He conducts workshops for middle and secondary school teachers on ways to use media to improve students’ achievement and attitudes in math and science. For the past six years, Jim has been a Visiting Professor of Instructional Systems at Florida State University during the spring semesters. He also works with the Learning Systems Institute at Florida State, which co-publishes Performance Improvement Quarterly with ISPI .

On a more personal note, it was Jim who invited me to come to Purdue for my doctoral studies. The Purdue program is built around competencies, one of the factors that convinced me to give Purdue a try. I found him to be fair and willing to share his depth of understanding of how to help people learn. Please join me in wishing Jim and his wife Nancy a wonderful next chapter of their life. If you have a story to tell or want to give a tribute in his honor to the School of Education, you can send it to Jennifer Matson, his daughter, at 1539 N. Park Ridge Way, Indianapolis, IN 46229 or at lancer@iquest.net.

  


Robert Mills Gagne, who was a leader in the fields of educational psychology and instructional design, died Sunday, April 28, 2002, in Signal Mountain, TN.

Gagne was born in 1916 in North Andover, MA. After receiving his AB from Yale University in 1937, he went to Brown University to earn a PhD in experimental psychology in 1940. He spent much of his 50-year career in academic positions at Connecticut College for Women (1940); Princeton University (1958 to 1962); University of California at Berkeley (1966 to 1969); and Florida State University (1969 to 1985). From 1962 to 1966, he was Director of Research at the American Institutes for Research in Pittsburgh, PA.

Gagne spent a good portion of his career working on military training problems. During World War II, he served as an Aviation Psychologist, developing tests for classification of aircrew. From 1950 to 1958, Gagne was Technical Director for Lackland and Lowry Air Force Laboratories, where he conducted numerous studies of human learning and performance. At the end of his career (1990-91), he worked on instructional design models for military training at Armstrong Air Force Base in San Antonio, TX.

During his career, Gagne never wavered from the idealistic vision of psychology. He believed that the science of psychology should be used to relieve the burdens of human life. His research and writing focused on how principles of human learning, established through scientific research, could be applied in education and training. He wrote five editions of a seminal book called The Conditions of Learning and also wrote numerous other books on principles of learning and instructional design.

In recognition of his contributions, Gagne received many honors including the Phi Delta Kappa Award for Distinguished Educational Research from the American Educational Research Association, the Distinguished Scientific Award for Applications of Psychology from the American Psychological Association, and the Distinguished Professional Achievement Award from the International Society for Performance Improvement in 1993.

Note: This article first appeared in the Tallahassee Democrat (FL) on Wednesday, May 1, 2002. It has been modified slightly for fit and style.

 

  


The International Society for Performance Improvement (ISPI) is pleased to announce the schedule for the 2002 Research Grant Program. Proposals are due June 3, 2002, and awards will be announced September 3, 2002. ISPI is interested in awarding grants for research related to performance technology. Such research may include, but is not limited to, investigations that contribute to the understanding, discovery, application, and/or validation of performance technology principles, theoretical underpinnings, and/or practices. ISPI anticipates multiple awards, ranging from $2,000 to $9,000. Further information about the Research Grant Program and the format for submitting a research proposal is available at www.ispi.org.

 
 



ISPI is looking for Human Performance Technology (HPT) articles
(approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:

  • Short “I wish I had thought of that” Articles
  • Practical Application Articles
  • The Application of HPT
  • Success Stories

In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to april@ispi.org. Each article will be reviewed by one of ISPI’s on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact april@ispi.org.

 

 

Go to printer-friendly version of this issue.


Feel free to forward ISPI’s PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone else’s PerformanceXpress, send your complete contact information to april@ispi.org, and you will be added to the PerformanceXpress emailing list.

PerformanceXpress (formerly News & Notes and Quick Read) is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Society’s activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.

If you have any questions or comments, please contact April Davis, ISPI’s Senior Director of Publications, at april@ispi.org.

 

ISPI
1400 Spring Street, Suite 260
Silver Spring, MD 20910 USA
Phone: 1.301.587.8570
Fax: 1.301.587.8573
info@ispi.org

http://www.ispi.org