The traditional measurement approach, The Kirkpatrick Model, provides a balanced picture of the learning experience by measuring satisfaction (Level I), learning (Level II), behavior change (Level III), and results (Level IV). In addition to these four levels, others like Dr. Jack Phillips are promoting a fifth level of training measurement—Return On Investment.

Whether your approach to measurement has four or five levels, the construct is only helpful in telling you what has happened. It has little predictive value of what will happen, unless you can replicate the same set of circumstances that existed in the past. The second challenge is that no matter how hard you try, the numbers are really qualitative data. Why? Because a subjective judgment has to be made on the impact that training made relative to all the other variables that impact performance. So, no matter how hard you try to make it quantitative, the numbers are at best a qualitative directional indicator of the success of the training.

Building a Business Case
Business people don’t make decisions about future investment based solely on an analysis of past performance. They develop projections based on incomplete information about current market conditions and opportunities, then make the best judgment they can as to what the future will be like. Invariably the business case is full of assumptions that, if reasonable and justifiable, will be accepted by senior management. Training professionals need to take the same approach when selling training to senior management.

Building a business case for training has six steps (DEFINE):

  1. Determine Required Results
  2. Evaluate Drivers of Performance
  3. Find Link between People and Performance
  4. Identify Costs and Benefits
  5. Negotiate Assumptions
  6. Examine Performance

A description of each of step is provided below:

Determine Required Results. What bottom-line measures of performance is senior management most concerned about? We know that profitability will be the primary concern, but there are many actions that can be taken to change the profitability of an organization. Some potential approaches to improving profits are increasing volume, raising prices, gaining market share, decreasing turnover, and reducing supply costs.

Evaluate Drivers of Performance. As much focus as there is on results, businesses don’t actually manage them. They are simply the “scorecard” for the way the business manages a series of processes and activities. Training must understand the processes it can influence and which of them will have a positive impact on external customers and the required downstream results.

Find Link between People and Performance. Customer expectations identified above are met through the service delivered by your people through your business processes. Training needs to understand the people and process issues that, if improved, would increase the organization’s ability to meet customer expectations.

Identify Costs and Benefits. If training is required for success of the initiative, the next question is, will the benefits of the training outweigh the costs? Typical training costs may include materials development (research, design, license fees), training facilitation (instructor fees), participant costs (travel, salary during training time), and administration costs (printing of materials, location costs). It is important to consider participants’ salary costs in the training cost calculation because senior management will be thinking about the cost (opportunity lost) associated with taking staff off the job.

Negotiate Assumptions. The hypothetical business case above is based on a number of assumptions that cannot be proven true or false until the training is complete. The key is to make reasonable assumptions, document them, and discuss them with the decision maker. Do the assumptions make sense? Is the return worth the investment of time? The benefits of having this conversation with numbers are twofold:

  • You will improve the alignment of the training implementation to the metrics that are important to management.
  • Talking to management about the role of training and potential benefits and costs allows management to “buy” the training and “buy-in” to the expected results. After all, nobody likes being sold.

Examine Performance. Once the training is complete, it is helpful to examine how business results changed as a result of the implementation. How did training perform relative to the assumptions made prior to the implementation? The Level I-IV (or V) framework can be helpful at this step to gain ideas as to how training can improve both the quality and efficiency of the offering.

Final Thoughts
The four (or five) levels of measurement are useful efficiency measures for training. They enable you to figure out what went well, and where you need to make improvements. They are also helpful when justifying the annual budget or getting commitment to the next wave of training after a successful pilot. However, they are not the business metrics that senior management focuses on.

Training professionals need to talk about training in business terms in order to be invited to the planning table. You need to focus your conversation on the impact training can have on the measures that management uses to run the business. Even if it is one or two degrees of separation from the bottom line, it will allow you to have a business conversation with senior managers that will help align the training implementation to the things that matter.

Greg Robinson is CEO and President of Carpe Diem Consultants, Inc. Founded in 1993, CDCI specializes in helping organizations improve customer and employee loyalty through web-based measurement systems, consulting, process enhancement, and training. Greg has worked with organizations in North America, Europe, and Asia to help them improve results by linking training to business metrics. He may be reached at grobinson@carpediemconsultants.com.

 

Advertisement 

Would you like to advertise in this space? Contact marketing@ispi.org


 

Training professionals need to talk about training in business terms in order to be invited to the planning table.




Best known to most of us as the senior editor of Training Magazine, Ron Zemke regularly seeks out information on new and emerging trends in performance improvement. As president of Performance Research Associates, a consulting firm specializing in customer relations and customer service, Ron sees firsthand how business and world events shape the way we work. The author or co-author of 40 business books, he also writes a bi-monthly column on customer service for the American City Business Journal newspapers. Ron may be reached at zemke@aol.com. He looked into the future with us and shared his predictions for the next two to three years.

Top Three Predictions
First, Ron believes that we will continue to struggle with our HPT identity. This is really a fight for HPT visibility. As practitioners, we have a history of getting results with proven tools. However, as each new “cure du jour” arrives, be it Total Quality Management, Zero Defects, LEAN, or another hot ticket, we learn its language, and then make our HPT contributions under its marquee, just “below the line of visibility” and never as the main act.

Second, we will learn to be more inclusive of practitioners in related fields. The proprietary nature of professional jargon has long separated HPTers from our colleagues in organization development, human factors, the Six Sigma arena, and other performance-focused specialties. The Tower of Babble we inhabit has caused these disciplines to dismiss each other because our language differences separate us. This will all change as we learn to speak the language of our clients and find we perform many of the same tasks in every discipline—we just take different paths to get the results. Results should be our primary language.

Third, since ISPI’s 1999 Think Tank, where HPT luminaries discussed the value of proven methods in successful performance improvement interventions, a number of our most visible colleagues have been writing and speaking on the subject. After pondering their ideas, Ron predicts we will rediscover the importance of proven methods in changing behavior. Distracted by computers and technology, we have let a number of the core methods from our past languish backstage. We will re-focus on results and bring back what works, not just what is easily adaptable to computer technology.

Why These Predictions
The challenge Ron puts before us with these predictions is to respond with healthy skepticism to the advent of the latest silver bullet that will fix all of our performance-based ills. It doesn’t exist. We would be better off looking instead at how we can artfully meld the old and the new to help our clients get results. From such work, a more accessible and memorable language to describe HPT might evolve. Rather than shrouding our good work behind the curtain of the latest-and-greatest, this new way of describing what we do could put us in the spotlight, for a change.

And, once we are on this path, we might stop along the way to find out what our colleagues in related disciplines are doing, and how we could partner with them for our clients’ benefit. Ron believes we will start learning to be more inclusive the minute we shut off our internal critic. His writing responsibilities have shown him the value of sharing a new or unfamiliar discipline by becoming an advocate for it as he explains it to his readers. This same thinking will serve HPTers well as we search for what we have in common with, for example, those human factors folks.

How Organizations Will Be Different
Actually, Ron speculates that organizations may not be at all different because of these predictions. Their staffs may be less Balkanized, forming, instead, an inclusive set of cross-disciplines that share tools and approaches among each other and clients. As the silos come down and we look for what we have in common—rather than our exclusive differences—we can help our organizations improve results with efficient, effective ways of working across our areas of specialization.

Implications for the Work of Performance Research Associates
In his own practice, Ron finds that he must rapidly deconstruct new approaches to performance improvement and pull proven techniques from HPT’s rich history of successes to ensure results. Executive coaching, from an HPT point of view, for example, can draw on experiences with precision teaching, behavior modification, goal setting, and other old chestnuts from the back of the HPT trunk. We need not acquiesce to the psychoanalytic view of coaching to accept the need for it or its ends.

HPTers can benefit from partnering with colleagues in related disciplines to find commonalities and use them to help our clients. After all, as recent TrendSpotters Dana Robinson and Paul Harmon have shown us, the specific job of the performance improvement professional is to align related disciplines to help clients get results.

If you have any predictions about the future of HPT that you feel would be of interest to the PerformanceXpress readership, please contact Carol Haig, CPT, at carolhaig@earthlink.net or Roger Addison, CPT, at roger@ispi.org.

 


  
  




If the word performance in performance support means business performance through human performance, then economic conditions are significant drivers of how we develop, implement, and maintain performance-centered systems. The current economic climate is one of classical business sense where profitability and low expenses are norms. Business performance through human performance means rapidly constructing, deploying, and maintaining the right processes and the right content to enable the largest competitive advantage through technology and human capital with the smallest possible expense. Indeed, it is the best time to revisit Electronic Performance Support Systems (EPSS Revisited, ISPI, 2003).

Because of emerging attention to business intelligence, performance support tools have emerged that address the new economic paradigm by focusing on business process analysis and benchmarking, rapid capture of business processes, rapid generation of business knowledge, rapid deployment in performance-centered environments, and rapid maintenance and upgrading. The performance support model that has emerged can be characterized as getting the right process right—quickly and continuously.

To further emphasize the trend, consider the new “normal” that forms the backdrop of business strategy and tactics (e.g., SARS, ENRON, 9/11, and the health insurance portability and accountability act of 1996). Normal is embracing unplanned events that affect business and managing them with more than just the tacit knowledge and too little responses, too late. What has emerged for EPSS is a focus on rapidly deployable and easily maintainable solutions rather than software features and functions. Here are three fundamental kinds of solutions that address the performance needs of organizations in today’s economy that meet the rapid deployment/easily maintainable mandates:

 

1.  

External/extrinsic capture-format-integrate-deploy solutions

  • Design: Rapidly capture processes and job-specific operational knowledge in a meaningful form
  • Integrate: Leverage existing electronic assets
  • Deploy: Share operational knowledge via the Intranet/Internet

Examples include ProCarta and Talsico.


2.  

External/extrinsic portal-centric solutions that facilitate rapid assembly and deployment of existing knowledge and learning, including coll aboration assets for teams

  • Based on the simple premise that smart, connected people perform at higher levels
  • Focuses on enhancing group performance via the WWW
  • Integrates learning, sharing, and collaboration to drive performance
  • Empowers group leaders and members to contribute
  • Emphasizes simplicity, speed, and ease of use

Such solutions focus less on workflow capture but more on process implementation by supporting collaboration, knowledge sharing, knowledge searching, and learning. The business imperatives and realities that they embrace as solutions include globalization of brands and sales teams, new product innovations, short product cycles, increased competition, continuously changing customer and consumer behavior, new distribution plans, information velocity (i.e., communicating the rapidly changing nature of products and competition), and the general diffusion of innovation.

Examples include
Auxilium, Hummingbird EIP, and Plumtree Corporate Portal.


3.   Intrinsic/extrinsic model-driven solutions based on adaptive workflows
  • Automatically capture existing business processes and model new business processes
  • Analyze, improve, and integrate business processes
  • Automatically generate knowledge assets from capture and process resources
  • Integrate knowledge assets into existing workflows and systems; integrate processes across disparate systems
  • Rapidly deploy and maintain knowledge assets
  • Measure and refine performance and solutions

These solutions provide a means to rapidly secure corporate process assets by automatically capturing processes and cataloging them in a fashion that can be checked out or in of a process repository. Captured processes can be employed for further asset creation, refinement, or improvement across an enterprise. Such systems reduce the time to competency and skill requirements, eliminating some need for experts, consultants, and manual, ad hoc processes. These systems intrinsically fuse knowledge into and simplify interfaces of existing applications in a fashion that reduces the expense and effort needed to make changes. Business users can directly improve processes, auto-create knowledge objects, and fuse knowledge into processes. Generally, they focus on continuously reducing process complexity and improving process performance through human performance. They give business users a unified process view while embedding knowledge (business policies, procedures, and business rules) into work tasks.

Examples include
Epiplex and Knoa.


In summary, if the word performance in performance support means business (or organizational) performance through human performance, then success of EPSS is therefore influenced as much by the business climate as it is by the human factors and the means to deliver knowledge. The best solutions, tools, and techniques today are those that provide organizations with the means to quickly and efficiently meet business goals and the needs of the real people who have to get the work done and do so in the climate of the “new normal.”

Gary J. Dickelman is the President and CEO of EPSScentral LLC, where he promotes all things performance centered, including the unique and powerful performance support tools that he developed with Epiance, Inc. He specializes in applying knowledge management, human factors engineering, learning technologies, information technology, and business process engineering to creating systems that humans can actually use. Gary is a member of ISPI, ASTD, and the Association for Computing Machinery and serves on the faculties of George Mason University’s Graduate School of Education and Boise State University’s Department of Instructional & Performance Technology in the College of Engineering. He may be reached at gdickelman@epsscentral.com.

 
Advertisement 

Would you like to advertise in this space? Contact marketing@ispi.org

 


 

The best solutions, tools, and techniques today are those that provide organizations with the means to quickly and efficiently meet business goals and the needs of the real people.


With the overwhelmingly positive reaction to ISPI’s Annual International Performance Improvement Conference & Exposition, it’s no surprise that half of our 2003 attendees in Boston were repeat participants. And, one-quarter had attended five or more ISPI conferences!

In 2004, we would like to introduce, and reintroduce, more of your colleagues to our 42nd Annual International Performance Improvement Conference & Exposition, April 18-23 in Tampa, Florida. This is why we are offering the Bring-a-Colleague rate once again. When you register for the full conference at the member or delegate rate, you may also register a colleague for only $350—provided your colleague has not attended an ISPI Annual Conference in the past three years (2001-2003).

When you register, think of a colleague at your organization, a client organization, your ISPI or ASTD chapter, or an acquaintance in the field who has not experienced a recent ISPI conference. Offer that person an opportunity to save hundreds of dollars while benefiting from the premier educational event in workplace performance improvement. Your thoughtfulness will build trust, partnership, and appreciation.

If you have not attended an ISPI Annual Conference in the past three years, you will want to register with a colleague. Find someone you know who plans to attend, register together, and one of you will register for only $350. The deadline is February 13, 2004. Click here to register today!

 


  




How can it be November already?
Wasn’t it just 2001? How can we be so close to 2004? The months and years go by whether we notice them or not. It’s pretty easy to get caught up in the things we do every day and then wonder where the time went.

A few years ago, Bob Mager presented a job aid during the 99-Seconds session at the ISPI Annual Conference that had money management, retirement, and leaving an inheritance as its content. It was a great technical example of a good job aid. It was also a good list of things to do—save money, write a will, and protect assets for your heirs. I keep a copy with my list of things to do. I am working on it! Before I know it, the next 40 or 50 years will have flown by, and it may be important that I have taken care of these things by then.

In the midst of doing our best to learn how to help people and organizations perform at their best, we must also take care to leave a legacy that matches our intentions. We can leave our physical property to our heirs, but what do we do with our less tangible property?

  • Do you have an idea that might have some merit?
  • Did you do something that worked?
  • Have you created a process, model, or job aid that could help others?

If the answer to any of these questions is “yes,” do you think it’s important to develop and share these in some form that will help others to use them? If you say “yes” again, then when do you want to do this important sharing?

ISPI can help you to make an intellectual contribution to others. In an organization where networking and keeping current are the two main motivators for joining and for staying involved, most of us are looking for new and old theories and practices that we can put to practical, measurable use. And, ISPI provides many forums in which to share them.

So, to borrow from Bob Mager’s job aid—if you don’t have an idea or practice to share, work harder! If you’re not ready to share now, think about when you will be. Make a plan to contribute an idea, model, or process that works to your colleagues in ISPI. Time goes by before you know it. So, if you have an idea to share, make sure to let us know. We’d love to hear from you.

 


  


An individual who receives the Certified Performance Technologist designation is certified for a period of three years, after which he/she will need to apply for Re-certification. As part of the Re-certification process, you will have to identify what you have done to continue your professional development and/or list the contributions you have made to the field of performance technology during the three-year certification period. You may want to get a jumpstart by downloading Part Two of the Re-certification Application, and during the next three years, record the training, education, and conferences that you attend, as well as any contributions that you make to the field of performance technology. For a complete description of the Re-certification program and to download the forms, click here.

In addition, logos are now available for CPTs that can be used on business cards, stationery, e-mail messages, and your website. Click here to find the logos and usage guidelines. Should you be a provider of training programs, workshops, or professional conferences with subjects that relate to one or more of the 10 Standards of Performance Technology, you will also find logos that can be used to identify your programs as sources for CPT Re-certification points.

 


  


The Exhibit Hall at ISPI’s 42nd Annual International Performance Improvement Conference & Exposition, April 20-22 in Tampa, Florida, is where you will find the Conference Registration desk, the ISPI Bookstore, the ISPI Awards of Excellence Display, GOT RESULTS?, the ISPI Job Fair, and exhibitors from many of companies.

Networking events such as the Opening Reception and Coffee Breaks after both General Sessions will take place in the Exhibit Hall. And, you’ll want to stop by the ISPI Bookstore and participate in special author book signings scheduled throughout the conference.

While networking in the Exhibit Hall, don’t forget to meet with the leading suppliers of workplace performance improvement products, programs, and services. ISPI exhibitors are people you will want to get to know. Visit with them, pick their brains, sample their offerings, meet an important new partner, or resolve a costly problem. Exhibit booths are available in increments of 10’ x 10’ and come with two booth personnel passes. To become an exhibitor, click here to download the Exhibitor’s Prospectus, or contact Keith Pew, ISPI’s Director of Sales and Marketing at keithp@ispi.org.

 


  




Through our “I-Spy” column, we hope to offer some useful leisure-time reading for our readership through relevant, interesting, and useful websites for performance technologists. Each month, we take readers to off-the-beaten-path sites that help them find similar thinkers, resources, work, new ideas, and sometimes just plain old fun.

Quick recap: Every month, three sites, one theme. While far from comprehensive, hopefully these sites will spark readers to look further and expand views about human performance technology (HPT). Please keep in mind that any listing is for informational purposes only and does not indicate an endorsement either by the International Society for Performance Improvement or me.

These are the general categories I use for the sites featured:

  1. E-Klatch: Links to professional associations, research, and resources that can help refine and expand our views of HPT through connections with other professionals and current trends
  2. HPT@work: Links to job listings, career development, volunteer opportunities, and other resources for applying your individual skills

The theme for this month’s column is Big Bucks. For some of us, it’s time to prepare for annual reviews and possible increases in income. As we examine opportunities for our professional and organizational performance improvement, can we also seek more financial security and freedom? These websites may trigger some ideas for measuring your success with “big bucks,” whether your metrics include performance or income.

E-Klatch
Want big healthy bucks? If you are curious about opportunities to link financial management and healthcare performance, pay a visit to the Healthcare Financial Management Association (HFMA). The HFMA is the nation’s leading personal membership organization for healthcare financial management professionals. An extensive job bank and publications are part of this organization’s site. A sample publication of note to ISPIers is a brief article titled Performance Metrics are Key to Tight Revenue Cycle Management.

HPT@work
Want big salary bucks? Helpful reader Roger Chevalier, PhD, CPT, alerted us to some research on industry salaries through Training Magazine. Click on the “reports & analysis” button to access some sample pages of different reports, including the 21st Annual Salary Survey. Complete articles and additional information are available to subscribers of this site. For additional resources on salary guides, I-Spy suggests another visit to the Riley Guide (from the May 2003 column), this time to their links under Salary Guides & Guidance.

Until next time, best of luck with your “big bucks.” See you in the December issue of PerformanceXpress!

When he is not Internet trawling for ISPI, Todd Packer can be found improving business, non-profit, and individual performance through research, training, and innovation coaching as Principal Consultant of Todd Packer and Associates based in Cleveland, Ohio. He may be reached at tp@toddpacker.com.

 


  



Much of ISPI’s focus
has been to evangelize Human Performance Technology (HPT) to its members, instructional designers, trainers, and others working in the arena of human resource development and organization development. Part of that evangelizing has been to create models to explain HPT to senior managers, not how to do it. What many of us have failed to do is to transfer either our commitment to HPT or the ability to engage in it to first line managers and supervisors. A lot of the training that has been done, even for practitioners, has focused on the process—assess, analyze, design, develop, implement, and evaluate, not how to focus on results, collaborate, or work—through real constraints and conflicting pressures. Unfortunately, recent survey data shows that less money is being used to train first line managers than in the past, and my suspicion is that what little is being spent is not about HPT but about complying with HR policies such as diversity, sexual harassment, and the like.

Senior management expects first line managers to know how to effectively direct the efforts of others while at the same time model behaviors that are inclusive and facilitate the involvement of subordinates and co-workers. First line managers are even expected to coach and give people feedback—probably the most onerous activity ever expected of anyone. Yet, first line managers rarely get the level of direction they need to be effective much less feedback or coaching in how to motivate or guide others. Organizations usually lack the performance management systems that effectively specify or align the expected work deliverables, the intended result, and performance measures. They even fail to understand the importance of consistent direction, efficient information and communication systems, and adequate resources.

The irony is that many of the tools for directing, coaching, facilitating, evaluating, giving feedback, and motivating exist and where they are absent, we have the ability to create them. Project plans and meeting management guidelines are just two well-worn tools that can be redeployed to help managers give more timely and constructive feedback. They can even be used to help clarify expectations. Newer tools like governance structures can facilitate long-term change in behaviors. Competency statements, when we add sufficient detail to describe appropriate, inappropriate, and exemplary behaviors in real work settings, are excellent tools for helping managers become more astute and critical observers of human activity. Work protocols, an old concept applied in a new way, are especially helpful in providing guidance in how to work across functions and job classifications. They can be used to model and encourage diplomacy and collaboration.

The time has come to dust off some old or create new management tools that help first line managers focus on results, collaborate within and across their functions, question the value of what they do, and recognize how to deal with adverse work conditions to improve human performance. We risk keeping HPT within the purview of a few if we continue to exclude the people who are most in need of the tools to think and perform differently.

Judith Hale, PhD, CPT is a long-time member of ISPI and ASTD. She is the author of Performance-Based Management: What Every Manager Should Do to Get Results. The book comes with more than 24 tools to help first line managers and supervisors apply the principles of HPT. Judith has been a consultant to management for over 27 years. She specializes in measurement, evaluation, certification, and performance improvement. Judith was awarded a BA from Ohio State University, a MA from Miami University, and a PhD from Purdue University. Judith may be reached at Haleassoci@aol.com or www.Haleassociates.com.

 


  


For the first time, the International Society for Performance Improvement (ISPI) will hold its annual Board election electronically, and members will vote for candidates to the 2004-2006 Board of Directors online. There are four open positions, including one for President-elect and three Directors. They will join the President, two continuing Board members, and the non-voting Executive Director who make up the eight-member Board. Look for more information on the candidate slate in the December issue of PerformanceXpress.

Since your link to the “voting booth” will be sent via email, it is very important that ISPI has your most current email address on file. To review your record, visit www.ispi.org and click on My ISPI to login. Or, you may call us at 301.587.8570. Every vote counts.

 


  



A Popular Research Fraud
A website, http://www.work-learning.com/chigraph.htm, that has been making the email rounds lately exposes an example of “bogus research” that has probably influenced thousands if not millions of training and educational design decisions—possibly for as long as 60 years! The messages that most of our colleagues have been taking from this appalling case of “lying with fake data” are that we should check our sources and be more careful in evaluating research. We should be sure we know what we are talking about when we use other people’s purported data and conclusions to make decisions about our own instructional or performance improvement programs and designs.

These points are certainly well taken, and the example is truly outrageous. But on examining my own lack of reaction to the exposé, I realized that one of the principles of measurement that I try to promote makes this and similar claims or counter-claims about research results at least partly irrelevant. Like most of my colleagues, I’ve been aware for much of my career of the often-cited “finding” that people remember more based on what they do than what they merely hear or read, and so on. I have always thought it was an interesting point, and aligned with much of my own experience. But, I would never have made and then left unexamined an instructional design decision based only on that general rule. I think that is where the perspective advanced in this column might differ from that of many of our colleagues. 

Upon Whose Data Should We Rely for Decisions?
As I have mentioned at least once in this column, in the article Why Do We Measure? And How? and elsewhere, Measurement: A Few Important Ideas, there are at least three reasons for measurement in training or performance improvement: validation, accountability, and decision-making. When we measure to validate a procedure or conclusion, we derive a general rule that suggests a direction or principle we might apply to a particular training or performance improvement problem. When we measure for accountability, we collect data to manage risk and hold ourselves or others accountable. But only when we measure to make decisions can we say for certain what is actually working now and what is not. The whole idea of measuring performance—at least if we take a natural science, engineering, or accounting point of view—is to be able to decide, in real time, what to do. Should we continue what we are doing with the group or individual, or do we need to make a change?

Research Provides Direction Not Certainty
If we use research findings—whether bogus or legitimate—to help guide our design and development of programs, then we probably have a good chance of creating effective programs. However, if we assume that these programs will then be optimally effective in our situation, with our population, and in our subject matter without actually measuring their effects on performance frequently and objectively, we deserve what we get. The theory said that Titanic could never sink. But the actual, real-time data said otherwise! 

Measure Your Own Programs—To Be Sure!
So, let me encourage you to rethink your approach to and confidence in data. Even the most thoroughly validated principles and theories, demonstrated in environments and with populations similar to yours, do not guarantee that your program based on those principles or theories will be optimally effective. You have to measure as you go, to confirm or disconfirm your own hunches based on long experience or reported results about relevant design issues or parameters. Otherwise, you risk continuing to implement programs that are ineffective for your population, or at least for some number of individuals for whose performance success you are, at least in part, responsible.

References
Binder, C. (2001). Measurement: A few important ideas. Performance Improvement, 40(3), 20-28.

Binder, C. (2003, July). Why do we measure? And how? PerformanceXpress.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance, and behavior to deliver measurable results. He may be reached at CarlBinder@aol.com. For additional articles, visit http://www.binder-riha.com/publications.htm.

 


 

Even the most thoroughly validated principles and theories, demonstrated in environments and with populations similar to yours, do not guarantee that your program based on those principles or theories will be optimally effective.





This year’s ISPI European Conference in Paris-Chantilly was as great for what happened as for what didn’t happen. Francois Lamotte d’Incamps immediately displayed his appreciation for controlled anarchy by opening the conference inviting the audience to participate and challenge the keynote speaker. As change agents, we are sometimes called on to impinge upon hostile and resistant environments. Hence, it was a rare treat for me to be able to pursue a line of questioning by asking not three but four questions in a row (and here goes the “for what didn’t happen”) without anyone giving me dirty looks or telling me to “be quiet and stick with the program!” Such sportsmanship effectively established a climate of lively interactions in true French style for the remainder of the conference.

In regards to the conference’s theme—Sustaining Performance—it struck me that although the word “ethics” was never mentioned, everything said about the best practices that contribute to sustainable performance articulated the definition of ethical behavior in organizations and society at large. Even more impressive was the fact that every individual conversation I had with HPT professionals from many different parts of the world resonated with ethical practices. To me, the fact that some of these people began their HPT activities from scratch some 20 years ago is testament that ethical behavior and sustainable performance can go hand in hand. Such commitment was further demonstrated by the fact that a number of past ISPI Board Members actively participated in the conference by displaying original ways of integrating existing HP models and/or readily applicable extensions thereof. Furthermore, the spontaneous coaching I received from colleagues from around the world was most inspiring.

I feel that the sincere support and collaboration I witnessed toward the mission of making HPT come to life at every level provided a glimpse of what the future of organizations may hold. Without being a bleeding heart humanist and while keeping an eye on the fundamental bottom line, I do think that the sum of such individual actions around a mission of sustainable performance may be the beginning of an effective effort to address Armand Braun’s urging that:

“le vrai défi du dévelopment durable réside dans notre capacité à réaliser un apprentissage collectif de la gestion de l’hyper complexité”

Which translated means: The true challenge of sustainable development lies in our capacity to collectively learn to manage hyper-complexity.

In the June 2003 issue of PerformanceXpress, the Conference Chair, Monique Mueller and the Program Manager, Christian Voelkl suggested HPT is a balance between the scientific approach and an artist’s approach. I think the ISPI 2002-2003 European Board met the goal of practicing what they preach having implemented each of these elements in their program.

With Andreas Kuehn, the new President of ISPI Europe, and a dynamic Board of six people from four European countries, I expect the next ISPI Europe Conference—UK 2004—will be just as extraordinary!

 


 


Why should we be interested in measuring? Leading organizations know that the difference between success and failure is often a strong performance measurement system. And, how you measure can be as important as what you measure.

When successfully implemented, measures will:

  • link strategy and tactics,
  • help assess performance against a baseline,
  • provide feedback that guides change,
  • supply a basis for a business case, and
  • focus the enterprise on what is important (desired behaviors and outcomes).

Measures and metrics allow organizations to understand operational performance, which can be tracked over time, relative to external benchmarks (e.g., industry average or top performers) as well as internal ones. By designing a tool that focuses on key performance drivers, an organization can better manage internal processes as well as identify external practices that can be adapted within the improving organization. By looking beyond the numbers at the qualitative drivers, management can also reveal the factors that most influence favorable performance.

One easy way organizations can access measures and metrics for process improvement is through the American Productivity and Quality Center’s (APQC) performance measurement database, PowerMARQ. This database provides more than 200 commonly used measures and individual benchmarks to assess the effectiveness of core operational functions. Using PowerMARQ, organizations can benchmark their performance against best-in-class organizations, establish performance targets, set budgets, identify key performance drivers, and assess improvement efforts. The tool allows organizations to avoid the time-consuming process of collecting measures and metrics from various sources, and instead gives managers key information for jumpstarting improvement efforts.

Successful Implementation
Measures can and do drive performance excellence—but success depends on how carefully organizations define their framework and metrics, and how they implement and refine actionable metrics. The biggest challenges organizations typically face are those relating to definitions, data collection, and accountability. Common obstacles include:

  • correctly identifying relevant key performance measures that tie to strategic objectives,
  • lacking common definitions,
  • collecting accurate performance data,
  • accepting final measures as tools in the effective management of departments, and
  • linking key performance measurements to compensation.

Successful implementation also depends on how well an organization manages the change process. Resistance to performance measurement (i.e., “You can’t measure what I do; It’s not fair because I don’t have total control over the outcome or the impact; We don’t have the data; We don’t have the time.”) must be anticipated and managed. As with any organizational change, securing buy-in and trust for successful implementation of actionable metrics begins with thoughtful planning to determine tactics and strategies that foster and solidify stakeholder support and identify specific actions needed to elevate confidence. Credibility can also be gained and increased through continuous assessment: ensuring the measures demonstrate results; are limited to the vital few; respond to multiple priorities; link to responsible programs; are used in decision making; are not too costly; and, are sufficiently complete, accurate, and consistent.

Taking Action
Measures are a critical component to driving performance improvement. Not only do measures give an organization a baseline of current performance, but they also allow management to track improvement and stay focused on strategically aligned activities. Organizations can expedite the implementation phase by avoiding common pitfalls and taking advantage of powerful tools that are currently available. With so much attention on the bottom line, organizations can no longer question whether or not to use measures.

ISPI and APQC
ISPI is excited to announce a new partnership with the APQC that will bring you access to their PowerMARQ program that can assist you in identifying important gap analysis metrics as well as provide you with industry averages for those metrics. This free offer expires January 31, 2004. ISPI members that submit surveys will gain unlimited access to the PowerMARQ database for one year, allowing ongoing benchmarks. To access APQC’s PowerMARQ database, visit www.apqc.org/powermarq, select option two on the login page, and proceed through the simple registration process. Once you have registered, you will be able to complete any of the PowerMARQ surveys and access the valuable data.

 

  
 

The tool allows organizations to avoid the time-consuming process of collecting measures and metrics from various sources, and instead gives managers key information for jumpstarting improvement efforts.



Performance Marketplace is a convenient way  to exchange information of interest to the performance improvement community. Take a few moments each month to scan the listings for important new events, publications, services, and employment opportunities. To post information for our readers, contact ISPI Director of Marketing, Keith Pew at keithp@ispi.org or 301.587.8570.


Books and Reports
High Impact Learning by Robert O. Brinkerhoff and Anne M. Apking provides the conceptual framework for the HILS® approach and is complete with integrated tools and methods that training practitioners can use to help their organizations achieve increased business results from learning investments.

ISD Revisited is a select collection of 56 articles from ISPI’s Performance Improvement journal focused ISD as practiced in the 21st Century. This compendium, with an introduction by Allison Rossett, provides a fresh perspective on ISD, presenting current thinking and best practices.

Conferences, Seminars, and Workshops
Darryl L. Sink & Associates, Inc. is offering the following workshops: Designing Instruction for Web-Based Training, Washington, DC, November 3-5; Instructional Developer Workshop, San Francisco, December 2-4; Criterion-Referenced Testing Workshop, Chicago, November 10-11. Visit www.dsink.com for details and to register!

Register today for Thiagi’s online course “How To Design An Effective Training Game In 10 Minutes” $25. Great content, exciting activities, online games, and personal feedback. The next online session is November 15-30. For more details, visit www.thiagi.com.

 

Consulting Services
So you want to be a CPT? If you have the experience, but don’t have the time, ProofPoint Systems has your solution. You provide the information, and ProofPoint does the rest. Not sure what’s involved? Call 650.559.9029, or email: info@proofpoint.net to get started.

Job and Career Resources
ISPI Online CareerSite is your source for performance improvement employment. Search listings and manage your resume and job applications online.

Magazines, Newsletters, and Journals
Chief Learning Officer Magazine Let CLO deliver the experts to you through Chief Learning Officer magazine, www.CLOmedia.com
, and the Chief Learning Officer Executive Briefings electronic newsletter. Subscriptions are free to qualified professionals residing in the United States.

Resource Directories
ISPI Online Buyers Guide offers resources for your performance improvement, training, instructional design and organizational development initiatives.

Training Services
The Power to Get Results. Martin Training Associates provides workshops, services, and products that focus on developing hard and soft skills in project management. Our methodology is universally applicable to any project and project team type. Visit
www.Martintraining.net for details.

 

 



ISPI is looking for Human Performance Technology (HPT) articles (approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:

  • Short “I wish I had thought of that” Articles
  • Practical Application Articles
  • The Application of HPT
  • Success Stories

In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to april@ispi.org. Each article will be reviewed by one of ISPI’s on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact april@ispi.org.

 

 

Go to printer-friendly version of this issue.


Feel free to forward ISPI’s PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone else’s PerformanceXpress, send your complete contact information to april@ispi.org, and you will be added to the PerformanceXpress emailing list.

PerformanceXpress is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Society’s activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.

If you have any questions or comments, please contact April Davis, ISPI’s Senior Director of Publications, at april@ispi.org.

ISPI
1400 Spring Street, Suite 260
Silver Spring, MD 20910 USA
Phone: 1.301.587.8570
Fax: 1.301.587.8573
info@ispi.org

http://www.ispi.org