We all know that what we used to call the training field, or instruction, is quickly changing. But what is it changing to? A recent article titled The Future of the Field Formerly Known as Training raised more questions than it answered (Galagan, 2003). I intend to weigh in with at least some partial answers.

First, training was, and still is, a way to support learning. Training is not the only way to learn, and it often is not the best way. Second, the need for learning is not going away; it is increasing. Every time an individual or group encounters something new, there is an opportunity, and often a requirement, to learn. This can be a new job, a new task, a new business strategy, a new customer, a new product, a new competitor behavior, and so on. Third, we can take a systematic, performance-based approach to managing learning, not just training, in our organizations.

Organizations and individuals must learn faster and more effectively to adapt to the rapid pace of change. The learning capability of an organization will make a strategic difference in its performance in the face of change. A highly developed learning capability will confer a competitive advantage to the organization.

The challenge for our organizations in the new millennium is to evolve from managing training to managing learning. This is not a trivial shift. Training alone is too slow and too inflexible for much of today’s work environment. Some of the non-training learning strategies being employed today include:

  • Coaching (by managers, professional coaches, and coworkers)
  • Knowledge management
  • Informal learning strategies
  • Holistic workplace learning strategies
  • Communication as a learning strategy
  • “Action” learning
  • Electronic performance support systems
  • Performance-based qualification systems

Most of the organizations I am familiar with today do not systematically deploy these strategies along with training to meet the learning needs of the organization and its individual performers. They still focus heavily on training solutions because that is what they are trained for and organized to provide. In many cases, their internal customers are not ready to sign up for non-training learning solutions.

Managing a balanced and integrated array of learning strategies requires a stronger role to be played by the managers and leaders in the workplace. Managers have been conditioned over the years to let the training department do it, and the training departments have responded. One of the biggest challenges of the transition from managing training to managing learning is gearing up the workplace organization to shoulder a major share of the responsibility. They do not traditionally see this as their role, and they are faced with a seemingly ever-increasing workload due to downsizing and pervasive change. For them to play their role in managing learning, they will have to learn how to do it and believe it will actually relieve the work pressure on themselves and their high performers. I believe this is one of the best areas in which the new breed of performance consultants can cut their teeth and demonstrate their value.

So, how can an organization plan to transition from managing training to managing learning? Below is a quick overview of a proven approach.

Engage an executive sponsor and an executive steering team, and provide them with a project plan and business case couched in the strategic interest of the business.
Follow a systematic process using a team of key stakeholders.
  Analyze the learning implications of business strategies, goals, and challenges.
  Analyze the population dynamics, demographics, and characteristics of the learner populations.
  Assess the capabilities of the existing learning systems.
  Design a new learning systems architecture that meets the business needs.
    Integrated learning strategies
    Supporting infrastructure
  Develop a multi-year implementation plan and a business case to support it.
Keep the executive steering team as a Learning Systems Governance Board to oversee implementation and to continue the link to the business drivers.
Rebuild the training staff to become a learning and development staff.

The new millennium will be more about learning systems than training systems. It will build on what has been learned while managing training systems. New strategies, roles, and skills will come into play. The people who are managing performance in the workplace will also find themselves managing learning with methods, content, tools, and consulting support from the Learning Departments.

Galagan, P. (2003, December). The future of the profession formerly known as training. Training and Development, 27-38.

Ray Svenson, president of Ray Svenson Consulting, Inc., is a recognized leader in Business-Driven Learning and Development Strategy for major corporations. His book, The Training and Development Strategic Plan Workbook, won the ISPI award for Outstanding Instructional Communication in 1994 and is a standard desk reference for learning leaders. Ray may be reached at raysvenson@qtm.net.


Would you like to advertise in this space? Contact marketing@ispi.org



A highly developed learning capability will confer a competitive advantage to an organization.

by Carol Haig, CPT and Roger Addison, CPT

On December 2, 2003, in Oakland, California, Claude Lineberry passed away from post-operative complications. Like many ISPI’ers, we were deeply saddened by this news. We were scheduled to interview Claude for this column and were looking forward to talking with him. He was pleased to be invited to share his perspectives and in that spirit we have decided that this month’s column belongs to him.

A Brief Bio
Claude was co-founder and a Senior Partner of the Vector Group, Inc., an international consulting firm dedicated to helping organizations successfully achieve change. He was a frequent ISPI speaker with an international reputation for success in improving organizational effectiveness. Claude was president of ISPI (then NSPI) in 1981-82 and was named Member for Life in 1997.

Claude’s most current book, Achieving Post-Merger Success: A Stakeholder’s Guide to Cultural Due Diligence, Assessment, and Integration, written with his business partner and close friend, Bob Carleton, is due out from Jossey-Bass/Pfeiffer this month. In addition, he co-authored Turning Kids On and Off, with Joe Harless, and Job Aids, with Donald Bullock, both now out of print.

Members Share Their Thoughts
When news of Claude’s passing was received in the ISPI community, many of you shared your recollections of Butch, as he was known to close friends. We are pleased to include your words here as we remember a vibrant presence in the world of human performance improvement.

  Needless to say, I was devastated by Butch’s death. He was my oldest friend—we were frat brothers at the U of Alabama in the early 60s, fellow doctoral students at Catholic U, business partners, co-authors, and best buddies throughout it all. Butch was the wittiest person I’ve ever known, wrote like a dream, and smart as a whip. He did not suffer fools, but [was] an awesome friend when you needed one. He lived life large, if not long. To say I’ll miss him is an understatement of the first order. —Joe Harless
  The last time Claude and I shared a private moment, we talked about the novels we were writing. I never heard whether he finished his, but his enthusiasm for his story line motivated me to double my own efforts toward completion of Killer in Our Midst (2003). Like most other conversations with Claude, his words—as well as his presence—sent me away with something of value. We miss him. —Bob Mager
  His work on corporate due diligence (with Bob Carleton, of course) is a great example of how HPT can be generalized beyond individual performance but to organizational value added. —Roger Kaufman

I met Claude Lineberry at the same time as I met the other pioneers of performance technology, when I was a neophyte to ISPI (then NSPI) over 20 years ago. I didn’t know how much he meant to me until I heard that he was no longer with us… I remember a man whose grand stature could also describe his personage. One whose largesse allowed him to spend a few moments talking with me when I was new, whose grand wisdom spewed forth in a grandly entertaining way, whose barreling voice resonated with warmth and great humor, and who contributed to the profession in a grand way.

It was Claude’s grand application of performance technology that was special to me…Claude was one of the first in the Society to converse with me about organizational consulting. He helped legitimize its place and my place in the Society. I will remember Claude as a grand master of organizational effectiveness, and a grand guy. —Esther Powers

  Butch and I are of an age and since receiving the dismal news of his demise from Dale Brethower he has been often on my mind. I know of no good reason this should be. We have never worked a project together; we are separated by half a continent and linked only by alphabet organizations. Yet, there is no denying it; images of Claude have slipped through my mind a hundred times since his death. I see him cruising top down in a Corvette through the suburbs of Detroit; walking, arms linked on wobbly legs in the wee hours through Chicago’s Loop; sitting poolside in Miami sipping beer, and generally being the irascible, irreverent soul of ISPI. I will miss his wit, his wisdom, and his deniable warmth. I owe him a drink, and if the God of frame-writing allows booze in purgatory we shall sip it there together. — Frank Wydra
  [Claude] has been a “performance technologist’s performance technologist” since before HPT was cool. He has been an ISPI “member’s member” since the “PI” stood for “Programmed Instruction.” But he…contributed to the advancement of HPT as well. Over the years, as a big fan, I have come to trust in [Claude’s] ability to see past the trendy surface issues of the time, and focus on the basics, in a serious—but not humorless—way…The chapter he co-authored on Culture Change for the Handbook still ranks for me as one of the most concise, and least confused, treatments of the subject I have yet seen. —Rob Foshay, from the presentation of the Member for Life Award
  Butch and I only recently figured out how long we had known each other—we met at the 1972 conference, my first and his fourth or fifth. Over the years, we gravitated from friendly competitors, through associates, to business partners, developing a deep, trusting friendship along the way. Close friend, advisor, running buddy, and fellow curmudgeon plus the best business partner and editor a person could ever have. Others have described Butch well, but let me add a favorite quote he used when talking about life and his philosophy—it sums him up well. “When it comes to the banquet of life, let no one say I settled for the appetizers.” I could certainly never accuse him of that—for which he will be sorely missed. —Bob Carleton

Readers may know that Claude was one of ISPI’s most outspoken critics and supporters. ISPI was his other family, and as a responsible family member, Claude spoke his mind. By raising controversial questions and sharing his views, he made ISPI think rigorously about important issues and enriched the outcomes, regardless of the ultimate decisions.

TrendSpotters joins our contributors and ISPI members in remembering Butch for all he was in ISPI and in life.



This article is the first
of two parts about Six Sigma and a comparison of Six Sigma and HPT. As more attention is placed on interventions, we need to learn more about options so that we can effectively select, promote, and support Six Sigma implementation. Six Sigma has saved companies millions of dollars, is supported by upper management, and has been credited with saving organizations.

Organizations arrive at the situation where products and services are not meeting customer needs, they are losing prestige and respect in the marketplace, they cannot create and maintain quality, and the culture doesn’t energize employees to improve. Both Six Sigma and Human Performance Technology (HPT) are comprehensive approaches based on systematic and systemic processes, focused on outcomes and results, and require collaboration. Many of their practices are similar, but each approach has a different origin. As a result, there are nuances that make huge differences. Six Sigma, driven more by hard data, reduced variation, and quality of the customer experience, enjoys more consensus around terminology and process. HPT is more inclusive, encourages broader data gathering and evaluation, and more unique, holistic interventions (solutions).

Six Sigma Defined
Six Sigma, a quality-centric approach to continuous improvement, is a disciplined change management process that uses data to analyze and measure deviation and to systematically eliminate the variance. “Six Sigma is both a technique and a philosophy based on the desire to eliminate waste and improve performance as far as is technically possible” (Business, 2002, p. 572). It focuses on quality of product and service, based on customer needs and expectations, and aims for “zero defects” (commonly targeted at 3.4 defects per million opportunities or 99.9997% defect free). Sigma is the symbol for standard deviation. “It is estimated that most companies are at the two to three sigma performance level, which means that for every million customer contacts there are 308,000 to 66,800 defects per million (Eckes, 2001, p. 1).

Six Sigma takes discipline, senior executive long-term commitment and willingness to drive the initiative, extensive training, new martial arts terminology, and shared vision. Six Sigma originated at Motorola, based on statistical tools and techniques developed by Joseph Juran; teams and culture are critical aspects for success. Improvements are based on team projects and needs to be infused into the culture. Quality, performance, productivity, and competitive advantage are improved while costs and waste are reduced. Six Sigma is now part of mid- and small-sized suppliers, healthcare, government, and other organizations.

Six Sigma Roles
The most fascinating and significant aspects of Six Sigma are its organizational roles. A member of the senior executive C-team is the quality leader and champion, while the senior executive team drives the initiative and serves as role models on a long-term basis. Six Sigma is not an initiative for everyone but top management, executives must “walk the talk” and drive the vision and the execution. Thorough knowledge and understanding of the Six Sigma details are critical for employees in all job categories. Using the terminology of karate, master black belt—who achieve the highest mastery of tools, techniques, and concept—serve as internal coaches, providing tutorials, facilitating meetings, and advising the champion. Black belts are full-time and green belts are part-time, having more tactical responsibilities. They lead several yellow-belt action teams (Davis, 2003, p.20) using outstanding project management skills applied to projects usually lasting six months or less. Teams consist of well-trained members competent in the DMAIC (Define, Measure, Analyze, Improve, Control) process for problem solving or DFSS (Design for Six Sigma), which is comparable to Harless’ New Performance mindset. Commitment and belief in Six Sigma are critical for success of any project or organizational effort.

Six Sigma is a well-defined strategy with an impressive history of substantial results in costs saved, deficits reduced, variation decreased, and customer satisfaction and loyalty improvements. It can be an effective intervention for HPT.

The second part of this article, which addresses the similarities between HPT and Six Sigma, will appear in the March 2004 issue of PerformanceXpress.

Business: The ultimate resource. (2002). Cambridge, MA: Perseus Publishing.

Davis, A.G. (2003, November). Six Sigma for small companies, Quality, 42, 11.

Eckes, G. (2001). Making Six Sigma last: Managing the balance between cultural and technical change. New York: John Wiley & Sons.

Darlene Van Tiem, PhD, CPT, is an associate professor and coordinator of the Performance Improvement and Instructional Design graduate program at University of Michigan–Dearborn. Prior to academic life, Darlene was the Training Director at Ameritech (now SBC) yellow pages business unit for four states. In addition, she was the curriculum manager of corporate-sponsored technical training for General Motors North America through General Physics. Darlene was lead author for two ISPI award-winning books: Fundamentals of Performance Technology and Performance Improvement Interventions. She may be reached at dvt@umd.umich.edu.



Would you like to advertise in this space? Contact marketing@ispi.org



Both Six Sigma and HPT are comprehensive approaches based on systematic and systemic processes, focused on outcomes and results, and require collaboration.

In the December 2003 issue of PerformanceXpress, Neil Rackham, author of the best-selling book SPIN® Selling, spoke with ISPI President Guy Wallace about his take on Human Performance Technology. This month Neil shares his thoughts on two-way partnering and gives readers a peek into his keynote presentation at ISPI’s 42nd Annual International Performance Improvement Conference & Exposition.

Can you share with us your thoughts about two-way partnering in terms of “suppliers-with-customer” and “suppliers-with-suppliers”?

Many of the fundamentals of partnering are the same for each. The importance of partnership to people in performance improvement for their customers is critical. Too many small companies are too narrow in their improvement specialties. They have one piece of the jigsaw puzzle, but their customers need broader answers. As suppliers, they must partner with other suppliers to create a combined new value that individual players can’t offer. To do that they might need to partner with other performance improvement suppliers to create the total solution—the complete jigsaw.

We need to ask ourselves what is our strategy to survive against larger improvement firms that can offer a broader array of improvement solutions? What are the competencies that our performance improvement organization doesn’t have? Who can we partner with to meet the needs of the customer?

The crucial thing is not to simply add their offerings to yours, but to fundamentally create new value by redesigning how you both do business, to release new value. As Andre Boisvert, who has set up many partnerships says, “Don’t set up a partnership if all you are doing is trading four quarters for a dollar.”

If I am a performance improvement company and you are a software development company, we should be changing my improvement processes and your software development processes and tools to create something more than what we offered before. More in terms of “newer, better, faster, and cheaper” than what we would offer if the customer simply brought us together in a project, and we continued to do what we would typically do.

Partnering creates a bigger pie. Sit with your potential partners and determine what you would do, and how you would do it if you were one company, rather than two companies working side-by-side.

What will we learn that’s new during your keynote presentation in Tampa?

My keynote will focus on “A Life Misspent in Performance Improvement.” Seriously, I intend to share many of the conclusions I have come to from both working for 87 of the Fortune 500 and from my research over the past 40 years…conclusions from research that will help practitioners. We will cover a lot of ground.

Any final thoughts to share with our PerformanceXpress readers?

Yes. I think we need to be more sophisticated about measurement. We live in a metrics driven world. We should be the ones to drive this. We need to help our customers determine how we will best measure this improvement effort, not as an afterthought, but as an integral part of the planning of an improvement effort.

Lord Calvin said, “If you can’t measure it, if you can’t express it in a quantitative manner, then your knowledge is of a meager and insignificant kind.” We need to insure that this does not apply to us, and our efforts at performance improvement.

For more information on Neil’s presentation in Tampa, click here.



One of the four basic principles of performance technology is that we strive to “make sure that what we do is of value.” That is, our efforts must meet a real and significant client need, cost-effectively.

The definition of needs analysis that people tend to use is narrower than that, however. Needs analysis is often defined as “an examination of the gap between the desired and current states” with the purpose of identifying factors that would allow us to close the gap.

The difficulty is that we do not always know if closing the gap addresses any real need of the organization. It may be that closing the gap between desired and current states is not of sufficient value to even justify the cost of the analysis.

Front-end analysis is a powerful tool—and we need to make sure we are not wasting it on gaps or “problems” that are not truly worth addressing. We are good at specifying the desired result, determining what factors are influencing the gap between desired and actual, and developing interventions to reconcile the gap. But are we always sure that all this effort creates significant value that warrants the cost? Did we meet a real need?

To paraphrase a saying by the old sage Mel Brooks, “Send us a sign. Are we doing the Lord’s work or just building sand castles?”

Recently, I worked with a client on an issue defined as “we need to make sure that our people follow our dress code.” We could define the gap; do an analysis and design and intervention that would ensure that people dress to code. But what was the real need? Was the value of meeting the dress code of sufficient benefit to the organization for us even to begin the analysis? There may in fact be a real need here and if we understood it, it might help both in the analysis and the design of the intervention.

Yes, I know the difficulty. If you are internal, how can you turn down a request from a senior manager, and if you are external, how can you afford to give up a piece of business? One way to deal with this issue is to perform a three-phase analysis:

  1. Clarify the issue: In this phase we would determine the real organizational need and clarify the value it might have to the organization if it were resolved. We would also determine if it is significant enough to warrant further analysis. This doesn’t have to be confrontational; simply ask requestors how they would define the benefits—to the organization and others—of addressing the issue. That need/benefit definition can then guide planning for further analysis, as well as design and execution of any intervention that follows.
  1. Specify the requirements: What would success look like? What results are required to satisfy the need? What would cause the requestor and/or the organization to say, “Yes! That’s the way things need to be.”
  1. Conduct an analysis: Here is where we find the current state, define the gap, and look for factors which could impact our gap resolution. A better term for this activity might be “systems analysis” rather than “needs analysis.” This would emphasize another HPT principle—“Always take a systems approach.”

Those three phases might be labeled Needs Analysis, Goal (or Outcome) Analysis, and Systems Analysis. That or a similar set of labels will help us emphasize that “closing gaps” is not necessarily meeting a need. It is simply a means of addressing an issue that the organization finds of value.

Roger Kaufman often says, “If this intervention is the answer, what was the question?” We should always start with a real need that’s more than just a statement of a “gap”—and then, once we understand the expected value, we can determine the best way to achieve it.


Brr. It’s that time of year, at least in the northern parts of the United States, when all you want to do is curl up with a hot cup of cocoa in front of a warm computer monitor. So let’s get those hands moving across the glowing Internet! Each month, we take readers to off-the-beaten-path sites that help them find similar thinkers, resources, work, new ideas, and sometimes just plain old fun.

Quick recap: Every month, three sites, one theme. While far from comprehensive, hopefully these sites will spark readers to look further and expand views about human performance technology (HPT). Please keep in mind that any listing is for informational purposes only and does not indicate an endorsement either by the International Society for Performance Improvement or me.

These are the general categories I use for the sites featured:

  1. E-Klatch: Links to professional associations, research, and resources that can help refine and expand our views of HPT through connections with other professionals and current trends
  2. HPT@work: Links to job listings, career development, volunteer opportunities, and other resources for applying your individual skills
  3. I-Candy: Links to sites that are thought provoking, enjoyable, and refreshing to help manage the stresses and identify new ideas for HPT

The theme for this month’s column is The Last Straw. Improving performance is stressful work. It is critically important for performance technologists to understand the stresses we face as we strive to create better workplaces around the world. Sometimes we hear people complain about “the last straw”—as in the final piece of straw that broke the camel’s back. If we can improve a situation, by opening up the HPT toolbox to build a lighter straw and/or a stronger camel (OK, so the metaphor is starting to get away from me), we can demonstrate our benefit to individuals at all levels of an organization through reduced stress and enhanced performance. Here are some resources that can help. Hang onto your last soda straws, too.

If you’re feeling stressed from work and life, you’re not alone. And thanks to the Internet, there are resources you can easily access. For a comprehensive set of stress management links, including connections to articles and associations, visit the United Kingdom-based Higher Education and Research Opportunities (HERO) links on stress. Sample links include: Advice for travelers, jet stress, how to stay healthy while flying, and Thirteen Tips for More Effective Personal Time Management. While many resources are geared toward students in the UK, others can benefit from the variety of helpful tools, including some Career links. This website was developed as an independent and impartial free source of information for students and other learners and for advisers and others working in education, training, and communities.

If the last straw pushes you to look outside of where you’re at, to explore the world, then pay a visit to the Northwestern University’s Foreign Government Job Information site. You can learn how to establish a business in the Bahamas, apply for a position with the Singapore Civil Service as a Ministry of Manpower Management Executive (you’ll get to “evolve the School of Lifelong Learning”), or get away from it all to hang out with the ptarmigan (Lagopus mutus hyperboreus), the only non-migratory bird species in Svalbard (in the Kingdom of Norway), “which remains on the islands year-round.” This amazing list of international resources links visitors to banks, ministries, government offices, budgets, and statistics for nations across the planet, from the (Islamic Transitional State of) Afghanistan to Zambia. Hey, perhaps this would be a good resource for organizing more international chapters of ISPI—in Guernsey, Kyrgyzstan, Niue, and elsewhere.

Well, if you want to play with the last straw, trundle on over to the Sodaplay site designed by London-based Soda Creative Ltd. Here you can build your very own mobile soda straw digital creature! These two-dimensional robots move, climb, and float as you alter their design and aspects of their environment. For inspiration or relaxation, you can visit the many creations in the SodaZoo. You can have fun as you try to build your own with the SodaConstructor. Let them try to climb the walls, so you don’t have to.

Until March, “e-” well.

When he is not Internet trawling for ISPI, Todd Packer can be found improving business, non-profit, and individual performance through research, training, and innovation coaching as Principal Consultant of Todd Packer and Associates based in Cleveland, Ohio. He may be reached at toddpacker@usa.net.



by Michael Peters, CPT

ISPI’s Annual International Performance Improvement Conference & Exposition has always been about professional learning and growth; expanding the breadth and depth of what we know, how we do what we do, and the impact we have on work, education, and the international community. And this year’s roster of pre-conference workshops delivers on both the breadth and depth commitments—breadth of topics and depth of both presenters and the insight they deliver.

The breadth of the 2004 Conference Workshop topics ranges from understanding our impact on business to theories, techniques and tools on delivering better performance solutions, better instruction, and e-learning and better evaluation.

If you’re keen on getting a handle on the business aspects of HPT, sign up for Robert O. Brinkerhoff’s, “Connecting HPI to Business Goals and Metrics” (1 day) or Mike Kicidis, “Turning Intangible Assets into Tangible Concepts” simulation workshop (1 day).

For a performance focus on HPT, you can choose workshops that deliver on the process, specific performance programs, or performance methods. For process, there’s Roger Kaufman’s, “Needs Assessment: What it is…How to get it done” (1 day). Jack Wolf describes a successful performance program in his half-day, “Shift to Performance at HBO.” If you’re looking to amp up your performance skills and techniques, you should check out Kimberly Morrill and Mark Munley’s, “Profiling Your Client’s Business” (1 day), Paul Staples, “Performance Mapping Program” (1 day) or Daniel Raymond’s 2-day workshop, “Building Better Job Aids.”

There’s plenty to choose from if you want to focus on our training roots. Peggy Durbin will teach you about “Learning Object Design” in her 2-day workshop, while Carl Binder will help you in “Building Fluent Performance” (1 day). Kenneth Silber’s, 1-day workshop will deliver a cognitive approach for problem solving training. Partnership is the theme in James Tamm’s “Building Collaborative Partnerships” (1 day) and Thiagi’s 1-day workshop “Partnering for High Performance in Teams—A Playful Approach.” Thiagi continues his partnership motif with an offering on “Partnering with Participants: Facilitating Human Performance,” while Mel Silberman will give you “20 Ways to Become a Consummate Team Facilitator” (1 day).

And if your interests expand HPT’s training roots to e-learning, we’ve got plenty of quality to offer there, as well. Ruth Clark delivers a 1-day workshop on “E-Learning and the Science of Instruction—Applied”, while Saul Carliner has a 2-day session on “Advanced Design for E-Learning.” In addition, Marie Jasinksi will help you realize the full potential of e-learning in her 1-day workshop on “Web-based Role Play and Simulation.”

And just to make sure we don’t forget what counts—Bill Lee, with his 1-day workshop, “Evaluation” and Jack Phillips, with his 2-day workshop, “Measuring ROI,” reinforce the importance of achieving and demonstrating results with everything we do.

Lastly, back by popular demand, Bill Coscarelli, Sharon Shrock, and Patricia Eyres will conduct an encore workshop titled “Constructing Level Two Evaluation & Certification Systems: Technical and Legal Guidelines” (1 day).

The conference workshops come in all sizes (half-, one-, and two-days), on Monday, April 19 and Tuesday, April 20. So, if you’re looking to gain a depth of knowledge from an established expert or a rising star, sign up for one these exciting learning experiences. Click here for complete workshop descriptions, and register today!


In this column I’ve occasionally suggested
that we don’t need statistical designs to do performance measurement and evaluation. I’ve mentioned that with ongoing counts of behavior, accomplishments, or business results and standard graphic analysis methods we can easily see when our interventions or decisions produce changes in trends or levels. I’ve shared evaluation designs that allow us to determine the causes of interventions without statistics. But I’ve never really come out directly and said this: Don’t use statistical methods.

Well, now I’ve said it.

Why, you might ask, would I argue against statistics? In a field (HPT) that considers itself to be research-based, where some of us passionately describe and advocate (if not always practice) measurement of results, why would I discourage people from using statistical methods? Let me list two of the reasons.

Statistics Emphasize Significance, Not Value
Anyone who has read research reports in education, psychology, or related fields must certainly have seen published reports of the effects of interventions on learning, memory, or other outcomes linked to improvement of human performance, studies showing highly “significant” results that are practically worthless because of their small size.

It’s important to recognize that statistical “significance” means only that a result is not likely attributable to chance, that the difference between a baseline or control condition and the effect of the intervention was probably a “real” effect. But statistical significance does not in any way imply that the result has practical or economic significance or value.

Instead of statistical significance, performance technologists should look for value or practical significance resulting from our work. This has to do with the size of the effect, the reliability with which we can produce it across individuals or groups, and the impact it has on some type of important human endeavor. Look for the big results, and don’t sweat the small stuff. Small “significant” effects are a distraction, not a desired outcome, for the HPT practitioner.

Statistics Often Intimidate and Obfuscate
Our three-year effort to obtain performance results cases for the GOT RESULTS? page on ISPI’s website has shown that statistics can be an obstacle to practical performance measurement, not usually a facilitator. Many of our colleagues—some of whom suffered through statistics courses in grad school and others who are intimidated by not having done so—feel that they can’t really submit case studies if they don’t include some kind of statistical test. This is balderdash.

Colleagues, this is NOT graduate school! This is the “world of work” as Tom Gilbert used to call it, and what we are looking for are large, valuable improvements in behavior, accomplishments, or business results. We need them to be so big and so valuable that our clients stand up and cheer when they see them, and don’t have to look at significance tests. To feel intimidated because you can’t put together a statistical evaluation design wastes a huge amount of time and worry. Not only that, but if we do manage to include statistical tests (and terminology) in our reports to clients and colleagues, the fact that most people don’t really understand them far outweighs any credibility we might gain by their inclusion.

Please don’t let statistics get in the way of producing or communicating results. Comments, counter-arguments? Please email me, and I’ll include some of your comments and my responses in upcoming columns.

A New Book on Standard Charting Methods
I’ve received numerous requests over the last few months for information about the standard charting methods described and illustrated in some of my columns. A new book, Handbook of the Standard Celeration Chart, Second Edition, which thoroughly reviews the rationale and methods associated with the standard chart, is now available from The Cambridge Society for Behavioral Studies.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance and behavior to deliver measured results. He may be reached at CarlBinder@aol.com. For additional articles, visit www.binder-riha.com/publications.htm. See past columns in this series by clicking on the “Back Issues” link at the bottom of the blue navigation bar on the left.




The deadline is approaching,
but it’s not too late to register with a colleague or client to attend ISPI’s 42nd Annual International Performance Improvement Conference & Exposition, April 18-23 in Tampa, Florida. When you register for the full conference at the member or delegate rate, you may also register a colleague for only $350—provided your colleague has not attended an ISPI Annual Conference in the past three years (2001-2003).

When you register, think of a colleague at your organization, a client organization, your ISPI or ASTD chapter, or an acquaintance in the field who has not experienced a recent ISPI conference. Offer that person an opportunity to save hundreds of dollars while benefiting from the premier educational event in workplace performance improvement.

If you have not attended an ISPI Annual Conference in the past three years, you will want to register with a colleague. Find someone you know who plans to attend, register together, and one of you will register for only $350. The deadline is February 13, 2004. Click here to register!




Maximizing the performance
of your human capital is an advantage if leveraged. Training is a tool that if applied, can certainly be a catalyst for maximizing human performance. One way to do this is by effective measurement.

The purpose of measurement is to derive a process whereby you can estimate the change in human performance, isolate it to a driver of human performance such as training, and make adjustments for conservatism.

Estimation is a process commonly used in business today. Sales people will estimate their future sales, and accounting people will estimate the cost of a warranty or claim that is expected in the future. So, too, can training personnel ask that participants, supervisors, experts, and others estimate the job performance impact that a training program will have on the job. Participant estimation, as it is commonly referred, is not estimating the performance solely related to training but asks participants to estimate job performance changes in general, including among other factors, training.

For example, if one attends sales training, one might estimate an increase in job performance but that increase could be related to other factors such as a competitor going out of business. So, estimates of performance change need to take into account many factors. Those factors include: process changes, people changes, marketplace changes, technology changes, and interventions such as training.

When estimating the increase, the person(s) doing the estimate should think carefully about all the factors mentioned. You may want to review historic data and forecast data to reasonably factor into their overall performance change. In addition, you may want to look at business results such as quality increases, sales increases, cycle-time decreases, cost decreases, risk decreases, etc. (the end outputs of human performance change) before vs. after training and compared to a control group who did not receive the training.

Logically, the training department is keenly interested in the effect training had on the performance improvement. So, the next step is to isolate the estimated increase in performance to just training. In this part of the process, the person(s) doing the estimates should estimate how much the training has or will influence job performance, relative to the other factors and assign a value to it. So, if the salesperson felt that training was the strongest factor that caused change or will be the driving force behind future change, it would receive a higher value than not.

Finally, because estimation and isolation are subjective at times, one must adjust any results for the estimate. Again, in other facets of business this is commonly done. Using analysis such as most likely, optimistic, and pessimistic adjusts estimates for bias by the estimator and flaws in assumptions. You’ll often see sales forecasts reported in this manner.

In training, adjustment is made for two reasons: first is conservatism. It is critical to state one is conservative in assumptions to build integrity into your metrics. Second is for bias. Estimates can be inflated. In fact, studies done by organizations such as the Tennessee Valley Authority (TVA) and separate studies by KnowledgeAdvisors suggest that respondents tend to over-estimate by a factor of 35%. To this end, when computing a human performance change one might reduce the inputs by a factor of 35% or a similar confidence rate as the adjustment factor for conservatism and bias.

Taken together, the principles of estimation, isolation, and adjustment form a powerful model in tabulating a systematic, replicable, and comparable model for human performance change.

A reader may say, “this is not credible data, it is not statistically accurate, and it is too subjective.” My response would be that the world of human performance measurement is far from objective and accurate. The goal is to have roughly reasonable indicators of it without expending considerable human, financial, or physical resources to do it.

Recognize that your attempts to go from roughly reasonable to highly accurate are tremendous outlays in resources. In addition, research states that you should do otherwise. A study published in the May 2003 Harvard Business Review found that senior managers make decisions on other instinctive factors, not the highly accurate and highly costly data they are provided from highly paid number crunchers. They use such data as one of many inputs and prefer more timely rough estimates versus precise metrics that are too late to factor into decision-making.

The first step in human performance measurement is to understand that in a world of doing more with less, all elements of the organization that are involved with performance improvement need to be seriously thinking about how to measure the impact their initiatives have on human performance.

Jeffrey A. Berk is Vice President of Products and Strategy for KnowledgeAdvisors, a corporate learning business intelligence firm that helps organizations gain the knowledge to improve human performance, better educate its workforce, and reduce costs across the enterprise. Its proprietary measurement technologies and benchmarking expertise help companies more successfully measure human performance change due to training. Jeffrey may be reached at jberk@knowledgeadvisors.com.


Taken together, the principles of estimation, isolation, and adjustment form a powerful model in tabulating a systematic, replicable, and comparable model for human performance change.

The Presidential Initiative Task Force is currently winding up Phase 4 of Stage 1. For a quick background on this effort, see my article the January 2004 issue of PerformanceXpress. For additional information and all of the past articles published on this effort, readers can click here.

Recent Steps
On January 5, 2004, the Presidential Initiative Task Force conducted a conference call to review their draft report page-by-page. This report was then updated and sent to the Board of Directors for review at their January meeting. The results of that review will be published next month.

The Task Force’s work outputs include the following:

  • HPT Definition and Criteria
  • A New HPT Framework
    •  Performance Systems Engineering Process
    •  Performance Analysis/Design Framework
    •  Technology and Research Domains
  • A Recommended Governance Structure

Our only real issue is the naming and numbering of “the” HPT technologies. As expected, this is the most controversial and difficult of our tasks. I believe it is because each of us has developed our own mental models over the years and rationale for them. And now they have begun to collide in our group process. Part of the difficulty of moving on is also a matter of deciding which potential functionalities should this new framework for HPT serve best. In my view, the new framework (the Domains) should help us clarify the various technologies of HPT and help us organize all of our stuff including our members.

We can then better ensure that the content of our publications and forums (conferences and institutes) is appropriately balanced in terms of the processes of HPT, the intervention sets that address the variables of performance, and the underlying science. We should be able to count the articles and pages devoted to content from each technology.

Another functionality test that I use is: Would I ask in a networking activity for everyone in the room to go to the “table” with the banner overhead for “their” main Domain (as many will have more than one). As learning and networking are the two big reasons for ISPI participation, we could do something like this at a future spring conference to kick-off the gathering of 1200–1500 attendees, where about 60% or more are brand new each year. It is hard to network in such a large group. If our 6-9 Domains minimized gaps and overlaps, then each could find their “homeroom” in the ISPI school of HPT.

Next Steps
The specifics of the Presidential Initiative Task Force outputs will hopefully be published to the entire Society after Board review and approval. But we’re not done at that point. There is plenty of work yet to do. The Board will be asked to begin to lay the groundwork for the Task Force(s) that will be needed to be put in place (some soon and some later) to carry forward, continue the work begun, and engage a greater number of Society members in the process. I’ll report back to you soon.



The American Productivity and Quality Center (APQC) and its researchers have identified best practices and discovered effective methods of improvement for more than 25 years. During that time, compelling stories have surfaced of model organizations that take aggressive, intelligent steps to improve their operations. APQC has had the opportunity to witness the evolution of successful initiatives at those organizations. The Xerox Profile: Best Practices in Organizational Improvement examines how an organization began its improvement efforts, how its focus evolved, and what challenges it has faced. This is an excellent way to compare your own organization’s improvement efforts.

The information in The Xerox Profile is from 12 consortium benchmarking studies conducted over the last decade, as well as articles APQC has published. The bulk of this report details information APQC study teams captured during site visits to Xerox. And, because Xerox is continuously making efforts to improve, it is important to note that the initiatives in this profile are still evolving. As Xerox itself asserts, “The ability to learn faster than your competitors may be the only sustainable advantage.”

APQC has found that progressive organizations, like Xerox, know that an unfaltering focus on continuous improvement is the key to achieving and sustaining success. Xerox is a global leader in technology innovation, with $1 billion spent annually on research and development. Consequently, a key element of Xerox improvement initiatives is innovation. Themes of research and development, knowledge management, new product development, and response to customer needs are evidenced throughout this publication.

It is interesting to review what Xerox reveals about certain activities, such as measures and communication, in the context of different initiatives. Far from being isolated organizational improvement efforts, these initiatives progress in a symbiotic manner so that Xerox can achieve both process and performance excellence. The company has a history of first implementing an initiative internally, perfecting the process, and then approaching its customers to help them in their implementation efforts. Xerox is an excellent example of an organization that systematically improves and then capitalizes on those improvements to gain strength in the marketplace.

The Xerox Profile: Best Practices in Organizational Improvement details Xerox’s history and developments around performance improvement. For example, Xerox’s internal communications team uses a variety of measurement techniques for measuring communications both at the corporate level and at the business unit level. Its performance appraisal process is used to measure the communications functions’ effectiveness. Additionally, Xerox’s knowledge management team measures its success through the annual employee survey, which measures the 30 elements on Xerox’s management model. The assessment effort is voluntary.

ISPI members will benefit from insight into:

  • how Xerox measures its customer solutions representatives;
  • the company’s performance enhancement planning process; and
  • Xerox’s evaluation of itself based on eight critical success factors.

For more information on Xerox or to purchase The Xerox Profile, please visit APQC’s online bookstore at www.apqc.org/pubs.



ISPI’s 2005 Annual Conference Program Committee is looking for volunteers to serve as track chairs and evaluators for the conference proposal review process.  Volunteers are needed to review proposals in the following areas of concentration: 

  • Analysis
  • Interventions
  • Metrics, Measurement, Process, and Practice
  • Using HPT to Manage Business Results
  • The Future of HPT
  • The Business of HPT

The majority of the work performed by evaluators will occur between August 1 and November 1, 2004.  During this time, track chairs and evaluators will work remotely with ISPI staff and make recommendations regarding the proposals submitted for presentation at ISPI’s 2005 Annual International Performance Improvement Conference and Exposition in Vancouver, British Columbia. 

Each of the six review teams will consist of approximately 10 members, a track chair, a deputy track chair, and evaluators.  Evaluators must be ISPI members in good standing. 

Additional Requirement for Track Chairs

  • Plans to attend ISPI’s 2004 Annual Conference in Tampa, Florida, including the Track Chair and Evaluator Committee Orientation
  • Track Chairs and Evaluators cannot be planned presenters at ISPI’s 2005 Annual Conference in Vancouver

If you are interested in serving as track chair or evaluator, please email conference@ispi.org. In the subject area of your email, write 2005 Evaluator/Track Chair Volunteer.  Your email should include your detailed contact information, and your first and second choice of  “tracks.”




Participate in groundbreaking research by sharing your experiences concerning how your instructional design preparation matched up with the ID position you eventually acquired! A brief, 15-minute, online survey asks you to identify your career environment (for example, higher education, business and industry, K-12 education, and more), whether you were prepared specifically to practice design in that environment, and if so, how you were prepared. Results of this survey will identify programs that do a particularly good job of preparing instructional designers for specific career environments. To share your experiences, please access the survey by clicking here (no identifying information will be collected as a result of your participation).

This study, available online until February 29, 2004, is being conducted by the Center for Instructional Technology Solutions in Industry and Education (CITSIE) at Virginia Tech University. If you have questions about the study, please contact Miriam Larson at milarso1@vt.edu.



Performance Marketplace is a convenient way  to exchange information of interest to the performance improvement community. Take a few moments each month to scan the listings for important new events, publications, services, and employment opportunities. To post information for our readers, contact ISPI Director of Marketing, Keith Pew at keithp@ispi.org or 301.587.8570.

Books and Reports
High Impact Learning by Robert O. Brinkerhoff and Anne M. Apking provides the conceptual framework for the HIL approach and is complete with integrated tools and methods that training practitioners can use to help their organizations achieve increased business results from learning investments. Winner of the 2004 ISPI Award of Excellence for Outstanding Instructional Communication.

Conferences, Seminars, and Workshops
Darryl L. Sink & Associates, Inc. is offering these workshops in 2004: The Criterion Referenced Testing Workshop, Atlanta, GA, March 22-23; Designing Instruction for Web-Based Training, Chicago, March 15-17; and The Instructional Developer Workshop, Chicago, March 1-3. Visit www.dsink.com for details and to register!

Faster, Cheaper, Better. Let Thiagi and his team design your web-based training and live e-learning sessions. True interactivity is in the mind—not in the mouse. Exciting activities require and reward higher-level thinking and application. For more details, visit www.thiagi.com.

2004 Measuring & Benchmarking Training Conference, February 23-25, 2004, Las Vegas. Learn proven methods to OPTIMIZE the ROI of your training program. Attend unique and informative workshops presented by training leaders such as Intel, Home Depot, Pfizer and United Airlines. 10% Discount for ISPI Members!



Continuing Education
California State University, Fullerton is accepting applications for its on-line Masters Degree Program in Instructional Technology and Design.  For a career in a dynamic and growing profession these programs are a perfect addition to anyone’s resume. For more information, please visit: www.msidt.fullerton.edu/apply.htm.

Job and Career Resources
ISPI Online CareerSite is your source for performance improvement employment. Search listings and manage your resume and job applications online.

Magazines, Newsletters, and Journals
Performance Improvement Quarterly, co-published by ISPI and FSU, is a peer-reviewed journal created to stimulate professional discussion in the field and to advance the discipline of Human Performance Technology through literature reviews, experimental studies with a scholarly base, and case studies. Subscribe today!

Resource Directories
ISPI Online Buyers Guide offers resources for your performance improvement, training, instructional design and organizational development initiatives.


Are you working to improve workplace performance? Then, ISPI membership is your key to professional development through education, certification, networking, and professional affinity programs.

If you are already a member, we thank you for your support. If you have been considering membership or are about to renew, there is no better time to join ISPI. To apply for membership or renew, visit www.ispi.org, or simply click here.



ISPI is looking for Human Performance Technology (HPT) articles (approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:

  • Short “I wish I had thought of that” Articles
  • Practical Application Articles
  • The Application of HPT
  • Success Stories

In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to april@ispi.org. Each article will be reviewed by one of ISPI’s on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact april@ispi.org.



Go to printer-friendly version of this issue.

Feel free to forward ISPI’s PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone else’s PerformanceXpress, send your complete contact information to april@ispi.org, and you will be added to the PerformanceXpress emailing list.

PerformanceXpress is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Society’s activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.

If you have any questions or comments, please contact April Davis, ISPI’s Senior Director of Publications, at april@ispi.org.

1400 Spring Street, Suite 260
Silver Spring, MD 20910 USA
Phone: 1.301.587.8570
Fax: 1.301.587.8573