by Karyn Patterson

These days, it seems that as we develop more innovative methods for delivering training, our decisions on which ones to use get harder, not easier. And everyone is talking about blended solutions. So how do you find the right method or blend?

The most important thing for us to consider in answering the question is the level of knowledge or skill that we expect from the learner after training. We look at three general levels:

  • Awareness: This means that learners should know specific facts and concepts and how they could be applied in the real world. For example, with training on your organization’s sexual harassment policy, you would only expect the learner to be aware of the policy and the consequences that would result from specific behavior.
  • Understanding: Here, learners will have a limited ability to perform with cases that are simpler than what will be encountered in the real world. They might be able to approximate the performance required for new users, but this cannot be guaranteed. You might teach to this level of performance for a new computer system that learners can gradually master while practicing on the job.
  • Skill: In this level, learners will have the ability to perform competently in the real world, under a variety of conditions, and meet the standards for performance. You would take this approach when training on new surgical procedures, for example. You do not want to take any chances here.

Note that the skill level of performance can rarely be achieved through training alone. More often, there needs to be a component of on-the-job training with mentoring and coaching support.

The DOPLER Decision MatrixSM
Okay, now back to our original question: How do you choose among the plethora of delivery options? We like using a simple matrix (called the DOPLER Decision MatrixSM) that plots the expected performance against a continuum of delivery options ranging from self-study print (with no real-time, contextual feedback) to on-the-job coaching (with complete real-time, contextual feedback). The range of acceptability is a zone where the delivery method matches the level of knowledge and skill required. Anything outside the zone is either inefficient (you will get the performance you expect, but you will spend more money and time than you should) or ineffective (the learner won’t be able to achieve the performance you expect).

For example, if it is determined that sales support professionals need to have an understanding level of knowledge in the topic of order entry, begin at the left-hand side where the word understanding is written and follow a horizontal line across until it reaches the left and right diagonal lines of the range of acceptability band. Where these lines intersect, follow a vertical line down to the delivery methods continuum to identify the most efficient, effective methods of delivery—in this case, workbook, web/CD, or virtual classroom.

If you know that an awareness level of knowledge is required for a particular topic area, the matrix would indicate that anything from audio to workbook would be appropriate. If, however, the learners are going to be in town for a meeting and you choose to transfer the knowledge in a classroom setting rather than producing a workbook, the objective will be met. However, the matrix will indicate that this is an inefficient delivery method. Inefficiency indicates only that human resources or more expensive technology was used where a simpler method could have been used, based on the depth of knowledge required.

Falling into the yellow is in no way an indictment of the method employed. In fact, you may decide to pilot a new technology-enabled course in a classroom setting to test the exercises prior to investing in actual development.

While falling into the yellow can be justified, falling into the red is a problem. The red zone indicates that the learner will not be able to achieve the objective, as determined by the depth of knowledge required, using that delivery method.

For example, if you want to teach a child to ride a bike, a complex skill requiring physical and cognitive ability, the decision matrix would indicate that only classroom and coaching delivery methods would be appropriate. No one would expect that showing a video or logging onto “” would be sufficient to ensure the child has the competence and confidence necessary to ride a bike.

DOPLERSM and The Blended Solution
Let’s stay with this example for a minute. We know that showing a video is ineffective in teaching a child to ride a bike, but let’s say you have identified that “Rules of Biker Safety” is a critical topic. Using a video to demonstrate the rules of safety is entirely appropriate. And logging onto “” to review facts, concepts, and rules may also be an efficient, effective delivery method for this content.

The point is, we must clearly define performance requirements, identify enabling content, provide practice, and use the most efficient delivery method to provide the required level of feedback to ensure mastery of the topic. As long as all of the modules fall into the green range of acceptability band, we will have engineered an effective, efficient blended solution.

Real-World Expectations
The underlying principle behind this model is this: In order for people to perform in the real world, they must be given opportunities to practice and receive feedback on that practice. The more the conditions of practice mirror the conditions of the real world, and the more we provide individualized, performance-specific feedback, the more likely learners are to actually transfer skills learned in training to their job.

Karyn Patterson is a senior performance consultant at Triad, a consulting firm providing custom learning and performance support solutions based in Farmington Hills, Michigan. She specializes in leading project teams for large-scale training and human performance improvement initiatives. One of these initiatives, a year-long process design and documentation project for General Motors, received the Premier Award from the Michigan chapter of the International Society for Performance Improvement. Karyn may be reached at


Would you like to advertise in this space? Contact


  Note that the skill level of performance can rarely be achieved through training alone. More often, there needs to be a component of on-the-job training with mentoring and coaching support.

by Carol Haig, CPT and Roger Addison, CPT

As we move into 2003, TrendSpotters has changed its focus from identifying the current trends to peeking into the future. We are asking our guests to predict what HPTers and the organizations we serve will encounter in the next few years. To start us off, we watched Judy Hale, CPT, of Hale Associates, and immediate past president of ISPI, as she peered into her crystal ball. Judy has clients and contacts across a broad spectrum of businesses and may be reached at She synthesizes her experiences and observations into three closely related predictions.

Three Key Predictions
Within the next two to three years, Judy expects to see an increase in the outsourcing of entire functions, such as learning and development, to large firms that are highly successful but may lack direct expertise in such functions. This practice is predicated on the success of Human Resources in outsourcing functions like payroll, compensation, recruitment, and outplacement. Learning and development are likely to be next. Large firms such as IBM, Accenture, and Intrepid, as well as major advertising agencies, are going after the training and development end of our business.

Many business services can be competently delivered for much less cost when provided from an offshore location. Examples are the production of books, periodicals, and media and computer programming. Soon, we will see an increase in offshore contracting to develop training projects and programs. This will present new challenges because this work will require English language expertise and cultural knowledge.

We will also see an increase in competition among providers of HPT-related products and services.

  • Reports on the millions of dollars spent annually on training have piqued the interests of the large firms described above
  • Downsizing and business failures will continue to put independent practitioners on the street, competing for consulting business
  • It is a buyer’s market, with fees falling, unemployment rising, and customers demanding fee concessions and demonstrated proof of success before they purchase services
  • The larger vendors in the human performance improvement arena will cut their costs by contracting offshore, thus shaking up their current contractor and vendor base

What Will Drive These Predictions
Large companies such as Microsoft and Weyerhauser are creating demand for outsourced functions because they have client relationships at the highest levels in major organizations and can leverage them to offer additional services. For example, a nationally known advertising company that has run successful ad campaigns for a client firm can say to the CEO, “Look, we have done a superb job with all your advertising. Give us your training function, and we will handle that for you as well. And you know the quality we produce.” The CEO may well reason, “If it looks good, it must be good,” and contract for the outsourcing.

All the well-publicized funds allocated to training and education will still be spent; the conglomerates that provide outsourcing will go offshore to hire out the design, development, and production at lower cost, and the results likely will be of far less quality than those produced formerly.

How Organizations Will Be Different
How HPT-related work is accomplished inside organizations will also be different. Already, purchasing departments are becoming more involved, increasing cost controls and purchasing HPT services through the classical commodity-buying model rather than through client relationships.

If the learning and development function remains in place, decision-making power may be usurped in a new organizational structure, changing how projects are managed and outside services are used. If the function is outsourced, employees may be transferred to the new service provider, reporting through a structure that contains limited knowledge of human performance improvement.

Implications for HPT Work
Generally, organizations will still improve human performance. When they contract out, it will continue to be for smaller projects, affecting external vendors and independent consultants. Project timeframes will compress further, forcing HPT practitioners to think like minimalists and ask themselves: Given that we cannot afford to do this project “right,” what can we provide that will be just enough to get the job done?

As a discipline, HPT will have to develop faster, leaner, meaner processes for our work, moving far beyond what we accomplish today with rapid prototyping. We will have to be quick without sacrificing effectiveness.

A major challenge we will face will be compensating for the lack of intimate organizational knowledge that outsourcing and offshore contracting will produce. As we know, there are many projects that require consideration of the finite elements of the internal culture and are most efficiently executed by inside staff. We will have to develop tools to manage this lack of intimacy so projects and programs built outside will be successful and produce results.

Over the years, our HPT models and rules have served us well because of the predictive nature of our work. Now we are challenged to take a minimalist view and revisit those models and rules, updating them to let us produce results smarter, faster, and cheaper. By first working collegially to improve our tools and our technology, we could then be effective with all types of project teams, helping them improve their processes and results.

A new role we might develop to aid our clients and ourselves in working differently is a role that would interface between the outsource company and the client to provide stewardship services. Like the old User Representative role in systems organizations of yore, the Steward would ask:

  • What is the work to be done?
  • What is the best way to do it?
  • Who is the best choice to do it?

Ultimately, we in HPT will have to continue to demonstrate that performance improvement makes a difference to organizational results, customer satisfaction, and productivity.

If you have any predictions about the future of HPT that you feel would be of interest to the PerformanceXpress readership, please contact Carol Haig, CPT, at or Roger Addison, CPT, at


by Diane Gayeski, PhD

Would you like to show increased impact and return on investment for your training and performance improvement projects? I am going to show you how to turn $100,000 of cost savings into more than $1 million in organizational value! Now that I have got you hooked, let me tell you why we need to do this.

Although we have made some strides in developing tools to manage and calculate the ROI (return on investment) of projects, in my opinion this is not sufficient. There are several reasons for this:

  • Real money. Let’s say you created a project that saved or made an organization a few hundred thousand dollars—you would probably be very proud of yourself! But for most large organizations, this is lunch money. We need to find a way to work with six-figure returns-on-investment to get the attention and credibility we desire.
  • Ensuring future results. Even though we use replicable methods, it is still impossible to guarantee that future interventions will have similar results, and it is generally very difficult to isolate the variables that caused business results. With current methods of evaluation, we are only as good as our last project. We need to show that we contribute to the present and the future.
  • Empowered employees. With today’s information technologies and flattened management structures, employees at all levels are able to do a lot of what HR, training, and employee communications professionals used to do for them. So, we need to show that we have another role—that of “highway engineer” instead of “chauffeur.”
  • Too much change to manage alone. Most of my clients tell me that they cannot keep up with the organizational changes that imply the need for constant news updates, modifications in training and documentation, and coaching and feedback on performance. Therefore, we need to create infrastructures that “grow themselves” and are mostly controlled by performers themselves.

So, what is the answer? We need to move from a focus on managing and assessing interventions (like training courses, compensation programs, or performance aids) to managing, assessing, and developing performance infrastructures. These infrastructures are the permanent systems that are owned by an organization and provide the platform for ongoing interventions and performance improvement. What is an infrastructure asset? It can include things like a course management system, a feedback system that allows employees to collect data and manage their own performance, or a set of policies and tools for crisis communications. These systems can also be referred to as “intangible assets” and these are the factors that represent most of the valuation of modern corporations and are the drivers of future earnings.

Here is an oversimplified example of how to represent the ongoing shareholder value of a performance improvement system that creates $100,000 savings each year. After tax, that equals about $64,000. Multiply that by the current S&P 500 price/earning ratio and this yields more than a million and a half dollars of increased organizational valuation!

Worksheet for Simplified Method of Determining Increase in Shareholder Value Based on Permanent Expense Reduction
Pre-tax Income (Savings)* $ 100,000
Tax Rate 36.0%
After Tax Income $ 64,000
Applicable PE Ratio (S&P 500 average 12/02) 24.1
Indicated Increase in Shareholder Value $ 1,542,400
* Assumes permanent savings after all applicable adjustments for overhead and other fixed expenses.

My short message to you is to reposition your own statement of the value you bring to organizations: the knowledge and ability to shape human performance by creating the performance infrastructures that create ongoing value.

Diane Gayeski, PhD is Professor of Organizational Communication, Learning & Design at Ithaca College and CEO of Gayeski Analytics. She helps clients to assess and improve their current performance infrastructures and provides workshops on new learning technologies and performance management concepts such as those presented in this short article. Diane may be reached at



Would you like to advertise in this space? Contact


  To learn more about communication and technology, attend Diane’s one-day workshop “Beyond Level 4: Measuring, Developing, and Managing Intangible Assets” on Friday, April 11, 2003 at ISPI’s Annual Conference.

ISPI’s “games guy” Sivasailam “Thiagi” Thiagarajan, CPT, has recently been designing and playing RAMEs (Replayable Asynchronous Multiplayer Experiences). In plain English, RAMEs are web-based games that collect ideas from virtual focus groups.

Let’s Play a RAME
This month you are invited to participate in a RAME called Improving PX (PerformanceXpress). The object of the RAME is to collect your ideas on how to improve ISPI’s online newsletter. The game has three rounds:

Round 1. Contribute an idea for improving the value of PerformanceXpress to its readers.

Round 2. Review the set of ideas from other players and select the top two. (Other players will be reviewing your idea and comparing it with other ideas.)

Round 3. Review the best ideas selected by different groups and select the top two “best-of-the-best” ideas.

It will take about 5 to 15 minutes to complete each round.

Why You Should Participate
Since ISPI is a heavily member-driven organization, your input is vital to the success of the Society and the publication. Feedback is essential to improving our products and services.

Here is what’s in it for you:

  • You will enjoy playing and scoring points.
  • You will enjoy contributing a valuable idea or feedback for improvement of ISPI’s monthly publication.
  • You will learn a process for effective and enjoyable data collection from online focus groups.

Ready for the First Round?
Please visit this website to register as a player:

It will only take you 15 seconds (unless your name is a long one like Sivasailam Thiagarajan). The registration deadline is Wednesday, February 12. You will get simple instructions for participating in the first round after the registration is complete.


by Jeanne Farrington, ISPI Director

Geary Rummler stopped by to talk with the ISPI Board of Directors during our January meeting. He talked about ISPI as a place where people come to share ideas with each other.

Let’s think about this for a minute: Is ISPI a place where learning and performance professionals come together to learn, to share, to teach, and to grow professionally? Our members say, “Yes!” In a survey of the membership in 2000, the primary reason people gave for staying with the Society was to keep current. They wanted to learn new things and to keep up with developments in the field.

“Who has a new idea? Who can explain this other aspect of the field to me?” Most of us have questions like this. Hopefully, all of us could easily write a long list of things we would like to know more about.

Lifelong Learning
In my 15 years with ISPI, I have noticed that the people who seem to know the most never sit back and bask in the light of all they have learned. Instead, they never stop learning. Even if they have advanced degrees and 30 years of experience and seem to know just about everything…they never, never stop reading, discussing, writing, sharing, and learning. This constant pursuit, of course, is how they became so knowledgeable in the first place.

I Don’t Know
For most of us, the more we learn the more we realize there is to learn. Primary ignorance—stuff we know we don’t know—is relatively easy to address. Either the information is available, or not. The good news about primary ignorance is that we are conscious of it—we know what we don’t know. For example, I don’t really know what a quark is, and I can never remember whether occurred has one “r” or two, and I wish I knew more about how the brain works. I can learn more about or compensate for not knowing these things if I wish.

I Don’t Care
Sometimes we consciously decide not to study certain things. For example, I don’t understand electricity very well, but I am grateful that the lights go on when I flip the switch and that the electrons are organizing themselves to display these words on my computer screen. (I don’t actually know that’s what happens, but I am happy that other people do.) This is still primary ignorance. I know I don’t know these things. But in this case I have decided not to do anything about it.

Then there is the much more insidious secondary ignorance. This is the stuff we don’t know that we don’t know. We only discover specific areas of secondary ignorance in hindsight. For example, there was a time when I had no idea that there was a field called instructional design.

Sometimes we suddenly become aware of something we didn’t know about at all. We have an epiphany, or we are blindsided, or there is something we can see only now. This can be boring, or delightful, or shocking, depending on the circumstances.

(Sorry, but) all of us possess vast oceans of secondary ignorance. Perhaps this is the thing that people who are purposefully learning all the time keep in mind. There is always one more layer of things to discover, even when we know a lot about a given subject.

“What am I missing here?” is a great question for surfacing things we do not see. “Is there another point of view? What more is there to learn about this? Is there a completely new way of looking at things that I haven’t seen yet?”

Respect for the Unknown
ISPI is full of bright people. There is always someone who can provide new information, perspective, choices, processes, or technologies. Each of us chooses the things we invest effort in learning about—it’s just such a good thing when we are exposed to those things in a safe environment—before we display our secondary ignorance about something that everybody else seems to know already!

As a matter of habit, we should all be asking, much of the time, “What don’t I see here?”

I don’t expect to learn much about quarks or electricity at ISPI. But I do expect to learn about those things that a self-respecting performance improvement professional ought to know. Keeping current means, at the very least, filling in the holes in our awareness of what’s important in our field. This, and more, is what we can help each other do.



by Carl Binder

If you have followed this column for the last few months, you know that I have encouraged readers to look for examples of graphs that distort results, or at least “show them to their best advantage” in ways that makes it difficult to easily understand the data or to compare one graph with another. (If you haven’t been following, you can catch up by clicking the back issues link toward the bottom of the blue navigation strip at the left.) Anyone who looks within their own organization, at a few business magazines, or in just about any other place where people show data to persuade or inform, will see cases of “stretch-to-fill” graphs where no matter how large or small the result might be, it fills the entire page. This is only one of many ways in which how we graph performance results can influence our understanding of those results, and ultimately the decisions we make. In this issue I’ll illustrate how the graphic format you choose for monitoring results over time can help or hinder your decisions.

Previous columns have distinguished between three types of effects you might see in results over time: changes in level, trend, or bounce (variability). It is important to tell the difference between these three types of effects because they have big implications for deciding about what performance interventions worked or did not work, what to do next, or how to plan for the future.

This is actually a huge topic, worthy of entire articles or books. In fact, Judith Hale, an ISPI past president, responded to my last column by suggesting a 1995 book by G.T. Henry that you might enjoy. For the sake of this discussion, let’s restrict ourselves to a single example of results data graphed in two different ways.

The first graph below uses a conventional stretch-to-fill graph to present the dollars per month generated by a small contracting firm during the first years of its life. Notice that the scale up the left is a traditional “add” scale, where a given amount (e.g., 10,000) is depicted by a given distance. On this graph we can see where the firm changed its selling method. But it is hard to tell whether or how the change influenced the level, trend, or bounce in month-to-month revenues. As with all “add” graphs, the visual picture of variability changes as you go up the scale. For example, a range of x 10 looks a lot smaller toward the bottom of the scale (e.g., from 1000 to 10,000) than it does toward the top (e.g., from 10,000 to 100,000, which would go off the scale.) If we were to draw a line through the data to estimate the trend or overall growth in revenues, we’d actually do a better job with a curve rather than a straight line. This fact makes it difficult to project trends into the future, at least without a complex statistical model. (In fact, the traditional term “learning curve” appears to be related to “add” graphs used in the study of learning and performance over the years. They should have used “multiply” graphs!)

The second graph presents the same data using Lindsley’s (1999) standard multiply format in which a given distance up the left represents a given ratio or multiplicative factor rather than an absolute amount. So, for example, multiplying the revenues x 10 from 1,000 to 10,000 on this graph would cause the value to go up the same graphic distance as if revenues grew from 10,000 to 100,000 (also a multiplication of x 10). This feature has huge advantages if we’re trying to make sense of the data. On the “multiply” graph we can see more clearly that the change in selling method did not affect the overall trend or growth in revenues. But it had a big effect on the bounce or variability, reducing it from over x 50 (i.e., where it ranged from around $700 per month to around $35,000 per month) to about x 3 (i.e., from around $10,000 to around $30,000).

If we were to make a decision based on the second graph, we’d stay with the new selling method, be happy that the overall trend continues and with the reduction in bounce because we could now better predict cash flow within a defined range. We might continue to change our processes in an effort to further reduce bounce, and perhaps look more carefully at our change in selling method to understand how and why it had that effect.

Next month we will continue to discuss factors related to graphic performance results over time. If you’d like to learn more about the graphing methods illustrated in this article, be sure to shoot me an email and/or check out Lindsley’s 1999 chapter.

Henry, G.T. (1995). Graphing data: Techniques for display and analysis. Applied Social Research Methods Series, Vol. 36. Sage Publications.

Lindsley, O.R. (1999). From training evaluation to performance tracking. In H.D. Stolovitch and E.J. Keeps (Eds), Handbook of human performance technology (2nd ed.). San Francisco: Jossey-Bass/Pfeiffer, 210-236.

Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance, and behavior to deliver valuable results. His easy-to-remember e-mail address is and his company’s website is



by Todd Packer

We continue to start our search engines (even in the chill of a Cleveland winter) here at “I-Spy,” a feature of PerformanceXpress that highlights relevant, interesting, and useful websites for performance technologists. Each month, we take readers to off-the-beaten path sites that help them find similar thinkers, resources, work, new ideas, and sometimes just plain old fun.

Quick recap: Every month, three sites, one theme. While far from comprehensive, hopefully these sites will spark readers to look further and expand views about human performance technology (HPT). Please keep in mind that any listing is for informational purposes only and does not indicate an endorsement either by the International Society for Performance Improvement or me.

These are the general categories I use for the sites featured:

  1. E-Klatch: Links to professional associations, research, and resources that can help refine and expand our views of HPT through connections with other professionals and current trends
  2. HPT@work: Links to job listings, career development, volunteer opportunities, and other resources for applying your individual skills
  3. I-Candy: Links to sites that are thought provoking, enjoyable, and refreshing to help manage the stresses and identify new ideas for HPT

The theme for this month’s column is Respect. We seek it for ourselves, our communities, and our profession, yet “respect” (from is an elusive phenomenon, difficult to quantify yet of critical importance for effective management. Whether to show honor or deference or special attention, our organizations and our nations struggle with treating people with respect, particularly historically disadvantaged groups. In the U.S., race relations continue to challenge our ability to respect the views, experiences, and opinions of diverse groups. As February is recognized as National African American History Month, I-Spy explores some connections between performance improvement, diversity, and respect. Fasten your e-belts, and please remain seated during our travel from the U.S. to Africa.

For an intriguing glimpse into current research on the intersect between community, technology, and performance measurement, visit The Institute for African-American E-Culture, a network of researchers and professionals from diverse universities. Several of the research summaries listed can contribute to developing HPT as we strive to respect culture while improving performance, particularly Culture-Specific Human Computer Interaction (CS-HCI can be defined as follows: The study of the design, implementation, and evaluation of human-computer interactions that are targeted towards a specific cultural demographic) and Effectiveness of Culture Specific Pedagogies and Development of Culturally Responsive Performance Assessments. IAAEC focuses on building unusual collaborations that support creation and ownership of IT in a community context. For example, IAAEC brings together computer scientists, community activists, and students to create a supportive technology development environment for whatever purposes they desire.

To find ways to respect your expertise and connect to diverse online communities in need, a visit to presents a variety of opportunities for individuals and businesses to have a positive impact. Service Leader contains “volunteer management and community engagement online resources [and is] a project of the RGK Center for Philanthropy and Community Service at the LBJ School of Public Affairs at The University of Texas at Austin.” With a focus on schools and “virtual volunteering,” this site offers a variety of materials, tips, and opportunities for PTs interested in volunteering from the keyboard as well as in the community.

Our digital safari now brings us to the beautiful and informative site of The Project for Information Access and Connectivity (PIAC). From powerful images of refugees (including the haunting “I dreamt for the war to finish and it has”) to sound clips of traditional music (check out “Instruments of Mozambique” under “Music”) to a feasibility study on an index for theses and dissertations completed at African universities (Addis Ababa University had 42 in Curriculum and Instruction), this site is a comprehensive, respectful, and multisensory tour of life and performance in Africa. For us global cybergeeks, they also offer a great introduction to the Internet. PIAC, which was established in 1997 by the Ford Foundation and the Rockefeller Foundation, collaborated with African grantees and program officers of both foundations on using technology to: enhance communications and the ability to work with colleagues and like-minded organizations in Africa and overseas; improve access to information; and improve the dissemination of African information. Of note to readers, per the ISPI website, Local Chapters section, there are two chapters in formation with a connection to Africa—one in South Africa and one for Europe/Middle East/Africa.

When he is not Internet trawling for ISPI, Todd Packer can be found improving business, non-profit, and individual performance through research, training, and innovation coaching as Principal Consultant of Todd Packer and Associates based in Cleveland, Ohio. He may be reached at


Announcing a newer, faster, cheaper, fail-safe way
for improving human performance! Okay, so it looks promising, as did many of the so-called miracle solutions of the past. But will it deliver on its promises? Join a brisk walk through a number of miracle cures that promised human performance results. Find out which ones worked and which ones have gone by the wayside to become just another fad.

Join ISPI President Jim Hill, Master of Ceremonies Harold Stolovitch, and a dozen presenters at the Opening Session of the ISPI’s Annual International Performance Improvement Conference and Exposition on Saturday, April 12 from 5:00-6:30 as they take a humorous look at several of the so-called miracle solutions for improving human performance. Each presentation will last three minutes. After these brief presentations, two speakers will help us draw some conclusions for our work in the future. Don’t miss out on the early bird rate. Register before February 10, 2003!


Can that be possible? That’s certainly our goal. Every year, prior to the Annual International Performance Improvement Conference and Exposition, ISPI offers in-depth workshops that are meant to broaden your knowledge base in a specific topic in workplace performance improvement.

ISPI is not in the ivory tower-building business. We select topics based on their business-relevance and their potential for solid return on investment to organizations like yours. ISPI workshops are limited in size, ensuring that you will receive individual attention from expert presenters that include Carl Binder, Dale Brethower, Robert O. Brinkerhoff, Bill Coscarelli and Sharon Shrock, Danny Langdon and Kathleen Whiteside, Allison Rossett, Thiagi, and others.

When you make your plans to join your colleagues in Boston, Massachusetts, for the Conference and Exposition, make sure to participate in one of our 20 valuable, pre-conference workshops. Our usual one-day and two-day workshops will be supplemented this year by new, half-day workshops. Workshops dates are April 11-12, 2003. Conference dates are April 12-15, 2003. Click here for complete workshop descriptions and register today!



The best training and performance jobs are at located on the International Society for Performance Improvement’s CareerSite. Post your resume for free, and find your next career. Tired of sifting through hundreds of generic ads searching for specific jobs? ISPI provides candidates with reliable employment opportunities in the performance improvement industry. Premier companies consider ISPI their best source for talent. Job seekers–post your resume for free!

  • Complete control over confidentiality of your information
  • Customized Job Search Agents working 24 hours/day
  • Store up to 3 unique profiles—FREE!
  • Post your resume for prospective employers
  • Application tracking features



by Irving H. Buchen

I wear two human resource (HR) hats: one academic, the other consultative. Often, the two are fused effectively in separate environments. My graduate course in HR Management, for example, is peppered with stories of consulting challenges and experiences. When I invite my students to add to or detract from these tales, the result is often richer and even more generic. I often benefit from the give and take by finding my knowledge base expanded and enhanced. But no one takes me to task for using anecdotal research. In fact, course evaluations usually single it out as a strength.

Consulting assignments frequently involve citing research sources. This is especially important when flawed or hurried proposals are being recommended and hailed as being innovative and fail-safe. Although the letdown is inevitable, most clients welcome being rescued from harmful and wasteful trial and error. Indeed, in the process I inadvertently have programmed a few to always ask, “Well, what does the literature show?” However, when a project is completed, the results are achieved, and a report is given, no one pejoratively regards the findings as inferior to those found in academic sources.

To be sure, client affirmation is pragmatic, not scholarly. Crossing over creates the problem and begets the nagging questions: at what point, if ever, can consulting experiences be regarded as possessing a validity that is more than impressionistic? And, if written up for publication together with academic citations, be accepted if not as equal at least as passable? In short, is it possible ever to join together academic and anecdotal research; and if so how?

Some Fallible Approaches
If I were more impressed by my own answers, I would not be genuinely soliciting input. In any case, here is a list of my current unsatisfactory actions, statements, and strategies:

  • Admit early on that this is based in part on anecdotal research, but brace oneself for editorial tongue-lashing.
  • Couch the anecdotal process in the form of a semi-scientific experiment. Indicate that it has been tried often and in diverse circumstances and the results have remained essentially the same. The risk here is the inevitable lecture on the true application of the scientific method.
  • Timidly suggest that these anecdotal conclusions are being only tentatively advanced in the absence of academic work to the contrary; and time will tell whether they have relevance. The likely response will be that the conclusions as well as the article are premature and further, that anecdotal fruits are more perishable than academic ones.
  • Finally, boldly stride forth and claim that all existing research is irrelevant or blind to the new problem being considered; and that necessity may have to be the mother of invention (or rather scholarly intervention); and allow the new findings to enjoy the status of research. Such a direct frontal attack usually elicits sustained silence; maintained over such an extended period of time that the entire matter has been forgotten.

Responses Sought
Suggestions, strategies, and solutions are earnestly sought and welcome from the following:

  • academics or consultants (those who wear both hats equally, occasionally, or partially)
  • editors and readers of manuscripts to journals or publishers
  • clients to whom the entire matter is a tempest in a teapot
  • current students in HR majors

Because I have no way of anticipating responses, if any, or how more creative they might be, my fallback position is to found and edit a new Journal of Anecdotal Research.

Dr. Irving H. Buchen is currently a member of the faculty of the online doctoral business program of Capella University and also teaches communications at Florida Gulf Coast University. He serves as Senior Research Associate to COMWELL, Consultants to Management, and to HR Partners. He also heads up his own firm, Future Optimum Performance Systems. He is the author of four books (soon to be five) and more than 100 articles. He may be reached at


This is your last chance to submit a Proposal for Speakers for the International Society for Performance Improvement’s 2003 Conference on Performance-Based Instructional Systems Design, September 18-20, in Chicago, IL. Speaker submissions must be received at the ISPI headquarters no later than February 10, 2003. Click here for additional information or to download the RFP.


The International Society for Performance Improvement (ISPI) would like to congratulate the list of professionals below who have taken advantage of the exemptions available during the grandparenting period and received the designation of Certified Performance Technologist (CPT) last month. Click here for a full list of CPTs. Visit, and apply today to receive your designation.

  • Beverly Thompson, Missouri, USA
  • Mark Moore, Washington, USA
  • Sarah Halsey, Washington, USA
  • Ernest Thor, Washington, USA
  • Guy Baltzelle, Washington, USA
  • Anya Wofford, Washington, USA
  • Thomas Sehmel, Washington, USA
  • Dennis Costello, Missouri, USA
  • Iris Ware, Michigan, USA
  • Janice Conway, Michigan, USA
  • Kenneth Burgdorf, Illinois, USA
  • Diane Rentfrow, California, USA
  • Susan Oliva, Florida, USA
  • Sheila Scanlon Wilkins, California, USA
  • Belia Nel, South Africa
  • Michael Hughes, Georgia, USA
  • Michael McCrary, Hawaii, USA
  • Linda Powell, Idaho, USA



Performance Marketplace is a convenient way to exchange information of interest to the performance improvement community. Take a few moments each month to scan the listings for important new events, publications, services, and employment opportunities. Find additional resources for your training and performance improvement initiatives at the ISPI Online Buyers Guide and find the latest training and performance jobs at the ISPI Online Job Bank. If you would like to post information for our readers, contact ISPI Assistant Director of Marketing, Mickey Cuzzucoli at or 301.587.8570.

Assessment Tools
Do you see a forest or a tree? Organizations need influential leaders to guide them. How can you find out how effective you are as a leader? Click the link to find out how to get a FREE assessment tool.

Books and Reports
High Impact Learning by Robert O. Brinkerhoff and Anne M. Apking provides the conceptual framework for the HIL approach and is complete with integrated tools and methods that training practitioners can use to help their organizations achieve increased business results from learning investments.

Magazines, Newsletters, and Journals
Chief Learning Officer Magazine – Let CLO deliver the experts to you through Chief Learning Officer magazine,, and the Chief Learning Officer Executive Briefings electronic newsletter. Subscriptions are free to qualified professionals residing in the United States.

Training Services
The Power to Get Results Martin Training Associates provides workshops, services, and products that focus on developing hard and soft skills in project management. Our methodology is universally applicable to any project and project team type. Visit for details.



Conferences, Seminars, and Workshops
Let Thiagi bring his top three workshops to you Interactive Strategies for Improving Performance (games, etc.), Train the Trainer (and the Facilitator), and Rapid Instructional Design (powerful alternatives to ISD). No bait and switch. All workshops designed and delivered by Thiagi himself.

Principles and Practices of Performance Improvement and Making the Transition to Performance Improvement, Boston, MA, April 10-12, 2003. Hands-on, practical, three-day Institutes designed for immediate return on investment.

41st Annual International Performance Improvement  Conference and Exposition: Lessons in Leadership, Boston, MA, April 10-15, 2003. The most important annual event for workplace performance improvement professionals.

Websites of Interest is a leading on-line resource providing HR professionals with daily news, articles, expert insights, discussion groups, and more. ICG (Intellectual Capital Group), a division of, provides cutting-edge research reports called RedBooks™; identifying and analyzing HR trends and technologies.



ISPI is looking for Human Performance Technology (HPT) articles
(approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:

  • Short “I wish I had thought of that” Articles
  • Practical Application Articles
  • The Application of HPT
  • Success Stories

In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to Each article will be reviewed by one of ISPI’s on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact



Go to printer-friendly version of this issue.

Feel free to forward ISPI’s PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone else’s PerformanceXpress, send your complete contact information to, and you will be added to the PerformanceXpress emailing list.

PerformanceXpress is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Society’s activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.

If you have any questions or comments, please contact April Davis, ISPI’s Senior Director of Publications, at

1400 Spring Street, Suite 260
Silver Spring, MD 20910 USA
Phone: 1.301.587.8570
Fax: 1.301.587.8573