First, training was, and still is, a way to support learning. Training is not the only way to learn, and it often is not the best way. Second, the need for learning is not going away; it is increasing. Every time an individual or group encounters something new, there is an opportunity, and often a requirement, to learn. This can be a new job, a new task, a new business strategy, a new customer, a new product, a new competitor behavior, and so on. Third, we can take a systematic, performance-based approach to managing learning, not just training, in our organizations.
Organizations and individuals must learn faster and more effectively to adapt to the rapid pace of change. The learning capability of an organization will make a strategic difference in its performance in the face of change. A highly developed learning capability will confer a competitive advantage to the organization.
The challenge for our organizations in the new millennium is to evolve from managing training to managing learning. This is not a trivial shift. Training alone is too slow and too inflexible for much of today’s work environment. Some of the non-training learning strategies being employed today include:
Most of the organizations I am familiar with today do not systematically deploy these strategies along with training to meet the learning needs of the organization and its individual performers. They still focus heavily on training solutions because that is what they are trained for and organized to provide. In many cases, their internal customers are not ready to sign up for non-training learning solutions.
Managing a balanced and integrated array of learning strategies requires a stronger role to be played by the managers and leaders in the workplace. Managers have been conditioned over the years to let the training department do it, and the training departments have responded. One of the biggest challenges of the transition from managing training to managing learning is gearing up the workplace organization to shoulder a major share of the responsibility. They do not traditionally see this as their role, and they are faced with a seemingly ever-increasing workload due to downsizing and pervasive change. For them to play their role in managing learning, they will have to learn how to do it and believe it will actually relieve the work pressure on themselves and their high performers. I believe this is one of the best areas in which the new breed of performance consultants can cut their teeth and demonstrate their value.
So, how can an organization plan to transition from managing training to managing learning? Below is a quick overview of a proven approach.
The new millennium will be more about learning systems than training systems. It will build on what has been learned while managing training systems. New strategies, roles, and skills will come into play. The people who are managing performance in the workplace will also find themselves managing learning with methods, content, tools, and consulting support from the Learning Departments.
A Brief Bio
Claude’s most current book, Achieving Post-Merger Success: A Stakeholder’s Guide to Cultural Due Diligence, Assessment, and Integration, written with his business partner and close friend, Bob Carleton, is due out from Jossey-Bass/Pfeiffer this month. In addition, he co-authored Turning Kids On and Off, with Joe Harless, and Job Aids, with Donald Bullock, both now out of print.
Share Their Thoughts
Readers may know that Claude was one of ISPI’s most outspoken critics and supporters. ISPI was his other family, and as a responsible family member, Claude spoke his mind. By raising controversial questions and sharing his views, he made ISPI think rigorously about important issues and enriched the outcomes, regardless of the ultimate decisions.
Organizations arrive at the situation where products and services are not meeting customer needs, they are losing prestige and respect in the marketplace, they cannot create and maintain quality, and the culture doesn’t energize employees to improve. Both Six Sigma and Human Performance Technology (HPT) are comprehensive approaches based on systematic and systemic processes, focused on outcomes and results, and require collaboration. Many of their practices are similar, but each approach has a different origin. As a result, there are nuances that make huge differences. Six Sigma, driven more by hard data, reduced variation, and quality of the customer experience, enjoys more consensus around terminology and process. HPT is more inclusive, encourages broader data gathering and evaluation, and more unique, holistic interventions (solutions).
Six Sigma Defined
Would you like to advertise in this space? Contact email@example.com
Both Six Sigma and HPT are comprehensive approaches based on systematic and systemic processes, focused on outcomes and results, and require collaboration.
In the December 2003 issue of PerformanceXpress, Neil Rackham, author of the best-selling book SPIN® Selling, spoke with ISPI President Guy Wallace about his take on Human Performance Technology. This month Neil shares his thoughts on two-way partnering and gives readers a peek into his keynote presentation at ISPI’s 42nd Annual International Performance Improvement Conference & Exposition.
Can you share with us your thoughts about two-way partnering in terms of “suppliers-with-customer” and “suppliers-with-suppliers”?
Many of the fundamentals of partnering are the same for each. The importance of partnership to people in performance improvement for their customers is critical. Too many small companies are too narrow in their improvement specialties. They have one piece of the jigsaw puzzle, but their customers need broader answers. As suppliers, they must partner with other suppliers to create a combined new value that individual players can’t offer. To do that they might need to partner with other performance improvement suppliers to create the total solution—the complete jigsaw.
We need to ask ourselves what is our strategy to survive against larger improvement firms that can offer a broader array of improvement solutions? What are the competencies that our performance improvement organization doesn’t have? Who can we partner with to meet the needs of the customer?
The crucial thing is not to simply add their offerings to yours, but to fundamentally create new value by redesigning how you both do business, to release new value. As Andre Boisvert, who has set up many partnerships says, “Don’t set up a partnership if all you are doing is trading four quarters for a dollar.”
If I am a performance improvement company and you are a software development company, we should be changing my improvement processes and your software development processes and tools to create something more than what we offered before. More in terms of “newer, better, faster, and cheaper” than what we would offer if the customer simply brought us together in a project, and we continued to do what we would typically do.
Partnering creates a bigger pie. Sit with your potential partners and determine what you would do, and how you would do it if you were one company, rather than two companies working side-by-side.
What will we learn that’s new during your keynote presentation in Tampa?
My keynote will focus on “A Life Misspent in Performance Improvement.” Seriously, I intend to share many of the conclusions I have come to from both working for 87 of the Fortune 500 and from my research over the past 40 years…conclusions from research that will help practitioners. We will cover a lot of ground.
Any final thoughts to share with our PerformanceXpress readers?
Yes. I think we need to be more sophisticated about measurement. We live in a metrics driven world. We should be the ones to drive this. We need to help our customers determine how we will best measure this improvement effort, not as an afterthought, but as an integral part of the planning of an improvement effort.
Lord Calvin said, “If you can’t measure it, if you can’t express it in a quantitative manner, then your knowledge is of a meager and insignificant kind.” We need to insure that this does not apply to us, and our efforts at performance improvement.
For more information on Neil’s presentation in Tampa, click here.
One of the four basic principles of performance technology is that we strive to “make sure that what we do is of value.” That is, our efforts must meet a real and significant client need, cost-effectively.
The definition of needs analysis that people tend to use is narrower than that, however. Needs analysis is often defined as “an examination of the gap between the desired and current states” with the purpose of identifying factors that would allow us to close the gap.
The difficulty is that we do not always know if closing the gap addresses any real need of the organization. It may be that closing the gap between desired and current states is not of sufficient value to even justify the cost of the analysis.
Front-end analysis is a powerful tool—and we need to make sure we are not wasting it on gaps or “problems” that are not truly worth addressing. We are good at specifying the desired result, determining what factors are influencing the gap between desired and actual, and developing interventions to reconcile the gap. But are we always sure that all this effort creates significant value that warrants the cost? Did we meet a real need?
To paraphrase a saying by the old sage Mel Brooks, “Send us a sign. Are we doing the Lord’s work or just building sand castles?”
Recently, I worked with a client on an issue defined as “we need to make sure that our people follow our dress code.” We could define the gap; do an analysis and design and intervention that would ensure that people dress to code. But what was the real need? Was the value of meeting the dress code of sufficient benefit to the organization for us even to begin the analysis? There may in fact be a real need here and if we understood it, it might help both in the analysis and the design of the intervention.
Yes, I know the difficulty. If you are internal, how can you turn down a request from a senior manager, and if you are external, how can you afford to give up a piece of business? One way to deal with this issue is to perform a three-phase analysis:
Those three phases might be labeled Needs Analysis, Goal (or Outcome) Analysis, and Systems Analysis. That or a similar set of labels will help us emphasize that “closing gaps” is not necessarily meeting a need. It is simply a means of addressing an issue that the organization finds of value.
Roger Kaufman often says, “If this intervention is the answer, what was the question?” We should always start with a real need that’s more than just a statement of a “gap”—and then, once we understand the expected value, we can determine the best way to achieve it.
Brr. It’s that time of year, at least in the northern parts of the United States, when all you want to do is curl up with a hot cup of cocoa in front of a warm computer monitor. So let’s get those hands moving across the glowing Internet! Each month, we take readers to off-the-beaten-path sites that help them find similar thinkers, resources, work, new ideas, and sometimes just plain old fun.
Quick recap: Every month, three sites, one theme. While far from comprehensive, hopefully these sites will spark readers to look further and expand views about human performance technology (HPT). Please keep in mind that any listing is for informational purposes only and does not indicate an endorsement either by the International Society for Performance Improvement or me.
These are the general categories I use for the sites featured:
The theme for this month’s column is The Last Straw. Improving performance is stressful work. It is critically important for performance technologists to understand the stresses we face as we strive to create better workplaces around the world. Sometimes we hear people complain about “the last straw”—as in the final piece of straw that broke the camel’s back. If we can improve a situation, by opening up the HPT toolbox to build a lighter straw and/or a stronger camel (OK, so the metaphor is starting to get away from me), we can demonstrate our benefit to individuals at all levels of an organization through reduced stress and enhanced performance. Here are some resources that can help. Hang onto your last soda straws, too.
Until March, “e-” well.
When he is not Internet trawling for ISPI, Todd Packer can be found improving business, non-profit, and individual performance through research, training, and innovation coaching as Principal Consultant of Todd Packer and Associates based in Cleveland, Ohio. He may be reached at firstname.lastname@example.org.
by Michael Peters, CPT
ISPI’s Annual International Performance Improvement Conference & Exposition has always been about professional learning and growth; expanding the breadth and depth of what we know, how we do what we do, and the impact we have on work, education, and the international community. And this year’s roster of pre-conference workshops delivers on both the breadth and depth commitments—breadth of topics and depth of both presenters and the insight they deliver.
The breadth of the 2004 Conference Workshop topics ranges from understanding our impact on business to theories, techniques and tools on delivering better performance solutions, better instruction, and e-learning and better evaluation.
If you’re keen on getting a handle on the business aspects of HPT, sign up for Robert O. Brinkerhoff’s, “Connecting HPI to Business Goals and Metrics” (1 day) or Mike Kicidis, “Turning Intangible Assets into Tangible Concepts” simulation workshop (1 day).
For a performance focus on HPT, you can choose workshops that deliver on the process, specific performance programs, or performance methods. For process, there’s Roger Kaufman’s, “Needs Assessment: What it is…How to get it done” (1 day). Jack Wolf describes a successful performance program in his half-day, “Shift to Performance at HBO.” If you’re looking to amp up your performance skills and techniques, you should check out Kimberly Morrill and Mark Munley’s, “Profiling Your Client’s Business” (1 day), Paul Staples, “Performance Mapping Program” (1 day) or Daniel Raymond’s 2-day workshop, “Building Better Job Aids.”
There’s plenty to choose from if you want to focus on our training roots. Peggy Durbin will teach you about “Learning Object Design” in her 2-day workshop, while Carl Binder will help you in “Building Fluent Performance” (1 day). Kenneth Silber’s, 1-day workshop will deliver a cognitive approach for problem solving training. Partnership is the theme in James Tamm’s “Building Collaborative Partnerships” (1 day) and Thiagi’s 1-day workshop “Partnering for High Performance in Teams—A Playful Approach.” Thiagi continues his partnership motif with an offering on “Partnering with Participants: Facilitating Human Performance,” while Mel Silberman will give you “20 Ways to Become a Consummate Team Facilitator” (1 day).
And if your interests expand HPT’s training roots to e-learning, we’ve got plenty of quality to offer there, as well. Ruth Clark delivers a 1-day workshop on “E-Learning and the Science of Instruction—Applied”, while Saul Carliner has a 2-day session on “Advanced Design for E-Learning.” In addition, Marie Jasinksi will help you realize the full potential of e-learning in her 1-day workshop on “Web-based Role Play and Simulation.”
And just to make sure we don’t forget what counts—Bill Lee, with his 1-day workshop, “Evaluation” and Jack Phillips, with his 2-day workshop, “Measuring ROI,” reinforce the importance of achieving and demonstrating results with everything we do.
Lastly, back by popular demand, Bill Coscarelli, Sharon Shrock, and Patricia Eyres will conduct an encore workshop titled “Constructing Level Two Evaluation & Certification Systems: Technical and Legal Guidelines” (1 day).
The conference workshops come in all sizes (half-, one-, and two-days), on Monday, April 19 and Tuesday, April 20. So, if you’re looking to gain a depth of knowledge from an established expert or a rising star, sign up for one these exciting learning experiences. Click here for complete workshop descriptions, and register today!
In this column I’ve occasionally suggested that we don’t need statistical designs to do performance measurement and evaluation. I’ve mentioned that with ongoing counts of behavior, accomplishments, or business results and standard graphic analysis methods we can easily see when our interventions or decisions produce changes in trends or levels. I’ve shared evaluation designs that allow us to determine the causes of interventions without statistics. But I’ve never really come out directly and said this: Don’t use statistical methods.
Well, now I’ve said it.
Why, you might ask, would I argue against statistics? In a field (HPT) that considers itself to be research-based, where some of us passionately describe and advocate (if not always practice) measurement of results, why would I discourage people from using statistical methods? Let me list two of the reasons.
Statistics Emphasize Significance, Not Value
Anyone who has read research reports in education, psychology, or related fields must certainly have seen published reports of the effects of interventions on learning, memory, or other outcomes linked to improvement of human performance, studies showing highly “significant” results that are practically worthless because of their small size.
It’s important to recognize that statistical “significance” means only that a result is not likely attributable to chance, that the difference between a baseline or control condition and the effect of the intervention was probably a “real” effect. But statistical significance does not in any way imply that the result has practical or economic significance or value.
Instead of statistical significance, performance technologists should look for value or practical significance resulting from our work. This has to do with the size of the effect, the reliability with which we can produce it across individuals or groups, and the impact it has on some type of important human endeavor. Look for the big results, and don’t sweat the small stuff. Small “significant” effects are a distraction, not a desired outcome, for the HPT practitioner.
Statistics Often Intimidate and Obfuscate
Our three-year effort to obtain performance results cases for the GOT RESULTS? page on ISPI’s website has shown that statistics can be an obstacle to practical performance measurement, not usually a facilitator. Many of our colleagues—some of whom suffered through statistics courses in grad school and others who are intimidated by not having done so—feel that they can’t really submit case studies if they don’t include some kind of statistical test. This is balderdash.
Colleagues, this is NOT graduate school! This is the “world of work” as Tom Gilbert used to call it, and what we are looking for are large, valuable improvements in behavior, accomplishments, or business results. We need them to be so big and so valuable that our clients stand up and cheer when they see them, and don’t have to look at significance tests. To feel intimidated because you can’t put together a statistical evaluation design wastes a huge amount of time and worry. Not only that, but if we do manage to include statistical tests (and terminology) in our reports to clients and colleagues, the fact that most people don’t really understand them far outweighs any credibility we might gain by their inclusion.
Please don’t let statistics get in the way of producing or communicating results. Comments, counter-arguments? Please email me, and I’ll include some of your comments and my responses in upcoming columns.
A New Book on Standard Charting Methods
I’ve received numerous requests over the last few months for information about the standard charting methods described and illustrated in some of my columns. A new book, Handbook of the Standard Celeration Chart, Second Edition, which thoroughly reviews the rationale and methods associated with the standard chart, is now available from The Cambridge Society for Behavioral Studies.
Dr. Carl Binder is a Senior Partner at Binder Riha Associates, a consulting firm that helps clients improve processes, performance and behavior to deliver measured results. He may be reached at CarlBinder@aol.com. For additional articles, visit www.binder-riha.com/publications.htm. See past columns in this series by clicking on the “Back Issues” link at the bottom of the blue navigation bar on the left.
The deadline is approaching, but it’s not too late to register with a colleague or client to attend ISPI’s 42nd Annual International Performance Improvement Conference & Exposition, April 18-23 in Tampa, Florida. When you register for the full conference at the member or delegate rate, you may also register a colleague for only $350—provided your colleague has not attended an ISPI Annual Conference in the past three years (2001-2003).
When you register, think of a colleague at your organization, a client organization, your ISPI or ASTD chapter, or an acquaintance in the field who has not experienced a recent ISPI conference. Offer that person an opportunity to save hundreds of dollars while benefiting from the premier educational event in workplace performance improvement.
If you have not attended an ISPI Annual Conference in the past three years, you will want to register with a colleague. Find someone you know who plans to attend, register together, and one of you will register for only $350. The deadline is February 13, 2004. Click here to register!
Maximizing the performance of your human capital is an advantage if leveraged. Training is a tool that if applied, can certainly be a catalyst for maximizing human performance. One way to do this is by effective measurement.
The purpose of measurement is to derive a process whereby you can estimate the change in human performance, isolate it to a driver of human performance such as training, and make adjustments for conservatism.
Estimation is a process commonly used in business today. Sales people will estimate their future sales, and accounting people will estimate the cost of a warranty or claim that is expected in the future. So, too, can training personnel ask that participants, supervisors, experts, and others estimate the job performance impact that a training program will have on the job. Participant estimation, as it is commonly referred, is not estimating the performance solely related to training but asks participants to estimate job performance changes in general, including among other factors, training.
For example, if one attends sales training, one might estimate an increase in job performance but that increase could be related to other factors such as a competitor going out of business. So, estimates of performance change need to take into account many factors. Those factors include: process changes, people changes, marketplace changes, technology changes, and interventions such as training.
When estimating the increase, the person(s) doing the estimate should think carefully about all the factors mentioned. You may want to review historic data and forecast data to reasonably factor into their overall performance change. In addition, you may want to look at business results such as quality increases, sales increases, cycle-time decreases, cost decreases, risk decreases, etc. (the end outputs of human performance change) before vs. after training and compared to a control group who did not receive the training.
Logically, the training department is keenly interested in the effect training had on the performance improvement. So, the next step is to isolate the estimated increase in performance to just training. In this part of the process, the person(s) doing the estimates should estimate how much the training has or will influence job performance, relative to the other factors and assign a value to it. So, if the salesperson felt that training was the strongest factor that caused change or will be the driving force behind future change, it would receive a higher value than not.
Finally, because estimation and isolation are subjective at times, one must adjust any results for the estimate. Again, in other facets of business this is commonly done. Using analysis such as most likely, optimistic, and pessimistic adjusts estimates for bias by the estimator and flaws in assumptions. You’ll often see sales forecasts reported in this manner.
In training, adjustment is made for two reasons: first is conservatism. It is critical to state one is conservative in assumptions to build integrity into your metrics. Second is for bias. Estimates can be inflated. In fact, studies done by organizations such as the Tennessee Valley Authority (TVA) and separate studies by KnowledgeAdvisors suggest that respondents tend to over-estimate by a factor of 35%. To this end, when computing a human performance change one might reduce the inputs by a factor of 35% or a similar confidence rate as the adjustment factor for conservatism and bias.
Taken together, the principles of estimation, isolation, and adjustment form a powerful model in tabulating a systematic, replicable, and comparable model for human performance change.
A reader may say, “this is not credible data, it is not statistically accurate, and it is too subjective.” My response would be that the world of human performance measurement is far from objective and accurate. The goal is to have roughly reasonable indicators of it without expending considerable human, financial, or physical resources to do it.
Recognize that your attempts to go from roughly reasonable to highly accurate are tremendous outlays in resources. In addition, research states that you should do otherwise. A study published in the May 2003 Harvard Business Review found that senior managers make decisions on other instinctive factors, not the highly accurate and highly costly data they are provided from highly paid number crunchers. They use such data as one of many inputs and prefer more timely rough estimates versus precise metrics that are too late to factor into decision-making.
The first step in human performance measurement is to understand that in a world of doing more with less, all elements of the organization that are involved with performance improvement need to be seriously thinking about how to measure the impact their initiatives have on human performance.
Jeffrey A. Berk is Vice President of Products and Strategy for KnowledgeAdvisors, a corporate learning business intelligence firm that helps organizations gain the knowledge to improve human performance, better educate its workforce, and reduce costs across the enterprise. Its proprietary measurement technologies and benchmarking expertise help companies more successfully measure human performance change due to training. Jeffrey may be reached at email@example.com.
Taken together, the principles of estimation, isolation, and adjustment form a powerful model in tabulating a systematic, replicable, and comparable model for human performance change.
The Presidential Initiative Task Force is currently winding up Phase 4 of Stage 1. For a quick background on this effort, see my article the January 2004 issue of PerformanceXpress. For additional information and all of the past articles published on this effort, readers can click here.
On January 5, 2004, the Presidential Initiative Task Force conducted a conference call to review their draft report page-by-page. This report was then updated and sent to the Board of Directors for review at their January meeting. The results of that review will be published next month.
The Task Force’s work outputs include the following:
Our only real issue is the naming and numbering of “the” HPT technologies. As expected, this is the most controversial and difficult of our tasks. I believe it is because each of us has developed our own mental models over the years and rationale for them. And now they have begun to collide in our group process. Part of the difficulty of moving on is also a matter of deciding which potential functionalities should this new framework for HPT serve best. In my view, the new framework (the Domains) should help us clarify the various technologies of HPT and help us organize all of our stuff including our members.
We can then better ensure that the content of our publications and forums (conferences and institutes) is appropriately balanced in terms of the processes of HPT, the intervention sets that address the variables of performance, and the underlying science. We should be able to count the articles and pages devoted to content from each technology.
Another functionality test that I use is: Would I ask in a networking activity for everyone in the room to go to the “table” with the banner overhead for “their” main Domain (as many will have more than one). As learning and networking are the two big reasons for ISPI participation, we could do something like this at a future spring conference to kick-off the gathering of 1200–1500 attendees, where about 60% or more are brand new each year. It is hard to network in such a large group. If our 6-9 Domains minimized gaps and overlaps, then each could find their “homeroom” in the ISPI school of HPT.
The specifics of the Presidential Initiative Task Force outputs will hopefully be published to the entire Society after Board review and approval. But we’re not done at that point. There is plenty of work yet to do. The Board will be asked to begin to lay the groundwork for the Task Force(s) that will be needed to be put in place (some soon and some later) to carry forward, continue the work begun, and engage a greater number of Society members in the process. I’ll report back to you soon.
The American Productivity and Quality Center (APQC) and its researchers have identified best practices and discovered effective methods of improvement for more than 25 years. During that time, compelling stories have surfaced of model organizations that take aggressive, intelligent steps to improve their operations. APQC has had the opportunity to witness the evolution of successful initiatives at those organizations. The Xerox Profile: Best Practices in Organizational Improvement examines how an organization began its improvement efforts, how its focus evolved, and what challenges it has faced. This is an excellent way to compare your own organization’s improvement efforts.
The information in The Xerox Profile is from 12 consortium benchmarking studies conducted over the last decade, as well as articles APQC has published. The bulk of this report details information APQC study teams captured during site visits to Xerox. And, because Xerox is continuously making efforts to improve, it is important to note that the initiatives in this profile are still evolving. As Xerox itself asserts, “The ability to learn faster than your competitors may be the only sustainable advantage.”
APQC has found that progressive organizations, like Xerox, know that an unfaltering focus on continuous improvement is the key to achieving and sustaining success. Xerox is a global leader in technology innovation, with $1 billion spent annually on research and development. Consequently, a key element of Xerox improvement initiatives is innovation. Themes of research and development, knowledge management, new product development, and response to customer needs are evidenced throughout this publication.
It is interesting to review what Xerox reveals about certain activities, such as measures and communication, in the context of different initiatives. Far from being isolated organizational improvement efforts, these initiatives progress in a symbiotic manner so that Xerox can achieve both process and performance excellence. The company has a history of first implementing an initiative internally, perfecting the process, and then approaching its customers to help them in their implementation efforts. Xerox is an excellent example of an organization that systematically improves and then capitalizes on those improvements to gain strength in the marketplace.
The Xerox Profile: Best Practices in Organizational Improvement details Xerox’s history and developments around performance improvement. For example, Xerox’s internal communications team uses a variety of measurement techniques for measuring communications both at the corporate level and at the business unit level. Its performance appraisal process is used to measure the communications functions’ effectiveness. Additionally, Xerox’s knowledge management team measures its success through the annual employee survey, which measures the 30 elements on Xerox’s management model. The assessment effort is voluntary.
ISPI members will benefit from insight into:
For more information on Xerox or to purchase The Xerox Profile, please visit APQC’s online bookstore at www.apqc.org/pubs.
ISPI’s 2005 Annual Conference Program Committee is looking for volunteers to serve as track chairs and evaluators for the conference proposal review process. Volunteers are needed to review proposals in the following areas of concentration:
The majority of the work performed by evaluators will occur between August 1 and November 1, 2004. During this time, track chairs and evaluators will work remotely with ISPI staff and make recommendations regarding the proposals submitted for presentation at ISPI’s 2005 Annual International Performance Improvement Conference and Exposition in Vancouver, British Columbia.
Each of the six review teams will consist of approximately 10 members, a track chair, a deputy track chair, and evaluators. Evaluators must be ISPI members in good standing.
Additional Requirement for Track Chairs
If you are interested in serving as track chair or evaluator, please email firstname.lastname@example.org. In the subject area of your email, write 2005 Evaluator/Track Chair Volunteer. Your email should include your detailed contact information, and your first and second choice of “tracks.”
Participate in groundbreaking research by sharing your experiences concerning how your instructional design preparation matched up with the ID position you eventually acquired! A brief, 15-minute, online survey asks you to identify your career environment (for example, higher education, business and industry, K-12 education, and more), whether you were prepared specifically to practice design in that environment, and if so, how you were prepared. Results of this survey will identify programs that do a particularly good job of preparing instructional designers for specific career environments. To share your experiences, please access the survey by clicking here (no identifying information will be collected as a result of your participation).
This study, available online until February 29, 2004, is being conducted by the Center for Instructional Technology Solutions in Industry and Education (CITSIE) at Virginia Tech University. If you have questions about the study, please contact Miriam Larson at email@example.com.
Performance Marketplace is a convenient way to exchange information of interest to the performance improvement community. Take a few moments each month to scan the listings for important new events, publications, services, and employment opportunities. To post information for our readers, contact ISPI Director of Marketing, Keith Pew at firstname.lastname@example.org or 301.587.8570.
Books and Reports
Conferences, Seminars, and Workshops
Faster, Cheaper, Better. Let Thiagi and his team design your web-based training and live e-learning sessions. True interactivity is in the mind—not in the mouse. Exciting activities require and reward higher-level thinking and application. For more details, visit www.thiagi.com.
2004 Measuring & Benchmarking Training Conference, February 23-25, 2004, Las Vegas. Learn proven methods to OPTIMIZE the ROI of your training program. Attend unique and informative workshops presented by training leaders such as Intel, Home Depot, Pfizer and United Airlines. 10% Discount for ISPI Members!
Job and Career Resources
Magazines, Newsletters, and Journals
Are you working to improve workplace performance? Then, ISPI membership is your key to professional development through education, certification, networking, and professional affinity programs.
If you are already a member, we thank you for your support. If you have been considering membership or are about to renew, there is no better time to join ISPI. To apply for membership or renew, visit www.ispi.org, or simply click here.
ISPI is looking for Human Performance Technology (HPT) articles (approximately 500 words and not previously published) for PerformanceXpress that bridge the gap from research to practice (please, no product or service promotion is permitted). Below are a few examples of the article formats that can be used:
In addition to the article, please include a short bio (2-3 lines) and a contact email address. All submissions should be sent to email@example.com. Each article will be reviewed by one of ISPIs on-staff HPT experts, and the author will be contacted if it is accepted for publication. If you have any further questions, please contact firstname.lastname@example.org.
Feel free to forward ISPIs PerformanceXpress newsletter to your colleagues or anyone you think may benefit from the information. If you are reading someone elses PerformanceXpress, send your complete contact information to email@example.com, and you will be added to the PerformanceXpress emailing list.
PerformanceXpress is an ISPI member benefit designed to build community, stimulate discussion, and keep you informed of the Societys activities and events. This newsletter is published monthly and will be emailed to you at the beginning of each month.
If you have any questions or comments, please contact April Davis, ISPIs Senior Director of Publications, at firstname.lastname@example.org.