An Evaluation of Presentation Skills Instruction
by Joe Wessel, Deb Bowden, and Bryan Horveath
Tales from the Field, a monthly column, consists of reports of evidence-based performance improvement practice and advice, presented by graduate students, alumni, and faculty of Boise State University’s Instructional and Performance Technology department.
Pepin Distributing Company (PDC) is the Tampa Bay area’s largest total beverage distributor. Incorporated in 1963, the company sells and markets a diverse portfolio of beverage products, including beer, bottled water, carbonated and energy drinks, liquor, and wine. PDC’s account managers (AMs) provide the direct interface with customers and are the primary drivers of product sales. AMs actively manage more than 2,000 retail accounts, meeting regularly with retailers to offer choices based on retailers’ needs and desires. AM success demands knowledge of PDC’s entire product portfolio and the ability to professionally present the features, benefits, and supporting sales data for all PDC products to influence retailers’ buying decisions.
In February 2011, a regional training manager (RTM) conducted a previously designed training program, “Professional Presentation Skills,” for all AMs at the request of PDC’s top management. The program’s main goal was to provide AMs with the presentation skills needed to more effectively drive sales. To determine the effectiveness of the “Professional Presentation Skills” program, in the fall of 2011, a team of three graduate students at Boise State University partnered with the client organization and conducted a summative evaluation with the following three questions:
- Presentation Design: Was the program designed in a way that the trainees could apply what they learned?
- Job Relevance: Was the program content relevant to the AM’s job?
- Sales Improvement: Has the application of the program content led to increased sales?
Program Logic Model
To illustrate the cause and effects by which the program being evaluated met certain needs or produced certain effects, the evaluation team collaborated with the client organization to develop a program logic model (Table 1). This model depicts the resources (inputs) and activities that went into the program development, the outputs, the short-term and long-term outcomes, and the impacts that were intended as a result of the resources and activities.
Table 1. Professional Presentation Skills Program Logic Model
Budget for facilitator, facility, copies, and incidentals
Presentation skills as a subject matter expert (SME)
Instructional designer to develop the presentation skills program
Skilled facilitator to deliver the training program
Local sales data
Participation of AMs
|Develop a detailed project plan
Design the presentation skills training program
Design the assessment strategy
Design the field observation checklist
Produce and distribute the materials
Plan for the next generation of the program
Analyze local market data to prepare for each sales call
|AMs have increased awareness of each retailer’s communication style
AMs prepare pre-call execution plans (including the call objective and appropriate sales collateral)
AMs demonstrate their ability to apply (number and quality) the four key program elements on each sales call
AMs are able to highlight the “point of difference” (maximizing profit margins) during sales presentations
AMs execute sales presentations in a professional manner
|SHORT-TERM (1 to 3 months) AMs show their ability to adapt to retailer’s communication style
Increase in higher profit margin brand presentations
LONG-TERM (3 months to 1 year)
More confident sales force
Increase in the number of exclusive contract requests
Sales of high-profit margin brands become consistently higher versus previous year’s sales
Ongoing learning and skill development through best practice sharing at sales meetings
|Recognition of PDC sales departments as the region’s best (sales and service) in the beverage industry
New beverage manufacturers interested in partnership request PDC as their exclusive distributor
At the client’s request, this was a goal-based evaluation, assessing if providing AMs with the presentation skills instruction had resulted in intended, desirable outcomes. The evaluation team used Scriven’s (2007) Key Evaluation Checklist to guide the selection and review of both process and outcome dimensions and the importance of weighting the dimensions: 1. Presentation Design (Somewhat Important), 2. Job Relevance (Important), and 3. Sales Improvement (Very Important). The evaluation team used archival sales data, AM observation checklists, interviews, and web-based surveys to obtain data needed to evaluate the three dimensions of the program. The team incorporated Kirkpatrick’s (1979) four-level evaluation model and Brinkerhoff’s (2006) Success Case Method (SCM) into the design of the dimensions and data collection method. Based on the evidence shown in the data, the team assessed the quality of each dimension using a 4-point scale: Poor, Marginal, Good, and Excellent.
The process evaluation for the “Professional Presentation Skills” program focused on one dimension of merit, presentation design, and its evaluation question–Was the program designed in a way that the trainees could apply what they learned? Executive stakeholders weighted the presentation design dimension as somewhat important. The evaluation team reviewed the curriculum design of the workshop PowerPoint slides and the results of a participant reaction survey (Kirkpatrick Level I) to arrive at a presentation design quality rating of Good.
The outcome evaluation for the “Professional Presentation Skills” program focused on two dimensions of merit: job relevance, which is a short-term outcome, and sales improvement, which is a long-term outcome. For the job relevance evaluation question–Was the program content relevant to the AM’s job?–executive stakeholders weighted the job relevance dimension as important. The evaluation team reviewed the results of a participant reaction survey (Kirkpatrick Level I), data from recent AM Observation Forms (Kirkpatrick Level 3), and transcripts from SCM interviews with high- and low-performing AMs to arrive at a job relevance quality rating of Good.
The second outcome evaluation dimension of merit for the “Professional Presentation Skills” program was sales improvement. For the evaluation question for this dimension–Has the application of the program content led to increased sales?–executive stakeholders weighted the sales improvement dimension as very important. The evaluation team reviewed extant sales data on current sales trends and market conditions and year-over-year sales data. At first, PDC-specific sales data showed negative year-over-year growth for the past three years; however, average AM sales since the “Professional Presentation Skills” training were down only 0.2%, compared to 4.4% and 1.8% in the previous two years, respectively. The evaluation team also examined individual AM sales data pre- and posttraining and applied these data to the rubric (Table 2). A total of 23 AMs had posttraining sales of 100% or higher, or excellent, compared to the same time period one year ago; 22 AMs had posttraining sales of 95% to 99.9%, or good; and only one AM had sales of less than 95% compared to last year, yielding a rating of marginal. This distribution supports a sales improvement rating of Excellent. However, because the impact of the “Professional Presentation Skills” training cannot be separated from other variables affecting sales performance, the evaluators decided on a sales improvement dimension rating of Good so as not to advance confidence in the relationship between the training and year-over-year sales data.
Table 2. Sales Data Rubric
|Below 90%||90%-94.9%||95%-99.9%||100% or Higher|
In consideration of the good rating for each of the three dimensions of presentation design, job relevance, and sales improvement, the evaluation team determined that the overall quality of the “Professional Presentation Skills” training was Good. A summary of the dimensions, their rating, and weighting is presented in Table 3.
Table 3. Professional Presentation Skills Program Dimensions, Rating, and Weighting
|Dimension||Professional Presentation Skills Program
Overall Quality: Good
|Presentation Design||X||Somewhat Important|
|Sales Improvement||X||Very Important|
In an effort to meet PDC’s organizational goals, the evaluation team recommended that taking meaningful action in certain areas would provide continued job-relevant training content and contribute to incremental sales increases. This evaluation considered the overall design and delivery of the “Professional Presentation Skills” training program as well as AM reactions, interviews, and observations. The evaluation revealed both strengths and opportunities for improvement, as noted in Tables 4 and 5:
Table 4. Strengths of the Professional Presentation Skills Program
|Clear learning objectives||Low||Learning objectives must be clear so AMs understand the goals of the training|
|Easy-to-use materials||Low||Materials must be easy-to-use to facilitate AM understanding and subsequent on-the-job application|
|Relevant examples||Medium||AMs must be able to apply the training content to their jobs|
|Perceived as important to improved sales performance by most AMs||High||The organization’s primary goal for the “Professional Presentation Skills” program is to improve sales performance|
Table 5. Opportunities for Improvement of the Professional Presentation Skills Program
|Opportunities for Improvement||Importance||Reasoning|
|Increased opportunity to participate and discuss during training||Low||PDC management must know that AMs have learned the material|
|Increased opportunity to practice during and after training||Medium||Learners must have additional practice opportunities to implement and improve upon the skills taught in the training|
|Address perception issues with AMs who believe presentation skills are not important to AM success||High||The organization’s primary goal for the “Professional Presentation Skills” program is to improve sales performance. Perception is the AMs’ reality.|
Brinkerhoff, R. O. (2006). Telling training’s story: Evaluation made simple, credible, and effective. San Francisco, CA: Berrett-Koehler.
Kirkpatrick, D. L. (1979). Techniques for evaluating training programs. Training & Development, 33(6), 78.
Scriven, M. (2007). Key evaluation checklist. Retrieved from http://www.wmich.edu/evalctr/archive_checklists/kec_feb07.pdf
About the Authors
Joe Wessel serves as training & development coordinator for Pepin Distributing Company in Tampa, FL. He anticipates completion of a master’s degree in Instructional and Performance Technology in 2013 and may be reached at firstname.lastname@example.org.
Deb Bowden is a graduate student in the Instructional and Performance Technology program at Boise State University and works as a director of global sales and delivery effectiveness for FranklinCovey. She will complete her master’s program in June 2012 and may be reached at email@example.com.
Bryan Horveath is the practice leader of a managed markets training & development division for The Access Group in Berkeley Heights, NJ. He is a graduate student at Boise State University and anticipates completing his master’s degree in Instructional and Performance Technology in 2014. He may be reached at firstname.lastname@example.org.