Tales from the Field Rounding the Bases: A Story of Full-Cycle Instructional Design for The Sports Bar
By Christin Lundberg, Scott Bowes, Travis Struchen, and Diane Weir
Tales from the Field, a monthly column, consists of reports of evidence-based performance improvement practice and advice, presented by graduate students, alumni, and faculty of Boise State University’s Instructional and Performance Technology department.
The Sports Bar (pseudonym) is a publicly traded, international company with over 800 restaurant locations. It specializes in chicken wings, beer, and televising sporting events. The Sports Bar has approximately 20,000 employees and is headquartered in the Midwest region of the United States.
The Sports Bar is projecting over 350% growth throughout the world in the next eight years. To meet this aggressive goal, The Sports Bar plans to increase the number of positions at its Midwestern headquarters to provide adequate support for those in the field. Given this aggressive business goal and timeline, The Sports Bar wants to be sure it hires the right personnel who will allow the company to achieve success. The Talent Management Services group at The Sports Bar firmly believes that the use of behavioral interviewing techniques will allow the organization to find the right people. Behavioral interviewing is based on the premise that past job behavior predicts future job performance. This type of interviewing relies on specific past examples presented by the candidate. Merritt (as cited in Ullah, 2010) indicates research has shown that behavioral-based interviewing techniques are eight times more likely to predict future job performance than other types of interviews such as experience-based or stress interviews.
In August, 2011, The Sports Bar’s Director of Talent Acquisition (client) approached The Sports Bar’s Instructional Design team regarding an ongoing issue at headquarters. Based on first-hand observations, the director was concerned about the ability of hiring managers to conduct effective behavioral-based interviews with job candidates and subsequently make hiring decisions based on data gathered from the behavioral-based interviews.
Our team, comprised of four Boise State University Instructional Design graduate students, set out to investigate the performance gap.
Based on the initial request, we knew the client believed that lack of knowledge and skills were leading to the identified performance gap. However, as instructional and performance technology practitioners, we knew the importance of investigating and triangulating this claim before determining if training would close the performance gap. We used the Bronco Instructional Design (BID) model, shown in Figure 1, as the framework for this project. Similar to other A2DDIE models of instructional design (e.g. Guerra, 2003; Villachica, Marker, & Taylor, 2010), BID includes a performance and cause analysis that precedes other analytical activities associated with ADDIE models. One benefit of using the BID model is that it is a “front-loaded” model in that it places a strong emphasis on the analysis and design stages, making subsequent steps of designing the instruction and developing the materials much easier by avoiding potentially costly modifications in later phases of the model.
Figure 1. Illustration of the Bronco Instructional Design Model.
We began investigating the perceived performance gap by conducting a performance analysis. A group interview with the Talent Acquisition team revealed of all hiring that occurred during the first nine months of 2011, hiring managers were making behavioral-based hiring decisions approximately 14% of the time, while the desired performance was that hiring managers make behavioral-based hiring decisions 95% of the time. A performance gap was evident based on this data, but more data was required to determine the significance and cause(s) of the gap.
Next, we conducted an organizational analysis to connect the performance gap to one or more organizational goals. If we could connect the gap to organizational goals, we would be able to explain why the gap was worth closing. We gathered information regarding both organizational and departmental goals and found two business goals, in particular, that were directly tied to the concept of behavioral interviewing.
- To acquire and retain differentiating talent.
- To become a restaurant chain of over 3,000 restaurants by 2020.
Having identified a performance gap worth closing, we conducted our cause analysis. We began by interviewing the Talent Acquisition team, who had directly observed the lack of behavioral-based interviewing. During the interviews, the Talent Acquisition team members told us they had directly witnessed hiring managers not using behavioral interview questions during interviews, not probing for additional information from candidates, and not taking appropriate notes during the interview, which are all critical components of successful behavioral interviews.
To supplement and triangulate these data, we conducted a survey of hiring managers and reviewed extant termination reports. Survey data indicated the hiring managers did, indeed, lack the knowledge and skills necessary to conduct behavioral-based interviews, which would ultimately lead to behavioral-based hiring decisions. The survey revealed that only 54% of respondents knew the difference between a behavioral-based interview and an experience-based interview, and only 23% of respondents were confident in their abilities to conduct an effective behavioral-based interview.
The data gathered from termination reports, in combination with projected costs (based on industry averages) for poor hiring decisions, resulting in employee turnover, revealed during the first nine months of 2011, lack of behavioral-based hiring decisions had cost the company approximately $220,000, when factoring in recruiting activities, time of those involved in the hiring process, and lost productivity due to open positions. Hiring projections for 2012 told us if The Sports Bar did not address the lack of behavioral interviewing skills amongst hiring managers, the company could stand to lose approximately $330,000 in the upcoming year. This analysis helped the client to see even more clearly the need to close the performance gap.
Based on our cause analysis, we determined there was more than one cause of the performance gap, which included a lack of knowledge and skill, lack of guidance, and lack of clarity around processes. Our team knew instruction would be an appropriate solution to close the knowledge and skill gap; therefore, we agreed to focus on that aspect of the performance problem. Additionally, we agreed to assist with lack of guidance gap by creating a job aid in the form of an interview notes guide that would provide guidance to hiring managers while conducting interviews. On the other hand, the client agreed to address the lack of clarity around interviewing processes at headquarters.
After narrowing our focus and realizing there was a knowledge and skill gap, our next step was to conduct a learner analysis via an electronic survey that would allow us to make instructional design decisions based on our target audience regarding prior knowledge, motivation, availability, language, and attitudes toward the company.
As we conducted a literature review to investigate behavioral interviewing, we realized that a procedural task analysis would be the most appropriate method for capturing the components of a behavioral interview because a behavioral interview can be broken down into steps and sub-steps. To conduct the task analysis, we conducted several in-person meetings with two members of the Talent Acquisition team (subject matter experts) to discuss the details of behavioral interviewing. Through these meetings, we were able to document the process of conducting a behavioral interview. This included some processes for decision making during an interview, as well as important cautionary notes and tips for a person conducting a behavioral interview. We used decision tables within the procedural task analysis document to capture different decision-making steps that occur throughout an interview.
Once we had a complete task analysis that adequately represented exemplary performance, we reviewed it closely to determine exactly which tasks were critical to behavioral interviewing and required training because of lack of knowledge or skills, as opposed to a simple job aid. We followed the guidelines set forth by Harless (1986) to determine when a job aid would be appropriate. We identified two critical tasks that required training and would be the focus of our instruction.
We began our design efforts by creating instructional objectives, using Mager’s three-part method (as cited in Chyung, 2008), by specifying on-the-job behaviors, criteria, and conditions for each of the two critical tasks, which focused on asking appropriate follow up questions and taking appropriate notes during the interview. These objectives served as the basis for all subsequent instructional design activities.
Our next step was to create an authentic performance assessment that mirrored the on-the-job tasks hiring managers would need to perform in a real behavioral interview with a candidate. We developed a two-part performance assessment that was checklist-based and was aligned with our two identified critical objectives. The first part of the assessment was a process assessment because the hiring managers would be evaluated on a task that did not produce a tangible outcome; therefore, the evaluator would need to watch the hiring manager perform the task. The second part of the assessment was a product assessment, consisting of the hiring managers’ notes, which was appropriate for this task because it produced a tangible product (i.e., the completed interview guide) that could be evaluated after completion of the task.
Using the learner analysis and performance assessment, our next step was to create an instructional plan that would allow the hiring managers to achieve the previously identified objectives. We used Merrill’s (2002) first principles of instruction to design our instructional plan, and made sure to address all four of the phases, as well as each of the associated corollaries, as the corollaries act as quality assurance checks for each of the phases (see Table 1). We presented the completed instructional plan to the client for approval before moving on to the next step in our instructional design process, which was to begin development of the instructional materials.
Table 1. Merrill’s First Principles of Instruction and Corollaries.
The final step of the instructional design process consisted of developing an instructor’s guide, a two-part learner’s guide (interaction and resource), and a PowerPoint presentation. The interaction portion of the learner’s guide contained all of the necessary tools to complete activities such as role-play interviews. The idea behind the interaction guide was that the learner could discard the materials after the training. On the other hand, the resource guide contained important points from the instruction, the job aid we created, and space for notes. The learner could keep the resource guide for future reference back on the job. The intent was that we would use these materials for the training program pilot and later modify them based on feedback gathered from multiple audiences (i.e., facilitator and learners)
The behavioral interview training we created was just one of seven modules related to hiring that The Sports Bar planned to launch at its headquarters in 2012. Because of this, it was determined that The Sports Bar would not be able to pilot this module individually. Therefore, we created a formative evaluation packet that The Sports Bar could use to guide the reaction (level 1) and learning (level 2) evaluation of behavioral interviewing module when the time was right for the organization. We also suggested that the client could use the authentic level two assessment to collect data indicating skill transfer to the workplace (level 3 data).
The evaluation packet we created provided detailed instructions, checklists, forms, and tables to make it easy to gather and analyze data from the pilot. In addition, we recommended The Sports Bar use the basic framework of the formative evaluation packet for the evaluation of the other six modules The Sports Bar was creating on its own.
In May 2012, The Sports Bar conducted its pilot of the behavioral interviewing module, and the larger hiring training course it developed, using the formative evaluation packet our team created. The formative evaluation revealed that the area requiring most improvement before rolling the training out to the headquarters was with regard to the role-plays. In an effort to make the mock interviews as realistic as possible, the ID team included what proved to be too many details in the original materials. The pilot revealed the materials required streamlining and a more simplistic approach so the learners could focus on learning new behavioral interviewing skills and not on trying to understand the instructions for the activities. These adjustments, and a few other minor adjustments, were made and reviewed with the pilot participants to ensure the improvements met the needs of the learners. The headquarters expected to conduct its first training session around the time of this manuscript submission (June, 2012).
Chyung, S. Y. (2008). Foundations of instructional and performance technology. Amherst, MA: HRD Press, Inc.
Harless, J. H. (1986). Guiding performance with job aids. In M. Smith (Ed.) Introduction to performance technology (Vol 1. pp. 106-124). Washington, DC: National Society for Performance and Instruction.
Guerra, I. J. (2003). Key competencies required of performance improvement professionals. Performance Improvement Quarterly, 16(1), 55-72. doi: 10.1111/j.1937-8327.2003.tb00272.x
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43-59.
Ullah, M. (2010). A systematic approach of conducting employee selection interview. International Journal of Business and Management, 5(6), 106-112.
Villachica, S. W., Marker, A., & Taylor, K. (2010). But what do they really expect? Employer perceptions of the skills of entry-level instructional designers. Performance Improvement Quarterly, 22(4), 33-51. doi: 10.1002/piq.20067
About the Authors
|Christin Lundberg is a Senior Instructional Designer at the company for which this project was completed. She completed her master’s degree in Instructional & Performance Technology (IPT) in May 2012 from Boise State University. Christin may be reached at firstname.lastname@example.org.|
|George (Scott) Bowes is an Assistant Manager of Training with the Metropolitan Council’s Environmental Services Division within the Performance Systems Business Unit where he manages training programs for technical employees. He plans to complete his certification in Workplace Instructional Design in the spring of 2013 and his master’s degree in Instructional & Performance Technology (IPT) in the spring of 2014. Scott may be reached at email@example.com.|
|Travis Struchen is an elementary teacher currently directing a bilingual program in Ho Chi Minh City, Vietnam. He will complete a Master of Science degree in Instructional and Performance Technology (IPT) and a Workplace E-Learning and Performance Support Graduate Certificate from Boise State University in July of 2012. Travis may be reached at firstname.lastname@example.org.|
|Diane Weir is a managing consultant with IBM Software Services for Collaboration where she designs transformative, enterprise-level learning initiatives. She is planning to complete her master’s degree in Instructional & Performance Technology (IPT) from Boise State University in the fall of 2012. Diane may be reached at email@example.com.|