By James Corbett, Karen Gerome, and Iris Nunn

Tales from the Field, a monthly column, consists of reports of evidence-based performance improvement practice and advice, presented by graduate students, alumni, and faculty of Boise State University’s Organizational Performance and Workplace Learning department.

Military retirees, as part of the greater military family, remain capable of making lifelong contributions to the organization they once served. To ensure their continued readiness and support, retirees must be kept apprised of relevant changing programs, services, and policies that affect the greater retirement community. In pursuit of this goal, in the spring of 2014, the National Military Retiree Administration (NMRA, a pseudonym) requested a team of graduate students at Boise State University to conduct an evaluation of the administration’s newest program, the NMRA’s Retiree Services Program. The purpose of the Retiree Services Program is to provide referral services regarding relevant benefit information to military retirees, annuitants, survivors, and disabled veterans and their families.

When fully implemented, there will be 15 desks across the country, all servicing different areas of responsibility and all manned by volunteers (generally, military retirees) and supervised by a volunteer director. An active duty military member, detailed from one of the five branches of the armed forces, will act as the retiree service coordinator at each location to provide oversight and support to the volunteer director and his or her staff.

The volunteers must accurately respond to requests for information and determine the correct points of contact for retirees and their families with questions about a variety of military benefits, services, and resources.
Further, the volunteers will perform outreach activities between the local federal retiree communities within their areas of responsibility and the local Department of Defense service retiree activity offices, the larger military community, and other governmental agencies and military coalition members that provide assistance to retirees.

Program Evaluation Question, Type and Background
The Retiree Services Program manager, the client for this evaluation case study, requested a formative and client-based evaluation to determine if the volunteers received the appropriate resources and training needed to effectively man the desks and respond to calls from the retirement community. The overall goal of this evaluation was to investigate the effectiveness of the program’s implementation process to help the program quickly get up to speed and provide a valuable service for the greater national military retirement community. This evaluation intended to offer guidance to the upstream stakeholders in hopes of improving the program for immediate recipients.

The primary evaluation questions the evaluation team set out to answer was:

How well did the implementation process prepare the volunteers to provide services for the greater retirement community and how can we improve the program’s performance moving forward?

At the time, the program was beginning the third and final phase of the implementation process. This meant that most of the desks had been identified and provided with a location and, with the exception of varying volunteer levels, the resources to operate. In addition, each retiree services desk was at a different readiness level, and only four of the 15 were actually operating and servicing their local retirement communities. This formative evaluation focused on the four desks that were operational at the time of the analysis.

Evaluation Dimensions
The evaluation team identified four dimensions for the evaluation and the weights associated with each dimension through several interviews with the client and a review of organizational documents for the program, such as the National Military Administration’s Retiree Services Program Instruction and the Retiree Services Program Training Guide & Resource Kit. Two dimensions (i.e., program management and program implementation) related to the program’s processes, and two dimensions (i.e., volunteer motivation and customer satisfaction) related to the program’s outcomes. The process dimensions were determined to be the most important to the program’s initial success while the program was still in its infancy. The overall categories, the dimensions of merit, the weighting determination for each dimension, and the related evaluation questions are shown in Table 1

Table 1: NMRA Program Evaluation–Dimensions, Weighting, Questions

Category Dimension of Merit Importance Weighting Related Evaluation Question(s)
Process Program Management Extremely Important How well did the solicitation process do in obtaining an appropriate level of Retiree Services Program volunteers?
Program Implementation Extremely Important How well did the implementation process (i.e., the training and resources provided) prepare volunteers to operate the retiree services desks in accordance with relevant standard operating procedures to effectively:

  1. inform the local retiree community of the available service and respond to their needs
  2. develop retiree resources (i.e., retiree event calendars, local retirement community referral services libraries, local retiree services contact sheets)
  3. establish collaborative retirement networks with external agencies to forge valuable professional relationships
Outcome Volunteer Motivation Somewhat Important How well did the implementation process contribute to volunteers valuing the service they are providing to the retirement community?
Customer Satisfaction Very Important How well do local retirees feel that their information needs are being met by the retiree services desks?

Methodology and Results

By investigating the four critical dimensions at the four operational desks, the evaluation team was able to determine how prepared the volunteers were to operate the retirement service desks and how to improve the program moving forward.
The team used the survey findings as the primary data source and the archival information and interviews as the secondary data source (see Table 2). Based on the evidence, the team determined the overall quality of the program to be “Good” on a 3-point scale of “Needs Improvement,” “Good,” and “Excellent.”

Table 2: Primary and Secondary Data Sources Used in Evaluation

Primary Data Source Secondary data source Secondary data source
Surveys:

  • Active Duty Coordinators (3)
  • Volunteer Directors (3)
  • Volunteers (4)
Archival Data:

  • Volunteer numbers–4 desks
  • SOP–1 desk
  • Quarterly Program Progress Report– Comprises data from all 4 desks
  • Volunteer Solicitation Strategies–Not available as archival data, but obtained through interviews and surveys
  • Customer Satisfaction Results–Amplifying information collected from survey and interview results (Note: Desks were not using customer satisfaction reports at the time of this study.)
  • Desk Stats–Not available as archival data; however, stats were approximated through interviews and surveys and varied greatly between the 4 operational desks (i.e., volunteer rates, call volume, retirees within areas of operation)
  • Volunteer Turnover and Absenteeism Rates–Not available as archival data; however, collected via survey and interviews (no significant findings)
Interviews:

  • Active Duty Coordinators (1)
  • Volunteer Directors (2)

All stakeholders involved in the evaluation indicated there is a strong need for the service to exist for the greater retiree community. This indicates a clear strength of the NMRA’s Retiree Services Program. All four desks were operational and attending to needs, although in limited capacities, of the retiree communities within their areas of responsibilities. They receive via emails and phone calls continuous confirmation of the need for the service to exist for the greater retiree community. Local retirees that were aware of the service have provided feedback that their information, referral, and advocacy needs were being met by the retiree services desks via the volunteers answering calls and emails, outreach efforts, and local newsletters.

Further, an appropriate level of Retiree Services Program volunteers had been solicited for the operational desks at the time of this report. The implementation process (i.e., the training and resources provided) prepared the volunteers to operate the retiree services desks in accordance with relevant standard operating procedures effectively and contributed to volunteers and staff valuing the service they were providing to the retirement community. Use of existing relationships and networks and the support of sponsoring base commanders were also noted as strengths of the program. Additional strengths of the program are as follows:

  • The program responded to an identified need and continuous confirmation of need for the service to exist for the greater retiree community
  • Each of the four desks was operational and attending to needs of the retiree communities within the areas of responsibilities.
  • An appropriate level of Retiree Services Program volunteers had been solicited for the operational desks at time of report.
  • The implementation process (i.e., the training and resources provided) prepared the volunteers to operate the retiree services desks in accordance with relevant standard operating procedures effectively.
  • The implementation process contributed to volunteers and staff valuing the service they were providing to the retirement community.
  • Local retirees aware of the service felt that their information, referral, and advocacy needs were being met by the retiree services desks.
  • The sponsoring base commanders provided strong support.
  • Existing relationships and networks were useful to get the program up and running.

Program Weaknesses and Recommendations for Improvement

The evaluation team identified recommendations and methods to improve the program moving forward. Areas of improvement identified during the evaluation were in ensuring the appropriate employment of volunteers and identifying the retiree communities associated with each desk’s area of responsibility. Further, as indicated by the lack of available archival data, it was clear that a majority of the desks have not established methods to collect metrics to measure the desks’ progress.

Although different degrees of importance weighting were used to consider the four dimensions during the evaluation process, many of the valuable recommendations did not clearly align with just one of the four dimensions. Because of this and the formative nature of this evaluation, the following recommendations were not prioritized based on an importance weighting scale. Instead, these recommendations were provided without priority to allow the client to consider each one individually based on factors such as the expected results, feasibility, costs, ability to accomplish desired outcomes, acceptability, and other key factors relating to implementation. More specifically,

  • A need to clarify the program’s strategic objectives and the manner in which each desk should operate to achieve them
  • Creation of a community of practice to allow better collaboration among the 15 retiree service desks
  • Establishment of clear boundaries of areas of responsibility so the desks can clearly identify the retirees they serve
  • Identification of a clear and simplified method for the desks to identify the retiree population they serve
  • Clear task direction for volunteers
  • Modification and simplification of the volunteer qualification process
  • Alternative methods for volunteers to serve the desk based on limited access to computer networks and remote access opportunities
  • Clarification of available resources available to each desk
  • Establishment of performance measurement and management system to allow for continuous monitoring and improvement of the desk’s services

Although listed briefly here in this article, the evaluation team explained each recommendation in detail in its full report and presented it to the client for review, consideration, and future implementation.

Evaluation Limitations and Lessons Learned

Although based on a holistic assessment of the four critical dimensions, the primary data used to make the evaluation determinations were based on limited survey data and supported by available secondary data in the form of available archival information and interview data. It was not until after the team collected and analyzed survey data that the team members concluded that they should have considered using interview data, primarily using Goal Free Evaluation techniques, alongside the survey data, as a primary method for determining how well the desks were performing in each dimension. For example, it was during interviews that the need for identification of clear and simplified methods for identifying the retiree population for each area of responsibility was discovered.

Due to limited survey responses from volunteers, zero responses from retirees, and consistent answers of “neither agree nor disagree” by many survey respondents regarding questions relating to volunteer rates and the creation of specific retiree resources, the survey information did not provide clear delineation between “needs improvement,” “good,” and “excellent” as desired for a process and outcome evaluation. Further, the four desks considered during the evaluation process had not yet established a method to collect metrics, such as customer satisfaction reports, retiree call and visit volume, outreach activities, confirmation of the creation of specific retiree products and resources, and so forth. This was to be expected given the relative short period the desks had been operational; however, this also detracted from the evaluation’s significance.

About the Authors
JamesCorbettJames Corbett is a Lieutenant Commander in the U.S. Coast Guard. He compled the organizational performance and workplace learning master’s degree program through Boise State University in August 2014. His most recent assignment was that of a congressional liaison for the FBI. He has also served on three Coast Guard ships in various leadership capacities. He can be reached at JTCorbett30@gmail.com.

 

 

KarenGeromeKaren Gerome is an instructional designer and e-learning specialist at Liberty Mutual Insurance where her current focus is on talent management practices and helping to design a comprehensive and consistent onboarding and assimilation program. She plans to complete her master’s degree in organizational performance and workplace learning in May 2015, and may be reached at karengerome@u.boisestate.edu.

 

 

IrisNunnIris Nunn has held leadership positions in education, training, and instructional design. She is currently an independent consultant in Houston, Texas, and is working on her graduate degree in organizational performance and workplace learning at Boise State University. She will complete her master’s degree in May 2015, and can be reached at inunn7@comcast.net.