Survey Results2019-03-24T19:20:46+00:00

Survey Results

This report details the results of the most recent QATC quarterly survey on critical quality assurance and training topics.  Over 50 contact center professionals representing a wide variety of industries provided insight regarding training tools and processes.

Number of Agent Seats

The largest number of participants is from call center operations with over 500 seats (34%).  However, the balance is widely dispersed across ranges from under 50 agents up to 500 seats. This provides a good representation of all sizes of centers.

New Hire Training Medium

Respondents were asked what percentage of their new hire training is conducted in each listed medium (all choices must add to 100%).  Nearly three-quarters of the training (74%) is conducted using classrooms as the primary medium. Another 12% is self-paced e-learning or computer-based training, while reading and written work comprises 6%.  Only 4% is delivered via web seminars with live instructors. For many, live face-to-face training with an instructor is preferred over pre-recorded delivery. However, with multiple locations to support, some are moving to web seminars with live instructors and other self-paced recorded media.

Ongoing Training Medium

When asked about an ongoing training medium (versus new hire training), classroom delivery dropped from 74% to 45% and self-paced e-learning jumped from 12% to 29%.  Web seminars doubled from 4% to 8%. Once the agent is on the job, it appears to be more acceptable to deliver training in media other than live classroom instruction, although nearly half of training for tenured agents is still conducted that way.  For content that is informative in nature, non-classroom instruction appears more accepted, but for behavior-based training, classroom is still the most popular.

Tools Used in Classroom Training

Respondents were asked to identify the tools used in classroom training and could choose as many as applied.  PowerPoint (or similar software), sample actual calls, and computer/software like that used by agents on the job were the most common answers.  Next were presentations by leaders and other departments, followed by a white board, self-paced e-learning, and limited real-time customer call handling within the training environment.  However, nearly all the tools mentioned were used by some centers. Given the need to reach all students through audio, visual, and kinesthetic delivery, using a wide variety of media can deliver the most consistent results across the full student population.

Web Seminar Tools

Survey participants were asked which tools they use for web seminar deliveries with the option to pick all that apply.  The most common answer was seminars conducted entirely by in-house staff, while seminars including at least partial external resources was only chosen by a few.  Two-way audio (with non-interactive visuals) was chosen slightly more often than one-way audio. Utilizing an on-site facilitator in addition to the web-based presenter was indicated by less than 10 respondents.  This can be an effective addition to a remotely-delivered presentation to ensure that all participants are engaged, and local discussion and questions are effectively handled.

Self-Paced e-Learning Tools

Respondents were asked to identify the tools used for self-paced e-learning deliveries and they could choose all the options that applied.  Quizzes within the training modules was the most common choice with an in-house Learning Management System (LMS) close behind. Only a few indicated use of an external LMS.  Computer simulations and interactive games are used by several respondents. Simulations can be effective in teaching software tools and processes and games can be engaging when the content is a bit dry.

Test for Knowledge or Skill

Almost two-thirds (64%) of the respondents indicated that a test for knowledge or skill is included at the end of each training program, while only 1 respondent indicated that testing is not included.  The remainder use tests some of the time. It is important to ensure that the students have achieved some level of mastery of the content of the class so that next steps can be determined. For example, does the student need to repeat the class, have some extra coaching on the job, or has she mastered the content to be able to use it effectively?  Some centers test only for knowledge/skill, but not the ability to put it to use on the job and deliver improved results.

Medium Used for Testing

For those respondents who do utilize testing, the next question asked what medium they use for those tests.  Computer-based objective tests were the most common answer with paper-based objective tests (with manual scoring) were second.  Simulations and live call handling assessments were tied for third choice. Only a few use paper-based objective tests that are scored electronically.  Automating the testing process can make it less burdensome for the trainers and may make tracking of the results easier. For example, analysis of questions missed most often can be very helpful in adjusting the training delivery to ensure the best results on the job.

Results of Failed Test

Respondents were asked to identify what the results are for the agent that fails the test.  Approximately one-third (34%) indicated that retraining would occur, while another one-third (32%) transfer the responsibility for additional training to the supervisor on-the-job.  Ten percent of the respondents indicated that if the agent fails, termination is the next step. Fifteen percent indicated that they do not test. While testing can provide feedback to the trainer on areas of challenge, it also helps the receiving supervisor to know where the agent will need extra mentoring or support.  Where tight restrictions (such as state licensing) are required, termination may be the only reasonable outcome. Some centers find that small tests after each module of training can be more helpful than one big test over all the content at the end.

Conclusion

This survey provides insight into the tools and processes utilized in the training processes.  While some are utilizing computer-based tools and web seminars, the majority are relying on the live classroom delivery for new hires, and to a lesser extent, for ongoing training.  Testing is applied by most centers to determine if the student has gained the needed knowledge or skills to be effective on the job. Testing is nearly evenly split between computer-based and paper-based.  Conversion to an automated testing process can relieve some of the trainer burden and provide an easier way to analyze results to determine where training needs to be adjusted and what support an agent will need to be effective on the job.