QATC Survey Results

This article details the results of the most recent QATC quarterly survey on critical quality assurance and training topics. More than 60 contact center professionals representing a wide variety of operations provided insight regarding the performance metrics and activities for QA analysts.

Participant Profile

The largest number of participants are from contact center operations with between 51 and 200 agents. However, the balance is widely dispersed across all ranges. This mix of respondents provides a broad spectrum of contact center sizes. Financial, healthcare, and utility have the largest representation but there are participants from a wide variety of industries.

Average Handle Time

Respondents were asked what the average handle time (AHT) is in their operation. Approximately one-third each reported that their calls are 3 to 5 minutes or 5 to 7 minutes. The balance of the responses is between 1 and over 10 minutes.

Analyst Time to Score a Call

Respondents were asked how long the analyst is given to score a call in relationship to the AHT. Forty-three percent of respondents indicated that the analyst has more than 2 times the AHT to complete the scoring. Another 43% have between 1.5 and 2 times the AHT. Listening to the call and thinking about each element of the scoring can be time-consuming.

Analyst Time to Score a Call chart

Calls per Day Goal

Respondents were asked what the calls scored per day goal is for QAs. Just over one-third expect 22 calls per day while another 31% expect 19. The goal is driven in part by the AHT but also somewhat dependent upon the other duties the QAs are expected to perform.

Calls per Day Goal chart

Responsibility for Calibration

Survey participants were asked how much time per week a QA is expected to be involved with calibration activities. Seventy-five percent indicated that calibration time is up to 2 hours per week. The 10% that selected “other” may not have QAs involved in the calibration process at all.


Responsibility for Calibration chart

Handling Reports

Respondents were asked how much time the QA spends in developing and/or distributing reports. Forty-three percent indicated it takes under 1 hour while another 43% utilize 1 to 2 hours per week on these activities. Again, the choice of “other” may indicate no reporting responsibilities.

Responsibility for Calibration chart

Time Coaching Agents

Respondents were asked how much time QAs spend in coaching agents. Twenty-nine percent reported that their QAs spend less than 1 hour per week on this activity while 24% spend 1 to 2 hours. However, 24% chose “other” suggesting that they do not have any responsibility for coaching agents.

Handling Reports chart

Training Time

Respondents were asked how much QA time is spent on training agents. Forty percent indicated that their QAs spend less than 1 hour per week on this activity while another 28% chose “other.” Eleven percent reported QAs spend up to 3 hours per week in training-related work.

Training Time chart

Other Activities

Respondents were asked how much time the QAs spend in activities other than those requested specifically in the survey. Thirty-six percent of respondents reported that their QAs spend over 3 hours per week on these tasks while another 49% reported between 1 and 3 hours per week. Only 3% chose “other” suggesting no other activities.

Other Activities chart

Managing Disputes

Respondents were asked if they track disputes of scores by agents associated with each QA and 69% indicated that they do. In terms of how many disputes are filed, 90% indicated that it is less than 10% of the total calls scored.

Track Number of Disputes

Percentage Disputed on Average

Disputes Considered QA Errors

Respondents were asked what percent of disputes are resolved as an error by the QA. Seventy percent reported less than 5% are considered errors. However, 14% reported that over 10% are resolved as QA errors.

Disputes Considered QA Errors chart

Summary

This survey provides some insight into the goals and metrics used to analyze the performance of the QA department. There is a wide range of activities and tasks that can be assigned to the analysts in addition to the basic contact scoring processes. Balancing these activities with the AHT typically drives the total number of scores that a QA can complete. Ensuring a clearly defined set of standards for agents to follow can reduce errors by agents as well as provide more accurate and meaningful QA scoring reports.