QATC Survey Results
This article details the results of the most recent QATC quarterly survey on critical quality assurance and training topics. Contact center professionals representing a wide variety of operations provided insight regarding the processes and expectations for quality assurance operations.
Number of Agents
The largest number of participants is from contact center operations with between 51 and 100 agents. However, the balance is widely dispersed across all ranges. This mix of respondents provides a broad spectrum of contact center sizes. Financial, healthcare, and utility have the largest representation but there are participants from a wide variety of industries.
Average Handle Time for Calls
Respondents were asked to provide the average handle time for calls in their centers. Approximately one-third (35%) responded with 5 to 7 minutes. That was followed closely by 24% for 3 to 5 minutes and 23% for 7-10 minutes. Twelve percent of the respondents indicated an AHT of 1 to 3 minutes while only 6% reported over 10 minutes.
Scores Per Day Expectation
Respondents were asked how many calls per day a QA analyst is expected to score. Eight to 16 scores are expected by 35% of the respondents with 32% reporting an expectation of under 8 calls per day. Twelve percent each reported 17 to 24 and 25 to 32 calls per day. Only 9% expect more than 32 calls per day to be scored. The length of the calls is a key indicator of how many QA processes can be completed in a day as well as other duties that the analyst may need to complete.
Respondents were asked how much time in an average week a QA analyst participates in calibration activities. Over half (53%) indicated that they spend less than 1 hour in such activities (and this may indicate that some do not participate at all in calibration). Approximately one-third (35%) reported between 1 and 2 hours in calibration with 9% reporting 2 to 3 hours and 3% spending more than 3 hours per week.
Survey participants were asked how much time the QA analysts spend in preparing and distributing reports. The largest percentage of respondents (43%) indicated that they spend under 1 hour on this activity while one-third spend 1 to 2 hours on reports. However, 21% indicated that more than 3 hours per week is dedicated to this activity.
Respondents were asked how much time the QA analyst spends in coaching agents. Over three-quarters answered “under 4 hours” and this may include those who do not have any responsibility for coaching. However, 18% indicated that their analysts spend 4 to 8 hours on coaching and another 3% reported 8 to 16 hours per week on this effort. Since development of agents is the primary purpose of quality assurance activities, coaching is an essential part of the process. Agents should know what they are doing well and be encouraged, as well as knowing where they can make changes to improve the outcomes of their work. Whether this coaching is done by supervisors, QA analysts, or others, it needs to be the focus of the whole process.
Respondents were asked to indicate how much time the analysts spend in training agents. Over eighty percent (82%) indicated that it is under 4 hours and this likely includes those who have no responsibility for training. Of the 18% that reported training as a QA activity, half indicated that the analyst spends 4 to 8 hours per week on this effort. As centers grow, roles that are often combined in smaller operations tend to be separated into dedicated roles. Thus, an analyst in a smaller center may do the QA scores and coaching and even training initially. In a larger center, these activities require more time and become separate.
Respondents were asked how much time the QA analyst participates in other activities in addition to scoring, calibration, reporting, coaching, and/or training. The largest percentage (44%) indicated that 4 to 8 hours is used for these activities while 39% indicated that it is under 4 hours per week. Interestingly, 9% indicated that over 16 hours per week is dedicated to these other activities.
Respondents were asked if they track disputes for each QA analyst. Fifty-nine percent indicated that they do track to the analyst while the remaining 41% indicated that they do not. When asked what percentage of scores are disputed, the overwhelming majority indicated that less than 10% are disputed. Of these disputes, 85% report that less than 5% are attributed to an error by the QA analyst while 12% indicate 5 to 10% error by the analyst.
Track Disputes for Each QA Analyst
Percentage of Total Calls Scored Disputed per QA Analyst on Average
How Many Disputes Considered an Error per QA Analyst on Average
This survey provides insight into the roles and responsibilities of the QA analysts. With the variety of sizes of call center agent teams in the participating respondents’ centers, the answers to this survey show the mix of roles that a QA analyst might have. Some likely only do the scoring while others may handle calibration, reports, coaching, and/or training. Some may have other duties as well.
However the process is distributed across the contact center team, the QA role is an essential one in providing high levels of customer satisfaction and employee development. Quality is the key to maximizing one contact resolution which is the top priority for many customers. Doing a job well and career development are also important to employee satisfaction and retention. It is not all about the score, but a focus on process improvement that will provide the greatest return on the investment and effort.
We hope you will complete the next survey, which will be available online soon.