Survey Results
This article details the results of the most recent QATC quarterly survey on the role of the QA analyst. Contact center professionals representing a wide variety of operations provided insight regarding measuring quality analyst performance.
Participant Profile
The largest number of participants are from contact center operations with between 51 and 200 agents. However, the balance is widely dispersed across all ranges. This mix of respondents provides a broad spectrum of contact center sizes. Financial, healthcare, and utilities have the largest representation, but there are participants from a wide variety of industries.
Average Handle Time
Respondents were asked what their average handle time is. Approximately one-third indicated an AHT of 5-7 minutes while almost another third indicated 7-10 minutes. Sixteen percent each chose 3-5 minutes and over 10 minutes. AHT is a critical component of the workload for the quality assurance analyst, driving the number of contacts that can reasonably be analyzed in a day.
Time to Score a Call
Respondents were asked how long the analyst is given to score a call. Nearly half are given more than 2 times the AHT with another 25% given 2 times the AHT. Eleven percent are given the same length of time as the AHT with the remaining 17% given 1.5 times the AHT. The QA process requires time to listen to the entire call, identify areas of excellence and those needing improvement, applying a score to each element and/or overall, and making notes for the agent and coach. To be most effective, the process needs to ensure that the input from the analyst is helpful to the coaching process and not just a numerical score.
Goal for Calls Evaluated
Survey participants were asked how many calls an analyst is expected to complete per day. Forty-three percent of respondents indicated they expect between 8 and 16 calls scored per day. Approximately one-quarter of the responses indicated either less than 8 per day or between 17 and 24 per day. A few indicated an expectation of 25 to 32 calls per day. This goal is largely a function of the AHT, and the time allotted to complete the scoring. However, it is also influenced by other activities and tasks that the analysts are expected to accomplish. Some of these activities are explored below.
Calibration Time
Respondents were asked how much time their analysts spend in calibration processes where scores from several analysts (and sometimes others such as supervisors or agents) are compared to ensure fair and accurate scoring. Approximately one-third spend under 1 hour or 1-2 hours per week. Another 16% indicated a requirement for 2-3 hours per week or some amount of time other than those listed as choices in the survey (perhaps none). Calibration is important to ensure that no matter who scores a call, the results are similar and consistent. This is key to support meaningful coaching and agent acceptance of the process.
Time Preparing Reports
The respondents were asked how much time an analyst spends preparing and distributing reports each week. Nearly half indicated that this activity takes less than an hour per week. Just over 20% each indicated it takes 1-2 hours or 2-3 hours. Eight percent indicated “other” which may include no time in this activity. Reports may be tallies of their activities, reports to supervisors/coaches, or other statistics. Indications of a broader training need or a company problem in another department might be part of the analysts’ reports.
Time Spent Coaching
Respondents were asked how much time their analysts spend on coaching agents. Nearly half indicated that they spend under 4 hours per week with 24% spending between 4 and 8 hours. Twenty-one percent indicated “other” which may include those who do no coaching. In some centers, supervisors or lead agents do all the coaching of the agents. In other centers, QA analysts may do some or all the coaching. It is important that the QA process includes coaching to help agents improve or to reward excellence. Scoring alone serves little purpose without coaching.
Time Spent on Training Agents
Respondents were asked how much time their analysts spend on training agents. Nearly half indicated that it was less than 4 hours per week with another 30% choosing “other” which may include those who do no training. Approximately 20% indicated that a significant amount of time (4 to over 16 hours per week) is spent on training. These may be smaller centers where the staff take on multiple roles
Time Spent On Other Activities
Survey respondents were asked how much time their analysts spend on activities other than those listed above. Approximately half indicated that it is less than 4 hours while about one-third spend 4 to 8 hours per week on these activities. Balancing the goals of the number of contacts analyzed per week and all the other activities that analysts are expected to handle can be a tough job.
Tracking Disputes
Respondents were asked if they track disputes that agents file against analyst scores. Nearly two-thirds do track these while the rest do not. It can be revealing to identify trends in dispute filings both by the agents who file them and the analysts whose work they are questioning.
Percent of Disputes
Respondents were asked to provide the percent of total calls scored that are disputed per QA Analyst on average. Nearly all (94%) indicated that it is under 10 percent. This success is often attributed to a good calibration process, a meaningful quality definition document, and good training for agents and supervisors on the process.
Disputes Considered QA Errors
Survey respondents were asked the percentage of disputes that are considered errors on the part of the QA Analysts on average. Eighty-five percent indicated that it is under 5% with only 15% of respondents finding 5-15% to be an analyst error. Errors by the analyst may be a lack of training, rushing to complete tasks, or misinterpretation of the requirements. Each cause needs to be addressed appropriately to reduce both errors and disputes and give agents more faith in the usefulness of the process.
Summary
This survey provides some insight into the activities and metrics applied to the QA analyst’s role. While some analysts appear to be totally dedicated to the process of scoring contacts, others are involved in calibration, coaching, and even training. Where analysts are alert to not only individual agent results but other indicators of a systemic problem in training, or even a company problem outside of the center, the organization as a whole benefits more from the QA process.
We hope you will complete the next survey, which will be available online soon.