Monitoring the quality of a support center is an extremely crucial part of Customer Experience (CX) being effectively and deliberately engineered.
There are several ways running a QA program: call or chat monitoring and/or recording, ticket review and/or monitoring, coaching for quality, and many other measures, all aimed at creating great customer experiences, providing consistent and effective support.
In an effort to uncover the state of quality assurance (QA) in technical support HDI has conducted a survey with 330 support organizations and here are the key findings:
- More than half of respondents are doing QA work and have formal processes in place, while almost one-third have no formal processes. About one-sixth (16%) aren’t performing QA.
- The top three purposes for quality review and monitoring are customer satisfaction (86%), training and/or coaching (77%), and performance review (67%).
- More than half (57%) of support organizations record technical support calls, and 26% review all recorded calls. 54% review randomly selected recorded calls.
- Almost three-quarters of organizations (72%) that do QA review both open and closed tickets in quality reviews, and they most commonly use randomly selected tickets for review. More than half of support centers that provide phone support (57%) record some or all technical support calls.
- Almost three-quarters (72%) of organizations review randomly selected tickets, while only 11% review all tickets.
- Only 13% of organizations have a dedicated quality manager or similar position; in most organizations that do quality monitoring, it’s done by the support center manager or a team lead.
The research confirmed that almost half of organizations were not running a formal QA program. Many respondents indicated that they would like to have a dedicated QA team or manager, but either can not afford it or didn’t know where to start.
Let’s not forget that Millennials and Generation Z have high expectations for their employers and motivated by the challenges, as well as opportunities to learn and grow. Organizations that strive to retain the best talent should continue staying creative delivering feedback, coaching and training their staff. And I’m not referring to the annual reviews many still exclusively rely on..
QA gets done consistently when an outsourced partner is involved. There are could be variations of business-critical, customer-critical or non-critical QA scores as well as CX QA scores, depending on the methodology applied.
The challenge then lies in the fact that the sampling pool is limited, requests are randomly selected by the partner and scorecard introduction drives almost robotic responses within the frontline… the frontline that is already removed from the rest of the business. If you are outsourcing your support to a third party provider or even if you are building a program internally, keep an eye on the following elements:
QA Attributes: attributes should be tailored to each function and role. Furthermore, they should be fully aligned with your brand promise and culture. Don’t over complicate it, the attributes should be simple, precise and easy to remember and understand even without the guide.
QA Score Card: was it built around customer experience or the employee performance? It should not be focused on individual employees, as much as your customers being fully taken care of through every channel. I like to look at the QA program as an internal CSAT tool – when you cross reference it with the external CSAT and NPS scores you get a much better view into CX.
Scripts: try minimizing usage of the scripts, or at the very least request your partner to develop a wide variety and allow their staff to improvise.
Coaching: how soon the feedback gets delivered to the employees? In what format coaching is done and how progress gets tracked afterwards?
Customers: how sessions below 85% get handled to make sure the clients were fully accommodated? Who notifies the account management team? There needs to be a solid process in place of closing the loop with the customers.
Sampling Pool: this is a tough one, as sessions get randomly selected and in most companies only 5% of total interactions are evaluated. Ask to sample based on criteria that is most relevant to your business and problems you are facing, for example only escalated cases, cases with more then X number of staff involved, or where customer lifespan is greater then X, if you have LTV per client even better.
In my experience developing and managing QA programs, it has a huge impact not only on customer retention, but the level of employee satisfaction and engagement. When done right, I’ve seen support engineers being excited to receive their very first QA scores and waiting for the results to arrive each week to follow. The sense of ownership and pride gets built through QA. I’d say if not already, make it happen for your clients and employees.
Please don’t hesitate to share your experiences and thoughts on QA. Would love to hear from you!