These CX stats suggest companies are lying to themselves about the quality of human-machine handoffs

Research from [24]7.ai suggests many firms are happier with the automation they've introduced than their customers are

I’ve yet to see a bumper sticker that says, ‘How’s my CX?” but even if someone creates one, you know it would be a rhetorical question. And that’s probably the problem with a lot of CX strategies.

For all the surveys and other feedback mechanisms organizations use, the majority I’v seen (or taken) have focused largely on generalized questions about satisfaction. They might ask whether you found a contact centre agent helpful, or whether their web site was easy to navigate, but they don’t usually get into the nitty gritty of whether or not you found their expensive automation tools to be valuable or not.

In a study released this week, however, San Jose, Calif.-based [24]7.ai tried to capture the disconnect between what companies believe and what consumers actually feel. (Disclosure: I have performed content marketing services for [24]7.ai in the past, but they are not currently one of my clients.)

The CX Reality Check: Research, Revelations And The Route Forward showed that while 89 per cent of companies believe that their automated systems understand customer intent, only 56 per cent of customers felt the same. 

Source: [24]7.ai

Perhaps as a result, nearly one third of consumers surveyed have stopped doing business with a company as a direct result of poor customer support.

As with any such research, there was an obvious objective to market the company’s own products and services, in this case asynchronous messaging. That said, I felt there were some good pieces of advice later in the report, particularly this one:

It’s vital for a solution to provide the interweaving of human agents and conversational AI during customer interactions. When reviewing a solution, consider how easy it is for human agents to call on chatbots, and how quickly they can identify customer intent when reviewing chatbot transcripts.

I’d go even further and suggest — perhaps in a demo or during a trial period — that such technologies are given the equivalent of fire drills to determine the level of “teamwork” humans and AI can achieve.

Though the report didn’t delve into this, this may not be happening in part because the companies investing in these tools are susceptible to a self-serving or optimism bias based on their hopes. In other words, when you go to the effort of making a business case for automation, you’re rooting for it to work.

And to be fair, there may be CX leaders who look at the [24]7:ai report and conclude that a majority of 71 per cent of customers who haven’t ditched a company over poor support is pretty good. A majority of 32 per cent also said COVID-19 hasn’t affected the way they interact with companies.

In the long term, though, there’s little question many brands will have to decide the degree of automation they’re prepared to introduce to their customer journeys and the level of trade-off they’re willing to accept. One of the best ways to make that call is by being more explicit in asking customers about those AI-human handoffs, as well as being more overt about why such technologies are being used and what the brand hopes they will achieve.

We all know companies want to cut costs and make people more productive. That’s corporate intent. If technology doesn’t get better at sussing out customer intent, the next reality check might be even more difficult to face.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.