Now Reading
Microsoft CX exec says purely automated chatbots didn’t work. Here’s how he got virtual and human agents to work as a team

360 Magazine 
in Print

BUY NOW

Microsoft CX exec says purely automated chatbots didn’t work. Here’s how he got virtual and human agents to work as a team

Microsoft virtual agents CX

When Gabriele ‘G’ Masili approached the customer support team at Microsoft about extending the use of virtual assistants, he wasn’t necessarily surprised when he got a negative reaction. It was the reason behind the negativity that caught him off-guard.

Speaking during at session at the Strategy Institute’s CX Transformation Summit last week, the software giant’s VP and CTO for Customer Experience and Success had assumed the support team was worried the technology was a threat to their jobs. That wasn’t it.

“They were seeing that the virtual agent was not doing a good enough job and they did not want the virtual agent to ruin the reputation of the support organization,” Masili said, noting that Microsoft’s initial use of virtual agents was a completely automated chatbot. “We found very quickly that that didn’t work. In fact, it failed miserably.”

gabriele g masili
Gabriele ‘G’ Masili,
Microsoft

Masili said he explained to the support team that he believed by enlisting their support, the company could develop an approach that would take the best of human expertise and augment it with automation to deliver a better experience.

While many chatbot vendors have made similar promises, Masili showed numbers that proved Microsoft’s strategy has borne fruit.

Since its fiscal year ended in June, for instance, Microsoft has seen 22 per cent year-over-year growth in the sue of virtual support agent services. The majority of the 83 million impressions involved are based on chat, with some voice-based usage as well.

Perhaps more importantly, Masili’s team has tracked 521 million digital resolutions — in other words, situations where a customer has explicitly told Microsoft that their problem was solved either through a virtual agent, an online article or other form of support content.

“It’s an important number because it’s roughly ten times bigger than the amount of times we had agents solving customer problems last year,” he said. “That’s not always been the case. If I go three-four years, back then proportion was much more even.”

Transparency + expertise + ‘actions’

This is all part of an initiative that began four years ago, where Masili’s team set out to accomplish multiple objectives at once. These included accelerating the value customers have with the Microsoft products and services they’ve already bought, as well as modernizing its support services.

Of course, Microsoft has the benefit of an army of software engineers to create technology to solve its problems, and Masili said much of what they created was developed on its own Azure cloud services. It has also since brought its support technologies to market as Dynamics 365 for Customer Service, but Masili said there were a number of strategic elements that had little to do with technology.

The first was asking the public for patience and help. A few years ago, when use of its virtual agents was in the single digits, the chatbots would communicate to customers that they were still “learning” and asked if customers would continue using it in order to help it improve. However it also offered to direct them to a live person instead.

According to Masili, more than 80 per cent of customers decided to go through with helping improve the virtual agent application by continuing to use it.

“Being transparent was the right thing to do,” he said. “You’ve seen a lot of those bots pretending to be a person.”

Next, Microsoft recognized that virtual agents frustrate customers because they treat self-service as making customers do the work.

A customer might say in a chat that they need to change their address, for example, and a virtual agent would offer a link where they would have to go into their profile, edit it, and take other steps.

Instead, Microsoft began to develop that Masili called “actions.” These were a set of dialogues linked to logic within the virtual agent that allowed it to directly make changes on a customer’s behalf.

In the change of address example, the virtual agent might say, “I see you want to change your address” in the chat, and then ask for a brief pause as the customers’ profile is updated. Actions have led to a 20 per per cent boost in problem resolution rate, Masili said. They are also based on the expertise of the firm’s human agents, who can use low code or in some cases no-code templates to create or update the automated dialogues. 

Ongoing CX work in progress

This isn’t a once-and-done activity, either. Masili said Microsoft noted that over time, some of the support topics on which the virtual agents’ actions were based became stale, leading to fewer resolutions.

“We found that the customers were changing the way they were approaching different issues as time goes by,” he said. “Early adopters have different ways to express their problems, and have different problems. Their vocabulary could change.”

See Also

It’s not just agents that help the virtual agent, Masili added. It works the other way too, where a virtual agent can identify a customer’s problem before handing an issue over. Agents can see chat history as well as support articles that a customer has accessed, for instance. When a customer clicks on a support link provided by a virtual assistant, a backend notification is also sent to the support team’s server notifying them if the link was helpful.

“I would say that this is definitely helped our job,” Joe Lopez, a Windows support engineer, said in a video that was shown during the session. “Instead of asking, ‘What can I do for you today?’ We can now get a much more clear picture of the issue before we start troubleshooting. And we can dive into the issue a lot quicker.”

In other areas, virtual agents have brought Microsoft the same benefits that have long been promised to other companies. Masili gave the example of Xbox gaming customers who seek a refund on a game. Traditionally, this process involved calling the company’s contact centre, giving an agent their order number, having agents validate it and then process the refund. This could 22 minutes by phone and an average of 18 minutes via automated chat.

“That’s a lot for time to get a few dollars back for a wrong purchase,” Masili pointed out. His team introduced further improvements, such as a webform so customers could input information without waiting for anybody. Submitting the form generates a ticket and can then be processed behind the scenes. These moves have reduced the time for those sorts of refunds by half — to less than nine minutes.

“It takes me longer to walk through this process (in this session) than it now does for a virtual agent to do it,” he said.

As a result, Microsoft has introduced a proprietary measurement systems that tracks the performance of its support topics and flags those that are degrading over time. He said.

Over the past three years or so, Masili said Microsoft has seen CSAT for these kinds of interactions improving from 4.1 to 4.7 on a five-star scale. This is based on less waiting time and increased convenience, he said.

“We recognized that Microsoft’s mission was the best mission we could have (for our team,” Masili said, referring to the company’s promise to “empower every person and every organization on the planet to achieve more” as its overarching purpose.  “We’ve taken that as the North Star for support.”

View Comments (0)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll To Top