This content is sponsored by Verizon.
When COVID hit, 40 million people were out of work and the government had to respond. But many federal agencies were unprepared for the workload that stimulus efforts were about to demand. One agency in particular did not have the staff to handle the incoming calls. It therefore turned to a technical solution to increase its ability to meet the needs of the public during this crisis.
This agency turned to Verizon. He needed to dramatically increase his ability to respond to incoming calls from the public, and he needed to do so almost immediately. Verizon responded with Interactive Natural Language Speech Recognition (IVR).
“Natural language IVR lets you have conversations, so you can ask the question, where’s my check? When will I receive my check? Why haven’t I received my check? You can ask a barrage of questions. The agency already had web pages with questions and answers, but people like to talk to someone,” said Stephen Sopko, associate director at Verizon. “The AI would actually go to the webpage, depending on the question being asked, and then it would analyze it and say, ‘Here’s what we think is the best answer. “”
And the AI learns, through a tuning process that helps it learn new ways to ask the same question. So not only does he learn that “Where is my stimulus check” and “When will my stimulus check be sent” most likely have the same answer, but he also accounts for accents and language barriers. It does this by placing confidence scores on the answers and then checking to see if it answered them correctly. If not, it will flag the question for review and prompt the caller to ask the question again, checking for discrepancies.
It also helps the AI itself become more natural in its conversation flow. He can ask follow-up questions to narrow down the answer to a question. For example, if stimulus checks were sent on different dates based on the taxpayer’s last name, it could check the caller’s last initial to provide the correct answer. And it all happens in a free-flowing conversation, not a series of rigidly defined input options.
And Verizon was able to do all of this extremely quickly. Normally, Sopko said, a buying cycle for a product like this takes six to nine months.
“We knew it wasn’t just for this agency. It was for the people of the United States. People weren’t going to be able to eat, they weren’t going to be able to pay their bills. We had to be above that. For Verizon, that got everyone on deck. And I know people say that; we really wanted it, we did it and we executed it,” Sopko said. “Our solution partner Nuance partnered with us and our Verizon team, and they just did monumental things to get this up and running. We got it up and live within two months. And millions and millions calls came in that the agency simply wouldn’t have been able to answer.
And they did it with a 70% containment result, meaning 70% of calls ended with the caller receiving responses they were happy with. Sopko said that’s a pretty high success rate; government agencies are generally happy with a 30-40% lockdown.
One thing that helped was that during the process, Microsoft bought Verizon’s partner company, Nuance. This helped Verizon align with several key resources, such as implementing IVR in Azure and using Microsoft’s QnA Maker product. It also allowed Verizon to rely on Microsoft’s FedRAMP authorization through the Joint Authorization Board process, which allowed them to reduce the time frame to around nine months. Sopko said Verizon’s natural language IVR will be FedRAMP High certified within the next two months.
That’s important, because Sopko said the agency has more work for Verizon, and it’s Natural Language IVR.
“They were so impressed with what we did and the adoption process – we made sure we had people who would help with the adoption of these technologies because that’s a big thing to get voters to use it – that next year it’s going to go up and be live for standard agency operations,” Sopko said. “On a good day, during the agency’s peak season, this one is able to handle about 30% of its calls, so the hope is that adoption will come from that, that the tuning process will kick in, that we’ll start to see a similar increase in the percentages of calls answered in natural language .