Part of my attitude relates to an experience I had. And this was within a single HIS. I wanted to…
Jonathan Rosenberg, PhD is chief technology officer and head of AI at Five9 of San Ramon, CA.
Tell me about yourself and the company.
My job is to direct our overall technology strategy, run our AI engineering teams, and figure out how to apply AI technologies to the evolution of contact center technology, which is a super interesting and exciting space. I’ve been in the IP communications industry for approaching 30 years. I was previously the chief technology officer for Webex and for Skype before that.
How has technology changed the way that inbound telephone calls are managed?
It’s having a huge impact, and that impact is going to get even bigger over time. Probably the biggest thing is that we have been able to dramatically improve self-service, not having to wait on hold to speak to a live agent for tasks that users would prefer to self-service anyway.
There’s tons of examples in the healthcare space, things like appointment scheduling and rescheduling, looking up and getting information on prior authorizations, answering questions about hospital visits, benefits coordination, cost estimates, and insurance verification checks. The list goes on and on. All of these things are relatively easy to support with self service. We can get the call to the right person and understand what the caller wants, collect information, and direct them to the right place. We’ve all experienced way too much transferring. You get to the wrong person, then you transfer somewhere else and you have to repeat everything all over again.
But it isn’t just inbound, there’s outbound as well, communication from healthcare providers and payers to their patients and customers.
Are technologists surprised that many people still prefer making a phone call?
The death of voice in the contact center, which has been predicted for 20 years, is absolutely, positively false. We are seeing an increased amount of usage of communication of the technology. It’s not that people aren’t doing other things. They are, but it’s the total amount of interaction that’s happening between customers and the brands and companies they work with. That has increased as these additional tools, like chat, SMS, and website have gone up.
But at the end of the day, there’s a lot of stuff where people still want to call and talk. It’s faster. It’s interactive. You feel greater levels of trust, especially when you talk to a live person. And in many cases, you want to speak to a real person. We don’t think that everything will go to self-service. That is best done with voice, especially in high-emotion, high-stress situations like we see in healthcare. A voice of empathy — nothing beats that.
What is the role of AI-powered virtual agents?
It has two sides. People understand less about what we call agent assist. In agent assist, AI is involved, but it isn’t replacing the human agent, it is augmenting them. It provides lots of benefits to the agent that help them do a better job and provide a greater experience for the customer. That includes things like coaching the agent to make sure that they perform their checklist of required tasks to the patient or to the customer. Helping provide information at the agent’s fingertips so they don’t have to put the customer on hold and go look something up — the information can just be put right in front of them.
Another really good thing that we have seen is that our agent assist application provides a live, real-time transcript to the agent. If they didn’t quite hear what someone said, they can just glance back at the live transcript.
That is one half of agent assist. The other half is self-service use cases, where a customer or patient calls in and they are talking or chatting with a bot to provide these self-service kinds of use cases that I’ve described. Both of these are powerful in the contact center.
Are health systems assembling all of their interactions with patients – mailings, outbound calls, fundraising, billing – so that anyone who interacts with them has the full story?
One of the hardest problems to solve in the industry is breaking down these silos. It is becoming more important than ever for companies that are providing customer service to do that.
One of the interesting drivers that will accelerate the collection of all this information together is this emergence of generative AI and large language models. These things are incredibly powerful and highly beneficial to a lot of the use cases that we are talking about. They need contextual data to work. They need to know about the caller and the customer to provide the experience they want.
The more of this information that you can collect and feed to that generative AI model, the better job it can do to provide superhuman experiences, something that you couldn’t even hope to get from a human alone. We are excited about that, and we think it will drive a lot of the collection of all this data together.
Healthcare is powered by fax machines, photocopies, and clipboard forms, yet consumers interact everywhere else with chatbots, smart search, and voice processing. Will healthcare embrace these technologies?
I hope so. Customers in general are seeing advances in technology and how they receive service in other industries, and that sets levels of expectation.
For the benefit of healthcare, a lot of the rest of the world is becoming more sensitized to the type of issues that have traditionally prevented or made it difficult for healthcare to do this, related to sensitivity of data and the protection of personal information. With all of this, especially with this genre of AI technology, those questions are top of mind now for buyers of these products and services. They were top of mind for healthcare before, and now they are top of mind for everybody else. That will drive increased attention to solving those problems so that we can deliver solutions that are broadly adaptable in the healthcare industry. That’s the optimist in me that thinks this way and hopes to see this technology penetrate quickly across healthcare.
Health systems have grown by acquisition into regional or even national organizations that have a large scale and an increased technology capability, but that may create more bureaucracy. How will this affect their use of technology?
In some sense, centralization helps a lot of these things. It consolidates the buying power and the deployment of these technologies. You can deliver it out to your regional hospitals and practices so that everyone gets it quickly. Instead of every single doctor’s office or hospital having to do this on their own, you get to do it just once. That could be helpful in accelerating adoption of this technology.
How does a software company incorporate AI into their product when it changes every day?
At Five9, we have adopted a strategy that we call engine-agnostic. That means that we have built our software platform so that we can consume third-party AI engines, as we call them. These engines are the raw ingredient that do the processing.
For example, the thing that takes an audio file and spits out a transcript. Or the thing that takes a sentence of text and delivers the intent. What was the thing that the customer said? Or you send it a paragraph and it produces the medication name, the doctor name, and the dosage off the information paragraph.
These are like raw engines. We have designed our system to allow those to be pluggable, so that we can adapt and evolve as these technologies improve. They are improving at a lightning, breathtaking pace, and that has definitely made it challenging. In fact, we have already switched our underlying LLM engines a few times for different use cases. We have only been able to do that because of this engine-agnostic strategy.
How will the market respond to companies that add the simplest AI wrapper to their product with few actual benefits, just to be able to use the marketing buzzword?
Especially in the contact center, tons of little startups said, “We can finally build the chatbot that the world has wanted.” Then they throw a UI wrapper in front of ChatGPT and call it a day.
That’s not what the market wants. The market, especially in the contact center space, wants a platform that spans all of the interaction modes – voice, chat, email, SMS, and social channels. They want powerful reporting and analytics. They also want to make sure that humans and AI are integrated together so that calls and chats can bounce between them, and all of that just works.
That’s hard. You realize that the value proposition is delivering all of that, and then plugging in the AI to make it better. The platform plays are the likely winners in the in the contact center technology space, and we are one of them.
AI allows companies to create closed models of their internal documentation and processes to help a frontline person who is in the middle of a call or chat session. Is that creating new possibilities?
This is another thing that has taken a generational leap forward. Prior to large language models and generative AI, we would have to train a custom model on the different intents, as we call it, different use cases and things that would be discussed in the conversation that need to be detected that would then trigger information to be shown to the agent. That all had to be done looking at their existing conversations to go collect all that data to train and fine tune the model.
Generative AI has changed dramatically the way that we can think about that. We used to need to enumerate every single use case, what might be said and asked, and then handcraft the model to detect it and handcraft the response that should be shown to the agent. Now with gen AI, we can just ingest all of this knowledge, processes, and documentation the same way a human agent would go read those materials. Then we can give written direction to the agent assist functionality on what we want to do. It can start to provide this knowledge and guidance.
We are seeing great initial results with this, and this is what we are building towards. It will be transformational in this space in the coming years. It is incredibly powerful and amazing technology.
What does the company’s strategy look like over the next three or four years?
With generative AI, three to four years is like really far away [laughs]. But it’s that thing, generative AI. It’s going to be incredibly amazing in delivering superhuman experiences to customers and delivering the kind of customer experience that the contact center market has always dreamed of delivering to customers, but that in many cases, fell short of expectations. Now we have the tool to deliver the kind of experiences that we have really wanted to.
I talk about the end of call hold. I can’t wait for the day when you can call the contact center, you want to speak to someone live, and they never need to put you on hold ever again. Because anything that they need, anything that they have to say, anything that they need to do, is instantly at their fingertips. That will be a great day, and we are working towards that day in the next two to four years.