Dark Mode Light Mode

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Redefining Enterprise Efficiency with Kore AI’s Advances

Gopi Polavarapu of Kore AI Gopi Polavarapu of Kore AI

What if you could revolutionize customer service and employee productivity with cutting-edge AI technology? In this episode of Customerland, we sit down with Gopi Polavarapu, the Chief Solutions Officer at Kore AI, to unravel the transformative journey of a company that’s at the forefront of conversational AI. From its inception in 2014, founded by Raj Koneru with a vision of low-code, no-code tools for businesses, Kore AI has skyrocketed to become a leader in generative AI technology. We explore their innovative approaches, including their patented NLU algorithm and the game-changing integration of large language models like OpenAI’s GPT-3.5.

Gopi sheds light on how Kore AI has strategically secured substantial funding and evolved its AI platforms over several iterations, driving significant improvements in customer and employee interactions across various industries. This episode dives into the complexities of modern customer service, discussing integration with human agents and the challenges of managing multiple communication channels. Through a hypothetical example with New Balance Shoes, we break down the setup, costs, and time required for large enterprises to fully adopt advanced AI solutions, highlighting the tangible benefits and efficiencies these technologies bring to the table.

Attracting Gen Z to the workforce is a pressing challenge for many enterprises, and Gopi shares how Kore AI’s EVA (Employee Virtual Assistant) is tailored to meet their expectations. Leveraging knowledge management and backend system integration, EVA transforms how tasks are handled, from HR queries to sales data retrieval, offering a seamless and efficient work environment. Listen in as we celebrate Kore AI’s impressive achievements, including reaching $100 million in ARR, and discuss their unwavering commitment to maintaining leadership in the competitive field of conversational AI.


This episode of Customerland is sponsored by

Read the full transcript below

Mike Giambattista 

Advertisement

Welcome back to another episode of Customerland. Today I have the honor of speaking with Gopi Polavarapu, who is Chief Solutions Officer at Kore AI sector, or the food service technology sector. Frankly, so many other sectors you’ll see Cori’s name pop up in the news because of some of their recent successes, so I’m really excited to have this conversation with you, Gopi. Thank you for joining me. Thanks for having me. So I’m wondering if you can, just to set context for the rest of the conversation, tell us a little bit in your own words about what Kore AI is about and what you do, kind of the solutions you’re providing out in the marketplace. 

Gopi Polavarapu 

So let me introduce Kore and then I’ll switch to my introduction. So Kore is about a 10-year-old company founded by Raj Koneru, who is a serial entrepreneur. This is his sixth company. Founded by Raj Koneru, who is a serial entrepreneur. This is his sixth company. The last company that he sold was Kony Solutions, who built a low-code no-code mobile app building solutions. So when he started the journey in 2014, his vision was to create a conversational AI company with low-code, no-code tools for a business user to build solutions on top, because the business users are the ones that are close to the customer and the tech teams are completely isolated. They did not know how to handle the customer interaction. So that’s why the company Kore AI is founded, with this low-code, no-code tool for the business users to start building conversation and automate those things back in 2014. 

At that time, the market was not evolved. Raj spearheaded this space and he developed this market with key customers like a large bank who has been using our product for the last seven, eight years. So they’ve been really helpful in incubating the solution from day one and now they’re able to pull 300 million sessions a year, automating pretty much every other transaction that they get, seeing 80% containment rate of their conversational banking use cases. So that’s how the journey started. And then, as the ChatGPT kind of hit everyone with the surprise of AI and how to use it at scale. So customer’s expectation of Chatbots 1.0, which was not a success because all menu-driven people didn’t like it the C-sets went down. 

But when ChatGPT came up, people started thinking, holy shit, I can do these things, I can leverage this technology back again. And how do I use this thing with generative AI? So that’s when Kore AI adopted the generative AI technology and added LLMs to our Kore functionality of how we do products and build products, and that enabled us to do a lot more cool things with conversational AI. So generative AI is kind of helping conversational AI to drive better conversations with the customers, better conversations with the employees within the four walls. So that is how we use the technology to grow the company. And now, you know, 10 years later, we are $100 million in ARR as a company. We’ve raised over close to $300 million in the investment so far. We’re continuing to grow 100% year on year for the last three years. So it’s a great time to be in the space and Korey is the place to be in when it’s a great time to be in the space, and Kore is the place to be in when it comes to conversational and generative AI in solving AI problems at work. 

Mike Giambattista 

It sure sounds like it. 

Gopi Polavarapu 

That’s about Kore. Just give you my role at Kore, so I am a chief solution officer. What we do is we’re trying to productize AI for industries, verticals and horizontal solutions like IT automation, hr automation or use cases that are prevalent in retail, like order status, wismo returns things that are common, that are repetitive across the industry. We’re trying to find those things and pre-build these AI solutions and productize those things so that we’re taking a solution up to 80 to 90% and then it can be customized to the customer’s need and enterprise case. That is what our team does and these are the solutions we’re building where it’s the food or the food industry you talked about, or it’s the retail industry you talked about, and that is what my team does. 

Mike Giambattista 

Interesting. If I caught this correctly, you were saying that Kore introduced LLMs into the mix recently, or was that the dates were confusing? Because, if I understood correctly, it’s a fairly recent introduction which would have been hugely transformative in a short period of time. 

Gopi Polavarapu 

So the way we did is we had this three NLU algorithm that we developed. It’s a patented technology where we were leveraging fundamental meaning and machine learning and knowledge graph in order to identify the intent of what the user was saying. And that is how we were doing. And we had a lot of transformer models being built from a machine learning perspective. So we were using these machine learning transformer models earlier on. But we were trying OpenAI with their older versions of Ada models the older versions that they had and we were playing with those things. But 3.5 literally pulled the rug off under everyone that it’s really able to understand what the user is saying and are able to articulate and generate the content. So we’ve been working with them. But all of a sudden, the tech has shown tremendous results and that is why. Because we’ve been working with machine learning as this three-annually model, so we were already there working on those models. All of a sudden we see the models perform extremely well. 

Earlier we used to see the ranking and resolving could happen with fundamental meaning Sometimes it could be knowledge graph, depending on the training you gave. But with the generative AI, machine learning can win almost every day, unless there are custom words that they use, like they have a signature card or they have a menu on their food menu that’s not trained by the generative AI. Otherwise it’s machine learning winning because all we did was we changed that machine learning model from transformer into an LLM. So that is what we did from an intent identification perspective. But when it comes to the language generation, generative AI still is like, sometimes hallucinates a lot. So there is underlying problems that it brings, like examples of Air Canada. You know examples of Chevy bots who talked about saying you know examples of Chevy bots who talked about saying you know, the competition’s product is better than this because there were not proper guardrails there. 

So we started using from an intent recognition perspective, with a few-shot, zero-shot models. Then, when the response comes back with the review’s generative AI, where it can have natural conversations in a case of food ordering, where it’s more complex people, you cannot really programmatically define it because there’s multiple computations and combinations. That is where we use generative AI. But we always have a response to the phasing control when it comes to the guardrails that allow us to adopt this technology faster because of the platform capabilities we build over the years using machine learning as the main one of the three NLUs. That’s what made us so fast in adopting the technology and enabling it, and we have customers now. Pretty much 50% of our net new customers are deploying LLM-based solutions using generative AI. 

Mike Giambattista 

Are they doing it in a low-code, no-code basis? Is that still the way this platform works? 

Gopi Polavarapu 

That is correct. 

So we enable low-code no-code on two aspects. 

One is building the conversations and the journey of a customer from a CX, but there is also another low-code no-code that we are providing. 

We have launched a new platform called Gale that enables you to take an open source model or a commercial model, fine-tune the model with your data based on all the interactions that you had with the customers in the past from a contact center perspective or from a pre-sales perspective, and you’re teaching the models better in a local basis and then building applications on top of it to deliver the value of generative AI. So we strongly believe low-code no-code is critical because you have to take the capabilities to the customer service people so that they can ingest all this data in a low code no code way, so that the tech teams are all looking at how do I do the AI ops on the model and manage the models, rather than trying to go and do the work. So that is what we’re doing really well in enabling the tech teams to the tech stack but at the same time, you’re enabling the business users with what they need so that they can accomplish their task. 

Mike Giambattista 

It’s an interesting transformation from the way so many IT departments and customer service operations have conceived of and used the tools they’ve been given over the years. 

With your platform, you’re putting so much more control in the hands of the actual customer facing or just call it the end users of the tech. It has to have meant pretty significant changes on the inside of those companies as well in order to try and figure out, you know, how best to integrate and then deploy this. It seems to me, like you’ve been what’s a kind way of saying this, you’ve been responsible for upsetting a lot of the apple cart with the way that you’re doing business in a way to really get the utility and leverage them well is always a challenge, it seems. I’d love to hear from your perspective if you think that’s a true assessment or not, and if it is how you overcome that. Is it just because the value proposition is so good and so big that companies are forced to just kind of wholesale say we’ll figure out what we need to do, or is there some better way of approaching it? 

Gopi Polavarapu 

I mean, a lot of our customers have accepted this as a welcome thing, because a lot of the times, you know, the people that are on the tech side probably do not know what their brand means. Most of the marketing teams, most of the customer support teams know what their brand means and what it means for customers. How do you treat the customers differently? So a lot of these subtle things are very important when you develop a conversation design on how you want to treat a customer, what the flows are. So that is critical. So can you still hear me? There’s a lot of background noise. 

Mike Giambattista 

Yeah, it’s a little echoey. Okay, yeah, that’s better Thanks. 

Gopi Polavarapu 

Awesome. So that’s what we’re seeing, where every company wants to create themselves differentiation when it comes to customer support and marketing on their website. So their business users are much more eager to be part of the development of the product. For example, with the low code, no code tools, we have enabled customer service department be part of the product development. As an example, if you look at our platform, it’s almost like script writing. So you go and say, okay, here is what I said, here is what the bot says. 

Here is what the customer says. Here is what the bot says, so they can start writing that in plain English and then, from there, what developers do is in the backend. For example, where is an order as an example. So that is a very simple thing and you may want to have your own way of communicating to the customer, but on the backend there is an API to be talked to of communicating to the customer. But on the backend, there is an API to be talked to. So there is a partnership between the customer service department and the engineering now, where customer service is defining the flows but the engineering is going and looking at what API calls should I connect to now to accomplish the use case. So it’s not like we’re isolating those teams completely, but we’re enabling both the players to get the best from both of those teams, so that it’s a joint collaboration that happens from day one, because what matters when you call a customer support is what you hear. That is what matters, right, and that is what people perceive. 

When I talk to a bot, is it human enough for me to have a conversation with it or should I just get me an agent, because most people are not used to talking to the bot. I don’t want to talk to the bot personally. I want to talking to the bot. I don’t want to talk to the bot personally. I want to get to the human as quickly as we can, but that is not the case anymore. Even OpenAI just showed their coolest demos they have. We are seeing human emotion is now getting part of the TTS, the text-to-speech engines. These days, where you really don’t know a difference between whether you’re talking to a human or you’re talking to a bot, and all of these things are coming into hand as you include people from the consumer service side, where they want to bring their brand and their values and their culture into those communications. That is why this has actually helped us quite well when it comes to deploying it faster, quicker for customers in their own way. 

Mike Giambattista 

I mean, just as you’re saying this, I can see that, of course, the efficiencies to the end user and the way customers are addressed in those kinds of situations, but the obvious efficiencies between the development team and the CX team, the CS team those kinds of collaborations are rare, if they’ve ever been possible at all. So I think what you’ve enabled there is a pretty big deal, which might even be the main value proposition is look, what we can now do by enabling the end users to have a voice in guiding the development of the product. That’s a giant shift. Just curious, because the platform can be deployed in so many different ways, in so many different situations and verticals and we understand the nuances of each one of them. 

Is Kore AI’s business model to be the kind of number one brand provider of this service package? Or are other companies coming to Kore AI and saying, let us embed your technology into our platforms so that we can compete on this level? Because it seems to me, uh, just based on how you’ve described your approach to AI and LLMs and, uh, and growing them in sensible ways, Um, a lot of companies who are in the AI space can’t really devote that level of attention to it, as you seem to be. I mean, you just secured $150 million in funding. It allows you to do a lot of stuff that other smaller entities couldn’t do. 

Gopi Polavarapu 

That is true. I mean because we are on the 11th version of our platform. Right, we didn’t start now, like most of the AI companies now are starting right now. There’s so many AI companies in the last three years since OpenAI came up, but we’ve been in the market. We’re on the 11th version of our platform. It is robust now where people are using this capability not just only to build bots, but our solution capability includes contact center as a service. It includes agent ai. So there is something across the journey, right. 

So we’ve made these investments over years as we see a problem we started solving. It’s kind of peeling the onion step by step, but end of the day, the goal is how do you provide a solution that is needed from when a customer is reaching out to you from a channel perspective, how they want to talk to you, because these days you know a lot of the younger Gen Zs they want to talk to on Snapchat, they want to talk to you on TikTok, right, there’s a whole different you know demographic that’s come up into the workplace now. So, having those channels and how the message comes to us till an agent who actually has to do multiple things in the case of an airline, you want to change a plane, the agent has to go through a lot of things. You never buy tickets from airlines, but you go to them for support because you’re typically buying through OTAs or Expedia to the world. But when the request goes to them, they need to figure out and do a lot of the back-end work to understand who is the travel agent for you, what kind of class of ticket you bought. So there’s a lot of work that happens on the agent. They need to go through and do so. We’ve solved the problem from the way the consumers want to talk to the brand into how the customer service people are actually providing support to them. So this entire journey has been thought through and we’ve solved it for multiple verticals and multiple functions from a function perspective as an IT help desk right, so you talked about, can people resell our platform? So we have companies that provide many service providers, so they actually use our product and they created an offering from it. 

Hey, I’m giving you a blend of AI with humans, rather than replacing the humans, so that the AI and humans are working together in an augmented way than replacing the human completely, because you still need human, because in a case of restaurant with uh, I’m sure you have seen Wendy’s adoptive drive-throughs. It doesn’t matter whether you put ai. If I’m a cash paying customer, you still need human to collect the cash at the end, right? So it’s not really eliminating the human. What it is doing is it is giving you more time to do other things and you’re freeing up the bandwidth. It’s being productive. It’s a tool right now. 

So, similarly, contact center people now are looking for these tools. They don’t need to go and look at 10 different screens. Our agent AI is there next to them as a co-pilot, helping them do everything that the customer is saying. All you need to do is say what the customer is saying. The customer is pissed off. You see a sentiment matrix on the top and if they’re really pissed off, the AI will automatically tell you what to say to pacify the customer. So these are the things that we are adding to really transform the experience of the customer experiences from an external use case perspective, but from an internal use case perspective, if it is the IT help desk or HR help desk or the recruiter that’s looking to hire somebody. So we’re solving these internal and external use cases using the same tech stack that is there for both CX and EX. 

Mike Giambattista 

So let’s say you’re a CX or EX leader in your company. You’re always looking for efficiencies operational and fiscal and you come across Kore.ai. It’s a really, really interesting solution. What does it take somebody like that to kind of pass the considerations test? They’re clearly going to be looking at budget. You know what is the cost of the platform, but cost of integration and time to kind of create the proprietary LLM that’ll be dealing with all this stuff. I mean, there’s got to be some sort of ingestion, learning, training period, right? So you know? Let’s say you’re I’m picking this out of a thin air, I have no idea whether they are a client of yours or not. Let’s say you’re New Balance Shoes tennis shoe company. Here in the States. It’s pretty popular. They’ve been around for, I think, almost a century. If you’re considering something like that as an enterprise retail brand, what are the considerations? And then, realistically, how long is it going to take them to really fully integrate this into their systems, into their way of working? 

Gopi Polavarapu 

Yeah, so I mean it all depends on their current setup. As an example, let’s take a retail example. So we are working with a brand they sell a lot of watches, so they’re on Shopify marketplace as a commerce platform, right? So as part of the pre-built solutions, we are already pre-integrating with Shopify. We’re already pre-integrating LLMs that knows everything about the product. We have what we call as a search AI platform that allows you to connect to various PIM databases, that has all the product marketing documentations. So we ingest all these documents from whatever the source is it could be a Word document or a spreadsheet or any form of documents or any structured data through a commerce platform. Once we ingest all of this data, then all of a sudden we have the data available. So it’s a mix of a RAG and an LLM, because the problem with LLM is LLM hallucinates a lot. It depends on how much data you have. 

So you don’t want to put LLM ahead of the time. But you would start with the rag initially, where you’re always providing source of truth. You always have evidence of where you got the information until you see this model is evolved enough so that you can start building your LLM. So the it’s a journey. I don’t expect this to be like a day on. You know, enable it, it’s going to go on. No, it’s a journey for everyone. You have to have like a roadmap that’s pragmatic enough for you to go start the immediate value. That is why we have these pre-built solutions like retail SS, food SS, travel, ss these SS solutions that are time to value. There is less than a month. You can onboard and start seeing the value on it, right. 

So we have seen customers deploy it quickly and it is pre-trained with the language that’s used in retail, like people want to return, they have a question about a product. So in retail, it’s all about knowing more about the product. I have my concerns. I need somebody to answer my question. Just like how you go to a retail store, you tap somebody’s shoulders, ask a question. So digitally we’re creating the same experiences with virtual avatars or chatbots or virtual assistants on their website. You can ask any question and it’ll help you. 

It can look at your CDP, which is your customer data profile, identify what kind of shoes have you bought in the past? Are you a size 10 or a size 11? So, depending on it, we can ask the question are you buying it for yourself. Then all of a sudden, I’m able to reduce the returns so that they don’t accidentally order a wrong size, right? So, hey, if you’re buying new balance, the shoe size is typically your Nike is about a size 10, but for new balance it could be size 9.5, because that could be a bit right. 

So we can start providing those personalization alerts to them so that your outcomes that you’re looking at is how do I reduce the returns, which is the bigger problem for any e-commerce company, uh, than looking at ai. So it all depends on the use case for use case and where they’re in the journey, uh, but we have seen customers anywhere from less than a week to see the value to almost three months, depending, depending upon completely custom software, completely custom API. So within three months you’ll start seeing the value of the pre-built solutions that we have developed for that. 

Mike Giambattista

Remarkable, so truly remarkable. I don’t want to skip over that because that’s an extraordinarily fast timeline. So I’m aware of your retail assist product, I’m aware of your travel assist product and I know you’ve got several others. What other verticals are you focusing on at the moment? 

Gopi Polavarapu

So we started with banking. So essentially, bank assist is our biggest offering. So obviously we kind of followed the IT spend money trail on AI and who’s using AI the most. So financial services is where we started focusing on. After that we have healthcare, which is the second biggest vertical we have. We started expanding into retail consumers. So for us, retail is all. Shopping is retail. That includes e-commerce, retail and CPG and our D2C strategy of manufacturers, because every manufacturer is trying to have their own D2C play these days. So direct to consumers, so we’re enabling them on the e-commerce strategy. So that is the new vertical. And then travel and hospitality is the next vertical that we have people assigned and that’s how our sales teams are divided into these kind of pockets. And we just recently announced a new vertical called Hitech, which is primarily around digital natives looking at adopting this technology for their own internal and external use cases. 

Mike Giambattista 

Interesting. What would be a typical client for that vertical, if you will? 

Gopi Polavarapu 

Take an example of a data center company like Equinox as an example, so they would be a good fit. Who is digitally native? They’re looking at this as a tech to sell tech. 

Mike Giambattista 

Gotcha. Okay, wow, that’s really interesting. Um, you know, side note here, I think, uh, frankly, I think we could go on for a couple hours here, but I don’t want to blow apart your, your schedule.

Gopi Polavarapu 

I think the bigger thing that I’m interested in talking is for every enterprise customers. One thing that everybody should be worried about is how do you get the next-gen talent into your workforce? So Gen-Z is coming to workforce, right? So they’re all now born in 2000. Now they’re 24 years old. They’re joining the workforce now. So all these people have been born with a smartphone in their hand, at least since they become 7, 8-year-old. 

They’ve been using, okay, google and Siri, and they’re used to this technology of asking a question and somebody doing a job for you, you know, versus the millennials and Gen-Z and baby boomers that are different workforce. So in order to attract this talent, you need to enable the tools that are needed in the workforce for doing a job Like hey, I need a chat GPT for my enterprise workforce. Or doing a job like hey, I need a chat GPT for my enterprise. Obviously, enterprises are blocking generative AI because they’re worried about their data privacy and concerns. So this is a time where I see every enterprise have to adopt some kind of an enterprise under four walls, data protected, trusted and provide these tools, which is where Kore AI is heavily investing in. 

With our EX strategy, we’ve developed a solution called EVA, which is Employee Virtual Assistant, and that is more about knowledge management of anything that your enterprise has. I do have an NDA with this company Write me the contract amendment for this particular customer, all internal details. It talks to your backend systems, apis and you’re conversing with it. Naturally, whether you type into it or you’re talking to it, and it’s doing work for you and it’s also helping you do your HR related stuff from benefits or I want to hire somebody. You know I want to replace somebody. It talks to all the APIs in the backend within your enterprise and gives you answers that could be a sales data? 

What is our sales in California last month, or it could be? Do I have an NDA with this company or I need to extend a contract? What was the contract term of that company? Do I have an exit clause? Any question that you have that has enterprise data today is an email and you follow up on that stuff. 

Or you try to schedule a meeting or find information, or you may have an IT problem you want to get a resolution on, or you may have a HR issue that you need to enroll your benefits on or change your benefit plans. So these days you don’t need to talk to a person and the Gen-Z people? They don’t like to talk. Gen-Z people don’t want to talk to persons, right? So they want to find out and figure out themselves. So this is a perfect mix where the enterprises have to kind of provide these kind of tools that runs within four walls, and Kore has actually come up with that solution with Eva, and we have made investments over the years on HR Assist, it Assist, recruit Assist. We’re kind of combining all of these things as automation tools and a help desk that comes with it, because you still need to talk to people at some level, because there is something that you want to find out from or your hardware is locked out and you need to talk to someone. 

So we’ve enabled all of these capabilities, similar to a CX experience where a customer calls for or talks to you for help and you try to self-service them and they can’t help it, then you talk to a support center. Similarly, in the employee side, we’re doing the same thing and adopting the same tech, but here you’re hyper-charging them with all the information that business has for their needs and just answering their question, then exposing the entire enterprise data to them. So one thing I didn’t specify in my talk was we are a two-time market leader in Gartner Magic Quadrant, in conversational AI, and Forrester just announced their latest conversational AI report and we’re number one leader among the leaders, literally number one in the industry, when it comes to the analyst report, and we’re working with other industry leaders. So we’re expecting to be a leader in this space throughout. 

You can see who’s playing in that space from the reports. We’ll share the reports with you. You can see who’s who in the zoo. We’ll share the reports with you. You can see who’s who in the zoo. So, but in general, Kore is definitely, you know, leading the pack from a differentiation perspective, vision perspective, execution perspective. Now we’re in pretty much every continent with a customer deployment. 

Author

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Previous Post

SAP Unveils Groundbreaking AI Innovations at SAP Sapphire 2024

Next Post
tariffs

Traceability is Key to Protecting Human Rights in Supply Chains

Advertisement

Subscribe to Customerland

Customer Enlightenment Delivered Directly to You.

    Get the latest insights, tips, and technologies to help you build and protect your customer estate.