Ada Support

Is your brand ready for generative AI? Here’s how CX leaders can prepare

Sarah Fox
AI Content Specialist
AI & Automation | 9 min read

By now you’ve probably heard of ChatGPT (we’d be worried if you haven’t). The next level AI-powered chatbot is taking the world by storm, and for good reason. 

ChatGPT is built on a large language model (LLM) trained on vast amounts of data. Using this data, it enables generative AI capable of creating novel and surprisingly creative content in response to custom queries. It’s a game-changing breakthrough that will not only transform the way people work, but the future of customer support interactions.

As business leaders start wrapping their heads around these impressive capabilities, many are still wondering how LLMs can impact customer service departments. Will reps become obsolete? Will we need to create new roles? And how will it impact the enterprise software landscape?

Like previous iterations of AI, generative AI will take over some customer support tasks — but it will also create exciting new opportunities. The role of human feedback, both from customers and from within the customer service organization, is even more critical when AI acts independently. Customer service automation roles, such as bot manager and conversational designer, will shift their efforts away from creating content to editing, fact-checking, and acting as brand gatekeepers. 

Humans will be responsible for identifying and correcting mistakes and improving performance over time — because even the most confident AI can still get things wrong. The key challenge here is alignment. Brands must optimize LLMs for their intended use in order to give customers accurate information delivered in the brand voice.

Rather than diminishing the role of customer service professionals, generative AI will actually amplify the importance of human input. Pretty cool, right? Let’s explore how.

The large language model advantage

Chatbots built on LLMs can both understand customer intent and create new, tailored content in the form of text, videos, and pictures based on the extremely large datasets they’re trained on. They’re able to pick up patterns and make stunningly accurate predictions — but they’re only a foundational layer. To give customers the kind of fully personalized brand experience they now expect, companies will need to build an additional application layer on top of the LLM.

We recently brought together industry experts Yaniv Markovski, Head of Support Engineering and Community at OpenAI, ChatGPT’s parent company, and Ada’s own VP of Machine Learning, Yochai Konig, along with our CEO Mike Murchison, to hear their thoughts on how CX leaders can prepare for this next wave of digital innovation. Here are some of their top tips.

The ChatGPT opportunity

Explore the implications and benefits of bringing generative AI to customer service.

Watch the webinar

Test drive generative AI to see how it will impact your CX

The best way to gain a better understanding of generative AI is to experience it for yourself. Queue up some interactions using ChatGPT or try using it instead of Google. Join the OpenAI Discord and Midjourney Discord to connect with others who are also new to the technology. Try experimenting with your top business use cases to see how it might impact the brand experience.

Keep in mind that deploying generative AI isn’t an all-or-nothing decision. Companies are already using it to different extents and for different purposes, according to Yaniv. It probably won’t make sense to auto-generate all of your content. It’s up to each organization to weigh the risks against the opportunities and decide in what context, if any, you’re okay with customers occasionally getting incorrect information.

LLMs have other limitations that brands should be aware of. Like any technology trained on content from the Internet, it can reflect human biases. Safeguarding against malicious content is key to protecting your brand reputation. Also, LLMs only reflect knowledge of the world at the time of their most recent training, so if the model isn’t continually updated, it will lag behind new developments in content, culture, and context.

So let’s say you’ve decided to add a generative AI layer to your customer service department. You’ll need to ensure content and documentation is consistent and accurate across the board to maintain a single source of truth. You’ll want to anticipate the ways negative bias could slip in and establish effective guardrails to keep harmful content away from your customers. By carefully planning the customer experience you want to deliver, you’ll be able to eliminate potential safety and accuracy issues.

Key steps to get started: Define paths to resolution, design conversations, and structure your team accordingly

High level, generative AI expands the role of customer support teams in three core areas: content generation guidance and quality, conversational design, and integration specialization. Let’s break these down.

Once your chatbot is sitting at the frontlines of your CX, taking over the primary role of responding to customers, service organizations will need to develop a strong quality assurance program to manage bot-generated content. This empowers support teams to take the more mundane and repetitive tasks off their plates and put more thought into what kind of content they want to create and why.

Conversational AI specialists will continue to structure the chatbot’s dialogue and flows, including defining tone and voice, but with the added advantage of faster trend and optimization insights. According to Yochai, their focus will include deciding the formality level and length of responses, and setting parameters for speaking to different demographics. For example, an ecommerce clothing company that sells to a younger crowd would probably want to use more casual language with its customers than a FinTech company that caters to retirees.

Engineers can use newly freed work hours to build better APIs, enhancing the chatbot by adding customization layers. They’ll also need to develop continuous feedback loops to enable humans to update the system with what they’re learning from customers.

At the heart of building a conversational application on LLMs is how it will allow customer service teams to move away from prescriptive chatbot models — questions tagged and mapped to a static set of answers. Instead, the customer support organization can adopt a more dynamic, generative approach. Remember, to deliver optimal outcomes, both your team and your strategy should be structured around the most relevant use cases.

Build or buy: Which is right for your brand?

This is a major consideration for tech leaders, with the unprecedented amount of data required for training LLMs making the venture both time-consuming and cost heavy. In 2023, we’ll see more organizations using LLM APIs to enhance customer interactions. If you choose to go down this path, consider the challenges and limitations of building your own conversational application on LLMs.

That said, each company’s customer base is unique, as are their needs, so customized solutions add value. Some businesses are equipped with the teams and tools to build the application layer over the LLM. But Yaniv urges caution with this approach. “Building is not the right thing for everybody,” he says. As an investment, AI requires expert attention for the long term. Consider whether you have everything you need to deliver on this strategy.

Another option is to choose a platform that’s already deploying conversational AI built on existing LLMs. Some companies act as an intermediary between LLMs and customers by designing branded applications. As generative AI use becomes more commonplace, this middle layer is going to be at the forefront of future CX innovations.

“Generative AI changes the fundamental relationship that most brands have with their customers,” says Ada’s CEO, Mike Murchison. “Instead of customers bending to the whims of the brand, the brand is meeting the customer where they are.”

Explore your options now to get ahead

As the generative AI landscape becomes more complex, brands need a deep understanding of both their customer base and the growing LLM industry to make the right longer-term decisions for their users, brand, and business. The sooner you venture into this new territory, the sooner you can gain a competitive advantage.

The generative AI toolkit for customer service leaders

Evolve your team, strategy, and tech stack for an AI-first future.

Get the toolkit