AWS Contact Center
How contact center leaders can prepare for generative AI
The widespread interest in generative artificial intelligence (AI) has created a renewed focus on the power of AI to solve for business challenges, especially for customer service. Generative AI is a type of AI that can create new content and ideas, including conversations, stories, images, videos, and music. According to McKinsey, customer experience (CX) is one of the top use cases for generative AI. They estimate a 30 to 45 percent productivity cost improvement, from applying generative AI to customer care functions.
In this first part of a 3-part blog post series, we will take a look at what generative AI is, how it is changing the CX landscape, and the business outcomes it can help deliver. In parts 2 and 3, we will dive deeper into the key considerations for the success of generative AI and the best practices for using Large Language Models (LLMs) in CX environments.
Generative AI is helping reinvent customer experiences
Generative AI is powered by very large machine learning models that are pre-trained on vast amounts of data at scale. For example, Anthropic’s Claude 2, available on Amazon Bedrock, was trained using 137 billion parameters. These pre-trained models are commonly referred to as Foundation Models (FMs). Large Language Models (LLMs) are a subset of FMs focused on understanding and generating human-like text.
LLMs bring an exciting potential to CX use cases because they can help improve how contact centers manage and process large amounts of data, and provide real-time CX enhancements. For example, a LLM’s ability to produce human-like text generation is an area of immediate opportunity for CX. This can be used for generating text like conversation summaries and providing real-time agent assistance.
LLMs will also improve the natural language processing of voice- and chat-bot conversations, which are fundamental to success of automation in contact centers. This is because LLMs are able to leverage vast amounts of metadata for context (such as call history, previous conversation transcripts, and prior transactions) in an active conversation. Additionally, complex, non-linear sentence structures can be more easily understood by LLMs to accurately determine contact intent.
Let’s consider an example of a caller interacting with a virtual natural language understanding (NLU) travel bot. Let us assume the traveler states an intent of traveling to the “capital of Norway” to this bot. A traditional NLU bot would not have the context to resolve that intent properly. NLU bots typically do not have the depth of parameter training that an LLM has, as they have a much more finite set of data to get results. However, with an LLM integration, the bot would “understand” that the customer would like to travel to Oslo. This is a very simple example of the LLM using its vast parameter knowledge to resolve the intent of the customer.
There might be use cases where you want your responses constrained to a particular, small dataset, to receive a more predictive outcome over LLM-generated responses. We will get deeper into those use cases and trade-offs in our subsequent blog posts in this series.
Outline real CX business outcomes you’re looking to drive with generative AI
We believe improving customer experiences is one of the top use cases for generative AI because of the opportunity to further improve assistance for agents, insights for managers, and self-service experiences for customers.
As you consider how generative AI can be valuable in your contact center, it’s important to start with the business outcomes you’re trying to achieve. This will help narrow in on specific use cases for generative AI in your contact center. The following three use cases for generative AI provide immediate business value by increasing agent efficiency, more accurately processing data, and helping customers get answers to more complex queries.
Generative AI has the potential to help further reduce handle times and increase first call resolution by improving agent efficiency and accuracy when responding to customer issues. Generative AI can enhance agent assist capabilities to generate real-time suggested responses and actions, summarized from company knowledge content to help agents solve customer problems quickly and accurately. For example, when a customer calls to inquire about their auto insurance claim, a LLM can leverage information about the customer’s claim and policy, repair shop details from the insurer’s website, and policy documents from internal repositories to provide the agent with a comprehensive response and next actions to help resolve the customer’s issue.
Generative AI can also enhance real-time and post-contact analytics and quality assurance efforts by analyzing all contacts, instead of just a smaller sample, making it faster for managers to identify insights and ensure agents are adhering to policies. Generative AI can be used to concisely summarize conversations to reduce the time agents and managers spend taking/reviewing notes or sharing context when transferring contacts. For example, generative AI can condense a long conversation about a cable subscription inquiry to: “the customer cancelled their cable subscription after rejecting a $10 rebate offered by agent”. LLMs will also provide further insights to managers regarding agent performance driving business outcomes. It can then provide recommended actions, like coaching points and agent training, to further improve performance.
Furthermore, you may be looking to optimize self-service experiences to improve call deflection rates and reduce the cost of development of automated self-service experiences. Generative AI can help here too, by making it easier for companies to understand the complex nuances of a customer’s intent. It can also deliver LLM-powered recommendations for improving contact center configurations that make it easier to design, build, and update self-service experiences.
Explore what’s possible with generative AI and Amazon Connect
When we launched Amazon Connect in 2017, we built it from the ground up with integrated AI capabilities that help customers like National Australia Bank, Traeger, Accor, and Just Energy realize outcomes like reduced handle times and improved customer satisfaction. We see an opportunity to enhance Amazon Connect’s existing built-in AI capabilities using generative AI, to deliver additional business value.
This demo provides a look into how generative AI can be used for three contact center use cases – agent assistance, manager assistance, and customer self-service:
We’re excited to collaborate with you on the applications of generative AI that will drive real business outcomes for your contact center as you deliver efficient, effective service to your customers.
To dive deeper and learn more about generative AI, check out these additional resources:
- Realize the business value of generative AI in your customer service organization with additional AWS AI-powered solutions
- How Technology Leaders Can Prepare for Generative AI
- How Your Organization Can Prepare for Generative AI
- An introduction to generative AI with Swami Sivasubramanian
- For more information on how you can experiment with the generative AI-powered customer self-service solution shown in the demo video above, see Deploy self-service question answering with the QnABot on AWS solution powered by Amazon Lex with Amazon Kendra and large language models
About the authors:
Mike Wallace leads the Americas Solution Architecture Practice for Customer Experience at AWS. | |
Andrea Caldwell is a Product Marketing Manager for Amazon Connect at AWS. |