Thu, November 23, 2023
In November 2022, Chat GPT 3.5 was unleashed on the world. It took just two months to reach 100 million monthly users, making it the fastest-growing tech launch in history by most measures. Since then other Generative AI chatbots have launched, along with Chat GPT 4.
Anyone who has used one of these Large Language Model (LLM) bots immediately notices two things: Firstly, they understand almost anything you can type or say to them. You never hear that annoying “I’m sorry, can you repeat that” error message you get with Siri or Alexa when you stray away from straightforward commands.
And secondly, these chatbots respond appropriately, in context, and in perfectly natural language (in fact, in any language). OK, they sometimes ‘hallucinate’ and make things up, but don’t all toddlers? It doesn’t take away from the fact that LLM technology is an impressive leap forward and a genuine game changer.
As the capabilities of these models continue to improve and their responses become more human-like and accurate, how are businesses and other organisations going to employ them to help with one of their biggest cost centres – the provision of customer service?
Chat GPT’s ability to understand queries expressed in natural language, understand the context around them and, importantly, retain that context and understanding throughout an interaction are what make it so incredible at conversing with customers.
It’s not just Chat GPT of course. Bard from Google and Claude from Anthropic are just two of the other leading LLM-based chatbots with similar capabilities. Many companies are building specialist chatbots and other services on top of these technologies, targeting specific sectors or use cases. You can even build your own specialist bot simply and quickly (more on that later).
Unlike traditional chatbots with their ability to only understand the limited phrases and scenarios that are programmed into them, LLM bots’ human-like language skills pave the way for more meaningful and productive conversations. From addressing common FAQs to managing complex customer issues, their abilities and their versatility are changing the face of customer service.
LLM chatbots’ conversational abilities enable them to handle high volumes of routine and even moderately complex queries. This frees up human agents to focus on relationship-building with customers and managing the most complex issues.
With a technology this powerful, it becomes more than a trend. When the potential gains include more efficient, personalised and 24/7 customer support, unlocking immense improvements in Customer Experience (CX) as well as massive cost savings, its adoption will become a necessity.
LLMs like ChatGPT, Claude and Bard display a deep comprehension of linguistic nuances, turns of phrase and cultural references. But this is not the primary reason that makes them almost human-like to interact with.
The complex neural networks on which LLMs are built are continuously trained on vast datasets. This allows LLM chatbots to develop a keen grasp of interpersonal dynamics, which imbues them with emotional intelligence – or at least the appearance of it, which is enough.
So, in addition to understanding natural language and even slang, and following and retaining the context around a conversation, they to all intents and purposes also understand emotions. Importantly, they can respond appropriately to the type of language used by a customer and its emotional content.
By capturing the richness and fluidity of human-to-human interactions, LLMs enable human-level and human-like interactions to occur between bots and customers. That’s not something that we could ever say before Chat GPT launched on the market.
For customers, this eliminates friction and enhances satisfaction while probably giving them an answer faster than a human could. No longer do customers have to adapt to robotic speech to make themselves understood.
As these models continue evolving in accuracy and capability, their role in driving seamless, trust-based customer experiences will only grow.
The human-like capabilities of LLMs like ChatGPT and Claude stem from their intricate and lengthy training processes, which are powered by data. Lots of data. By ingesting unfathomable volumes of text, images and even audio-visual content, these AIs grasp the complexity and nuance of language and of human interaction.
Specifically, these models have undergone extensive pre-training on diverse datasets - from casual web content to academic papers. This exposure to varied lexical styles and topics is key to enabling their conversational versatility.
The careful curation and filtering of training data has also been instrumental. By prioritising accurate, high-quality sources that represent different demographics and viewpoints, the models develop a balanced viewpoint of the world and accumulate knowledge, enabling them to provide responses that are contextually relevant and appropriate.
For businesses, an exciting opportunity lies in developing customised LLMs that are aligned to their unique requirements. Tailored training on internal data assets such as customer queries, product catalogues and user manuals can prime models with niche sectorial or company-specific knowledge. You can truly build your own oracle.
While you still must safeguard against AI hallucinations – where the LLM just makes something up rather than references known material it was trained on – LLM chatbots are largely capable of boosting the accuracy, relevancy, and speed of responses to customers. That is a game changer for customer service.
The integration of LLMs into various customer service and CX channels marks a significant shift in how businesses interact with customers. AI and automation are, of course, already deeply embedded in the contact centre and other customer-facing channels. LLM chatbots have three broad use cases when it comes to CX:
1. Automate Customer Interactions
In customer-facing roles, ChatGPT serves as a powerful automation tool to improve efficiency and quality of support. A primary application is responding to common customer queries, freeing agents up to handle complex issues and boosting satisfaction.
The types of applications that can be used to interact directly with customers include live chat or interactive IVR. In these cases, the chatbot answers the call or live chat and prompts the customer to describe their problem or query. Unlike traditional chatbots which often get the customer’s reason for making contact wrong, LLM bots get it right more often than not because of their language capabilities and understanding of context. They can therefore match the customer’s statement or question to a pre-defined customer journey, transactional process, or workflow more accurately.
As you can be more confident in the LLM bot’s ability to diagnose the correct problem, you can even implement back office automation, or technology like Robotic Process Automation (RPA), to automate many of the steps needed to solve the customer’s problem once it has been identified.
Research from Contact Babel shows that customers generally prefer to solve their own problems using self-service methods like FAQs or chatbots, so if you can enable them to do this more quickly and consistently it will only be a boost for your CX.
2. Act as an Agent Co-pilot
No matter how good Generative AI gets, it is unlikely to ever replace human agents fully in the contact centre or other customer-facing environments. Indeed, as AI takes over more and more routine and simple interactions, those left over for human agents to deal with are likely to be more complex, more urgent, more emotionally taxing for customers, or – and this is crucial – more commercially critical and valuable for the company.
These interactions are likely to involve deep problem-solving and may require agents to undertake lengthy investigations or search for key pieces of information among millions of data points.
Chat GPT, and other LLMs, can help here by acting as a sort of co-pilot for the human staff tasked with handling these interactions and fixing these sorts of problems for customers. This can happen on several levels:
3. Uncover Strategic Insights
Beyond customer interactions, ChatGPT and other LLM models – because they are based on machine learning and neural networks – can offer valuable insights by processing and analysing huge volumes of customer data to reveal trends, preferences, and potential problems.
This can tell you, quite quickly, which areas of your business need better coverage and which issues are causing dissatisfaction. Proactively addressing these pain points boosts customer retention and guides future product enhancements. It might be as simple as rewriting your FAQs on your website, all the way up to redefining your CX strategy.
Insights can also shape targeted communications and marketing campaigns. By understanding customer sentiments deeply, businesses can deliver offerings and experiences exceeding expectations. They can also intervene with customers before problems happen, thus boosting rather than eroding customer satisfaction.
Now we know what Generative AI technology can potentially do to improve your contact centre and CX, we need to look at how we can practically implement it in your business. Multiple approaches exist for frictionless adoption. It all depends on your technical infrastructure, what you want to achieve and, of course, your budget.
1. CCaaS Integration
In the past year, most contact centre technology vendors have begun integrating Chat GPT or similar technologies into their cloud platform offerings. These include customer-facing chat or IVR bots as well as agent-assist technologies such as auto-complete.
If you have an existing Contact Centre as a Service (CCaaS) stack, as your vendor rolls out embedded support for Generative AI you can almost instantly begin to incorporate it in your different channels – following a little training on your company data. This enables next-gen self-service and co-pilot assistance without completely overhauling your tech stack or bolting ad-hoc solutions onto it.
2. API Integration
A more DIY solution is also possible using the API (application protocol interfaces) that Chat GPT and other LLMs make available. These allow anyone to hook a Generative AI chatbot up to multiple systems to take advantage of its automation capabilities.
A publicly available integration tool like Zapier and a £40 a month subscription to Chat GPT 4 may be all you need to build Generative AI capabilities into your existing systems. Workflows can be set up across platforms to trigger ChatGPT to provide information, run reports or do whatever you need it to do. Such robust flexibility creates scope for efficiency gains across diverse tech environments.
3. Build Your Own Virtual Assistant
Chat GPT recently launched customer GPTs which allow anyone to build their own virtual assistant (VA) like Siri or Alexa but with the language capabilities of an LLM bot. Your custom GPT would be trained exclusively on your company data - ingesting past conversations along with product manuals, policy docs and FAQs.
Once armed with this knowledge and understanding of how your business operates, you can deploy the chatbot as a standalone application to handle customer queries. As it has been trained on company materials, the bot can be customised to speak with your brand voice.
Implementing Chat GPT, or any LLM bot, in your contact centre is not quite as simple as training it, setting it loose, and forgetting about it. You need to set guidelines for it to follow and decide which types of customer interactions you want it to handle, and which you want it to hand off to other systems or to human agents.
Similarly, if you’re just giving your staff access to Chat GPT or a custom GPT to help them with responding to emails or chats, you also need to provide your people with guidelines on acceptable use for consistency and quality control.
Configuring ChatGPT’s tone of voice and style of response – formal, casual, or playful – is important to ensure it accurately reflects the personality of your brand so you can build customer rapport. Staff must know what prompts are good for creating the correct style of response.
ChatGPT exchanges must also adhere to the legal, ethical and compliance constraints that apply to all other types of interactions with your customers. From privacy practices to accuracy, particularly in regulated industries, compliance builds critical customer trust and protects your business from reputational risks. It’s a good idea to have an AI component to your ethics committee to oversee your use of the technology and to also ensure your QA team is checking your chatbot’s responses to customers.
It’s far too early to know whether Generative AI chatbots really are the future of CX or not, and exactly what mix of automation versus human interaction customers want or need. While many have a strong preference for solving their own problems via self-service, other demographics prefer the connection they can only get from talking with a fellow human being.
What is clear is that automation is a rare win-win in that it enables a business to both reduce costs while improving the speed, accuracy, and efficiency of the customer experience. For these reasons many businesses and contact centres will embrace it – at least until the next major technology breakthrough comes along.
WhitepaperAdvanced Customer Experience Technologies
From the cloud to AI, how to use the latest customer contact technologies to improve your customer relationships
If you want to know what you’re doing right or wrong in your business, your customers are the best people to ask.Articles
Companies that embrace the possibilities that are opening up before us will likely be the winners.News