ChatGPT and Generative AI for Magical CX
Here are excerpts of a candid discussion of how ChatGPT and Generative AI are transforming the enterprise landscape. Our recent webinar explores the pragmatics, risks, and rewards of ChatGPT and Generative AI as panelists discuss specific use cases that are pioneering today’s most salient technology breakthrough. Without question, Generative AI and ChatGPT are changing workflows and transforming the customer and employee experience. Find out how these tools propel self-service, deliver dramatic ROI, reduce support costs, gratify users, and overcome the challenges that accompany dramatic change.
The webinar is moderated by Murali Nemani, CMO of Aisera. The session features insights of guest speakers Scott Owen, VP of Support–Customer Identity Cloud at Okta, Aneel Jael, former SVP of Customer Success at McAfee and prior SVP of Support and Support Account Management at ServiceNow, Jason “JE” English, Partner & Principal Analyst at Intellyx and finally, Muddu Sudhakar, CEO of Aisera. Attendees dive deep and share war stories about how Generative AI and ChatGPT fit into the enterprise technology stack and address compliance, security, privacy, data integrity, and more.
Why are these issues so vital today?
ChatGPT, (generative pretrained transformer) was made free and publicly available on November 30, 2022, as a proof of concept. It took a mere five days to gain a million users, exploding into phenomenal public focus.
At the macro level, Intellyx’s JE speaks of an “AI fog” in the software market resulting from that quick release, which gave the public their first interaction with LLMs—even if many LLMs are only as useful, accurate, and helpful as the data they are trained on.
What productive value is AI delivering today?
Despite dazzling potential, ChatGPT and Generative AI bring controversies and raise concerns over cybersecurity, legal liability, and inaccurate AI-driven decisions, among others. Nevertheless, Deloitte reports that 82 percent of early AI adopters cite positive ROI. IDC analysts project global AI spending to surpass $300 billion by 2026. Goldman Sachs predicts that up to two-thirds of occupations could be ‘partially replaced’ by AI (referring mainly to the routine toil that consumes employee time servicing repetitive customer tasks).
In fact, McKinsey finds that Generative AI has the potential to automate work activities that absorb 60-70 percent of employee time to date. That doesn’t mean replacing jobs per se, but rather changing the nature of the work while adding as much as 3.3 percentage points annually to global GDP.
JE notes that executive leadership sees a lot of upside as AI hits an inflection point. Machine learning and development tools are finally within reach, as are the infrastructure and bandwidth needed to pull off development. ChatGPT and Generative AI interfaces are now reaching into almost every type of enterprise software space, whether for BPM, workflows, analytics and so forth.
Lookahead planning and code Copilot capabilities within developer tools make for an ideal programming partner and mentor. Regarding integration and recognition, there are now good models for connecting APIs and integrating real-world information into training and execution. Audio and visual resources are also maturing—starting with documents and reaching into connected personal experiences and the Internet of Things (IoT).
IT Service Management and Operations use Generative AI for customer and employee incident reporting, metrics, documentation, and compliance, with the support of major vendors like ServiceNow and Microsoft. They can also address a variety of solutions on the backend.
Nevertheless, before embarking, enterprises should think ahead about intent and brand impact once they move beyond the experimental stages. Cost savings and productivity gains should not be the primary reason for adoption, as opposed to enhancing the customer and employee experience. Incorrect use of AI can risk systems and reputation. A magical customer experience can create customers for life, but failure to deliver can lose trust.
ChatGPT and Generative AI: Delivering the Magical Customer Experience
Murali framed the intersection between the product, the technology, and the customer service needs in terms of two boardroom priorities: 1) Adopt Generative AI as strategic to the mission of every enterprise; and 2) Lead with Customer Experience as a competitive differentiator.
As those ideas coalesce, they grab the attention of global enterprise. But paradoxically, investing more in customer satisfaction can raise associated costs. While efforts to optimize on cost can potentially reduce the very customer satisfaction you’re seeking.
Customer frustration has been a major driver of the effort to get issues and requests efficiently resolved. Lack of personalized interaction with digital systems or voice agents slow down responses and culminate in negative brand level engagements.
But now, Generative AI has the magical ability to make natural language the new UI for the Customer Experience. Customers can address classic questions around billing, subscriptions and other issues that may be repetitive but are nevertheless very specific and personalized to the user. Order management, tracking, finding status, making changes, and so forth are common but high-value interactions.
Not surprisingly, returns and refunds spark sizeable responses as well. The benefit of making these processes natural, intuitive, and actionable can hardly be quantified. Smooth account and user access / control not only yield quick answers but help create a self-serve, frictionless enterprise in which users can also sign up for new services.
However, achieving such modernization is not about using Generative AI and ChatGPT broadly in open-ended ways. Enterprises must ensure trust, privacy, confidentiality and security around user and customer data. Any solution must adhere to those extremely important elements.
Personalization and context are all about understanding your customer, their background, and the context of their questions. That’s why the “hallucinations” issue is a serious Achilles Heel, misstating the accuracy of questions and answers provided. Also important are reinforcement and collective learning: how do these systems learn? Are they manual? Do they require a lot of care and feeding?
When it comes to workflows and integrations, it’s no longer good enough to be able to converse but to actually drive outcomes. Such resolutions are heavily based on the enterprise back-end: ticketing systems, knowledge articles and so forth.
Then there’s deployment flexibility, which is vital to public sector, private data cloud, data center, and so forth. A solution must address these areas, and it’s up to practitioners to meet them.
AI Copilot for Customer Service
Muddu and the Aisera team have been executing this broad vision through AI Copilot—an intelligent Virtual Assistant that is not only about conversations but can help enable workflows. It delivers the power to pose a request and receive fulfillment easily and accurately. This is the core technology base around AiseraGPT.
Aisera’s particular expertise is to build its own domain and industry-specific LLMs on top. The way we speak in IT differs from the language of HR, compliance, or marketing and sales. The ontology and taxonomy of healthcare differ from those of finance or banking. So the solution must understand those respective nuances and then layer them atop customer-specific language. AiseraGPT’s transformational architecture unites these layered models and AI workflows. We deliver all these domain-specific elements out-of-the-box with our Universal Bot, enabling clients to jump-start their engagements.
AiseraGPT addresses digital channels
Omnichannel access for users is paramount; the value of having a conversational engine, plus a workflow resolution engine that connects to the back-end environment (knowledge base, ticketing systems, application systems and so on) can’t be overstated.
Aisera has made this process as intuitive and quick as possible for customers deploying in the customer experience domains, with over 400 prebuilt integrations, and more than 1200 AI workflows ready to use. That eases and speeds the ability to begin delivering these magical experiences.
Aisera translates all those capabilities into a stack—a system of digital or voice engagement with multiple ways to enter. Add your own conversational engine plus your AI workflows; Aisera customers can easily integrate with their system of record as well—Salesforce, Zendesk, etc.
Aneel on the challenges of building out to large support organizations
During the webinar, Aneel shared his deep experience solving the core problems that accompany internal and external Customer Service in a transformational ChatGPT environment. Choosing the right use cases can achieve between 65-80 percent deflection and yield substantial savings. That’s not to mention optimizing the key issue of customer satisfaction, as people want to make superior use of their time and move on quickly.
The unique promise of ChatGPT, says Aneel, lies is its ability to enable you to search beyond your known sources, summarize, and acquire knowledge faster. For any system that is SaaS or subscription-based, you can now provide your customers with an excellent guided experience without need for human intervention. With a universal solution and interface, it doesn’t matter if your question is related to legal, HR, or elsewhere. You have a single resource to deal with all these knowledge-specific issues.
Muddu addresses the challenge of reaching out securely to knowledge sources, not only within your organization but within third-party systems. Make sure that privacy controls are taken care of, and you can then improve your security posture dramatically. ChatGPT can enable security control while delivering the right content and auditing to avoid potential content leakage.
Okta’s three-year transformational journey
Scott Owen recalls that in late 2021, Okta launched a three-year project to deliver development support engineers as much online help as possible. This meant reinventing processes internally, driving more content externally, and addressing support cases that grew year over year—a monolithic task. Nevertheless, they dropped from over s150 case categories down to 27!
Questions addressed to developers included change requests, subscription billing, and feature changes enabled with Knowledge Base Articles (KBAs). Pushing that content created a challenge. They were concerned about not pleasing developer customers.
Okta had an opportunity for auto response, giving instant solutions to customers who could not find them or had no time to seek them. The engineer workload consisted of 20-30 percent of capacity being absorbed in ordinary, everyday tasks.
Driving a great adoption experience internally and externally
Okta needed an agile technology partner, and that was Aisera, who ensured that engineers could locate what they needed quickly. Aisera ignited the fuel to push content out to their developer website, enabling a transformation from under 10 percent content use to nearly 25 percent. Okta mobilized the knowledge base in their systems and made it actionable to developers. Time to resolution dropped internally as logged-in views have risen 400 percent!
Yet another challenge facing Okta was the fact that technologists often lack consumable writing skills. Leveraging technology to create knowledge drafts conserves a significant amount of their capacity and boosts return on investment.
Scott and Aneel simultaneously display two sides of the Generative AI coin: One is the external value to customers themselves and what it means to deliver superior CX. On the other side is the ability to ensure the effectiveness and productivity of systems serving internal users.
Creating our own custom LLMs
Muddu explained how Aisera’s foundational model brings their own LLMs based on a customer’s product knowledge. With natural language, Aisera can support whatever interface users are looking for. A customer doesn’t need to teach the system manually; it can address whatever domain, product, and industry apply to the customer.
Scott notes that like any implementation, it takes time to understand it, get the acronyms right, and fine-tune the technology while setting expectations internally. You need to have the right ingestion points, interface, and checks and balances to make this a better experience outwardly to customers and inwardly to those using the solution. With Aisera, you have a system of records and intelligence that navigates on behalf of the user. It understands their multiple intents and can break those down, answering them one at a time.
It’s not the time to “Wait and Watch”
In summary, AI can shape customer service support from a transitional role with high turnover to a desirable career. Employees commit to this role longer because they’re achieving successful customer outcomes on a more frequent basis.
Aneel believes in making sure Generative AI is part of the holistic digital transformation strategy because in isolation it may not work to deliver the outcomes you’re looking for.
The panel doesn’t believe that this is a time to wait and watch; now is the time to act. And good partnership is key to ensuring that this transformative technology is optimally employed.