Principal is a global financial company with nearly 20,000 employees passionate about improving the wealth and well-being of people and businesses. In business for 145 years, Principal is helping approximately 64 million customers (as of Q2, 2024) plan, protect, invest, and retire, while working to support the communities where it does business and build a diverse, inclusive workforce.
As Principal grew, its internal support knowledge base considerably expanded. This wealth of content provides an opportunity to streamline access to information in a compliant and responsible way. Principal wanted to use existing internal FAQs, documentation, and unstructured data and build an intelligent chatbot that could provide quick access to the right information for different roles. With the QnABot on AWS (QnABot), integrated with Microsoft Azure Entra ID access controls, Principal launched an intelligent self-service solution rooted in generative ai. Now, employees at Principal can receive role-based answers in real time through a conversational chatbot interface. The chatbot improved access to enterprise data and increased productivity across the organization.
In this post, we explore how Principal used QnABot paired with amazon Q Business and amazon Bedrock to create Principal ai Generative Experience: a user-friendly, secure internal chatbot for faster access to information.
QnABot is a multilanguage, multichannel conversational interface (chatbot) that responds to customers’ questions, answers, and feedback. It allows companies to deploy a fully functional chatbot integrated with generative ai offerings from amazon, including amazon Bedrock, amazon Q Business, and intelligent search services, with natural language understanding (NLU) such as amazon OpenSearch Service and amazon Bedrock Knowledge Bases. With QnABot, companies have the flexibility to tier questions and answers based on need, from static FAQs to generating answers on the fly based on documents, webpages, indexed data, operational manuals, and more.
amazon Q Business is a generative ai-powered assistant that can answer questions, provide summaries, generate content, and securely complete tasks based on data and information in your enterprise systems. It empowers employees to be more creative, data-driven, efficient, prepared, and productive.
amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading ai companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral ai, Stability ai, and amazon through a single API, along with a broad set of capabilities to build generative ai applications with security, privacy, and responsible ai.
Challenges, opportunities, and constraints
Principal team members need insights from vast amounts of unstructured data to serve their customers. This data includes manuals, communications, documents, and other content across various systems like SharePoint, OneNote, and the company’s intranet. The information exists in various formats such as Word documents, ASPX pages, PDFs, Excel spreadsheets, and PowerPoint presentations that were previously difficult to systematically search and analyze. Principal sought to develop natural language processing (NLP) and question-answering capabilities to accurately query and summarize this unstructured data at scale. This solution would allow for greater understanding of a wide range of employee questions by searching internal documentation for responses and suggesting answers—all through a user-friendly interface. The solution had to adhere to compliance, privacy, and ethics regulations and brand standards and use existing compliance-approved responses without additional summarization. It was important for Principal to maintain fine-grained access controls and make sure all data and sources remained secure within its environment.
Principal needed a solution that could be rapidly deployed without extensive custom coding. It also wanted a flexible platform that it could own and customize for the long term. As a leader in financial services, Principal wanted to make sure all data and responses adhered to strict risk management and responsible ai guidelines. This included preventing any data from leaving its source or being accessible to third parties.
The chatbot solution deployed by Principal had to address two use cases. The first use case, treated as a proof of concept, was to respond to customers’ request for proposal (RFP) inquiries. This first use case was chosen because the RFP process relies on reviewing multiple types of information to generate an accurate response based on the most up-to-date information, which can be time-consuming.
The second use case applied to Principal employees in charge of responding to customer inquiries using a vast well of SharePoint data. The extensive amount of data employees must search to find appropriate answers for customers made it difficult and time-consuming to navigate. It is estimated these employees collectively spent hundreds of hours each year searching for information. As the volume and complexity of customer requests grows, without a solution to enhance search capabilities, costs were projected to significantly rise.
Principal saw an opportunity for an internal generic ai assistant to allow employees to use ai in their daily work without risking exposure of sensitive information through any unapproved or unregulated external ai vendors.
The solution: Principal ai Generative Experience with QnABot
Principal began its development of an ai assistant by using the core question-answering capabilities in QnABot. Within QnABot, company subject matter experts authored hard-coded questions and answers using the QnABot editor. Principal also used the AWS open source repository Lex Web UI to build a frontend chat interface with Principal branding.
Initially, Principal relied on the built-in capabilities of QnABot, using Anthropic’s Claude on amazon Bedrock for information summarization and retrieval. Upon the release of amazon Q Business in preview, Principal integrated QnABot with amazon Q Business to take advantage of its advanced response aggregation algorithms and more complete ai assistant features. Integration enhanced the solution by providing a more human-like interaction for end-users.
Principal implemented several measures to improve the security, governance, and performance of its conversational ai platform. By integrating QnABot with Azure Active Directory, Principal facilitated single sign-on capabilities and role-based access controls. This allowed fine-tuned management of user access to content and systems. Generative ai models (for example, amazon Titan) hosted on amazon Bedrock were used for query disambiguation and semantic matching for answer lookups and responses.
Usability and continual improvement were top priorities, and Principal enhanced the standard user feedback from QnABot to gain input from end-users on answer accuracy, outdated content, and relevance. This input made it straightforward for administrators and developers to identify and improve answer relevancy. Custom monitoring dashboards in amazon OpenSearch Service provided real-time visibility into platform performance. Additional integrations with services like amazon Data Firehose, AWS Glue, and amazon Athena allowed for historical reporting, user activity analytics, and sentiment trends over time through amazon QuickSight.
Adherence to responsible and ethical ai practices were a priority for Principal. The Principal ai Enablement team, which was building the generative ai experience, consulted with governance and security teams to make sure security and data privacy standards were met. Model monitoring of key NLP metrics was incorporated and controls were implemented to prevent unsafe, unethical, or off-topic responses. The flexible, scalable nature of AWS services makes it straightforward to continually refine the platform through improvements to the machine learning models and addition of new features.
The initial proof of concept was deployed in a preproduction environment within 3 months. The first data source connected was an amazon Simple Storage Service (amazon S3) bucket, where a 100-page RFP manual was uploaded for natural language querying by users. The data source allowed accurate results to be returned based on indexed content.
The first large-scale use case directly interfaced with SharePoint data, indexing over 8,000 pages. The Principal team partnered with amazon Q Business data connector developers to implement improvements to the SharePoint connector. Improvements included the ability to index pages in SharePoint lists and add data security features. The use case was piloted with 10 users during 1 month, while working to onboard an additional 300 users over the next 3 months.
During the initial pilot, the Principal ai Enablement team worked with business users to gather feedback. The first round of testers needed more training on fine-tuning the prompts to improve returned results. The enablement team took this feedback and partnered with training and development teams to design learning plans to help new users more quickly gain proficiency with the ai assistant. The goal was to onboard future users faster through improved guidance on how to properly frame questions for the assistant and additional coaching resources for those who needed more guidance to learn the system.
Some users lacked access to corporate data, but they used the platform as a generative ai chatbot to securely attach internal-use documentation (also called initial generic entitlement) and query it in real time or to ask questions of the model’s foundational knowledge without risk of data leaving the tenant. Queries from users were also analyzed to identify beneficial future features to implement.
The following diagram illustrates the Principal generative ai chatbot architecture with AWS services.
Principal started by deploying QnABot, which draws on numerous services including amazon Bedrock, amazon Q Business, QuickSight, and others. All AWS services are high-performing, secure, scalable, and purpose-built. AWS services are designed to meet your specific industry, cross-industry, and technology use cases and are developed, maintained, and supported by AWS. AWS solutions (for example, QnABot) bring together AWS services into preconfigured deployable products, with architecture diagrams and implementation guides. Developed, maintained, and supported by AWS, AWS solutions simplify the deployment of optimized infrastructure tailored to meet customer use cases.
Principal strategically worked with the amazon Q Business and QnABot teams to test and improve the amazon Q Business conversational ai platform. The QnABot team worked closely with the Principal ai Enablement team on the implementation of QnABot, helping to define and build out capabilities to meet the incoming use cases. As an early adopter of amazon Q Business, engineers from Principal worked directly with the amazon Q Business team to validate updates and new features. When amazon Q Business became generally available, Principal collaborated with the team to implement the integration of AWS IAM Identity Center, helping to define the process for IAM Identity Center implementation and software development kit (SDK) integration. The results of the IAM Identity Center integration were contributed back to the QnABot amazon Q Business plugin repository so other customers could benefit from this work.
Results
The initial proof of concept was highly successful in driving efficiencies for users. It achieved an estimated 50% reduction in time required for users to respond to client inquiries and requests for proposals. This reduction in time stemmed from the platform’s ability to search for and summarize the data needed to quickly and accurately respond to inquiries. This early success demonstrated the solution’s effectiveness, generating excitement within the organization to broaden use cases.
The initial generic entitlement option allowed users to attach files to their chat sessions and dynamically query content. This option proved popular because of large productivity gains across various roles, including project management, enterprise architecture, communications, and education. Users interacting with the application in their daily work have received it well. Some users have reported up to a 50% reduction in the time spent conducting rote work. Removing the time spent on routine work allows employees to focus on human judgement-based and strategic decisions.
The platform has delivered strong results across several key metrics. Over 95% of queries received answers users accepted or built upon, with only 4% of answers receiving negative feedback. For queries earning negative feedback, less than 1% involved answers or documentation deemed irrelevant to the original question. Over 99% of documents provided through the system were evaluated as relevant and containing up-to-date information. 56% of total queries were addressed through either sourcing documentation related to the question or having the user attach a relevant file through the chat interface. The remaining queries were answered based on foundational knowledge built into the platform or from the current session’s chat history. These results indicate that users benefit from both Retrieval Augmented Generation (RAG) functionality and amazon Q Business foundational knowledge, which provide helpful responses based on past experiences.
The positive feedback validates the application’s ability to deliver timely, accurate information to users, optimizing processes and empowering employees with data-driven insights. Metrics indicate a high level of success in delivering the right information, reducing time spent on client inquiries and tasks and producing significant savings in hours and dollars. The platform effectively and quickly resolves issues by surfacing relevant information faster than manual searches, improving processes, productivity, and the customer experience. As usage expands, it is expected that the benefits will multiply for both users and stakeholders. This initial proof of concept provides a strong foundation for continued optimization and value, with potential expansion to maximize benefits for Principal employees.
Roadmap
The Principal ai Enablement team has an ambitious roadmap for 2024 focused on expanding the capabilities of its conversational ai platform. There is a commitment to scale and accelerate development of generative ai technology to meet the growing needs of the enterprise.
Numerous use cases are currently in development by the ai Enablement team. Many future use cases are expected to use the Principal ai Generative Experience application due to its success in automating processes and empowering users with self-service insights. As adoption increases, ongoing feature additions will further strengthen the platform’s value.
At Principal, the roadmap indicates a commitment to delivering continual innovation. Innovation will drive further optimization of operations and workflows. It could also create new opportunities to enhance customer and employee experiences through advanced ai applications.
Principal is well positioned to build upon early successes by fulfilling its vision. The roadmap provides a strategic framework to maximize the platform’s business impact and differentiate solutions in the years ahead.
Conclusion
Principal utilized QnABot on AWS paired with amazon Q Business and amazon Bedrock, to deliver a generative ai experience for its users, thereby reducing manual time spent on client inquiries and tasks and producing significant savings in hours and dollars. Using generative ai, Principal’s employees can now focus on deeper human judgment based decisioning, instead of time spent scouring for answers from data sources manually. Get started with QnABot on AWS, amazon Q Business and amazon Bedrock.
About the Authors
Ajay Swamy is the Global Product Leader for Data, AIML and Generative ai AWS Solutions. He specializes in building AWS Solutions (production-ready software packages) that deliver compelling value to customers by solving for their unique business needs. Other than QnABot on AWS, he manages Generative ai Application Builder, Enhanced Document Understanding, Discovering Hot Topics using Machine Learning and other AWS Solutions. He lives with his wife (Tina) and dog (Figaro), in New York, NY.
Dr. Nicki Susman is a Senior Machine Learning Engineer and the Technical Lead of the Principal ai Enablement team. She has extensive experience in data and analytics, application development, infrastructure engineering, and DevSecOps.
Joel Elscott is a Senior Data Engineer on the Principal ai Enablement team. He has over 20 years of software development experience in the financial services industry, specializing in ML/ai application development and cloud data architecture. Joel lives in Des Moines, Iowa, with his wife and five children, and is also a group fitness instructor.
Bob Strahan is a Principal Solutions Architect in the AWS Generative ai Innovation Center team.
Austin Johnson is a Solutions Architect, maintaining the Lex Web UI open source library.
The subject matter in this communication is educational only and provided with the understanding that Principal is not endorsing, or necessarily recommending use of artificial intelligence. You should consult with appropriate counsel, compliance, and information security for your business needs.
Insurance products and plan administrative services provided through Principal Life Insurance Company, a member of the Principal Financial Group, Des Moines, IA 50392.
2024, Principal Financial Services, Inc.
3778998-082024