Beyond ChatGPT: Generative AI for knowledge management at work
Soon after ChatGPT and generative AI took the world by storm just under a year ago, our inboxes exploded with questions, speculation, and trepidation about what this could mean for business. In what seemed like an instant, client inquiries about Turnberry’s AI capabilities and solutions skyrocketed. Our Data & Insights and Digital Modernization practices have collaboratively developed and delivered data-powered solutions for clients for over two decades – still, experiencing the flood of our clients’ and consultants’ big questions about the future of AI-driven products at work was eye-opening.
Alongside this undeniable excitement about the implications of and opportunities from AI, our company was simultaneously grappling with growing pains from several recent acquisitions, and we had greater access to internal knowledge, expertise, and solution accelerators than ever before – but lacked efficient, user-friendly discovery capabilities with which to navigate and leverage the mass of information squirreled away in various document repositories. Thus began our journey to confront how to leverage generative AI to demystify our internal knowledge management challenges.
A new retrieval-augmented generation (RAG)-based web tool, a valuable outcome of this journey, is now beginning to transform the way we access and use information at Turnberry – and has been a foundational (and fun!) steppingstone in not only our journey to reshape knowledge management, but also as a key pillar for the future of work with AI. Given the tremendous success of our partners in this space – such as Lucidworks and its Fusion platform – we considered leveraging and extending an existing, proven tool to meet our needs. However, having already leveraged that platform for clients, we also needed to exercise our own capabilities in building and deploying from the ground up to best learn openly.
Where we started
Turnberry Labs, our company’s internal innovation incubator, became the birthplace of this RAG platform. We recognized the need for a more efficient, effective knowledge management system – so we got to work. Our goals were two-fold: produce a solution for revolutionized knowledge management while also experimenting with all of the ins and outs of how to build a real-life RAG-based application from scratch. We’ve experimented with LangChain and its huge, growing collection of building blocks for RAG and LLM, vector databases, Azure’s Cognitive Search, Lucidworks Fusion, OpenAI’s LLM, other LLM solutions, and a variety of other related technology options that can make up these solutions; we’ve wrestled with and learned what our usual SDLC best practices around CI/CD and automated deployment, immutable and code-declared infrastructures, cloud-native preferences, and test-driven mantras look like in this new wild world of tools. We have accumulated a mountain of know-how and produced something of huge value in the process. Win-win!
Like ChatGPT, Turnberry’s proprietary RAG platform provides a conversational interface for a user to post natural-language structured questions and requests for content. Unlike ChatGPT, whose knowledge is limited to web-based information that was available only up to a static point in the past, Turnberry’s RAG platform first semantically searches our company’s internal content about solutions, services, niche capabilities, and more. It then feeds the LLM (a la core ChatGPT) with this content to leverage the LLM’s capability to reason and generate content from our own company’s information. The interface provides not only the ground-breaking conversational response style typical to ChatGPT, but also the precise text provided as context to the LLM. It also includes citations for all documents used in the response, with direct links to access these documents at their source. No more clicking through 18 layers in SharePoint only to find this is not the folder you are looking for. Ultimately, this tool offers the same great “intelligent conversation” capability we have already gotten used to expecting from ChatGPT, but with intelligence underpinned by our internal corporate knowledge, and a direct one-stop link to those knowledge sources.
Supporting the front-end user interface, we have also developed mechanisms to curate content by focused user-defined “topics.” This allows users to reference key content they want included as part of RAG, leaving it where it currently resides. When this content changes, all RAG-supporting repositories are kept up to date with no manual intervention.
Fundamentally, the skeleton of a RAG solution boils down to:
- Retrieval: When the model receives a prompt or a question, it first queries a dataset of documents or knowledge sources to find relevant information. This retrieval is often done using a dense vector search where documents and queries are embedded in a high-dimensional space, and the nearest neighbor search is performed to find the most relevant documents.
- Augmentation: The retrieved documents are then provided to the language model as additional context. This allows the model to incorporate or reference the retrieved information in its responses.
- Generation: The language model generates a response based on the combination of the original prompt and the information retrieved from the external sources.
What we’ve seen
We have so much internal knowledge and expertise in content scattered across a myriad of repositories with a variety of hierarchy styles. To stay relevant, competitive, and ahead of the curve, we need to leverage this knowledge. Employees using Turnberry’s RAG platform now spend less time searching for pre-existing content and more time strategizing about how to leverage it effectively for new client solutions. We are effectively creating “semantic guided discovery” for all of Turnberry’s proprietary documentation and learnings from our past engagements. By leveraging this exciting new platform, Turnberry’s consultants – and entire practices – get better with each new engagement, from baseline documentation to code repositories.
We have learned that just like us, many of our client partners do not have a strong grasp on internal knowledge management. Solutions like a proprietary, employee-facing RAG can bridge the knowledge gap and enhance efficiency in knowledge-based decision-making – and with our assistance leveraging these hard-earned learnings, our clients can have relatively quick access to this too. Harnessing generative AI for internal knowledge management makes employees’ knowledge base wider and their jobs easier. Instead of being unintentionally forced into re-creating the wheel, they are empowered to leverage one another’s best work to produce the best possible outcomes more efficiently.
What’s next
We will keep developing our RAG platform, getting a better handle on wrangling content, and enhancing its capabilities and AI algorithms, while also continuing to leverage best-of-breed partner technologies in the market, like Lucidworks. As we expand our own capabilities internally – as well as grow the number and kinds of users who can benefit from them – we learn more about how this game-changing tool can best be of benefit to us. Further, while we have made considerable progress, there is still much more to learn with respect to how we best handle content suppression and access control, result accuracy and hallucination reduction, cost control, and more.
We are also chomping at the bit to continue extending our tool’s benefits to our clients as we have done with the Lucidworks platform, customizing it to meet clients’ unique knowledge management needs. Our ongoing commitment is to harness AI’s potential for innovative solutions and more efficient knowledge management.
Tools like these pave the way for a new working world, where employees benefit from:
- Augmented decision-making: AI will not replace humans but will augment their decision-making processes. Technologies like RAG will become even more intelligent, providing data-driven insights to support human judgment.
- Personalized learning: AI will enable personalized learning experiences, allowing employees to acquire new skills and knowledge in a way that suits their individual needs and pace.
- Enhanced productivity: as AI systems become more integrated into the workplace, productivity gains will be substantial. Mundane tasks will be automated, freeing up employees to focus on higher-value activities.
- Predictive capabilities: AI will increasingly assist us in predicting future trends, helping organizations prepare for what is to come. This will be particularly valuable in industries that face constant change and disruption.
- Prescriptive curated content: with each new project, engagement, or implementation, not only will Turnberry add to its knowledge base – but our clients eventually will too. Employees and managers will feel a new level of accomplishment knowing that their work not only serves the immediate task at hand, but also enriches the collective knowledge of their organization.
The emergence of ChatGPT and related generative AI technologies demonstrates a significant impact on how we have approached problem-solving and knowledge management in the business world. While we continue to iterate on new AI solutions within Turnberry Labs and for clients, we are also excited about the rapidly evolving landscape, and will continue to innovate with deliberate execution for our clients and ourselves.
Continue reading
Seven ideas to design a great interactive training
As Scrum Masters and Agile Coaches, we often need to introduce our teams to new concepts.
From hype to reality: AI’s game-changing role in service and sales
AI for service and sales is finally at the tipping point for rapid innovation. Last month,…
Turnberry Solutions announces partnership with Smartsheet
Turnberry Solutions, a leader in business, digital, and talent transformation services, announces it is a Smartsheet Solution Partner.