In today’s increasingly complex and geographically dispersed organizations, encompassing remote teams and diverse knowledge systems, the challenge of tracking down crucial data across the entire enterprise knowledge ecosystem has become a formidable task. Consequently, employees are experiencing the negative consequences of this knowledge access challenge, leading to reduced productivity and waning engagement within the workforce.
During the recent VB Spotlight event titled “The Impact of Generative AI on Enterprise Search: A Game-Changer for Businesses,” Phu Nguyen, the head of digital workplace at Pure Storage, emphasized the detrimental effects of this issue. He highlighted that employees are feeling frustrated due to the inability to locate the information they need, ultimately resulting in diminished engagement levels and decreased productivity.
To shed light on potential solutions, the event brought together industry experts including Jean-Claude Monney, a digital workplace, technology, and knowledge management advisor, and Eddie Zhou, the founding engineer specializing in intelligence at Glean. The panelists discussed the emergence of a revolutionary advancement in workplace-specific search tools, which harness the power of generative AI. These innovative tools aim to provide employees with comprehensive access to the knowledge they require, along with its contextual relevance, regardless of their location within the organization.
By leveraging generative AI, organizations can overcome the limitations of traditional search methods and unlock a wealth of information that was previously challenging to navigate. This transformative technology enables employees to swiftly and efficiently access the precise knowledge they need, empowering them to make informed decisions and perform their tasks effectively.
Moreover, the contextual understanding offered by generative AI allows users to grasp the interconnectedness of information across different departments and teams. This comprehensive perspective fosters collaboration, facilitates cross-functional problem-solving, and encourages knowledge sharing within the organization.
The adoption of generative AI-powered search tools marks a significant leap forward in the quest to streamline knowledge access within enterprises. As organizations embrace this game-changing technology, they can alleviate the frustration experienced by employees, enhance productivity, and ultimately drive higher levels of engagement throughout the workforce.
The evolution of enterprise search
Traditional enterprise search can’t reach all the knowledge in an organization, which is spread out in multiple systems. It can mine structured knowledge, such as the data found in Jira, Confluence, intranets and sales portals, but unstructured knowledge, the information communicated through IM, Teams, Slack, and email, has been uncharted territory, difficult to corral in any helpful contextual way, Nguyen adds.
“The paradigm of knowledge management has changed significantly,” he says. “How do you have a system that can look at both structured and unstructured data and provide you with the answers that you’re ultimately looking for? Not the information that you need, but the answer that you’re looking for.”
Solutions that integrate with multiple systems and utilize generative AI can address these challenges, and help employees find the information they need to perform their jobs effectively, no matter where that knowledge resides.
“Companies are now building searches specifically for the workplace, built for internal searches that work across your internal system,” Nguyen explains. “Most importantly, they’re built on a knowledge graph that returns a search that’s more relevant to your employees. This is all very exciting for us because we think of this as part of our employee information center strategy. Previously it was just an intranet and our support portal, but now we have this workplace search that can connect information across multiple systems inside our organization.”
How organizations can leverage generative AI
There are three major ways companies can leverage generative AI, and they’re game changers, Monney says. First, he says, are the benefits that an NLP interface brings.
“Time to knowledge is a new business currency,” says Monney. “What we’ve seen with generative AI is this quantum leap in user experience. ChatGPT has democratized ways to talk to a system and get very succinct responses.”
At home, users have grown accustomed to the ease and convenience of natural language interfaces like Alexa and Siri; generative AI brings that user experience to the workplace, giving workers not just an enterprise search tool, but a digital knowledge assistant, he adds. It enables employees to find not just information but precise answers quickly, boosting productivity and efficiency, especially in complex decision-making scenarios. Generative AI also has the potential to go beyond answering individual questions and assist in more complex decision journeys, providing users with synthesized and relevant information without the need for explicit queries.
Generative AI can also automate repetitive tasks and streamline workflows — for example, chat bots that are powered by generative AI can handle customer service inquiries, product recommendations, or simply assist with booking appointments. That frees time for more complex tasks and greatly increases productivity.
Lastly, these generative AI solutions can be precisely refined for industry-specific and case-specific use. Companies can add their own corpus of knowledge to the large language models that generative AI uses, to improve relevance and the time to knowledge.
Bringing generative AI into the workplace
“To bring this technology into the workplace is not an easy thing,” Zhou cautions. It requires a knowledge model, which is composed of three pillars. The first is company knowledge and context. An off-the-shelf model or system, without being properly connected to the right knowledge and the right data, will not be functional, correct, or relevant.
“You need to build generative AI into a system that has the company knowledge and context,” he explains. “That allows for this trusted knowledge model to form out of the combination of these things. Search is one such method that can deliver this company knowledge and context, in conjunction with generative AI. But it’s one of several.”
The second pillar of the trusted knowledge model is permissioning and data governance, or being aware, as a user interfaces with a product and with a system, of what information they should and should not have access.
“We speak of knowledge in the company as if it’s free-flowing currency, but the reality is that different users and different employees in a company have access to different pieces of knowledge,” he says. “That’s objective and clear when it comes to documents. You might be part of a group alias which has access to a shared drive, but there are plenty of other things that a given person should not have access to, and in the generative setting it’s incredibly important to get this right.”
The third and final one is referenceability. As the product interface has evolved, users need to build a trust with the system, and be able to verify where the system is pulling information from.
“Without that kind of provenance, it’s hard to build trust, and it can lead to runaway factuality errors and hallucinations,” he says – especially in an enterprise system where each user is accountable for their decisions.
The emerging possibilities of generative AI
Generative AI means moving from questions into decisions Zhou says, decreasing time to knowledge. Basic enterprise search might turn up a series of documents to read, leaving the user to dig out the information they need. With augmented answer-first enterprise search, the user doesn’t ask those questions individually; instead, they can express the underlying journey, the overall decisions that need to be made, and the LLM agent brings it all together.
“This generative technology, when we pair it with search, and not just single searches, it gives us the ability to say, ‘I’m going on a business trip to X. Tell me everything I need to know,’” he says. “An LLM agent can go and figure out all the information I might need and repeatedly issue different searches, collect that information, synthesize it for me and deliver it to me.”
For more on the ways that generative AI and large language models can transform how knowledge is accessed and used in enterprises, they types of use cases and more, don’t miss this VB Spotlight!