How to use ChatGPT on company’s knowledgebase

10 Sep, 2023

As businesses evolve in the digital age, the need for quick, accurate, and efficient data retrieval and problem-solving has never been more crucial. One tool that’s making waves in this regard is ChatGPT, a conversational agent based on the GPT-4 architecture. One may have heard a lot of precautions on accuracy of the data in ChatGPT so this is and a lack of importan business or domain context are the major impediments for more businesses to start making more value of ChatGPT in the organizations. But how does ChatGPT gain the information it needs to answer questions or perform tasks? This is where the integration of a knowledge base comes in.

As we delve deeper into the workings of ChatGPT, it’s important to note the core principle of its functioning: the usage of embeddings. Embeddings are used to translate text into high-dimensional numerical vectors that encapsulate both semantic and syntactical meaning. This is pivotal for training the neural network to create contextually appropriate and coherent responses. Here’s how it works:

  1. Word Embeddings: This refers to the process of representing words in the language model. Each word is replaced by a dense vector representation that encapsulates the semantic meaning of the word. The model learns and refines these embedding during the training phase.
  2. Tokenization and Embedding: In essence, tokenization is the process of breaking down words into individual tokens. These tokens are then embedded into a vector representation. Each dimension of the vector signifies a unique aspect of the word’s meaning.

The implications of incorporating ChatGPT into your business’s internal database or knowledge base can be substantial. It could significantly enhance your

Moreover, this article aims to dive into the practical aspects of using ChatGPT, its accuracy, cost, and potential risks. We’ll also discuss how ChatGPT can be integrated with your company’s SharePoint, Google Drive, Confluence or any other source of the knowledgebase and how you can utilize it on custom data.

ChatGPT Training on Company Knowledgebase - Truth or Myth

ChatGPT, a Large Language Model (LLM), is frequently sought after to augment a company’s knowledgebase. However, employing this LLM effectively requires a clear understanding of the concepts of training, prompting and vector embeddings, and how they influence the integration of ChatGPT.

With the aim of an enhanced database, there’s a misconception that uploading an encyclopedia-like amount of data as a part of the training process will reshape the model’s understanding. Yet, this is far from the truth. ChatGPT’s training is directed at a general context awareness, not memorizing specific knowledge sources or databases. This key aspect balances the practical use of ChatGPT with the necessary caution in handling enterprise knowledge.

What is Training in Language Models (LLM)?

Training in Language Models like ChatGPT refers to the initial phase where a model learns by ingesting and processing vast amounts of text data. The primary objective is to enable the model to generate coherent and contextually accurate responses when prompted with queries. During the training phase, the model learns patterns, associations, and structures in the text data, allowing it to make educated guesses when faced with new, unseen text.

Why Uploading an Encyclopedia Won’t Train the Model?

There is a common misconception that uploading a single comprehensive document like an encyclopedia or a company’s organizational structure will make the model an “expert” in that domain. However, this is far from the truth.

What is Prompting in LLM?

Prompting, on the other hand, refers to the queries or statements used to trigger a model into generating a specific response. These prompts can be finely tuned to get the most relevant and accurate output from the model. Prompting becomes especially crucial in the application phase where you engage the model in real-time or batch tasks.

What is Vector Embeddings in LLM?

Vector embeddings in Language Models (LLM) like ChatGPT refer to the representation of words, phrases, or even entire sentences as vectors in a continuous mathematical space. This enables the model to understand semantic similarity between different pieces of text, and forms the basis of the model’s ability to generate coherent and contextually relevant text. Each word or piece of text is transformed into a series of numbers that capture its meaning and context within the dataset the model was trained on. Words that are contextually or semantically similar will have vector representations that are close to each other in this multi-dimensional space.

What is vector embeddings

Cost-Effective Approach: Training vs. Prompt Engineering

For further details, you can refer to our article “AI Model Training or Prompt Engineering? Cost-effective Approach”, which elaborates on why prompt engineering could be a more efficient way to get the desired outputs. In many cases, you don’t need to go through the resource-heavy process of retraining the model. Instead, careful crafting of prompts can lead to equally satisfying results.

Deployment Options

When it comes to deploying ChatGPT in a business environment, the approach can differ based on specific needs, existing tech stacks, and budgets. Here are some options to consider for a seamless and effective deployment:

1. Custom Development

With custom development, you have the option of building your chatbot interface from scratch, integrating it with Open AI API.

Basically this option provides the great flecibility allowing to plug in literally any DB to ChatGPT, starting from specific files, websites, cloud storage (Shareporint, Google Drive, Dropbox) or parse web results.

This would require a proprietory development and may take up to a month for the first version to start.

Note: For specific numbers, it’s advisable to inquire for a proposal. It’s free and will give you a detailed estimate.

2. Power Virtual Agents for SharePoint

ChatGPT can be integrated directly into SharePoint using Microsoft’s Power Virtual Agents.

Power Virtual Agents lets you create powerful AI-powered chatbots for a range of requests—from providing simple answers to common questions to resolving issues requiring complex conversations.

Open bots panel and then click on new Bot. Enter name, select language, and Environment and then press create. This will spin up a new power virtual agent bot. Note: It takes few minutes for the bot to setup.

Microsoft has added Boosted Conversation to Power Virtual Agent. You can link an external website and the Bot will start generating answers if it couldn’t find any relevant topics. Now, the improved version supports up to 4 public websites and 4 internal Microsoft websites (SharePoint sites and OneDrive).

Power Virtual Agents for SharePoint

3. Google AI Labs for Google Drive

While the process of integrating ChatGPT with Google Docs requires some technical acumen, it can be incredibly beneficial. By creating an App Script code with a unique OpenAI API key, you can utilize the powers of ChatGPT within Google Docs.

Add-ons like GPT for Sheets and Docs, AI Email Writers, and Reclaim.ai can assist you in fusing AI into Google Workspace. These can dramatically change the way you interface with Google Docs, improving productivity and enhancing text creation and editing functionality. But it does not currently really leveraging all the data one have on Google Drive in ChatGPT.

Google hinted the assistant that would read from Google Drive service on it’s latest I/O presentation but no details are provided yet.

ChatGPT in Google Docs

4. ChatGPT on Atlassian Confluence Knowledgebase

You can utilize third-party plugins like Copilot: ChatGPT for Confluence to integrate ChatGPT into Confluence Knowledgebase.

This option offers easy plug-n-play solution but is limited to the specific knowledge base systems, like Atlassian Confluence.

ChatGPT in Confluence

In summary, the best deployment option will depend on your existing ecosystem, the resources you have at your disposal, and the specific use-cases you have in mind for ChatGPT. Each approach comes with its own sets of pros and cons, and understanding these can help you make an informed decision.

Leveraging ChatGPT for Company’s Knowledgebase: A Case Study on an AI Consulting Firm with PDLT

In the rapidly evolving business landscape, timely access to accurate and essential information is crucial to driving success. One such consulting firm realized the need to enhance its knowledge management system for consultants, enabling faster responses and service delivery to its clients.

Situation

In the prior system, the consulting firm faced challenges in quickly retrieving and interpreting extensive regulations needed to provide adaptable solutions to its customers. For instance, new regulations’ interpretations or comparisons between distinct jurisdictions often took more time than a typical client was eager to wait. This lag in the response time began affecting customer satisfaction rates and contributed to a reduction in return businesses.

Solution

In response to these challenges, the firm engaged PDLT, an AI consulting company skilled in building MVPs (Minimum Viable Products) within less than three months. The AI consulting firm swiftly went to work and developed a proprietary system that indexed hundreds of terabytes of diverse documents.

The system used vector embeddings to process information effectively, transforming the vast quantity of unstructured text into high-dimensional numerical vectors. This enabled more efficient text retrieval and understanding. Once the vectorization was complete, the owned data was embedded as a company-specific knowledgebase in a web application built on OpenAI, creating a unique “chatgpt knowledgebase”.

PDLT utilized the the best practices for the User Experience of the chatbots, enabling consultants to access vital information anytime they need.

Outcome

The results were noticeable and immediate. After the first iteration of the project was delivered within a month, the consulting firm began experiencing its transformative impact. Junior consultants began to provide senior-level advice due to the readily available information, and the time to the first response saw a significant improvement.

In essence, this AI-empowered knowledgebase allowed the consultants to deliver agile and accurate solutions to the clients, furthered by the remarkable abilities of ChatGPT in understanding, processing, and providing relevant results.

Integrating ChatGPT with Your Company’s Knowledgebase

Your business’s knowledgebase is a valuable repository of information that supports both your internal teams and external customers. However, traditional searching and information retrieval methods can feel cumbersome and outdated. By integrating an advanced AI like ChatGPT into your knowledgebase, you can modernize the way users interact with your information, making it easier and more intuitive to find what they need.

Here is how to proceed:

1. Identification of Core Knowledge Areas:

Identify the key domains of knowledge within your business. These could range from product details, troubleshooting guides, business strategies, or any other crucial information needed for daily operations.

Example: If you run an IT service company, your core knowledge areas could include hardware requirements, software troubleshooting, client relationship management, and cybersecurity protocols. ChatGPT can enhance the searching and retrieval process from the volume of guides, manuals, and past queries stored in your knowledgebase.

2. Define the Risks and Required Access Levels:

Ensure that sensitive data within your business is properly safeguarded. Not every piece of information should be accessible by every user. Design different access levels based on roles, departments, or security clearance within your organization.

Example: Your product development process might be confidential, so you’ll restrict access to only members of the product team. In contrast, generic troubleshooting guides for common technical issues could be widely accessible to all staff and even customers.

3. Limit the Knowledgebase Indexing to Respective Access Levels:

With access levels defined, apply these restrictions to your knowledgebase indexation. That way, ChatGPT can serve relevant data based on an individual user’s access level rather than showing results they are not authorized to view.

Example: When a customer service executive asks ChatGPT a question regarding available software updates, they receive a response relevant to their level of access - information that’s safe to share with customers. Simultaneously, an engineer receives more in-depth and technical data about the software update because their access level permits so.

4. Integrate ChatGPT with Existing Information Systems:

Depending on your existing information management platforms, like SharePoint, Confluence, or Google Drive, your method of ChatGPT integration will differ.

Structuring your company’s knowledgebase benefits you through:

These steps might seem overwhelming at first, but remember, you’re not alone in this journey. AI consulting firms like Pragmatic DLT (PDLT) can help guide you through the entire process of setting up ChatGPT for your knowledgebase in less than three months. By leveraging this game-changing AI, you’ll be creating a significantly more efficient and effective knowledge management process for your business.

Potential Applications in Companies

Pragmatic DLT (PDLT) focuses on leveraging cutting-edge technologies such as AI to streamline various business processes, enabling companies to rapidly develop a minimum viable product (MVP) in less than three months. Below are some of the key areas where PDLT’s expertise can be highly beneficial.

Customer Service

Employee Onboarding

Decision Support

In conclusion, the adoption of AI in these sectors can drastically improve efficiency, customer satisfaction, and ultimately, profitability. PDLT can be your ideal partner in this journey, offering tailor-made solutions that fit your business needs.

Conclusion

ChatGPT’s potential for business optimization is evident, particularly in data retrieval and problem-solving. Through various integration options—custom development, Power Virtual Agents for SharePoint, Google AI Labs for Google Drive, and Atlassian Confluence—the technology offers flexibility in deployment tailored to business needs and existing ecosystems.

Key Takeaways:

  1. Data Accuracy: While concerns exist, coupling ChatGPT with a robust knowledge base can mitigate them.

  2. Practical Use-cases: ChatGPT can enhance customer service, automate troubleshooting, and provide real-time decision support, among other applications.

  3. Training and Prompting: No magic bullet; effective use demands understanding of training, prompting, and vector embeddings.

  4. Cost and Time: Implementation timelines and costs vary; ballpark figures start at around $5,000 and one month.

  5. Risks and Access Levels: Attention to data sensitivity and access levels is non-negotiable.

  6. Consulting Options: Specialist firms can expedite and de-risk implementation.

In sum, ChatGPT offers a viable path to augmenting business operations, but the journey demands planning, expertise, and oversight.

Testimonials

Let’s build something great together