With the increasing popularity of Large Language Models (LLM), we have created a bot with the specific purpose of answering questions based on a predefined set of internal documents — our company’s knowledge base, in this case. The functionality of this bot allows users to integrate it into their Slack client and connect to their own data sources.
By either tagging the bot in a question or directly messaging it, the bot provides answers based on the content within the documentation. This implementation leverages Akka’s actor model and makes use of Akka Streams, Akka HTTP, and an Akka-based Slack client for seamless operation. In terms of communication with the Large Language Model (LLM), the bot utilizes LangChain, a Python library, accessed through ScalaPy Facades.