Transforming GPT through API into a Fully Functional AI Assistant with LLMKit: A Step-by-Step Guide

obaydmerz

Abderrahmene Merzoug

Posted on July 19, 2024

Transforming GPT through API into a Fully Functional AI Assistant with LLMKit: A Step-by-Step Guide

In recent years, large language models (LLMs) such as GPT-3 have shown incredible promise in generating human-like text, answering questions, and holding conversations. However, to transform an LLM into a fully functional AI assistant, we need more than just text generation capabilities. That's where LLMKit comes in. LLMKit is a powerful library designed to help developers convert their text-to-text LLMs into fully functional AI assistants, enabling them to perform specific tasks and handle real-world scenarios.

LLMKit simplifies the integration of LLMs into your applications by providing a modular plugin system, built-in retry mechanisms, and customizable conversation configurations. In this article, we will walk you through the process of creating a conversation with an external function call using LLMKit. By the end of this guide, you'll have a solid understanding of how to extend the capabilities of your LLM and build a responsive AI assistant.

Getting Started with LLMKit

To start using LLMKit, you'll need to install it from GitHub using npm. Open your terminal and run the following command:

npm i obaydmerz/llmkit
Enter fullscreen mode Exit fullscreen mode

A step-by-step guide to create a conversation

Step 1: Importing modules and creating instances

import { Conversation, retry } from "llmkit";
import { External, ExternalFunction } from "llmkit/plugins/external";
// Import the OpenAI module here
Enter fullscreen mode Exit fullscreen mode

also, instantiate the OpenAI client:

const gpt = new OpenAI_Instance_Or_Any_LLM();
Enter fullscreen mode Exit fullscreen mode

Step 2: Create the conversation

Don't worry, we will explain the code below later:

let conv = Conversation.create(
    (m) => retry({}, (num) => gpt.chatCompletion(/*or any function*/(m, {
      debug: true
    }))),
    {
      plugins: [
        External.create([
          ExternalFunction.create("purchase", {
            description: "Orders something",
            parameters: {
              name: "string, The name of the product"
            },
            hook: async ({ name }) => {
              if (name.toLowerCase() == "pizza") {
                return "Ordered! Tell the user that the pizza is yummy";
              }
              return "Product Not Found";
            },
          }),
        ]),
      ],
    }
  );
Enter fullscreen mode Exit fullscreen mode

Conversation is the main class in LLMKit that hold the conversation between three parts: system, user, agent

The Conversation.create() static function accepts two arguments:

  • The first argument, which is the function that's called to pass the string message to the GPT. ( notice the retry function which repeats the function if it fails )
  • The second argument is the options object.

Here we added External to plugins, External provides a way for the GPT/LLM to execute functions your code.

Step 3: Send a Message to the Conversation

(async () => {
     await conv.start();
     let res = await conv.send("I wanna purchase a burger");

     // You can access messages through conv.messages
     console.log(res); // I'm sorry, I couldn't find the burger you were looking for. How can I assist you further?
})();
Enter fullscreen mode Exit fullscreen mode

After that all

If you still have any question, post it in the comments.
Join our discord server for further assistance!

šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€šŸš€

šŸ’– šŸ’Ŗ šŸ™… šŸš©
obaydmerz
Abderrahmene Merzoug

Posted on July 19, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related