Using Bun 1.0 for Serving APIs with Vercel AI SDK and OpenAI

donvitocodes

donvitocodes

Posted on September 14, 2023

Using Bun 1.0 for Serving APIs with Vercel AI SDK and OpenAI

Just a few days ago, the much-anticipated Bun 1.0 was launched, bringing a fresh perspective to the world of JavaScript. This new Javascript runtime and toolkit, etc. promises the fastest execution times and better developer experience.

Although I come from a backend development background and wouldn’t consider myself a seasoned JavaScript expert, I’ve always had an interest in exploring the various tools and frameworks. So, when Bun 1.0 made its debut a few days ago, I felt compelled to dive in and see what it had to offer, especially given that this realm is somewhat new to me.

Since I am playing around with OpenAI, I created a simple API using Bun and the Vercel AI SDK. The API serves chat completion requests from the chat web application I built using NextJS 13 and shadcn/ui.

It was really nice since it supports streaming out of the box! Since I already implemented the API in Next13, it was pretty straightforward to port the code to Bun.

const response = await openai.chat.completions.create({
                    model: 'gpt-3.5-turbo',
                    messages,
                    stream: true,
                });

// Assuming OpenAIStream is a function that processes the response for streaming - not working yet
const stream = OpenAIStream(response);

// Return the streaming response - not working yet
return new StreamingTextResponse(stream,
    { headers: corsResponseHeaders });
Enter fullscreen mode Exit fullscreen mode

Here’s a demo of the Chat application I built. The API being consumed by the web application in the left is the one created using Bun.

Here's the full source code of the API I built using Bun and Vercel AI SDK.

import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';

const openai = new OpenAI({ apiKey: Bun.env.OPENAI_API_KEY });

const corsResponseHeaders = {
    "Access-Control-Allow-Origin": "*",
    "Access-Control-Allow-Methods": "POST, GET, OPTIONS",
    "Access-Control-Allow-Headers": "X-PINGOTHER, Content-Type",
    "Access-Control-Max-Age": "86400"
}

const server = Bun.serve({
    port: 3001,
    async fetch(req) {
        if (req.method === 'POST') {
            try {

                const { messages } = await req.json();

                const response = await openai.chat.completions.create({
                    model: 'gpt-3.5-turbo',
                    messages,
                    stream: true,
                });

                // Assuming OpenAIStream is a function that processes the response for streaming - not working yet
                const stream = OpenAIStream(response);

                // Return the streaming response - not working yet
                return new StreamingTextResponse(stream,
                    { headers: corsResponseHeaders });               

            } catch (error) {
                console.log("Error: ", error);
                // Return an error response
                return new Response("Internal Server Error",
                    {
                        status: 500,
                        headers: corsResponseHeaders
                    },

                );
            }
        } else {
            // Handle other request methods or return a default response
            return new Response("Not Found", { status: 404,
                headers: corsResponseHeaders });
        }
    },
});

console.log(`Listening on localhost: ${server.port}`);
Enter fullscreen mode Exit fullscreen mode

If you like my article, you can follow me and subscribe here in Medium for more. You can also support me by buying me a coffee!

You can also follow me in my social media. Cheers!

🐦 twitter.com/donvito
🔗 linkedin.com/in/melvinvivas
👾 github.com/donvito
🔗 donvitocodes.com

[reposted from] https://techblogs.donvitocodes.com/using-bun-1-0-for-serving-apis-with-vercel-ai-sdk-and-openai-e6d01fedd2ca

💖 💪 🙅 🚩
donvitocodes
donvitocodes

Posted on September 14, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related