Build Your Own Full Stack ChatGPT with React
Safak
Posted on July 25, 2024
Hi, I'm a full-stack web developer passionate about sharing my web development journey with other devs. I recently created a full-stack ChatGPT application that uses Gemini (I picked that over OpenAI because it's free). Check out my GitHub repository for the open-source code: Here
I’ve also published the tutorial of the project on my YouTube channel: You can check it out
What technologies are used?
Backend:
- Express.js
Database:
- MongoDB
Auth:
- Clerk
Frontend:
- React 19
- React Router Dom
- React Query
AI:
- Google Gemini
Image Uploading:
- ImageKit
The design consists of two layouts, five pages, and three components.
-
src
-
layouts
RootLayout.jsx
DashboardLayout.jsx
-
routes
Homepage.jsx
DashboardPage.jsx
ChatPage.jsx
SignInPage.jsx
SignOutPage.jsx
-
components
ChatList.jsx
NewPrompt.jsx
Upload.jsx
-
lib
gemini.js
-
layouts
Setting Up Google Gemini
To integrate Google Gemini AI, we use the @google/generative-ai
library. Here is how to set it up in gemini.js
:
import { GoogleGenerativeAI } from "@google/generative-ai";
import { HarmBlockThreshold, HarmCategory } from "@google/generative-ai";
const safetySettings = [
{
category: HarmCategory.HARM_CATEGORY_HARASSMENT,
threshold: HarmBlockThreshold.BLOCK_ONLY_HIGH,
},
{
category: HarmCategory.HARM_CATEGORY_HATE_SPEECH,
threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
},
];
const genAI = new GoogleGenerativeAI(import.meta.env.VITE_GEMINI_API_KEY);
const model = genAI.getGenerativeModel({ model: "gemini-1.5-flash", safetySettings });
export default model;
Setting Up MongoDB Models
Using Mongoose.js, we create schemas for our chats and user chats. This will allow us to verify upcoming CRUD inputs. Here are the schemas:
Chat Schema
const chatSchema = new mongoose.Schema({
userId: {
type: String,
required: true,
},
history: [
{
role: {
type: String,
enum: ['user', 'model'],
required: true,
},
parts: [
{
text: {
type: String,
required: true,
},
},
],
img: {
type: String,
required: false,
},
},
],
}, {
timestamps: true,
});
The history array is required to send previous data to the AI. This ensures that when we visit the chat page and ask a new question, we receive relevant answers. Additionally, using the userId, we can verify the owner of the chat.
UserChats Schema
const userChatsSchema = new mongoose.Schema({
userId: {
type: String,
required: true,
},
chats: [
{
_id: {
type: String,
required: true,
},
title: {
type: String,
required: true,
},
createdAt: {
type: Date,
default: Date.now,
},
},
],
}, {
timestamps: true,
});
Each user has a chats
array that includes the title and ID of their chats. We’ll use the title to display items in the left menu, and when a user clicks on any of these titles, we’ll open the chat page using the corresponding ID.
Creating an AI Chat
After creating the endpoint in the Express app and fetching data in the client, you’ll be able to pass the previous history to the AI.
const chat = model.startChat({
history: chatHistory,
generationConfig: {
maxOutputTokens: 100, // You can limit your tokens
},
});
After displaying the chat, you are ready to have a conversation with the AI. Take the text from a form and send it to AI.
const add = async (text) => {
setQuestion(text);
let accumulatedText = "";
try {
const result = await chat.sendMessageStream(
Object.entries(img.aiData).length ? [img?.aiData, text] : [text]
);
for await (const chunk of result.stream) {
const chunkText = chunk.text();
accumulatedText += chunkText;
setAnswer(accumulatedText);
}
mutation.mutate();
} catch (err) {
console.log(err);
}
};
This is how to create a chat streaming with Gemini AI. To see how to save them to a database, you can check the source code or the video tutorial
I hope it was useful. If you want to learn more about web development and practice with real-world projects, you can check out my channel and other posts 😊
Lama Dev
🔥 Lama Dev YouTube Channel
❌ Lama Dev X/Twitter
⚡️ Lama Dev Instagram
👾 Source Code
Posted on July 25, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.