Building OpenWorldAI.com
iskurbanov
Posted on May 3, 2023
I just have finishing putting together the beta version of openworldai.com which right now is just a directory for AI projects.
My goal with the website is to build it out to be a hub for all things AI. That includes interesting projects, community, news, and aggregating newsletters.
This post is to share my process of building it and the tech stack that I used. I want to eventually open source the frontend.
The Stack
To build out the MVP of this project I went with the following stack:
- Next.js for the React Framework
- Next-Auth for authentication
- Prisma for the ORM for my database
- PlanetScale for my database
- Tailwind CSS for styling
- Preline.co for Tailwind components
- ScreenshotOne for taking bulk screenshots
- ChatGPT for debugging and prototyping
Future additions:
- Algolia for semantic search (this is a huge addition that I believe will be seen on all websites in the near future)
- Better project adding flow (the flow is too manual at the moment)
- Blog (to improve SEO)
Deeper dive on how it works
Next.js
The project is currently all serverless, built with the Next.js API routes whenever they are needed.
To add projects I currently just seed the database using Prisma. Which is working quite well. I also have a seed file where I call the OpenAI API to generate the description for each project and categorize it. Doing this at scale would take way too much time so I decided to automate that process from the beginning. It is working quite well!
I also use a puppeteer scrapper to get the website text and feed it into OpenAI:
// Puppeteer Scrapper + OpenAI
const puppeteer = require('puppeteer');
const { Configuration, OpenAIApi } = require('openai');
const fs = require('fs');
const configuration = new Configuration({
apiKey: 'sk-******',
});
const openai = new OpenAIApi(configuration);
async function generateSEOContent(prompt) {
try {
const response = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: prompt }],
});
return response.data.choices[0].message.content.trim();
} catch (error) {
console.error('Error generating SEO content:', error);
return '';
}
}
async function generateProjectName(prompt) {
try {
const response = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: prompt }],
});
return response.data.choices[0].message.content.trim();
} catch (error) {
console.error('Error generating project name:', error);
return '';
}
}
async function generateCategoriesAndSubcategories(prompt) {
try {
const response = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: prompt }],
});
const categoriesAndSubcategories = response.data.choices[0].message.content.trim();
console.log('categoriesAndSubcategories:', categoriesAndSubcategories);
const categoryMatch = categoriesAndSubcategories.match(/categories: (\[[^\]]+\])/);
const subcategoryMatch = categoriesAndSubcategories.match(/subcategories: (\[[^\]]+\])/);
const categories = categoryMatch ? JSON.parse(categoryMatch[1]) : [];
const subcategories = subcategoryMatch ? JSON.parse(subcategoryMatch[1]) : [];
return { categories, subcategories };
} catch (error) {
console.error('Error generating categories and subcategories:', error);
return { categories: [], subcategories: [] };
}
}
async function getTextFromURL(url) {
const browser = await puppeteer.launch({ headless: "new" });
const page = await browser.newPage();
try {
await page.goto(url, { waitUntil: 'networkidle2' });
const text = await page.evaluate(() => {
return document.body.innerText;
});
await browser.close();
return text;
} catch (error) {
console.error(`Error fetching text from URL: ${url}`, error);
await browser.close();
return '';
}
}
async function main(url) {
const text = await getTextFromURL(url);
const projectNamePrompt = `Get the project name for the following website content, only output the name:\n\n${text}\n\n`;
const projectName = await generateProjectName(projectNamePrompt);
const seoTitlePrompt = `Generate an SEO optimized title for the following website content:\n\n${text}\n\n`;
const seoTitle = await generateSEOContent(seoTitlePrompt);
console.log('SEO Title:', seoTitle);
const seoShortDescPrompt = `Generate an SEO optimized short description for the following website content:\n\n${text}\n\n`;
const seoShortDesc = await generateSEOContent(seoShortDescPrompt);
console.log('SEO Short Description:', seoShortDesc);
const seoLongDescPrompt = `Generate an SEO optimized long description between 1000-1500 characters for the following website content:\n\n${text}\n\n`;
const seoLongDesc = await generateSEOContent(seoLongDescPrompt);
console.log('SEO Long Description:', seoLongDesc);
const categoryPrompt = `Here is a list of categories and their subcategories, only use these options:
text:[copywriting,email assistant,general writing,prompts,seo,social media assistant,summarizer]
image:[art,avatars,design assistant,image editing,image generation,logo generation]
code:[code assistant,low-code,no-code]
audio:[audio editing,music,text to speech,transcriber]
video:[personalized videos,video editing,video generation]
business:[customer support,e-commerce,education assistant,finance,human resources,legal,presentations,productivity,real estate,sales,marketing]
other:[dating,experimental,fitness,gaming,healthcare,life assistant,research,resources,search engine,travel]
Analyze the following website content and assign it to categories and subcategories: ${seoLongDesc}
Return output in this format:
categories: ["categories", "categories"]
subcategories: ["subcategories", "subcategories","subcategories"]`;
const categoriesAndSubcategories = await generateCategoriesAndSubcategories(categoryPrompt);
const output = {
projectName: removeOuterQuotes(projectName),
url: removeOuterQuotes(url),
seoTitle: removeOuterQuotes(seoTitle),
seoShortDesc: removeOuterQuotes(seoShortDesc),
seoLongDesc: removeOuterQuotes(seoLongDesc),
categories: categoriesAndSubcategories.categories,
subcategories: categoriesAndSubcategories.subcategories,
};
console.log('output', output);
fs.readFile('output.json', 'utf8', (err, data) => {
if (err) {
console.error('Error reading output.json:', err);
return;
}
// Parse the content of the file as a JSON array
const projects = JSON.parse(data);
// Add the new output to the array
projects.push(output);
// Write the updated array back to the file
fs.writeFile('output.json', JSON.stringify(projects, null, 2), 'utf8', (err) => {
if (err) {
console.error('Error writing to output.json:', err);
} else {
console.log('Project output added to output.json.');
}
});
});
}
async function processURLs() {
const outputArray = [];
const urls = readURLsFromFile('urls2.json');
for (const url of urls) {
const output = await main(url);
outputArray.push(output);
}
// You can also write the outputArray to a JSON file if needed.
fs.writeFileSync('output2.json', JSON.stringify(outputArray, null, 2));
}
processURLs();
function removeOuterQuotes(str) {
if (!str) return str;
if (str.startsWith('"') && str.endsWith('"')) {
return str.slice(1, -1);
}
return str;
}
function readURLsFromFile(filename) {
const data = fs.readFileSync(filename, 'utf8');
return JSON.parse(data);
}
This allows me to feed it an array of URLs and it returns an json object with all the data I need for the website. I then run another seed that takes a screenshot for all the websites.
Tailwind CSS
I'm not a huge fan of styling and css so I use Tailwind and component libraries where I can. I have recently stumbled upon Preline.co and loved it so I used some of their components and modified them to fit my style.
Prisma + PlanetScale
These 2 go really well together and I haven't done much configuration to it since I have set up initially. Which is what I expect from a good ORM and a database. Less time managing it and more time building.
Conclusion
That is pretty much it for the MVP. I will try to add updates as I add more features to the site. I'm really looking forward to growing it and open sourcing the frontend when it's ready.
You can checkout the website in its current form at OpenWorldAI.com
Let me know your thoughts and what you would like to see on the project!
Posted on May 3, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.