Add sitemap.xml and robots.txt to Next.js app (SEO - 01)

mijim

Medusa

Posted on July 3, 2020

Add sitemap.xml and robots.txt to Next.js app (SEO - 01)

Since Next.js has become the best option (in my opinion) to create serverless apps with React, I will start a new group of articles talking about how to perform SEO with this great library.

This first one is a very simple recipe to add sitemap.xml and robots.txt files. As almost every body knows, these files are used by the Google Search Bot in order to know the site structure and the files that should list.

In order to don't extend much the post, I'll show you only the static files. Anyway, you can transform these files into dynamic ones by fetching previously and passing to getSitemap and getRobots methods the data you want to put in.

pages/sitemap.xml.tsx

import React from 'react';

const getSitemap = () => `<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://example.com/</loc>
    <lastmod>2020-07-01</lastmod>
    <changefreq>daily</changefreq>
    <priority>0.9</priority>
  </url>
</urlset>`;

class Sitemap extends React.Component {
  public static async getInitialProps({ res }) {
    res.setHeader('Content-Type', 'text/xml');
    res.write(getSitemap());
    res.end();
  }
}

export default Sitemap;
Enter fullscreen mode Exit fullscreen mode

pages/robots.txt.tsx

import React from 'react';

const getRobots = () => `User-agent: *
Disallow: /_next/static/
`;

class Sitemap extends React.Component {
  public static async getInitialProps({ res }) {
    res.setHeader('Content-Type', 'text/plain');
    res.write(getRobots());
    res.end();
  }
}

export default Sitemap;
Enter fullscreen mode Exit fullscreen mode
💖 💪 🙅 🚩
mijim
Medusa

Posted on July 3, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related