How to use SSR with Gatsby
Martin Klingenberg
Posted on August 31, 2022
Introduction
Gatsby 4 came with new rendering methods. DSG (Deferred Static Generation) to render blog articles on the go and SSR (Server Side Rendering). However, the documentation and availability of good examples for using this new technology has been scarce, especially the limitations of the technology. Earlier this year, I took responsibility of developing my companies web pages and transitioned a lot of the page to use SSR. The process was confusing, frustrating and I want to help all developers in my situation. This article will focus on the SSR features of Gatsby, however most of the solutions will apply to DSG as well.
Why do I even want to use SSR?
The media-guy at my company wanted to make it easier to publish changes to specific parts of the website and wanted the changes published without any delay. And I do understand why dislikes waiting 10-15 minutes to publish changes. Why wait for a build when you can have the changes instantaneous. I haven’t converted the whole page to SSR as only parts of the page benefits from using the technology. Basically my first tip, is to identify the pages that benefits form SSR the most, then prioritize the pages with the product owner. Second tip is to do the transition to SSR gradually, one page at a time.
The coding begins
Fetching data
If you have an existing solution, you probably have some method of getting data already. That would be some sort of page queries or similar. And we are going to keep most of these queries. In my project, a lot of the data needed for the LayoutComponent
, navigation etc, and the navgation was working perfectly and will rarely change, therefore there is no need for SSR. Blog articles and available positions are a different story. For those we need to fetch data on the go. And gatsby gives us no API for that. We must make our own method of fetching data.
Installing dependencies
In my case, I needed to fetch data from Sanity (A headless CMS). And Sanity provides a nice graphq-interface. Therefore we need to install a graphql-client in addition to the existing gatsby-source-sanity
.
yarn add @apollo/client; npm install @apollo/client
Lets get some data
First we need to set up a graphql-client, capable of fetching data. It is no harder than the following.
src/server-side/client.js
import { ApolloClient, InMemoryCache } from '@apollo/client';
import config from '../config';
export const client = new ApolloClient({
uri: `https://${config.SANITY_PROJECT_ID}.apicdn.sanity.io/v1/graphql/${config.SANITY_DATASET}/${config.SANITY_TAG}`,
cache: new InMemoryCache(),
headers: {
Authorization: `Bearer ${config.SANITY_TOKEN}`,
},
});
Now comes the sad part of the story. The queries you have built using the gatsby-graphql will not work anymore. There will be some changes but when you have rewritten your queries, the usage of the client is easy. As the example below shows, I like to put the logic for fetching data into it’s own file.
import { gql } from '@apollo/client';
import { createGatsbyImages } from '../server-side/imageCreator';
import { client } from '../server-side/client';
export async function getBlogDataServerSide() {
const response = await client.query({
fetchPolicy: 'no-cache',
query: gql`
{
Write something clever here ;)
}
`,
});
response.data.articles.forEach((article) => {
createGatsbyImages(article);
});
return response.data;
}
And then in your component you will have something that looks like this:
const Blog = ({ data, serverData }) => {
return (
<div>Some blog article templating</div>
);
};
export default Blog;
export async function getServerData() {
try {
return {
props: { articles: await getBlogDataServerSide() },
status: 200,
};
} catch {
return {
articles: [],
status: 500,
};
}
}
export const query = graphql`
{
You might have some sort of page query here
}
`;
And this is how easy it is. The page-query will only run once per build (obviously) and the serverdata will be fresh. And since the component exports a getServerData
-method, gatsby knows that this is a serverSide-component.
Adding some gatsby-image support
Since we are no longer using a gatsby source, we will need to make our own gatsby-image objects. You might want to create something similar for another CMS. After reading the source code of gatsby-source-sanity
I came up with the script below.
import { getGatsbyImageData } from 'gatsby-source-sanity';
import config from '../config';
function resolveNodeType(asset) {
if (asset._ref) {
return asset._ref;
}
if (asset.id) {
return { _id: asset.id };
}
return asset.url;
}
function imageCreator(asset) {
const node = resolveNodeType(asset);
let assets = {};
if (asset.metadata?.dimensions) {
assets = {
...asset.metadata.dimensions,
};
}
return getGatsbyImageData(node, assets, {
projectId: config.SANITY_PROJECT_ID,
dataset: config.SANITY_DATASET,
});
}
export function createGatsbyImages(element) {
if (!element) return;
Object.keys(element).forEach((subElement) => {
if (typeof element[subElement] === 'object') {
createGatsbyImages(element[subElement]);
return;
}
if (Array.isArray(element)) {
element.forEach((childElement) => {
createGatsbyImages(childElement);
});
return;
}
if (
subElement === '__typename' &&
element[subElement] === 'Image' &&
element.asset
) {
element.asset.gatsbyImageData = imageCreator(element.asset);
return;
}
if (
subElement === '_type' &&
element[subElement] === 'image' &&
element.asset
) {
element.asset.gatsbyImageData = imageCreator(element.asset);
return;
}
});
}
Adding dynamic routes
One of the weak spots in the Gatsby documentation, is the explanation of how the different routing options work in combination with SSR. However, you are not here for the rant. You are here for the solution. When publishing a new article we need to have routes with slugs in them. To make that work, we will use what Gatsby calls fallback routes. Basically, if you want a page named /blogg/:slug
, you can make the following file /src/pages/[slug].js
. The slug parameter will then be available in props.params['slug']
. Easy as pie.
export async function getServerData(props) {
const slug = props.params['slug'];
// More ssr-code
}
Wait! What happened to our sitemap!?
Well if you have a plugin generating a sitemap for your page, you will experience that the articles are not showing up any more. The reason being that since there is really no actual route, there will be nothing to pick up the route.
Most plugins I have seen has an option to query the routes manually.
Time to deploy
If you have been deploying your web page on your own, you will need to make some changes to the environment. The most naive thing you can do is to try to use gatsby serve
as a production server. Gatsby wants you to pay for their hosting-service and would not give you a production capable server for free. gatsby serve
is slow and has a tendency to add some extra redirects, which totally kills your SEO score. We need to do something different. In my company we like to deploy our services into our own kubernetes cluster. That is pretty much total overkill and way more complicated than paying for gatsby cloud. However, it comes with the benefit of me and my colleagues learning kubernetes, docker and helm. Other reasons to choose self hosting can be to reduce cost of hosting. Another reason might be that you are not allowed to use hosting outside you own country. A lot of Norwegian governmental services cannot be hosted in public clouds.
The architecture suggested by me would be using Varnish + Nginx + Fastify. Varnish for caching, nginx to do some simple proxying and adding gzip and edge-ready-ness and fastify to serve our app.
Add fastify to our page
Lets start by adding some dependencies
npm install gatsby-plugin-fastify-klyngen fastify; yarn add gatsby-plugin-fastify-klyngen fastify
.
Next add the plugin into gatsby-config.js.
plugins: [
{
resolve: `gatsby-plugin-fastify-klyngen`,
options: {
/* discussed below */
features: {},
},
},
...otherPlugins
]
To find a good server to serve our page I had to use an existing server and adjust it. The gatsby-plugin-fastify
needed some adjustments. I will make a pull-request to their repository with my small improvements. Until a new version arrives, I have forked their repository and made my own version. But I strongly recommend using the original package as it’s maintained by a larger team and I have no intention of maintaining a fastify server for very long.
When the server is configured you can easily try it out. Build your page before serving it yarn build; yarn gserve
.
But how about performance?
It would be nice if there was a mechanism to cache the articles and other content. And then invalidate the cache when the content changes. Varnish is really good at just that. Varnish was originally made by the largest newspaper in Norway and as a proud Norwegian I think varnish is the perfect solution. In this case when using sanity, it is a good solution. I spoke with a dev-ops engineer to get some insights into how he would solve the problem and the recommendation was to put the page behind a CDN. In my case that was a bit difficult as there was no easy way of integrating Sanity with a proper CDN.
Varnish is easy to use but will require some reading to ensure that your cache is as performs as good as possible. I have come up with the following configuration. It is basic and cleans all the cache on every change. It has room for improvement but its a good starting point.
vcl 4.0;
backend default {
.host = "localhost";
.port = "8080";
}
sub vcl_recv {
if (req.method == "PATCH") {
ban("req.url ~ .");
return (synth(200, "Full cache cleared"));
}
unset req.http.Cookie;
}
sub vcl_backend_response {
set beresp.ttl = 2w;
}
The most important line of the config is unset req.http.Cookie
; That tells varnish to cache content even though you have a cookies. The no cache when having a cookie is actually a clever default, but will not give you any cache using google analytics or similar products.
Nginx config
To tie everything together we need a simple nginx-config.
server {
listen 8080;
gzip on;
gzip_vary on;
gzip_min_length 1024;
gzip_proxied expired no-cache no-store private auth;
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/javascript application/xml;
gzip_disable "MSIE [1-6]\.";
absolute_redirect off;
error_page 404 /404.html;
rewrite ^([^.\?]*[^/])$ $1/ permanent;
location / {
add_header Cache-Control "public";
proxy_pass http://localhost:9000;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_intercept_errors on;
recursive_error_pages on;
}
}
And lets wrap it all up with a docker image
FROM nginx:1.20.2-alpine as dev
# Install packages and dependencies
RUN apk update && apk add --no-cache supervisor python3 make gcc g++ && apk add --update nodejs yarn varnish
# Build
WORKDIR /app
RUN mkdir -p /app/packages/website
RUN mkdir -p /app/.yarn/releases
COPY package.json yarn.lock .yarnrc.yml /app/
COPY .yarn/releases /app/.yarn/releases/
COPY .yarn/plugins /app/.yarn/plugins/
COPY packages/website /app/packages/website/
COPY packages/shared-components /app/packages/shared-components/
RUN yarn
RUN yarn workspace website run disable-telemetry
# The build step shouldn't be cached since it's non determenistic
# As such we add the next line to try and do a cache bust
# Recommended by: https://stackoverflow.com/a/58801213/359825
#ADD "https://www.random.org/cgi-bin/randbyte?nbytes=10&format=h" skipcache
# Ensure that proper .env files exists before building
RUN test -f "/app/packages/website/.env.production"
RUN yarn workspace website run build
# Configuring NginX
COPY website.nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
# Configure varnish
COPY default.vcl /etc/varnish/
# Configure supervisor
COPY supervisord.conf /app/supervisord.conf
CMD ["supervisord","-c","/app/supervisord.conf"]
Wrapping up
Hope this is helpful for the other people using gatsby. This may not be the perfect solution, but it is working well. If you want to see the whole solution, our website is open source and is available at Alv Website. Fill the comment-section with questions and feedback. It’s appreciated.
Posted on August 31, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.