My web performance journey with Nuxt, Storyblok & Netlify
Alba Silvente Fuentes
Posted on April 17, 2021
In this post I will show you the main web performance concerns I had while building my website and how a Jamstack architecture will help us solve them.
To build my website I have used technologies such as: Nuxt, my static site generator, Storyblok, as my headlessCMS with an image service provider, and Netlify to host my full static site.
I'm using Lighthouse as the tool that will show us all the opportunities we have to improve or fix the performance of our project.
Resources treatment
1. Preload key requests
We will always consider using link rel=preload to prioritize fetching resources that are currently requested later in page load.
Solution (uses rel preload) → Preload critical assets to improve loading speed.
Declare preload links in your HTML to instruct the browser to download key resources as soon as possible.
<head>
<link rel="preload" href="critical.css" as="style">
<link rel="preload" href="critical.js" as="script">
</head>
What I use → As I use Nuxt as my static site generator, it is already leveraging my performance with this technique, check crazy fast static applications to learn more about how it's doing it for us.
2. Preconnect to required origins
Consider adding preconnect or dns-prefetch resource hints to establish early connections to important third-party origins.
Solution (uses rel preconnect) → Informing the browser of your intention is as simple as adding a link preconnect tag to your page:
<link rel="preconnect" href="https://example.com">
This lets the browser know that the page intends to connect to example com and retrieve content from there.
In general, it's better to use link rel="preload", as it's a more comprehensive performance tweak, but we can keep link rel="preconnect" for the edge cases like:
Link dns-prefetch is another type related to connections. This handles the DNS lookup only, but it's got wider browser support, so it may serve as a nice fallback. You use it the exact same way:
<link rel="dns-prefetch" href="https://example.com">.
What I use → A good example of this could be a link to google fonts, in my case, as I have the font files inside my project itself I didn't need to have this into account.
But Nuxt already took this into account for you and they created a module to improve your font load performance: @nuxtjs/google-fonts.
3. Lazy load third-party resources with facades
Some third-party embeds can be lazy loaded. Consider replacing them with a facade until they are required.
Solution (third party facades) → Instead of adding a third-party embed directly to your HTML, load the page with a static element that looks similar to the embedded third-party. The interaction pattern should look:
- On load: Add facade to the page (as the cover of a video).
- On mouseover: The facade preconnects to third-party resources.
- On click: The facade replaces itself with the third-party product.
What I use → For Youtube videos I started using lite-youtube-embed package, following the advice of Debbie O'brien and web.dev!
The difference in the loading time of your page is brutal, not to mention that initially you don't have a lot of iframes lengthening the interaction time.
4. Reduce the impact of third-party code / Minimize third-party usage
Third-party code can significantly impact load performance. Limit the number of redundant third-party providers and try to load third-party code after your page has primarily finished loading.
Solution (loading third party javascript) → If a third-party script is slowing down your page load, you have several options to improve performance:
- Load the script using the async or defer attribute to avoid blocking document parsing.
- Self-hosting the script if the third-party server is slow.
- Consider removing the script if it doesn't add clear value to your site.
- Resource Hints like link rel=preconnect or link rel=dns-prefetch to perform a DNS lookup for domains hosting third-party scripts.
What I use → I'm using Google Analytics, a third party, but a package called vue-gtag helps me to load only with user consent and, once active, it preconnects to googletagmanager and loads analytics asynchronously:
<link href="https://www.googletagmanager.com" rel="preconnect">
<script type="text/javascript" async src="https://www.google-analytics.com/analytics.js"></script>
As I'm using the Storyblok image service provider, I preconnected to it, so it can improve the images load time:
<link rel="preconnect" href="//img2.storyblok.com">
// Nuxt config
head: {
link: [
{ rel: 'preconnect', href: '//img2.storyblok.com' },
],
}
5. Eliminate render-blocking resources
Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles.
You can reduce the size of your pages by only shipping the code and styles that you need. Click on a URL to inspect that file in the Sources panel. Styles in CSS files and code in JavaScript files are marked in two colors:
- Green (critical): Styles that are required for first paint; code that's critical to the page's core functionality.
- Red (non-critical): Styles that apply to content not immediately visible; code not being used in page's core functionality.
Solution (render blocking resources) → Let's see in depth how to eliminate scripts or stylesheets render-blocking our page load.
-
How to eliminate render-blocking scripts
Once you've identified critical code, move that code from the render-blocking URL to an inline script tag in your HTML page.
If there's code in a render-blocking URL that's not critical, you can keep it in the URL, and then mark the URL with async or defer attributes.
Code that isn't being used at all should be removed.
-
How to eliminate render-blocking stylesheets
Inline critical styles required for the first paint inside a style block at the head of the HTML page. Then load the rest of the styles asynchronously using the preload link.
Consider automating the process of extracting and inlining "Above the Fold" CSS using the Critical tool.
What I use → In Netlify we have a plugin for critical css called netlify-plugin-inline-critical-css.
Another cool technique is to split the style into different files, organized by media query. Then add a media attribute to each stylesheet link. When loading a page, the browser only blocks the first paint to retrieve the stylesheets that match the user's device.
Keep CSS /JS files smaller
1. Minify CSS /JavaScript
Minification is the process of removing whitespace and any code that is not necessary to create a smaller but perfectly valid code file.
Minifying CSS files → reduce network payload sizes (read more about minifying CSS)
Solution CSS → Minify with tools like webpack https://web.dev/minify-css/#css-minification-with-webpack.
Minifying JavaScript files → reduce payload sizes and script parse time (read more about minifying JS)
Solution JS → The new uglify-js: https://github.com/terser/terser or continue using webpack, Teser is already included in the prod.js file.
What I use → Nuxt is already using Terser webpack plugin in its build configuration, taking care of the minification by itself.
2. Remove unused CSS /JavaScript
Remove dead rules from stylesheets and defer the loading of CSS not used for above-the-fold content to reduce unnecessary bytes consumed by network activity.
Solution (unused css rules) → Take into account Critical/Non-Critical CSS technique as per render-blocking stylesheets section, but combined with a tool that deletes the CSS not used in your page, as the famous PurgeCSS.
Remove unused JavaScript to reduce bytes consumed by network activity. (unused JS)
Solution (detailed info and tool comparatives) → Let's see which techniques we can use if our frameworks doesn't do it for us:
Record code coverage to start analyzing the unused code in specific files:
Coverage tab in DEV Tools:
- Click Start Instrumenting Coverage And Reload Page if you want to see what code is needed to load the page.
- Click Instrument Coverage if you want to see what code is used after interacting with the page.
Build tool for support for removing unused code
Webpack make easier avoid or remove unused code with the following techniques:
-
Code Splitting - Extract common dependencies into shared bundles.
The process of breaking up bundled code into multiple smaller bundles that can be loaded and executed independently as needed.
Nuxt takes care with webpack of code-splitting the application!
-
Unused Code Elimination - Dead Code Elimination is the process of removing code that is not used by the current application.
There are a number of tools available with the most popular being Terser and Closure Compiler. Webpack's Dead Code Elimination is implemented by removing unused module exports, then relying on Terser.
-
Unused Imported Code - tricky optimization cases where a module's exports are used in a way that is difficult to statically analyze.
Dynamic imports are one of these cases. Webpack doesn't understand the special destructuring syntax to elimitate dead code:
const { transformImage } = await import('./image.utils.js');
But it allows to manually list the exports that are used via magic comment:
const { transformImage } = await import(/* webpackExports: "transformImage" */ './image.utils.js');
What I use → Nuxt already does this for me, it's using webpack under the hood. It's splitting my code by pages, so I can forget about this magic webpackChunkName comment that you need to add on every route with dynamic import.
3. Enable text compression
Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes.
Solution (uses text compression) → Enable text compression on your server.
When a browser requests a resource, it will use the Accept-Encoding HTTP request header to indicate what compression algorithms it supports.
Accept-Encoding: gzip, compress, br
If the browser supports Brotli (br) you should use Brotli because it can reduce the file size of the resources more than the other compression algorithms.
What I use → My hosting, Netlify, is already using Brotli compression by default.
Brotli compression has gained widespread browser support and is particularly effective for text-based files such as HTML, JavaScript and CSS assets.
Netlify Edge is already encoding and caching suitable assets with Brotli, and delivering correctly compressed assets depending on the requesting browser.
Median Brotli / gzip comparisons, according to Akamai’s testing:
- JavaScript files compressed with Brotli are 14% smaller than gzip.
- HTML files are 21% smaller than gzip.
- CSS files are 17% smaller than gzip.
4. Remove duplicate modules in JavaScript bundles
Remove large, duplicate JavaScript modules from bundles to reduce unnecessary bytes consumed by network activity.
Solution → With webpack you have https://www.npmjs.com/package/webpack-bundle-analyzer to check JS bundles and start cleaning up your project.
What I use → In Nuxt I already have that package, I just need to add a flag --analyze to my build command and voilà!
Reduce execution time
1. JavaScript execution time
Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this.
Solution (bootup time) → The combination of code splitting, minification and compression, removal of unused code and caching techniques will greatly improve execution time.
What I use → As always Nuxt is one step ahead, in this video you can check the technique they use with your own eyes: https://www.youtube.com/watch?v=J6airiY8e84
2. Minimizes main-thread work
Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this.
Solution (mainthread work breakdown) → In the end, it's the compendium of many of the things we've already seen in this article or that we will see later on.
In summary, the idea is to optimize both our JS and CSS code, minimizing it and removing unused code, as well as the third-party libraries we are using. Always serving the CSS and JS critical to the page being viewed first and deferring the rest.
3. User Timing marks and measures (a cool tool, not an issue)
Consider instrumenting your app with the User Timing API to measure your app's real-world performance during key user experiences. Read more about user timings.
4. Initial server response time was short
Keep the server response time for the main document short because all other requests depend on it.
Solution (time to first byte) → When choosing a hosting you have to take this into account, if it is a static hosting, everything will already be configured correctly and the CDN will have many advantages.
What I use → In my case Netlify gives me a response of 33ms. You can check this speed test tool, to see my results and test with your site: testmysite.io/dawntraoz.com
The DOM troubles
1. Avoid large layout shifts
These DOM elements contribute most to the CLS of the page.
Cumulative Layout Shift (CLS) is a Core Web Vitals metric calculated by summing all layout shifts that aren’t caused by user interaction.
What I use → This https://webvitals.dev/cls site give you detailed information on how your website CLS is performing.
2. Avoids an excessive DOM size
A large DOM will increase memory usage, cause longer style calculations, and produce costly layout reflows.
Solutions (dom size) → In general, look for ways to create DOM nodes only when needed, and destroy nodes when they're no longer needed.
We can make use of lazy loading components in Nuxt.
But also keep your HTML smaller or load more on scroll technique could help.
Images, our bigger headache
1. Properly size images
Serve images that are appropriately-sized to save cellular data and improve load time.
Solutions (uses responsive images) → Let's take a look at the different techniques recommended by Google:
- Srcset: The main strategy for serving appropriately sized images is called "responsive images". With responsive images, you generate multiple versions of each image, and then specify which version to use in your HTML or CSS using media queries, viewport dimensions, and so on.
<img src="flower-large.jpg" srcset="flower-small.jpg 480w, flower-large.jpg 1080w" sizes="50vw">
-
Image CDNs: are another main strategy for serving appropriately sized images. You can think of image CDNs like web service APIs for transforming images.
What I use → I'm using the one available in Storyblok: storyblok image service, always requesting the proper sizes.
SVG: another strategy is to use vector-based image formats. With a finite amount of code, an SVG image can scale to any size. See Replace complex icons with SVG to learn more.
2. Defer offscreen images
Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive.
Solution (offscreen images) → Lazy load your images. You can use the loading property set to lazy as per MDN recommendation: Lazy loading.
What I use → In my case I'm using Vue Lazyload to lazy-load my images and background images: https://github.com/hilongjw/vue-lazyload#demo
3. Efficiently encode images
Optimized images load faster and consume less cellular data.
Solution (uses optimized images) → This should be fixed if you are using all the different techniques we see in this article. Using your image CDN service or the compression of your image should be enough.
If you don't use any image CDN, you can use this online tool: https://squoosh.app/
4. Serve images in next-gen formats
Image formats like JPEG 2000, JPEG XR, and WebP often provide better compression than PNG or JPEG, which means faster downloads and less data consumption.
Solution (uses webp images) → If you use an image service, as I do, they also have a format filter to get the webp/jpeg format. So you can upload any kind of image, but you will always download the optimized one!
What I use → I use img2.storyblok service adding a filters:format(webp). But only when the browser support this format.
Problem I found → I needed to filter by canvas rendering on the client side to avoid displaying webp images in browsers that don't support it like Safari (webp will work in future versions):
format = this.canUseWebP() ? '/filters:format(webp)' : '/filters:format(/*jpeg OR png*/)'
// In methods
canUseWebP() {
if (window.canUseWebP) {
return window.canUseWebP
}
const el = document.createElement('canvas')
if (el.getContext && el.getContext('2d')) {
window.canUseWebP =
el.toDataURL('image/webp').indexOf('data:image/webp') === 0
return window.canUseWebP
}
window.canUseWebP = false
return window.canUseWebP
},
Thanks to my colleagues in @passionPeopleNL, for shedding light on this matter!
5. Image elements have explicit width and height
Set an explicit width and height on image elements to reduce layout shifts and improve CLS.
Solution (optimize CLS) → Always include width and height size attributes on your images and video elements.
Alternatively, reserve the required space with CSS aspect ratio boxes.
What I use → I have created a generic component for images.
This way every time I define an image I will call this component, which will not only optimize my image using v-lazy and filtering the format, but the properties will not allow you not to pass the width and height.
This way we will always make sure that we comply with the standard.
6. Use video formats for animated content
Large GIFs are inefficient for delivering animated content. Consider using MPEG4/WebM videos for animations and PNG/WebP for static images instead of GIF to save network bytes.
Solution (efficient animated content) → Many image CDNs support GIF to HTML5 video conversion. You upload a GIF to the image CDN, and the image CDN returns an HTML5 video.
I recommend you the article Improve Animated GIF Performance With HTML5 Video if you need to do this yourself.
7. Preload Largest Contentful Paint image
Preload the image used by the LCP element in order to improve your LCP time.
Solution (optimize LCP) → If you know that a particular resource should be prioritized, use link rel="preload" to fetch it sooner.
Many types of resources can be preloaded, but you should first focus on preloading critical assets, such as fonts, above-the-fold images or videos, and critical-path CSS or JavaScript.
What I use → In the article page I've placed the featured image of the article as a preload link at the head tag using the head method that nuxt provide us.
head() {
return {
link: [
{
rel: 'preload',
as: 'image',
href: transformImage(this.story.content.featured_image, '672x0'),
},
],
}
}
Fonts
1. All text remains visible during webfont loads
Leverage the font-display CSS feature to ensure text is user-visible while webfonts are loading.
Solution (font display) → The easiest way to avoid showing invisible text while custom fonts load is to temporarily show a system font. By including font-display: swap in your @font-face style, you can avoid FOIT in most modern browsers:
@font-face {
font-family: 'Pacifico';
font-style: normal;
font-weight: 400;
src: local('Pacifico Regular'), local('Pacifico-Regular'), url(https://fonts.gstatic.com/s/pacifico/v12/FwZY7-Qmy14u9lezJ-6H6MmBp0u-.woff2) format('woff2');
font-display: swap;
}
The font-display API specifies how a font is displayed. swap tells the browser that text using the font should be displayed immediately using a system font. Once the custom font is ready, it replaces the system font.
For Google fonts, for example, is as simple as adding the &display=swap parameter to the end to the Google Fonts URL:
<link href="https://fonts.googleapis.com/css?family=Roboto:400,700&**display=swap**" rel="stylesheet">
What I use → The @font-face swap technique, is the one I'm using at the moment, with the font files included in my project directly.
What to avoid?
1. Avoid multiple page redirects
Redirects introduce additional delays before the page can be loaded (avoid multiple redirects).
I avoid → I'm not doing any redirects.
2. Avoid serving legacy JavaScript to modern browsers
Polyfills and transforms enable legacy browsers to use new JavaScript features. However, many aren't necessary for modern browsers.
Solution (detailed info) → For your bundled JavaScript, adopt a modern script deployment strategy using module/nomodule feature detection to reduce the amount of code shipped to modern browsers, while retaining support for legacy browsers.
What I use → In Nuxt we have --modern with some options in the build command. In my case for generate --modern is sufficient.
Check this awesome tutorial https://dev.to/debs_obrien/modern-build-in-nuxt-js-17lc to learn more about it.
3. Avoids enormous network payloads
Large network payloads cost users real money and are highly correlated with long load times.
Solution (total byte weight) → There're some ways to minimize our payload size:
- Defer requests until they're needed. Nuxt is taking care of it.
- Optimize requests to be as small as possible, minimizing and compressing, try to use WebP for the images when it's possible. An image CDN will be always there to keep our performance up!
-
Cache requests so the page doesn't re-download the resources on repeat visits.
Web.dev recommend us to check Network reliability landing page to learn more about caching works and implement it.
4. Avoids document.write()
For users on slow connections, external scripts dynamically injected via document.write() can delay page load by tens of seconds.
Solution (no document write) → In your own code you have absolute control to not add it, but I recommend that whenever you are going to use a third-party check that it is not using document.write() for something.
5. Avoid non-composited animations
Animations which are not composited can be janky and increase CLS.
Solution (non composited animations) → Right now I don't have so many animations, but the few I have I apply with the properties that are cheap to run for the browser: translate and scale.
Reading this tutorial https://www.html5rocks.com/en/tutorials/speed/high-performance-animations/ will clarify you why.
Interesting articles on this topic
https://wildbit.com/blog/2020/09/30/getting-postmark-lighthouse-performance-score-to-100
Posted on April 17, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.