Improving Website Performance: Utilizing Webpack for Efficient File Compression and Optimization.
Oleh SYPIAHIN
Posted on August 4, 2023
Ensuring web application performance is crucial for providing a satisfying user experience in web development. Page loading speed and application responsiveness directly affect user satisfaction and project success. To optimize performance, one effective method is file compression. This process significantly reduces the size of transmitted data and boosts page loading times. This article will explore powerful file compression tools like Gzip and Brotli. It will also demonstrate using them with Webpack Compression Plugin to optimize web applications and achieve rapid loading speeds across all devices and network conditions.
A little bit about me: 👨‍💻 I excel in optimizing performance and enjoy enhancing applications to save valuable ⏰ and resources 💰.
Today, you will get an in-depth understanding of webpack and some valuable tips. By going deeper, you will gain a complete understanding of what you are doing, with answers to questions like "Why?" and "How does it work?"
Let's analyze the events that we will influence when building the application.
Research Scope.
Our goal is clear. Before we start, let us quickly remind you what comes next after your code passes the PR.
- Building project on the server. In other words, we create a bundle of files the user will open in its browsers.
- Customer opens the address of your application, and the browser sends HTTP requests.
- In response browser downloads bundle files.
- Browser run bundle file. All code of your application is accessible for the browser.
- DOM creating and displaying. Interacting with the site.
What is a bundle file?
A bundle file is a single file that combines and optimizes all the resources (e.g., JavaScript code, CSS styles, images) of your web application, making it easier to load and run on the client side.
Thus browser – is a tool for demodulating the code into digestible user form.Â
The browser understands what needs to be done to turn the code into a website or application. I want to note – that at this stage, the speed of demodulation and the construction of the DOM depends on the computing power of your device.
As a developer, you may not be able to control the performance of the browser or the power of the user's device. However, there is a crucial aspect where you and/or the DevOps engineer can make a significant impact - the bundle size. This is the focus of our analysis, and the solution lies in compression.
What is Compression? Theoretical diving.
Compression of bundle files during Webpack build is the process of using compression algorithms, such as Gzip and Brotli, to reduce the size of the final bundle file containing all the code and resources of your web application. This improves website loading performance, reduces network traffic, and allows your application to run faster on the client side.Â
Compression algorithms identify repeating data and replace it with shorter symbols, reducing the amount of transmitted information and file size. Gzip and Brotl are safe for compressing text and code but may cause a quality loss in images and videos.
To use the compression algorithm when building a project, a web application, you have to support it:
- Server Configuration: Ensure that your web server supports gzip/brotli compression. Usually, this can be set up through the web server's configuration files.
- Compression Configuration: Configure the server to compress specific types of files when transmitting them to clients. Typically, text files like HTML, CSS, JavaScript, JSON, XML, and others, such as SVG, are compressed.
- File Compression: During the project build, you can use compression tools like Gulp or Webpack to automatically compress and optimize your files before hosting them on the server.
- Setting Headers: After compressing the files, the server sends the appropriate HTTP headers using Gzip or Brotli compression. The «Content-Encoding: gzip» and «Content-Type» headers help clients understand the compressed data and how to unpack it correctly-small schema for representation.
Flow Compression/Decompression Representation
The good news is that most browsers support "under the hood" for decompressing Data from the server and do decompression "on the fly."
If you have specific browser support, please, check compatibility here on this link.
Compression. First Steps.
So, now we need to comprehend how to Compress data.
I am using my well-known Angular pet application from a topic about lazy imports and will now focus on applying the gzip algorithm.
- I will break down the steps into smaller subtopics to make them easier to follow.
- Switch from the standard builder to a custom one. This will help us better understand its benefits.
- It is necessary to generate a specific configuration file for webpack.
- To prepare for work, it is necessary to install additional libraries to facilitate the workflow.
- Create an application using the updated configuration.
- Run and compare performance.
Standard Builder vs. Custom Builder.
The process of building the app is outlined in the angular.json configuration file. Within the "build" configuration, there is a package that offers a standard build setup for the majority of Angular applications.
"projects": {
...
"[project]": {
...
"architect": {
"builder": "@angular-devkit/build-angular:browser",
We require something else. We aim to adjust the settings and enhance the functionality of the standard build. We need greater flexibility in managing the build process and Webpack configuration. We strive for excellence in our standards.
The good news is that a custom-webpack package already exists for non-standard developers. Install and change the builder string.
npm i @angular-builders/custom-webpack
"architect": {
...
"build": {
"builder": "@angular-builders/custom-webpack:browser",
"options": {
"customWebpackConfig": {
"path": "./custom-webpack.config.js",
"replaceDuplicatePlugins": true
},
...
}
},
The place where we change the webpack configuration is a particular JavaScript file. Here, we have the options and customWebpackConfig sections. To specify the path to the new webpack configuration file, you should create a new file with the same name or a different one and link it to the path.
We need to install the special CompressionPlugin (the main star of our topic) before we can proceed with configuring the custom-webpack.config file.
npm install compression-webpack-plugin --save-dev
Original custom-webpack.config.js is:
const CompressionPlugin = require("compression-webpack-plugin");
module.exports = {
mode: "production",
plugins: [
new CompressionPlugin({
// Be very carefully with 'true', sometimes bug happens
deleteOriginalAssets: false,
algorithm: 'gzip',
test: /\.(js|css|html)$/,
}),
]
};
We have installed the CompressionPlugin dependency and set it up. We aim to use the gzip algorithm for compression, and we have specified the file extensions we want to zip in the Regular Expression format.
We will not rely on our understanding 🤔, but instead utilize internationally recognized and tested algorithms 🌍 to delve further into the subject 🔍.
This library supports algorithms such as Gzip and Google's Brotli. Also, we must add information about which files we want to use.
Brotli is known for providing better compression than Gzip, especially for text-based files such as HTML, CSS, and JavaScript. It can achieve compression rates that are 20-30% higher than Gzip. For instance, JavaScript and CSS files are typically compressed to 30-40% of their original size with Gzip, whereas Brotli can achieve 50% or more compression.
Additionally, you may have noticed a commented section of code reading "deleteOriginalAssets: true." We will address this further in a later discussion.
Our upcoming project will be built using custom-webpack and its new configuration file, custom-webpack.config.js. Let's build:
ng build
Let's take a look at the dist/lazy-import folder. Here, you will find both the original files and … the compressed gzip files.
What? Copies? Wait, wait, wait.Â
An inquiry that Devops engineers may have is whether keeping both clean and compressed application files will increase in the server's application size. The answer is affirmative - the size will be 🔼X plus (compression ratio multiplied by X). Prior to this, only clean application files with X megabytes were retained. 💾
Can I only save compressed files and securely use the application? Yes, the answer is affirmative.
Compress and delete.
Overall, there are libraries available to remove non-compressed files automatically.
The CompressionPlugin has a setting called deleteOriginalAssets set to true, which you may want to check. However, some forums report that this feature can accidentally delete important files, causing the application to fail.
However, in this article, we will create a script for this purpose and integrate it into the process. This will help you better understand how these libraries function behind the scenes.
To begin, create a new JavaScript file in the project's root, like remove-non-compressed-files.js, and add the script.
const fs = require('fs');
const path = require('path');
const directoryPath = './dist/lazy-import';
fs.readdir(directoryPath, (err, files) => {
if (err) {
console.error('Error reading directory contents:', err);
return;
}
// Filter files to keep only those with the .js extension
const jsFiles = files.filter((file) => path.extname(file) === '.js');
// Delete each file
jsFiles.forEach((file) => {
const filePath = path.join(directoryPath, file);
fs.unlinkSync(filePath);
console.log(`Deleted file: ${filePath}`);
});
console.log('File deletion completed!');
});
If we add this script to the build process after the build, this script will delete all files with the given condition.
ng build && node remove-non-compressed-files.js
The result is more respectable now.
Fantastic! But storing only compressed files can be helpful but requires a careful approach and proper integration with your actual server and build process.
In our upcoming article, we will discuss Docker Example and cover important topics such as server Dynamic Compression, Gzip Support, caching, and more. It's crucial to have a comprehensive understanding of these aspects, and we'll provide the necessary information.
Run and compare performance.
It is time to make sure that our approach is correct and efficient.
Let's set up a local server to handle isolated and built applications using only gzip files. Do the command and read my notes.
npm install express compression express-static-gzip
express
: A popular web framework for Node.js used to build web applications. It provides features to handle routes, requests, and responses.
compression
: An express middleware that enables HTTP response compression on the server side using methods like Gzip and Brotli. It reduces the size of data sent to the client, improving application performance.
expressStaticGzip 
- is a middleware for express that provides static file serving with built-in gzip compression support. It enables express to serve static files, such as JavaScript, CSS, and images, while automatically compressing them using gzip or Brotli, if available. This helps to reduce the size of transferred files, leading to faster loading times and improved performance for web applications.
Let's create a server configuration file called server.js.
const express = require('express');
const expressStaticGzip = require('express-static-gzip');
const path = require('path');
const app = express();
// Enable gzip compression for server responses
app.use(expressStaticGzip(path.join(__dirname, 'dist', 'lazy-import'), {
enableBrotli: true, // Enable Brotli compression
orderPreference: ['br', 'gz'], // Compression preference order (Brotli will be preferred if available)
setHeaders: (res) => {
// Set correct Content-Encoding and Content-Type headers
res.setHeader('Cache-Control', 'public, max-age=31536000');
},
}));
// Start the server on port 8080
const PORT = 8080;
app.listen(PORT, () => {
console.log(`Server running at http://localhost:${PORT}/`);
});
Let's run the server and open the application.
node server.js
Performance Tests.
Prerequisites:Â
- I often employ network speed throttling, such as simulating a slow 3G network, to test the performance of applications under real-world conditions that may have limited network bandwidth and higher latency. This helps developers understand how their application behaves and performs when network speed is not optimal. By simulating a slow network connection, they can gain insights into the application's performance in such situations.
- After each test, kindly delete the dist/[app-name] folder, restart the server following the build, and clear the cache in your browser.
- This feature only works in browsers that support gzip compression - all modern browsers. IE bye. The initial picture represents the uncompressed version of the application. It resulted in a transfer of 106 kB.
The second picture is intended for the compressed application, resulting in a transfer of 284 kB.
The results show that around 50% of the 📊 transfer wins and occurs during the ⌛️ DOMContentLoaded event.
Scale the application and see more fascinating results.
Last look on headers. Our compressed application has Content-Encoding: gzip instead of standard application/javascript.
The browser decompresses the data and uses the resulting JavaScript to process and render the page.
Decompression Proccess.
We see that our full-time uploading included a decompression process. The legitimate question is -  "How long does the decompression process take, and could it potentially affect the user's experience?"
Gzip and Brotli decompression usually happens very quickly in modern browsers and computers. Compression algorithms are designed to provide high-speed decompression with minimal computational overhead.
The speed of decompression depends on several factors:
Hardware: The processing power and available memory impact the decompression speed. Modern computers and mobile devices typically have sufficient resources for rapid gzip data processing.
- Algorithm Optimization: Browser developers continuously work on optimizing decompression algorithms to improve performance. This helps speed up the unpacking of data in the browser.
Also, one example is that some compression libs have compressed levels for ours. Add and check the balance between bundle size and decompression time.
new CompressionPlugin({
...
//for gzip max compression level is 9.
compressionOptions: { level: 9 },
...
}),
- Size of Source Data: Smaller files decompress faster than larger ones. Therefore, optimizing the size of files and combining multiple small files into one can reduce decompression time.
- Caching: If files have been previously decompressed and cached in the browser, it contributes to faster page loading. Overall, gzip decompression is a relatively fast operation and is usually not a performance bottleneck when working with web applications. However, like any performance optimization, monitoring file sizes and ensuring proper caching is essential to maximize the user experience.
Conclusion:
In summary, by harnessing the full potential of compression technologies and staying dedicated to continuous improvement, web developers can pave the way toward a faster, more sustainable, and more engaging web experience for users worldwide.
Please, follow me:
LinkedIn
Medium
Hackernoon
Posted on August 4, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.