Porting from RequireJS to ES6
Brian Kirkpatrick
Posted on January 20, 2023
Background
We have a modest (6-figure SLOC) size codebase for web applications that has historically been built on RequireJS.
If you haven't used RequireJS before, know that it's actually a pretty neat solution. Particularly in early-to-mid 2010s, there were no consistent cross-context module loader environments. (Remember the YUI library?) You were either packing your JS source (still piecemeal even with something like Browserify), probably from a highly-dynamic and inconsistent CommonJS/Node context, or writing directly within <script\>
tags themselves (my stomach feels sick...), if you didn't have a decent loader to use.
Compared to getting Node-style CommonJS modules to work within a browser context, RequireJS could really feel like a breath of fresh air. Developers simply used CommonJS-like require()
statements but could define modules within a self-contained closure with some expectation of consistent meta symbols (module
, exports
, and the require
function itself). The loader extensions were particularly unique and useful, giving you a way to hook in a wide variety of different resources. I even had a .ZIP
filesystem loader for archiving/compressing and exposing static assets!
RequireJS used a different loader standard, called AMD (for "asynchronous module definition"). By defining and loading modules asynchronously, you could expect better startup performance when your page or app loaded (as opposed to loading one big bundle of JS, which was difficult if not impossible to debug in production without a heavy cast of developer tool plugins--if they existed at all). It was great to use a browser-first context with explicit symbols (you'd be surprised how much Node implements implicitly), and porting other resources from CommonJS was surprisingly easy since you could always just wrap a definition in an AMD-compatible closure. This remained the case even when UMD (universal module definition) headers had their time in the sun for a few years.
ES6 introduced support for import
and export
symbols, syntax, and behavior. Surprisingly (largely I think because of the strong buy-in into CommonJS/Node environments), it took a long time for ES6 modules to take off. Even for the past few years, major packages focused largely on (say) cross-building to an ES6 module, but many projects have since successfully migrated the underlying codebase to ES6 with builds "out" to other contexts.
There are still a large variety of reasons to retain other module environments, largely (in my experience) because of other abstraction buy-ins that obfuscate how code is loaded and utilized within the browser context on deployment. React (in general) and .TSX
(in particular) are great examples, because of how strongly components are mapped to file structures and with the requirement to perform an up-front build pass anyway. And who can forget JQuery? Ha ha ha, just kidding.
Motivation
This brings us to our official migration. Farewell, RequireJS! You have served us well.
(A side note: Sometimes you will see ES6 modules referred to as "ESM". You will also see a file extension .MJS
used, largely to tell Node that it is using a JavaScript module. From the client's perspective, type="module"
is sufficient, but it can be nice to make it obvious, and popular editors like VS Code know about .MJS
already. The biggest adjustment you might need is adding the MIME type mapping to your static file server, which is pretty easy to do in something like nginx
.)
There are several major reasons for this migration:
-
RequireJS has reached EOL (end of life) and is no longer receiving major updates. In something as fundamental as a module loader, stability is a good thing, but in this case it is losing support overall within the ecosystem as ES6 modules finally reach a level of critical mass.
-
RequireJS has had, by design, significant performance issues on module loading. ES6 modules implement
import
andexport
symbols and logic within the JavaScript interpreter specification itself, which means much better performance for module loading in general. This is in addition to not string-parsing every module load forrequire()
statements. Look it up, it's true! When loading adefine
-enclosed module, RequireJS will turn it into a string and match against a pattern forrequire()
calls to resolve internal dependencies. For a loader system whose primary benefits are supposed to include asynchronous performance, this was actually kind of mind-boggling when I learned about it. Among other things, this means you can't dynamically "construct"require()
arguments to resolve dependencies at runtime. -
ES6 modules are natively cross-compatible across both browser and command-line (e.g., Node and NPM/Yarn) contexts. This means it becomes MUCH easier to role out CI integration at the module level for any dependency that would benefit from it, including obfuscation/minification; testing; documentation; deployment/release triggers & announcements; and many other great features--even though the module itself is written to be browser-first.
-
I mentioned the community at large has finally reached critical mass in its migration towards ES6 modules (away from browser-native, RequireJS, Node/CommonJS, and any other UMD-header contexts). This means reusability (for both our own internally-developed modules and for any other dependencies we want to leverage) is MUCH better and easier. No more random headers! No more arbitrary closures to enforce a cross-compatible context! It's pretty exciting, the more you think about it. Most open source JS projects are at the point where they have native support for ES6-compatible builds, so it's simply a matter of
import symbol from "path";
and you're good to go. This is particularly true if, like us, you are using git submodules as an alternative to package management, especially since you can build and pull an ES6 module from (say) adist/
folder for any projects that aren't already ported.https://www.zachgollwitzer.com/posts/scripts-commonjs-umd-amd-es6-modules
Speaking of paths,
import
has several modes in which it can be used with respect to how module paths are resolved. (There is also somewhat-experimental support in major browsers for animportmap
feature to define how module paths are resolved, but we haven't touched that yet.) The path resolution for RequireJS wasn't particularly bad because it was always based on the host path anyways (being evaluated in the browser context), but on the shell side (for CI etc.) this is a nice improvement for consistency's sake, especially to get away fromnode_modules/
hell.
Porting
So, you have a large collection of RequireJS modules. How to you port them? I find there are three major changes needed to most modules:
- First, you need to eliminate the "define()" closure and any dependencies on the symbols it exposes (module, exports, etc.).
BEFORE:
define(function(require, exports, module) {
class MyClass {
...
}
});
AFTER:
class MyClass {
...
}
- Next, you need to replace "require()" statements themselves with appropriate "import" statements.
BEFORE:
const dependency = require("path/to/dependency");
AFTER:
import dependency from "path/to/dependency.mjs";
- Then, you need to change your export statements (there were three ways to export symbols in RequireJS closures: returning from the closure, assigning to exports, and assigning to module.exports).
BEFORE:
return Object.assign(MyClass, {
"__metadata__": "..."
});
AFTER:
export default Object.assign(MyClass, {
"__metadata__:" "..."
});
Finally, there are likely some modules that utilize external resources. For example, some of our modules reference a side-loaded .CSV table for which we had a custom RequireJS module loader. The loader extension, when require()
arguments began with the registered prefix "csv!", would use Papaparse to transform the content into a module-level Array of Objects implicitly. This can easily be replaced by inline chains of fetch() and then() Promises, using the await keyword. (Incidentally, this frees us from an additional dependency, because we no longer need the "loader" module to define the extension, as well as freeing us from the sticky global-context registration of the extension against the prefix.)
BEFORE:
require.config({ "paths": {
"csv": "mycsvloader"
}});
const data = require("csv!path/to/table.csv");
AFTER:
const data = await fetch("path/to/table.csv")
.then(response => response.text())
.then(text => papaparse.parse(text.trim(), {
"header": true
}).data);
This ends up being sufficient for 99% of cases I've come across.
Conclusion
I will admit that I still have sentimental value for RequireJS. For years, it let us "short-circuit" the module wars and focus explicitly on robust development of reusable front-end-first modules. But, once we had the time and effort to port to ES6 modules, the benefits were clear and we're already seeing significant improvements. In particular, I have to say I'm very excited for how much CI power we can bring to bear on individual modules themselves.
If you're in a similar position, I'd encourage you to consider investing the time and effort to conduct a similar refactor. The JavaScript ecosystem is nothing if not incredibly hyper-pluralistic--but for something as fundamental as module loading, it's nice that we finally seem to be converging on something that will be stable well into the future (especially at the interpreter level, instead of being part of yet another higher-level framework). I think in the long run this is actually going to enable much more diversity and pluralism in the JavaScript ecosystem as the world of developers starts to collectively leverage a much greater degree of reusability and cross-compatibility.
Posted on January 20, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.