Writing Deno/Node Cross-Compatible Javascript
ndesmic
Posted on September 28, 2021
Note: Not long after publish Deno 1.15 added several new compatibility features like http
polyfills and the --compat
commandline flag which automatically sets up node polyfills.
Try those instead! https://deno.com/blog/v1.15
Being a fan of Deno I've been trying to see how I can move more of my node workflows and code over to use it. This is fine for private code but it's likely to make people who use Node a bit uncomfortable. Node is nice a stable, it's well known, it doesn't rock the boat. When developing new libraries I figured it might be an interesting idea to make them compatible so when the time comes I don't have to re-implement them to change runtimes. Sadly this turns out to be harder than expected. The problem isn't too unlike sharing code between node and the browser. Node made many early decisions pre-standardization that have rendered important parts of it incompatible. We don't have easy access to fetch
for example, we have much more low level http
. For more algorithmic code this isn't much of an issue but when it comes to things like direct file system access which is not and probably with never be standardized with a simple API in browsers we need ways to bridge some of the divide.
Basic Tips
Use an updated version of node
First off we should use and build for the latest versions of Node. Node is generally moving closer toward utilizing newer, standard implementations and moving away from it's old ones. This can be seen in things like WHATWG (or web) steams which will one day replace the existing Node streams. While we can't really do much with existing code libraries using the older paradigms, when building our own stuff, we can make sure we're using new versions of Node that support these types of APIs. This will make sure our code is not specific to Node's implementations. This might deter consumers not on the latest node but time will solve that for us.
We also want to be using ESM. CommonJS is going to complicate things quite a bit so let's use the modern and standardized module format.
Don't use Typescript
At least not directly. The problem is Deno lets us do weird things like import typescript directly which complicates things (but in some cases we don't have a choice, see below). It also causes problems because Deno adheres to ESM rules and requires extensions (.ts). Typescript does not and wants you to leave out extensions. Unfortunately this is just a big mess. I would instead push for something like JS with JSDoc comments to get the typing benefits if you need them.
Opening a file in Node
Opening a file in Node requires importing fs
. Also, this will only net you the old callback versions, what you really want is the promisified versions in fs/promises
. readFile
takes a second parameter which is the encoding, typically this will be utf-8
for text.
We also don't want to deal with the .mjs
extension if we can help it, so I'd recommend using type: "module"
in your package.json
.
//read-file-node.js
import fs from "fs/promises";
const data = await fs.readFile("../data/hello.txt", "utf-8");
console.log(data);
And we can run with node ../src/read-file/read-file-node.js
Opening a file in Deno
Deno is a little simpler. For one, the standard library is separate but the basic runtime gives us a handy method to read files so we don't need it for such a basic operation.
//read-file-deno.js
const data = await Deno.readTextFile("../data/hello.txt");
console.log(data);
Deno has fancy permissions so the command to run needs to give it the power to read files: deno run --allow-read ../src/read-file/read-file-deno.js
Node In Deno: Polyfill + Import Maps
First lets see what happens running Node version in Deno:
error: Relative import path "fs/promises" not prefixed with / or ./ or ../ from "file:///D:/projects/deno-node/src/read-file/read-file-node.js"
Deno doesn't know what to do with unprefixed paths that Node provides.
We can teach Deno about fs
using import maps. This is a JSON file that tells the program how to map one module path to another. The nice thing is this works for bare modules too so we can point fs
to something more useful. In fact Deno comes with some Node polyfills so we can point directly to that.
{
"imports": {
"fs": "https://deno.land/std/node/fs.ts"
}
}
We can call this import-map-deno.json
and we can tell Deno to run with it: deno run --allow-read --import-map=../src/read-file/import-map-deno.json ../src/read-file/read-file-node.js
.
This will allow us to run the Node code in Deno!
Deno in Node: Global Deno Polyfill
And when we run the Deno version in Node:
file:///D:/projects/deno-node/src/read-file/read-file-deno.js:1
const data = await Deno.readTextFile("../data/hello.txt");
^
ReferenceError: Deno is not define
It doesn't know what Deno
is.
Unfortunately the reverse is slightly harder. What we want is to be able to use a Deno
global object. To do so we'll need to modify the code to import a script that will setup a Deno polyfill on the global object.
//deno.js
import fs from "fs/promises";
function readTextFile(path){
return fs.readFile(path, "utf-8");
}
globalThis.Deno = {
readTextFile
};
And then import it:
//read-file-deno.js
import "./deno.js";
const data = await Deno.readTextFile("../data/hello.txt");
console.log(data);
This will now work when run from node!
But uh-oh, we modified the script, so when we go back and try to run it from Deno:
error: Relative import path "fs/promises" not prefixed with / or ./ or ../ from "file:///D:/projects/deno-node/src/read-file/deno.js"
Again we can use those cool import maps again. While node will always resolve import "./deno.js"
in Deno we can tell it to resolve something completely different. In fact since the Deno
object already exists we don't need to import anything at all! To do this with import maps is a little weird. As far as I understand they have to point to something and we can't inline functions. So we'll create a completely empty file null.js
.
{
"imports": {
"./deno.js": "./null.js"
}
}
Now we need to change how we run Deno: deno run --allow-read --import-map=../src/read-file/import_map_deno.json ../src/read-file/read-file-deno.js
to take this import map into account.
And this will work.
Another Example: Fetch
This will be a bit harder because the APIs are not as 1-to-1. We'll also need 3rd party dependencies to deal with this. We'll start with Deno this time because it's easier.
Deno
const response = await fetch(`https://api.github.com/users/ndesmic/repos`, {
"Accept": "application/vnd.github.v3+json"
});
const json = await response.json();
console.log(json);
Nothing interesting here, standard fetch like you'd use in the browser. We run it like deno run --allow-net ../src/fetch/fetch-deno.js
Node
Here we'll need to pull in a library to do fetching. We want this to be close to standard fetch so that it'll just work™. What I don't want is a different API like axios
that I need to adapt. For this I decided to go with node-fetch
since that seems like a popular choice.
import fetch from "node-fetch";
const response = await fetch(`https://api.github.com/users/ndesmic/repos`, {
"Accept": "application/vnd.github.v3+json"
});
const json = await response.json();
console.log(json);
And we run it thusly: node ../src/read-file/read-file-node.js
Deno in Node
We can start from the error:
const response = await fetch(`https://api.github.com/users/ndesmic/repos`, {
^
ReferenceError: fetch is not defined
Well we know we didn't have fetch so lets add it:
import fetch from "node-fetch";
Hey wait a second now it's exactly the same as the node version!
Node in Deno
Well all the means is that we need to make the Node version work in Deno. The error:
error: Relative import path "node-fetch" not prefixed with / or ./ or ../ from "file:///D:/projects/deno-node/src/fetch/fetch-node.js"
We need to use an import map to point this somewhere. This time it actually has a return so we can't just say it's null. This is where it's nice if the API matches otherwise we might have to do some complicated internal import-mapping. But it's easy to polyfill node-fetch
:
//node-fetch.js
export default fetch;
And the import map:
{
"imports": {
"node-fetch": "./node-fetch.js"
}
}
And we run it with the import map: deno run --allow-net --import-map=../src/fetch/import_map_deno.json ../src/fetch/fetch-node.js
Hard Mode: HTTP Listen
Both Deno and Node provide APIs for listening to HTTP and there is no such thing for browsers. There's the service worker API which has similarities and Deno follows that but there's no concept of listening to a TCP socket. These APIs are very different though so this is a much harder problem.
Http Listen in Deno
//http-listen.js
const port = parseInt(Deno.env.get("PORT")) ?? 8080;
const server = Deno.listen({ port });
async function serveHttp(connection) {
const httpConnection = Deno.serveHttp(connection);
for await (const requestEvent of httpConnection) {
requestEvent.respondWith(
new Response(`Hello from Server!`, {
status: 200,
headers: {
"Content-Type": "text/plain"
}
})
);
}
}
console.log(`Server running on port ${port}`);
for await (const connection of server) {
serveHttp(connection);
}
Deno has a listen
method to listen for incoming TCP requests. Those are then "upgraded" to HTTP with serveHttp
. Those are given using the web-standard Request/Response objects. What's also interesting is that we're using async iterators which weren't added to Node until very recently so that even the API primitives are different.
What this will do is listen on a port given by the environment (for a little extra compatibility spice) or default to 8080. It will response with "Hello from Server!".
We can run it with PORT=8081 deno run --allow-net --allow-env ../src/http-listen/http-listen-deno.js
to listen on port 8081.
Http Listen Deno from Node
We'll instantly get a bunch of issues here. The first is Deno.env
. We'll again be polyfilling the Deno object like we did for file reads. To make the env
work we create an object and attach it to the global Deno object:
//deno.js
const env = {
get: name => process.env[name]
};
Easy enough. Now the tricky part. We need to poly fill Deno.listen
. The polyfill we'll be making is extremely sparse and will only handle exactly the cases we need and nothing else. This is because making a robust polyfill is really hard and requires a lot of code and testing. I want to keep things simple. Deno.listen
gives back a stream of incoming HTTP connections.
//deno.js
import net from "net";
function listen({ port }){
const stream = new ReadableStream({
start(controller){
const server = net.createServer(socket => {
controller.enqueue(socket)
});
server.listen(port)
}
});
return stream;
}
Here we're going to use a ReadableStream because this greatly simplifies the enqueuing logic. Node requires that the ReadableStream be imported so we need to do that:
//deno.js
import { ReadableStream } from "node:stream/web";
When the controller starts we also start a node net.Server
which has a callback for each connection coming in. We enqueue those connections into the stream. The nice part is both Node (16+) and Deno (but not browsers yet surprisingly) can do async iteration over a read stream which is exactly what we want. We also have to run listen
on the server to start listening.
Now here's a problem. If you tried to do this, you would get an exit with error code 13. Top-level iteration of read streams is broken in both Node and Deno. The problem is both like to eagerly exit and not wait for a top-level promise so long as there's nothing processing in the event loop. This is never the case in browser since it lives for as long as the page does. This behavior can be extremely confusing and we actually need to make a hack for it to work. Just before iterating over the connections:
//http-listen.js
//keep alive
setInterval(() => {}, 1000);
This will keep enqueuing timer events which will prevent Node from exiting the process. I also said Deno has this problem, which is true, but the original code works because the underlying socket code seems to produce events that keep it alive indefinitely.
Now we need to handle the TCP socket connections with serveHttp
. Again this will be modeled as a stream of requests made on the socket and each will need to be responded to.
//deno.js
function serveHttp(socket){
const stream = new ReadableStream({
start(controller){
socket.on("data", data => {
controller.enqueue({
respondWith: (response) => {
socket.write(responseToHttp(response));
}
});
});
socket.on("close", () => {
controller.close()
});
}
});
return stream;
}
The underlying node socket has an event data
which signals a request. We can then enqueue it into the stream. What we enqueue is actually a mock Request object. I didn't build any more than was absolutely necessary for the demo, so it just has a respondWith
method, we don't even read the request at all. The respondWith
takes a Response object and serializes a response back on the socket. If the socket closes we need to close our stream as well.
To serialize the Response we use responseToHttp
which is a super-minimal implementation (also the indenting is somewhat important):
//deno.js
function responseToHttp(response){
if(!response.options.headers["Content-Length"]){
response.options.headers["Content-Length"] = response.body.length;
}
return `HTTP/1.1 ${response.options.status} ${response.options.statusText ?? "OK"}
${Object.entries(response.options.headers).map(([name, value]) => `${name}: ${value}`).join("\n")}
${response.body}`;
}
It can serialize a text body, some headers and a status code but you can probably already see a bunch of issues with it. It's enough to work though.
Finally we add the Deno polyfill:
//http-listen-deno.js
import "../deno.js";
And it should work.
Fixing Deno back up
So we made modifications and now we need to fix the Deno script so that it works again.
Again we just substitute the global Deno import with a null module:
{
"imports": {
"../deno.js": "../null.js"
}
}
And run appropriately
PORT=8081 deno run --allow-net --allow-env --import-map=../src/http-listen/import-map-deno.json ../src/http-listen/http-listen-deno.js
Http Listen in Node
Node relies on a module called http
that sits on top of net
. We're going to deal with it at this level rather than getting down into the muck of TCP directly because that's how you would write this for Node.
import http from "http";
const port = process.env["PORT"] ?? "8080";
function requestListener(req, res) {
res.writeHead(200, "OK", {
"Content-Type" : "text/plain"
});
res.end("Hello from server!");
}
const server = http.createServer(requestListener);
console.log(`Server running on port ${port}`);
server.listen(port);
Already we can see many differences. No async, no Request/Response objects etc.
Http Listen Node from Deno
First off we encounter an error with process.env[name]
. The problem is polyfilling globals is harder in Node. In Deno we can ignore unnecessary imports using import maps, but Node has no such feature. Instead we need to do a check at runtime.
//http-listen.js
function getEnv(name){
return globalThis.Deno ? Deno.env.get(name) : process.env[name];
}
const port = getEnv("PORT") ?? "8080";
We could put this in a module but it's simple enough to be inline for now.
Now for the http
stuff. I had thought that Deno would have a polyfill for this but it doesn't seem to yet.
Here's what I came up with. Again strictly dealing with just the things in use and nothing else:
//http.js
class NodeResponse {
#request;
#status;
#statusText;
#headers;
#body;
constructor(request){
this.#request = request;
}
writeHead(status, statusText, headers){
this.#status = status;
this.#statusText = statusText;
this.#headers = headers;
}
end(body){
this.#body = body;
this.#end();
}
#end(){
const response = new Response(this.#body, {
status: this.#status,
statusText: this.#statusText,
headers: this.#headers
});
this.#request.respondWith(response);
}
}
function createServer(requestHandler){
return {
listen: async port => {
const server = Deno.listen({ port: parseInt(port) });
for await(const connection of server){
const httpConnection = Deno.serveHttp(connection);
for await(const requestEvent of httpConnection){
requestHandler(null, new NodeResponse(requestEvent));
}
}
}
}
}
export default {
createServer
}
The order of where we attach a handler and listen on a port are different but that's not too hard to deal with with a little currying. Where it gets tricky is the difference between the Request and Response objects. We aren't dealing with Request so we can just null that out. For response we need to create an object with the same methods that will eventually turn into a Response object. So we hold all the written attributes in private properties and then when we call end
we save the body and then commit the response with responseWith
. This is not going to work for streaming an what-not but will for our simple case.
Lastly let's wire up the import map:
{
"imports": {
"http": "./http.js"
}
}
And run:
PORT=8081 deno run --allow-net --allow-env --import-map=../src/http-listen/import-map-node.json ../src/http-listen/http-listen-node.js
Modules
Both Deno and Node have different ways for dealing with modules and we need to make those work.
Deno modules
Deno modules are just ECMAscript modules with one tiny difference, they can import typescript.
//deno-import.js
import { join } from "https://deno.land/std/path/mod.ts";
console.log(join("Hello", "World!"));
In fact, I don't think you can even get compiled JS versions of the standard library without doing it yourself. For third party stuff this is usually not an issue though.
Deno Modules in Node
So typescript is an obvious problem but Node will stop us before we even get that far. Node does not support modules from web URLs.
Error [ERR_UNSUPPORTED_ESM_URL_SCHEME]: Only file and data URLs are supported by the default ESM loader. Received 'https:'
The way we can get around this is to build a loader. Loaders are a new thing in Node that can allow us to load custom types of modules. They are experimental right now though so expect this code to age poorly. They are a simply a module that exports a things with well-known names. Here's a loader that can load from web URLs:
//deno-loader.js
import fetch from "node-fetch";
const isWebUrl = specifier => /^https?:\/\//.test(specifier);
export async function resolve(specifier, context, defaultResolve){
if(isWebUrl(specifier)){
return {
url: specifier
};
} else if(context.parentURL && } else if(context.parentURL && isWebUrl(context.parentURL)){
){
return {
url : new URL(specifier, context.parentURL).href
};
}
return defaultResolve(specifier, context, defaultResolve);
}
export function getFormat(url, context, defaultGetFormat) {
if (isWebUrl(url)) {
return {
format: 'module'
};
}
return defaultGetFormat(url, context, defaultGetFormat);
}
export async function getSource(url, context, defaultGetSource){
if(isWebUrl(url)){
const response = await fetch(url);
let source = await response.text();
return {
source
};
}
return defaultGetSource(url, context, defaultGetSource);
}
We have 3 functions here. resolve
takes the module specifier and some context like the original file's location and let's you return a new URL string. We override this if the URL starts with http://
or https://
so that Node doesn't block it. In the case the input URL is relative we still need to check the parent to make sure it's not a web URL otherwise we pass it back to Node's default resolver.
getFormat
tells Node what type of format the module is. Basically this let's you compile to WASM and stuff like that. We just want our modules to be plain JS modules so we return type "module" or kick it back to Node's default.
Finally getSource
takes a URL an turns it into source. So internally we use node-fetch to make a network request, download the module and then pass it back or fallback to Node's default.
This is enough to get JS working. However we need a small change to deal with TS. In getFormat
just after we get the source text we can examine the URL. If it ends in .ts
then we transpile using typescript
(npm install this).
//deno-loader.js
import typescript from "typescript";
///export async function getFormat(){
if(url.endsWith(".ts")){
source = typescript.transpileModule(source, {
compilerOptions: {
module: typescript.ModuleKind.ESNext
}
}).outputText;
}
Luckily that's all we need. The compiler options makes it so we export ESM instead of CJS. It's not super robust but for simple shallow modules it works just fine.
We can now run this like node --experimental-loader ../src/imports/deno-loader.js ../src/imports/import-deno.js
Node imports
We've actually already done this. All we need to do is replace the module with another one in the import map. This works for bare modules or any other one on disk. Also I don't recommend using NPM and trying to hack things to work, instead you can use https://www.skypack.dev/ which will work for most packages on NPM and automatically deal with the references and transpile CJS code to ESM for you. Just create the source map.
For completion's sake:
//import-node.js
//unfortunately we can't destructure because the mapped module export isn't quite the same.
import path from "path";
console.log(path.join("Hello", "World!"));
Import map:
{
"imports": {
"path": "https://cdn.skypack.dev/path"
}
}
We could also make our own path
or use Deno's polyfill libraries too but I wanted to show off using Skypack.
And run:
deno run --import-map=../src/imports/import-map-deno.json ../src/imports/import-node.js
Trade-offs
We've sort of developed two separate strategies. Which one you pick really depends on what you want to do. Do you want Node-centric code or Deno-centric code?
It's easier to get Node code to run in Deno because Deno has both a set of polyfills for Node already available and import maps to make patching modules nice and easy. However if you are looking to convert from Node to Deno all your code will stay written in Node which can feel a bit dated if you write lots of browser code.
If you want to go the other way, and write Deno code for use in Node you generally need to "unpolyfill" the Deno object by implementing the polyfill modules for Node in the source and then pointing them at null references when run in Deno. This can be a lot harder but it will keep you in the more standard API mode which can be especially helpful if you need to move that code to the browser as well.
Deno's imports need loaders in node (or ever more complex systems not using experimental features) but for Node in Deno all you really need is import maps and a good CDN like Skypack.
Sometimes both will have divergent APIs and you'll simply have to polyfill those APIs. In the Node case this might involve checking for the globalThis.Deno
object to see which context you are in and running the appropriate code. Since import maps are not yet available in Node, you will need to ship both implementations to get it to work.
Posted on September 28, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.