A Deno-licious Workflow
Geert-Jan Zwiers
Posted on November 14, 2021
If there is one project that has increased my developer happiness, it is probably Deno. One of the best things is the ease at which one can set up a coding workflow and maintain a project with the combination of Deno, Git and the GitHub CLI.
With this workflow, pull requests (PRs) can be made and merged from a terminal, release notes can be generated automatically and releases are made in the blink of an eye. Once you get used to this workflow, it feels about as fluent as coding gets.
Requirements
- A GitHub Account
-
deno
installed -
gh
(GitHub CLI) installed -
git
installed
Recommended:
- an autocompletion tool for your terminal, e.g. oh-my-zsh for the
zsh
shell or posh-git for PowerShell
Setting up verified commits
As Deno places more emphasis on security, let's begin by creating a key to sign our commits with. This way we can make verified commits that prove we are not some impostor trying to upload a million cat.jpg files or something. In a way, GPG keys are an implementation of 'Just be yourself'!
Read how to generate a GPG key for GitHub here and adding it to your account.
Creating a module
One convention in Deno is to have a file mod.ts
as entrypoint and two files deps.ts
and dev_deps.ts
as places to import other modules for use throughout yours. Note that the filenames have no special meaning in Deno, they are merely a convention. We'd probably like a .gitignore
file as well, but I know what you're thinking: Do I really have to make four whole files by hand? No way! Okay, hang on, because there is a solution. Just run mod
which is a deno CLI program that scaffolds a basic module for you:
deno install --allow-read --allow-run=git --allow-write --name mod https://deno.land/x/mod/mod.ts
And then run:
mod -n my_deno_project
This makes a directory my_deno_project
in the current working directory with the files we just mentioned and runs git init
for us. Of course, you can name the directory whatever you like.
Uploading to GitHub
Let's add the code to a remote repository by making a verified commit using our new GPG key. Configure git to require signing commits by running the following command in my_deno_project
:
git config commit.gpgsign true
Next, add your files to the working tree and make the first commit:
git add .
git commit -m "initial commit"
At this point you should be prompted to enter your GPG key's password to sign the commit with. Now we can send this code to a remote repository on GitHub with the CLI:
gh repo create
This will let you make a new remote repository interactively, but if you already know what you want you can use something like:
gh repo create my_deno_project --confirm --public
Check that the remote repo was created successfully, then push the local files:
git push -u origin main
Protecting the main branch
Now that the initial code is on GitHub it's time to setup branch protection that ensures we can only merge changes to the main branch via pull requests. The major benefit of doing this is that all changes can be checked and reviewed before being included in any sort of release.
Go to the project on GitHub and go to the Settings tab, then go to Branches. Add a rule with the branch name pattern main
and enable the setting "Require a pull request before merging" and also turn on "Include administrators". There is another setting that we want to enable: "Require status checks to pass before merging", but we probably want to have actual checks before enabling it.
We'll add some code and a pipeline soon, but let's do all of that in a new branch:
git checkout -b first_feature
Adding Continuous Integration
When developing modules for Deno there are three steps that can be achieved quite easily using built-in deno
subcommands. These are formatting code with deno fmt
, linting code with deno lint
and running unit and/or integration tests with deno test
. Using GitHub Actions we can also include these steps in a Continuous Integration (CI) pipeline that will run anytime we push changes to the remote.
Wait a minute, do we have to add a whole pipeline manually now? Nope! We can use mod
to create a basic pipeline for us! In the current working directory (my_deno_project
) run:
mod --ci
You should now have a .github
directory with a workflows
subdirectory and a build.yaml
file. Note that mod
doesn't overwrite existing files (you should see some warnings about that), so we could use it to add these additional files do the project.
If you go into build.yaml
, you can see it has a basic pipeline structure for Deno that includes the aforementioned steps. It will format, lint and test the code. Only problem with that is we don't have any code yet! Let's fix that.
Test-Driven Development
To make a high-quality module means having well-tested code, amongst other things. Add the following line to dev_deps.ts
:
export { assertEquals } from "https://deno.land/std@0.114.0/testing/asserts.ts";
The idea of Test-Driven Development is to write a test that initially fails, and then writing the minimal amount of code required to make the test pass. For the example project, we'll just be adding a sum
function, so create a new file mod.test.ts
and add the following code:
import { assertEquals } from "./dev_deps.ts";
import { sum } from "./mod.ts";
Deno.test({
name: "sum",
fn() {
assertEquals(sum(1, 2), 3);
}
});
Also add an empty sum
function in mod.ts
:
export function sum() {};
If you run deno test
you can see the test won't pass. We'll implement a basic sum function here and class it up a bit by allowing it to sum any number of numbers using spread syntax and Array.reduce
:
export function sum(...numbers: number[]): number {
return numbers.reduce((prev, curr) => {
return prev + curr;
})
}
If you run the test again you should see it pass. Now, try to run the commands deno fmt
and deno lint
as well. You can also run deno test --coverage=cov
to create a code coverage output directory and then deno coverage cov
to view a coverage report on the console (which should be 100% in this case!).
Merging to main
This code looks ready for release, as all checks are passing. We want to include these as requirements for any pull requests. First, create another commit using conventional commit syntax. This style makes it easier to see what type of changes have been made and what sort of version increment would be best. You can read more about the specifications here.
git add .
git commit -m "feat: add sum function"
Now, instead of pushing the code to main
, which is protected, let's use the GitHub CLI to make a PR. We can use --fill
to autofill the title and body of the PR with the commit info.
gh pr create --fill
Now you don't need to leave the terminal at all with the GitHub CLI. You could keep working on something else, and use gh pr status
to check the PR.
When the pipeline has run, edit the branch protection rule on GitHub and tick the "Require status checks to pass before merging" and search for the build
job that the pipeline runs, which includes formatting, linting and testing.
If all the checks pass you can merge the changes into main with a (single) squash commit:
gh pr merge --squash
And this is really the core of this workflow. You make changes, create a PR with gh pr create --fill
, then check in later and merge with gh pr merge --squash
. It takes care of using a consistent format in the code and ensures that good practices are applied by running the linter. It's a very fluent and programmatic way of developing and maintaining a codebase.
Auto-generating release notes.
The great thing about using conventional commits together with GitHub is that you can create release notes and autofill them with your commits. This gives a very nice, concise overview of what sort of fixes and features were made per release. The only downside right now is that it has to be done from GitHub and not the CLI.
To create a release, go to Create a new release on GitHub (right below Releases on the right hand side). As long as your project is unstable, meaning breaking changes can happen at any release and not just major version increments, choose a v0.x.x
format for your tag, for example v0.1.0
. Click the button "auto-generate release notes" on the top-right of where you can write the release description, and there you go!
Summary
This tutorial showcased a module development workflow for Deno using GitHub. We configured a project to require signing commits with a GPG key. We used the mod
CLI to quickly scaffold a module for Deno with a GitHub Actions CI pipeline. Finally, we used the GitHub CLI to create a remote repository, to make pull requests and merge them into a protected branch. This workflow is highly programmatic, with only a few manual steps required on GitHub in the browser, and it greatly reduces the amount of context switching needed while developing.
I hope this tutorial showed you how using Deno and GitHub greatly simplifies the creation of high-quality code, adhering to many good practices and standards (branch protection, commit signing, conventional commits, test-driven development). I recognize that this workflow takes some time to get used to before it starts to become fast and fluent, but it's absolutely worth making the effort as it will take your code quality to the next level.
Posted on November 14, 2021
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.