Creating the Local First Stack
Ersin Buckley
Posted on December 18, 2023
This week I've been busy leading up to the holiday season working on my CRDT (conflict free replicated data type) project. My goal is to test out some of the assumptions I made about building a really effective stack for offline and local first software. Stay tuned to read more about how it went on my journey to implement a sync-server for CRSQLite using Go, and a frontend using React + wasm SQLite. If you want to know a bit more about why I'm doing this, check out last weeks article Local First Software.
The foundational technology I want to build on for this project is CR-Sqlite. You can think of it as a Git layer on top of SQLite. What it allows you to do is create a normal relational schema, then mark tables as replicated, you will then able to track the changes to these tables through a simple query. On the other side of the synchronization picture, you can handle merges by inserting changes in to the database and the CR-Sqlite plugin will handle updating the right values in the database all in the background.
Keeping two databases in sync
The greatest thing about CRSQLite is that it just works on top of an existing database. We start with a schema like this:
create table foo (a primary key not null, b);
create table baz (a primary key not null, b, c, d);
select crsql_as_crr('foo');
select crsql_as_crr('baz');
Then we can get a stream of update events made to the database with a query like this!
result, err := conn.Query("SELECT * FROM crsql_changes WHERE db_version > ? ", currentVersion)
The special relation crsql_changes
contains the column, row and database version for that column and row. Using this data we can 'materialize' a view of the table at a certain version. Check out the implementation, you can simply query the database to get the list of changes needed to synchronize two databases
To merge changes, simply INSERT
into crsql_changes
and your rows/columns will seamlessly be updated. This is really cool because it's just plain old SQL, and it's dead simple to interact with from Go, using the database/sql
package. You can check out the implementation at the time of writing here.
See this all working together in the "dumb version" implemented in this test. The test creates two databases in memory, inserts values into one and merges changes between each other to validate that the two databases end up in the same state. After implementing this I'm feeling confident that the method will work, and I have everything needed for the back-end server to implement merging of database states.
Lets move on.
Implementing the server
How shall we merge the []Change
values in a meaningful way over the network? My test just passes them through memory and that's not really realistic for our desired application. The requirement I have is to support many web clients and a simple server in the background for caching and orchestrating the web clients. The server is just like a client, except it's always online and available for peers to use as a source for synchronizing data.
We can solve this with a service! Now there are many ways I could have started, but I decided to test out gRPC along the way. This was a mistake. I hoped for the best, but gRPC ended up not being a good choice for the web client. Why? you ask. The gRPC protocol works with all the bells and whistles of http when used server to server, but web clients are not as great. The Javascript client is dependent on http 2.0, and it requires a proxy like Envoy to work with a browser. What's more, I didn't love the structure of the generated web client. So through the process of working on this 'local first stack' I actually got sucked in to a big rabbit hole in making the rpc system work. I ended up going with Connect which is a tool that can create a service from a protobuf service definition, that also talks a simple http 1.1 protocol. What ultimately sold me on this solution as the best is that it also came with a very nice to use web client generation, and even plugs in to my favorite react http helper useQuery.
So with the backend all done and implemented, I have a simplified service definition here, an implementation of the server in go and a client implemented in typescript.
Implementing the frontend
The coolest thing in this whole stack is that we have the same exact database on the frontend as the backend. A build of the whole SQLite project, including the CR-Sqlite plugin. This is cutting edge stuff, let's go dive in and see how it works.
The database is this really high performance low level piece of systems programming, how the heck is it running on the web?
Great question, dear reader! Despite the database being written in C and Rust, we can use this from javascript using a simple database oriented API, in the exact same way as we interacted with the db in our backend.
The technology that unlocks this is Web Assembly (WASM). A very low level language that can run in the browser, and can expose API's to your plain old javascript. This is something that comes in the browser, and more and more it is becoming widely used on the web. Compiling the C and rust code with LLVM enables the CR-SQLite project to run on the web. I can use all this really advanced and deeply complicated stuff through simple npm imports. Got to love the 2023 way of plugging in thousands of lines of advanced database with a few lines of bun install
:D
Wrapping it all up
Now the frontend is far from perfect so far, there are in-fact critical bugs with the synchronization work, but I have the bones of the project setup -- and over the next weeks I plan to continue refining and improving on it so that I can test my theory about Local First software being a simpler, faster and better way to build software. It's amazing how much work I can leverage already in the CRDT space, thanks to the way the web as a platform has evolved and the focus and efforts of people to develop great open source tools for RPC, user interface and databses.
And one last note, if you've made it this far. Thanks for reading! It would be awesome to hear from you, so feel free to reach out on the twitter, or the comments, help me push this project (writing) a bit further. My goal has been to write once a week, and I'm hoping that the process is one that will help me hone my craft.
Posted on December 18, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.