Write your own type-safe reducer to normalize your data

hansjhoffman

Hans Hoffman

Posted on October 7, 2020

Write your own type-safe reducer to normalize your data

Background

Given the following JSON data with nested objects, what is the best data structure to use in our Redux store?

[
  {
    "id": "xlkxhemkuiam",
    "title": "voluptates sequi et praesentium eos consequatur cumque omnis",
    "body": "Beatae distinctio libero voluptates nobis voluptatem...",
    "createdAt": "Tue, 22 Sep 2020 16:28:53 GMT",
    "user": {
      "id": "lswamlcggqlw",
      "handle": "Payton_Carter",
      "imgUrl": "https://s3.amazonaws.com/uifaces/faces/twitter/dawidwu/128.jpg"
    },
    "comments": [
      {
        "id": "jsyrjkxwtpmu",
        "body": "Sint deserunt assumenda voluptas doloremque repudiandae...",
        "createdAt": "Fri, 25 Sep 2020 18:03:26 GMT",
        "user": {
          "id": "hqhhywrxpprz",
          "handle": "Orlo97",
          "imgUrl": "https://s3.amazonaws.com/uifaces/faces/twitter/ponchomendivil/128.jpg"
        }
      }
    ]
  },
...
]
Enter fullscreen mode Exit fullscreen mode

The easiest and most common approach would be to store the array of blog posts exactly as they were received. If we wanted to display data for a particular post given its id then we'd have to iterate over the array until we found our matching post. Moreover, we again would have to rely on iteration if we wanted to perform an upsert action in our Redux store. Obviously both tasks suffer from a time complexity of O(n) so we can instead normalize our data and consequently reduce our complexity down to O(1).

You don't always have to deal with data the same format the server gives you.

Yes, this idea has been around for years and there are popular tools like normalizr to help with this. But what if you have deeply nested data that is not easily parsable by such tools? Here I present one possible approach using a few popular React Typescript fp libraries fp-ts, io-ts, monocle-ts to build a custom, type-safe reducer function.

This is more of a quick run through rather than a step-by-step guide. If you are interested, I encourage you to dive into the source code. You can also see a live demo here.

GitHub logo rhoskal / fp-data-normalization

Type-safe data normalization using fp-ts

Let's normalize

Before we start, let's specify the shape of our normalized data in a way that would allow us O(1) lookups:

export type AppState = {
  entities: {
    comments: NormalizedComments;
    posts: NormalizedPosts;
    users: NormalizedUsers;
  };
};
Enter fullscreen mode Exit fullscreen mode

Step 1

We can get both compile time and runtime type safety by using io-ts to declare our domain types. For example, our Post:

/**
 * Composite types
 */

export const Post = t.type({
  id: IdString,
  title: NonEmptyString,
  body: NonEmptyString,
  createdAt: UtcDateString,
  user: User,
  comments: Comments,
});

/**
 * Static types
 */

export type Post = t.TypeOf<typeof Post>;
Enter fullscreen mode Exit fullscreen mode

We can add a few constraints instead of just using basic strings by specifying custom types. For example, IdString ensures the given string is exactly 12 characters in length and contains no digits e.g. "jsyrjkxwtpmu".

/**
 * Type guards
 */

const isIdString = (input: unknown): input is string => {
  return typeof input === "string" && /[A-Za-z]{12}/g.test(input);
};

/**
 * Custom codecs
 */

const IdString = new t.Type<string, string, unknown>(
  "idString",
  isIdString,
  (input, context) => (isIdString(input) ? t.success(input) : t.failure(input, context)),
  t.identity,
);
Enter fullscreen mode Exit fullscreen mode

Step 2

Now we can protect our React app from crashing due to an unexpected API response by using our domain static types. We've also elevated all the superfluous error checking in our domain logic into one simple check. Thank you decoders! 🎉

const fetchPosts = (): Posts => {
  const result = Posts.decode(data);

  return pipe(
    result,
    E.fold(
      () => {
        console.warn(PathReporter.report(result));

        return [];
      },
      (posts) => posts,
    ),
  );
};
Enter fullscreen mode Exit fullscreen mode

This is the really cool part! If the API response contains an id in the wrong format or is missing entirely, we can catch this before we enter our reducer function. Let that sink if for a bit... Even an internal API can change right under our feet or have corrupt data make its way in. We can protect our app from this. Manipulate data.json yourself and see it in action.

The ability to declare types once and get both compile and runtime safety is a joy worth experiencing.

The Either type returned from the io-ts Decoder produces one interesting side effect worth pointing out — we pass an empty array on failure which would eventually result in no blog posts rendered in our React app. Does this provide a nice UX? Sure our app not crashing is better than the alternative, but maybe we can find a happy medium and render some data?

I'm still working through this myself. A few co-workers suggested looking into fp-ts These and one even submitted a PR! Check it out for yourself.

Step 3

Finally, instead of doing nasty and error prone JS object spreading when trying to add or update entities in our state, we can use monocle-ts to define lenses will make our life easier. Below, our upsert function first checks to see if we already have the given user stored so we can ensure that certain user properties cannot be updated once inserted — such as a user's id. Moreover, a user can change their handle and profile image in my example so we want to allow those properties to be updated.

/**
 * Optics
 */

const usersLens = Lens.fromPath<AppState>()(["entities", "users"]);
const atUser = (id: IdString) => Lens.fromProp<NormalizedUsers>()(id);

/**
 * Upserts
 */

const upsertUser = (user: User) => (state: AppState): AppState => {
  return pipe(
    state,
    R.lookup(user.id),
    O.fold(
      () => {
        return pipe(
          state,
          usersLens.compose(atUser(user.id)).set({
            id: user.id,
            handle: user.handle,
            imgUrl: user.imgUrl,
          }),
        );
      },
      (_user) => {
        return pipe(
          state,
          usersLens.compose(atUser(user.id)).modify(
            (prevUser): UserEntity => ({
              ...prevUser,
              handle: user.handle,
              imgUrl: user.imgUrl,
            }),
          ),
        );
      },
    ),
  );
};
Enter fullscreen mode Exit fullscreen mode

Conclusion

Normalizing data using lenses and decoders does require some effort, but I hope I have demonstrated the reward for doing so. Doesn't type-safe code like this puts a smile on you face? 😎

P.S. — Please let me know if you have a more elegant or idiomatic way of doing this! I'm all ears.

💖 💪 🙅 🚩
hansjhoffman
Hans Hoffman

Posted on October 7, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related