Literature Synthesis

wahaj

Wahaj Javed

Posted on May 25, 2020

Literature Synthesis

Introduction

This project stemmed from the idea of Generative Adversarial Networks and their philosophical implications on the sale of the GAN generated artwork sold for a few hundred thousand dollars on Christies's. The premise behind this idea that art itself is not a uniquely human endeavor, that whatever we as humans accomplish, machines may beat us at it. Keeping that in mind I drove to find the last nail in the coffin in human extraordinaire, literature.

Utility

Literature is one the only things that has transcended time and rightful consumption of it is essential to evolution of human society. The advent of infinite (presumably) original literature with the world's largest encyclopedia embedded in its roots and some of the greatest cultural works as a guiding star could produce some high quality literature.

Demo Link

It is currently a Work in Progress. Will probably have a working demo by the end of May. *fingers crossed*

How I built it

I had initially started building it using the Transformer Language Model, but due to its heavy memory consumption for storing attention to other words, its context was severely limited.
However, the introduction of the Reformer Architecture by Google was a godsend. Reformer improves on the long-distance context retention, quintessential to quality literature, by using LSH(Locality Sensitive Hashing) and reversible residual layers reducing its storage footprint from O(N2)O(N^2) (for Transformer) to O(NlogN)O(N log N) .
Having built the model using this architecture as a base on the Trax Library

GitHub logo google / trax

Trax — Deep Learning with Clear Code and Speed

I intend to lease the TPUs from Google Collab for extensive training, trained on the enwiki data set, (or someone comes up with pre-trained weights on top of Reformer before that) and to use transfer learning on the Project Gutenberg Open Dataset.

Additional Thoughts / Feelings / Stories

A GPT-2-esque model with highly efficient context retention could spell the next-generation of natural language literature proliferation throughout the internet-connected world

💖 💪 🙅 🚩
wahaj
Wahaj Javed

Posted on May 25, 2020

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related