Literature Synthesis
Wahaj Javed
Posted on May 25, 2020
Introduction
This project stemmed from the idea of Generative Adversarial Networks and their philosophical implications on the sale of the GAN generated artwork sold for a few hundred thousand dollars on Christies's. The premise behind this idea that art itself is not a uniquely human endeavor, that whatever we as humans accomplish, machines may beat us at it. Keeping that in mind I drove to find the last nail in the coffin in human extraordinaire, literature.
Utility
Literature is one the only things that has transcended time and rightful consumption of it is essential to evolution of human society. The advent of infinite (presumably) original literature with the world's largest encyclopedia embedded in its roots and some of the greatest cultural works as a guiding star could produce some high quality literature.
Demo Link
It is currently a Work in Progress. Will probably have a working demo by the end of May. *fingers crossed*
How I built it
I had initially started building it using the Transformer Language Model, but due to its heavy memory consumption for storing attention to other words, its context was severely limited.
However, the introduction of the Reformer Architecture by Google was a godsend. Reformer improves on the long-distance context retention, quintessential to quality literature, by using LSH(Locality Sensitive Hashing) and reversible residual layers reducing its storage footprint from
(for Transformer) to
.
Having built the model using this architecture as a base on the Trax Library
Additional Thoughts / Feelings / Stories
A GPT-2-esque model with highly efficient context retention could spell the next-generation of natural language literature proliferation throughout the internet-connected world
Posted on May 25, 2020
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.
Related
May 24, 2020