Skip to content

WasifArmanHaque/text-generating-lstm

Repository files navigation

PROJECT TITLE

Recurrent Neural Network writes Sherlock

Overview

In this simple project, I used Keras Deep Learning framework for training an LSTM (Long Short Term Memory) - a variant of Recurrent Neural Networks, on a portion of the famous Arthur Conan Doyle novel 'Hound of the Baskerville'.

The generate_sherlock_word.py script loads the data, trains an LSTM model on it and displays generated text at each training step. It takes at least 20 iterations for the generated text to appear somewhat coherent.

The text data used is taken from here. The weights.best.hdf5 file contains trained weights (10 epochs).

There are many scopes for improvement and I will be working on them soon, meanwhile any suggestions will be highly appreciated.

About

An LSTM (Long Short Term Memory) model trained to write like Conan Doyle's Sherlock.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages