Skip to content

kaustubhhiware/LSTM-GRU-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LSTM-GRU-from-scratch

LSTM, GRU cell implementation from scratch

Assignment 4 weights for Deep Learning, CS60010.

Currently includes weights for LSTM and GRU for hidden layer size as 32, 64, 128 and 256.

Objective

The aim of this assignment was to compare performance of LSTM, GRU and MLP for a fixed number of iterations, with variable hidden layer size. Please refer to Report.pdf for the details.

Suggested reading: colah's blog on LSTM. That's really all you need.

  • Loss as a function of iterations

Usage

sh run.sh - Run all hidden units LSTM GRU and report accuracy. Output here

python train.py --train - Run training, save weights into weights/ folder. Defaults to LSTM, hidden_unit 32, 30 iterations / epochs

python train.py --train --hidden_unit 32 --model lstm --iter 5: Train LSTM and dump weights. Run training with specified number of iterations. Default iterations are 50.

python train.py --test --hidden_unit 32 --model lstm - Load precomputed weights and report test accuracy.

Code structure

  • data_loader is used to load data from zip files in data folder.
  • module defines the basic LSTM and GRU code.
  • train handles input and states the model.

License

The MIT License (MIT) 2018 - Kaustubh Hiware.

About

LSTM, GRU cell implementation from scratch in tensorflow

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published