Skip to content

This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library

Notifications You must be signed in to change notification settings

cx-olquinjica/Deep-Learning-Personal-Notebooks

Repository files navigation

Deep-Learning-Personal-Notebooks

This collection of notebooks is based on the Dive into Deep Learning Book. It was created with the intention of serving as a reference when working on future projects. All of the notes are written in Pytorch and the d2l library

Ralph Emerson Waldo: "Nothing great was ever achieved without enthusiasm..."

Steven Wright: "Everywhere is walking distance if you have the time..."

Lee Hanney: "Exercise to stimulate, not to annihilate. The world wasn't formed in a day, and neither were we. set small goals and build upon them.."

"The Man who loves walking will walk further than the man who loves the destination. When you fall in love with the journey, everything else takes care of itself. Trip, fall, pick yourself up. Get up, learn, do it over again..."

Study Plan:

  1. Basics: ✅

    • Linear Neural Networks
    • Multilayer Perceptrons
    • Builder's guide
  2. Convolutional Neural Networks ✅

    • LeNet ->> DenseNet
    • CNNs for Audio and Text (Maybe)
  3. Review Probability and Information Theory Deep Learning: Adaptive Computation and Machine Learning Chapter III another link

     - Estimators, Bias and Variance 
     - Maximum Likelihood Estimation
     - Bayesian Statistics
     - Deep FeedForward Networks
    
  4. Deep Learning: Adaptive Computation and Machine Learning Chapter VII

    • Regularization for Deep Learning(apply to CNNs)
  5. Optimization Algorithms d2l.ai chapter 12

  6. Deep Learning: Adaptive Computation and Machine Learning Chapter IX

    • Convolutional Neural Networks a Maths perspective
  7. Computational Performance d2l.ai chapter 13 🔜

    • When talking about parallelization, do not forget to check the multiple implementation of GPU as shown on the AlexNet paper.
    • Implementation cuda-convnet
  8. Computer Vision d2l.ai chapter 14

  9. Final Project 🔜

  10. Recurrent Neural Networks d2l.ai chapter 9-10

  11. Final Project ✅

  12. Attention Mechanisms and Transformers d2l.ai chapter 11

  13. Natural Language Processing: Pretraining d2l.ai chapter 15

  14. Natural Language Processing: Applications d2l.ai chapter 16

  15. Final Project ✅ - Machine Translation - Document Summarization - Document Generation

    • Transformers in Computer Vision
      • Diffusion Models
      • Video Understanding
      • Nice time to reconsider GAN + Transformers!
  16. Hyperparameter Optimization d2l.ai chapter 19 🔜

  17. Generative Adversarial Networks d2l.ai chapter 20

  18. Recommender Systems d2l.ai chapter 21

  19. Reinforcement Learning d2l.ai chapter 17

  20. Gaussian Processes d2l.ai chapter 18

    • Why study Gaussian Processes??
      • They provide a function space perspective of modelling, which makes understanding a variety of model classes, including deep neural networks, much more approachable
      • They have an extraordinary range of applications where they are SOTA, including active learning, hyperparameter learning, auto-ML, and spatiotemporal regression
      • Over the last few years, algorithmic advances have made Gaussian processes increasingly scalable and relevant, harmonizing with deep learning through frameworks such as GPyTorch

Citation

@article{zhang2021dive,
    title={Dive into Deep Learning},
    author={Zhang, Aston and Lipton, Zachary C. and Li, Mu and Smola, Alexander J.},
    journal={arXiv preprint arXiv:2106.11342},
    year={2021}
}