Skip to content

Anri-Lombard/Deep-Learning-NYU

Repository files navigation

NYU Deep Learning Course by Yann LeCun

Course Structure

  • History and Motivation
  • Evolution and DL
  • Neural Nets
  • SGD and backpropagation
  • Backprop in practice
  • NN training
  • Parameter transformation
  • Convolutional Nets
  • Natural signals' properties
  • 1 Dimentionsal Convolutional Nets
  • Optimization
  • Autograd
  • CNNs (again)
  • CNN applications
  • Recurrent Nets and attention
  • Training Recurrent Nets
  • Energy-based models
  • Self-supervised learning (SSL), Explainable Boosting Machines (EBM)
  • Autoencoders
  • Contrastive methods
  • Regularised latent
  • Training Variable Autoencoders
  • Sparsity
  • World models, Generative Adversarial Networks (GANs)
  • Training GANs
  • CV SSL
  • Predictive Control
  • Activations
  • Losses
  • PPUU
  • Deep Learning for Natural Language Processing (NLP)
  • Attention and transformers
  • Graph Convolutional Networks (GCNs)
  • Structured prediction
  • Graphical Methods
  • Regularization and Bayesian methods
  • Interface for latent-variable EBMs
  • Training latent-variable EBMs

Disclaimer

This is my personal repository containing my notes and modifications of the notebooks of the course. I am not affiliated with NYU or Yann LeCun in any way, but am just a student learning from the content they provide. Any mistakes in the solutions are mine and not the course's, so don't hesitate to correct it if you find any.

References

About

Deep Learning Course instructed by Yann LeCun and Alfredo Canziani

Topics

Resources

Stars

Watchers

Forks