Skip to content

Lecture notes, resources and programming assignments taken from the specialized deep learning program (a sequence of courses all related to deep learning) offered by DeepLearning.AI on coursera.

Notifications You must be signed in to change notification settings

alanmenchaca/deep-learning-specialization

Repository files navigation

deep-learning-specialization

The Deep Learning Specialization is a foundational program that will help us understand the capabilities, challenges, and consequences of deep learning and prepare us to participate in the development of leading-edge AI technology.

In this Specialization, we will build and train neural network architectures such as Convolutional Neural Networks, Recurrent Neural Networks, LSTMs, Transformers, and learn how to make them better with strategies such as Dropout, BatchNorm, Xavier/He initialization, and more. We will get ready to master theoretical concepts and their industry applications using Python and TensorFlow and tackle real-world cases such as speech recognition, music synthesis, chatbots, machine translation, natural language processing, and more.

There are 5 courses in this Specialized Program:

In the first course of the Deep Learning Specialization, we will study the foundational concept of neural networks and deep learning.

  • Week 1 - Introduction to Deep Learning: Understand the significant technological trends driving deep learning development and where and how it’s applied.
  • Week 2 - Neural Networks Basics: Set up a machine learning problem with a neural network mindset and use vectorization to speed up your models.
  • Week 3 - Shallow Neural Networks: Build a neural network with one hidden layer using forward propagation and backpropagation.
  • Week 4 - Deep Neural Networks: Understand the key computations underlying deep learning, use them to build and train deep neural networks, and apply them to computer vision.

In the second course of the Deep Learning Specialization, we will open the deep learning black box to understand the processes that drive performance and generate good results systematically.

  • Week 1 - Practical Aspects of Deep Learning: Discover and experiment with various initialization methods, apply L2 regularization and dropout to avoid model overfitting, and use gradient checking to identify errors in a fraud detection model.
  • Week 2 - Optimization Algorithms: Develop your deep learning toolbox by adding more advanced optimizations, random mini-batching, and learning rate decay scheduling to speed up your models.
  • Week 3 - Hyperparameter Tuning, Batch Normalization and Programming Frameworks: Explore TensorFlow, a deep learning framework that allows you to build neural networks quickly and easily and train a neural network on a TensorFlow dataset.

In the third course of the Deep Learning Specialization, we will learn how to build a successful machine learning project and get to practice decision-making as a machine learning project leader.

  • ML Strategy [1] - Strategic guidelines and Using Human-level Performance: Streamline and optimize your ML production workflow by implementing strategic guidelines for goal-setting and applying human-level performance to help define key priorities.
  • ML Strategy [2] - Error Analysis & End-to-end Deep Learning: Develop time-saving error analysis procedures to evaluate the most worthwhile options to pursue and gain intuition for how to split your data and when to use multi-task, transfer, and end-to-end deep learning.

In the fourth course of the Deep Learning Specialization, we will understand how computer vision has evolved and become familiar with its exciting applications such as autonomous driving, face recognition, reading radiology images, and more.

Course 5: Sequence Models

In the fifth course of the Deep Learning Specialization, we will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.

  • Week 1 - Recurrent Neural Networks: Discover recurrent neural networks (RNNs) and several of their variants, including LSTMs, GRUs and Bidirectional RNNs, all models that perform exceptionally well on temporal data.
  • Week 2 - Natural Language Processing and Word Embeddings: Use word vector representations and embedding layers to train recurrent neural networks with an outstanding performance across a wide variety of applications, including sentiment analysis, named entity recognition, and neural machine translation.
  • Week 3 - Sequence Models and the Attention Mechanism: Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs, explore speech recognition and how to deal with audio data, and improve your sequence models with the attention mechanism.
  • Week 4 - Transformers: Build the transformer architecture and tackle natural language processing (NLP) tasks such as attention models, named entity recognition (NER) and Question Answering (QA).

Disclaimer

The solutions to the assignments uploaded here are only for reference.

  • In the course of Convolutional Neural Networks (4th course) some datasets and pretrained models of the assignments were removed, so that the files were lighter when downloading them.