Skip to content

officialpm/Natural-Language-Processing-in-TensorFlow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

📄 Natural Language Processing in TensorFlow

I successfully completed the Natural language processing in TensorFlow an online course offered by deeplearning.ai on Coursera. The 4 week course was taught by Laurence Moroney for 4-5 hours a week.

Laurence Moroney made it very clear and detailed in respect to learning the several aspects like building natural language processing systems using Tensorflow and also processing texts including tokenization and representing sentences as vectors. The course also addressed subjects like applying RNNs, GRUs and LSTMs in Tensorflow and training LSTMs on existing text to create original poetry and more.

I am thankful for gaining the skills below:

  • Natural Language Processing
  • Tokenization
  • Machine Learning
  • Tensorflow
  • RNNs

📝 About Course

If you are a software developer who wants to build scalable AI-powered algorithms, you need to understand how to use the tools to build them. This Specialization will teach you best practices for using TensorFlow, a popular open-source framework for machine learning.

In Course 3 of the deeplearning.ai TensorFlow Specialization, you will build natural language processing systems using TensorFlow. You will learn to process text, including tokenizing and representing sentences as vectors, so that they can be input to a neural network. You’ll also learn to apply RNNs, GRUs, and LSTMs in TensorFlow. Finally, you’ll get to train an LSTM on existing text to create original poetry!

The Machine Learning course and Deep Learning Specialization from Andrew Ng teach the most important and foundational principles of Machine Learning and Deep Learning. This new deeplearning.ai TensorFlow Specialization teaches you how to use TensorFlow to implement those principles so that you can start building and applying scalable models to real-world problems. To develop a deeper understanding of how neural networks work, we recommend that you take the Deep Learning Specialization.

📌 Week 1


Sentiment in text

The first step in understanding sentiment in text, and in particular when training a neural network to do so is the tokenization of that text. This is the process of converting the text into numeric values, with a number representing a word or a character. This week you'll learn about the Tokenizer and pad_sequences APIs in TensorFlow and how they can be used to prepare and encode text and sentences to get them ready for training neural networks!

📘 Notebooks

Lesson 1

Lesson 2

Lesson 3 - Sarcasm

Exercise - Solved

📌 Week 2


Word Embeddings

These tokens are mapped as vectors in a high dimension space. With Embeddings and labelled examples, these vectors can then be tuned so that words with similar meaning will have a similar direction in the vector space. This will begin the process of training a neural network to udnerstand sentiment in text -- and you'll begin by looking at movie reviews, training a neural network on texts that are labelled 'positive' or 'negative' and determining which words in a sentence drive those meanings.

📘 Notebooks

Lesson 1

Lesson 2

Sarcasm Classifier

Exercise - Solved

📌 Week 3


Sequence models

Understanding a variety of model formats that are used in training models to understand context in sequence!

📘 Notebooks

Lesson 1

Lesson 2 - Sarcasm

Exercise - Solved

📌 Week 4


Sequence models and literature

Given a body of words, you could conceivably predict the word most likely to follow a given word or phrase, and once you've done that, to do it again, and again. With that in mind, this week you'll build a poetry generator. It's trained with the lyrics from traditional Irish songs, and can be used to produce beautiful-sounding verse of it's own!

📘 Notebooks

Lesson 1

Lesson 2

Exercise - Solved

🔗 Course Link

📄 Certificate


Parth Maniar

githublinkedin

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published