Skip to content

ketanpandey01/GradientDescent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

GradientDescent

Overview/Summary

This is the Implementation of gradient descent algorithm to find the value of parameters(m,b) so that the cost function is minimized i.e. finding the line of best fit on the data.

Visualize data

Visualizing the data

Data

Cost Function - Sum of Squared Error

A simple SSE algorithm for measuring error

def CostFunction(m,b,data):
    sumError = 0
    for itr in range(m_examples):
        feature = data[itr,0]
        label = data[itr,1]
        predLabel = (m * feature) + b
        sumError += (label - predLabel)**2
    sumError = sumError/m_examples
    return sumError

Visualize data plus fitting line

  • Before fitting the parameters

Data plus fitting line

  • After Running Gradient Descent

Data plus fitting line

Plot - Error

The error changing as we move toward the minimum.

Error

Dependencies]

  • numpy
  • pandas (read the dataset)
  • matplotlib (plotting)

References

Siraj Raval - Youtube - Intro - The Math of Intelligence

A Neural Network in 13 lines of Python (Part 2 - Gradient Descent) - Andrew Trask

About

Implementation of gradient descent to find the line of best fit.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published