Skip to content

Implement machine learning algorithms like decision tree, k-nearest neighbor, etc right from scratch (using NumPy)

Notifications You must be signed in to change notification settings

jyoti0225/ML-Algorithms-from-Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ML-Algorithms-from-Scratch

In this repository, I will be implementing popular machine learning algorithms right from scratch.

For most of complex and non-linear data , tree based algorithms like Decision Tree, Random Forest, XGBoost, etc works better than most of the algorithms. But have you ever thought how these algorithms work?
In this, you will understand the working of a Decision Tree and then can also implement it in python using NumPy. Here is an example:

In this, we will be implementing KNN-Algorithm from scratch. KNN is a data classification algorithm that attempts to determine what group a data point is in by looking at the data points around it.


In this, we will be implementing K-Means Algorithm from scratch and shows an example application - reducing the number of colors. K-means is an entry level unsupervised clustering algorithm. In contrast to traditional methods in unsupervised methods we have data points, but we don't have labels. Based on distances between data algorithm finds k groups of points in a given dataset.

About

Implement machine learning algorithms like decision tree, k-nearest neighbor, etc right from scratch (using NumPy)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published