- Analytical solution of Linear Regression using pseudo inverse (Moore-Penrose inverse; PRML by Bishop)
- Time taken to compute the analytical solution with increase sample size
- Linear Regression with optimization using gradient descent
- Impact of increasing sample size and L2-regularization parameter on test error
- Impact of increasing L1-regularization parameter on test-error and model weights
- Elastic net effect on model weights where features are correlated
- Linear Classification using logistic regression
- Impact of increasing sample size and L2-regularization parameter on test error
- Sampling from a known distribution
- Rejection Sampling
- Importance Sampling
- Markov Chain Monte Carlo Sampling
- numpy
- math
- time
- seaborn
- matplotlib
- tqdm
Generated the data for regression and classification with user-defined variance
Code Adam Optimizer from scratch. Use inspiration from https://towardsdatascience.com/the-math-behind-adam-optimizer-c41407efe59b
https://github.com/MadhumithaKannan/linear-regression-using-only-numpy
https://www.geeksforgeeks.org/how-to-split-data-into-training-and-testing-in-python-without-sklearn/
https://medium.com/@Suraj_Yadav/compute-performance-metrics-from-scratch-53025140fe1d
https://inria.github.io/scikit-learn-mooc/overfit/learning_validation_curves_slides.html