Skip to content

adamLutPhi/MachineLearner

Machine Learner

A Machine Learning Repository, and its algorithms

"Life is a Set of random variables of opportunities"

  • A Random Variable is:
  1. Neither a Random: always follow a distribution
  2. Nor a Variable: it is a function of Probability Density (i.e. frequency)

The Random Variable is Quoted from Professor Krishna Jaganaathan , IIT Madras (from a Probability Foundation for Electrical Engineers Course )

The objective is to Utilize the most of it, effectively- ~90% of the time

🎮 Mixed-Coding mode

1. Focused The best learning is the One where pushing, to face off the Poly-shaped, Spaghetti monster complexity (like this one)

2. Relaxed
Keep Space in mind, call for relaxation, when Early signs :

  • Once Creativity starts lacking, - new ideas become transparent
  • Novelity vanishes into thin air, & Cloud of Routine takes over (its shadows become in everywhere you go)

Currently learning

  • Use an RNG (Random Number Generator) (StableRNG)

  • Divide & Conquer Algorithm (see Repository: Cause & Effect )

Machine Learning Algorithms

  • Outline a widrowHoff Algorithm
  • Outline a Green's function Algorithm

Open Questions

  • How to build A Neural-Network Model (Learning algorithm, Error function erf, optimization)
  • Which differentiation Module to Pick? 🤔
  • How could we implement an Optimization function?

Credits

Prof. Sirnivasa Singupta, Department of Electronics and Electrical Communication Engineering, IIT Kharagpur.

Special Thanks

To the Heroes behind the Scene, a list of Human Beings, & tutors, in which I couldn't do this project without, in the Alphabetic Order:

Lessons learned

  1. the Ziggurat Function
  • Opens up the possibility for a Statistical Repo 💡
  • For more info, please visit the Discussion

Papers Used

-The Ziggurat paper, by Christopher D McFarland : A Modified Ziggurat Algorithm for Generating Exponentially and Normally Distributed Pseudorandom Numbers (including the author's view on the Tail generation issue)

Good points of this paper is: it does not use rejection regions (what makes it Interesting for application) (Please Review on p.3, if you will)

We do not improve upon these approaches here and, instead, reuse previous techniques ... Overall, the ZA is ideal for distributions with Infrequent Sampling from the tail, i.e. not heavy-tailed distributions.

Paper on could be found from here A modified ziggurat algorithm for generating exponentially- and normally-distributed Pseudo-random Numbers

YouTube

A Great Neural Network course, easily Explained by a Humble Prof. S. Sengupta, (S: Sirnivasa) [R.I.P], by the NPTEL of India right here

WIP (Work In Progress Project)

This project is a Sown seed in the ground 🌱

If you do not mind, have free time, can help & give a hand 🤝 Please step in: Your Help would be much Appreciated- Thank You 🙏

Disclaimer

The author won't be held responsible, for any immature actions, & or any signs of code abuse, at all costs

Author

Logo