Member: 🌜 Ymanzi 🌛
Implement a Multilayer Perceptron Library from scratch
- Cross Entropy
- Means Squared Error
- Sigmoid
- Tanh
- ReLU
- SoftMax
Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model's performance on the unseen data as well, avoiding overfitting.
- L1/L2 Regularization : L1/L2 regularization try to reduce the possibility of overfitting by keeping the values of the weights and biases small.
- Dropout : To apply DropOut, we randomly select a subset of the units and clamp their output to zero, regardless of the input; this effectively removes those units from the model.
- Dropout Connect : We apply dropout with the weights, instead of nodes