Programming Exercise 5: Feed Forward Single/Multiple-Hidden Layer Classifier for MNIST Dataset
Python (sklearn-based) implementation that explores how different parameters impact a feed-forward neural network with single/multiple fully-connected hidden layer(s).
A brief analysis of the results is provided in Portuguese. It was submitted as an assignment of a graduate course named Connectionist Artificial Intelligence at UFSC, Brazil.
In short, multiple normalization methods are evaluated in a single-layer FFNET for classifying handwritten digits from the MNIST dataset with multiple training algorithms, learning rate (alpha), epochs, and activation functions. Then, the best results are submitted to multiple multi-layer of fully connected perceptrons for comparison.
Before normalization | MinMax normalization | MaxAbs normalization |
---|---|---|
L2 normalization | (x - u) / s normalization | Quantil-Uniform normalization |
Quantil-Normal normalization | ||
Confusion matrix of the experiment with the highest f1-score (0.93) of the multi-layer experiments.