Skip to content

fredericoschardong/MNIST-hyper-parameterization

Repository files navigation

Programming Exercise 5: Feed Forward Single/Multiple-Hidden Layer Classifier for MNIST Dataset

Description

Python (sklearn-based) implementation that explores how different parameters impact a feed-forward neural network with single/multiple fully-connected hidden layer(s).

A brief analysis of the results is provided in Portuguese. It was submitted as an assignment of a graduate course named Connectionist Artificial Intelligence at UFSC, Brazil.

In short, multiple normalization methods are evaluated in a single-layer FFNET for classifying handwritten digits from the MNIST dataset with multiple training algorithms, learning rate (alpha), epochs, and activation functions. Then, the best results are submitted to multiple multi-layer of fully connected perceptrons for comparison.

Normalization

Before normalization MinMax normalization MaxAbs normalization
L2 normalization (x - u) / s normalization Quantil-Uniform normalization
Quantil-Normal normalization

Result

Confusion matrix of the experiment with the highest f1-score (0.93) of the multi-layer experiments.

About

A naive sklearn-driven script to learn the best parameters for the MNIST database

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages