Skip to content
#

backward-propagation

Here are 43 public repositories matching this topic...

This notebook demonstrates a neural network implementation using NumPy, without TensorFlow or PyTorch. Trained on the MNIST dataset, it features an architecture with input layer (784 neurons), two hidden layers (132 and 40 neurons), and an output layer (10 neurons) with sigmoid activation.

  • Updated Mar 15, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the backward-propagation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the backward-propagation topic, visit your repo's landing page and select "manage topics."

Learn more