Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Deep MNIST for Experts" too complex for second tutorial. #769

Closed
NHDaly opened this issue Jan 14, 2016 · 3 comments
Closed

"Deep MNIST for Experts" too complex for second tutorial. #769

NHDaly opened this issue Jan 14, 2016 · 3 comments

Comments

@NHDaly
Copy link
Contributor

NHDaly commented Jan 14, 2016

The end of the first tutorial, "MNIST for ML Beginners", has this paragraph:

What matters is that we learned from this model. Still, if you're feeling a bit down about these results, check out the next tutorial where we do a lot better, and learn how to build more sophisticated models using TensorFlow!

But the next tutorial is "Deep MNIST for Experts", and it does not explain things nearly as well as the first tutorial. For example, the second tutorial starts differing at the Build a Multilayer Convolutional Network section, but the very first paragraph under Weight Initialization does not explain the concepts it is using:

To create this model, we're going to need to create a lot of weights and biases. One should generally initialize weights with a small amount of noise for symmetry breaking, and to prevent 0 gradients. Since we're using ReLU neurons, it is also good practice to initialize them with a slightly positive initial bias to avoid "dead neurons." Instead of doing this repeatedly while we build the model, let's create two handy functions to do it for us.

What are "ReLU neurons"? Why are we using them? What is a convolutional network even?

I would say, either

  1. This tutorial should be expanded to clarify/explain things better for beginners, or
  2. This tutorial should be moved to later in the list of tutorials, and
  3. The first tutorial, "MNIST for Beginners," should not say the next tutorial but instead say a later tutorial on MNIST for Experts.

Thanks!

@ushnish
Copy link

ushnish commented Jan 14, 2016

Seconded. I think an explanation of conv nets would be helpful in the context of MNIST digits.

@martinwicke
Copy link
Member

The TensorFlow tutorials cannot replace a deep learning or machine learning textbook or course. We may add pointers to additional resources as those become available.

@zhaolewen
Copy link

Not only this tutorial is complicated, all these tutorials are complicated.
For example, what I want to do is:

  1. load a cvs of image file names, and the classification labels associated with each of them
  2. load a folder of images
  3. Have the neuron network up and running !!! no matter how bad the accuracy !!!
  4. Improve the accuracy

But Tensorflow gives me already at the beginning a 4-file MNIST tutorial, with complicated graphs and complicated mechanisms for loading files into the network

darkbuck pushed a commit to darkbuck/tensorflow that referenced this issue Jan 23, 2020
…pstream-zero-division

Reenable the zero division test for ROCm
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants