Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Gradient #12

Open
eggie5 opened this issue Nov 20, 2016 · 2 comments
Open

Implement Gradient #12

eggie5 opened this issue Nov 20, 2016 · 2 comments

Comments

@eggie5
Copy link

eggie5 commented Nov 20, 2016

I notice you use a 3rd party module to evaluate the gradient of your cost function in your GD routine. What was the reasoning behind this and why not implement it?

@rushter
Copy link
Owner

rushter commented Nov 20, 2016

  1. There are tens of functions in the deep learning module. I prefer simplicity, so everyone can understand the concept behind this.
  2. Flexibility. People can play around with custom functions. No need to manually differentiate them.

But we can definitely get rid of this library in the linear models.

@mynameisvinn
Copy link
Contributor

for completeness sake: the closed form derivative of mse (ie cost function used in linear regression) is fairly straightforward. the problem is, however, computing gradients for more complicated cost functions like the ones seen in neural networks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants