Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement relu #32

Open
kasey- opened this issue May 23, 2019 · 4 comments
Open

Implement relu #32

kasey- opened this issue May 23, 2019 · 4 comments

Comments

@kasey-
Copy link

kasey- commented May 23, 2019

Hello,

I started to implement the relu function for the genann library on a fork under my name (https://github.com/kasey-/genann) before sending you a PR:

double inline genann_act_relu(const struct genann *ann unused, double a) {
    return (a > 0.0) ? a : 0.0;
}

But I am a bit lost in the way you compute the back propagation of the neural network. The derivative of relu formula is trivial (a > 0.0) ? 1.0 : 0.0 But I cannot understand were I should plug-it inside your formula as I do not understand how do you compute your back propagation. Did you implemented only the derivate of the sigmoid ?

@ilia3101
Copy link

ilia3101 commented Sep 8, 2019

I would like ReLU in genann too, but the back propagation is also something I don't understand here :(

@msrdinesh
Copy link

Yes, In the code derivate of sigmoid "ddxσ(x)=σ(x)(1−σ(x))" is only implemented. I think we have to write a generic function of derivatives so that, we can add other activation functions like tanh and Relu
check the code here
https://github.com/kasey-/genann/blob/27c4c4288728791def0c5fd175c1c3999057ad9d/genann.c#L335
If you agree I can also work on this.

@AnnieJohnson25
Copy link

Hi everyone,
To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative?
Would it also require any additional functions like how sigmoid has genann_act_sigmoid_cached and genann_init_sigmoid_lookup?
Any advice...

@doug65536
Copy link

doug65536 commented Mar 12, 2023

Hi everyone, To actually implement Relu and linear what exactly should we be looking at other than the back propagation derivative? [snip]

It's right there at the top of this bug report thread, double inline genann_act_relu. (To be clear: It still doesn't work using that, because the derivative is hardcoded to sigmoid).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants