Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

APTx activation function #214

Merged
merged 1 commit into from May 14, 2024
Merged

Conversation

Rockdeldiablo
Copy link
Contributor

I saw you implemented the Swish function. As an exercise i implemented the APTx activation https://arxiv.org/abs/2209.06119 , which seems to be a variant of the new mish function that requires less computational power. I tested on the DE given as exercise during the Harvard ComputeFest and it seems to give better results than Swish at low/very low epochs. However more tests are needed before a definitive response.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Deleted by mistake?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wow, i guess so. I thought only added 2 lines. it's been a while since i used git. i must have made a mistake.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi, i removed the modifications to the markdown in advanced.ipynb since vs code automatically delete those 3k lines for no reasons and i don't know why at the moment.

self.gamma = gamma

def forward(self, x):
return (self.alpha + torch.nn.functional.tanh(self.beta*x))*self.gamma*x
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Optionally, you can just use torch.tanh(). If I recall correctly, torch.nn.functional is deprecated. I'll merge this one though.

@shuheng-liu shuheng-liu merged commit 0ee81fd into NeuroDiffGym:master May 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants