Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

config.txt parsing error? #257

Open
Danielkim002 opened this issue Jan 5, 2023 · 1 comment
Open

config.txt parsing error? #257

Danielkim002 opened this issue Jan 5, 2023 · 1 comment

Comments

@Danielkim002
Copy link

Danielkim002 commented Jan 5, 2023

Hello, this is my first time using neat-python, and it is also my first time posting an issue report on GitHub. I apologize if I break any unspoken rules or guidelines in this post due to my inexperience.

Due to my inexperience with python neat and neural networks in general I have been using a lot of online sources including chat_gpt, youtube, and any available online documentation . In my experimentation I have been messing around with the activation functions. Using the xor example's config.txt file as boiler plate code and some chat gpt suggestions, I implemented the following changes to the config file

[ActivationFunction]
name = Leaky ReLU
slope = 0.01

[DefaultGenome]
#node activation options
activation_default = Leaky ReLU
activation_mutate_rate = 0.005
activation_options = Leaky ReLU, sigmoid

These changes worked and I was able to see the changes to the code, after pushing these changes into the project's repository and revisiting the code at a later date I was greeted with an error that "Leaky ReLU" was not a recognized activation function. After changing Leaky ReLU with relu a function that is found in activations.py I was greeted with a new error. No such activation function: 'relu,' in this error a comma can be found in the quotes meaning that the error could be a parsing error. Due to these errors I decided to reset my commit and experiment with the activation functions with the goal of recreating the error. After resetting to prior to the commit I was unable to reproduce the error with relu and sigmoid. However, when I pushed the changes I made into a separate branch to see if the error was coming as a result of the push I was greeted with the error in my next attempt to run the code. In the version of the project prior to push to GitHub the options in activation_options are parsed using the comma:

PRIOR TO PUSH TO GITHUB

[DefaultGenome]
#node activation options
activation_default = relu
activation_mutate_rate = 0.005
activation_options = relu, sigmoid

this config would work, however after the push this config will not work

I have found a work around, and it is simple as taking out the comma

AFTER PUSH TO GITHUB

[DefaultGenome]
#node activation options
activation_default = relu
activation_mutate_rate = 0.005
activation_options = relu sigmoid

I am unsure as to why this error is happening and it is very frustrating. Is anyone else having this issue? Could this be an error with my local machine? Is this an error with GitHub?

@Ball-Man
Copy link

The reason why it was working is probably because that 0.5% of mutation rate that you specified never fired during your first experiments. In fact, it appears that the activation_options field is not precomputed, but queried at runtime only when needed. You can empirically test this behaviour by adding a random activation name to the options and observing that neat will not complain about it in the first generations.

As far as I can tell, the supported syntax is having a space between options, no commas or other symbols. For this reasons, the string relu, sigmoid will be parsed as relu, and sigmoid.

About leaky relu, you are right, it is not implemented by this library, as you could see in activations.py. You could implement it and add it as custom activation function, if you're interested in it. In that regard keep in mind that leaky relu usually expects an additional parameter, which is the weight to apply to the input value in case it is negative.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants