Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are the weights optimized by the autograd of torch? #4

Open
kevin031060 opened this issue Aug 27, 2019 · 5 comments
Open

Are the weights optimized by the autograd of torch? #4

kevin031060 opened this issue Aug 27, 2019 · 5 comments

Comments

@kevin031060
Copy link

Hi, I understand that the topology is optimized through EA. How the weights are optimized. EA method or backpropagating by torch?

@ddehueck
Copy link
Owner

ddehueck commented Aug 27, 2019

The weights are optimized in an evolutionary fashion as well.

Note the following in config files:

# Float between 0.0 and 1.0 - rate at which a connection gene will be mutated
CONNECTION_MUTATION_RATE = 0.80
# Float between 0.0 and 1.0 - rate at which a connections weight is perturbed (if connection is to be mutated) 
CONNECTION_PERTURBATION_RATE = 0.90

@ddehueck
Copy link
Owner

It should be possible to also update the weights with some training procedure before fitness evaluation. It could be interesting to see if something like the Baldwin Effect is evident in the resulting fitness distribution.

@kevin031060
Copy link
Author

Thank you and it's really a cool job.
Is there any possibility that uses backpropagating method to train the generated network after using EA to determine its structure. In other words, can we use the common backpropagating method to train an arbitrarily connected network? I think it can be more efficient if we use the backpropagating.

@ddehueck
Copy link
Owner

Yes, that would absolutely be possible. I'd be curious to see your results if you do so - we could add such an experiment to this repo perhaps.

@kevin031060
Copy link
Author

I'll try it and hope there would be some progress.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants