Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

neuroevolution - activation functions, prefab substructures, how much of PyTorch api is available ? #83

Open
gminorcoles opened this issue Jul 28, 2023 · 0 comments

Comments

@gminorcoles
Copy link

Hi,
I did a lot of neuroevolution and evolutionary computing work long ago but I have not attempted it in the PyTorch world. Is there somewhere in the documentation or the code where I can focus on what the possible ingredients of evolution will be? In NEAT there was a limited set of operators or structural components which could comprise a solution.

Also is it possible to add evolved or hand-coded structures to the set of structures which can be incorporated into the evolutionary process?

Also where can I find out more about the weights or other evolved parameters? Are all these lost after evolution and the individual has to be retrained? It might be good to be able to fine tune an evolved individual but since these are prototypes I understand that might not be the intention.

Also is there a grammar for the string representation of a network? I am interested in compressing individuals to speed the search and it seems that a string or symbolic representation might work, and also I would like to try to incorporate guided evolution every few generations using a fine-tuned LLM and the string representation might provide a shortcut for this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant