Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any tutorial? #32

Open
Jenscaasen opened this issue Aug 8, 2018 · 7 comments
Open

Any tutorial? #32

Jenscaasen opened this issue Aug 8, 2018 · 7 comments

Comments

@Jenscaasen
Copy link

Hi,
sorry for opening an issue for this, but i dont know how else to contact you.
Is there any documentation on how to actually implement SharpNeat? There is a collection of examples somewhere in the internet using sharpneat, which i can not find again, and there is that 8 year old tutorial from someone using it for TicTacToe, but which is incompatible to the current Nuget release. Outside of that, how do i use SharpNeat classes to give it input and getting outputs?

Best regards

@polgaro
Copy link

polgaro commented Mar 31, 2019

This guy made a tutorial:
http://www.nashcoding.com/2010/07/17/tutorial-evolving-neural-networks-with-sharpneat-2-part-1/

Also, here's a simple implementation you can copy from:
https://github.com/polgaro/NEAT

@t4ccer
Copy link

t4ccer commented May 7, 2020

I created simple tutorial here:
https://t4ccer.com/posts/sharpneat-tutorial/

@Jenscaasen
Copy link
Author

I created simple tutorial here:
https://t4ccer.com/posts/sharpneat-tutorial/

Thank you for that, thats so far the easiest to use tutorial that only uses the NUGET Package.

I was wondering if there is any documentation or experience on how to make the algorithm explore the possible outputs more "agressively". In my case after 2000 generations he still keeps most outputs at 0.5 (the default i presume).

@t4ccer
Copy link

t4ccer commented Aug 10, 2020

I'm glad to hear that my tutorial is useful.

In my experience setting complexityRegulationStrategy to NullComplexityRegulationStrategy made network evolve faster. You can also increase number of specimen in each generation. Of course if You need deeper understanding of NEAT You can read this paper. Unfortunately I didn't find any sharpneat documentation so I experimented myself.

Also if your network always performs with the same fitness it may be problem with fitness computing function.

@garnier77
Copy link

garnier77 commented Aug 13, 2020 via email

@marcus1337
Copy link

Hello everyone thank you for the tutorials as NEAT is very poorly documented and hard to understand and to put in models. I am willing to pay anyone that can help me build a time-series model to implement in NEAT and we do this through screen share Teamviewer or Skype and at the same time teaching me how NEAT works. Please email me or skype me if interested JGARNIER77

On Mon, Aug 10, 2020 at 10:20 AM t4ccer @.***> wrote: I'm glad to hear that my tutorial is useful. In my experience setting complexityRegulationStrategy to NullComplexityRegulationStrategy made network evolve faster. You can also increase number of specimen in each generation. Of course if You need deeper understanding of NEAT You can read this http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf paper. Unfortunately I didn't find any sharpneat documentation so I experimented myself. Also if your network always performs with the same fitness it may be problem with fitness computing function. — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub <#32 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFZOZ3YXIG3PJVTKM3K2T4DSAAT7TANCNFSM4FON5YDA .

Hello, I have been developing highly efficient NEAT code for free which is available through my GitHub page. I am willing to give tutorials.

@zelding zelding mentioned this issue Jan 3, 2023
@CopilotCoding
Copy link

CopilotCoding commented Mar 18, 2023

I generated my own API reference for using SharpNEAT at https://copilotcoding.github.io/ if you want to use it for your own reference its there.

Most of the information about SharpNEAT is heavily outdated, you could use older more documented versions instead, but if you want 4.0.0 documentation you will need to create it or wait for someone else to make it.

2 months ago, January 14, 2023: https://www.youtube.com/watch?v=pqVOAo669n0

Defunct dead link: http://www.nashcoding.com/2010/07/17/tutorial-evolving-neural-networks-with-sharpneat-2-part-1/

12 years ago, March 23, 2011: https://github.com/tansey/sharpneat-tutorials
4 years ago, March 17, 2019: https://vbstudio.hu/en/blog/20190317-Growing-an-AI-with-NEAT
4 years ago, Mar 31, 2019: https://github.com/polgaro/NEAT
2 years ago, May 7, 2020: https://t4ccer.com/posts/sharpneat-tutorial/

If anyone has a more updated version of these based on the new version 4 library, please let me know:

4 years ago, Aug 31, 2019: https://github.com/lordjesus/UnityNEAT

3 years ago, Nov 7, 2020: https://github.com/flo-wolf/UnitySharpNEAT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants