Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Neat disconnected #163

Open
chill0chill opened this issue Apr 9, 2019 · 3 comments
Open

Neat disconnected #163

chill0chill opened this issue Apr 9, 2019 · 3 comments

Comments

@chill0chill
Copy link

Hi, I am running a NEAT with more than 28 features and comparing it with a deep backprop network. I am running a regression problem.After running multiple configurations(pop size, crossover, mutation rate, connection add rate , etc). I am observing that neat is leaving out important features from the network if initial state is unconnected. If i start with fully connected network , then the accuracy is dropping a lot. Not to mention both the methods are underperforming deep fully connected network. Has anybody else faced this issue or does anybody has a thought on this problem?

@mathiasose
Copy link
Contributor

I did some thinking about this a while back, but I didn't implement it, so take that into consideration:

It might be beneficial to create a fitness function that rewards both objective score and smaller numbers of connections and/or hidden neurons. So if two different networks perform the same at the objective task, the smaller of the two networks would get a higher fitness score than the other. This would create a selection pressure towards smaller networks, which might make the solution space exploration more efficient (assuming the task can in fact be solved by a small network).

In your case you might also want to try starting with partially (randomly) connected initial states, not just fully connected and fully disconnected.

@winatawelly
Copy link

Hi, I am running a NEAT with more than 28 features and comparing it with a deep backprop network. I am running a regression problem.After running multiple configurations(pop size, crossover, mutation rate, connection add rate , etc). I am observing that neat is leaving out important features from the network if initial state is unconnected. If i start with fully connected network , then the accuracy is dropping a lot. Not to mention both the methods are underperforming deep fully connected network. Has anybody else faced this issue or does anybody has a thought on this problem?

Do you ever found any solution to this problem ?
i also running regression problem with 22 features and just noticed that most features doesnt even make it to the output node.

@ntraft
Copy link

ntraft commented Nov 20, 2022

I'm not sure what you mean by "disconnected" and "most features doesn't even make it to the output node", but it sounds like you may be impacted by the issue I filed just now: #255.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants