Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compatibility with neural network: replacing with constant value instead of dropping the feature #52

Open
stephanecollot opened this issue Dec 6, 2022 · 2 comments

Comments

@stephanecollot
Copy link
Collaborator

stephanecollot commented Dec 6, 2022

Hi,

For neural network if you change the number of features, you need to change the input dimension and therefore the number of neurons.
So, we could have an option like:

  • leaving='drop' for current behaviour
  • leaving='replace' for NN

What do you think?

@aerdem4
Copy link
Owner

aerdem4 commented Dec 7, 2022

Makes sense. Would this be a flag to set?

I have only one concern. Most BN implementations etc make safe normalization but someone may have a custom model which assumes all features to have non-zero std. So replacing with a small random noise could also be an option.

@stephanecollot
Copy link
Collaborator Author

stephanecollot commented Dec 7, 2022

yes, could be different "leaving strategies", I think a string field would give more flexibility. Good point for the batch normalization and the risk of dividing by zero.

Side remark: with NN you will most probably use FLOFO instead of LOFO anyway, at least that fit my current use case, because I have too many features and training takes too long.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants