Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any way to send hyperparameter_hunter.Integer to custom feature_engineer function? #201

Open
dePuff opened this issue Oct 3, 2019 · 3 comments
Labels
Enhancement New feature or request

Comments

@dePuff
Copy link

dePuff commented Oct 3, 2019

Hello.

Actualy I'm looking for a way to Feature Selection or to apply some transformation for part of columns only during hyperparameter_hunter experiment.

To found this parts of columns in dataset is the idea of experiment.

I did't found any way to send hyperparameter_hunter.Integer (for example) to my custom feature_engineer function. It was idea how to solve my task.

Some ugly way is to generate tons of functions and to use them as search space but i'm pretty sure like I miss something and exist more nice way for tasks like my ones.

Actually any sample of Feature Selection with hyperparameter_hunter will be enought and this part did't covered well in manual.

Best Regards

@HunterMcGushion
Copy link
Owner

HunterMcGushion commented Oct 3, 2019

@dePuff, thank you for opening this issue!

I'm sorry, but I don't think I'm completely understanding what functionality you're looking for.

It sounds like you're familiar with HH's feature_engineer, so you may have already seen this Medium article. If you haven't seen it, would you mind checking that out and letting me know if that's at all what you're talking about?

My other thought is that you might be looking for the feature_selector kwarg of CVExperiment. Both feature_engineer and feature_selector are also kwargs in the forge_experiment method of all the OptPros for optimization, which is when you would be using Integer to define a search space.

I'm sorry if I'm completely missing your point. Would you mind providing a minimal code snippet of what you're trying to accomplish?

@puffofsmoke
Copy link

puffofsmoke commented Oct 4, 2019

First of all. Thank you for your time and for your response.

Yes, I started with Medium article but probably missing something because never use libs like this one before.

For example, if I run something like code below then hyperparameter_hunter will looking for good learning rate by checking learning rates closest to good one.
I found this possibility to be amazing.

opt_1 = BayesianOptPro(iterations=100, random_state=32)
opt_1.forge_experiment(
    AdaBoostRegressor,
    model_init_params=dict(
        n_estimators=40,
        learning_rate=Real(0.01, 1.0),
        loss='linear',
    ),
)
opt_1.go()

From now I wanna use same approaches for feature selection but fail with it.
My expectation was to have something like rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/

Let's forget for moment about question in the title because probably it's totally wrong idea how to get what I want.

Could you provide any snippet of code how to make any kind of feature selection with hyperparameter_hunter it will really help to understand the technics?

I tried feature_selector kwargs in this way:

possible_features = [['col_1', 'col_2', 'col_3', 'col_4', ], ['col_1', 'col_2', 'col_4', ]]
...
feature_selector=Categorical(possible_features)

And it did't work anyhow.

But here I had ideas of features combinations which should work.
I really want to know how to make hyperparameter_hunter found them if we don't have them.

Any sample of any technics please.

@HunterMcGushion
Copy link
Owner

Hahaha thank you so much for bringing this up, because I forgot to add that when I was working on the rest of Feature Engineering! I had always intended to add optimization of feature_selector, so thank you for reminding me it's not actually in there yet.

I'll get started on this and update you with any developments!

@HunterMcGushion HunterMcGushion added the Enhancement New feature or request label Oct 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants