Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Computational complexity of EBM #495

Open
JWKKWJ123 opened this issue Jan 7, 2024 · 5 comments
Open

Computational complexity of EBM #495

JWKKWJ123 opened this issue Jan 7, 2024 · 5 comments

Comments

@JWKKWJ123
Copy link

Hi all,
I'm interested in comparing the computational complexity of EBM and other machine learning models. I think calculating the number of trainable and non-trainable parameters of a model is one approach. I want to know whether EBM can output the total parameters of the output model (including trainable parameters and non-trainable parameters) based on the input features and the hyperparameters of the model?

@paulbkoch
Copy link
Collaborator

Hi @JWKKWJ123 -- I think what you're asking for is the total number of bins within the "term_scores_" attribute of the EBM model? I'm not clear on what a trainable vs non-trainable parameter would be in the context of EBMs. We don't currently expose the ability to freeze parts of an EBM in the way you might with a neural net, although we do offer the init_score parameter for these scenarios.

@JWKKWJ123
Copy link
Author

JWKKWJ123 commented Jan 8, 2024

term_scores

Hi Paul,
Thanks for the reply, I understood that all parameters are ‘trainable’ in EBM. Because I'm not particularly familiar with GBDT, I am still confused with the calculating of total parameters.
If I have a dataset of 10 features, and the EBM classifier have the default setting:
class interpret.glassbox.ExplainableBoostingClassifier(feature_names=None, feature_types=None, max_bins=256, max_interaction_bins=32, interactions=10, exclude=[], validation_size=0.15, outer_bags=8, inner_bags=0, learning_rate=0.01, greediness=0.0, smoothing_rounds=0, max_rounds=5000, early_stopping_rounds=50, early_stopping_tolerance=0.0001, min_samples_leaf=2, max_leaves=3, objective='log_loss', n_jobs=- 2, random_state=42)
Then how to calculate the total parameters?
e.g.(I think it is not correct):
N(parameters) ≈ N(features)* max_bins + N(pairwisefeatures)* max_interaction_bins

@Harsha-Nori
Copy link
Collaborator

As Paul mentioned, it's not ideal to think of each bin as an independent parameter, but it's probably the closest approximation we have to a "trainable parameter" in EBMs.

Your formula for this is approximately right for binary classification and regression. You'll need to multiply by the number of classes in the case of multiclass regression with >= 3 classes.

That said, the exact number of parameters varies beyond that formula because 1) the number of bins may be smaller than max_bins if there aren't sufficiently many unique values in the data (e.g. a boolean feature will always have only 2 bins, not 256) and 2) categorical features are handled separately, and can have either more or less than the max_bins number of values.

In practice, for any specific dataset, the way to calculate this exactly is to just sum up the lengths of the values in term_scores_ as Paul mentioned. But your formula is a reasonable approximation!

@paulbkoch
Copy link
Collaborator

In practice, for any specific dataset, the way to calculate this exactly is to just sum up the lengths of the values in term_scores_ as Paul mentioned.

I agree with everything Harsha said, but wanted to add one detail to this sentence. When you get the length of the arrays in "term_scores_", you'll want to ravel them before taking the length if you have pairs or multiclass.

@JWKKWJ123
Copy link
Author

Hi all,
Thank you very much! The approximation of total parameters is enough for me for now! So I may choose to calculate the parameter based on the number of features and bins.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants