A unified ensemble framework for pytorch to easily improve the performance and robustness of your deep learning model. Ensemble-PyTorch is part of the pytorch ecosystem, which requires the project to be well maintained.
pip install torchensemble
from torchensemble import VotingClassifier # voting is a classic ensemble strategy
# Load data
train_loader = DataLoader(...)
test_loader = DataLoader(...)
# Define the ensemble
ensemble = VotingClassifier(
estimator=base_estimator, # estimator is your pytorch model
n_estimators=10, # number of base estimators
)
# Set the optimizer
ensemble.set_optimizer(
"Adam", # type of parameter optimizer
lr=learning_rate, # learning rate of parameter optimizer
weight_decay=weight_decay, # weight decay of parameter optimizer
)
# Set the learning rate scheduler
ensemble.set_scheduler(
"CosineAnnealingLR", # type of learning rate scheduler
T_max=epochs, # additional arguments on the scheduler
)
# Train the ensemble
ensemble.fit(
train_loader,
epochs=epochs, # number of training epochs
)
# Evaluate the ensemble
acc = ensemble.evaluate(test_loader) # testing accuracy
Ensemble Name | Type | Source Code | Problem |
---|---|---|---|
|
|
|
Classification / Regression |
|
|
|
Classification / Regression |
|
|
|
Classification / Regression |
|
|
|
Classification / Regression |
|
Sequential |
|
Classification / Regression |
|
Sequential |
|
Classification / Regression |
|
|
|
Classification / Regression |
Fast Geometric Ensemble6 | Sequential |
|
Classification / Regression |
|
|
soft_gradient_boosting.py | Classification / Regression |
- scikit-learn>=0.23.0
- torch>=1.4.0
- torchvision>=0.2.2
Zhou, Zhi-Hua. Ensemble Methods: Foundations and Algorithms. CRC press, 2012.↩
Breiman, Leo. Bagging Predictors. Machine Learning (1996): 123-140.↩
Friedman, Jerome H. Greedy Function Approximation: A Gradient Boosting Machine. Annals of Statistics (2001): 1189-1232.↩
Huang, Gao, et al. Snapshot Ensembles: Train 1, Get M For Free. ICLR, 2017.↩
Lakshminarayanan, Balaji, et al. Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles. NIPS, 2017.↩
Garipov, Timur, et al. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs. NeurIPS, 2018.↩
Feng, Ji, et al. Soft Gradient Boosting Machine. ArXiv, 2020.↩