A unified ensemble framework for pytorch to easily improve the performance and robustness of your deep learning model. Ensemble-PyTorch is part of the pytorch ecosystem which requires the project to be well maintained.
Stable version:
pip install torchensemble
Latest version (under development):
pip install git+https://github.com/TorchEnsemble-Community/Ensemble-Pytorch.git
from torchensemble import VotingClassifier # voting is a classic ensemble strategy
# Load data
train_loader = DataLoader(...)
test_loader = DataLoader(...)
# Define the ensemble
ensemble = VotingClassifier(
estimator=base_estimator, # here is your deep learning model
n_estimators=10, # number of base estimators
)
# Set the optimizer
ensemble.set_optimizer(
"Adam", # type of parameter optimizer
lr=learning_rate, # learning rate of parameter optimizer
weight_decay=weight_decay, # weight decay of parameter optimizer
)
# Set the learning rate scheduler
ensemble.set_scheduler(
"CosineAnnealingLR", # type of learning rate scheduler
T_max=epochs, # additional arguments on the scheduler
)
# Train the ensemble
ensemble.fit(
train_loader,
epochs=epochs, # number of training epochs
)
# Evaluate the ensemble
acc = ensemble.predict(test_loader) # testing accuracy
Ensemble Name | Type | Source Code |
---|---|---|
|
|
|
|
|
|
|
|
|
|
Sequential |
|
|
Sequential |
|
|
|
|
Fast Geometric Ensemble6 | Sequential |
|
|
|
soft_gradient_boosting.py |
- scikit-learn>=0.23.0
- torch>=1.4.0
- torchvision>=0.2.2
Zhou, Zhi-Hua. Ensemble Methods: Foundations and Algorithms. CRC press, 2012.↩
Breiman, Leo. Bagging Predictors. Machine Learning (1996): 123-140.↩
Friedman, Jerome H. Greedy Function Approximation: A Gradient Boosting Machine. Annals of Statistics (2001): 1189-1232.↩
Huang, Gao, et al. Snapshot Ensembles: Train 1, Get M For Free. ICLR, 2017.↩
Lakshminarayanan, Balaji, et al. Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles. NIPS, 2017.↩
Garipov, Timur, et al. Loss Surfaces, Mode Connectivity, and Fast Ensembling of DNNs. NeurIPS, 2018.↩
Feng, Ji, et al. Soft Gradient Boosting Machine. ArXiv, 2020.↩