A Comprehensive List of Hyperparameter Optimization & Tuning Solutions
This article has one purpose; to maintain an up-to-date list of available hyperparameter optimization and tuning solutions for deep learning and other machine learning uses. If you want to suggest a solution to add, you can share it in the comments below.
Hyperparameter Optimization for Keras
- DeepReplay — Hyperparameter Visualization
- Hyperas — Keras Wrapper for Hyperopt
- Kopt — Another Hyperopt Based Optimizer
- Talos — Hyperparameter Optimization for Keras
There is also the KerasClassifier inside sklearn for grid search.
Other Hyperparameter Optimization Solutions
- adviser — Open-source implementation of Google Vizier
- auto_ml — Automated machine learning
- BTB — Bayesian Tuning and Bandits
- Chocolate — Decentralized Hyperparameter Optimization
- Cornell-MOE — parallel Bayesian optimization algorithms
- deap — Evolutionary Algorithm Optimization
- devol — Evolutionary Algorithm optimization
- GPyOpt — Gaussian Process Optimization
- H20 — Automatic Machine Learning
- HORD — Deterministic RBF Surrogates
- HPOlib — Hyperparameter Optimizer Wrapper
- HpBandSter — A distributed Hyperband implementation on Steroids
- hypergrad — Differentiation based optimization
- Hyperopt — Distributed Async Optimization
- mlrMBO — Bayesian optimization for R
- pbt — Population Based Training
- pycma — CMA-ES optimization
- rbfopt — Derivative-free optimization
- ROBO — Bayesian Optimization Framework
- Sherpa — Hyperparameter Optimization
- SMAC3 — Sequential Model-base Algorithm Configuration
- spearmint — Bayesian-based Optimization
- TPOT — Automated Machine Learning tool
- test-tube — Track and Test Machine Learning Codes
- Tune — Scalable Hyperparameter Search