Auto Hyperparameter Optimization
Set your model parameters in a systematic way
Overview of Auto-HPO (Hyperparameter Optimization)
Nomadic started from an emerging customer need for auto-hyperparameter optimization (HPO). Setting model parameters across ML development stages (such as temperature, or learning rate/epochs during training) is often intuition-based or full of guesswork.
Nomadic’s unique HPO tuner library enables teams to identify the best model parameter configurations in a systematic way. We make available state-of-the-art search techniques developed by Microsoft Research and the latest HPO libraries off-the-shelf, so that you can easily search, set and test your parameters based on evolving priorities, such as increased cost-awareness or maximum performance.
Choose a Tuner
To use Auto-HPO, choose a supported Tuner
from Tuner. Nomadic Tuners employ different search techniques based on your needs, and iterate over hyperparameters to find the best configuration based on an evaluator.
Create an Experiment using the Tuner
The following is an example using FlamlParamTuner
.
FlamlParamTuner
uses FLAML (Fast Library for Automated Machine Learning) for efficient hyperparameter tuning. FLAML leverages the structure of the search space to optimize for both cost and model performance simultaneously. It contains two new methods developed by Microsoft Research:
- Cost-Frugal Optimization (CFO)
- BlendSearch
Using FLAML, you can specify budget constraints on your parameter search.