hyperparameters tuning is very important concept in order to choose the optimal hyperparameters for a given algorithm, it is crucial for the success of a machine learning model or a deep learning architecture, since they heavily influence the behavior of the model learning. Often the search space of hyperparameters is fairly large for most machine learning and deep learning algorithms, that manually tuning is impossible.

The hyperparamters optimization literature is very rich in terms of algorithms, and many open source implementations are already existing. But finding good hyperparameters, does not only involve an efficient search algorithm over the space of possible hyperparameters, but also requires organization in order to review, compare, restart, or resume some of the tried experiments.

Polyaxon has a concept for suggesting hyperparameters and managing their results very similar to Google Vizier called experiment groups. An experiment group in Polyaxon defines a search algorithm, a search space, and a model to train.

Our blog is now hosted at https://blog.polyaxon.com