Hypera ( I think it will be Hyperas)

GithubHyperparameter tuning

A very simple convenience wrapper around hyperopt for fast prototyping with keras models. Hyperas lets you use the power of hyperopt without having to learn the syntax of it. Instead, just define your keras model as you are used to, but use a simple template notation to define hyper-parameter ranges to tune.

Features

* Varying dropout probabilities, sampling from a uniform distribution
* Different layer output sizes
* Different optimization algorithms to use
* Varying choices of activation functions
* Conditionally adding layers depending on a choice
* Swapping whole sets of layers

Official website

Tutorial and documentation

Enter your contact information to continue reading