Implemented optimizers

The following implementations of optimizers can found under this extension package:

Pygmo implementations

These derivative-free global optimizers are ported from the pygmo Python library:

In order to use this optimizers you need to install the pygmo dependency:

pip install pygmo

Note

The pygmo library is not compatible with Python 3.9 yet and only available for Linux and Unix systems.

Name

Keyword argument

Reference

CMAES

"CMAES"

pygmo cmaes

PSO

"PygmoPSO"

pygmo pso_gen

SGA

"SGA"

pygmo sga

SEA

"SEA"

pygmo sea

XNES

"XNES"

pygmo xnes

Differential Evolution

"DifferentialEvolution"

pygmo de

Simulated Annealing

"SimulatedAnnealing"

pygmo simulated_annealing

Tensorflow keras optimizers

These gradient based optimizers are ported from the tensorflow keras Python library:

In order to use this optimizers you need to install the tensorflow dependency:

pip install tensorflow

Name

Keyword argument

Reference

SGD

"SGD"

tf.keras.optimizers.SGD

RMSprop

"RMSprop"

tf.keras.optimizers.RMSprop

Adam

"AdamTensorflow"

tf.keras.optimizers.Adam

Nadam

"NAdam"

tf.keras.optimizers.Nadam

Adamax

"Adamax"

tf.keras.optimizers.Adamax

Ftrl

"Ftrl"

tf.keras.optimizers.Ftrl

Nevergrad optimizers

These derivative-free global optimizers are ported from the nevergrad Python library:

In order to use this optimizers you need to install the nevergrad dependency:

pip install nevergrad

Name

Keyword argument

Reference

Differential Evolution

"NevergradDE"

nevergrad.optimizers.DifferentialEvolution

PSO

"PSO"

nevergrad.optimizers.ConfPSO

Evosax optimizers

These derivative-free global optimizers are ported from the evosax Python library:

In order to use this optimizers you need to install the evosax dependency:

pip install evosax

Note

The evosax library is only available for Linux and Unix systems.

Name

Keyword argument

Reference

CMAES

"EvoSaxCMAES"

evosax.strategies.cma_es

PSO

"EvoSaxPSO"

evosax.strategies.pso

Simulated Annealing

"EvoSaxSimAnneal"

evosax.strategies.sim_anneal

Differential Evolution

"EvoSaxDE"

evosax.strategies.de

Optuna optimizers

These derivative-free global optimizers are ported from the optuna Python library:

In order to use this optimizers you need to install the optuna dependency:

pip install optuna

Name

Keyword argument

Reference

Tree-structured Parzen Estimator

"TPESampler"

optuna.samplers.TPESampler

Optax optimizers

These derivative-free global optimizers are ported from the optax Python library:

In order to use this optimizers you need to install the optax dependency:

pip install optax

Name

Keyword argument

Reference

Adam

"Adam"

optax.adam

SGD

"SGDOptax"

optax.sgd