deepr.optimizers package

Submodules

deepr.optimizers.base module

Interface for Optimizers

class deepr.optimizers.base.Optimizer[source]

Bases: ABC

Interface for Optimizers

deepr.optimizers.core module

Tensorflow Optimizers

class deepr.optimizers.core.TensorflowOptimizer(optimizer, learning_rate, loss='loss', grad_norms=None, exclude_vars=None, clip=None, skip_vars=None, skip_steps=None, **kwargs)[source]

Bases: Optimizer

Default Tensorflow Optimizers

learning_rate

Learning rate

Type:

float

optimizer

Name of the optimizer. See TensorflowOptimizer.OPTIMIZERS for a description of available Tensorflow optimizers.

Type:

str

kwargs

Optional arguments for the Tensorflow optimizer.

OPTIMIZERS = {'adagrad': <class 'tensorflow.python.training.adagrad.AdagradOptimizer'>, 'adam': <class 'tensorflow.python.training.adam.AdamOptimizer'>, 'lazyadam': <class 'tensorflow.contrib.opt.python.training.lazy_adam_optimizer.LazyAdamOptimizer'>, 'momentum': <class 'tensorflow.python.training.momentum.MomentumOptimizer'>, 'sgd': <class 'tensorflow.python.training.gradient_descent.GradientDescentOptimizer'>}

Module contents