deepr.hooks package

Submodules

deepr.hooks.base module

Base Hooks Factories

Some TensorFlow hooks cannot be defined before runtime. For example, a TensorLoggingHook requires tensors to be initialized.

To resolve this issue, we provide abstractions for Hooks factories, that allow you to parametrize the creation of hooks that will be created at runtime.

See the LoggingTensorHookFactory for instance.

class deepr.hooks.base.EstimatorHookFactory[source]

Bases: ABC

Estimator Hook Factory

class deepr.hooks.base.TensorHookFactory[source]

Bases: ABC

Tensor Hook Factory

deepr.hooks.early_stopping module

Early Stopping Hook

class deepr.hooks.early_stopping.EarlyStoppingHookFactory(metric, max_steps_without_improvement, min_steps=0, mode=BestMode.DECREASE, run_every_secs=None, run_every_steps=None, final_step=None)[source]

Bases: EstimatorHookFactory

Early Stopping Hook Factory

metric

Name of the metric to read from evaluation checkpoints.

Type:

str

max_steps_without_improvement

If no improvement in metric for this many step, stop.

Type:

int

min_steps

Do not attempt to early stop for this many steps.

Type:

int

mode

INCREASE or DECREASE. If DECREASE, lower metric is better.

Type:

BestMode

run_every_secs

If given, run early stopping hook every given seconds.

Type:

int, Optional

run_every_steps

If given, run early stopping hook every given steps.

Either run_every_secs or run_every_step should be given.

Type:

int, Optional

final_step

If given, will set the global_step to this value when early stopping.

This is a useful way to signal the end of training to the evaluator in the case of distributed training (early stopping causes issues with train_and_evaluate).

Type:

int, Optional

deepr.hooks.log_variables_init module

Log Variables Statistics after initialization.

class deepr.hooks.log_variables_init.LogVariablesInitHook(use_mlflow=False, whitelist=None, blacklist=('adam', 'beta', 'stopping', 'step'))[source]

Bases: SessionRunHook

Log Variables Statistics after initialization.

after_create_session(session, coord)[source]

Log average norm and number of zeros of variables values.

deepr.hooks.logging_tensor module

MLFlow Metrics Hook

class deepr.hooks.logging_tensor.LoggingTensorHook(tensors, functions=None, name=None, use_mlflow=False, use_graphite=False, skip_after_step=None, every_n_iter=None, every_n_secs=None, at_end=False, formatter=<function _default_formatter>)[source]

Bases: LoggingTensorHook

Logging Hook (tensors and custom metrics as functions)

class deepr.hooks.logging_tensor.LoggingTensorHookFactory(tensors=None, functions=None, name=None, use_mlflow=False, use_graphite=False, skip_after_step=None, every_n_iter=None, every_n_secs=None, at_end=False, formatter=<function _default_formatter>)[source]

Bases: TensorHookFactory

Parametrize the creation of a LoggingTensorHook factory.

Arguments for instantiation should be provided as keyword arguments.

tensors

Name of the tensors to use at runtime. If None (default), log all scalars.

Type:

List[str], Optional

functions

Additional “python” metrics. Each function should return a float

Type:

Dict[str, Callable[[], float]], Optional

name

Name used as prefix of tags when sending to MLFlow / Graphite

Type:

str, Optional

use_mlflow

If True, send metrics to MLFlow. Default is False.

Type:

bool, Optional

use_graphite

If True, send metrics to Graphite. Default is False.

Type:

bool, Optional

skip_after_step

If not None, do not run the hooks after this step.

Prevents outliers when used in conjunction with an early stopping hook that overrides the global_step.

Type:

int, Optional

formatter

Formatter for logging, default uses 7 digits precision.

Type:

Callable[[str, Any], str], Optional

class deepr.hooks.logging_tensor.MaxResidentMemory(unit='gb')[source]

Bases: ResidentMemory

Measure maximum resident memory of the current process

class deepr.hooks.logging_tensor.ResidentMemory(unit='gb')[source]

Bases: object

Measure resident memory of the current process

deepr.hooks.num_params module

Log Number of Parameters after session creation

class deepr.hooks.num_params.NumParamsHook(use_mlflow=False)[source]

Bases: SessionRunHook

Log Number of Parameters after session creation

after_create_session(session, coord)[source]

Called when new TensorFlow session is created.

This is called to signal the hooks that a new session has been created. This has two essential differences with the situation in which begin is called:

  • When this is called, the graph is finalized and ops can no longer be added

    to the graph.

  • This method will also be called as a result of recovering a wrapped

    session, not only at the beginning of the overall session.

Parameters:
  • session – A TensorFlow Session that has been created.

  • coord – A Coordinator object which keeps track of all threads.

deepr.hooks.num_params.get_num_params()[source]

Get number of global and trainable parameters

Returns:

num_global, num_trainable

Return type:

Tuple[int, int]

deepr.hooks.steps_per_sec module

Steps Per Second Hook

class deepr.hooks.steps_per_sec.StepsPerSecHook(batch_size, name=None, use_mlflow=False, use_graphite=False, skip_after_step=None, every_n_steps=100, every_n_secs=None, output_dir=None, summary_writer=None)[source]

Bases: StepCounterHook

Logs steps per seconds and num_examples_per_sec.

batch_size

Batch Size

Type:

int

prefix

Prefix of tags when sending to MLFlow / Graphite

Type:

str, Optional

use_mlflow

If True, send metrics to MLFlow. Default is False.

Type:

bool, Optional

use_graphite

If True, send metrics to Graphite. Default is False.

Type:

bool, Optional

skip_after_step

If not None, do not run the hooks after this step.

Type:

int, Optional

deepr.hooks.summary module

Summary Saver Hook

class deepr.hooks.summary.SummarySaverHookFactory(tensors=None, save_steps=None, save_secs=None, output_dir=None, summary_writer=None, scaffold=None)[source]

Bases: TensorHookFactory

Summary Saver Hook

Module contents