DeepR

Config

The configuration module makes it possible to configure any object.

assert_no_macros(item)

Raises a ValueError if item has macro parameters.

fill_macros(item[, macros])

Create item whose macro params present in macros are filled.

fill_references(item[, references])

Fill all params that are references, fail if not found.

from_config(item)

Instantiate item from config.

ismacro(item)

True if item is a string that looks like '$macro:param'.

isreference(item)

True if item is a string that looks like '@reference'.

parse_config(config[, macros])

Fill macro parameters and references in config from macros.

Exporter

Exporters run at the end of training.

Exporter()

Base class for Exporters

BestCheckpoint(metric[, mode, use_mlflow, tag])

Overrides Checkpoint Information to point to the best checkpoint.

SaveVariables(path_variables, variable_names)

Save Variables as Parquet, supports chunking.

SavedModel(path_saved_model, fields)

Saved Model Exporter

Hook

Hooks are called regularly during training to send some information to another service.

EarlyStoppingHookFactory(metric, ...[, ...])

Early Stopping Hook Factory

EstimatorHookFactory()

Estimator Hook Factory

LoggingTensorHookFactory([tensors, ...])

Parametrize the creation of a LoggingTensorHook factory.

MaxResidentMemory([unit])

Measure maximum resident memory of the current process

ResidentMemory([unit])

Measure resident memory of the current process

StepsPerSecHook(batch_size[, name, ...])

Logs steps per seconds and num_examples_per_sec.

SummarySaverHookFactory([tensors, ...])

Summary Saver Hook

TensorHookFactory()

Tensor Hook Factory

Initializer

Initializers run before training.

CheckpointInitializer(path_init_ckpt, ...)

Checkpoint Initializer

Io

Io provides helpers to read/write from file systems.

HDFSFileSystem()

Context aware HDFSFileSystem using pyarrow.hdfs.

ParquetDataset(path_or_paths[, filesystem, ...])

Context aware ParquetDataset with support for chunk writing.

Path(*args)

Equivalent of pathlib.Path for local and HDFS FileSystem

read_json(path)

Read json or jsonnet file into dictionary

Job

Jobs are the programs that will actually run, they are composable through the pipeline job, the yarn launcher and trainer job.

CleanupCheckpoints(path_model[, ...])

Cleanup Checkpoints in path_model

CopyDir(source, target[, skip_copy, overwrite])

Copy Directory.

ExportXlaModelMetadata(path_optimized_model, ...)

Export xla compatible model metadata from a saved model

GridSampler(param_grid[, repeat])

Grid Sampler wrapping ParameterGrid from sklearn

Job()

Interface for jobs

LogMetric(key, value[, use_mlflow])

Log Metric job

MLFlowSaveConfigs([use_mlflow, config, ...])

Upload Configs to MLFlow

MLFlowSaveInfo([use_mlflow, path_mlflow, ...])

Save MLFlow info to path

OptimizeSavedModel(path_saved_model, ...[, ...])

Converts SavedModel into an optimized protobuf for inference

ParamsSampler(param_grid[, n_iter, repeat, seed])

Parameter Sampler

ParamsTuner(job, macros, sampler)

Params tuner

Pipeline(jobs)

Pipeline, executes list of jobs in order

Sampler()

Parameters Sampler

SaveDataset(input_fn, path, fields[, ...])

Save Dataset Job.

Trainer(path_model, pred_fn, loss_fn, ...[, ...])

Train and evaluate a tf.Estimator on the current machine.

YarnLauncher(job, config[, run_on_yarn, ...])

Packages current environment, upload .pex and run yarn job.

YarnTrainer(trainer, config[, train_on_yarn])

Run a TrainerBase job on yarn.

Layer

Tensorflow logic is preferably defined in a Layer for re-usability and composability. It is the equivalent of Keras, Trax, etc. layers. It takes as input / returns a dictionary of Tensor. This means that the __init__ method of a Layer must define which keys are used for inputs / outputs.

ActiveMode(layer[, mode, inputs, outputs])

Active Mode Layer.

Add()

Add two tensors of any compatible shapes.

Average(**kwargs)

Average Layer

BPR(**kwargs)

Vanilla BPR Loss Layer.

BooleanMask(**kwargs)

Boolean Mask Layer

ClickRank(**kwargs)

Click Rank Layer

Concat([axis])

Concatenate tensors on axis

Dense(units[, inputs, outputs, name])

Dense Layer

DotProduct([n_in])

Dot Product on the last dimension of the input vectors.

Embedding(variable_name, shape[, trainable, ...])

Partitioned Embedding Layer

Equal(values[, reduce_mode])

Equal Layer

Identity([inputs, name])

Identity Layer

IsMinSize(size, **kwargs)

Compare size of inputs to minimum

Layer([n_in, n_out, inputs, outputs, name])

Base class for composable layers in a deep learning network.

LogicalAnd()

Perform logical_and on two tensors of compatible shapes.

Lookup(table_initializer_fn, **kwargs)

Lookup Layer.

LookupFromFile(table_name, path[, ...])

Lookup From File Layer.

LookupFromMapping(table_name, mapping[, ...])

Lookup From Mapping Layer.

LookupIndexToString(table_name[, path, ...])

Lookup Index To String.

MaskedBPR(**kwargs)

Masked BPR Loss Layer.

NotEqual(values[, reduce_mode])

Not Equal Layer

Parallel(*layers)

Apply layers in parallel on consecutive inputs.

Product([n_in])

Product Layer

Rename(layer[, inputs, outputs])

Wrap Layer in a Node to rename inputs / outputs.

Select([inputs, outputs, indices, n_in])

Layer to extract inputs / outputs from previous layers

DAG(*layers)

Class to easily compose layers in a deep learning network.

Sequential

alias of DAG

Slice(begin, end, **kwargs)

Slice Layer

SliceFirst(size, **kwargs)

Slice First Layer

SliceLast(size, **kwargs)

Slice First Layer

StringJoin([n_in, separator])

String Join Layer

Sum([n_in])

Sum Layer

ToDense(default_value, **kwargs)

Sparse to Dense Layer

ToFloat()

Cast tensor to float32

WeightedAverage([default])

Weighted Average Layer

Macros

Macros are subclasses of dictionaries that dynamically create params for configs.

MLFlowInit([use_mlflow, run_name, ...])

MLFlow Macro initializes MLFlow run and sets MLFlow parameters

Metrics

Metrics compute training and validation information during training.

DecayMean([decay, tensors, pattern])

Decay Mean Metric

FiniteMean([tensors, pattern])

Finite Mean Metric

LastValue([tensors, pattern])

Last value Metric

MaxValue([tensors, pattern])

Max value Metric

Mean([tensors, pattern])

Finite Mean Metric

Metric()

Base class for Metrics

StepCounter(name)

StepCounter Metric

VariableValue(name)

Variable Value Metric.

Optimizer

Optimizer is the way to optimize your graph.

Optimizer()

Interface for Optimizers

TensorflowOptimizer(optimizer, learning_rate)

Default Tensorflow Optimizers

Prepro

The Prepro classes are utilities to transform Dataset.

The most common way to define a Prepro is to wrap a Layer with a Map or Filter transform.

Batch(batch_size[, drop_remainder])

Combines consecutive elements of a dataset into batches.

Filter(predicate[, on_dict, modes])

Filter a dataset keeping only elements on which predicate is True

FromExample(fields[, sequence, modes, ...])

Parse TF Record Sequence Example

Map(map_func[, on_dict, update, modes, ...])

Map a function on each element of a tf.data.Dataset.

PaddedBatch(batch_size, fields[, drop_remainder])

Combines consecutive elements of a dataset into padded batches.

Prefetch(buffer_size)

Creates a dataset that prefetch element on CPU / GPU.

Prepro()

Base class for composable preprocessing functions.

Repeat([count, modes])

Repeats a dataset so each original value is seen count times.

Serial(*preprocessors[, fuse, ...])

Chain preprocessors to define complex preprocessing pipelines.

Shuffle(buffer_size[, modes, seed, ...])

Randomly shuffles the elements of a dataset.

TFRecordSequenceExample

alias of FromExample

TableInitializer(table_initializer_fn)

Table Initializer.

Take([count])

Creates a dataset with at most count elements.

ToExample(fields[, sequence, modes, ...])

Convert dictionary of Tensors to tf.SequenceExample.

Reader

A Reader is the equivalent of tensorflow_dataset readers. Their __init__ method defines all the parameters necessary to create a Dataset.

GeneratorReader(generator_fn, output_types)

Reader Class for datasets using generator functions

Reader()

Interface for readers, similar to tensorflow_datasets

TFRecordReader(path[, num_parallel_reads, ...])

Class for TFRecord Reader of tf.train.Example.

Utils

Various functions

Field(name, shape, dtype[, default, sequence])

Convenient way to define fields for features.

TableContext()

Context Manager to reuse Tensorflow tables.

TensorType(dtype)

Return TensorType from Python, TensorFlow or NumPy type

chunks(iterable, chunk_size)

Split Iterable into Iterable chunks.

dict_to_item(data, keys)

Convert dictionary into object or tuple of objects.

get_feedable_tensors(graph, names)

Retrieve feed tensors from graph.

get_fetchable_tensors(graph, names)

Retrieve fetch tensors from graph.

handle_exceptions(fn)

Handle Exceptions Decorator.

import_graph_def(path_pb[, name])

Import Graph Definition from protobuff into the current Graph.

index_to_string_table_from_file(name[, ...])

Create reverse table from file

item_to_dict(items, keys)

Convert tuple or object to dictionary.

make_same_shape(tensors[, broadcast])

Make list of tensors the same shape

msb_lsb_to_str(msb, lsb)

Convert two 64 bit integers MSB and LSB to a 128 bit UUID.

progress(iterable[, secs])

Log progress on Iterable.iterable

save_variables_in_ckpt(path, variables[, ...])

Save variables in checkpoint

str_to_msb_lsb(el)

Convert a 128 bit UUID to two 64 bit integers MSB and LSB.

table_from_file(name[, path, key_dtype, ...])

Create table from file

table_from_mapping(name[, mapping, ...])

Create table from mapping

to_flat_tuple(items)

Convert nested list, tuples and generators to a flat tuple.

Vocab

Simple helpers for vocabularies

read(path)

Read vocabulary from file.

size(path)

Return vocabulary size from mapping file.

write(path, vocab)

Write vocabulary to file.

Writer

A Writer makes it possible to write dataset to disk.

Writer()

Base class for writers.

TFRecordWriter(path[, chunk_size, ...])

TFRecords writer.