deepr.layers.Layer
- class deepr.layers.Layer(n_in=None, n_out=None, inputs=None, outputs=None, name=None)[source]
Base class for composable layers in a deep learning network.
Heavily inspired by TRAX layers, adapted for TF1.X and tf.estimator.
Layers are the basic building block of models. A
Layer
is a function from one or more inputs to one or more outputs.- The inputs of a
Layer
are tensors, packaged as follows n_in = 1: one tensor (NOT wrapped in a tuple)
n_in > 1: a tuple of tensors
- The outputs of a
Layer
are tensors, packaged as follows n_out = 1: one tensor (NOT wrapped in a tuple)
n_out > 1: a tuple of tensors
The basic usage of a
Layer
is to build graphs as intuitively as possible. For example:>>> from deepr.layers import Dense >>> input_tensor = tf.ones([32, 8]) >>> dense = Dense(16) >>> output_tensor = dense(input_tensor) >>> output_tensor <tf.Tensor 'dense/BiasAdd:0' shape=(32, 16) dtype=float32>
Because some layers (like
Dropout
) might behave differently depending on the mode (TRAIN, EVAL, PREDICT), an optional argument can be provided:>>> from deepr.layers import Dropout >>> tensor = tf.ones([32, 8]) >>> dropout = Dropout(0.5) >>> dropped = dropout(input_tensor, tf.estimator.ModeKeys.TRAIN) >>> not_dropped = dropout(input_tensor, tf.estimator.ModeKeys.EVAL)
Because in a lot of cases, a
Layer
needs to be applied on a dictionary, yielded by a tf.data.Dataset for example, you can also do:>>> tf.reset_default_graph() >>> tensors = {"x": tf.ones([32, 8])} >>> dense = Dense(16, inputs="x", outputs="y") >>> tensors = dense(tensors) >>> tensors {'y': <tf.Tensor 'dense/BiasAdd:0' shape=(32, 16) dtype=float32>}
The inputs and outputs are optional (defaults to t_0, t_1 etc.) and their order needs to be coherent with the order of tensors in tuples.
Authors of new layer subclasses typically override one of the two methods of the base
Layer
class:def forward(self, tensors, mode: str = None): # tensors is either a Tensor (n_in=1) or a tuple of Tensors def forward_as_dict(self, tensors: Dict, mode: str = None) -> Dict: # tensors is a dictionary whose keys contain self.inputs
The implementation of either of these two methods gives the implementation of the other for free thanks to automatic tuple to dictionary conversion.
The easiest way to define custom layers is to use the
layer
decorator (see documentation).Note that layers using parameters (a
Dense
layer for example) should not create variables at instantiation time nor store variables or any other graph references as attributes.>>> tf.reset_default_graph() >>> dense = Dense(16)
No parameters are created >>> dense(tf.ones([32, 8])) <tf.Tensor ‘dense/BiasAdd:0’ shape=(32, 16) dtype=float32>
Parameters are created in the current tf.Graph
In other words, calling the layer should not change its state. This is effectively enforcing functional programming. The state of the layer is only used to parametrize its runtime. This makes it simpler to define graphs with the tf.estimator API.
If you want to define a layer and use it twice (effectively reusing its variables), you need to be explicit, and set the reuse=True arguments at call time. Behind the scene, it’s simply wrapping the TF1.X variable management into a
variable_scope()
.>>> tf.reset_default_graph() >>> dense = Dense(16) >>> dense(tf.ones([32, 8])) <tf.Tensor 'dense/BiasAdd:0' shape=(32, 16) dtype=float32> >>> dense(tf.ones([32, 8]), reuse=True) <tf.Tensor 'dense_1/BiasAdd:0' shape=(32, 16) dtype=float32>
While the two operations have different names ‘dense/BiasAdd:0’ and ‘dense_1/BiasAdd:0’, they both share the same weights.
Good examples on how to implement parametrized layers are deepr.Dense and embedding.Embedding.
- inputs
Names of the n_in inputs keys in a dictionary. Tuple if n_in > 1, else string.
- outputs
Names of the n_out outputs keys in a dictionary. Tuple if n_out > 1, else string
Methods
__init__
([n_in, n_out, inputs, outputs, name])forward
(tensors[, mode])Forward method on one Tensor or a tuple of Tensors.
forward_as_dict
(tensors[, mode])Forward method on a dictionary of Tensors.
- The inputs of a