[PYTHON] Precautions when using tf.keras.layers.TimeDistributed for tf.keras custom layer

Purpose of this article

tf.tf for custom layer defined in keras.keras.layers.I got an error when using TimeDistributed, so I will share the content and solution.



 version
- Python: 3.6.9
- Tensorflow: 2.1.0

## table of contents
 --What is tf.keras.layers.TimeDistributed?
 --What is a custom layer?
 --Introduction of error contents when tf.keras.layers.TimeDistributed is applied to custom layer
 --Solution above

## What is tf.keras.layers.TimeDistributed?
 It is used when you want to make one layer act repeatedly in the time direction. ([Reference](https://www.tensorflow.org/api_docs/python/tf/keras/layers/TimeDistributed))

```python
from tensorflow.keras import Input, Model
from tensorflow.keras.layers import TimeDistributed, Dense

temporal_dim = 4
emb_dim = 16
inputs = Input((temporal_dim, 8))
outputs = TimeDistributed(Dense(emb_dim))(inputs)
model = Model(inputs, outputs)
model.summary()
"""
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         [(None, 4, 8)]            0         
_________________________________________________________________
time_distributed (TimeDistri (None, 4, 16)             144       
=================================================================
Total params: 144
Trainable params: 144
Non-trainable params: 0
_________________________________________________________________
"""

Note) The above result does not depend on the presence or absence of `tf.keras.layers.TimeDistributed```, as tf.keras.layers.Dense``` only affects the last dimension. I'm grateful when I use it for layers with limited input_shape, such as `` tf.keras.layers.Embedding```.

tf.keras.layers.Embedding is batch_It can only be used for 2D tensor with size, so if you want to embed 3D tensor, tf.keras.layers.You need to use TimeDistributed to iterate over the missing extra one-dimensional direction.(Used in the recommend model)


## What is a custom layer

#### **`tf.In keras, tf.keras.layers.You can inherit a Layer to define a custom layer.`**

tf.There are many useful layers implemented in keras, and you can combine them to tf.keras.We will create a model. A large number of models can be implemented with just the standard layer.



 If you want more flexible processing, you can implement it as a custom layer without losing the `` `tf.keras``` likeness. ([Reference](https://www.tensorflow.org/tutorials/customization/custom_layers))

### Example .1: Combine standard ``` tf.keras.layers```
 In this case, use `` `.__ init__ ()` `` to generate an instance of` `tf.keras.layers```, and use `` `.call ()` `` to process the actual layer. Is defined.

 (The following layers are the same as ``` tf.keras.layers.Dense```)

```python
import tensorflow as tf
from tensorflow.keras.layers import Dense

class Linear(tf.keras.layers.Layer):
    def __init__(self, emb_dim, *args, **kwargs):
        super(Linear_error, self).__init__(*args, **kwargs)
        self.dense = Dense(emb_dim)

    def call(self, x):
        return self.dense(x)

Example .2: Define a learnable variable

As a way to use the lower level API, you can write:

class Linear(tf.keras.layers.Layer):
    def __init__(self, emb_dim):
        super(Linear, self).__init__()
        self.emb_dim = emb_dim

    def build(self, input_shape):
        self.kernel = self.add_weight(
            "kernel", shape=[int(input_shape[-1]), self.emb_dim]
        )

    def call(self, x):
        return tf.matmul(x, self.kernel)

Error: tf.keras.layers.TimeDistributed for custom layer doesn't work

Use `` `tf.keras.layers.TimeDistributed``` for custom layers as shown below.

from tensorflow.keras import Input
from tensorflow.keras.layers import TimeDistributed

temporal_dim = 4
emb_dim = 16
inputs = Input((temporal_dim, 8))

outputs = TimeDistributed(Linear(emb_dim))(inputs)
assert outputs.shape.rank == 3
assert outputs.shape[1] == temporal_dim
assert outputs.shape[2] == emb_dim

Then, the following error is returned. (Excerpt)

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/base_layer.py:773: in __call__
    outputs = call_fn(cast_inputs, *args, **kwargs)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/layers/wrappers.py:270: in call
    output_shape = self.compute_output_shape(input_shape).as_list()
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/layers/wrappers.py:212: in compute_output_shape
    child_output_shape = self.layer.compute_output_shape(child_input_shape)

>     raise NotImplementedError
E     NotImplementedError

/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/keras/engine/base_layer.py:564: NotImplementedError

This is a message that the custom layer `.compute_output_shape ()` is missing, so please implement it yourself.

In tf.keras.layers, input_shape and output_shape are not determined until `.build ()` or `.call ()` is called. However, `tf.keras.layers.TimeDistributed``` will generate the ouput_shape of the layer in the argument when the instance is created (that is, .build () or .call () . Try to know before ). `tf.keras.layers``` provides a method called .compute_output_shape () for such a case. As far as I can see [Code](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/engine/base_layer.py/#L530), I'm trying to build, but if it's a custom layer `NotImplementedError``` is raised without a successful build.

solution

Override ``` .compute_output_shape ()` `` when defining a custom layer to explicitly define output_shape.

class Linear(tf.keras.layers.Layer):
    def __init__(self, emb_dim, *args, **kwargs):
        super(Linear, self).__init__(*args, **kwargs)
        self.emb_dim = emb_dim
        self.dense = Dense(self.emb_dim)

    def call(self, x):
        return self.dense(x)

    def compute_output_shape(self, input_shape):
        output_shape = input_shape[0:-1] + [self.emb_dim]
        return output_shape

If it is this custom layer, it will work normally even if I go to tf.keras.layers.TimeDistributed.

Recommended Posts

Precautions when using tf.keras.layers.TimeDistributed for tf.keras custom layer
Precautions when using for statements in pandas
Precautions when using Chainer
Precautions when using pit in Python
Precautions when using TextBlob trait analysis
Precautions when using codecs and pandas
Precautions when using the urllib.parse.quote function
Precautions when using phantomjs from python
Precautions when using six with Python 2.5
[python, multiprocessing] Behavior for exceptions when using multiprocessing
Precautions when using OpenCV from Power Automate Desktop
Precautions when using google-cloud library with GAE / py
Precautions for cv2.cvtcolor
Recommended tf.keras custom layer writing and variable name behavior
Precautions when using sqlite3 on macOS Sierra (10.12) with multiprocessing
Where to fix code when using plotly for free
Precautions when operating with string for TmeStampType of PySpark