You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hello I am getting following error on TimeDistributed() layer while running the code in colab, please help me
WARNING:tensorflow:The following Variables were used in a Lambda layer's call (tf.nn.bias_add), but are not present in its tracked objects: <tf.Variable 'att_layer_2/b:0' shape=(100,) dtype=float32>. This is a strong indication that the Lambda layer should be rewritten as a subclassed Layer.
1 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in compute_output_shape(self, input_shape)
840 raise NotImplementedError(
841 'Please run in eager mode or implement the compute_output_shape '
--> 842 'method on your layer (%s).' % self.class.name)
843
844 @doc_controls.for_subclass_implementers
NotImplementedError: Exception encountered when calling layer "time_distributed" (type TimeDistributed).
Please run in eager mode or implement the compute_output_shape method on your layer (TFOpLambda).
hello I am getting following error on TimeDistributed() layer while running the code in colab, please help me
WARNING:tensorflow:The following Variables were used in a Lambda layer's call (tf.nn.bias_add), but are not present in its tracked objects: <tf.Variable 'att_layer_2/b:0' shape=(100,) dtype=float32>. This is a strong indication that the Lambda layer should be rewritten as a subclassed Layer.
NotImplementedError Traceback (most recent call last)
in
6
7 review_input = Input(shape=(MAX_SENTS, MAX_SENT_LENGTH), dtype='int32')
----> 8 review_encoder = TimeDistributed(sentEncoder)(review_input)
9 l_lstm_sent = Bidirectional(GRU(100, return_sequences=True))(review_encoder)
10 l_att_sent = AttLayer(100)(l_lstm_sent)
1 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in compute_output_shape(self, input_shape)
840 raise NotImplementedError(
841 'Please run in eager mode or implement the
compute_output_shape
'--> 842 'method on your layer (%s).' % self.class.name)
843
844 @doc_controls.for_subclass_implementers
NotImplementedError: Exception encountered when calling layer "time_distributed" (type TimeDistributed).
Please run in eager mode or implement the
compute_output_shape
method on your layer (TFOpLambda).Call arguments received:
• inputs=tf.Tensor(shape=(None, 15, 100), dtype=int32)
• training=None
• mask=None
The text was updated successfully, but these errors were encountered: