-
Notifications
You must be signed in to change notification settings - Fork 378
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to train HAN because of the following error. #32
Comments
I got the same error |
I found that in AttLayer change code from " def compute_mask(self, inputs, mask=None):
it will be ok. It means that should not support mask. |
Solved the issue. |
But have you found that 'l_att = AttLayer(100)(l_lstm)' the first AttLayer no error, but the second 'l_att_sent = AttLayer(100)(l_lstm_sent)' at computing mask. |
Well, I'm still experimenting with it. But earlier I was facing some issue with the same and now it is working fine. |
I have the same problem and changing the code as @cjopengler says works. |
Solved the issue. |
it is because of the function timedistributed |
sentence_input = Input(shape=(MAX_SENT_LENGTH,), dtype='int32')
embedded_sequences = embedding_layer(sentence_input)
l_lstm = Bidirectional(GRU(100, return_sequences=True))(embedded_sequences)
l_att = AttLayer(100)(l_lstm)
sentEncoder = Model(sentence_input, l_att)
review_input = Input(shape=(MAX_SENTS, MAX_SENT_LENGTH), dtype='int32')
review_encoder = TimeDistributed(sentEncoder)(review_input)
l_lstm_sent = Bidirectional(GRU(100, return_sequences=True))(review_encoder)
l_att_sent = AttLayer(100)(l_lstm_sent)
preds = Dense(2, activation='softmax')(l_att_sent)
model = Model(review_input, preds)
model.compile(loss='categorical_crossentropy',
optimizer='rmsprop',
metrics=['acc'])
print("model fitting - Hierachical attention network")
Error
ValueError: Dimensions must be equal, but are 15 and 100 for 'att_layer_10/mul' (op: 'Mul') with input shapes: [?,15], [?,15,100].
The text was updated successfully, but these errors were encountered: