How Could I Get Both The Final Hidden State And Sequence In A Lstm Layer When Using A Bidirectional Wrapper
i have followed the steps in https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/ But when it comes to the Bidirectional lstm, i tried this lst
Solution 1:
The call Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
returns 5 tensors:
- The entire sequence of hidden states, by default it'll be the concatenation of forward and backward states.
- The last hidden state
h
for the forward LSTM - The last cell state
c
for the forward LSTM - The last hidden state
h
for the backward LSTM - The last cell state
c
for the backward LSTM
The line you've posted would raise an error since you want to unpack the returned value into just three variables (lstm, state_h, state_c
).
To correct it, simply unpack the returned value into 5 variables. If you want to merge the states, you can concatenate the forward and backward states with Concatenate
layers.
lstm, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])
Post a Comment for "How Could I Get Both The Final Hidden State And Sequence In A Lstm Layer When Using A Bidirectional Wrapper"