Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions chapter_attention-mechanisms/bahdanau-attention.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ class AttentionDecoder(d2l.Decoder):
首先,初始化解码器的状态,需要下面的输入:

1. 编码器在所有时间步的最终层隐状态,将作为注意力的键和值;
1. 上一时间步的编码器全层隐状态,将作为初始化解码器的隐状态;
1. 最终时间步的编码器全层隐状态,将作为初始化解码器的隐状态;
1. 编码器有效长度(排除在注意力池中填充词元)。

在每个解码时间步骤中,解码器上一个时间步的最终层隐状态将用作查询。
Expand Down Expand Up @@ -458,4 +458,4 @@ d2l.show_heatmaps(attention_weights[:, :, :, :len(engs[-1].split()) + 1],

:begin_tab:`paddle`
[Discussions](https://discuss.d2l.ai/t/11842)
:end_tab:
:end_tab:
Loading