Skip to content

pack_padded_sequence bug fix#5

Open
landoskape wants to merge 4 commits intoast0414:masterfrom
landoskape:suggestions
Open

pack_padded_sequence bug fix#5
landoskape wants to merge 4 commits intoast0414:masterfrom
landoskape:suggestions

Conversation

@landoskape
Copy link
Copy Markdown

In my version of pytorch=1.12.1, torch.nn.utils.rnn.pack_padded_sequence requires the length input to be on the cpu. This is a quick fix to the the problem, as suggested by this discussion: pytorch/pytorch#43227

In my version of pytorch=1.12.1, torch.nn.utils.rnn.pack_padded_sequence requires lengths to be on the cpu. This is a quick fix to the the problem, as referenced in this discussion: pytorch/pytorch#43227
byte() call is deprecated for "masked_fill", now calls bool()
It didn't do this before
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant