Skip to content

Error when decoder has more than 1 layer. #312

@pajola

Description

@pajola

The output is the follwoing:
RuntimeError: Input batch size 128 doesn't match hidden[0] batch size 256

The issue is due to the "initial_state=lstm_states" when the decoder is forwarded.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requestedtopic: examplesIssue about examples

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions