Skip to content

Fix WanSelfAttention forward docstring#576

Open
arifahmad-py wants to merge 1 commit intoWan-Video:mainfrom
arifahmad-py:fix-wanselfattention-docstring
Open

Fix WanSelfAttention forward docstring#576
arifahmad-py wants to merge 1 commit intoWan-Video:mainfrom
arifahmad-py:fix-wanselfattention-docstring

Conversation

@arifahmad-py
Copy link

  • The WanSelfAttention.forward docstring currently states x is [B, L, num_heads, head_dim], but the implementation (and callsites) pass x as [B, L, C] and split into heads after the linear projections (qkv_fn).
  • This PR updates the docstring to reflect the actual shape.
  • No functional changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments