Skip to content

[BUG] 2.6版本默认绑定flash_atten,无法取消,并且目前并没有提供对应flash_att的版本和安装示例。 #429

Closed
@kaixindelele

Description

@kaixindelele

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run `pip install flash_attn`

demo2.6会有这样的报错

期望行为 | Expected Behavior

期望能够提供取消对flash的绑定,或者提供一个安装教程,以及版本的指定。

复现方法 | Steps To Reproduce

2.6的demo例程。Linux系统。

运行环境 | Environment

Python: 3.10
Transformers: 4.40.0
PyTorch: 2.4.0+cu121
CUDA: 12.2

备注 | Anything else?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions