Skip to content

Environment Setup - Issues with Flash-Attn #84

@FeCardoza

Description

@FeCardoza

Setting up this environment is far more complicated than the README explains.

I'm using windows 11, perhaps I should try this on my linux distro.

requirements.txt specifies torch>=2.4.0 but flash_attn is only compatible up to torch=2.5 and python 3.12, while the latest releases are torch=2.8 and python 3.13 and no version of torch is compatible with the latest release of CUDA which is 13.0, so we have to install old versions of everything for this to work.

Edit: Installing from these wheels worked: Dao-AILab/flash-attention#1469

Can the devs update the installation procedure, or am I the only one having this much trouble?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions