-
Notifications
You must be signed in to change notification settings - Fork 225
Open
Description
Setting up this environment is far more complicated than the README explains.
I'm using windows 11, perhaps I should try this on my linux distro.
requirements.txt
specifies torch>=2.4.0
but flash_attn
is only compatible up to torch=2.5
and python 3.12
, while the latest releases are torch=2.8
and python 3.13
and no version of torch
is compatible with the latest release of CUDA
which is 13.0
, so we have to install old versions of everything for this to work.
Edit: Installing from these wheels worked: Dao-AILab/flash-attention#1469
Can the devs update the installation procedure, or am I the only one having this much trouble?
Metadata
Metadata
Assignees
Labels
No labels