python-version / requirements.txt
winglian's picture
Update requirements.txt
edc06dc verified
raw
history blame contribute delete
205 Bytes
torch==2.10.0
transformers==5.8.0
flash-linear-attention
https://github.com/Dao-AILab/causal-conv1d/releases/download/v1.6.1.post4/causal_conv1d-1.6.1+cu12torch2.10cxx11abiTRUE-cp312-cp312-linux_x86_64.whl