File size: 205 Bytes
765157b
68a663e
0ccb3bc
edc06dc
1
2
3
4
torch==2.10.0
transformers==5.8.0
flash-linear-attention
https://github.com/Dao-AILab/causal-conv1d/releases/download/v1.6.1.post4/causal_conv1d-1.6.1+cu12torch2.10cxx11abiTRUE-cp312-cp312-linux_x86_64.whl