yuccaaa commited on
Commit
b609f54
·
verified ·
1 Parent(s): 2991e6f

Upload flash-attention/flashinfer_python-0.2.2.post1+cu124torch2.6-cp38-abi3-linux_x86_64.whl with huggingface_hub

Browse files
.gitattributes CHANGED
@@ -134,3 +134,4 @@ ProtT3/all_checkpoints/stage2_07301646_2datasets_construct/wandb/run-20250730_17
134
  ProtT3/results/2datasets_construct_predictions.txt filter=lfs diff=lfs merge=lfs -text
135
  flash-attention/flash_attn-2.6.0.post1+cu122torch2.4cxx11abiFALSE-cp38-cp38-linux_x86_64.whl filter=lfs diff=lfs merge=lfs -text
136
  flash-attention/flash_attn-2.7.1.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl filter=lfs diff=lfs merge=lfs -text
 
 
134
  ProtT3/results/2datasets_construct_predictions.txt filter=lfs diff=lfs merge=lfs -text
135
  flash-attention/flash_attn-2.6.0.post1+cu122torch2.4cxx11abiFALSE-cp38-cp38-linux_x86_64.whl filter=lfs diff=lfs merge=lfs -text
136
  flash-attention/flash_attn-2.7.1.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl filter=lfs diff=lfs merge=lfs -text
137
+ flash-attention/flashinfer_python-0.2.2.post1+cu124torch2.6-cp38-abi3-linux_x86_64.whl filter=lfs diff=lfs merge=lfs -text
flash-attention/flashinfer_python-0.2.2.post1+cu124torch2.6-cp38-abi3-linux_x86_64.whl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b5950853a0769809199f4f252eb271f63700e4f8a51e0da582f0f066b22cd7c
3
+ size 540118154