prithivMLmods commited on
Commit
e886955
·
verified ·
1 Parent(s): 73dc4fb

update app

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -23,8 +23,8 @@ library_name: transformers
23
  This repository provides pre-built wheels for `flash-attn` version **2.8.3** for various PyTorch versions, Python versions, and architectures (compiled with CUDA 12). You can install these directly using `pip install <url>` or add the provided strings directly to your `requirements.txt`.
24
 
25
  > [!IMPORTANT]
26
- > The detailed categories and structured view of the `strangertoolshf/flash_attention_2_wheelhouse` folders and files on the hf-tree shareable link are available here: [huggingface-tree](https://strangertoolshf-huggingface-tree.static.hf.space/index.html#models/strangertoolshf/flash_attention_2_wheelhouse/main)
27
-
28
  <div style="
29
  background: rgba(61, 122, 255, 0.15);
30
  padding: 16px;
 
23
  This repository provides pre-built wheels for `flash-attn` version **2.8.3** for various PyTorch versions, Python versions, and architectures (compiled with CUDA 12). You can install these directly using `pip install <url>` or add the provided strings directly to your `requirements.txt`.
24
 
25
  > [!IMPORTANT]
26
+ > The detailed categories and structured view of the `strangertoolshf/flash_attention_2_wheelhouse` folders and files on the hf-tree shareable link are available here: [huggingface-tree](https://strangertoolshf-huggingface-tree.hf.space/#models/strangertoolshf/flash_attention_2_wheelhouse/main)
27
+
28
  <div style="
29
  background: rgba(61, 122, 255, 0.15);
30
  padding: 16px;