Update README.md
Browse files
README.md
CHANGED
|
@@ -34,7 +34,7 @@ device: NVIDIA H200 MIG 3g.71gb
|
|
| 34 |
```
|
| 35 |
|
| 36 |
```
|
| 37 |
-
Currently supported up to `transformers==4.57.
|
| 38 |
```
|
| 39 |
|
| 40 |
This version allows flexible configuration of attention implementations—such as `flash_attention` or `sdpa`—for performance optimization or standardization. Users can also **opt out** of specific attention implementations if desired.
|
|
|
|
| 34 |
```
|
| 35 |
|
| 36 |
```
|
| 37 |
+
Currently supported up to `transformers==4.57.2`. Support for Transformers v5 will be added soon.
|
| 38 |
```
|
| 39 |
|
| 40 |
This version allows flexible configuration of attention implementations—such as `flash_attention` or `sdpa`—for performance optimization or standardization. Users can also **opt out** of specific attention implementations if desired.
|