prithivMLmods commited on
Commit
e954b70
·
verified ·
1 Parent(s): c2aba6b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -34,7 +34,7 @@ device: NVIDIA H200 MIG 3g.71gb
34
  ```
35
 
36
  ```
37
- Currently supported up to `transformers==4.57.1`. Support for Transformers v5 will be added soon.
38
  ```
39
 
40
  This version allows flexible configuration of attention implementations—such as `flash_attention` or `sdpa`—for performance optimization or standardization. Users can also **opt out** of specific attention implementations if desired.
 
34
  ```
35
 
36
  ```
37
+ Currently supported up to `transformers==4.57.2`. Support for Transformers v5 will be added soon.
38
  ```
39
 
40
  This version allows flexible configuration of attention implementations—such as `flash_attention` or `sdpa`—for performance optimization or standardization. Users can also **opt out** of specific attention implementations if desired.