joejiang
shanghaijiang
ยท
AI & ML interests
None yet
Organizations
The transformers version in config.json does not match the updated instructions in the README.
#1 opened 4 months ago
by
shanghaijiang
only transformers==4.51.3 could work, higher with issue
#2 opened 4 months ago
by
shanghaijiang
Missing Arxiv link?
๐ 5
1
#17 opened 4 months ago
by
RoversX
Flash attention issue with python 3.13
2
#7 opened 4 months ago
by
freakynit
Flash attention issue with python 3.13
2
#7 opened 4 months ago
by
freakynit