Not support flash_attn_2?

#17
by bdsqlsz - opened
StepFun org

failed at modeling_utils.py:2406 in _flash_attn_2_can_dispatch -> ValueError: Step3VL10BForCausalLM
does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub
page: https://huggingface.co/stepfun-ai/Step3-VL-10B/discussions/new or in the Transformers GitHub repo:
https://github.com/huggingface/transformers/issues/new

Sign up or log in to comment