After deploying locally, I keep encountering errors when running the examples. Is there any solution
(.venv_step3) D:\qr-code>d:/qr-code/.venv_step3/Scripts/python.exe d:/qr-code/test6.py
The tokenizer you are loading from 'D:\huggingface\Step3-VL-10B-AWQ' with an incorrect regex pattern: https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/discussions/84#69121093e8b480e709447d5e. This will lead to incorrect tokenization. You should set the fix_mistral_regex=True flag when loading this tokenizer to fix this issue.
Encountered exception while importing configuration_step_vl: No module named 'configuration_step_vl'
Encountered exception while importing vision_encoder: No module named 'vision_encoder'
Traceback (most recent call last):
File "d:\qr-code\test6.py", line 24, in
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "d:\qr-code.venv_step3\Lib\site-packages\transformers\models\auto\auto_factory.py", line 586, in from_pretrained
model_class = get_class_from_dynamic_module(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "d:\qr-code.venv_step3\Lib\site-packages\transformers\dynamic_module_utils.py", line 604, in get_class_from_dynamic_module
final_module = get_cached_module_file(
^^^^^^^^^^^^^^^^^^^^^^^
File "d:\qr-code.venv_step3\Lib\site-packages\transformers\dynamic_module_utils.py", line 427, in get_cached_module_file
modules_needed = check_imports(resolved_module_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "d:\qr-code.venv_step3\Lib\site-packages\transformers\dynamic_module_utils.py", line 260, in check_imports
raise ImportError(
ImportError: This modeling file requires the following packages that were not found in your environment: configuration_step_vl, vision_encoder. Run pip install configuration_step_vl vision_encoder
Based on the logs, I believe that the current version of Transformers does not yet support configuration_step_vl. In addition, all of our quantized models have only been tested on Linux systems using the vLLM inference engine. The Windows environment and the Transformers backend have not undergone comprehensive testing.