Hugging Face's logo
micas23
/
Agents
Runtime error

runtime error

Exit code: 1. Reason: space`. The tokenizer needs to be converted from the slow tokenizers Traceback (most recent call last): File "/app/app.py", line 12, in <module> pipeline = OVDiffusionPipeline.from_pretrained(model_id, device="CPU") File "/usr/local/lib/python3.13/site-packages/optimum/intel/openvino/modeling_base.py", line 625, in from_pretrained return super().from_pretrained( ~~~~~~~~~~~~~~~~~~~~~~~^ model_id, ^^^^^^^^^ ...<9 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/optimum/modeling_base.py", line 407, in from_pretrained return from_pretrained_method( model_id=model_id, ...<9 lines>... **kwargs, ) File "/usr/local/lib/python3.13/site-packages/optimum/intel/openvino/modeling_diffusion.py", line 485, in _from_pretrained submodels[name] = load_method(model_save_path / name) ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/tokenization_utils_base.py", line 2113, in from_pretrained return cls._from_pretrained( ~~~~~~~~~~~~~~~~~~~~^ resolved_vocab_files, ^^^^^^^^^^^^^^^^^^^^^ ...<9 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/tokenization_utils_base.py", line 2359, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.13/site-packages/transformers/models/t5/tokenization_t5_fast.py", line 119, in __init__ super().__init__( ~~~~~~~~~~~~~~~~^ vocab_file=vocab_file, ^^^^^^^^^^^^^^^^^^^^^^ ...<7 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/tokenization_utils_fast.py", line 108, in __init__ raise ValueError( ...<2 lines>... ) ValueError: Cannot instantiate this tokenizer from a slow version. If it's based on sentencepiece, make sure you have sentencepiece installed.

Container logs:

Fetching error logs...