runtime error

Exit code: 1. Reason: �████| 456k/456k [00:00<00:00, 73.0MB/s] tokenizer.json: 0%| | 0.00/3.57M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 3.57M/3.57M [00:00<00:00, 32.0MB/s] added_tokens.json: 0%| | 0.00/1.47k [00:00<?, ?B/s] added_tokens.json: 100%|██████████| 1.47k/1.47k [00:00<00:00, 10.2MB/s] special_tokens_map.json: 0%| | 0.00/957 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 957/957 [00:00<00:00, 6.17MB/s] Traceback (most recent call last): File "/app/app.py", line 11, in <module> processor = TrOCRProcessor.from_pretrained(MODEL_NAME) File "/usr/local/lib/python3.10/site-packages/transformers/processing_utils.py", line 228, in from_pretrained args = cls._get_arguments_from_pretrained(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/processing_utils.py", line 272, in _get_arguments_from_pretrained args.append(attribute_class.from_pretrained(pretrained_model_name_or_path, **kwargs)) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 787, in from_pretrained return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/roberta/tokenization_roberta_fast.py", line 185, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 111, in __init__ fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) Exception: data did not match any variant of untagged enum ModelWrapper at line 251144 column 3

Container logs:

Fetching error logs...