runtime error

Exit code: 1. Reason: tokenizer_config.json: 0%| | 0.00/9.93k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 9.93k/9.93k [00:00<00:00, 55.9MB/s] vocab.json: 0%| | 0.00/777k [00:00<?, ?B/s] vocab.json: 100%|██████████| 777k/777k [00:00<00:00, 86.6MB/s] tokenizer.json: 0%| | 0.00/3.48M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 3.48M/3.48M [00:00<00:00, 44.0MB/s] added_tokens.json: 0%| | 0.00/207 [00:00<?, ?B/s] added_tokens.json: 100%|██████████| 207/207 [00:00<00:00, 2.18MB/s] special_tokens_map.json: 0%| | 0.00/801 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 801/801 [00:00<00:00, 7.09MB/s] config.json: 0%| | 0.00/787 [00:00<?, ?B/s] config.json: 100%|██████████| 787/787 [00:00<00:00, 8.57MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float32) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 309, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4422, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1127, in _get_resolved_checkpoint_files raise EnvironmentError( OSError: ibm-granite/granite-3.3-2b-instruct does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

Container logs:

Fetching error logs...