runtime error

Exit code: 1. Reason: 1/171 [00:00<00:00, 610kB/s] tokenizer_config.json: 0%| | 0.00/425 [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 425/425 [00:00<00:00, 1.75MB/s] tokenizer.json: 0%| | 0.00/17.2M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 17.2M/17.2M [00:00<00:00, 26.0MB/s] chat_template.jinja: 0%| | 0.00/3.83k [00:00<?, ?B/s] chat_template.jinja: 100%|██████████| 3.83k/3.83k [00:00<00:00, 13.9MB/s] `torch_dtype` is deprecated! Use `dtype` instead! Traceback (most recent call last): File "/app/app.py", line 9, in <module> model = AutoModelForCausalLM.from_pretrained( model_id, torch_dtype=torch.float16, device_map="auto" ) File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 374, in from_pretrained return model_class.from_pretrained( ~~~~~~~~~~~~~~~~~~~~~~~~~~~^ pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/modeling_utils.py", line 4060, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ pretrained_model_name_or_path=pretrained_model_name_or_path, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<6 lines>... transformers_explicit_filename=getattr(config, "transformers_weights", None), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/modeling_utils.py", line 688, in _get_resolved_checkpoint_files raise OSError( ...<2 lines>... ) OSError: devNaam/vakilai-llama32-3b-v1 does not appear to have a file named pytorch_model.bin or model.safetensors.

Container logs:

Fetching error logs...