runtime error
Exit code: 1. Reason: ��█▉| 9.86G/9.97G [00:17<00:00, 638MB/s][A pytorch_model-00001-of-00002.bin: 100%|█████████▉| 9.97G/9.97G [00:18<00:00, 548MB/s] pytorch_model-00002-of-00002.bin: 0%| | 0.00/3.85G [00:00<?, ?B/s][A pytorch_model-00002-of-00002.bin: 1%| | 31.5M/3.85G [00:01<02:11, 29.2MB/s][A pytorch_model-00002-of-00002.bin: 7%|▋ | 283M/3.85G [00:02<00:22, 156MB/s] [A pytorch_model-00002-of-00002.bin: 24%|██▍ | 933M/3.85G [00:03<00:07, 375MB/s][A pytorch_model-00002-of-00002.bin: 36%|███▌ | 1.37G/3.85G [00:04<00:06, 398MB/s][A pytorch_model-00002-of-00002.bin: 56%|█████▌ | 2.14G/3.85G [00:05<00:03, 523MB/s][A pytorch_model-00002-of-00002.bin: 72%|███████▏ | 2.76G/3.85G [00:06<00:02, 546MB/s][A pytorch_model-00002-of-00002.bin: 90%|█████████ | 3.49G/3.85G [00:07<00:00, 604MB/s][A pytorch_model-00002-of-00002.bin: 100%|█████████▉| 3.85G/3.85G [00:07<00:00, 498MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16, device_map="auto") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4399, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4736, in _load_pretrained_model raise ValueError( ValueError: The current `device_map` had weights offloaded to the disk. Please provide an `offload_folder` for them. Alternatively, make sure you have `safetensors` installed if the model you are using offers the weights in this format.
Container logs:
Fetching error logs...