runtime error
Exit code: 1. Reason: -00004-of-00005.safetensors: 46%|βββββ | 2.29G/5.00G [00:04<00:04, 671MB/s][A model-00004-of-00005.safetensors: 64%|βββββββ | 3.21G/5.00G [00:05<00:02, 759MB/s][A model-00004-of-00005.safetensors: 80%|ββββββββ | 4.01G/5.00G [00:06<00:01, 771MB/s][A model-00004-of-00005.safetensors: 100%|ββββββββββ| 5.00G/5.00G [00:07<00:00, 702MB/s] Downloading shards: 80%|ββββββββ | 4/5 [00:30<00:07, 7.56s/it][A model-00005-of-00005.safetensors: 0%| | 0.00/1.47G [00:00<?, ?B/s][A model-00005-of-00005.safetensors: 3%|β | 41.9M/1.47G [00:01<00:42, 33.6MB/s][A model-00005-of-00005.safetensors: 39%|ββββ | 577M/1.47G [00:02<00:02, 300MB/s] [A model-00005-of-00005.safetensors: 100%|ββββββββββ| 1.47G/1.47G [00:03<00:00, 468MB/s] Downloading shards: 100%|ββββββββββ| 5/5 [00:33<00:00, 6.00s/it][A Downloading shards: 100%|ββββββββββ| 5/5 [00:33<00:00, 6.71s/it] The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function. Loading checkpoint shards: 0%| | 0/5 [00:00<?, ?it/s][A Loading checkpoint shards: 100%|ββββββββββ| 5/5 [00:00<00:00, 115864.75it/s] generation_config.json: 0%| | 0.00/215 [00:00<?, ?B/s][A generation_config.json: 100%|ββββββββββ| 215/215 [00:00<00:00, 1.37MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 14, in <module> model = MllamaForConditionalGeneration.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4302, in from_pretrained dispatch_model(model, **device_map_kwargs) File "/usr/local/lib/python3.10/site-packages/accelerate/big_modeling.py", line 498, in dispatch_model raise ValueError( ValueError: You are trying to offload the whole model to the disk. Please use the `disk_offload` function instead.
Container logs:
Fetching error logs...