runtime error

Exit code: 1. Reason: ng_dmd.pt: 50%|████▉ | 2.82G/5.68G [00:03<00:02, 1.18GB/s] checkpoints/self_forcing_dmd.pt: 75%|███████▌ | 4.27G/5.68G [00:04<00:01, 1.27GB/s] checkpoints/self_forcing_dmd.pt: 100%|██████████| 5.68G/5.68G [00:04<00:00, 1.18GB/s] tokenizer_config.json: 0%| | 0.00/9.73k [00:00<?, ?B/s] tokenizer_config.json: 100%|██████████| 9.73k/9.73k [00:00<00:00, 52.9MB/s] vocab.json: 0%| | 0.00/2.78M [00:00<?, ?B/s] vocab.json: 100%|██████████| 2.78M/2.78M [00:00<00:00, 70.6MB/s] merges.txt: 0%| | 0.00/1.67M [00:00<?, ?B/s] merges.txt: 100%|██████████| 1.67M/1.67M [00:00<00:00, 66.4MB/s] tokenizer.json: 0%| | 0.00/11.4M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 11.4M/11.4M [00:00<00:00, 106MB/s] config.json: 0%| | 0.00/728 [00:00<?, ?B/s] config.json: 100%|██████████| 728/728 [00:00<00:00, 6.26MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 53, in <module> model = AutoModelForCausalLM.from_pretrained( File "/home/user/.pyenv/versions/3.10.18/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 600, in from_pretrained return model_class.from_pretrained( File "/home/user/.pyenv/versions/3.10.18/lib/python3.10/site-packages/transformers/modeling_utils.py", line 311, in _wrapper return func(*args, **kwargs) File "/home/user/.pyenv/versions/3.10.18/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4674, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/home/user/.pyenv/versions/3.10.18/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1243, in _get_resolved_checkpoint_files raise OSError( OSError: Qwen/Qwen3-8B does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

Container logs:

Fetching error logs...