runtime error
Exit code: 1. Reason: 85df29?x-id=GetObject&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIA2JU7TKAQDPQNWMNT%2F20260304%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260304T032811Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host%3Brange&X-Amz-Signature=8c2ce8242773ee68427247cb7ebb6a025e780fd159c251726ccf9888cb4193c7", source: hyper_util::client::legacy::Error(SendRequest, hyper::Error(IncompleteMessage)) })), [1;31mcaller[0m[31m: "/home/runner/work/xet-core/xet-core/cas_client/src/download_utils.rs:528"[0m [2;3mat[0m /home/runner/work/xet-core/xet-core/error_printer/src/lib.rs:28 model-00003-of-00004.safetensors: 53%|ββββββ | 2.62G/4.92G [00:11<00:10, 217MB/s][A model-00003-of-00004.safetensors: 53%|ββββββ | 2.62G/4.92G [00:11<00:10, 225MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 180, in <module> model = LlavaForConditionalGeneration.from_pretrained(MODEL_PATH, torch_dtype="bfloat16", device_map=0) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 279, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4261, in from_pretrained checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1152, in _get_resolved_checkpoint_files checkpoint_files, sharded_metadata = get_checkpoint_shard_files( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 1115, in get_checkpoint_shard_files cached_filenames = cached_files( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 517, in cached_files raise EnvironmentError( OSError: fancyfeast/llama-joycaption-beta-one-hf-llava does not appear to have files named ('model-00003-of-00004.safetensors', 'model-00004-of-00004.safetensors'). Checkout 'https://huggingface.co/fancyfeast/llama-joycaption-beta-one-hf-llava/tree/main'for available files.
Container logs:
Fetching error logs...