runtime error

Exit code: 1. Reason: nsformer.py", line 339, in __init__ modules = self._load_auto_model( model_name_or_path, ...<8 lines>... has_modules=has_modules, ) File "/usr/local/lib/python3.13/site-packages/sentence_transformers/SentenceTransformer.py", line 2112, in _load_auto_model transformer_model = Transformer( model_name_or_path, ...<4 lines>... backend=self.backend, ) File "/usr/local/lib/python3.13/site-packages/sentence_transformers/models/Transformer.py", line 88, in __init__ config, is_peft_model = self._load_config(model_name_or_path, cache_dir, backend, config_args) ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/sentence_transformers/models/Transformer.py", line 139, in _load_config find_adapter_config_file( ~~~~~~~~~~~~~~~~~~~~~~~~^ model_name_or_path, ^^^^^^^^^^^^^^^^^^^ ...<3 lines>... local_files_only=config_args.get("local_files_only", False), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/utils/peft_utils.py", line 84, in find_adapter_config_file adapter_cached_filename = cached_file( model_id, ...<11 lines>... _raise_exceptions_for_connection_errors=False, ) File "/usr/local/lib/python3.13/site-packages/transformers/utils/hub.py", line 277, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) File "/usr/local/lib/python3.13/site-packages/transformers/utils/hub.py", line 450, in cached_files raise OSError( ...<4 lines>... ) from e OSError: sentence-transformers/fine_tuned_model is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `hf auth login` or by passing `token=<your_token>`

Container logs:

Fetching error logs...