runtime error

Exit code: 1. Reason: โœ“ sentencepiece is installed ============================================================ Loading model and tokenizer... ============================================================ Attempt 1: Loading with default settings... โœ— Attempt 1 failed: Couldn't instantiate the backend tokenizer from one of: (1) a `tokenizers` library serialization fi Attempt 2: Loading with use_fast=False... โœ— Attempt 2 failed: Couldn't instantiate the backend tokenizer from one of: (1) a `tokenizers` library serialization fi Attempt 3: Trying LlamaTokenizer directly... โœ“ Tokenizer loaded successfully with LlamaTokenizer โœ“ Set pad_token to eos_token Loading model... โœ— Model loading failed: Unrecognized model in aab20abdullah/akin-yurt-finely. Should have a `model_type` key in its config.json. Traceback (most recent call last): File "/app/app.py", line 110, in <module> model = AutoModelForCausalLM.from_pretrained( MODEL_NAME, ...<3 lines>... low_cpu_mem_usage=True ) File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 319, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ~~~~~~~~~~~~~~~~~~~~~~~~~~^ pretrained_model_name_or_path, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1471, in from_pretrained raise ValueError( ...<2 lines>... ) ValueError: Unrecognized model in aab20abdullah/akin-yurt-finely. Should have a `model_type` key in its config.json.

Container logs:

Fetching error logs...