runtime error

Exit code: 1. Reason: ig.json: 0%| | 0.00/753 [00:00<?, ?B/s] adapter_config.json: 100%|██████████| 753/753 [00:00<00:00, 4.39MB/s] /usr/local/lib/python3.12/site-packages/peft/tuners/tuners_utils.py:1225: UserWarning: Model has `tie_word_embeddings=True` and a tied layer is part of the adapter, but `ensure_weight_tying` is not set to True. This can lead to complications, for example when merging the adapter or converting your model to formats other than safetensors. Check the discussion here: https://github.com/huggingface/peft/issues/2777 warnings.warn(msg) adapter_model.safetensors: 0%| | 0.00/430M [00:00<?, ?B/s] adapter_model.safetensors: 0%| | 30.6k/430M [00:01<4:14:54, 28.1kB/s] adapter_model.safetensors: 68%|██████▊ | 291M/430M [00:02<00:00, 157MB/s]  adapter_model.safetensors: 100%|██████████| 430M/430M [00:02<00:00, 181MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 49, in <module> model = ReactionPredictionModel(candidate_models) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/app/utils.py", line 120, in __init__ self.load_retro_model(candidate_models[model]) File "/home/user/app/utils.py", line 173, in load_retro_model self.retro_model = PeftModel.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/peft/peft_model.py", line 568, in from_pretrained load_result = model.load_adapter( ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/peft/peft_model.py", line 1368, in load_adapter load_result = set_peft_model_state_dict( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/peft/utils/save_and_load.py", line 455, in set_peft_model_state_dict state_dict[store_key] = peft_model_state_dict[lookup_key] ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^ KeyError: 'base_model.model.lm_head.weight'

Container logs:

Fetching error logs...