ValueError: The checkpoint you are trying to load has model type `mol_llama` but Transformers does not recognize this architecture.

#2
by Codemaster67 - opened

So i tried using the model via the offical collab notebook provided https://colab.research.google.com/#fileId=https%3A//huggingface.co/DongkiKim/Mol-Llama-3.1-8B-Instruct.ipynb

The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://huggingface.co/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
  warnings.warn(
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1430             try:
-> 1431                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1432             except KeyError:

3 frames
KeyError: 'mol_llama'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
/usr/local/lib/python3.12/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1431                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1432             except KeyError:
-> 1433                 raise ValueError(
   1434                     f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
   1435                     "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type `mol_llama` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`

i tried upgrade the transformers to 5.2.0 (via running !pip install --upgrade transformers) and i got the same error

for the third cell i got this error

ImportError                               Traceback (most recent call last)
/tmp/ipython-input-2461261565.py in <cell line: 0>()
      1 # Load model directly
----> 2 from transformers import MolLlaMA
      3 model = MolLLaMA.from_pretrained("DongkiKim/Mol-Llama-3.1-8B-Instruct", dtype="auto")

ImportError: cannot import name 'MolLlaMA' from 'transformers' (/usr/local/lib/python3.12/dist-packages/transformers/__init__.py)

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.

Sign up or log in to comment