ValueError: Unrecognized model in google/t5gemma-2-270m-270m. Should have a `model_type` key in its config.json.
I have the following error when loading google/t5gemma-2-270m-270m:
ValueError: Unrecognized model in google/t5gemma-2-270m-270m. Should have a model_type key in its config.json.
I can reproduce it easily:
from transformers import AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("google/t5gemma-2-270m-270m")
Interestingly, this does not happen with google/t5gemma-2-1b-1b, I can load this model without issue.
my transformers version is 5.5.0
Hi @aeth0r -
Thanks for reaching out to us. I tested google/t5gemma-2-270m-270m on Transformers 5.5.0 and it loaded successfully, so the error appears to be resolved. If the issue persists, try a fresh environment, clear the Hugging Face cache, and reinstall Transformers to make sure no old files are causing the error.
I did the following:
- clear my huggingface cache
- uninstall transformers and install the latest available version (5.5.3)
However, I can still reproduce the same error! Actually, the download of the model (not in my cache) does not even start. The complete stackstrace:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/aethor/Dev/Renard/.venv/lib64/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 329, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/aethor/Dev/Renard/.venv/lib64/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1528, in from_pretrained
raise ValueError(
ValueError: Unrecognized model in google/t5gemma-2-270m-270m. Should have a `model_type` key in its config.json.
Hi @aeth0r -
I am able to load the model successfully with transformers==5.5.3 using:
from transformers import AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("google/t5gemma-2-270m-270m")
The model downloads and loads fine on my side (config, weights, and generation config all load successfully). This likely points to an environment mismatch or install conflict. Could you confirm whether the script is running in the same environment where transformers==5.5.3 is installed?
Hi,
thanks for your answer. I can confirm that the script is running on the right environment (and that, in that same environment, I can load other similar models such as google/t5gemma-2-1b-1b).
But I can also confirm that, on another machine, I don't experience any issue and can load the model just fine. Therefore, something specific to my setup/machine must be causing the issue. This is puzzling. I will report back if I find what is happening.