# Use a pipeline as a high-level helperfrom transformers import pipeline
pipe = pipeline("text-generation", model="google/gemma-7b-it")
Is resulting in a KeyError on colab:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1116 try:
-> 1117 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1118 except KeyError:
3 frames
KeyError: 'gemma'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1117 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1118 except KeyError:
-> 1119 raise ValueError(
1120 f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
1121 "but Transformers does not recognize this architecture. This could be because of an "
ValueError: The checkpoint you are trying to load has model type `gemma` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.