KeyError: 'type' when using pipeline with Phi-3-mini-4k-instruct on Kaggle/Colab

#114
by tusherbhomik - opened

I am trying to run the Phi-3-mini-4k-instruct model using the high-level pipeline API, but it fails during initialization with a KeyError: 'type'. This happens within the _init_rope method of the remote modeling code.

Environment:

Platform: Kaggle Notebook (also reproducible in Google Colab)

Library: transformers (latest version installed via pip install -U transformers)

Python: 3.12

Code Snippet Used:


from transformers import pipeline

# Standard snippet from the model card
pipe = pipeline("text-generation", model="microsoft/Phi-3-mini-4k-instruct", trust_remote_code=True)

messages = [
    {"role": "user", "content": "Who are you?"},
]
print(pipe(messages))

Full Traceback:

KeyError: 'type'
...
~/.cache/huggingface/modules/transformers_modules/microsoft/Phi-3-mini-4k-instruct/.../modeling_phi3.py in _init_rope(self)
    294             )
    295         else:
--> 296             scaling_type = self.config.rope_scaling["type"]
    297             if scaling_type == "longrope":
    298                 self.rotary_emb = Phi3LongRoPEScaledRotaryEmbedding(self.head_dim, self.config)

Observation:
It seems that the config.json or the rope_scaling dictionary for the Phi-3 model is missing the "type" key, or the remote modeling_phi3.py is expecting a schema that doesn't match the current configuration file provided in the repository.

Also, even with attn_implementation='eager', the error persists.

Request:
Could the team please check if the config.json needs an update or if the modeling_phi3.py script needs a fallback for the missing scaling_type?

Sign up or log in to comment