Daemontatox commited on
Commit
5c044ab
·
verified ·
1 Parent(s): 78d4fe7

Update config.json

Browse files

https://huggingface.co/google/functiongemma-270m-it/discussions/5#694457b5ad6f76a0d3c43917

"""
vLLM has started using the Transformers v5 style rope_parameters and patches it into configs loaded with Transformers v4.

vLLM 0.12.0 uses the existence of rope_parameters to decide whether or not to set a default value for it, i.e.:

if rope_theta is not None:
if not hasattr(config, "rope_parameters"):
config.rope_parameters = {"rope_type": "default"}
config.rope_parameters["rope_theta"] = rope_theta

This causes an error with this model because rope_parameters will remain None and the following dict assignment will fail.

https://github.com/vllm-project/vllm/pull/30983 fixes this on the vLLM side for (hopefully) v0.13.0 onwards.

Removing these fields from the checkpoint should fix this model for vLLM v0.12.0 in the meantime.
"""

Files changed (1) hide show
  1. config.json +0 -2
config.json CHANGED
@@ -44,8 +44,6 @@
44
  "query_pre_attn_scalar": 256,
45
  "rms_norm_eps": 1e-06,
46
  "rope_local_base_freq": 10000.0,
47
- "rope_parameters": null,
48
- "rope_scaling": null,
49
  "rope_theta": 1000000.0,
50
  "sliding_window": 512,
51
  "transformers_version": "4.57.3",
 
44
  "query_pre_attn_scalar": 256,
45
  "rms_norm_eps": 1e-06,
46
  "rope_local_base_freq": 10000.0,
 
 
47
  "rope_theta": 1000000.0,
48
  "sliding_window": 512,
49
  "transformers_version": "4.57.3",