Safetensors
qwen2
Zihao-Li commited on
Commit
300710d
·
verified ·
1 Parent(s): c529b5a

Fix missing `metadata` in `model.safetensors.index.json`

Browse files

### **Description:**
This PR fixes the missing `"metadata.total_size"` field in `model.safetensors.index.json`, which was causing a `KeyError: 'metadata'` when loading the model with `transformers`.


### **Reproduction:**
```
from transformers import AutoTokenizer, AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("AIDC-AI/Marco-LLM-GLO")
```

Error:

```
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-2-73061392afe9> in <cell line: 0>()
----> 1 model = AutoModelForCausalLM.from_pretrained("AIDC-AI/Marco-LLM-GLO")

2 frames
/usr/local/lib/python3.11/dist-packages/transformers/utils/hub.py in get_checkpoint_shard_files(pretrained_model_name_or_path, index_filename, cache_dir, force_download, proxies, resume_download, local_files_only, token, user_agent, revision, subfolder, _commit_hash, **deprecated_kwargs)
1076
1077 shard_filenames = sorted(set(index["weight_map"].values()))
-> 1078 sharded_metadata = index["metadata"]
1079 sharded_metadata["all_checkpoint_keys"] = list(index["weight_map"].keys())
1080 sharded_metadata["weight_map"] = index["weight_map"].copy()

KeyError: 'metadata'
```

### **Changes Made:**
- Added `"metadata": { "total_size": 15231271760 }` to `model.safetensors.index.json`

Files changed (1) hide show
  1. model.safetensors.index.json +3 -0
model.safetensors.index.json CHANGED
@@ -1,4 +1,7 @@
1
  {
 
 
 
2
  "weight_map": {
3
  "model.embed_tokens.weight": "model-00002-of-00004.safetensors",
4
  "model.layers.0.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",
 
1
  {
2
+ "metadata": {
3
+ "total_size": 15231271760
4
+ }
5
  "weight_map": {
6
  "model.embed_tokens.weight": "model-00002-of-00004.safetensors",
7
  "model.layers.0.self_attn.q_proj.weight": "model-00003-of-00004.safetensors",