models123 danielhanchen commited on
Commit
edfe3be
·
0 Parent(s):

Duplicate from unsloth/GLM-5.1-FP8

Browse files

Co-authored-by: Daniel (Unsloth) <danielhanchen@users.noreply.huggingface.co>

This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +37 -0
  2. README.md +115 -0
  3. chat_template.jinja +117 -0
  4. config.json +862 -0
  5. generation_config.json +12 -0
  6. model-00001-of-00142.safetensors +3 -0
  7. model-00002-of-00142.safetensors +3 -0
  8. model-00003-of-00142.safetensors +3 -0
  9. model-00004-of-00142.safetensors +3 -0
  10. model-00005-of-00142.safetensors +3 -0
  11. model-00006-of-00142.safetensors +3 -0
  12. model-00007-of-00142.safetensors +3 -0
  13. model-00008-of-00142.safetensors +3 -0
  14. model-00009-of-00142.safetensors +3 -0
  15. model-00010-of-00142.safetensors +3 -0
  16. model-00011-of-00142.safetensors +3 -0
  17. model-00012-of-00142.safetensors +3 -0
  18. model-00013-of-00142.safetensors +3 -0
  19. model-00014-of-00142.safetensors +3 -0
  20. model-00015-of-00142.safetensors +3 -0
  21. model-00016-of-00142.safetensors +3 -0
  22. model-00017-of-00142.safetensors +3 -0
  23. model-00018-of-00142.safetensors +3 -0
  24. model-00019-of-00142.safetensors +3 -0
  25. model-00020-of-00142.safetensors +3 -0
  26. model-00021-of-00142.safetensors +3 -0
  27. model-00022-of-00142.safetensors +3 -0
  28. model-00023-of-00142.safetensors +3 -0
  29. model-00024-of-00142.safetensors +3 -0
  30. model-00025-of-00142.safetensors +3 -0
  31. model-00026-of-00142.safetensors +3 -0
  32. model-00027-of-00142.safetensors +3 -0
  33. model-00028-of-00142.safetensors +3 -0
  34. model-00029-of-00142.safetensors +3 -0
  35. model-00030-of-00142.safetensors +3 -0
  36. model-00031-of-00142.safetensors +3 -0
  37. model-00032-of-00142.safetensors +3 -0
  38. model-00033-of-00142.safetensors +3 -0
  39. model-00034-of-00142.safetensors +3 -0
  40. model-00035-of-00142.safetensors +3 -0
  41. model-00036-of-00142.safetensors +3 -0
  42. model-00037-of-00142.safetensors +3 -0
  43. model-00038-of-00142.safetensors +3 -0
  44. model-00039-of-00142.safetensors +3 -0
  45. model-00040-of-00142.safetensors +3 -0
  46. model-00041-of-00142.safetensors +3 -0
  47. model-00042-of-00142.safetensors +3 -0
  48. model-00043-of-00142.safetensors +3 -0
  49. model-00044-of-00142.safetensors +3 -0
  50. model-00045-of-00142.safetensors +3 -0
.gitattributes ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ model.safetensors.index.json filter=lfs diff=lfs merge=lfs -text
37
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ - zh
5
+ library_name: transformers
6
+ license: mit
7
+ pipeline_tag: text-generation
8
+ base_model:
9
+ - zai-org/GLM-5.1
10
+ tags:
11
+ - unsloth
12
+ - glm_moe_dsa
13
+ ---
14
+ <div>
15
+ <p style="margin-bottom: 0; margin-top: 0;">
16
+ <h1 style="margin-top: 0rem;">See how to run GLM-5.1 locally - <a href="https://unsloth.ai/docs/models/glm-5.1">Read our Guide!</a></h1>
17
+ </p>
18
+ <p style="margin-top: 0;margin-bottom: 0;">
19
+ <em><a href="https://unsloth.ai/docs/basics/unsloth-dynamic-v2.0-gguf">Unsloth Dynamic 2.0</a> achieves superior accuracy & outperforms other leading quants.</em>
20
+ </p>
21
+ <div style="margin-top: 0;display: flex; gap: 5px; align-items: center; ">
22
+ <a href="https://github.com/unslothai/unsloth/">
23
+ <img src="https://github.com/unslothai/unsloth/raw/main/images/unsloth%20new%20logo.png" width="133">
24
+ </a>
25
+ <a href="https://discord.gg/unsloth">
26
+ <img src="https://github.com/unslothai/unsloth/raw/main/images/Discord%20button.png" width="173">
27
+ </a>
28
+ <a href="https://unsloth.ai/docs/models/glm-5.">
29
+ <img src="https://raw.githubusercontent.com/unslothai/unsloth/refs/heads/main/images/documentation%20green%20button.png" width="143">
30
+ </a>
31
+ </div>
32
+ </div>
33
+
34
+ You can follow instructions in our [guide here](https://unsloth.ai/docs/models/glm-5.1).
35
+
36
+ ---
37
+
38
+
39
+ # GLM-5.1
40
+
41
+ <div align="center">
42
+ <img src=https://raw.githubusercontent.com/zai-org/GLM-5/refs/heads/main/resources/logo.svg width="15%"/>
43
+ </div>
44
+ <p align="center">
45
+ 👋 Join our <a href="https://raw.githubusercontent.com/zai-org/GLM-5/refs/heads/main/resources/wechat.png" target="_blank">WeChat</a> or <a href="https://discord.gg/QR7SARHRxK" target="_blank">Discord</a> community.
46
+ <br>
47
+ 📖 Check out the GLM-5.1 <a href="https://z.ai/blog/glm-5.1" target="_blank">blog</a> and GLM-5 <a href="https://arxiv.org/abs/2602.15763" target="_blank">Technical report</a>.
48
+ <br>
49
+ 📍 Use GLM-5.1 API services on <a href="https://docs.z.ai/guides/llm/glm-5.1">Z.ai API Platform. </a>
50
+ <br>
51
+ 🔜 <a href="https://chat.z.ai">GLM-5.1</a> will be available on chat.z.ai in the coming days.
52
+ </p>
53
+
54
+ <p align="center">
55
+ [<a href="https://huggingface.co/papers/2602.15763" target="_blank">Paper</a>]
56
+ [<a href="https://github.com/zai-org/GLM-5" target="_blank">GitHub</a>]
57
+ </p>
58
+
59
+ ## Introduction
60
+
61
+ GLM-5.1 is our next-generation flagship model for agentic engineering, with significantly stronger coding capabilities than its predecessor. It achieves state-of-the-art performance on SWE-Bench Pro and leads GLM-5 by a wide margin on NL2Repo (repo generation) and Terminal-Bench 2.0 (real-world terminal tasks).
62
+
63
+ ![bench_51](https://raw.githubusercontent.com/zai-org/GLM-5/refs/heads/main/resources/bench_51.png)
64
+
65
+ But the most meaningful leap goes beyond first-pass performance. Previous models—including GLM-5—tend to exhaust their repertoire early: they apply familiar techniques for quick initial gains, then plateau. Giving them more time doesn't help.
66
+
67
+ GLM-5.1, by contrast, is built to stay effective on agentic tasks over much longer horizons. We've found that the model handles ambiguous problems with better judgment and stays productive over longer sessions. It breaks complex problems down, runs experiments, reads results, and identifies blockers with real precision. By revisiting its reasoning and revising its strategy through repeated iteration, GLM-5.1 sustains optimization over hundreds of rounds and thousands of tool calls. The longer it runs, the better the result.
68
+
69
+ ## Benchmark
70
+
71
+ | | GLM-5.1 | GLM-5 | Qwen3.6-Plus | Minimax M2.7 | DeepSeek-V3.2 | Kimi K2.5 | Claude Opus 4.6 | Gemini 3.1 Pro | GPT-5.4 |
72
+ | ------------------------------------------ | ------------------ | ------------------- | ------------ | -------------------- | -------------------- | ---------- | --------------- | -------------- | ---------------- |
73
+ | HLE | 31.0 | 30.5 | 28.8 | 28.0 | 25.1 | 31.5 | 36.7 | **45.0** | 39.8 |
74
+ | HLE (w/ Tools) | 52.3 | 50.4 | 50.6 | - | 40.8 | 51.8 | **53.1*** | 51.4* | 52.1* |
75
+ | AIME 2026 | 95.3 | 95.4 | 95.1 | 89.8 | 95.1 | 94.5 | 95.6 | 98.2 | **98.7** |
76
+ | HMMT Nov. 2025 | 94.0 | **96.9** | 94.6 | 81.0 | 90.2 | 91.1 | 96.3 | 94.8 | 95.8 |
77
+ | HMMT Feb. 2026 | 82.6 | 82.8 | 87.8 | 72.7 | 79.9 | 81.3 | 84.3 | 87.3 | **91.8** |
78
+ | IMOAnswerBench | 83.8 | 82.5 | 83.8 | 66.3 | 78.3 | 81.8 | 75.3 | 81.0 | **91.4** |
79
+ | GPQA-Diamond | 86.2 | 86.0 | 90.4 | 87.0 | 82.4 | 87.6 | 91.3 | **94.3** | 92.0 |
80
+ | SWE-Bench Pro | **58.4** | 55.1 | 56.6 | 56.2 | - | 53.8 | 57.3 | 54.2 | 57.7 |
81
+ | NL2Repo | 42.7 | 35.9 | 37.9 | 39.8 | - | 32.0 | **49.8** | 33.4 | 41.3 |
82
+ | Terminal-Bench 2.0 (Terminus-2) | 63.5 | 56.2 | 61.6 | - | 39.3 | 50.8 | 65.4 | **68.5** | - |
83
+ | Terminal-Bench 2.0 (Best self-reported) | 66.5 (Claude Code) | 56.2 (Claude Code) | - | 57.0 (Claude Code) | 46.4 (Claude Code) | - | - | - | **75.1** (Codex) |
84
+ | CyberGym | **68.7** | 48.3 | - | - | 17.3 | 41.3 | 66.6 | - | - |
85
+ | BrowseComp | **68.0** | 62.0 | - | - | 51.4 | 60.6 | - | - | - |
86
+ | BrowseComp (w/ Context Manage) | 79.3 | 75.9 | - | - | 67.6 | 74.9 | 84.0 | **85.9** | 82.7 |
87
+ | τ³-Bench | 70.6 | 69.2 | 70.7 | 67.6 | 69.2 | 66.0 | 72.4 | 67.1 | **72.9** |
88
+ | MCP-Atlas (Public Set) | 71.8 | 69.2 | **74.1** | 48.8 | 62.2 | 63.8 | 73.8 | 69.2 | 67.2 |
89
+ | Tool-Decathlon | 40.7 | 38.0 | 39.8 | 46.3 | 35.2 | 27.8 | 47.2 | 48.8 | **54.6** |
90
+ | Vending Bench 2 | $5,634.00 | $4,432.12 | $5,114.87 | - | $1,034.00 | $1,198.46 | **$8,017.59** | $911.21 | $6,144.18 |
91
+ ## Serve GLM-5.1 Locally
92
+
93
+ The following open-source frameworks support local deployment of GLM-5.1:
94
+
95
+ - [SGLang](https://github.com/sgl-project/sglang) (v0.5.10+) — see [cookbook](https://cookbook.sglang.io/autoregressive/GLM/GLM-5.1)
96
+ - [vLLM](https://github.com/vllm-project/vllm) (v0.19.0+) — see [recipes](https://github.com/vllm-project/recipes/blob/main/GLM/GLM5.md)
97
+ - [xLLM](https://github.com/jd-opensource/xllm) (v0.8.0+) — see [example](https://github.com/zai-org/GLM-5/blob/main/example/ascend.md)
98
+ - [Transformers](https://github.com/huggingface/transformers) (v0.5.3+) — see [transformers docs](https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/glm_moe_dsa.md)
99
+ - [KTransformers](https://github.com/kvcache-ai/ktransformers) (v0.5.3+) — see [tutorial](https://github.com/kvcache-ai/ktransformers/blob/main/doc/en/kt-kernel/GLM-5.1-Tutorial.md)
100
+
101
+ ## Citation
102
+
103
+ If you find GLM-5.1 or GLM-5 useful in your research, please cite our technical report:
104
+
105
+ ```bibtex
106
+ @misc{glm5team2026glm5vibecodingagentic,
107
+ title={GLM-5: from Vibe Coding to Agentic Engineering},
108
+ author={GLM-5-Team and : and Aohan Zeng and Xin Lv and Zhenyu Hou and Zhengxiao Du and Qinkai Zheng and Bin Chen and Da Yin and Chendi Ge and Chenghua Huang and Chengxing Xie and Chenzheng Zhu and Congfeng Yin and Cunxiang Wang and Gengzheng Pan and Hao Zeng and Haoke Zhang and Haoran Wang and Huilong Chen and Jiajie Zhang and Jian Jiao and Jiaqi Guo and Jingsen Wang and Jingzhao Du and Jinzhu Wu and Kedong Wang and Lei Li and Lin Fan and Lucen Zhong and Mingdao Liu and Mingming Zhao and Pengfan Du and Qian Dong and Rui Lu and Shuang-Li and Shulin Cao and Song Liu and Ting Jiang and Xiaodong Chen and Xiaohan Zhang and Xuancheng Huang and Xuezhen Dong and Yabo Xu and Yao Wei and Yifan An and Yilin Niu and Yitong Zhu and Yuanhao Wen and Yukuo Cen and Yushi Bai and Zhongpei Qiao and Zihan Wang and Zikang Wang and Zilin Zhu and Ziqiang Liu and Zixuan Li and Bojie Wang and Bosi Wen and Can Huang and Changpeng Cai and Chao Yu and Chen Li and Chengwei Hu and Chenhui Zhang and Dan Zhang and Daoyan Lin and Dayong Yang and Di Wang and Ding Ai and Erle Zhu and Fangzhou Yi and Feiyu Chen and Guohong Wen and Hailong Sun and Haisha Zhao and Haiyi Hu and Hanchen Zhang and Hanrui Liu and Hanyu Zhang and Hao Peng and Hao Tai and Haobo Zhang and He Liu and Hongwei Wang and Hongxi Yan and Hongyu Ge and Huan Liu and Huanpeng Chu and Jia'ni Zhao and Jiachen Wang and Jiajing Zhao and Jiamin Ren and Jiapeng Wang and Jiaxin Zhang and Jiayi Gui and Jiayue Zhao and Jijie Li and Jing An and Jing Li and Jingwei Yuan and Jinhua Du and Jinxin Liu and Junkai Zhi and Junwen Duan and Kaiyue Zhou and Kangjian Wei and Ke Wang and Keyun Luo and Laiqiang Zhang and Leigang Sha and Liang Xu and Lindong Wu and Lintao Ding and Lu Chen and Minghao Li and Nianyi Lin and Pan Ta and Qiang Zou and Rongjun Song and Ruiqi Yang and Shangqing Tu and Shangtong Yang and Shaoxiang Wu and Shengyan Zhang and Shijie Li and Shuang Li and Shuyi Fan and Wei Qin and Wei Tian and Weining Zhang and Wenbo Yu and Wenjie Liang and Xiang Kuang and Xiangmeng Cheng and Xiangyang Li and Xiaoquan Yan and Xiaowei Hu and Xiaoying Ling and Xing Fan and Xingye Xia and Xinyuan Zhang and Xinze Zhang and Xirui Pan and Xu Zou and Xunkai Zhang and Yadi Liu and Yandong Wu and Yanfu Li and Yidong Wang and Yifan Zhu and Yijun Tan and Yilin Zhou and Yiming Pan and Ying Zhang and Yinpei Su and Yipeng Geng and Yong Yan and Yonglin Tan and Yuean Bi and Yuhan Shen and Yuhao Yang and Yujiang Li and Yunan Liu and Yunqing Wang and Yuntao Li and Yurong Wu and Yutao Zhang and Yuxi Duan and Yuxuan Zhang and Zezhen Liu and Zhengtao Jiang and Zhenhe Yan and Zheyu Zhang and Zhixiang Wei and Zhuo Chen and Zhuoer Feng and Zijun Yao and Ziwei Chai and Ziyuan Wang and Zuzhou Zhang and Bin Xu and Minlie Huang and Hongning Wang and Juanzi Li and Yuxiao Dong and Jie Tang},
109
+ year={2026},
110
+ eprint={2602.15763},
111
+ archivePrefix={arXiv},
112
+ primaryClass={cs.LG},
113
+ url={https://arxiv.org/abs/2602.15763},
114
+ }
115
+ ```
chat_template.jinja ADDED
@@ -0,0 +1,117 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [gMASK]<sop>
2
+ {%- if tools -%}
3
+ {%- macro tool_to_json(tool) -%}
4
+ {%- set ns_tool = namespace(first=true) -%}
5
+ {{ '{' -}}
6
+ {%- for k, v in tool.items() -%}
7
+ {%- if k != 'defer_loading' and k != 'strict' -%}
8
+ {%- if not ns_tool.first -%}{{- ', ' -}}{%- endif -%}
9
+ {%- set ns_tool.first = false -%}
10
+ "{{ k }}": {{ v | tojson(ensure_ascii=False) }}
11
+ {%- endif -%}
12
+ {%- endfor -%}
13
+ {{- '}' -}}
14
+ {%- endmacro -%}
15
+ <|system|>
16
+ # Tools
17
+
18
+ You may call one or more functions to assist with the user query.
19
+
20
+ You are provided with function signatures within <tools></tools> XML tags:
21
+ <tools>
22
+ {% for tool in tools %}
23
+ {%- if 'function' in tool -%}
24
+ {%- set tool = tool['function'] -%}
25
+ {%- endif -%}
26
+ {% if tool.defer_loading is not defined or not tool.defer_loading %}
27
+ {{ tool_to_json(tool) }}
28
+ {% endif %}
29
+ {% endfor %}
30
+ </tools>
31
+
32
+ For each function call, output the function name and arguments within the following XML format:
33
+ <tool_call>{function-name}<arg_key>{arg-key-1}</arg_key><arg_value>{arg-value-1}</arg_value><arg_key>{arg-key-2}</arg_key><arg_value>{arg-value-2}</arg_value>...</tool_call>{%- endif -%}
34
+ {%- macro visible_text(content) -%}
35
+ {%- if content is string -%}
36
+ {{- content }}
37
+ {%- elif content is iterable and content is not mapping -%}
38
+ {%- for item in content -%}
39
+ {%- if item is mapping and item.type == 'text' -%}
40
+ {{- item.text }}
41
+ {%- elif item is string -%}
42
+ {{- item }}
43
+ {%- endif -%}
44
+ {%- endfor -%}
45
+ {%- else -%}
46
+ {{- content }}
47
+ {%- endif -%}
48
+ {%- endmacro -%}
49
+ {%- set ns = namespace(last_user_index=-1, thinking_indices='') -%}
50
+ {%- for m in messages %}
51
+ {%- if m.role == 'user' %}
52
+ {%- set ns.last_user_index = loop.index0 -%}
53
+ {%- elif m.role == 'assistant' %}
54
+ {%- if m.reasoning_content is string %}
55
+ {%- set ns.thinking_indices = ns.thinking_indices ~ ',' ~ ns.last_user_index ~ ',' -%}
56
+ {%- endif %}
57
+ {%- endif %}
58
+ {%- endfor %}
59
+ {%- set ns.has_thinking = false -%}
60
+ {%- for m in messages -%}
61
+ {%- if m.role == 'user' -%}<|user|>{{ visible_text(m.content) }}{% set ns.has_thinking = (',' ~ loop.index0 ~ ',') in ns.thinking_indices -%}
62
+ {%- elif m.role == 'assistant' -%}
63
+ <|assistant|>
64
+ {%- set content = visible_text(m.content) %}
65
+ {%- if m.reasoning_content is string %}
66
+ {%- set reasoning_content = m.reasoning_content %}
67
+ {%- elif '</think>' in content %}
68
+ {%- set reasoning_content = content.split('</think>')[0].split('<think>')[-1] %}
69
+ {%- set content = content.split('</think>')[-1] %}
70
+ {%- elif loop.index0 > ns.last_user_index and not (enable_thinking is defined and not enable_thinking) %}
71
+ {%- set reasoning_content = '' %}
72
+ {%- elif loop.index0 < ns.last_user_index and ns.has_thinking %}
73
+ {%- set reasoning_content = '' %}
74
+ {%- endif %}
75
+ {%- if ((clear_thinking is defined and not clear_thinking) or loop.index0 > ns.last_user_index) and reasoning_content is defined -%}
76
+ {{ '<think>' + reasoning_content + '</think>'}}
77
+ {%- else -%}
78
+ {{ '</think>' }}
79
+ {%- endif -%}
80
+ {%- if content.strip() -%}
81
+ {{ content.strip() }}
82
+ {%- endif -%}
83
+ {% if m.tool_calls %}
84
+ {% for tc in m.tool_calls %}
85
+ {%- if tc.function %}
86
+ {%- set tc = tc.function %}
87
+ {%- endif %}
88
+ {{- '<tool_call>' + tc.name -}}
89
+ {% set _args = tc.arguments %}{% for k, v in _args.items() %}<arg_key>{{ k }}</arg_key><arg_value>{{ v | tojson(ensure_ascii=False) if v is not string else v }}</arg_value>{% endfor %}</tool_call>{% endfor %}
90
+ {% endif %}
91
+ {%- elif m.role == 'tool' -%}
92
+ {%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
93
+ {{- '<|observation|>' -}}
94
+ {%- endif %}
95
+ {%- if m.content is string -%}
96
+ {{- '<tool_response>' + m.content + '</tool_response>' -}}
97
+ {%- else -%}
98
+ {{- '<tool_response><tools>\n' -}}
99
+ {% for tr in m.content %}
100
+ {%- for tool in tools -%}
101
+ {%- if 'function' in tool -%}
102
+ {%- set tool = tool['function'] -%}
103
+ {%- endif -%}
104
+ {%- if tool.name == tr.name -%}
105
+ {{- tool_to_json(tool) + '\n' -}}
106
+ {%- endif -%}
107
+ {%- endfor -%}
108
+ {%- endfor -%}
109
+ {{- '</tools></tool_response>' -}}
110
+ {% endif -%}
111
+ {%- elif m.role == 'system' -%}
112
+ <|system|>{{ visible_text(m.content) }}
113
+ {%- endif -%}
114
+ {%- endfor -%}
115
+ {%- if add_generation_prompt -%}
116
+ <|assistant|>{{- '</think>' if (enable_thinking is defined and not enable_thinking) else '<think>' -}}
117
+ {%- endif -%}
config.json ADDED
@@ -0,0 +1,862 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "GlmMoeDsaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "torch_dtype": "bfloat16",
8
+ "eos_token_id": [
9
+ 154820,
10
+ 154827,
11
+ 154829
12
+ ],
13
+ "ep_size": 1,
14
+ "first_k_dense_replace": 3,
15
+ "hidden_act": "silu",
16
+ "hidden_size": 6144,
17
+ "index_head_dim": 128,
18
+ "index_n_heads": 32,
19
+ "index_topk": 2048,
20
+ "indexer_rope_interleave": true,
21
+ "initializer_range": 0.02,
22
+ "intermediate_size": 12288,
23
+ "kv_lora_rank": 512,
24
+ "max_position_embeddings": 202752,
25
+ "mlp_layer_types": [
26
+ "dense",
27
+ "dense",
28
+ "dense",
29
+ "sparse",
30
+ "sparse",
31
+ "sparse",
32
+ "sparse",
33
+ "sparse",
34
+ "sparse",
35
+ "sparse",
36
+ "sparse",
37
+ "sparse",
38
+ "sparse",
39
+ "sparse",
40
+ "sparse",
41
+ "sparse",
42
+ "sparse",
43
+ "sparse",
44
+ "sparse",
45
+ "sparse",
46
+ "sparse",
47
+ "sparse",
48
+ "sparse",
49
+ "sparse",
50
+ "sparse",
51
+ "sparse",
52
+ "sparse",
53
+ "sparse",
54
+ "sparse",
55
+ "sparse",
56
+ "sparse",
57
+ "sparse",
58
+ "sparse",
59
+ "sparse",
60
+ "sparse",
61
+ "sparse",
62
+ "sparse",
63
+ "sparse",
64
+ "sparse",
65
+ "sparse",
66
+ "sparse",
67
+ "sparse",
68
+ "sparse",
69
+ "sparse",
70
+ "sparse",
71
+ "sparse",
72
+ "sparse",
73
+ "sparse",
74
+ "sparse",
75
+ "sparse",
76
+ "sparse",
77
+ "sparse",
78
+ "sparse",
79
+ "sparse",
80
+ "sparse",
81
+ "sparse",
82
+ "sparse",
83
+ "sparse",
84
+ "sparse",
85
+ "sparse",
86
+ "sparse",
87
+ "sparse",
88
+ "sparse",
89
+ "sparse",
90
+ "sparse",
91
+ "sparse",
92
+ "sparse",
93
+ "sparse",
94
+ "sparse",
95
+ "sparse",
96
+ "sparse",
97
+ "sparse",
98
+ "sparse",
99
+ "sparse",
100
+ "sparse",
101
+ "sparse",
102
+ "sparse",
103
+ "sparse"
104
+ ],
105
+ "model_type": "glm_moe_dsa",
106
+ "moe_intermediate_size": 2048,
107
+ "moe_layer_freq": 1,
108
+ "n_group": 1,
109
+ "n_routed_experts": 256,
110
+ "n_shared_experts": 1,
111
+ "norm_topk_prob": true,
112
+ "num_attention_heads": 64,
113
+ "num_experts_per_tok": 8,
114
+ "num_hidden_layers": 78,
115
+ "num_key_value_heads": 64,
116
+ "num_nextn_predict_layers": 1,
117
+ "pad_token_id": 154821,
118
+ "pretraining_tp": 1,
119
+ "q_lora_rank": 2048,
120
+ "qk_head_dim": 256,
121
+ "qk_nope_head_dim": 192,
122
+ "qk_rope_head_dim": 64,
123
+ "quantization_config": {
124
+ "activation_scheme": "dynamic",
125
+ "fmt": "e4m3",
126
+ "modules_to_not_convert": [
127
+ "lm_head",
128
+ "model.embed_tokens",
129
+ "model.layers.0.input_layernorm",
130
+ "model.layers.0.post_attention_layernorm",
131
+ "model.layers.0.self_attn.indexer.k_norm",
132
+ "model.layers.0.self_attn.indexer.k_norm.bias",
133
+ "model.layers.0.self_attn.indexers_proj",
134
+ "model.layers.0.self_attn.kv_a_layernorm",
135
+ "model.layers.0.self_attn.q_a_layernorm",
136
+ "model.layers.1.input_layernorm",
137
+ "model.layers.1.post_attention_layernorm",
138
+ "model.layers.1.self_attn.indexer.k_norm",
139
+ "model.layers.1.self_attn.indexer.k_norm.bias",
140
+ "model.layers.1.self_attn.indexers_proj",
141
+ "model.layers.1.self_attn.kv_a_layernorm",
142
+ "model.layers.1.self_attn.q_a_layernorm",
143
+ "model.layers.2.input_layernorm",
144
+ "model.layers.2.post_attention_layernorm",
145
+ "model.layers.2.self_attn.indexer.k_norm",
146
+ "model.layers.2.self_attn.indexer.k_norm.bias",
147
+ "model.layers.2.self_attn.indexers_proj",
148
+ "model.layers.2.self_attn.kv_a_layernorm",
149
+ "model.layers.2.self_attn.q_a_layernorm",
150
+ "model.layers.3.input_layernorm",
151
+ "model.layers.3.mlp.gate",
152
+ "model.layers.3.mlp.gate.e_score_correction_bias",
153
+ "model.layers.3.post_attention_layernorm",
154
+ "model.layers.3.self_attn.indexer.k_norm",
155
+ "model.layers.3.self_attn.indexer.k_norm.bias",
156
+ "model.layers.3.self_attn.indexers_proj",
157
+ "model.layers.3.self_attn.kv_a_layernorm",
158
+ "model.layers.3.self_attn.q_a_layernorm",
159
+ "model.layers.4.input_layernorm",
160
+ "model.layers.4.mlp.gate",
161
+ "model.layers.4.mlp.gate.e_score_correction_bias",
162
+ "model.layers.4.post_attention_layernorm",
163
+ "model.layers.4.self_attn.indexer.k_norm",
164
+ "model.layers.4.self_attn.indexer.k_norm.bias",
165
+ "model.layers.4.self_attn.indexers_proj",
166
+ "model.layers.4.self_attn.kv_a_layernorm",
167
+ "model.layers.4.self_attn.q_a_layernorm",
168
+ "model.layers.5.input_layernorm",
169
+ "model.layers.5.mlp.gate",
170
+ "model.layers.5.mlp.gate.e_score_correction_bias",
171
+ "model.layers.5.post_attention_layernorm",
172
+ "model.layers.5.self_attn.indexer.k_norm",
173
+ "model.layers.5.self_attn.indexer.k_norm.bias",
174
+ "model.layers.5.self_attn.indexers_proj",
175
+ "model.layers.5.self_attn.kv_a_layernorm",
176
+ "model.layers.5.self_attn.q_a_layernorm",
177
+ "model.layers.6.input_layernorm",
178
+ "model.layers.6.mlp.gate",
179
+ "model.layers.6.mlp.gate.e_score_correction_bias",
180
+ "model.layers.6.post_attention_layernorm",
181
+ "model.layers.6.self_attn.indexer.k_norm",
182
+ "model.layers.6.self_attn.indexer.k_norm.bias",
183
+ "model.layers.6.self_attn.indexers_proj",
184
+ "model.layers.6.self_attn.kv_a_layernorm",
185
+ "model.layers.6.self_attn.q_a_layernorm",
186
+ "model.layers.7.input_layernorm",
187
+ "model.layers.7.mlp.gate",
188
+ "model.layers.7.mlp.gate.e_score_correction_bias",
189
+ "model.layers.7.post_attention_layernorm",
190
+ "model.layers.7.self_attn.indexer.k_norm",
191
+ "model.layers.7.self_attn.indexer.k_norm.bias",
192
+ "model.layers.7.self_attn.indexers_proj",
193
+ "model.layers.7.self_attn.kv_a_layernorm",
194
+ "model.layers.7.self_attn.q_a_layernorm",
195
+ "model.layers.8.input_layernorm",
196
+ "model.layers.8.mlp.gate",
197
+ "model.layers.8.mlp.gate.e_score_correction_bias",
198
+ "model.layers.8.post_attention_layernorm",
199
+ "model.layers.8.self_attn.indexer.k_norm",
200
+ "model.layers.8.self_attn.indexer.k_norm.bias",
201
+ "model.layers.8.self_attn.indexers_proj",
202
+ "model.layers.8.self_attn.kv_a_layernorm",
203
+ "model.layers.8.self_attn.q_a_layernorm",
204
+ "model.layers.9.input_layernorm",
205
+ "model.layers.9.mlp.gate",
206
+ "model.layers.9.mlp.gate.e_score_correction_bias",
207
+ "model.layers.9.post_attention_layernorm",
208
+ "model.layers.9.self_attn.indexer.k_norm",
209
+ "model.layers.9.self_attn.indexer.k_norm.bias",
210
+ "model.layers.9.self_attn.indexers_proj",
211
+ "model.layers.9.self_attn.kv_a_layernorm",
212
+ "model.layers.9.self_attn.q_a_layernorm",
213
+ "model.layers.10.input_layernorm",
214
+ "model.layers.10.mlp.gate",
215
+ "model.layers.10.mlp.gate.e_score_correction_bias",
216
+ "model.layers.10.post_attention_layernorm",
217
+ "model.layers.10.self_attn.indexer.k_norm",
218
+ "model.layers.10.self_attn.indexer.k_norm.bias",
219
+ "model.layers.10.self_attn.indexers_proj",
220
+ "model.layers.10.self_attn.kv_a_layernorm",
221
+ "model.layers.10.self_attn.q_a_layernorm",
222
+ "model.layers.11.input_layernorm",
223
+ "model.layers.11.mlp.gate",
224
+ "model.layers.11.mlp.gate.e_score_correction_bias",
225
+ "model.layers.11.post_attention_layernorm",
226
+ "model.layers.11.self_attn.indexer.k_norm",
227
+ "model.layers.11.self_attn.indexer.k_norm.bias",
228
+ "model.layers.11.self_attn.indexers_proj",
229
+ "model.layers.11.self_attn.kv_a_layernorm",
230
+ "model.layers.11.self_attn.q_a_layernorm",
231
+ "model.layers.12.input_layernorm",
232
+ "model.layers.12.mlp.gate",
233
+ "model.layers.12.mlp.gate.e_score_correction_bias",
234
+ "model.layers.12.post_attention_layernorm",
235
+ "model.layers.12.self_attn.indexer.k_norm",
236
+ "model.layers.12.self_attn.indexer.k_norm.bias",
237
+ "model.layers.12.self_attn.indexers_proj",
238
+ "model.layers.12.self_attn.kv_a_layernorm",
239
+ "model.layers.12.self_attn.q_a_layernorm",
240
+ "model.layers.13.input_layernorm",
241
+ "model.layers.13.mlp.gate",
242
+ "model.layers.13.mlp.gate.e_score_correction_bias",
243
+ "model.layers.13.post_attention_layernorm",
244
+ "model.layers.13.self_attn.indexer.k_norm",
245
+ "model.layers.13.self_attn.indexer.k_norm.bias",
246
+ "model.layers.13.self_attn.indexers_proj",
247
+ "model.layers.13.self_attn.kv_a_layernorm",
248
+ "model.layers.13.self_attn.q_a_layernorm",
249
+ "model.layers.14.input_layernorm",
250
+ "model.layers.14.mlp.gate",
251
+ "model.layers.14.mlp.gate.e_score_correction_bias",
252
+ "model.layers.14.post_attention_layernorm",
253
+ "model.layers.14.self_attn.indexer.k_norm",
254
+ "model.layers.14.self_attn.indexer.k_norm.bias",
255
+ "model.layers.14.self_attn.indexers_proj",
256
+ "model.layers.14.self_attn.kv_a_layernorm",
257
+ "model.layers.14.self_attn.q_a_layernorm",
258
+ "model.layers.15.input_layernorm",
259
+ "model.layers.15.mlp.gate",
260
+ "model.layers.15.mlp.gate.e_score_correction_bias",
261
+ "model.layers.15.post_attention_layernorm",
262
+ "model.layers.15.self_attn.indexer.k_norm",
263
+ "model.layers.15.self_attn.indexer.k_norm.bias",
264
+ "model.layers.15.self_attn.indexers_proj",
265
+ "model.layers.15.self_attn.kv_a_layernorm",
266
+ "model.layers.15.self_attn.q_a_layernorm",
267
+ "model.layers.16.input_layernorm",
268
+ "model.layers.16.mlp.gate",
269
+ "model.layers.16.mlp.gate.e_score_correction_bias",
270
+ "model.layers.16.post_attention_layernorm",
271
+ "model.layers.16.self_attn.indexer.k_norm",
272
+ "model.layers.16.self_attn.indexer.k_norm.bias",
273
+ "model.layers.16.self_attn.indexers_proj",
274
+ "model.layers.16.self_attn.kv_a_layernorm",
275
+ "model.layers.16.self_attn.q_a_layernorm",
276
+ "model.layers.17.input_layernorm",
277
+ "model.layers.17.mlp.gate",
278
+ "model.layers.17.mlp.gate.e_score_correction_bias",
279
+ "model.layers.17.post_attention_layernorm",
280
+ "model.layers.17.self_attn.indexer.k_norm",
281
+ "model.layers.17.self_attn.indexer.k_norm.bias",
282
+ "model.layers.17.self_attn.indexers_proj",
283
+ "model.layers.17.self_attn.kv_a_layernorm",
284
+ "model.layers.17.self_attn.q_a_layernorm",
285
+ "model.layers.18.input_layernorm",
286
+ "model.layers.18.mlp.gate",
287
+ "model.layers.18.mlp.gate.e_score_correction_bias",
288
+ "model.layers.18.post_attention_layernorm",
289
+ "model.layers.18.self_attn.indexer.k_norm",
290
+ "model.layers.18.self_attn.indexer.k_norm.bias",
291
+ "model.layers.18.self_attn.indexers_proj",
292
+ "model.layers.18.self_attn.kv_a_layernorm",
293
+ "model.layers.18.self_attn.q_a_layernorm",
294
+ "model.layers.19.input_layernorm",
295
+ "model.layers.19.mlp.gate",
296
+ "model.layers.19.mlp.gate.e_score_correction_bias",
297
+ "model.layers.19.post_attention_layernorm",
298
+ "model.layers.19.self_attn.indexer.k_norm",
299
+ "model.layers.19.self_attn.indexer.k_norm.bias",
300
+ "model.layers.19.self_attn.indexers_proj",
301
+ "model.layers.19.self_attn.kv_a_layernorm",
302
+ "model.layers.19.self_attn.q_a_layernorm",
303
+ "model.layers.20.input_layernorm",
304
+ "model.layers.20.mlp.gate",
305
+ "model.layers.20.mlp.gate.e_score_correction_bias",
306
+ "model.layers.20.post_attention_layernorm",
307
+ "model.layers.20.self_attn.indexer.k_norm",
308
+ "model.layers.20.self_attn.indexer.k_norm.bias",
309
+ "model.layers.20.self_attn.indexers_proj",
310
+ "model.layers.20.self_attn.kv_a_layernorm",
311
+ "model.layers.20.self_attn.q_a_layernorm",
312
+ "model.layers.21.input_layernorm",
313
+ "model.layers.21.mlp.gate",
314
+ "model.layers.21.mlp.gate.e_score_correction_bias",
315
+ "model.layers.21.post_attention_layernorm",
316
+ "model.layers.21.self_attn.indexer.k_norm",
317
+ "model.layers.21.self_attn.indexer.k_norm.bias",
318
+ "model.layers.21.self_attn.indexers_proj",
319
+ "model.layers.21.self_attn.kv_a_layernorm",
320
+ "model.layers.21.self_attn.q_a_layernorm",
321
+ "model.layers.22.input_layernorm",
322
+ "model.layers.22.mlp.gate",
323
+ "model.layers.22.mlp.gate.e_score_correction_bias",
324
+ "model.layers.22.post_attention_layernorm",
325
+ "model.layers.22.self_attn.indexer.k_norm",
326
+ "model.layers.22.self_attn.indexer.k_norm.bias",
327
+ "model.layers.22.self_attn.indexers_proj",
328
+ "model.layers.22.self_attn.kv_a_layernorm",
329
+ "model.layers.22.self_attn.q_a_layernorm",
330
+ "model.layers.23.input_layernorm",
331
+ "model.layers.23.mlp.gate",
332
+ "model.layers.23.mlp.gate.e_score_correction_bias",
333
+ "model.layers.23.post_attention_layernorm",
334
+ "model.layers.23.self_attn.indexer.k_norm",
335
+ "model.layers.23.self_attn.indexer.k_norm.bias",
336
+ "model.layers.23.self_attn.indexers_proj",
337
+ "model.layers.23.self_attn.kv_a_layernorm",
338
+ "model.layers.23.self_attn.q_a_layernorm",
339
+ "model.layers.24.input_layernorm",
340
+ "model.layers.24.mlp.gate",
341
+ "model.layers.24.mlp.gate.e_score_correction_bias",
342
+ "model.layers.24.post_attention_layernorm",
343
+ "model.layers.24.self_attn.indexer.k_norm",
344
+ "model.layers.24.self_attn.indexer.k_norm.bias",
345
+ "model.layers.24.self_attn.indexers_proj",
346
+ "model.layers.24.self_attn.kv_a_layernorm",
347
+ "model.layers.24.self_attn.q_a_layernorm",
348
+ "model.layers.25.input_layernorm",
349
+ "model.layers.25.mlp.gate",
350
+ "model.layers.25.mlp.gate.e_score_correction_bias",
351
+ "model.layers.25.post_attention_layernorm",
352
+ "model.layers.25.self_attn.indexer.k_norm",
353
+ "model.layers.25.self_attn.indexer.k_norm.bias",
354
+ "model.layers.25.self_attn.indexers_proj",
355
+ "model.layers.25.self_attn.kv_a_layernorm",
356
+ "model.layers.25.self_attn.q_a_layernorm",
357
+ "model.layers.26.input_layernorm",
358
+ "model.layers.26.mlp.gate",
359
+ "model.layers.26.mlp.gate.e_score_correction_bias",
360
+ "model.layers.26.post_attention_layernorm",
361
+ "model.layers.26.self_attn.indexer.k_norm",
362
+ "model.layers.26.self_attn.indexer.k_norm.bias",
363
+ "model.layers.26.self_attn.indexers_proj",
364
+ "model.layers.26.self_attn.kv_a_layernorm",
365
+ "model.layers.26.self_attn.q_a_layernorm",
366
+ "model.layers.27.input_layernorm",
367
+ "model.layers.27.mlp.gate",
368
+ "model.layers.27.mlp.gate.e_score_correction_bias",
369
+ "model.layers.27.post_attention_layernorm",
370
+ "model.layers.27.self_attn.indexer.k_norm",
371
+ "model.layers.27.self_attn.indexer.k_norm.bias",
372
+ "model.layers.27.self_attn.indexers_proj",
373
+ "model.layers.27.self_attn.kv_a_layernorm",
374
+ "model.layers.27.self_attn.q_a_layernorm",
375
+ "model.layers.28.input_layernorm",
376
+ "model.layers.28.mlp.gate",
377
+ "model.layers.28.mlp.gate.e_score_correction_bias",
378
+ "model.layers.28.post_attention_layernorm",
379
+ "model.layers.28.self_attn.indexer.k_norm",
380
+ "model.layers.28.self_attn.indexer.k_norm.bias",
381
+ "model.layers.28.self_attn.indexers_proj",
382
+ "model.layers.28.self_attn.kv_a_layernorm",
383
+ "model.layers.28.self_attn.q_a_layernorm",
384
+ "model.layers.29.input_layernorm",
385
+ "model.layers.29.mlp.gate",
386
+ "model.layers.29.mlp.gate.e_score_correction_bias",
387
+ "model.layers.29.post_attention_layernorm",
388
+ "model.layers.29.self_attn.indexer.k_norm",
389
+ "model.layers.29.self_attn.indexer.k_norm.bias",
390
+ "model.layers.29.self_attn.indexers_proj",
391
+ "model.layers.29.self_attn.kv_a_layernorm",
392
+ "model.layers.29.self_attn.q_a_layernorm",
393
+ "model.layers.30.input_layernorm",
394
+ "model.layers.30.mlp.gate",
395
+ "model.layers.30.mlp.gate.e_score_correction_bias",
396
+ "model.layers.30.post_attention_layernorm",
397
+ "model.layers.30.self_attn.indexer.k_norm",
398
+ "model.layers.30.self_attn.indexer.k_norm.bias",
399
+ "model.layers.30.self_attn.indexers_proj",
400
+ "model.layers.30.self_attn.kv_a_layernorm",
401
+ "model.layers.30.self_attn.q_a_layernorm",
402
+ "model.layers.31.input_layernorm",
403
+ "model.layers.31.mlp.gate",
404
+ "model.layers.31.mlp.gate.e_score_correction_bias",
405
+ "model.layers.31.post_attention_layernorm",
406
+ "model.layers.31.self_attn.indexer.k_norm",
407
+ "model.layers.31.self_attn.indexer.k_norm.bias",
408
+ "model.layers.31.self_attn.indexers_proj",
409
+ "model.layers.31.self_attn.kv_a_layernorm",
410
+ "model.layers.31.self_attn.q_a_layernorm",
411
+ "model.layers.32.input_layernorm",
412
+ "model.layers.32.mlp.gate",
413
+ "model.layers.32.mlp.gate.e_score_correction_bias",
414
+ "model.layers.32.post_attention_layernorm",
415
+ "model.layers.32.self_attn.indexer.k_norm",
416
+ "model.layers.32.self_attn.indexer.k_norm.bias",
417
+ "model.layers.32.self_attn.indexers_proj",
418
+ "model.layers.32.self_attn.kv_a_layernorm",
419
+ "model.layers.32.self_attn.q_a_layernorm",
420
+ "model.layers.33.input_layernorm",
421
+ "model.layers.33.mlp.gate",
422
+ "model.layers.33.mlp.gate.e_score_correction_bias",
423
+ "model.layers.33.post_attention_layernorm",
424
+ "model.layers.33.self_attn.indexer.k_norm",
425
+ "model.layers.33.self_attn.indexer.k_norm.bias",
426
+ "model.layers.33.self_attn.indexers_proj",
427
+ "model.layers.33.self_attn.kv_a_layernorm",
428
+ "model.layers.33.self_attn.q_a_layernorm",
429
+ "model.layers.34.input_layernorm",
430
+ "model.layers.34.mlp.gate",
431
+ "model.layers.34.mlp.gate.e_score_correction_bias",
432
+ "model.layers.34.post_attention_layernorm",
433
+ "model.layers.34.self_attn.indexer.k_norm",
434
+ "model.layers.34.self_attn.indexer.k_norm.bias",
435
+ "model.layers.34.self_attn.indexers_proj",
436
+ "model.layers.34.self_attn.kv_a_layernorm",
437
+ "model.layers.34.self_attn.q_a_layernorm",
438
+ "model.layers.35.input_layernorm",
439
+ "model.layers.35.mlp.gate",
440
+ "model.layers.35.mlp.gate.e_score_correction_bias",
441
+ "model.layers.35.post_attention_layernorm",
442
+ "model.layers.35.self_attn.indexer.k_norm",
443
+ "model.layers.35.self_attn.indexer.k_norm.bias",
444
+ "model.layers.35.self_attn.indexers_proj",
445
+ "model.layers.35.self_attn.kv_a_layernorm",
446
+ "model.layers.35.self_attn.q_a_layernorm",
447
+ "model.layers.36.input_layernorm",
448
+ "model.layers.36.mlp.gate",
449
+ "model.layers.36.mlp.gate.e_score_correction_bias",
450
+ "model.layers.36.post_attention_layernorm",
451
+ "model.layers.36.self_attn.indexer.k_norm",
452
+ "model.layers.36.self_attn.indexer.k_norm.bias",
453
+ "model.layers.36.self_attn.indexers_proj",
454
+ "model.layers.36.self_attn.kv_a_layernorm",
455
+ "model.layers.36.self_attn.q_a_layernorm",
456
+ "model.layers.37.input_layernorm",
457
+ "model.layers.37.mlp.gate",
458
+ "model.layers.37.mlp.gate.e_score_correction_bias",
459
+ "model.layers.37.post_attention_layernorm",
460
+ "model.layers.37.self_attn.indexer.k_norm",
461
+ "model.layers.37.self_attn.indexer.k_norm.bias",
462
+ "model.layers.37.self_attn.indexers_proj",
463
+ "model.layers.37.self_attn.kv_a_layernorm",
464
+ "model.layers.37.self_attn.q_a_layernorm",
465
+ "model.layers.38.input_layernorm",
466
+ "model.layers.38.mlp.gate",
467
+ "model.layers.38.mlp.gate.e_score_correction_bias",
468
+ "model.layers.38.post_attention_layernorm",
469
+ "model.layers.38.self_attn.indexer.k_norm",
470
+ "model.layers.38.self_attn.indexer.k_norm.bias",
471
+ "model.layers.38.self_attn.indexers_proj",
472
+ "model.layers.38.self_attn.kv_a_layernorm",
473
+ "model.layers.38.self_attn.q_a_layernorm",
474
+ "model.layers.39.input_layernorm",
475
+ "model.layers.39.mlp.gate",
476
+ "model.layers.39.mlp.gate.e_score_correction_bias",
477
+ "model.layers.39.post_attention_layernorm",
478
+ "model.layers.39.self_attn.indexer.k_norm",
479
+ "model.layers.39.self_attn.indexer.k_norm.bias",
480
+ "model.layers.39.self_attn.indexers_proj",
481
+ "model.layers.39.self_attn.kv_a_layernorm",
482
+ "model.layers.39.self_attn.q_a_layernorm",
483
+ "model.layers.40.input_layernorm",
484
+ "model.layers.40.mlp.gate",
485
+ "model.layers.40.mlp.gate.e_score_correction_bias",
486
+ "model.layers.40.post_attention_layernorm",
487
+ "model.layers.40.self_attn.indexer.k_norm",
488
+ "model.layers.40.self_attn.indexer.k_norm.bias",
489
+ "model.layers.40.self_attn.indexers_proj",
490
+ "model.layers.40.self_attn.kv_a_layernorm",
491
+ "model.layers.40.self_attn.q_a_layernorm",
492
+ "model.layers.41.input_layernorm",
493
+ "model.layers.41.mlp.gate",
494
+ "model.layers.41.mlp.gate.e_score_correction_bias",
495
+ "model.layers.41.post_attention_layernorm",
496
+ "model.layers.41.self_attn.indexer.k_norm",
497
+ "model.layers.41.self_attn.indexer.k_norm.bias",
498
+ "model.layers.41.self_attn.indexers_proj",
499
+ "model.layers.41.self_attn.kv_a_layernorm",
500
+ "model.layers.41.self_attn.q_a_layernorm",
501
+ "model.layers.42.input_layernorm",
502
+ "model.layers.42.mlp.gate",
503
+ "model.layers.42.mlp.gate.e_score_correction_bias",
504
+ "model.layers.42.post_attention_layernorm",
505
+ "model.layers.42.self_attn.indexer.k_norm",
506
+ "model.layers.42.self_attn.indexer.k_norm.bias",
507
+ "model.layers.42.self_attn.indexers_proj",
508
+ "model.layers.42.self_attn.kv_a_layernorm",
509
+ "model.layers.42.self_attn.q_a_layernorm",
510
+ "model.layers.43.input_layernorm",
511
+ "model.layers.43.mlp.gate",
512
+ "model.layers.43.mlp.gate.e_score_correction_bias",
513
+ "model.layers.43.post_attention_layernorm",
514
+ "model.layers.43.self_attn.indexer.k_norm",
515
+ "model.layers.43.self_attn.indexer.k_norm.bias",
516
+ "model.layers.43.self_attn.indexers_proj",
517
+ "model.layers.43.self_attn.kv_a_layernorm",
518
+ "model.layers.43.self_attn.q_a_layernorm",
519
+ "model.layers.44.input_layernorm",
520
+ "model.layers.44.mlp.gate",
521
+ "model.layers.44.mlp.gate.e_score_correction_bias",
522
+ "model.layers.44.post_attention_layernorm",
523
+ "model.layers.44.self_attn.indexer.k_norm",
524
+ "model.layers.44.self_attn.indexer.k_norm.bias",
525
+ "model.layers.44.self_attn.indexers_proj",
526
+ "model.layers.44.self_attn.kv_a_layernorm",
527
+ "model.layers.44.self_attn.q_a_layernorm",
528
+ "model.layers.45.input_layernorm",
529
+ "model.layers.45.mlp.gate",
530
+ "model.layers.45.mlp.gate.e_score_correction_bias",
531
+ "model.layers.45.post_attention_layernorm",
532
+ "model.layers.45.self_attn.indexer.k_norm",
533
+ "model.layers.45.self_attn.indexer.k_norm.bias",
534
+ "model.layers.45.self_attn.indexers_proj",
535
+ "model.layers.45.self_attn.kv_a_layernorm",
536
+ "model.layers.45.self_attn.q_a_layernorm",
537
+ "model.layers.46.input_layernorm",
538
+ "model.layers.46.mlp.gate",
539
+ "model.layers.46.mlp.gate.e_score_correction_bias",
540
+ "model.layers.46.post_attention_layernorm",
541
+ "model.layers.46.self_attn.indexer.k_norm",
542
+ "model.layers.46.self_attn.indexer.k_norm.bias",
543
+ "model.layers.46.self_attn.indexers_proj",
544
+ "model.layers.46.self_attn.kv_a_layernorm",
545
+ "model.layers.46.self_attn.q_a_layernorm",
546
+ "model.layers.47.input_layernorm",
547
+ "model.layers.47.mlp.gate",
548
+ "model.layers.47.mlp.gate.e_score_correction_bias",
549
+ "model.layers.47.post_attention_layernorm",
550
+ "model.layers.47.self_attn.indexer.k_norm",
551
+ "model.layers.47.self_attn.indexer.k_norm.bias",
552
+ "model.layers.47.self_attn.indexers_proj",
553
+ "model.layers.47.self_attn.kv_a_layernorm",
554
+ "model.layers.47.self_attn.q_a_layernorm",
555
+ "model.layers.48.input_layernorm",
556
+ "model.layers.48.mlp.gate",
557
+ "model.layers.48.mlp.gate.e_score_correction_bias",
558
+ "model.layers.48.post_attention_layernorm",
559
+ "model.layers.48.self_attn.indexer.k_norm",
560
+ "model.layers.48.self_attn.indexer.k_norm.bias",
561
+ "model.layers.48.self_attn.indexers_proj",
562
+ "model.layers.48.self_attn.kv_a_layernorm",
563
+ "model.layers.48.self_attn.q_a_layernorm",
564
+ "model.layers.49.input_layernorm",
565
+ "model.layers.49.mlp.gate",
566
+ "model.layers.49.mlp.gate.e_score_correction_bias",
567
+ "model.layers.49.post_attention_layernorm",
568
+ "model.layers.49.self_attn.indexer.k_norm",
569
+ "model.layers.49.self_attn.indexer.k_norm.bias",
570
+ "model.layers.49.self_attn.indexers_proj",
571
+ "model.layers.49.self_attn.kv_a_layernorm",
572
+ "model.layers.49.self_attn.q_a_layernorm",
573
+ "model.layers.50.input_layernorm",
574
+ "model.layers.50.mlp.gate",
575
+ "model.layers.50.mlp.gate.e_score_correction_bias",
576
+ "model.layers.50.post_attention_layernorm",
577
+ "model.layers.50.self_attn.indexer.k_norm",
578
+ "model.layers.50.self_attn.indexer.k_norm.bias",
579
+ "model.layers.50.self_attn.indexers_proj",
580
+ "model.layers.50.self_attn.kv_a_layernorm",
581
+ "model.layers.50.self_attn.q_a_layernorm",
582
+ "model.layers.51.input_layernorm",
583
+ "model.layers.51.mlp.gate",
584
+ "model.layers.51.mlp.gate.e_score_correction_bias",
585
+ "model.layers.51.post_attention_layernorm",
586
+ "model.layers.51.self_attn.indexer.k_norm",
587
+ "model.layers.51.self_attn.indexer.k_norm.bias",
588
+ "model.layers.51.self_attn.indexers_proj",
589
+ "model.layers.51.self_attn.kv_a_layernorm",
590
+ "model.layers.51.self_attn.q_a_layernorm",
591
+ "model.layers.52.input_layernorm",
592
+ "model.layers.52.mlp.gate",
593
+ "model.layers.52.mlp.gate.e_score_correction_bias",
594
+ "model.layers.52.post_attention_layernorm",
595
+ "model.layers.52.self_attn.indexer.k_norm",
596
+ "model.layers.52.self_attn.indexer.k_norm.bias",
597
+ "model.layers.52.self_attn.indexers_proj",
598
+ "model.layers.52.self_attn.kv_a_layernorm",
599
+ "model.layers.52.self_attn.q_a_layernorm",
600
+ "model.layers.53.input_layernorm",
601
+ "model.layers.53.mlp.gate",
602
+ "model.layers.53.mlp.gate.e_score_correction_bias",
603
+ "model.layers.53.post_attention_layernorm",
604
+ "model.layers.53.self_attn.indexer.k_norm",
605
+ "model.layers.53.self_attn.indexer.k_norm.bias",
606
+ "model.layers.53.self_attn.indexers_proj",
607
+ "model.layers.53.self_attn.kv_a_layernorm",
608
+ "model.layers.53.self_attn.q_a_layernorm",
609
+ "model.layers.54.input_layernorm",
610
+ "model.layers.54.mlp.gate",
611
+ "model.layers.54.mlp.gate.e_score_correction_bias",
612
+ "model.layers.54.post_attention_layernorm",
613
+ "model.layers.54.self_attn.indexer.k_norm",
614
+ "model.layers.54.self_attn.indexer.k_norm.bias",
615
+ "model.layers.54.self_attn.indexers_proj",
616
+ "model.layers.54.self_attn.kv_a_layernorm",
617
+ "model.layers.54.self_attn.q_a_layernorm",
618
+ "model.layers.55.input_layernorm",
619
+ "model.layers.55.mlp.gate",
620
+ "model.layers.55.mlp.gate.e_score_correction_bias",
621
+ "model.layers.55.post_attention_layernorm",
622
+ "model.layers.55.self_attn.indexer.k_norm",
623
+ "model.layers.55.self_attn.indexer.k_norm.bias",
624
+ "model.layers.55.self_attn.indexers_proj",
625
+ "model.layers.55.self_attn.kv_a_layernorm",
626
+ "model.layers.55.self_attn.q_a_layernorm",
627
+ "model.layers.56.input_layernorm",
628
+ "model.layers.56.mlp.gate",
629
+ "model.layers.56.mlp.gate.e_score_correction_bias",
630
+ "model.layers.56.post_attention_layernorm",
631
+ "model.layers.56.self_attn.indexer.k_norm",
632
+ "model.layers.56.self_attn.indexer.k_norm.bias",
633
+ "model.layers.56.self_attn.indexers_proj",
634
+ "model.layers.56.self_attn.kv_a_layernorm",
635
+ "model.layers.56.self_attn.q_a_layernorm",
636
+ "model.layers.57.input_layernorm",
637
+ "model.layers.57.mlp.gate",
638
+ "model.layers.57.mlp.gate.e_score_correction_bias",
639
+ "model.layers.57.post_attention_layernorm",
640
+ "model.layers.57.self_attn.indexer.k_norm",
641
+ "model.layers.57.self_attn.indexer.k_norm.bias",
642
+ "model.layers.57.self_attn.indexers_proj",
643
+ "model.layers.57.self_attn.kv_a_layernorm",
644
+ "model.layers.57.self_attn.q_a_layernorm",
645
+ "model.layers.58.input_layernorm",
646
+ "model.layers.58.mlp.gate",
647
+ "model.layers.58.mlp.gate.e_score_correction_bias",
648
+ "model.layers.58.post_attention_layernorm",
649
+ "model.layers.58.self_attn.indexer.k_norm",
650
+ "model.layers.58.self_attn.indexer.k_norm.bias",
651
+ "model.layers.58.self_attn.indexers_proj",
652
+ "model.layers.58.self_attn.kv_a_layernorm",
653
+ "model.layers.58.self_attn.q_a_layernorm",
654
+ "model.layers.59.input_layernorm",
655
+ "model.layers.59.mlp.gate",
656
+ "model.layers.59.mlp.gate.e_score_correction_bias",
657
+ "model.layers.59.post_attention_layernorm",
658
+ "model.layers.59.self_attn.indexer.k_norm",
659
+ "model.layers.59.self_attn.indexer.k_norm.bias",
660
+ "model.layers.59.self_attn.indexers_proj",
661
+ "model.layers.59.self_attn.kv_a_layernorm",
662
+ "model.layers.59.self_attn.q_a_layernorm",
663
+ "model.layers.60.input_layernorm",
664
+ "model.layers.60.mlp.gate",
665
+ "model.layers.60.mlp.gate.e_score_correction_bias",
666
+ "model.layers.60.post_attention_layernorm",
667
+ "model.layers.60.self_attn.indexer.k_norm",
668
+ "model.layers.60.self_attn.indexer.k_norm.bias",
669
+ "model.layers.60.self_attn.indexers_proj",
670
+ "model.layers.60.self_attn.kv_a_layernorm",
671
+ "model.layers.60.self_attn.q_a_layernorm",
672
+ "model.layers.61.input_layernorm",
673
+ "model.layers.61.mlp.gate",
674
+ "model.layers.61.mlp.gate.e_score_correction_bias",
675
+ "model.layers.61.post_attention_layernorm",
676
+ "model.layers.61.self_attn.indexer.k_norm",
677
+ "model.layers.61.self_attn.indexer.k_norm.bias",
678
+ "model.layers.61.self_attn.indexers_proj",
679
+ "model.layers.61.self_attn.kv_a_layernorm",
680
+ "model.layers.61.self_attn.q_a_layernorm",
681
+ "model.layers.62.input_layernorm",
682
+ "model.layers.62.mlp.gate",
683
+ "model.layers.62.mlp.gate.e_score_correction_bias",
684
+ "model.layers.62.post_attention_layernorm",
685
+ "model.layers.62.self_attn.indexer.k_norm",
686
+ "model.layers.62.self_attn.indexer.k_norm.bias",
687
+ "model.layers.62.self_attn.indexers_proj",
688
+ "model.layers.62.self_attn.kv_a_layernorm",
689
+ "model.layers.62.self_attn.q_a_layernorm",
690
+ "model.layers.63.input_layernorm",
691
+ "model.layers.63.mlp.gate",
692
+ "model.layers.63.mlp.gate.e_score_correction_bias",
693
+ "model.layers.63.post_attention_layernorm",
694
+ "model.layers.63.self_attn.indexer.k_norm",
695
+ "model.layers.63.self_attn.indexer.k_norm.bias",
696
+ "model.layers.63.self_attn.indexers_proj",
697
+ "model.layers.63.self_attn.kv_a_layernorm",
698
+ "model.layers.63.self_attn.q_a_layernorm",
699
+ "model.layers.64.input_layernorm",
700
+ "model.layers.64.mlp.gate",
701
+ "model.layers.64.mlp.gate.e_score_correction_bias",
702
+ "model.layers.64.post_attention_layernorm",
703
+ "model.layers.64.self_attn.indexer.k_norm",
704
+ "model.layers.64.self_attn.indexer.k_norm.bias",
705
+ "model.layers.64.self_attn.indexers_proj",
706
+ "model.layers.64.self_attn.kv_a_layernorm",
707
+ "model.layers.64.self_attn.q_a_layernorm",
708
+ "model.layers.65.input_layernorm",
709
+ "model.layers.65.mlp.gate",
710
+ "model.layers.65.mlp.gate.e_score_correction_bias",
711
+ "model.layers.65.post_attention_layernorm",
712
+ "model.layers.65.self_attn.indexer.k_norm",
713
+ "model.layers.65.self_attn.indexer.k_norm.bias",
714
+ "model.layers.65.self_attn.indexers_proj",
715
+ "model.layers.65.self_attn.kv_a_layernorm",
716
+ "model.layers.65.self_attn.q_a_layernorm",
717
+ "model.layers.66.input_layernorm",
718
+ "model.layers.66.mlp.gate",
719
+ "model.layers.66.mlp.gate.e_score_correction_bias",
720
+ "model.layers.66.post_attention_layernorm",
721
+ "model.layers.66.self_attn.indexer.k_norm",
722
+ "model.layers.66.self_attn.indexer.k_norm.bias",
723
+ "model.layers.66.self_attn.indexers_proj",
724
+ "model.layers.66.self_attn.kv_a_layernorm",
725
+ "model.layers.66.self_attn.q_a_layernorm",
726
+ "model.layers.67.input_layernorm",
727
+ "model.layers.67.mlp.gate",
728
+ "model.layers.67.mlp.gate.e_score_correction_bias",
729
+ "model.layers.67.post_attention_layernorm",
730
+ "model.layers.67.self_attn.indexer.k_norm",
731
+ "model.layers.67.self_attn.indexer.k_norm.bias",
732
+ "model.layers.67.self_attn.indexers_proj",
733
+ "model.layers.67.self_attn.kv_a_layernorm",
734
+ "model.layers.67.self_attn.q_a_layernorm",
735
+ "model.layers.68.input_layernorm",
736
+ "model.layers.68.mlp.gate",
737
+ "model.layers.68.mlp.gate.e_score_correction_bias",
738
+ "model.layers.68.post_attention_layernorm",
739
+ "model.layers.68.self_attn.indexer.k_norm",
740
+ "model.layers.68.self_attn.indexer.k_norm.bias",
741
+ "model.layers.68.self_attn.indexers_proj",
742
+ "model.layers.68.self_attn.kv_a_layernorm",
743
+ "model.layers.68.self_attn.q_a_layernorm",
744
+ "model.layers.69.input_layernorm",
745
+ "model.layers.69.mlp.gate",
746
+ "model.layers.69.mlp.gate.e_score_correction_bias",
747
+ "model.layers.69.post_attention_layernorm",
748
+ "model.layers.69.self_attn.indexer.k_norm",
749
+ "model.layers.69.self_attn.indexer.k_norm.bias",
750
+ "model.layers.69.self_attn.indexers_proj",
751
+ "model.layers.69.self_attn.kv_a_layernorm",
752
+ "model.layers.69.self_attn.q_a_layernorm",
753
+ "model.layers.70.input_layernorm",
754
+ "model.layers.70.mlp.gate",
755
+ "model.layers.70.mlp.gate.e_score_correction_bias",
756
+ "model.layers.70.post_attention_layernorm",
757
+ "model.layers.70.self_attn.indexer.k_norm",
758
+ "model.layers.70.self_attn.indexer.k_norm.bias",
759
+ "model.layers.70.self_attn.indexers_proj",
760
+ "model.layers.70.self_attn.kv_a_layernorm",
761
+ "model.layers.70.self_attn.q_a_layernorm",
762
+ "model.layers.71.input_layernorm",
763
+ "model.layers.71.mlp.gate",
764
+ "model.layers.71.mlp.gate.e_score_correction_bias",
765
+ "model.layers.71.post_attention_layernorm",
766
+ "model.layers.71.self_attn.indexer.k_norm",
767
+ "model.layers.71.self_attn.indexer.k_norm.bias",
768
+ "model.layers.71.self_attn.indexers_proj",
769
+ "model.layers.71.self_attn.kv_a_layernorm",
770
+ "model.layers.71.self_attn.q_a_layernorm",
771
+ "model.layers.72.input_layernorm",
772
+ "model.layers.72.mlp.gate",
773
+ "model.layers.72.mlp.gate.e_score_correction_bias",
774
+ "model.layers.72.post_attention_layernorm",
775
+ "model.layers.72.self_attn.indexer.k_norm",
776
+ "model.layers.72.self_attn.indexer.k_norm.bias",
777
+ "model.layers.72.self_attn.indexers_proj",
778
+ "model.layers.72.self_attn.kv_a_layernorm",
779
+ "model.layers.72.self_attn.q_a_layernorm",
780
+ "model.layers.73.input_layernorm",
781
+ "model.layers.73.mlp.gate",
782
+ "model.layers.73.mlp.gate.e_score_correction_bias",
783
+ "model.layers.73.post_attention_layernorm",
784
+ "model.layers.73.self_attn.indexer.k_norm",
785
+ "model.layers.73.self_attn.indexer.k_norm.bias",
786
+ "model.layers.73.self_attn.indexers_proj",
787
+ "model.layers.73.self_attn.kv_a_layernorm",
788
+ "model.layers.73.self_attn.q_a_layernorm",
789
+ "model.layers.74.input_layernorm",
790
+ "model.layers.74.mlp.gate",
791
+ "model.layers.74.mlp.gate.e_score_correction_bias",
792
+ "model.layers.74.post_attention_layernorm",
793
+ "model.layers.74.self_attn.indexer.k_norm",
794
+ "model.layers.74.self_attn.indexer.k_norm.bias",
795
+ "model.layers.74.self_attn.indexers_proj",
796
+ "model.layers.74.self_attn.kv_a_layernorm",
797
+ "model.layers.74.self_attn.q_a_layernorm",
798
+ "model.layers.75.input_layernorm",
799
+ "model.layers.75.mlp.gate",
800
+ "model.layers.75.mlp.gate.e_score_correction_bias",
801
+ "model.layers.75.post_attention_layernorm",
802
+ "model.layers.75.self_attn.indexer.k_norm",
803
+ "model.layers.75.self_attn.indexer.k_norm.bias",
804
+ "model.layers.75.self_attn.indexers_proj",
805
+ "model.layers.75.self_attn.kv_a_layernorm",
806
+ "model.layers.75.self_attn.q_a_layernorm",
807
+ "model.layers.76.input_layernorm",
808
+ "model.layers.76.mlp.gate",
809
+ "model.layers.76.mlp.gate.e_score_correction_bias",
810
+ "model.layers.76.post_attention_layernorm",
811
+ "model.layers.76.self_attn.indexer.k_norm",
812
+ "model.layers.76.self_attn.indexer.k_norm.bias",
813
+ "model.layers.76.self_attn.indexers_proj",
814
+ "model.layers.76.self_attn.kv_a_layernorm",
815
+ "model.layers.76.self_attn.q_a_layernorm",
816
+ "model.layers.77.input_layernorm",
817
+ "model.layers.77.mlp.gate",
818
+ "model.layers.77.mlp.gate.e_score_correction_bias",
819
+ "model.layers.77.post_attention_layernorm",
820
+ "model.layers.77.self_attn.indexer.k_norm",
821
+ "model.layers.77.self_attn.indexer.k_norm.bias",
822
+ "model.layers.77.self_attn.indexers_proj",
823
+ "model.layers.77.self_attn.kv_a_layernorm",
824
+ "model.layers.77.self_attn.q_a_layernorm",
825
+ "model.layers.78.eh_proj",
826
+ "model.layers.78.enorm",
827
+ "model.layers.78.hnorm",
828
+ "model.layers.78.input_layernorm",
829
+ "model.layers.78.mlp.gate",
830
+ "model.layers.78.mlp.gate.e_score_correction_bias",
831
+ "model.layers.78.post_attention_layernorm",
832
+ "model.layers.78.self_attn.indexer.k_norm",
833
+ "model.layers.78.self_attn.indexer.k_norm.bias",
834
+ "model.layers.78.self_attn.indexers_proj",
835
+ "model.layers.78.self_attn.kv_a_layernorm",
836
+ "model.layers.78.self_attn.q_a_layernorm",
837
+ "model.layers.78.shared_head.norm",
838
+ "model.norm"
839
+ ],
840
+ "quant_method": "fp8",
841
+ "weight_block_size": [
842
+ 128,
843
+ 128
844
+ ]
845
+ },
846
+ "rms_norm_eps": 1e-05,
847
+ "rope_interleave": true,
848
+ "rope_parameters": {
849
+ "rope_theta": 1000000,
850
+ "rope_type": "default"
851
+ },
852
+ "routed_scaling_factor": 2.5,
853
+ "scoring_func": "sigmoid",
854
+ "tie_word_embeddings": false,
855
+ "topk_group": 1,
856
+ "topk_method": "noaux_tc",
857
+ "transformers_version": "5.6.0.dev0",
858
+ "unsloth_fixed": true,
859
+ "use_cache": true,
860
+ "v_head_dim": 256,
861
+ "vocab_size": 154880
862
+ }
generation_config.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "eos_token_id": [
4
+ 154820,
5
+ 154827,
6
+ 154829
7
+ ],
8
+ "pad_token_id": 154820,
9
+ "temperature": 1.0,
10
+ "top_p": 0.95,
11
+ "transformers_version": "5.4.0"
12
+ }
model-00001-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:205976fdd1cddd00be3bc1e20755a163054e225af995a72f6e4149c018204be4
3
+ size 5363940952
model-00002-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d909a059ea5207282552fe5f5e4a4d411da68cf4644057ba1cc7a329ebfb2b9a
3
+ size 5361736696
model-00003-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1c708c55c71b2f537917ab7144ca1334f365724869662713439fe16e910582b
3
+ size 5363339120
model-00004-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c8405236f949cd74c17af1d3ba65818f731e3dce4112a3760bb55ab808d86cae
3
+ size 5361736640
model-00005-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9f76ae761b928c7e905748b3a5cb914528784ea0c6d4e8d84d3ebbd384998f62
3
+ size 5363339176
model-00006-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eec9fcc3a36b3cd7df973960072f42b1e2fa93094cd9e798ea8b857ff9ba8bae
3
+ size 5361736504
model-00007-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:80205be2a99e07bc102228bf4b36b64d0ba1c175a75bd2905a83bbdd59bda06c
3
+ size 5363339304
model-00008-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3778a0c3bcaca60f56c44fbc4a87f60bfab72323bdf8f2cabd255ec7eb24ff7
3
+ size 5361736368
model-00009-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2a4ad3a27c1863f32611e297b7ad85ddf229237d3aef3abff5895be59ef2ef4
3
+ size 5363339440
model-00010-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46449f548211661cb5c3845f16501b4250b9fae313914f9b83b31a365ed498d6
3
+ size 5361736232
model-00011-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:36557532935d5edf38aef425de1494f51039f61b34d2f0a39aabf285bbfa7752
3
+ size 5363339592
model-00012-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:36c6ff15d39f1355bbda9802a595ef91262495f1d8381d8634bc2d7468d82509
3
+ size 5363339104
model-00013-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1722fbe67d4bfa26442617837f908445a272e269b26219791e7509e71f3041fb
3
+ size 5361736696
model-00014-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0a4028b398d2e09e362dc360fa5fb536cafc80fa9326fd3a5f43a867cc937704
3
+ size 5363339120
model-00015-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:916305b18ec04fd399ec4f197bf29c7600df868f974327ead4280bd6c9004352
3
+ size 5361736688
model-00016-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:00444ce81f597e639561a3fa0126cdeaefc8122ae25bd14cce949b260861e554
3
+ size 5363339128
model-00017-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:345af0570dc88a2ae4cceae6cd99e52ad6d149b619f49ee18bfce0aaad037838
3
+ size 5361736552
model-00018-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:69187f3011e235f55a47143b7b9998836acca7550e8924047e58b0f35ab6d554
3
+ size 5363339256
model-00019-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9fe4da41bf7287e53e77fa66ad7253b0189509b9fc964f00dffe77810c5fff19
3
+ size 5361736416
model-00020-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ffbd31d3787cbb991c7026c263490206b84084f48f555fd1f5e5076ae75b1b5b
3
+ size 5361791448
model-00021-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2da172eb3d7dcdb22196bef3e5d2f7997e27975007f32072c26e9b0cebdf3c27
3
+ size 5361736352
model-00022-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f8903dd5a119b301d23e37aa56b6c91b9c144b31d863a26af59177c9afab4dd
3
+ size 5363339456
model-00023-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:66c41ca799608fa422b382f31ea56d43b97872de552b1dada68a4d5ed6be1d28
3
+ size 5361736224
model-00024-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2d8e543d1361c6504cf9f7cf937210cac1d7772c7f568bec5aac7f38e3e4aa30
3
+ size 5363339608
model-00025-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4939a8045fc5619d72adf41d638107b2270845adc2862d8ec0cce349ded7c54c
3
+ size 5363339104
model-00026-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ece8ece94c8e90d459951fc1e5acadbc744fb6c2fb06ceafa3ac3b829db25ab
3
+ size 5361736696
model-00027-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ad676bf742023e409af3722d70d097fc257991bdb0bd04bed8f17397b7def3c6
3
+ size 5363339112
model-00028-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5f37bad9f196213cd05081cf2d2a466f4dabaa22c594eff452933fcd5ab3dc64
3
+ size 5361736664
model-00029-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3b49905f4405d2e7a70ca1b4af0851350c86e685f982a0570c600d11dd802880
3
+ size 5363339152
model-00030-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e7b654485f9c01c86621f6cb024e1f6bdd81300011b6b9b11275c3fa7982472d
3
+ size 5361736528
model-00031-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34d78a779bd69810b5cb6467dcaa8705c6a88d8c2e48e8408881ffd2dcbd4a46
3
+ size 5363339280
model-00032-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:995e8d62d59d3dc544864b1f3d64f43d698acb411bd9f001993437fe8ffc99bb
3
+ size 5361736392
model-00033-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:98b8ed708b1c6c14283c378440bb687f71c2a4612ab35e910169a02e4678d6f7
3
+ size 5363339416
model-00034-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c2d5253375958275caa3525b9a5a8bdca3a75cea15b98b36747fc4931777e7cc
3
+ size 5361736264
model-00035-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17a2d75e9d0598413de333185465867a1e1722441266a449c421e09a4b3bf813
3
+ size 5363339544
model-00036-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:441ae5c24d32a8b674dd8fb1db1d4835ef46bd470c9d2f096d86b5031a3fcb41
3
+ size 5363339104
model-00037-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9bf844f6bee957fc203431c39b0a5eb9f104258c170e4e5490f1ea8487d593b6
3
+ size 5361736696
model-00038-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ba412f5ccc524070eb35272438a2f549bb7165cb4b7b86530955bf281fe8ea2
3
+ size 5363338920
model-00039-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5ebc73ae8cfce5ebcaf759962748d66b5010beccacd1f77b02e45d300c36c2b8
3
+ size 5361735840
model-00040-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:05e6033cf198187e3a89fc90f6c9364712056b73669ba853e442a4bff812e695
3
+ size 5363338584
model-00041-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b96b36ed04a5750c16a09df81f01cc396ca29bd7b50605990fcfab083b1d70c9
3
+ size 5361736576
model-00042-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c32899afc42f70a40d48640f15991f55a691ab04e49af60844cf476fb9e65ca9
3
+ size 5363339232
model-00043-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:91036a2dbf0f882d0ca1a39db1e8735d40ba5e689f3cc06e0afb859cee5b288c
3
+ size 5361736440
model-00044-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7763ff9d8d8f85e0d9430a1688734744d5c5d4b46224f3b0cc555529f5d0a529
3
+ size 5363339368
model-00045-of-00142.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a3e1fb4ed18f8fd9b3a89ececacbdfaa2af3b0f61a4550ebb6a31ce600b4b834
3
+ size 5361736312