Upload folder using huggingface_hub
Browse files- .gitattributes +1 -0
- LICENSE +96 -0
- README.md +161 -5
- chat_template.jinja +156 -0
- config.json +52 -0
- configuration_solar_open.py +242 -0
- generation_config.json +14 -0
- model-00001-of-00012.safetensors +3 -0
- model-00002-of-00012.safetensors +3 -0
- model-00003-of-00012.safetensors +3 -0
- model-00004-of-00012.safetensors +3 -0
- model-00005-of-00012.safetensors +3 -0
- model-00006-of-00012.safetensors +3 -0
- model-00007-of-00012.safetensors +3 -0
- model-00008-of-00012.safetensors +3 -0
- model-00009-of-00012.safetensors +3 -0
- model-00010-of-00012.safetensors +3 -0
- model-00011-of-00012.safetensors +3 -0
- model-00012-of-00012.safetensors +3 -0
- model.safetensors.index.json +0 -0
- modeling_solar_open.py +608 -0
- parallel_tool_call_logits_processor.py +104 -0
- solar_open_logits_processor.py +763 -0
- solar_open_reasoning_parser.py +351 -0
- solar_open_tool_parser.py +267 -0
- special_tokens_map.json +4006 -0
- tokenizer.json +3 -0
- tokenizer_config.json +0 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
LICENSE
CHANGED
|
@@ -0,0 +1,96 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Upstage Solar License
|
| 2 |
+
|
| 3 |
+
# Preamble
|
| 4 |
+
|
| 5 |
+
The 'Upstage Solar License' (hereinafter referred to as "this License") was established by Upstage Co., Ltd., incorporated under the laws of the Republic of Korea, to encourage the development of open-source software using Solar AI models, and is not affiliated with the Apache Software Foundation.
|
| 6 |
+
|
| 7 |
+
This License basically adopts all provisions of the Apache License, Version 2.0 (hereinafter referred to as "Apache License 2.0"), including the principle of allowing commercial use, but prescribes minimum strategic conditions for the global expansion of AI technology and the sustainable development of the AI ecosystem.
|
| 8 |
+
|
| 9 |
+
The key additional condition, as specified in Section 4(e), is that if you distribute a "Derivative AI Model" based on the "Work", you must specify the 'Solar' brand. This applies as an exception to Section 6 (Trademarks) of the Apache License 2.0.
|
| 10 |
+
|
| 11 |
+
# TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
| 12 |
+
|
| 13 |
+
1\. Definitions.
|
| 14 |
+
|
| 15 |
+
"License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document.
|
| 16 |
+
|
| 17 |
+
"Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License.
|
| 18 |
+
|
| 19 |
+
"Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity.
|
| 20 |
+
|
| 21 |
+
"You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License.
|
| 22 |
+
|
| 23 |
+
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
|
| 24 |
+
|
| 25 |
+
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
|
| 26 |
+
|
| 27 |
+
"Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below).
|
| 28 |
+
|
| 29 |
+
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof.
|
| 30 |
+
|
| 31 |
+
"Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution."
|
| 32 |
+
|
| 33 |
+
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work.
|
| 34 |
+
|
| 35 |
+
2\. Grant of Copyright License.
|
| 36 |
+
Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form.
|
| 37 |
+
|
| 38 |
+
3\. Grant of Patent License.
|
| 39 |
+
Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
|
| 40 |
+
|
| 41 |
+
4\. Redistribution.
|
| 42 |
+
You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
|
| 43 |
+
|
| 44 |
+
(a) You must give any other recipients of the Work or Derivative Works a copy of this License; and
|
| 45 |
+
(b) You must cause any modified files to carry prominent notices stating that You changed the files; and
|
| 46 |
+
(c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
|
| 47 |
+
(d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.
|
| 48 |
+
(e) If You distribute or make available a Derivative Work that is an artificial intelligence model created, trained, fine-tuned, or otherwise improved using the Work (the "Derivative AI Model"), You must adhere to the following conditions:
|
| 49 |
+
(i) The name of such Derivative AI Model must begin with "Solar" (e.g., "Solar-MyModel-v1"); and
|
| 50 |
+
(ii) You must prominently display the phrase "Built with Solar" in any related websites, user interfaces, or documentation associated with the Derivative AI Model; and
|
| 51 |
+
(iii) You must provide a copy of this License, including the original copyright notice and NOTICE file as included in the distribution of the Works, alongside the Derivative AI Model.
|
| 52 |
+
|
| 53 |
+
You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License.
|
| 54 |
+
|
| 55 |
+
5\. Submission of Contributions.
|
| 56 |
+
Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions.
|
| 57 |
+
|
| 58 |
+
6\. Trademarks.
|
| 59 |
+
This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work, reproducing the content of the NOTICE file, and as explicitly required for attribution in Section 4(e) of this License.
|
| 60 |
+
Any use of the “Solar” name under this License must not imply any sponsorship, endorsement, certification, or official relationship with the Licensor, nor mislead users into believing that a Derivative AI Model is an official product of the Licensor.
|
| 61 |
+
|
| 62 |
+
7\. Disclaimer of Warranty.
|
| 63 |
+
Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License.
|
| 64 |
+
|
| 65 |
+
8\. Limitation of Liability.
|
| 66 |
+
In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages.
|
| 67 |
+
|
| 68 |
+
9\. Accepting Warranty or Additional Liability.
|
| 69 |
+
While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability.
|
| 70 |
+
|
| 71 |
+
END OF TERMS AND CONDITIONS
|
| 72 |
+
|
| 73 |
+
# APPENDIX: How to apply the Upstage Solar License to your work.
|
| 74 |
+
|
| 75 |
+
To apply the Upstage Solar License to your work, attach the following boilerplate notice, with the fields enclosed by brackets replaced with your own identifying information. (Don't include the brackets\!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives.
|
| 76 |
+
|
| 77 |
+
Copyright \[yyyy\] \[Upstage AI (or other copyright owner)\]
|
| 78 |
+
|
| 79 |
+
Licensed under the Upstage Solar License (the "License");
|
| 80 |
+
you may not use this file except in compliance with the License.
|
| 81 |
+
You may obtain a copy of the License at
|
| 82 |
+
|
| 83 |
+
https://huggingface.co/Upstage/Solar-Open-100B/blob/main/LICENSE
|
| 84 |
+
|
| 85 |
+
Unless required by applicable law or agreed to in writing, software
|
| 86 |
+
distributed under the License is distributed on an "AS IS" BASIS,
|
| 87 |
+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 88 |
+
See the License for the specific language governing permissions and
|
| 89 |
+
limitations under the License.
|
| 90 |
+
|
| 91 |
+
This work is based on or derived from materials licensed under the
|
| 92 |
+
Upstage Solar License.
|
| 93 |
+
If you distribute or make available a Derivative AI Model (as defined
|
| 94 |
+
in Section 4(e) of the License), your model name must begin with "Solar"
|
| 95 |
+
and you must prominently display "Built with Solar" in associated
|
| 96 |
+
documentation or interfaces.
|
README.md
CHANGED
|
@@ -1,5 +1,161 @@
|
|
| 1 |
-
---
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language:
|
| 3 |
+
- en
|
| 4 |
+
- ko
|
| 5 |
+
library_name: transformers
|
| 6 |
+
license: other
|
| 7 |
+
license_name: upstage-solar-license
|
| 8 |
+
pipeline_tag: text-generation
|
| 9 |
+
tags:
|
| 10 |
+
- upstage
|
| 11 |
+
- solar
|
| 12 |
+
- moe
|
| 13 |
+
- 100b
|
| 14 |
+
- llm
|
| 15 |
+
- nota
|
| 16 |
+
- quantization
|
| 17 |
+
---
|
| 18 |
+
|
| 19 |
+
# **Solar-Open-100B-NotaMoeQuant-Int4**
|
| 20 |
+
|
| 21 |
+
This repository provides **Upstage’s flagship model, [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)**, packaged with [**Nota AI**](https://www.nota.ai/)’s proprietary quantization technique specifically developed for **Mixture-of-Experts (MoE)-based LLMs**. Unlike conventional quantization methods, this approach incorporates a **novel method** designed to mitigate **representation distortion** that can occur when experts are mixed under quantization in MoE architectures.
|
| 22 |
+
|
| 23 |
+
## Overview
|
| 24 |
+
|
| 25 |
+
- **Base model:** [Solar-Open-100B](https://huggingface.co/upstage/Solar-Open-100B)
|
| 26 |
+
- **Quantization:** Int4 weight-only
|
| 27 |
+
- **Packing format:** `auto_round:auto_gptq` (ensuring backend compatibility with PyTorch and vLLM)
|
| 28 |
+
- **Quantization group size:** 128
|
| 29 |
+
- **Supported tensor parallel sizes:** {1,2}
|
| 30 |
+
- **Hardware Requirements:**
|
| 31 |
+
* **Minimum:** 2 x NVIDIA A100 (80GB)
|
| 32 |
+
|
| 33 |
+
## License
|
| 34 |
+
This repository contains both model weights and code,
|
| 35 |
+
which are licensed under different terms:
|
| 36 |
+
|
| 37 |
+
1. MODEL WEIGHTS (*.safetensors)
|
| 38 |
+
Licensed under **Upstage Solar License**
|
| 39 |
+
See: https://huggingface.co/upstage/Solar-Open-100B/blob/main/LICENSE
|
| 40 |
+
|
| 41 |
+
2. CODE (*.py, *.json, *.jinja files)
|
| 42 |
+
Licensed under **Apache License 2.0**
|
| 43 |
+
See: https://www.apache.org/licenses/LICENSE-2.0
|
| 44 |
+
|
| 45 |
+
|
| 46 |
+
## Performance
|
| 47 |
+
|
| 48 |
+
- English
|
| 49 |
+
|
| 50 |
+
| |**Solar-Open-100B**|**Nota MoE Quantization (Ours)**|**AutoRound**|**cyankiwi AWQ**|
|
| 51 |
+
|--- | --- | --- | --- | --- |
|
| 52 |
+
|PPL (WikiText-2)↓|6.06 |**6.81** |7.12 |30.52 |
|
| 53 |
+
|PPL (C4)↓ |20.37 |**20.84** |20.94 |50.16 |
|
| 54 |
+
|PIQA↑ |82.37 |**82.75** |82.05 |78.94 |
|
| 55 |
+
|BoolQ↑ |84.89 |84.86 |**85.29** |68.87 |
|
| 56 |
+
|ARC-E↑ |87.25 |**86.48** |85.77 |83.12 |
|
| 57 |
+
|ARC-C↑ |61.43 |**61.69** |60.84 |56.40 |
|
| 58 |
+
|TruthfulQA↑ |59.25 |**60.14** |59.18 |52.38 |
|
| 59 |
+
|WinoGrande↑ |76.09 |**75.77** |**75.77** |68.59 |
|
| 60 |
+
|
| 61 |
+
- Korean
|
| 62 |
+
|
| 63 |
+
| |**Solar-Open-100B**|**Nota MoE Quantization (Ours)**|**AutoRound**|**cyankiwi AWQ**|
|
| 64 |
+
|--- | --- | --- | --- | --- |
|
| 65 |
+
|HRM8K↑ |81.52 |80.68 |**81.56** |32.67 |
|
| 66 |
+
|MMLU-ProX-Lite↑ |55.44 |**51.84** |51.26 |6.19 |
|
| 67 |
+
|KoBEST↑ |62.00 |**62.80** |61.80 |61.80 |
|
| 68 |
+
|CLiCK↑ |71.33 |**70.03** |69.77 |51.18 |
|
| 69 |
+
|
| 70 |
+
- Model weigth memory footprint
|
| 71 |
+
|
| 72 |
+
|**Solar-Open-100B**|**Nota MoE Quantization (Ours)**|**cyankiwi AWQ**|
|
| 73 |
+
| --- | --- | --- |
|
| 74 |
+
|191.2 GB |51.9 GB |57.0 GB |
|
| 75 |
+
|
| 76 |
+
|
| 77 |
+
* Note
|
| 78 |
+
- ↑ / ↓ denote the direction of improvement: higher is better (↑), lower is better (↓).
|
| 79 |
+
- Cyankiwi AWQ is a publicly available [INT4 (4-bit AWQ) quantized version of Solar-Open-100B](cyankiwi/Solar-Open-100B-AWQ-4bit)
|
| 80 |
+
- Because we used a smaller thinking budget, the results for HRM8K and CLiCK are slightly lower than the numbers reported in the original Solar-Open-100B repository.
|
| 81 |
+
- Memory refers to the pure VRAM footprint occupied only by the model weights.
|
| 82 |
+
|
| 83 |
+
## Inference
|
| 84 |
+
### Transformers
|
| 85 |
+
|
| 86 |
+
Install the required dependencies:
|
| 87 |
+
|
| 88 |
+
```bash
|
| 89 |
+
pip install -U transformers kernels torch accelerate auto-round==0.8.0
|
| 90 |
+
```
|
| 91 |
+
|
| 92 |
+
Run inference with the following code:
|
| 93 |
+
|
| 94 |
+
```python
|
| 95 |
+
import torch
|
| 96 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 97 |
+
|
| 98 |
+
MODEL_ID = "nota-ai/Solar-Open-100B-NotaMoEQuant-Int4"
|
| 99 |
+
|
| 100 |
+
# Load model and tokenizer
|
| 101 |
+
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
|
| 102 |
+
|
| 103 |
+
model = AutoModelForCausalLM.from_pretrained(
|
| 104 |
+
pretrained_model_name_or_path=MODEL_ID,
|
| 105 |
+
torch_dtype=torch.bfloat16,
|
| 106 |
+
device_map="auto",
|
| 107 |
+
trust_remote_code=True,
|
| 108 |
+
)
|
| 109 |
+
|
| 110 |
+
# Prepare input
|
| 111 |
+
messages = [{"role": "user", "content": "who are you?"}]
|
| 112 |
+
inputs = tokenizer.apply_chat_template(
|
| 113 |
+
messages,
|
| 114 |
+
tokenize=True,
|
| 115 |
+
add_generation_prompt=True,
|
| 116 |
+
return_dict=True,
|
| 117 |
+
return_tensors="pt",
|
| 118 |
+
)
|
| 119 |
+
inputs = inputs.to(model.device)
|
| 120 |
+
|
| 121 |
+
# Generate response
|
| 122 |
+
generated_ids = model.generate(
|
| 123 |
+
**inputs,
|
| 124 |
+
max_new_tokens=4096,
|
| 125 |
+
temperature=0.8,
|
| 126 |
+
top_p=0.95,
|
| 127 |
+
top_k=50,
|
| 128 |
+
do_sample=True,
|
| 129 |
+
)
|
| 130 |
+
generated_text = tokenizer.decode(generated_ids[0][inputs.input_ids.shape[1] :])
|
| 131 |
+
print(generated_text)
|
| 132 |
+
```
|
| 133 |
+
|
| 134 |
+
### vLLM
|
| 135 |
+
Create and activate a Python virtual environment
|
| 136 |
+
```bash
|
| 137 |
+
uv venv --python 3.12 --seed
|
| 138 |
+
source .venv/bin/activate
|
| 139 |
+
```
|
| 140 |
+
|
| 141 |
+
Install Solar Open's optimized vLLM
|
| 142 |
+
```bash
|
| 143 |
+
VLLM_PRECOMPILED_WHEEL_LOCATION="https://github.com/vllm-project/vllm/releases/download/v0.12.0/vllm-0.12.0-cp38-abi3-manylinux_2_31_x86_64.whl" \
|
| 144 |
+
VLLM_USE_PRECOMPILED=1 \
|
| 145 |
+
uv pip install git+https://github.com/UpstageAI/vllm.git@v0.12.0-solar-open
|
| 146 |
+
```
|
| 147 |
+
|
| 148 |
+
Start the vLLM server (For 2 GPUs)
|
| 149 |
+
```bash
|
| 150 |
+
PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True
|
| 151 |
+
vllm serve nota-ai/Solar-Open-100B-NotaMoEQuant-Int4 \
|
| 152 |
+
--trust-remote-code \
|
| 153 |
+
--enable-auto-tool-choice \
|
| 154 |
+
--tool-call-parser solar_open \
|
| 155 |
+
--reasoning-parser solar_open \
|
| 156 |
+
--logits-processors vllm.model_executor.models.parallel_tool_call_logits_processor:ParallelToolCallLogitsProcessor \
|
| 157 |
+
--logits-processors vllm.model_executor.models.solar_open_logits_processor:SolarOpenTemplateLogitsProcessor \
|
| 158 |
+
--tensor-parallel-size 2 \
|
| 159 |
+
--max-num-seqs 64 \
|
| 160 |
+
--gpu-memory-utilization 0.8
|
| 161 |
+
```
|
chat_template.jinja
ADDED
|
@@ -0,0 +1,156 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{#- ======== Template Parameters ======== #}
|
| 2 |
+
{%- set add_generation_prompt = add_generation_prompt if add_generation_prompt is defined else true %}
|
| 3 |
+
{%- set default_system_prompt = default_system_prompt if default_system_prompt is defined else true %}
|
| 4 |
+
{%- set reasoning_effort = reasoning_effort if reasoning_effort is defined else "high" %}
|
| 5 |
+
{%- set think_render_option = think_render_option if think_render_option is defined else "lastthink" %}
|
| 6 |
+
|
| 7 |
+
{#- ======== System Block State ======== #}
|
| 8 |
+
{%- set sys_ns = namespace(is_first_block=true) -%}
|
| 9 |
+
|
| 10 |
+
{#- ======== Find last user message index ======== #}
|
| 11 |
+
{%- set last_user_idx = namespace(value=-1) -%}
|
| 12 |
+
{%- for message in messages -%}
|
| 13 |
+
{%- if message.role == 'user' -%}
|
| 14 |
+
{%- set last_user_idx.value = loop.index0 -%}
|
| 15 |
+
{%- endif -%}
|
| 16 |
+
{%- endfor -%}
|
| 17 |
+
|
| 18 |
+
{#- ======== System messages renderers ======== #}
|
| 19 |
+
{%- macro render_system_message(user_system_messages) %}
|
| 20 |
+
{%- if default_system_prompt %}
|
| 21 |
+
{%- if not sys_ns.is_first_block %}{{- "\n\n" }}{%- endif %}
|
| 22 |
+
{%- set sys_ns.is_first_block = false %}
|
| 23 |
+
{{- "## Provider System Prompt\n\nYou are Solar Open 100B, a large language model trained by Upstage AI, a Korean startup. Your knowledge cutoff is 2025-07. The current date is " + strftime_now("%Y-%m-%d") + "." }}
|
| 24 |
+
{%- endif -%}
|
| 25 |
+
{%- if user_system_messages %}
|
| 26 |
+
{%- if not sys_ns.is_first_block %}{{- "\n\n" }}{%- endif %}
|
| 27 |
+
{%- set sys_ns.is_first_block = false %}
|
| 28 |
+
{{- "## System Prompt" }}
|
| 29 |
+
{%- for system_message in user_system_messages %}
|
| 30 |
+
{{- "\n\n" }}
|
| 31 |
+
{{- system_message }}
|
| 32 |
+
{%- endfor %}
|
| 33 |
+
{%- endif -%}
|
| 34 |
+
{%- endmacro %}
|
| 35 |
+
|
| 36 |
+
{%- macro render_tool_instruction(tools) %}
|
| 37 |
+
{%- if not sys_ns.is_first_block %}{{- "\n\n" }}{%- endif %}
|
| 38 |
+
{%- set sys_ns.is_first_block = false %}
|
| 39 |
+
{{- "## Tools\n\n### Tool Call Instruction" }}
|
| 40 |
+
{{- "\nYou may invoke one or more tools to assist with the user's query. Available tools are provided in JSON Schema format: <|tools:begin|><|tool:begin|><tools-json-object><|tool:end|>...<|tools:end|>\n" }}
|
| 41 |
+
{{- "\n### Available Tools\n" }}
|
| 42 |
+
{{- "<|tools:begin|>" }}
|
| 43 |
+
{%- for tool in tools %}
|
| 44 |
+
{{- "<|tool:begin|>" }}
|
| 45 |
+
{{- tool.function | tojson }}
|
| 46 |
+
{{- "<|tool:end|>" }}
|
| 47 |
+
{%- endfor %}
|
| 48 |
+
{{- "<|tools:end|>\n" }}
|
| 49 |
+
{{- "\n### Tool Call Format\n" }}
|
| 50 |
+
{{- "For each tool call, return a JSON object with the following structure, enclosed within <|tool_call:begin|> and <|tool_call:end|> tags: \n<|tool_call:begin|><tool-call-id><|tool_call:name|><tool-name><|tool_call:args|><args-json-object><|tool_call:end|>\n" }}
|
| 51 |
+
{{- "- The <tool-call-id> must be a randomly generated string consisting of 10 lowercase letters (a-z) and/or digits (0-9) (e.g., a1b2c3d4e5)\n" }}
|
| 52 |
+
{{- "\n### Tool Response Format\n" }}
|
| 53 |
+
{{- "Each tool is responded by `tool` with the following structure:\n<|tool_response:id|><tool-call-id><|tool_response:name|><tool-name><|tool_response:result|><results><|tool_response:end|>\n" }}
|
| 54 |
+
{{- "- Ensure the <tool-call-id> matches the corresponding tool call" -}}
|
| 55 |
+
{%- endmacro %}
|
| 56 |
+
|
| 57 |
+
{%- macro render_json_response_format_instruction(response_format) %}
|
| 58 |
+
{%- if not sys_ns.is_first_block %}{{- "\n\n" }}{%- endif %}
|
| 59 |
+
{%- set sys_ns.is_first_block = false %}
|
| 60 |
+
{{- "## Output Format Constraint" }}
|
| 61 |
+
{{- "\n\nYour final response should follow the JSON schema: \n[Start of schema]" }}
|
| 62 |
+
{{- response_format }}
|
| 63 |
+
{{- "\n[End of schema]\nPlease ensure your answers adhere to this format and do not contain any unnecessary text." }}
|
| 64 |
+
{%- endmacro %}
|
| 65 |
+
|
| 66 |
+
{%- macro get_tool_name(messages, tool_call_id) %}
|
| 67 |
+
{%- for msg in messages -%}
|
| 68 |
+
{%- if msg.role == 'assistant' and msg.tool_calls -%}
|
| 69 |
+
{%- for tool_call in msg.tool_calls -%}
|
| 70 |
+
{%- if tool_call.id == tool_call_id -%}
|
| 71 |
+
{{- tool_call.function.name }}
|
| 72 |
+
{%- endif -%}
|
| 73 |
+
{%- endfor -%}
|
| 74 |
+
{%- endif -%}
|
| 75 |
+
{%- endfor -%}
|
| 76 |
+
{%- endmacro %}
|
| 77 |
+
|
| 78 |
+
{%- macro render_tool_arguments(tool_arguments) %}
|
| 79 |
+
{%- if tool_arguments is mapping -%}
|
| 80 |
+
{{- tool_arguments | tojson }}
|
| 81 |
+
{%- else -%}
|
| 82 |
+
{{- tool_arguments }}
|
| 83 |
+
{%- endif -%}
|
| 84 |
+
{%- endmacro %}
|
| 85 |
+
|
| 86 |
+
{#- ======== Render system message ======== #}
|
| 87 |
+
{%- set ns = namespace(system_messages=[]) -%}
|
| 88 |
+
{%- for message in messages -%}
|
| 89 |
+
{%- if message.role == 'system' -%}
|
| 90 |
+
{%- set ns.system_messages = ns.system_messages + [message.content] -%}
|
| 91 |
+
{%- endif -%}
|
| 92 |
+
{%- endfor -%}
|
| 93 |
+
|
| 94 |
+
{%- if ns.system_messages or default_system_prompt or tools or response_format -%}
|
| 95 |
+
{{- "<|begin|>system<|content|>" }}
|
| 96 |
+
{{- render_system_message(ns.system_messages) }}
|
| 97 |
+
{%- if tools -%}
|
| 98 |
+
{{- render_tool_instruction(tools) }}
|
| 99 |
+
{%- endif %}
|
| 100 |
+
{%- if response_format -%}
|
| 101 |
+
{{- render_json_response_format_instruction(response_format) }}
|
| 102 |
+
{%- endif %}
|
| 103 |
+
{{- "<|end|>" }}
|
| 104 |
+
{%- endif -%}
|
| 105 |
+
|
| 106 |
+
{#- ======== Render main messages ======== #}
|
| 107 |
+
{%- for message in messages -%}
|
| 108 |
+
{%- if message.role == 'user' -%}
|
| 109 |
+
{{- "<|begin|>user<|content|>" + message.content + "<|end|>" }}
|
| 110 |
+
{%- elif message.role == 'tool' -%}
|
| 111 |
+
{%- set prev_is_tool = loop.index0 > 0 and messages[loop.index0 - 1].role == 'tool' -%}
|
| 112 |
+
{%- set next_is_tool = loop.index0 < (messages | length - 1) and messages[loop.index0 + 1].role == 'tool' -%}
|
| 113 |
+
{%- if not prev_is_tool -%}
|
| 114 |
+
{{- "<|begin|>tool<|tool_response|>" }}
|
| 115 |
+
{%- endif -%}
|
| 116 |
+
{{- "<|tool_response:begin|>" + message.tool_call_id + "<|tool_response:name|>" }}
|
| 117 |
+
{{- get_tool_name(messages, message.tool_call_id) }}
|
| 118 |
+
{{- "<|tool_response:result|>" }}
|
| 119 |
+
{{- message.content }}
|
| 120 |
+
{{- "<|tool_response:end|>" }}
|
| 121 |
+
{%- if not next_is_tool -%}
|
| 122 |
+
{{- "<|end|>" }}
|
| 123 |
+
{%- endif -%}
|
| 124 |
+
{%- elif message.role == 'assistant' -%}
|
| 125 |
+
{#- ======== Assistant Thinking ======== #}
|
| 126 |
+
{%- if think_render_option == "all" -%}
|
| 127 |
+
{%- if message.reasoning -%}
|
| 128 |
+
{{- "<|begin|>assistant<|think|>" + message.reasoning + "<|end|>" }}
|
| 129 |
+
{%- endif -%}
|
| 130 |
+
{%- elif think_render_option == "lastthink" -%}
|
| 131 |
+
{%- if message.reasoning and loop.index0 > last_user_idx.value -%}
|
| 132 |
+
{{- "<|begin|>assistant<|think|>" + message.reasoning + "<|end|>" }}
|
| 133 |
+
{%- endif -%}
|
| 134 |
+
{%- endif -%}
|
| 135 |
+
|
| 136 |
+
{#- ======== Assistant Messages ======== #}
|
| 137 |
+
{%- if message.tool_calls -%}
|
| 138 |
+
{{- "<|begin|>assistant<|tool_calls|>" }}
|
| 139 |
+
{%- for tool_call in message.tool_calls -%}
|
| 140 |
+
{{- "<|tool_call:begin|>" + tool_call.id +"<|tool_call:name|>" + tool_call.function.name + "<|tool_call:args|>" }}
|
| 141 |
+
{{- render_tool_arguments(tool_call.function.arguments) }}
|
| 142 |
+
{{- "<|tool_call:end|>" }}
|
| 143 |
+
{%- endfor -%}
|
| 144 |
+
{{- "<|calls|>" }}
|
| 145 |
+
{%- else -%}
|
| 146 |
+
{{- "<|begin|>assistant<|content|>" + message.content + "<|end|>" }}
|
| 147 |
+
{%- endif -%}
|
| 148 |
+
{%- endif -%}
|
| 149 |
+
{%- endfor -%}
|
| 150 |
+
|
| 151 |
+
{%- if add_generation_prompt -%}
|
| 152 |
+
{%- if reasoning_effort in ["low", "minimal"] -%}
|
| 153 |
+
{{- "<|begin|>assistant<|think|><|end|>" }}
|
| 154 |
+
{%- endif -%}
|
| 155 |
+
{{- "<|begin|>assistant" }}
|
| 156 |
+
{%- endif -%}
|
config.json
ADDED
|
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"SolarOpenForCausalLM"
|
| 4 |
+
],
|
| 5 |
+
"attention_bias": false,
|
| 6 |
+
"attention_dropout": 0.0,
|
| 7 |
+
"auto_map": {
|
| 8 |
+
"AutoConfig": "configuration_solar_open.SolarOpenConfig",
|
| 9 |
+
"AutoModel": "modeling_solar_open.SolarOpenModel",
|
| 10 |
+
"AutoModelForCausalLM": "modeling_solar_open.SolarOpenForCausalLM"
|
| 11 |
+
},
|
| 12 |
+
"bos_token_id": 1,
|
| 13 |
+
"dtype": "bfloat16",
|
| 14 |
+
"eos_token_id": 2,
|
| 15 |
+
"first_k_dense_replace": 0,
|
| 16 |
+
"head_dim": 128,
|
| 17 |
+
"hidden_act": "silu",
|
| 18 |
+
"hidden_size": 4096,
|
| 19 |
+
"initializer_range": 0.02,
|
| 20 |
+
"intermediate_size": 10240,
|
| 21 |
+
"max_position_embeddings": 131072,
|
| 22 |
+
"model_type": "solar_open",
|
| 23 |
+
"moe_intermediate_size": 1280,
|
| 24 |
+
"n_group": 1,
|
| 25 |
+
"n_routed_experts": 128,
|
| 26 |
+
"n_shared_experts": 1,
|
| 27 |
+
"norm_topk_prob": true,
|
| 28 |
+
"num_attention_heads": 64,
|
| 29 |
+
"num_experts_per_tok": 8,
|
| 30 |
+
"num_hidden_layers": 48,
|
| 31 |
+
"num_key_value_heads": 8,
|
| 32 |
+
"pad_token_id": 2,
|
| 33 |
+
"partial_rotary_factor": 1.0,
|
| 34 |
+
"quantization_config": {
|
| 35 |
+
"bits": 4,
|
| 36 |
+
"data_type": "int",
|
| 37 |
+
"group_size": 128,
|
| 38 |
+
"packing_format": "auto_round:auto_gptq",
|
| 39 |
+
"quant_method": "auto-round",
|
| 40 |
+
"sym": true
|
| 41 |
+
},
|
| 42 |
+
"rms_norm_eps": 1e-05,
|
| 43 |
+
"rope_scaling": null,
|
| 44 |
+
"rope_theta": 1000000,
|
| 45 |
+
"routed_scaling_factor": 1.0,
|
| 46 |
+
"tie_word_embeddings": false,
|
| 47 |
+
"topk_group": 1,
|
| 48 |
+
"transformers_version": "4.57.3",
|
| 49 |
+
"use_cache": true,
|
| 50 |
+
"use_qk_norm": false,
|
| 51 |
+
"vocab_size": 196608
|
| 52 |
+
}
|
configuration_solar_open.py
ADDED
|
@@ -0,0 +1,242 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
# Copyright 2025 The ZhipuAI Inc. team and HuggingFace Inc. team.
|
| 4 |
+
#
|
| 5 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 6 |
+
# you may not use this file except in compliance with the License.
|
| 7 |
+
# You may obtain a copy of the License at
|
| 8 |
+
#
|
| 9 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 10 |
+
#
|
| 11 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 12 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 13 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 14 |
+
# See the License for the specific language governing permissions and
|
| 15 |
+
# limitations under the License.
|
| 16 |
+
#
|
| 17 |
+
# This file has been modified by Upstage AI including
|
| 18 |
+
# - Hyperparameter Adjustments: Modified the model architecture by increasing vocab_size and num_hidden_layers, while decreasing num_attention_heads, intermediate_size, and moe_intermediate_size.
|
| 19 |
+
# RoPE Configuration: Replaced the generic rope_parameters argument with explicit rope_theta and rope_scaling parameters to define Rotary Positional Embeddings settings.
|
| 20 |
+
#
|
| 21 |
+
# Based on code from: https://github.com/huggingface/transformers/blob/main/src/transformers/models/glm4_moe/configuration_glm4_moe.py
|
| 22 |
+
|
| 23 |
+
from transformers.configuration_utils import PretrainedConfig
|
| 24 |
+
from transformers.modeling_rope_utils import rope_config_validation
|
| 25 |
+
|
| 26 |
+
|
| 27 |
+
class SolarOpenConfig(PretrainedConfig):
|
| 28 |
+
r"""
|
| 29 |
+
This is the configuration class to store the configuration of a [`SolarOpenModel`]. It is used to instantiate a
|
| 30 |
+
SolarOpen model according to the specified arguments, defining the model architecture.
|
| 31 |
+
|
| 32 |
+
Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the
|
| 33 |
+
documentation from [`PretrainedConfig`] for more information.
|
| 34 |
+
|
| 35 |
+
|
| 36 |
+
Args:
|
| 37 |
+
vocab_size (`int`, *optional*, defaults to 196608):
|
| 38 |
+
Vocabulary size of the SolarOpen model. Defines the number of different tokens that can be represented by the
|
| 39 |
+
`inputs_ids` passed when calling [`SolarOpenModel`]
|
| 40 |
+
hidden_size (`int`, *optional*, defaults to 4096):
|
| 41 |
+
Dimension of the hidden representations.
|
| 42 |
+
intermediate_size (`int`, *optional*, defaults to 10240):
|
| 43 |
+
Dimension of the MLP representations.
|
| 44 |
+
num_hidden_layers (`int`, *optional*, defaults to 48):
|
| 45 |
+
Number of hidden layers in the Transformer encoder.
|
| 46 |
+
num_attention_heads (`int`, *optional*, defaults to 64):
|
| 47 |
+
Number of attention heads for each attention layer in the Transformer encoder.
|
| 48 |
+
partial_rotary_factor (`float`, *optional*, defaults to 1.0):
|
| 49 |
+
The factor of the partial rotary position.
|
| 50 |
+
num_key_value_heads (`int`, *optional*, defaults to 8):
|
| 51 |
+
This is the number of key_value heads that should be used to implement Grouped Query Attention. If
|
| 52 |
+
`num_key_value_heads=num_attention_heads`, the model will use Multi Head Attention (MHA), if
|
| 53 |
+
`num_key_value_heads=1` the model will use Multi Query Attention (MQA) otherwise GQA is used. When
|
| 54 |
+
converting a multi-head checkpoint to a GQA checkpoint, each group key and value head should be constructed
|
| 55 |
+
by meanpooling all the original heads within that group. For more details, check out [this
|
| 56 |
+
paper](https://huggingface.co/papers/2305.13245). If it is not specified, will default to `32`.
|
| 57 |
+
|
| 58 |
+
hidden_act (`str` or `function`, *optional*, defaults to `"silu"`):
|
| 59 |
+
The non-linear activation function (function or string) in the decoder.
|
| 60 |
+
max_position_embeddings (`int`, *optional*, defaults to 131072):
|
| 61 |
+
The maximum sequence length that this model might ever be used with.
|
| 62 |
+
initializer_range (`float`, *optional*, defaults to 0.02):
|
| 63 |
+
The standard deviation of the truncated_normal_initializer for initializing all weight matrices.
|
| 64 |
+
rms_norm_eps (`float`, *optional*, defaults to 1e-05):
|
| 65 |
+
The epsilon used by the rms normalization layers.
|
| 66 |
+
use_cache (`bool`, *optional*, defaults to `True`):
|
| 67 |
+
Whether or not the model should return the last key/values attentions (not used by all models). Only
|
| 68 |
+
relevant if `config.is_decoder=True`.
|
| 69 |
+
tie_word_embeddings (`bool`, *optional*, defaults to `False`):
|
| 70 |
+
Whether the model's input and output word embeddings should be tied.
|
| 71 |
+
rope_theta (`float`, *optional*, defaults to 1000000.0):
|
| 72 |
+
The base period of the RoPE embeddings.
|
| 73 |
+
rope_scaling (`Dict`, *optional*):
|
| 74 |
+
Dictionary containing the scaling configuration for the RoPE embeddings. NOTE: if you apply new rope type
|
| 75 |
+
and you expect the model to work on longer `max_position_embeddings`, we recommend you to update this value
|
| 76 |
+
accordingly.
|
| 77 |
+
Expected contents:
|
| 78 |
+
`rope_type` (`str`):
|
| 79 |
+
The sub-variant of RoPE to use. Can be one of ['default', 'linear', 'dynamic', 'yarn', 'longrope',
|
| 80 |
+
'llama3'], with 'default' being the original RoPE implementation.
|
| 81 |
+
`factor` (`float`, *optional*):
|
| 82 |
+
Used with all rope types except 'default'. The scaling factor to apply to the RoPE embeddings. In
|
| 83 |
+
most scaling types, a `factor` of x will enable the model to handle sequences of length x *
|
| 84 |
+
original maximum pre-trained length.
|
| 85 |
+
`original_max_position_embeddings` (`int`, *optional*):
|
| 86 |
+
Used with 'dynamic', 'longrope' and 'llama3'. The original max position embeddings used during
|
| 87 |
+
pretraining.
|
| 88 |
+
`attention_factor` (`float`, *optional*):
|
| 89 |
+
Used with 'yarn' and 'longrope'. The scaling factor to be applied on the attention
|
| 90 |
+
computation. If unspecified, it defaults to value recommended by the implementation, using the
|
| 91 |
+
`factor` field to infer the suggested value.
|
| 92 |
+
`beta_fast` (`float`, *optional*):
|
| 93 |
+
Only used with 'yarn'. Parameter to set the boundary for extrapolation (only) in the linear
|
| 94 |
+
ramp function. If unspecified, it defaults to 32.
|
| 95 |
+
`beta_slow` (`float`, *optional*):
|
| 96 |
+
Only used with 'yarn'. Parameter to set the boundary for interpolation (only) in the linear
|
| 97 |
+
ramp function. If unspecified, it defaults to 1.
|
| 98 |
+
`short_factor` (`list[float]`, *optional*):
|
| 99 |
+
Only used with 'longrope'. The scaling factor to be applied to short contexts (<
|
| 100 |
+
`original_max_position_embeddings`). Must be a list of numbers with the same length as the hidden
|
| 101 |
+
size divided by the number of attention heads divided by 2
|
| 102 |
+
`long_factor` (`list[float]`, *optional*):
|
| 103 |
+
Only used with 'longrope'. The scaling factor to be applied to long contexts (<
|
| 104 |
+
`original_max_position_embeddings`). Must be a list of numbers with the same length as the hidden
|
| 105 |
+
size divided by the number of attention heads divided by 2
|
| 106 |
+
`low_freq_factor` (`float`, *optional*):
|
| 107 |
+
Only used with 'llama3'. Scaling factor applied to low frequency components of the RoPE
|
| 108 |
+
`high_freq_factor` (`float`, *optional*):
|
| 109 |
+
Only used with 'llama3'. Scaling factor applied to high frequency components of the RoPE
|
| 110 |
+
attention_bias (`bool`, defaults to `False`, *optional*, defaults to `False`):
|
| 111 |
+
Whether to use a bias in the query, key, value and output projection layers during self-attention.
|
| 112 |
+
attention_dropout (`float`, *optional*, defaults to 0.0):
|
| 113 |
+
The dropout ratio for the attention probabilities.
|
| 114 |
+
moe_intermediate_size (`int`, *optional*, defaults to 1280):
|
| 115 |
+
Intermediate size of the routed expert.
|
| 116 |
+
num_experts_per_tok (`int`, *optional*, defaults to 8):
|
| 117 |
+
number of experts per token.
|
| 118 |
+
n_shared_experts (`int`, *optional*, defaults to 1):
|
| 119 |
+
Number of shared experts.
|
| 120 |
+
n_routed_experts (`int`, *optional*, defaults to 128):
|
| 121 |
+
Number of routed experts.
|
| 122 |
+
routed_scaling_factor (`float`, *optional*, defaults to 1.0):
|
| 123 |
+
Scaling factor or routed experts.
|
| 124 |
+
n_group (`int`, *optional*, defaults to 1):
|
| 125 |
+
Number of groups for routed experts.
|
| 126 |
+
topk_group (`int`, *optional*, defaults to 1):
|
| 127 |
+
Number of selected groups for each token(for each token, ensuring the selected experts is only within `topk_group` groups).
|
| 128 |
+
first_k_dense_replace (`int`, *optional*, defaults to 0):
|
| 129 |
+
Number of dense layers in shallow layers(embed->dense->dense->...->dense->moe->moe...->lm_head).
|
| 130 |
+
\--k dense layers--/
|
| 131 |
+
norm_topk_prob (`bool`, *optional*, defaults to `True`):
|
| 132 |
+
Whether to normalize the topk probabilities.
|
| 133 |
+
use_qk_norm (`bool`, *optional*, defaults to `False`):
|
| 134 |
+
Whether to use query-key normalization in the attention
|
| 135 |
+
```python
|
| 136 |
+
>>> from transformers import SolarOpenModel, SolarOpenConfig
|
| 137 |
+
|
| 138 |
+
>>> # Initializing a SolarOpen style configuration
|
| 139 |
+
>>> configuration = SolarOpenConfig()
|
| 140 |
+
|
| 141 |
+
>>> # Initializing a model from the SolarOpen style configuration
|
| 142 |
+
>>> model = SolarOpenModel(configuration)
|
| 143 |
+
|
| 144 |
+
>>> # Accessing the model configuration
|
| 145 |
+
>>> configuration = model.config
|
| 146 |
+
```"""
|
| 147 |
+
|
| 148 |
+
model_type = "solar_open"
|
| 149 |
+
keys_to_ignore_at_inference = ["past_key_values"]
|
| 150 |
+
|
| 151 |
+
# Default tensor parallel plan for base model `SolarOpen`
|
| 152 |
+
base_model_tp_plan = {
|
| 153 |
+
"layers.*.self_attn.q_proj": "colwise",
|
| 154 |
+
"layers.*.self_attn.k_proj": "colwise",
|
| 155 |
+
"layers.*.self_attn.v_proj": "colwise",
|
| 156 |
+
"layers.*.self_attn.o_proj": "rowwise",
|
| 157 |
+
"layers.*.mlp.experts.*.gate_proj": "colwise",
|
| 158 |
+
"layers.*.mlp.experts.*.up_proj": "colwise",
|
| 159 |
+
"layers.*.mlp.experts.*.down_proj": "rowwise",
|
| 160 |
+
"layers.*.mlp.gate_proj": "colwise",
|
| 161 |
+
"layers.*.mlp.up_proj": "colwise",
|
| 162 |
+
"layers.*.mlp.down_proj": "rowwise",
|
| 163 |
+
}
|
| 164 |
+
base_model_pp_plan = {
|
| 165 |
+
"embed_tokens": (["input_ids"], ["inputs_embeds"]),
|
| 166 |
+
"layers": (["hidden_states", "attention_mask"], ["hidden_states"]),
|
| 167 |
+
"norm": (["hidden_states"], ["hidden_states"]),
|
| 168 |
+
}
|
| 169 |
+
|
| 170 |
+
def __init__(
|
| 171 |
+
self,
|
| 172 |
+
vocab_size=196608,
|
| 173 |
+
hidden_size=4096,
|
| 174 |
+
intermediate_size=10240,
|
| 175 |
+
num_hidden_layers=48,
|
| 176 |
+
num_attention_heads=64,
|
| 177 |
+
partial_rotary_factor=1.0,
|
| 178 |
+
num_key_value_heads=8,
|
| 179 |
+
hidden_act="silu",
|
| 180 |
+
max_position_embeddings=131072,
|
| 181 |
+
initializer_range=0.02,
|
| 182 |
+
rms_norm_eps=1e-5,
|
| 183 |
+
use_cache=True,
|
| 184 |
+
tie_word_embeddings=False,
|
| 185 |
+
rope_theta=1000000.0,
|
| 186 |
+
rope_scaling=None,
|
| 187 |
+
attention_bias=False,
|
| 188 |
+
attention_dropout=0.0,
|
| 189 |
+
moe_intermediate_size=1280,
|
| 190 |
+
num_experts_per_tok=8,
|
| 191 |
+
n_shared_experts=1,
|
| 192 |
+
n_routed_experts=128,
|
| 193 |
+
routed_scaling_factor=1.0,
|
| 194 |
+
n_group=1,
|
| 195 |
+
topk_group=1,
|
| 196 |
+
first_k_dense_replace=0,
|
| 197 |
+
norm_topk_prob=True,
|
| 198 |
+
use_qk_norm=False,
|
| 199 |
+
**kwargs,
|
| 200 |
+
):
|
| 201 |
+
self.vocab_size = vocab_size
|
| 202 |
+
self.max_position_embeddings = max_position_embeddings
|
| 203 |
+
self.hidden_size = hidden_size
|
| 204 |
+
self.intermediate_size = intermediate_size
|
| 205 |
+
self.num_hidden_layers = num_hidden_layers
|
| 206 |
+
self.num_attention_heads = num_attention_heads
|
| 207 |
+
self.partial_rotary_factor = partial_rotary_factor
|
| 208 |
+
|
| 209 |
+
self.num_key_value_heads = num_key_value_heads
|
| 210 |
+
self.hidden_act = hidden_act
|
| 211 |
+
self.initializer_range = initializer_range
|
| 212 |
+
self.rms_norm_eps = rms_norm_eps
|
| 213 |
+
self.use_cache = use_cache
|
| 214 |
+
self.rope_theta = rope_theta
|
| 215 |
+
self.rope_scaling = rope_scaling
|
| 216 |
+
self.attention_bias = attention_bias
|
| 217 |
+
self.attention_dropout = attention_dropout
|
| 218 |
+
# Validate the correctness of rotary position embeddings parameters
|
| 219 |
+
# BC: if there is a 'type' field, move it to 'rope_type'.
|
| 220 |
+
if self.rope_scaling is not None and "type" in self.rope_scaling:
|
| 221 |
+
self.rope_scaling["rope_type"] = self.rope_scaling["type"]
|
| 222 |
+
rope_config_validation(self)
|
| 223 |
+
|
| 224 |
+
# MoE arguments
|
| 225 |
+
self.moe_intermediate_size = moe_intermediate_size
|
| 226 |
+
self.num_experts_per_tok = num_experts_per_tok
|
| 227 |
+
self.n_group = n_group
|
| 228 |
+
self.topk_group = topk_group
|
| 229 |
+
self.n_shared_experts = n_shared_experts
|
| 230 |
+
self.n_routed_experts = n_routed_experts
|
| 231 |
+
self.routed_scaling_factor = routed_scaling_factor
|
| 232 |
+
self.first_k_dense_replace = first_k_dense_replace
|
| 233 |
+
self.norm_topk_prob = norm_topk_prob
|
| 234 |
+
self.use_qk_norm = use_qk_norm
|
| 235 |
+
|
| 236 |
+
super().__init__(
|
| 237 |
+
tie_word_embeddings=tie_word_embeddings,
|
| 238 |
+
**kwargs,
|
| 239 |
+
)
|
| 240 |
+
|
| 241 |
+
|
| 242 |
+
__all__ = ["SolarOpenConfig"]
|
generation_config.json
ADDED
|
@@ -0,0 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_from_model_config": true,
|
| 3 |
+
"bos_token_id": 1,
|
| 4 |
+
"do_sample": true,
|
| 5 |
+
"eos_token_id": [
|
| 6 |
+
2,
|
| 7 |
+
24,
|
| 8 |
+
25
|
| 9 |
+
],
|
| 10 |
+
"pad_token_id": 2,
|
| 11 |
+
"temperature": 0.8,
|
| 12 |
+
"top_p": 0.95,
|
| 13 |
+
"transformers_version": "4.57.3"
|
| 14 |
+
}
|
model-00001-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:726e9cc90c9db7d8e5578a7ff1997e4b34856f1d50d7c319c1e4cbbd8e369ce3
|
| 3 |
+
size 4998891080
|
model-00002-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:24fc10c0db0a943d46ea7fe809240cb04ded39ec49f01d063ca342eb85bdb188
|
| 3 |
+
size 4999349568
|
model-00003-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:194db66f219568d98fc3600198c9bc745f6d519077394551bd7e83e087511054
|
| 3 |
+
size 4998779896
|
model-00004-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:734c2b61bd4d2985d4bf4d61c487311abe2694e52053cb1f54dfbe1879af3830
|
| 3 |
+
size 4999354992
|
model-00005-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:16650572334c932a060823aabb053a759726ef925586b8e71b6a802b1655b50e
|
| 3 |
+
size 4998782640
|
model-00006-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:90a37535c982473c33acdb3f2708616b72aab381ba5426c8fef5f9ac6aba98d7
|
| 3 |
+
size 4999355160
|
model-00007-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d39dd29a56e51b1d434034c45da93510b16a4c5df78628c65bdc6084c67878ea
|
| 3 |
+
size 4998782480
|
model-00008-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9a1b13eedbc43fe8af8d84e4cb7cf7f0bdd4fcde885ce7074ae80839032f24ea
|
| 3 |
+
size 4998782712
|
model-00009-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:49d0881af4ececfb30cb71b1135441f66153633251c891a545b5dd7610cda705
|
| 3 |
+
size 4999354920
|
model-00010-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:4e48cf559b0230f47a5a46f647603b3694c1c712893d8c11ade4d9678eb89f6e
|
| 3 |
+
size 4998782656
|
model-00011-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:4a084585013b1b3276f086c74a859315c4a8c6fcea9c47d6bcd2d27f3f8d5838
|
| 3 |
+
size 4159221512
|
model-00012-of-00012.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9742b5e5f3945b0705b4e852072f39d4fd65aa6b3472dbc1a115f0778d160c1b
|
| 3 |
+
size 1610612864
|
model.safetensors.index.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
modeling_solar_open.py
ADDED
|
@@ -0,0 +1,608 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
# Copyright 2025 The GLM4 & ZhipuAI team and HuggingFace Inc. team.
|
| 4 |
+
#
|
| 5 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 6 |
+
# you may not use this file except in compliance with the License.
|
| 7 |
+
# You may obtain a copy of the License at
|
| 8 |
+
#
|
| 9 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 10 |
+
#
|
| 11 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 12 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 13 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 14 |
+
# See the License for the specific language governing permissions and
|
| 15 |
+
# limitations under the License.
|
| 16 |
+
#
|
| 17 |
+
# This file has been modified by Upstage AI including:
|
| 18 |
+
# - Hybrid MoE Architecture: Replaced the standard dense structure with a depth-dependent Hybrid MoE, adding `SolarOpenMoE` and `SolarOpenTopkRouter` classes.
|
| 19 |
+
# - RoPE Strategy: Changed the rotary position embedding strategy from GLM4's interleaved rotation to Llama-style block rotation (via modified `rotate_half`).
|
| 20 |
+
# - Normalization Logic: Simplified the layer normalization structure by removing GLM4's extra post-operation norms and adding optional Query-Key Normalization (`use_qk_norm`).
|
| 21 |
+
#
|
| 22 |
+
# Based on code from: https://github.com/huggingface/transformers/blob/main/src/transformers/models/glm4/modeling_glm4.py
|
| 23 |
+
|
| 24 |
+
from typing import Callable, Optional, Union
|
| 25 |
+
|
| 26 |
+
import torch
|
| 27 |
+
import torch.nn.functional as F
|
| 28 |
+
from torch import nn
|
| 29 |
+
|
| 30 |
+
from transformers.activations import ACT2FN
|
| 31 |
+
from transformers.cache_utils import Cache, DynamicCache
|
| 32 |
+
from transformers.generation import GenerationMixin
|
| 33 |
+
from transformers.integrations import use_kernel_forward_from_hub
|
| 34 |
+
from transformers.masking_utils import create_causal_mask
|
| 35 |
+
from transformers.modeling_flash_attention_utils import FlashAttentionKwargs
|
| 36 |
+
from transformers.modeling_layers import GradientCheckpointingLayer
|
| 37 |
+
from transformers.modeling_outputs import BaseModelOutputWithPast, CausalLMOutputWithPast
|
| 38 |
+
from transformers.modeling_rope_utils import ROPE_INIT_FUNCTIONS, dynamic_rope_update
|
| 39 |
+
from transformers.modeling_utils import ALL_ATTENTION_FUNCTIONS, PreTrainedModel
|
| 40 |
+
from transformers.processing_utils import Unpack
|
| 41 |
+
from transformers.utils import TransformersKwargs, auto_docstring, can_return_tuple
|
| 42 |
+
from transformers.utils.deprecation import deprecate_kwarg
|
| 43 |
+
from transformers.utils.generic import check_model_inputs
|
| 44 |
+
from .configuration_solar_open import SolarOpenConfig
|
| 45 |
+
|
| 46 |
+
|
| 47 |
+
def repeat_kv(hidden_states: torch.Tensor, n_rep: int) -> torch.Tensor:
|
| 48 |
+
"""
|
| 49 |
+
This is the equivalent of torch.repeat_interleave(x, dim=1, repeats=n_rep). The hidden states go from (batch,
|
| 50 |
+
num_key_value_heads, seqlen, head_dim) to (batch, num_attention_heads, seqlen, head_dim)
|
| 51 |
+
"""
|
| 52 |
+
batch, num_key_value_heads, slen, head_dim = hidden_states.shape
|
| 53 |
+
if n_rep == 1:
|
| 54 |
+
return hidden_states
|
| 55 |
+
hidden_states = hidden_states[:, :, None, :, :].expand(batch, num_key_value_heads, n_rep, slen, head_dim)
|
| 56 |
+
return hidden_states.reshape(batch, num_key_value_heads * n_rep, slen, head_dim)
|
| 57 |
+
|
| 58 |
+
|
| 59 |
+
def eager_attention_forward(
|
| 60 |
+
module: nn.Module,
|
| 61 |
+
query: torch.Tensor,
|
| 62 |
+
key: torch.Tensor,
|
| 63 |
+
value: torch.Tensor,
|
| 64 |
+
attention_mask: Optional[torch.Tensor],
|
| 65 |
+
scaling: float,
|
| 66 |
+
dropout: float = 0.0,
|
| 67 |
+
**kwargs: Unpack[TransformersKwargs],
|
| 68 |
+
):
|
| 69 |
+
key_states = repeat_kv(key, module.num_key_value_groups)
|
| 70 |
+
value_states = repeat_kv(value, module.num_key_value_groups)
|
| 71 |
+
|
| 72 |
+
attn_weights = torch.matmul(query, key_states.transpose(2, 3)) * scaling
|
| 73 |
+
if attention_mask is not None:
|
| 74 |
+
causal_mask = attention_mask[:, :, :, : key_states.shape[-2]]
|
| 75 |
+
attn_weights = attn_weights + causal_mask
|
| 76 |
+
|
| 77 |
+
attn_weights = nn.functional.softmax(attn_weights, dim=-1, dtype=torch.float32).to(query.dtype)
|
| 78 |
+
attn_weights = nn.functional.dropout(attn_weights, p=dropout, training=module.training)
|
| 79 |
+
attn_output = torch.matmul(attn_weights, value_states)
|
| 80 |
+
attn_output = attn_output.transpose(1, 2).contiguous()
|
| 81 |
+
|
| 82 |
+
return attn_output, attn_weights
|
| 83 |
+
|
| 84 |
+
|
| 85 |
+
def rotate_half(x):
|
| 86 |
+
"""Rotates half the hidden dims of the input."""
|
| 87 |
+
x1 = x[..., : x.shape[-1] // 2]
|
| 88 |
+
x2 = x[..., x.shape[-1] // 2 :]
|
| 89 |
+
return torch.cat((-x2, x1), dim=-1)
|
| 90 |
+
|
| 91 |
+
|
| 92 |
+
def apply_rotary_pos_emb(q, k, cos, sin, position_ids=None, unsqueeze_dim=1):
|
| 93 |
+
"""Applies Rotary Position Embedding to the query and key tensors.
|
| 94 |
+
|
| 95 |
+
Args:
|
| 96 |
+
q (`torch.Tensor`): The query tensor.
|
| 97 |
+
k (`torch.Tensor`): The key tensor.
|
| 98 |
+
cos (`torch.Tensor`): The cosine part of the rotary embedding.
|
| 99 |
+
sin (`torch.Tensor`): The sine part of the rotary embedding.
|
| 100 |
+
position_ids (`torch.Tensor`, *optional*):
|
| 101 |
+
Deprecated and unused.
|
| 102 |
+
unsqueeze_dim (`int`, *optional*, defaults to 1):
|
| 103 |
+
The 'unsqueeze_dim' argument specifies the dimension along which to unsqueeze cos[position_ids] and
|
| 104 |
+
sin[position_ids] so that they can be properly broadcasted to the dimensions of q and k. For example, note
|
| 105 |
+
that cos[position_ids] and sin[position_ids] have the shape [batch_size, seq_len, head_dim]. Then, if q and
|
| 106 |
+
k have the shape [batch_size, heads, seq_len, head_dim], then setting unsqueeze_dim=1 makes
|
| 107 |
+
cos[position_ids] and sin[position_ids] broadcastable to the shapes of q and k. Similarly, if q and k have
|
| 108 |
+
the shape [batch_size, seq_len, heads, head_dim], then set unsqueeze_dim=2.
|
| 109 |
+
Returns:
|
| 110 |
+
`tuple(torch.Tensor)` comprising of the query and key tensors rotated using the Rotary Position Embedding.
|
| 111 |
+
"""
|
| 112 |
+
cos = cos.unsqueeze(unsqueeze_dim)
|
| 113 |
+
sin = sin.unsqueeze(unsqueeze_dim)
|
| 114 |
+
|
| 115 |
+
# Keep half or full tensor for later concatenation
|
| 116 |
+
rotary_dim = cos.shape[-1]
|
| 117 |
+
q_rot, q_pass = q[..., :rotary_dim], q[..., rotary_dim:]
|
| 118 |
+
k_rot, k_pass = k[..., :rotary_dim], k[..., rotary_dim:]
|
| 119 |
+
|
| 120 |
+
# Apply rotary embeddings on the first half or full tensor
|
| 121 |
+
q_embed = (q_rot * cos) + (rotate_half(q_rot) * sin)
|
| 122 |
+
k_embed = (k_rot * cos) + (rotate_half(k_rot) * sin)
|
| 123 |
+
|
| 124 |
+
# Concatenate back to full shape
|
| 125 |
+
q_embed = torch.cat([q_embed, q_pass], dim=-1)
|
| 126 |
+
k_embed = torch.cat([k_embed, k_pass], dim=-1)
|
| 127 |
+
return q_embed, k_embed
|
| 128 |
+
|
| 129 |
+
|
| 130 |
+
class SolarOpenAttention(nn.Module):
|
| 131 |
+
"""Multi-headed attention from 'Attention Is All You Need' paper"""
|
| 132 |
+
|
| 133 |
+
def __init__(self, config: SolarOpenConfig, layer_idx: Optional[int] = None):
|
| 134 |
+
super().__init__()
|
| 135 |
+
self.config = config
|
| 136 |
+
self.layer_idx = layer_idx
|
| 137 |
+
self.head_dim = getattr(config, "head_dim", config.hidden_size // config.num_attention_heads)
|
| 138 |
+
self.num_key_value_groups = config.num_attention_heads // config.num_key_value_heads
|
| 139 |
+
self.scaling = self.head_dim**-0.5
|
| 140 |
+
self.rope_scaling = config.rope_scaling
|
| 141 |
+
self.attention_dropout = config.attention_dropout
|
| 142 |
+
self.is_causal = True
|
| 143 |
+
|
| 144 |
+
self.q_proj = nn.Linear(
|
| 145 |
+
config.hidden_size, config.num_attention_heads * self.head_dim, bias=config.attention_bias
|
| 146 |
+
)
|
| 147 |
+
self.k_proj = nn.Linear(
|
| 148 |
+
config.hidden_size, config.num_key_value_heads * self.head_dim, bias=config.attention_bias
|
| 149 |
+
)
|
| 150 |
+
self.v_proj = nn.Linear(
|
| 151 |
+
config.hidden_size, config.num_key_value_heads * self.head_dim, bias=config.attention_bias
|
| 152 |
+
)
|
| 153 |
+
self.o_proj = nn.Linear(config.num_attention_heads * self.head_dim, config.hidden_size, bias=False)
|
| 154 |
+
self.use_qk_norm = config.use_qk_norm
|
| 155 |
+
if self.use_qk_norm:
|
| 156 |
+
self.q_norm = SolarOpenRMSNorm(self.head_dim, eps=config.rms_norm_eps)
|
| 157 |
+
self.k_norm = SolarOpenRMSNorm(self.head_dim, eps=config.rms_norm_eps)
|
| 158 |
+
|
| 159 |
+
@deprecate_kwarg("past_key_value", new_name="past_key_values", version="4.58")
|
| 160 |
+
def forward(
|
| 161 |
+
self,
|
| 162 |
+
hidden_states: torch.Tensor,
|
| 163 |
+
position_embeddings: tuple[torch.Tensor, torch.Tensor],
|
| 164 |
+
attention_mask: Optional[torch.Tensor],
|
| 165 |
+
past_key_values: Optional[Cache] = None,
|
| 166 |
+
cache_position: Optional[torch.LongTensor] = None,
|
| 167 |
+
**kwargs: Unpack[FlashAttentionKwargs],
|
| 168 |
+
) -> tuple[torch.Tensor, Optional[torch.Tensor]]:
|
| 169 |
+
input_shape = hidden_states.shape[:-1]
|
| 170 |
+
hidden_shape = (*input_shape, -1, self.head_dim)
|
| 171 |
+
|
| 172 |
+
query_states = self.q_proj(hidden_states).view(hidden_shape)
|
| 173 |
+
key_states = self.k_proj(hidden_states).view(hidden_shape)
|
| 174 |
+
value_states = self.v_proj(hidden_states).view(hidden_shape)
|
| 175 |
+
|
| 176 |
+
if self.use_qk_norm: # main diff from Llama
|
| 177 |
+
query_states = self.q_norm(query_states)
|
| 178 |
+
key_states = self.k_norm(key_states)
|
| 179 |
+
|
| 180 |
+
query_states = query_states.transpose(1, 2)
|
| 181 |
+
key_states = key_states.transpose(1, 2)
|
| 182 |
+
value_states = value_states.transpose(1, 2)
|
| 183 |
+
|
| 184 |
+
cos, sin = position_embeddings
|
| 185 |
+
query_states, key_states = apply_rotary_pos_emb(query_states, key_states, cos, sin)
|
| 186 |
+
|
| 187 |
+
if past_key_values is not None:
|
| 188 |
+
# sin and cos are specific to RoPE models; position_ids needed for the static cache
|
| 189 |
+
cache_kwargs = {"sin": sin, "cos": cos, "cache_position": cache_position}
|
| 190 |
+
key_states, value_states = past_key_values.update(key_states, value_states, self.layer_idx, cache_kwargs)
|
| 191 |
+
|
| 192 |
+
attention_interface: Callable = eager_attention_forward
|
| 193 |
+
if self.config._attn_implementation != "eager":
|
| 194 |
+
attention_interface = ALL_ATTENTION_FUNCTIONS[self.config._attn_implementation]
|
| 195 |
+
|
| 196 |
+
attn_output, attn_weights = attention_interface(
|
| 197 |
+
self,
|
| 198 |
+
query_states,
|
| 199 |
+
key_states,
|
| 200 |
+
value_states,
|
| 201 |
+
attention_mask,
|
| 202 |
+
dropout=0.0 if not self.training else self.attention_dropout,
|
| 203 |
+
scaling=self.scaling,
|
| 204 |
+
**kwargs,
|
| 205 |
+
)
|
| 206 |
+
|
| 207 |
+
attn_output = attn_output.reshape(*input_shape, -1).contiguous()
|
| 208 |
+
attn_output = self.o_proj(attn_output)
|
| 209 |
+
return attn_output, attn_weights
|
| 210 |
+
|
| 211 |
+
|
| 212 |
+
class SolarOpenMLP(nn.Module):
|
| 213 |
+
def __init__(self, config, hidden_size=None, intermediate_size=None):
|
| 214 |
+
super().__init__()
|
| 215 |
+
self.config = config
|
| 216 |
+
self.hidden_size = config.hidden_size if hidden_size is None else hidden_size
|
| 217 |
+
self.intermediate_size = config.intermediate_size if intermediate_size is None else intermediate_size
|
| 218 |
+
|
| 219 |
+
self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size, bias=False)
|
| 220 |
+
self.up_proj = nn.Linear(self.hidden_size, self.intermediate_size, bias=False)
|
| 221 |
+
self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False)
|
| 222 |
+
self.act_fn = ACT2FN[config.hidden_act]
|
| 223 |
+
|
| 224 |
+
def forward(self, x):
|
| 225 |
+
down_proj = self.down_proj(self.act_fn(self.gate_proj(x)) * self.up_proj(x))
|
| 226 |
+
return down_proj
|
| 227 |
+
|
| 228 |
+
|
| 229 |
+
class SolarOpenTopkRouter(nn.Module):
|
| 230 |
+
def __init__(self, config: SolarOpenConfig):
|
| 231 |
+
super().__init__()
|
| 232 |
+
self.config = config
|
| 233 |
+
self.top_k = config.num_experts_per_tok
|
| 234 |
+
self.n_routed_experts = config.n_routed_experts
|
| 235 |
+
self.routed_scaling_factor = config.routed_scaling_factor
|
| 236 |
+
self.n_group = config.n_group
|
| 237 |
+
self.topk_group = config.topk_group
|
| 238 |
+
self.norm_topk_prob = config.norm_topk_prob
|
| 239 |
+
|
| 240 |
+
self.weight = nn.Parameter(torch.empty((self.n_routed_experts, config.hidden_size)))
|
| 241 |
+
self.e_score_correction_bias = nn.Parameter(
|
| 242 |
+
torch.zeros((self.n_routed_experts), dtype=torch.float32))
|
| 243 |
+
|
| 244 |
+
self._gate_logits = None
|
| 245 |
+
|
| 246 |
+
@torch.no_grad()
|
| 247 |
+
def get_topk_indices(self, scores):
|
| 248 |
+
scores_for_choice = scores.view(-1, self.n_routed_experts) + self.e_score_correction_bias.unsqueeze(0)
|
| 249 |
+
group_scores = (
|
| 250 |
+
scores_for_choice.view(-1, self.n_group, self.n_routed_experts // self.n_group)
|
| 251 |
+
.topk(2, dim=-1)[0]
|
| 252 |
+
.sum(dim=-1)
|
| 253 |
+
)
|
| 254 |
+
group_idx = torch.topk(group_scores, k=self.topk_group, dim=-1, sorted=False)[1]
|
| 255 |
+
group_mask = torch.zeros_like(group_scores)
|
| 256 |
+
group_mask.scatter_(1, group_idx, 1)
|
| 257 |
+
score_mask = (
|
| 258 |
+
group_mask.unsqueeze(-1)
|
| 259 |
+
.expand(-1, self.n_group, self.n_routed_experts // self.n_group)
|
| 260 |
+
.reshape(-1, self.n_routed_experts)
|
| 261 |
+
)
|
| 262 |
+
scores_for_choice = scores_for_choice.masked_fill(~score_mask.bool(), 0.0)
|
| 263 |
+
topk_indices = torch.topk(scores_for_choice, k=self.top_k, dim=-1, sorted=False)[1]
|
| 264 |
+
return topk_indices
|
| 265 |
+
|
| 266 |
+
def forward(self, hidden_states):
|
| 267 |
+
hidden_states = hidden_states.view(-1, self.config.hidden_size)
|
| 268 |
+
router_logits = F.linear(hidden_states.type(torch.float32), self.weight.type(torch.float32))
|
| 269 |
+
self._gate_logits = router_logits
|
| 270 |
+
scores = router_logits.sigmoid()
|
| 271 |
+
topk_indices = self.get_topk_indices(scores)
|
| 272 |
+
topk_weights = scores.gather(1, topk_indices)
|
| 273 |
+
if self.norm_topk_prob:
|
| 274 |
+
denominator = topk_weights.sum(dim=-1, keepdim=True) + 1e-20
|
| 275 |
+
topk_weights /= denominator
|
| 276 |
+
topk_weights = topk_weights * self.routed_scaling_factor
|
| 277 |
+
return topk_indices, topk_weights
|
| 278 |
+
|
| 279 |
+
|
| 280 |
+
@use_kernel_forward_from_hub("RMSNorm")
|
| 281 |
+
class SolarOpenRMSNorm(nn.Module):
|
| 282 |
+
def __init__(self, hidden_size, eps=1e-6):
|
| 283 |
+
"""
|
| 284 |
+
SolarOpenRMSNorm is equivalent to T5LayerNorm
|
| 285 |
+
"""
|
| 286 |
+
super().__init__()
|
| 287 |
+
self.weight = nn.Parameter(torch.ones(hidden_size))
|
| 288 |
+
self.variance_epsilon = eps
|
| 289 |
+
|
| 290 |
+
def forward(self, hidden_states):
|
| 291 |
+
input_dtype = hidden_states.dtype
|
| 292 |
+
hidden_states = hidden_states.to(torch.float32)
|
| 293 |
+
variance = hidden_states.pow(2).mean(-1, keepdim=True)
|
| 294 |
+
hidden_states = hidden_states * torch.rsqrt(variance + self.variance_epsilon)
|
| 295 |
+
return self.weight * hidden_states.to(input_dtype)
|
| 296 |
+
|
| 297 |
+
def extra_repr(self):
|
| 298 |
+
return f"{tuple(self.weight.shape)}, eps={self.variance_epsilon}"
|
| 299 |
+
|
| 300 |
+
|
| 301 |
+
class SolarOpenMoE(nn.Module):
|
| 302 |
+
"""
|
| 303 |
+
A mixed expert module containing shared experts.
|
| 304 |
+
"""
|
| 305 |
+
|
| 306 |
+
def __init__(self, config):
|
| 307 |
+
super().__init__()
|
| 308 |
+
self.config = config
|
| 309 |
+
self.experts = nn.ModuleList(
|
| 310 |
+
[
|
| 311 |
+
SolarOpenMLP(config, intermediate_size=config.moe_intermediate_size)
|
| 312 |
+
for _ in range(config.n_routed_experts)
|
| 313 |
+
]
|
| 314 |
+
)
|
| 315 |
+
self.gate = SolarOpenTopkRouter(config)
|
| 316 |
+
self.shared_experts = SolarOpenMLP(
|
| 317 |
+
config=config, intermediate_size=config.moe_intermediate_size * config.n_shared_experts
|
| 318 |
+
)
|
| 319 |
+
|
| 320 |
+
@torch.compiler.disable()
|
| 321 |
+
def moe(self, hidden_states: torch.Tensor, topk_indices: torch.Tensor, topk_weights: torch.Tensor):
|
| 322 |
+
r"""
|
| 323 |
+
MoE forward pass that only executes selected experts.
|
| 324 |
+
Uses @torch.compiler.disable() to allow dynamic shape operations.
|
| 325 |
+
Requires --enforce-eager flag when serving with vLLM.
|
| 326 |
+
"""
|
| 327 |
+
final_hidden_states = torch.zeros_like(hidden_states)
|
| 328 |
+
|
| 329 |
+
for expert_idx in range(len(self.experts)):
|
| 330 |
+
expert = self.experts[expert_idx]
|
| 331 |
+
|
| 332 |
+
# Find positions where this expert was selected
|
| 333 |
+
batch_idx, topk_pos = torch.where(topk_indices == expert_idx)
|
| 334 |
+
|
| 335 |
+
if batch_idx.numel() == 0:
|
| 336 |
+
continue
|
| 337 |
+
|
| 338 |
+
# Extract only the tokens routed to this expert
|
| 339 |
+
expert_input = hidden_states[batch_idx]
|
| 340 |
+
expert_output = expert(expert_input)
|
| 341 |
+
|
| 342 |
+
# Apply weights and accumulate results
|
| 343 |
+
weights = topk_weights[batch_idx, topk_pos].unsqueeze(-1)
|
| 344 |
+
final_hidden_states.index_add_(0, batch_idx, (expert_output * weights).to(hidden_states.dtype))
|
| 345 |
+
|
| 346 |
+
return final_hidden_states
|
| 347 |
+
|
| 348 |
+
def forward(self, hidden_states):
|
| 349 |
+
residuals = hidden_states
|
| 350 |
+
orig_shape = hidden_states.shape
|
| 351 |
+
topk_indices, topk_weights = self.gate(hidden_states)
|
| 352 |
+
hidden_states = hidden_states.view(-1, hidden_states.shape[-1])
|
| 353 |
+
hidden_states = self.moe(hidden_states, topk_indices, topk_weights).view(*orig_shape)
|
| 354 |
+
hidden_states = hidden_states + self.shared_experts(residuals)
|
| 355 |
+
return hidden_states
|
| 356 |
+
|
| 357 |
+
|
| 358 |
+
class SolarOpenDecoderLayer(GradientCheckpointingLayer):
|
| 359 |
+
def __init__(self, config: SolarOpenConfig, layer_idx: int):
|
| 360 |
+
super().__init__()
|
| 361 |
+
self.hidden_size = config.hidden_size
|
| 362 |
+
|
| 363 |
+
self.self_attn = SolarOpenAttention(config=config, layer_idx=layer_idx)
|
| 364 |
+
|
| 365 |
+
if layer_idx >= config.first_k_dense_replace:
|
| 366 |
+
self.mlp = SolarOpenMoE(config)
|
| 367 |
+
else:
|
| 368 |
+
self.mlp = SolarOpenMLP(config)
|
| 369 |
+
|
| 370 |
+
self.input_layernorm = SolarOpenRMSNorm(config.hidden_size, eps=config.rms_norm_eps)
|
| 371 |
+
self.post_attention_layernorm = SolarOpenRMSNorm(config.hidden_size, eps=config.rms_norm_eps)
|
| 372 |
+
|
| 373 |
+
@deprecate_kwarg("past_key_value", new_name="past_key_values", version="4.58")
|
| 374 |
+
def forward(
|
| 375 |
+
self,
|
| 376 |
+
hidden_states: torch.Tensor,
|
| 377 |
+
attention_mask: Optional[torch.Tensor] = None,
|
| 378 |
+
position_ids: Optional[torch.LongTensor] = None,
|
| 379 |
+
past_key_values: Optional[Cache] = None,
|
| 380 |
+
use_cache: Optional[bool] = False,
|
| 381 |
+
cache_position: Optional[torch.LongTensor] = None,
|
| 382 |
+
position_embeddings: Optional[tuple[torch.Tensor, torch.Tensor]] = None, # necessary, but kept here for BC
|
| 383 |
+
**kwargs: Unpack[TransformersKwargs],
|
| 384 |
+
) -> torch.Tensor:
|
| 385 |
+
residual = hidden_states
|
| 386 |
+
hidden_states = self.input_layernorm(hidden_states)
|
| 387 |
+
# Self Attention
|
| 388 |
+
hidden_states, _ = self.self_attn(
|
| 389 |
+
hidden_states=hidden_states,
|
| 390 |
+
attention_mask=attention_mask,
|
| 391 |
+
position_ids=position_ids,
|
| 392 |
+
past_key_values=past_key_values,
|
| 393 |
+
use_cache=use_cache,
|
| 394 |
+
cache_position=cache_position,
|
| 395 |
+
position_embeddings=position_embeddings,
|
| 396 |
+
**kwargs,
|
| 397 |
+
)
|
| 398 |
+
hidden_states = residual + hidden_states
|
| 399 |
+
|
| 400 |
+
# Fully Connected
|
| 401 |
+
residual = hidden_states
|
| 402 |
+
hidden_states = self.post_attention_layernorm(hidden_states)
|
| 403 |
+
hidden_states = self.mlp(hidden_states)
|
| 404 |
+
hidden_states = residual + hidden_states
|
| 405 |
+
return hidden_states
|
| 406 |
+
|
| 407 |
+
|
| 408 |
+
@auto_docstring
|
| 409 |
+
class SolarOpenPreTrainedModel(PreTrainedModel):
|
| 410 |
+
config: SolarOpenConfig
|
| 411 |
+
base_model_prefix = "model"
|
| 412 |
+
supports_gradient_checkpointing = True
|
| 413 |
+
_no_split_modules = ["SolarOpenDecoderLayer"]
|
| 414 |
+
_skip_keys_device_placement = ["past_key_values"]
|
| 415 |
+
_supports_flash_attn = True
|
| 416 |
+
_supports_sdpa = True
|
| 417 |
+
_supports_flex_attn = True
|
| 418 |
+
_can_compile_fullgraph = False
|
| 419 |
+
_supports_attention_backend = True
|
| 420 |
+
_can_record_outputs = {
|
| 421 |
+
"hidden_states": SolarOpenDecoderLayer,
|
| 422 |
+
"attentions": SolarOpenAttention,
|
| 423 |
+
}
|
| 424 |
+
|
| 425 |
+
def _init_weights(self, module):
|
| 426 |
+
super()._init_weights(module)
|
| 427 |
+
if isinstance(module, SolarOpenTopkRouter):
|
| 428 |
+
module.weight.data.normal_(mean=0.0, std=self.config.initializer_range)
|
| 429 |
+
|
| 430 |
+
|
| 431 |
+
class SolarOpenRotaryEmbedding(nn.Module):
|
| 432 |
+
inv_freq: torch.Tensor # fix linting for `register_buffer`
|
| 433 |
+
|
| 434 |
+
def __init__(self, config: SolarOpenConfig, device=None):
|
| 435 |
+
super().__init__()
|
| 436 |
+
# BC: "rope_type" was originally "type"
|
| 437 |
+
if hasattr(config, "rope_scaling") and isinstance(config.rope_scaling, dict):
|
| 438 |
+
self.rope_type = config.rope_scaling.get("rope_type", config.rope_scaling.get("type"))
|
| 439 |
+
else:
|
| 440 |
+
self.rope_type = "default"
|
| 441 |
+
self.max_seq_len_cached = config.max_position_embeddings
|
| 442 |
+
self.original_max_seq_len = config.max_position_embeddings
|
| 443 |
+
|
| 444 |
+
self.config = config
|
| 445 |
+
self.rope_init_fn = ROPE_INIT_FUNCTIONS[self.rope_type]
|
| 446 |
+
|
| 447 |
+
inv_freq, self.attention_scaling = self.rope_init_fn(self.config, device)
|
| 448 |
+
self.register_buffer("inv_freq", inv_freq, persistent=False)
|
| 449 |
+
self.original_inv_freq = self.inv_freq
|
| 450 |
+
|
| 451 |
+
@torch.no_grad()
|
| 452 |
+
@dynamic_rope_update # power user: used with advanced RoPE types (e.g. dynamic rope)
|
| 453 |
+
def forward(self, x, position_ids):
|
| 454 |
+
inv_freq_expanded = self.inv_freq[None, :, None].float().expand(position_ids.shape[0], -1, 1).to(x.device)
|
| 455 |
+
position_ids_expanded = position_ids[:, None, :].float()
|
| 456 |
+
|
| 457 |
+
device_type = x.device.type if isinstance(x.device.type, str) and x.device.type != "mps" else "cpu"
|
| 458 |
+
with torch.autocast(device_type=device_type, enabled=False): # Force float32
|
| 459 |
+
freqs = (inv_freq_expanded.float() @ position_ids_expanded.float()).transpose(1, 2)
|
| 460 |
+
emb = torch.cat((freqs, freqs), dim=-1)
|
| 461 |
+
cos = emb.cos() * self.attention_scaling
|
| 462 |
+
sin = emb.sin() * self.attention_scaling
|
| 463 |
+
|
| 464 |
+
return cos.to(dtype=x.dtype), sin.to(dtype=x.dtype)
|
| 465 |
+
|
| 466 |
+
|
| 467 |
+
@auto_docstring
|
| 468 |
+
class SolarOpenModel(SolarOpenPreTrainedModel):
|
| 469 |
+
_keys_to_ignore_on_load_unexpected = [r"model\.layers\.92.*", r"model\.layers\.46.*"]
|
| 470 |
+
|
| 471 |
+
def __init__(self, config: SolarOpenConfig):
|
| 472 |
+
super().__init__(config)
|
| 473 |
+
self.padding_idx = config.pad_token_id
|
| 474 |
+
self.vocab_size = config.vocab_size
|
| 475 |
+
|
| 476 |
+
self.embed_tokens = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx)
|
| 477 |
+
self.layers = nn.ModuleList(
|
| 478 |
+
[SolarOpenDecoderLayer(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]
|
| 479 |
+
)
|
| 480 |
+
self.norm = SolarOpenRMSNorm(config.hidden_size, eps=config.rms_norm_eps)
|
| 481 |
+
self.rotary_emb = SolarOpenRotaryEmbedding(config=config)
|
| 482 |
+
self.gradient_checkpointing = False
|
| 483 |
+
|
| 484 |
+
# Initialize weights and apply final processing
|
| 485 |
+
self.post_init()
|
| 486 |
+
|
| 487 |
+
@check_model_inputs()
|
| 488 |
+
@auto_docstring
|
| 489 |
+
def forward(
|
| 490 |
+
self,
|
| 491 |
+
input_ids: Optional[torch.LongTensor] = None,
|
| 492 |
+
attention_mask: Optional[torch.Tensor] = None,
|
| 493 |
+
position_ids: Optional[torch.LongTensor] = None,
|
| 494 |
+
past_key_values: Optional[Cache] = None,
|
| 495 |
+
inputs_embeds: Optional[torch.FloatTensor] = None,
|
| 496 |
+
cache_position: Optional[torch.LongTensor] = None,
|
| 497 |
+
use_cache: Optional[bool] = None,
|
| 498 |
+
**kwargs: Unpack[TransformersKwargs],
|
| 499 |
+
) -> BaseModelOutputWithPast:
|
| 500 |
+
if (input_ids is None) ^ (inputs_embeds is not None):
|
| 501 |
+
raise ValueError("You must specify exactly one of input_ids or inputs_embeds")
|
| 502 |
+
|
| 503 |
+
if inputs_embeds is None:
|
| 504 |
+
inputs_embeds: torch.Tensor = self.embed_tokens(input_ids)
|
| 505 |
+
|
| 506 |
+
if use_cache and past_key_values is None:
|
| 507 |
+
past_key_values = DynamicCache(config=self.config)
|
| 508 |
+
|
| 509 |
+
if cache_position is None:
|
| 510 |
+
past_seen_tokens = past_key_values.get_seq_length() if past_key_values is not None else 0
|
| 511 |
+
cache_position: torch.Tensor = torch.arange(
|
| 512 |
+
past_seen_tokens, past_seen_tokens + inputs_embeds.shape[1], device=inputs_embeds.device
|
| 513 |
+
)
|
| 514 |
+
|
| 515 |
+
if position_ids is None:
|
| 516 |
+
position_ids = cache_position.unsqueeze(0)
|
| 517 |
+
|
| 518 |
+
causal_mask = create_causal_mask(
|
| 519 |
+
config=self.config,
|
| 520 |
+
input_embeds=inputs_embeds,
|
| 521 |
+
attention_mask=attention_mask,
|
| 522 |
+
cache_position=cache_position,
|
| 523 |
+
past_key_values=past_key_values,
|
| 524 |
+
position_ids=position_ids,
|
| 525 |
+
)
|
| 526 |
+
|
| 527 |
+
hidden_states = inputs_embeds
|
| 528 |
+
position_embeddings = self.rotary_emb(hidden_states, position_ids)
|
| 529 |
+
|
| 530 |
+
for decoder_layer in self.layers[: self.config.num_hidden_layers]:
|
| 531 |
+
hidden_states = decoder_layer(
|
| 532 |
+
hidden_states,
|
| 533 |
+
attention_mask=causal_mask,
|
| 534 |
+
position_ids=position_ids,
|
| 535 |
+
past_key_values=past_key_values,
|
| 536 |
+
cache_position=cache_position,
|
| 537 |
+
position_embeddings=position_embeddings,
|
| 538 |
+
**kwargs,
|
| 539 |
+
)
|
| 540 |
+
|
| 541 |
+
hidden_states = self.norm(hidden_states)
|
| 542 |
+
return BaseModelOutputWithPast(
|
| 543 |
+
last_hidden_state=hidden_states,
|
| 544 |
+
past_key_values=past_key_values,
|
| 545 |
+
)
|
| 546 |
+
|
| 547 |
+
|
| 548 |
+
@auto_docstring
|
| 549 |
+
class SolarOpenForCausalLM(SolarOpenPreTrainedModel, GenerationMixin):
|
| 550 |
+
_tied_weights_keys = ["lm_head.weight"]
|
| 551 |
+
_tp_plan = {"lm_head": "colwise_rep"}
|
| 552 |
+
_pp_plan = {"lm_head": (["hidden_states"], ["logits"])}
|
| 553 |
+
|
| 554 |
+
def __init__(self, config):
|
| 555 |
+
super().__init__(config)
|
| 556 |
+
self.model = SolarOpenModel(config)
|
| 557 |
+
self.vocab_size = config.vocab_size
|
| 558 |
+
self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False)
|
| 559 |
+
|
| 560 |
+
# Initialize weights and apply final processing
|
| 561 |
+
self.post_init()
|
| 562 |
+
|
| 563 |
+
@can_return_tuple
|
| 564 |
+
@auto_docstring
|
| 565 |
+
def forward(
|
| 566 |
+
self,
|
| 567 |
+
input_ids: Optional[torch.LongTensor] = None,
|
| 568 |
+
attention_mask: Optional[torch.Tensor] = None,
|
| 569 |
+
position_ids: Optional[torch.LongTensor] = None,
|
| 570 |
+
past_key_values: Optional[Cache] = None,
|
| 571 |
+
inputs_embeds: Optional[torch.FloatTensor] = None,
|
| 572 |
+
labels: Optional[torch.LongTensor] = None,
|
| 573 |
+
use_cache: Optional[bool] = None,
|
| 574 |
+
cache_position: Optional[torch.LongTensor] = None,
|
| 575 |
+
logits_to_keep: Union[int, torch.Tensor] = 0,
|
| 576 |
+
**kwargs: Unpack[TransformersKwargs],
|
| 577 |
+
) -> CausalLMOutputWithPast:
|
| 578 |
+
|
| 579 |
+
outputs: BaseModelOutputWithPast = self.model(
|
| 580 |
+
input_ids=input_ids,
|
| 581 |
+
attention_mask=attention_mask,
|
| 582 |
+
position_ids=position_ids,
|
| 583 |
+
past_key_values=past_key_values,
|
| 584 |
+
inputs_embeds=inputs_embeds,
|
| 585 |
+
use_cache=use_cache,
|
| 586 |
+
cache_position=cache_position,
|
| 587 |
+
**kwargs,
|
| 588 |
+
)
|
| 589 |
+
|
| 590 |
+
hidden_states = outputs.last_hidden_state
|
| 591 |
+
# Only compute necessary logits, and do not upcast them to float if we are not computing the loss
|
| 592 |
+
slice_indices = slice(-logits_to_keep, None) if isinstance(logits_to_keep, int) else logits_to_keep
|
| 593 |
+
logits = self.lm_head(hidden_states[:, slice_indices, :])
|
| 594 |
+
|
| 595 |
+
loss = None
|
| 596 |
+
if labels is not None:
|
| 597 |
+
loss = self.loss_function(logits=logits, labels=labels, vocab_size=self.config.vocab_size, **kwargs)
|
| 598 |
+
|
| 599 |
+
return CausalLMOutputWithPast(
|
| 600 |
+
loss=loss,
|
| 601 |
+
logits=logits,
|
| 602 |
+
past_key_values=outputs.past_key_values,
|
| 603 |
+
hidden_states=outputs.hidden_states,
|
| 604 |
+
attentions=outputs.attentions,
|
| 605 |
+
)
|
| 606 |
+
|
| 607 |
+
|
| 608 |
+
__all__ = ["SolarOpenPreTrainedModel", "SolarOpenModel", "SolarOpenForCausalLM"]
|
parallel_tool_call_logits_processor.py
ADDED
|
@@ -0,0 +1,104 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
|
| 16 |
+
from typing import TYPE_CHECKING
|
| 17 |
+
|
| 18 |
+
import torch
|
| 19 |
+
|
| 20 |
+
from vllm.sampling_params import SamplingParams
|
| 21 |
+
from vllm.v1.sample.logits_processor import (
|
| 22 |
+
AdapterLogitsProcessor,
|
| 23 |
+
RequestLogitsProcessor,
|
| 24 |
+
)
|
| 25 |
+
|
| 26 |
+
if TYPE_CHECKING:
|
| 27 |
+
from vllm.config import VllmConfig
|
| 28 |
+
|
| 29 |
+
# Hardcoded token IDs for Solar tokenizer
|
| 30 |
+
TOOL_CALL_END_TOKEN_ID = 32 # <|tool_call:end|>
|
| 31 |
+
CALLS_TOKEN_ID = 25 # <|calls|>
|
| 32 |
+
|
| 33 |
+
|
| 34 |
+
class SingleToolCallEnforcer:
|
| 35 |
+
"""Request-level logits processor that enforces single tool call.
|
| 36 |
+
|
| 37 |
+
When <|tool_call:end|> token is generated, forces the next token
|
| 38 |
+
to be <|calls|> (which is a stop token), preventing parallel tool calls.
|
| 39 |
+
"""
|
| 40 |
+
|
| 41 |
+
def __init__(
|
| 42 |
+
self,
|
| 43 |
+
tool_call_end_token_id: int,
|
| 44 |
+
calls_token_id: int,
|
| 45 |
+
):
|
| 46 |
+
self._tool_call_end_token_id = tool_call_end_token_id
|
| 47 |
+
self._calls_token_id = calls_token_id
|
| 48 |
+
|
| 49 |
+
def __call__(
|
| 50 |
+
self,
|
| 51 |
+
output_token_ids: list[int],
|
| 52 |
+
logits: torch.Tensor,
|
| 53 |
+
) -> torch.Tensor:
|
| 54 |
+
# Check if last generated token is <|tool_call:end|>
|
| 55 |
+
if output_token_ids and output_token_ids[-1] == self._tool_call_end_token_id:
|
| 56 |
+
# Force next token to be <|calls|> by masking all other tokens
|
| 57 |
+
mask = torch.full_like(logits, -float("inf"))
|
| 58 |
+
mask[self._calls_token_id] = logits[self._calls_token_id]
|
| 59 |
+
return mask
|
| 60 |
+
|
| 61 |
+
return logits
|
| 62 |
+
|
| 63 |
+
|
| 64 |
+
class ParallelToolCallLogitsProcessor(AdapterLogitsProcessor):
|
| 65 |
+
"""Logits processor that enforces single tool call when parallel_tool_calls=False.
|
| 66 |
+
|
| 67 |
+
When parallel_tool_calls is disabled in SamplingParams, this processor
|
| 68 |
+
ensures that after <|tool_call:end|> is generated, the next token is
|
| 69 |
+
forced to be <|calls|> (a stop token), preventing multiple tool calls.
|
| 70 |
+
"""
|
| 71 |
+
|
| 72 |
+
def __init__(
|
| 73 |
+
self,
|
| 74 |
+
vllm_config: "VllmConfig",
|
| 75 |
+
device: torch.device,
|
| 76 |
+
is_pin_memory: bool,
|
| 77 |
+
):
|
| 78 |
+
super().__init__(vllm_config, device, is_pin_memory)
|
| 79 |
+
|
| 80 |
+
def is_argmax_invariant(self) -> bool:
|
| 81 |
+
"""This processor can change argmax result by forcing specific tokens."""
|
| 82 |
+
return False
|
| 83 |
+
|
| 84 |
+
def new_req_logits_processor(
|
| 85 |
+
self,
|
| 86 |
+
params: SamplingParams,
|
| 87 |
+
) -> RequestLogitsProcessor | None:
|
| 88 |
+
"""Return a request-level logits processor if parallel_tool_calls=False.
|
| 89 |
+
|
| 90 |
+
Args:
|
| 91 |
+
params: Request sampling params
|
| 92 |
+
|
| 93 |
+
Returns:
|
| 94 |
+
SingleToolCallEnforcer if parallel_tool_calls is False, otherwise None.
|
| 95 |
+
"""
|
| 96 |
+
# Only apply when parallel_tool_calls is explicitly disabled
|
| 97 |
+
if params.parallel_tool_calls is False:
|
| 98 |
+
return SingleToolCallEnforcer(
|
| 99 |
+
tool_call_end_token_id=TOOL_CALL_END_TOKEN_ID,
|
| 100 |
+
calls_token_id=CALLS_TOKEN_ID,
|
| 101 |
+
)
|
| 102 |
+
|
| 103 |
+
return None
|
| 104 |
+
|
solar_open_logits_processor.py
ADDED
|
@@ -0,0 +1,763 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
|
| 16 |
+
import os
|
| 17 |
+
from enum import Enum
|
| 18 |
+
from typing import TYPE_CHECKING
|
| 19 |
+
|
| 20 |
+
import torch
|
| 21 |
+
|
| 22 |
+
from vllm.sampling_params import SamplingParams
|
| 23 |
+
from vllm.v1.sample.logits_processor import (
|
| 24 |
+
AdapterLogitsProcessor,
|
| 25 |
+
RequestLogitsProcessor,
|
| 26 |
+
)
|
| 27 |
+
|
| 28 |
+
if TYPE_CHECKING:
|
| 29 |
+
from vllm.config import VllmConfig
|
| 30 |
+
|
| 31 |
+
# Hardcoded token IDs for Solar tokenizer
|
| 32 |
+
|
| 33 |
+
# Special token IDs for chat template
|
| 34 |
+
BEGIN_TOKEN_ID = 20 # <|begin|>
|
| 35 |
+
END_TOKEN_ID = 21 # <|end|>
|
| 36 |
+
THINK_TOKEN_ID = 22 # <|think|>
|
| 37 |
+
CONTENT_TOKEN_ID = 23 # <|content|>
|
| 38 |
+
FLUSH_TOKEN_ID = 24 # <|flush|> (eos token)
|
| 39 |
+
ASSISTANT_TOKEN_ID = 163444 # assistant
|
| 40 |
+
'''
|
| 41 |
+
'assistant' is not a special token exactly, but is treated as one in the logits
|
| 42 |
+
processing.
|
| 43 |
+
'''
|
| 44 |
+
|
| 45 |
+
# Tool call related tokens
|
| 46 |
+
CALLS_TOKEN_ID = 25 # <|calls|> (eos token for tool calls)
|
| 47 |
+
TOOL_CALLS_TOKEN_ID = 30 # <|tool_calls|>
|
| 48 |
+
TOOL_CALL_BEGIN_TOKEN_ID = 31 # <|tool_call:begin|>
|
| 49 |
+
TOOL_CALL_END_TOKEN_ID = 32 # <|tool_call:end|>
|
| 50 |
+
TOOL_CALL_NAME_TOKEN_ID = 33 # <|tool_call:name|>
|
| 51 |
+
TOOL_CALL_ARGS_TOKEN_ID = 34 # <|tool_call:args|>
|
| 52 |
+
|
| 53 |
+
# =============================================================================
|
| 54 |
+
# Dynamic Reasoning Budget Configuration
|
| 55 |
+
# =============================================================================
|
| 56 |
+
# budget = min(max_budget, max(min_budget, max_tokens * ratio / 100))
|
| 57 |
+
# Priority: max_budget > min_budget > ratio
|
| 58 |
+
#
|
| 59 |
+
# Available environment variables:
|
| 60 |
+
# HIGH effort:
|
| 61 |
+
# SOLAR_REASONING_BUDGET_HIGH_MAX (default: 32768) - max_budget
|
| 62 |
+
# SOLAR_REASONING_BUDGET_HIGH_MIN (default: 8192) - min_budget
|
| 63 |
+
# SOLAR_REASONING_BUDGET_HIGH_RATIO (default: 60) - % of max_tokens
|
| 64 |
+
#
|
| 65 |
+
# MEDIUM effort:
|
| 66 |
+
# SOLAR_REASONING_BUDGET_MEDIUM_MAX (default: 16384) - max_budget
|
| 67 |
+
# SOLAR_REASONING_BUDGET_MEDIUM_MIN (default: 4096) - min_budget
|
| 68 |
+
# SOLAR_REASONING_BUDGET_MEDIUM_RATIO (default: 30) - % of max_tokens
|
| 69 |
+
#
|
| 70 |
+
# Tool call:
|
| 71 |
+
# SOLAR_TOOL_CALL_ID_BUDGET (default: 10) - Max tokens for tool call ID
|
| 72 |
+
# =============================================================================
|
| 73 |
+
|
| 74 |
+
DEFAULT_REASONING_EFFORT = "high"
|
| 75 |
+
|
| 76 |
+
# HIGH effort settings (1k = 1024 tokens)
|
| 77 |
+
DEFAULT_REASONING_BUDGET_HIGH_MAX = 32 * 1024
|
| 78 |
+
DEFAULT_REASONING_BUDGET_HIGH_MIN = 8 * 1024
|
| 79 |
+
DEFAULT_REASONING_BUDGET_HIGH_RATIO = 60
|
| 80 |
+
|
| 81 |
+
# MEDIUM effort settings
|
| 82 |
+
DEFAULT_REASONING_BUDGET_MEDIUM_MAX = 16 * 1024
|
| 83 |
+
DEFAULT_REASONING_BUDGET_MEDIUM_MIN = 4 * 1024
|
| 84 |
+
DEFAULT_REASONING_BUDGET_MEDIUM_RATIO = 30
|
| 85 |
+
|
| 86 |
+
# Tool call settings
|
| 87 |
+
DEFAULT_TOOL_CALL_ID_BUDGET = 10
|
| 88 |
+
|
| 89 |
+
# Pre-computed constant to avoid repeated string parsing
|
| 90 |
+
NEG_INF = float("-inf")
|
| 91 |
+
|
| 92 |
+
|
| 93 |
+
def is_reasoning_request(params: SamplingParams) -> bool:
|
| 94 |
+
"""Check if the request is a reasoning request based on reasoning_effort."""
|
| 95 |
+
return (params.reasoning_effort is None) or (params.reasoning_effort in ("medium", "high"))
|
| 96 |
+
|
| 97 |
+
|
| 98 |
+
def is_structured_outputs(params: SamplingParams) -> bool:
|
| 99 |
+
"""Check if the request has structured outputs constraints."""
|
| 100 |
+
return (
|
| 101 |
+
params.structured_outputs is not None
|
| 102 |
+
and not params.structured_outputs.all_constraints_none()
|
| 103 |
+
)
|
| 104 |
+
|
| 105 |
+
|
| 106 |
+
class GenerationState(Enum):
|
| 107 |
+
"""Enum representing the current state of response generation."""
|
| 108 |
+
|
| 109 |
+
# Initial state - no tokens generated yet
|
| 110 |
+
INITIAL = "initial"
|
| 111 |
+
|
| 112 |
+
# New message states (after think_end)
|
| 113 |
+
NEW_MESSAGE_BEGIN = "new_message_begin" # <|begin|> token was just generated
|
| 114 |
+
NEW_MESSAGE_ASSISTANT = "new_message_assistant" # assistant token after <|begin|>
|
| 115 |
+
|
| 116 |
+
# Think mode states
|
| 117 |
+
THINK_BEGIN = "think_begin" # <|think|> token was just generated
|
| 118 |
+
THINK_IN_PROGRESS = "think_in_progress" # Generating think content
|
| 119 |
+
THINK_END = "think_end" # <|end|> after think content
|
| 120 |
+
THINK_FLUSH = "think_flush" # <|flush|> after think content
|
| 121 |
+
|
| 122 |
+
# Content states
|
| 123 |
+
CONTENT_BEGIN = "content_begin" # <|content|> token was just generated
|
| 124 |
+
CONTENT_IN_PROGRESS = "content_in_progress" # Generating content
|
| 125 |
+
CONTENT_END = "content_end" # <|end|> or <|flush|> after content
|
| 126 |
+
CONTENT_FLUSH = "content_flush" # <|flush|> after content
|
| 127 |
+
|
| 128 |
+
# Tool call states
|
| 129 |
+
# Flow: <|tool_calls|> -> (<|tool_call:begin|> -> id -> <|tool_call:name|> -> name -> <|tool_call:args|> -> args -> <|tool_call:end|>)+ -> <|calls|>
|
| 130 |
+
# Note: Think message can appear before <|tool_calls|>
|
| 131 |
+
TOOL_CALLS_BEGIN = "tool_calls_begin" # <|tool_calls|> token was just generated
|
| 132 |
+
TOOL_CALL_BEGIN = "tool_call_begin" # <|tool_call:begin|> token was just generated
|
| 133 |
+
TOOL_CALL_ID_IN_PROGRESS = "tool_call_id_in_progress" # Generating tool call ID
|
| 134 |
+
TOOL_CALL_NAME_BEGIN = "tool_call_name_begin" # <|tool_call:name|> token was just generated
|
| 135 |
+
TOOL_CALL_NAME_IN_PROGRESS = "tool_call_name_in_progress" # Generating tool name
|
| 136 |
+
TOOL_CALL_ARGS_BEGIN = "tool_call_args_begin" # <|tool_call:args|> token was just generated
|
| 137 |
+
TOOL_CALL_ARGS_IN_PROGRESS = "tool_call_args_in_progress" # Generating tool arguments (JSON)
|
| 138 |
+
TOOL_CALL_END = "tool_call_end" # <|tool_call:end|> token was just generated (can start another tool call or end)
|
| 139 |
+
CALLS = "calls" # <|calls|> token was just generated (eos token for tool calls)
|
| 140 |
+
|
| 141 |
+
|
| 142 |
+
def get_generation_state(
|
| 143 |
+
output_token_ids: list[int],
|
| 144 |
+
begin_token_id: int = BEGIN_TOKEN_ID,
|
| 145 |
+
end_token_id: int = END_TOKEN_ID,
|
| 146 |
+
flush_token_id: int = FLUSH_TOKEN_ID,
|
| 147 |
+
think_token_id: int = THINK_TOKEN_ID,
|
| 148 |
+
content_token_id: int = CONTENT_TOKEN_ID,
|
| 149 |
+
tool_calls_token_id: int = TOOL_CALLS_TOKEN_ID,
|
| 150 |
+
tool_call_begin_token_id: int = TOOL_CALL_BEGIN_TOKEN_ID,
|
| 151 |
+
tool_call_name_token_id: int = TOOL_CALL_NAME_TOKEN_ID,
|
| 152 |
+
tool_call_args_token_id: int = TOOL_CALL_ARGS_TOKEN_ID,
|
| 153 |
+
tool_call_end_token_id: int = TOOL_CALL_END_TOKEN_ID,
|
| 154 |
+
calls_token_id: int = CALLS_TOKEN_ID,
|
| 155 |
+
assistant_token_id: int = ASSISTANT_TOKEN_ID,
|
| 156 |
+
) -> GenerationState:
|
| 157 |
+
"""Determine the current generation state based on output token IDs.
|
| 158 |
+
|
| 159 |
+
Analyzes the sequence of generated tokens to determine which phase
|
| 160 |
+
of the chat template the generation is currently in.
|
| 161 |
+
|
| 162 |
+
Response format specs:
|
| 163 |
+
- think mode: <|think|>{{think-tokens}}<|end|><|begin|>assistant<|content|>{{content-tokens}}<|flush|>
|
| 164 |
+
- tool mode: <|begin|>assistant<|tool_calls|><|tool_call:begin|>{{id}}<|tool_call:name|>{{name}}<|tool_call:args|>{{args}}<|tool_call:end|><|calls|>
|
| 165 |
+
- tool mode (with think): <|think|>{{think-tokens}}<|end|><|begin|>assistant<|tool_calls|>...<|calls|>
|
| 166 |
+
- no-think mode: <|content|>{{content-tokens}}<|flush|>
|
| 167 |
+
|
| 168 |
+
Args:
|
| 169 |
+
output_token_ids: List of token IDs generated so far.
|
| 170 |
+
begin_token_id: Token ID for <|begin|>.
|
| 171 |
+
end_token_id: Token ID for <|end|>.
|
| 172 |
+
flush_token_id: Token ID for <|flush|> (eos).
|
| 173 |
+
think_token_id: Token ID for <|think|>.
|
| 174 |
+
content_token_id: Token ID for <|content|>.
|
| 175 |
+
tool_calls_token_id: Token ID for <|tool_calls|>.
|
| 176 |
+
tool_call_begin_token_id: Token ID for <|tool_call:begin|>.
|
| 177 |
+
tool_call_name_token_id: Token ID for <|tool_call:name|>.
|
| 178 |
+
tool_call_args_token_id: Token ID for <|tool_call:args|>.
|
| 179 |
+
tool_call_end_token_id: Token ID for <|tool_call:end|>.
|
| 180 |
+
calls_token_id: Token ID for <|calls|> (eos).
|
| 181 |
+
assistant_token_id: Token ID for assistant.
|
| 182 |
+
|
| 183 |
+
Returns:
|
| 184 |
+
GenerationState indicating the current phase of generation.
|
| 185 |
+
"""
|
| 186 |
+
if not output_token_ids:
|
| 187 |
+
return GenerationState.INITIAL
|
| 188 |
+
|
| 189 |
+
# Track state by scanning through tokens
|
| 190 |
+
state = GenerationState.INITIAL
|
| 191 |
+
in_think = False
|
| 192 |
+
in_content = False
|
| 193 |
+
|
| 194 |
+
for token_id in output_token_ids:
|
| 195 |
+
if token_id == think_token_id:
|
| 196 |
+
state = GenerationState.THINK_BEGIN
|
| 197 |
+
in_think = True
|
| 198 |
+
in_content = False
|
| 199 |
+
|
| 200 |
+
elif token_id == content_token_id:
|
| 201 |
+
state = GenerationState.CONTENT_BEGIN
|
| 202 |
+
in_content = True
|
| 203 |
+
in_think = False
|
| 204 |
+
|
| 205 |
+
elif token_id == tool_calls_token_id:
|
| 206 |
+
state = GenerationState.TOOL_CALLS_BEGIN
|
| 207 |
+
in_think = False
|
| 208 |
+
in_content = False
|
| 209 |
+
|
| 210 |
+
elif token_id == tool_call_begin_token_id:
|
| 211 |
+
state = GenerationState.TOOL_CALL_BEGIN
|
| 212 |
+
|
| 213 |
+
elif token_id == tool_call_name_token_id:
|
| 214 |
+
state = GenerationState.TOOL_CALL_NAME_BEGIN
|
| 215 |
+
|
| 216 |
+
elif token_id == tool_call_args_token_id:
|
| 217 |
+
state = GenerationState.TOOL_CALL_ARGS_BEGIN
|
| 218 |
+
|
| 219 |
+
elif token_id == tool_call_end_token_id:
|
| 220 |
+
state = GenerationState.TOOL_CALL_END
|
| 221 |
+
|
| 222 |
+
elif token_id == calls_token_id:
|
| 223 |
+
state = GenerationState.CALLS
|
| 224 |
+
|
| 225 |
+
elif token_id == begin_token_id:
|
| 226 |
+
state = GenerationState.NEW_MESSAGE_BEGIN
|
| 227 |
+
|
| 228 |
+
elif token_id == assistant_token_id:
|
| 229 |
+
if state == GenerationState.NEW_MESSAGE_BEGIN:
|
| 230 |
+
state = GenerationState.NEW_MESSAGE_ASSISTANT
|
| 231 |
+
|
| 232 |
+
elif token_id == end_token_id:
|
| 233 |
+
if in_think:
|
| 234 |
+
state = GenerationState.THINK_END
|
| 235 |
+
in_think = False
|
| 236 |
+
elif in_content:
|
| 237 |
+
state = GenerationState.CONTENT_END
|
| 238 |
+
in_content = False
|
| 239 |
+
|
| 240 |
+
elif token_id == flush_token_id:
|
| 241 |
+
if in_think:
|
| 242 |
+
state = GenerationState.THINK_FLUSH
|
| 243 |
+
in_think = False
|
| 244 |
+
elif in_content:
|
| 245 |
+
state = GenerationState.CONTENT_FLUSH
|
| 246 |
+
in_content = False
|
| 247 |
+
|
| 248 |
+
else:
|
| 249 |
+
# Regular token - update state based on current context
|
| 250 |
+
if state == GenerationState.THINK_BEGIN:
|
| 251 |
+
state = GenerationState.THINK_IN_PROGRESS
|
| 252 |
+
elif state == GenerationState.THINK_IN_PROGRESS:
|
| 253 |
+
pass # Stay in think_in_progress
|
| 254 |
+
elif state == GenerationState.CONTENT_BEGIN:
|
| 255 |
+
state = GenerationState.CONTENT_IN_PROGRESS
|
| 256 |
+
elif state == GenerationState.CONTENT_IN_PROGRESS:
|
| 257 |
+
pass # Stay in content_in_progress
|
| 258 |
+
elif state == GenerationState.TOOL_CALL_BEGIN:
|
| 259 |
+
state = GenerationState.TOOL_CALL_ID_IN_PROGRESS
|
| 260 |
+
elif state == GenerationState.TOOL_CALL_ID_IN_PROGRESS:
|
| 261 |
+
pass # Stay in tool_call_id_in_progress
|
| 262 |
+
elif state == GenerationState.TOOL_CALL_NAME_BEGIN:
|
| 263 |
+
state = GenerationState.TOOL_CALL_NAME_IN_PROGRESS
|
| 264 |
+
elif state == GenerationState.TOOL_CALL_NAME_IN_PROGRESS:
|
| 265 |
+
pass # Stay in tool_call_name_in_progress
|
| 266 |
+
elif state == GenerationState.TOOL_CALL_ARGS_BEGIN:
|
| 267 |
+
state = GenerationState.TOOL_CALL_ARGS_IN_PROGRESS
|
| 268 |
+
elif state == GenerationState.TOOL_CALL_ARGS_IN_PROGRESS:
|
| 269 |
+
pass # Stay in tool_call_args_in_progress
|
| 270 |
+
|
| 271 |
+
return state
|
| 272 |
+
|
| 273 |
+
|
| 274 |
+
# Pre-computed list of all special token IDs for batch indexing
|
| 275 |
+
_ALL_SPECIAL_TOKEN_IDS = [
|
| 276 |
+
BEGIN_TOKEN_ID,
|
| 277 |
+
END_TOKEN_ID,
|
| 278 |
+
THINK_TOKEN_ID,
|
| 279 |
+
CONTENT_TOKEN_ID,
|
| 280 |
+
FLUSH_TOKEN_ID,
|
| 281 |
+
CALLS_TOKEN_ID,
|
| 282 |
+
TOOL_CALLS_TOKEN_ID,
|
| 283 |
+
TOOL_CALL_BEGIN_TOKEN_ID,
|
| 284 |
+
TOOL_CALL_END_TOKEN_ID,
|
| 285 |
+
TOOL_CALL_NAME_TOKEN_ID,
|
| 286 |
+
TOOL_CALL_ARGS_TOKEN_ID,
|
| 287 |
+
]
|
| 288 |
+
|
| 289 |
+
# Pre-computed lists for state-specific batch indexing (excluding allowed tokens)
|
| 290 |
+
_SPECIAL_EXCEPT_END = [ # For THINK states (allow END)
|
| 291 |
+
BEGIN_TOKEN_ID, FLUSH_TOKEN_ID, THINK_TOKEN_ID, CONTENT_TOKEN_ID,
|
| 292 |
+
TOOL_CALLS_TOKEN_ID, CALLS_TOKEN_ID, TOOL_CALL_BEGIN_TOKEN_ID,
|
| 293 |
+
TOOL_CALL_END_TOKEN_ID, TOOL_CALL_NAME_TOKEN_ID, TOOL_CALL_ARGS_TOKEN_ID,
|
| 294 |
+
]
|
| 295 |
+
|
| 296 |
+
_SPECIAL_EXCEPT_CONTENT_TOOLCALLS = [ # For NEW_MESSAGE_ASSISTANT (allow CONTENT, TOOL_CALLS)
|
| 297 |
+
THINK_TOKEN_ID, BEGIN_TOKEN_ID, END_TOKEN_ID, FLUSH_TOKEN_ID,
|
| 298 |
+
CALLS_TOKEN_ID, TOOL_CALL_BEGIN_TOKEN_ID, TOOL_CALL_END_TOKEN_ID,
|
| 299 |
+
TOOL_CALL_NAME_TOKEN_ID, TOOL_CALL_ARGS_TOKEN_ID,
|
| 300 |
+
]
|
| 301 |
+
|
| 302 |
+
_SPECIAL_EXCEPT_FLUSH = [ # For CONTENT states (allow FLUSH)
|
| 303 |
+
BEGIN_TOKEN_ID, END_TOKEN_ID, THINK_TOKEN_ID, CONTENT_TOKEN_ID,
|
| 304 |
+
TOOL_CALLS_TOKEN_ID, CALLS_TOKEN_ID, TOOL_CALL_BEGIN_TOKEN_ID,
|
| 305 |
+
TOOL_CALL_END_TOKEN_ID, TOOL_CALL_NAME_TOKEN_ID, TOOL_CALL_ARGS_TOKEN_ID,
|
| 306 |
+
]
|
| 307 |
+
|
| 308 |
+
_SPECIAL_EXCEPT_TOOLCALL_NAME = [ # For TOOL_CALL_ID_IN_PROGRESS (allow TOOL_CALL_NAME)
|
| 309 |
+
BEGIN_TOKEN_ID, END_TOKEN_ID, THINK_TOKEN_ID, CONTENT_TOKEN_ID,
|
| 310 |
+
FLUSH_TOKEN_ID, CALLS_TOKEN_ID, TOOL_CALLS_TOKEN_ID,
|
| 311 |
+
TOOL_CALL_BEGIN_TOKEN_ID, TOOL_CALL_END_TOKEN_ID, TOOL_CALL_ARGS_TOKEN_ID,
|
| 312 |
+
]
|
| 313 |
+
|
| 314 |
+
_SPECIAL_EXCEPT_TOOLCALL_ARGS = [ # For TOOL_CALL_NAME_IN_PROGRESS (allow TOOL_CALL_ARGS)
|
| 315 |
+
BEGIN_TOKEN_ID, END_TOKEN_ID, THINK_TOKEN_ID, CONTENT_TOKEN_ID,
|
| 316 |
+
FLUSH_TOKEN_ID, CALLS_TOKEN_ID, TOOL_CALLS_TOKEN_ID,
|
| 317 |
+
TOOL_CALL_BEGIN_TOKEN_ID, TOOL_CALL_END_TOKEN_ID, TOOL_CALL_NAME_TOKEN_ID,
|
| 318 |
+
]
|
| 319 |
+
|
| 320 |
+
_SPECIAL_EXCEPT_TOOLCALL_END = [ # For TOOL_CALL_ARGS_IN_PROGRESS (allow TOOL_CALL_END)
|
| 321 |
+
BEGIN_TOKEN_ID, END_TOKEN_ID, THINK_TOKEN_ID, CONTENT_TOKEN_ID,
|
| 322 |
+
FLUSH_TOKEN_ID, CALLS_TOKEN_ID, TOOL_CALLS_TOKEN_ID,
|
| 323 |
+
TOOL_CALL_BEGIN_TOKEN_ID, TOOL_CALL_NAME_TOKEN_ID, TOOL_CALL_ARGS_TOKEN_ID,
|
| 324 |
+
]
|
| 325 |
+
|
| 326 |
+
|
| 327 |
+
def _forbid_all_special_tokens(logits: torch.Tensor) -> None:
|
| 328 |
+
"""Set all special token logits to -inf."""
|
| 329 |
+
logits[_ALL_SPECIAL_TOKEN_IDS] = NEG_INF
|
| 330 |
+
|
| 331 |
+
|
| 332 |
+
class SolarOpenTemplateEnforcer:
|
| 333 |
+
"""Request-level logits processor that enforces Solar Open chat template.
|
| 334 |
+
|
| 335 |
+
Enforces the following generation rules:
|
| 336 |
+
- think mode: <|think|>{{tokens}}<|end|><|begin|>assistant<|content|>{{tokens}}<|flush|>
|
| 337 |
+
- tool mode: <|tool_calls|><|tool_call:begin|>{{id}}<|tool_call:name|>{{name}}<|tool_call:args|>{{args}}<|tool_call:end|><|calls|>
|
| 338 |
+
- tool+think mode: <|think|>{{tokens}}<|end|><|begin|>assistant<|tool_calls|>...<|calls|>
|
| 339 |
+
- no-think mode: <|content|>{{tokens}}<|flush|>
|
| 340 |
+
|
| 341 |
+
Key constraints:
|
| 342 |
+
- Think message can only appear first
|
| 343 |
+
- Think message must be followed by another message
|
| 344 |
+
- Content and tool messages cannot coexist
|
| 345 |
+
- Maximum 2 messages (think + content/tool, or just content/tool)
|
| 346 |
+
|
| 347 |
+
Performance optimization:
|
| 348 |
+
- Uses incremental state tracking to avoid full token sequence scan on each call
|
| 349 |
+
- Maintains local counters for budget tracking
|
| 350 |
+
- Uses pre-computed constants to avoid repeated object creation
|
| 351 |
+
"""
|
| 352 |
+
|
| 353 |
+
# Pre-computed frozenset for reasoning state check (avoids set creation per call)
|
| 354 |
+
_REASONING_STATES = frozenset({
|
| 355 |
+
GenerationState.INITIAL,
|
| 356 |
+
GenerationState.THINK_BEGIN,
|
| 357 |
+
GenerationState.THINK_IN_PROGRESS,
|
| 358 |
+
})
|
| 359 |
+
|
| 360 |
+
def __init__(
|
| 361 |
+
self,
|
| 362 |
+
is_reasoning_request: bool,
|
| 363 |
+
is_structured_outputs: bool,
|
| 364 |
+
reasoning_budget: int | None = None,
|
| 365 |
+
tool_call_id_budget: int = DEFAULT_TOOL_CALL_ID_BUDGET,
|
| 366 |
+
):
|
| 367 |
+
self._is_reasoning_request = is_reasoning_request
|
| 368 |
+
self._is_structured_outputs = is_structured_outputs
|
| 369 |
+
self._reasoning_budget = reasoning_budget
|
| 370 |
+
self._tool_call_id_budget = tool_call_id_budget
|
| 371 |
+
|
| 372 |
+
# Incremental state tracking
|
| 373 |
+
self._state = GenerationState.INITIAL
|
| 374 |
+
self._last_processed_len = 0
|
| 375 |
+
self._in_think = False
|
| 376 |
+
self._in_content = False
|
| 377 |
+
|
| 378 |
+
# Budget counters
|
| 379 |
+
self._think_token_count = 0
|
| 380 |
+
self._tool_call_id_token_count = 0
|
| 381 |
+
|
| 382 |
+
def _reset_state(self) -> None:
|
| 383 |
+
"""Reset all incremental state to initial values.
|
| 384 |
+
|
| 385 |
+
Called when defensive reprocessing is needed (e.g., token sequence inconsistency).
|
| 386 |
+
"""
|
| 387 |
+
self._state = GenerationState.INITIAL
|
| 388 |
+
self._last_processed_len = 0
|
| 389 |
+
self._in_think = False
|
| 390 |
+
self._in_content = False
|
| 391 |
+
self._think_token_count = 0
|
| 392 |
+
self._tool_call_id_token_count = 0
|
| 393 |
+
|
| 394 |
+
def _process_token(self, token_id: int) -> None:
|
| 395 |
+
"""Process a single token and update internal state incrementally.
|
| 396 |
+
|
| 397 |
+
Args:
|
| 398 |
+
token_id: The token ID to process.
|
| 399 |
+
"""
|
| 400 |
+
if token_id == THINK_TOKEN_ID:
|
| 401 |
+
self._state = GenerationState.THINK_BEGIN
|
| 402 |
+
self._in_think = True
|
| 403 |
+
self._in_content = False
|
| 404 |
+
self._think_token_count = 0 # Reset counter for new think block
|
| 405 |
+
|
| 406 |
+
elif token_id == CONTENT_TOKEN_ID:
|
| 407 |
+
self._state = GenerationState.CONTENT_BEGIN
|
| 408 |
+
self._in_content = True
|
| 409 |
+
self._in_think = False
|
| 410 |
+
|
| 411 |
+
elif token_id == TOOL_CALLS_TOKEN_ID:
|
| 412 |
+
self._state = GenerationState.TOOL_CALLS_BEGIN
|
| 413 |
+
self._in_think = False
|
| 414 |
+
self._in_content = False
|
| 415 |
+
|
| 416 |
+
elif token_id == TOOL_CALL_BEGIN_TOKEN_ID:
|
| 417 |
+
self._state = GenerationState.TOOL_CALL_BEGIN
|
| 418 |
+
self._tool_call_id_token_count = 0 # Reset counter for new tool call
|
| 419 |
+
|
| 420 |
+
elif token_id == TOOL_CALL_NAME_TOKEN_ID:
|
| 421 |
+
self._state = GenerationState.TOOL_CALL_NAME_BEGIN
|
| 422 |
+
|
| 423 |
+
elif token_id == TOOL_CALL_ARGS_TOKEN_ID:
|
| 424 |
+
self._state = GenerationState.TOOL_CALL_ARGS_BEGIN
|
| 425 |
+
|
| 426 |
+
elif token_id == TOOL_CALL_END_TOKEN_ID:
|
| 427 |
+
self._state = GenerationState.TOOL_CALL_END
|
| 428 |
+
|
| 429 |
+
elif token_id == CALLS_TOKEN_ID:
|
| 430 |
+
self._state = GenerationState.CALLS
|
| 431 |
+
|
| 432 |
+
elif token_id == BEGIN_TOKEN_ID:
|
| 433 |
+
self._state = GenerationState.NEW_MESSAGE_BEGIN
|
| 434 |
+
|
| 435 |
+
elif token_id == ASSISTANT_TOKEN_ID:
|
| 436 |
+
if self._state == GenerationState.NEW_MESSAGE_BEGIN:
|
| 437 |
+
self._state = GenerationState.NEW_MESSAGE_ASSISTANT
|
| 438 |
+
|
| 439 |
+
elif token_id == END_TOKEN_ID:
|
| 440 |
+
if self._in_think:
|
| 441 |
+
self._state = GenerationState.THINK_END
|
| 442 |
+
self._in_think = False
|
| 443 |
+
elif self._in_content:
|
| 444 |
+
self._state = GenerationState.CONTENT_END
|
| 445 |
+
self._in_content = False
|
| 446 |
+
|
| 447 |
+
elif token_id == FLUSH_TOKEN_ID:
|
| 448 |
+
if self._in_think:
|
| 449 |
+
self._state = GenerationState.THINK_FLUSH
|
| 450 |
+
self._in_think = False
|
| 451 |
+
elif self._in_content:
|
| 452 |
+
self._state = GenerationState.CONTENT_FLUSH
|
| 453 |
+
self._in_content = False
|
| 454 |
+
|
| 455 |
+
else:
|
| 456 |
+
# Regular token - update state and counters based on current context
|
| 457 |
+
if self._state == GenerationState.THINK_BEGIN:
|
| 458 |
+
self._state = GenerationState.THINK_IN_PROGRESS
|
| 459 |
+
self._think_token_count += 1
|
| 460 |
+
elif self._state == GenerationState.THINK_IN_PROGRESS:
|
| 461 |
+
self._think_token_count += 1
|
| 462 |
+
elif self._state == GenerationState.CONTENT_BEGIN:
|
| 463 |
+
self._state = GenerationState.CONTENT_IN_PROGRESS
|
| 464 |
+
elif self._state == GenerationState.CONTENT_IN_PROGRESS:
|
| 465 |
+
pass # Stay in content_in_progress
|
| 466 |
+
elif self._state == GenerationState.TOOL_CALL_BEGIN:
|
| 467 |
+
self._state = GenerationState.TOOL_CALL_ID_IN_PROGRESS
|
| 468 |
+
self._tool_call_id_token_count += 1
|
| 469 |
+
elif self._state == GenerationState.TOOL_CALL_ID_IN_PROGRESS:
|
| 470 |
+
self._tool_call_id_token_count += 1
|
| 471 |
+
elif self._state == GenerationState.TOOL_CALL_NAME_BEGIN:
|
| 472 |
+
self._state = GenerationState.TOOL_CALL_NAME_IN_PROGRESS
|
| 473 |
+
elif self._state == GenerationState.TOOL_CALL_NAME_IN_PROGRESS:
|
| 474 |
+
pass # Stay in tool_call_name_in_progress
|
| 475 |
+
elif self._state == GenerationState.TOOL_CALL_ARGS_BEGIN:
|
| 476 |
+
self._state = GenerationState.TOOL_CALL_ARGS_IN_PROGRESS
|
| 477 |
+
elif self._state == GenerationState.TOOL_CALL_ARGS_IN_PROGRESS:
|
| 478 |
+
pass # Stay in tool_call_args_in_progress
|
| 479 |
+
|
| 480 |
+
def _update_state_incremental(self, output_token_ids: list[int]) -> None:
|
| 481 |
+
"""Update internal state by processing only new tokens.
|
| 482 |
+
|
| 483 |
+
Args:
|
| 484 |
+
output_token_ids: Full list of output token IDs.
|
| 485 |
+
"""
|
| 486 |
+
current_len = len(output_token_ids)
|
| 487 |
+
|
| 488 |
+
# Defensive check: if token sequence is shorter than expected, reset and reprocess
|
| 489 |
+
if current_len < self._last_processed_len:
|
| 490 |
+
self._reset_state()
|
| 491 |
+
|
| 492 |
+
# Process only new tokens
|
| 493 |
+
for i in range(self._last_processed_len, current_len):
|
| 494 |
+
self._process_token(output_token_ids[i])
|
| 495 |
+
|
| 496 |
+
self._last_processed_len = current_len
|
| 497 |
+
|
| 498 |
+
@staticmethod
|
| 499 |
+
def _count_think_tokens(output_token_ids: list[int]) -> int:
|
| 500 |
+
"""Count the number of tokens generated after <|think|> token.
|
| 501 |
+
|
| 502 |
+
Returns 0 if <|think|> token is not found (defensive).
|
| 503 |
+
Note: This static method is kept for backward compatibility and testing.
|
| 504 |
+
The incremental version uses _think_token_count instead.
|
| 505 |
+
"""
|
| 506 |
+
try:
|
| 507 |
+
think_index = output_token_ids.index(THINK_TOKEN_ID)
|
| 508 |
+
return len(output_token_ids) - think_index - 1
|
| 509 |
+
except ValueError:
|
| 510 |
+
return 0
|
| 511 |
+
|
| 512 |
+
@staticmethod
|
| 513 |
+
def _count_tool_call_id_tokens(output_token_ids: list[int]) -> int:
|
| 514 |
+
"""Count the number of tokens generated after the last <|tool_call:begin|> token.
|
| 515 |
+
|
| 516 |
+
Returns 0 if <|tool_call:begin|> token is not found (defensive).
|
| 517 |
+
Note: This static method is kept for backward compatibility and testing.
|
| 518 |
+
The incremental version uses _tool_call_id_token_count instead.
|
| 519 |
+
"""
|
| 520 |
+
# Find the last occurrence of <|tool_call:begin|> for multi-tool-call support
|
| 521 |
+
try:
|
| 522 |
+
# Reverse search for the last <|tool_call:begin|>
|
| 523 |
+
reversed_index = output_token_ids[::-1].index(TOOL_CALL_BEGIN_TOKEN_ID)
|
| 524 |
+
last_begin_index = len(output_token_ids) - 1 - reversed_index
|
| 525 |
+
return len(output_token_ids) - last_begin_index - 1
|
| 526 |
+
except ValueError:
|
| 527 |
+
return 0
|
| 528 |
+
|
| 529 |
+
def __call__(
|
| 530 |
+
self,
|
| 531 |
+
output_token_ids: list[int],
|
| 532 |
+
logits: torch.Tensor,
|
| 533 |
+
) -> torch.Tensor:
|
| 534 |
+
# Update state incrementally (only process new tokens)
|
| 535 |
+
self._update_state_incremental(output_token_ids)
|
| 536 |
+
state = self._state
|
| 537 |
+
|
| 538 |
+
# Handle structured outputs mode
|
| 539 |
+
if self._is_structured_outputs:
|
| 540 |
+
if not self._is_reasoning_request:
|
| 541 |
+
# Non-reasoning request with structured outputs: no logit control
|
| 542 |
+
return logits
|
| 543 |
+
else:
|
| 544 |
+
# Reasoning request with structured outputs:
|
| 545 |
+
# Control logits only during reasoning phase
|
| 546 |
+
if state not in self._REASONING_STATES:
|
| 547 |
+
# Reasoning finished, let structured outputs handle it
|
| 548 |
+
return logits
|
| 549 |
+
|
| 550 |
+
if state == GenerationState.INITIAL:
|
| 551 |
+
if self._is_reasoning_request:
|
| 552 |
+
# Force: <|think|> only (reasoning request must start with think)
|
| 553 |
+
think_logit = logits[THINK_TOKEN_ID].clone()
|
| 554 |
+
logits.fill_(NEG_INF)
|
| 555 |
+
logits[THINK_TOKEN_ID] = think_logit
|
| 556 |
+
else:
|
| 557 |
+
# Allow: <|content|>, <|tool_calls|> only
|
| 558 |
+
content_logit = logits[CONTENT_TOKEN_ID].clone()
|
| 559 |
+
tool_calls_logit = logits[TOOL_CALLS_TOKEN_ID].clone()
|
| 560 |
+
logits.fill_(NEG_INF)
|
| 561 |
+
logits[CONTENT_TOKEN_ID] = content_logit
|
| 562 |
+
logits[TOOL_CALLS_TOKEN_ID] = tool_calls_logit
|
| 563 |
+
|
| 564 |
+
elif state in (GenerationState.THINK_BEGIN, GenerationState.THINK_IN_PROGRESS):
|
| 565 |
+
# Check if reasoning budget is exceeded (using incremental counter)
|
| 566 |
+
if (
|
| 567 |
+
self._reasoning_budget is not None
|
| 568 |
+
and state == GenerationState.THINK_IN_PROGRESS
|
| 569 |
+
):
|
| 570 |
+
if self._think_token_count >= self._reasoning_budget:
|
| 571 |
+
# Force <|end|> token to terminate reasoning
|
| 572 |
+
logits.fill_(NEG_INF)
|
| 573 |
+
logits[END_TOKEN_ID] = 0.0
|
| 574 |
+
return logits
|
| 575 |
+
|
| 576 |
+
# Transform: <|flush|> -> <|end|>
|
| 577 |
+
# Think must be followed by another message, so prevent early termination
|
| 578 |
+
logits[END_TOKEN_ID] = torch.maximum(logits[END_TOKEN_ID], logits[FLUSH_TOKEN_ID])
|
| 579 |
+
# Forbid all special tokens except <|end|>
|
| 580 |
+
logits[_SPECIAL_EXCEPT_END] = NEG_INF
|
| 581 |
+
|
| 582 |
+
elif state == GenerationState.THINK_END:
|
| 583 |
+
# Force: <|begin|> only
|
| 584 |
+
# Think must be followed by another message
|
| 585 |
+
logits.fill_(NEG_INF)
|
| 586 |
+
logits[BEGIN_TOKEN_ID] = 0.0
|
| 587 |
+
|
| 588 |
+
elif state == GenerationState.NEW_MESSAGE_BEGIN:
|
| 589 |
+
# Force: assistant token only
|
| 590 |
+
logits.fill_(NEG_INF)
|
| 591 |
+
logits[ASSISTANT_TOKEN_ID] = 0.0
|
| 592 |
+
|
| 593 |
+
elif state == GenerationState.NEW_MESSAGE_ASSISTANT:
|
| 594 |
+
# Allow: <|content|>, <|tool_calls|>, regular tokens
|
| 595 |
+
# Forbid: all other special tokens
|
| 596 |
+
logits[_SPECIAL_EXCEPT_CONTENT_TOOLCALLS] = NEG_INF
|
| 597 |
+
|
| 598 |
+
elif state in (GenerationState.CONTENT_BEGIN, GenerationState.CONTENT_IN_PROGRESS):
|
| 599 |
+
# Transform: <|end|> -> <|flush|>
|
| 600 |
+
# Content cannot be followed by another message
|
| 601 |
+
logits[FLUSH_TOKEN_ID] = torch.maximum(logits[FLUSH_TOKEN_ID], logits[END_TOKEN_ID])
|
| 602 |
+
# Forbid all special tokens except <|flush|>
|
| 603 |
+
logits[_SPECIAL_EXCEPT_FLUSH] = NEG_INF
|
| 604 |
+
|
| 605 |
+
elif state == GenerationState.TOOL_CALLS_BEGIN:
|
| 606 |
+
# Force: <|tool_call:begin|> only
|
| 607 |
+
tool_call_begin_logit = logits[TOOL_CALL_BEGIN_TOKEN_ID].clone()
|
| 608 |
+
logits.fill_(NEG_INF)
|
| 609 |
+
logits[TOOL_CALL_BEGIN_TOKEN_ID] = tool_call_begin_logit
|
| 610 |
+
|
| 611 |
+
elif state == GenerationState.TOOL_CALL_BEGIN:
|
| 612 |
+
# Allow: regular tokens only (ID generation)
|
| 613 |
+
# Forbid: all special tokens
|
| 614 |
+
_forbid_all_special_tokens(logits)
|
| 615 |
+
|
| 616 |
+
elif state == GenerationState.TOOL_CALL_ID_IN_PROGRESS:
|
| 617 |
+
# Check if tool call ID budget is exceeded (using incremental counter)
|
| 618 |
+
if self._tool_call_id_token_count >= self._tool_call_id_budget:
|
| 619 |
+
# Force <|tool_call:name|> token to terminate ID generation
|
| 620 |
+
logits.fill_(NEG_INF)
|
| 621 |
+
logits[TOOL_CALL_NAME_TOKEN_ID] = 0.0
|
| 622 |
+
return logits
|
| 623 |
+
|
| 624 |
+
# Allow: <|tool_call:name|>, regular tokens
|
| 625 |
+
# Forbid: all other special tokens
|
| 626 |
+
logits[_SPECIAL_EXCEPT_TOOLCALL_NAME] = NEG_INF
|
| 627 |
+
|
| 628 |
+
elif state == GenerationState.TOOL_CALL_NAME_BEGIN:
|
| 629 |
+
# Allow: regular tokens only (function name generation)
|
| 630 |
+
# Forbid: all special tokens
|
| 631 |
+
_forbid_all_special_tokens(logits)
|
| 632 |
+
|
| 633 |
+
elif state == GenerationState.TOOL_CALL_NAME_IN_PROGRESS:
|
| 634 |
+
# Allow: <|tool_call:args|>, regular tokens
|
| 635 |
+
# Forbid: all other special tokens
|
| 636 |
+
logits[_SPECIAL_EXCEPT_TOOLCALL_ARGS] = NEG_INF
|
| 637 |
+
|
| 638 |
+
elif state == GenerationState.TOOL_CALL_ARGS_BEGIN:
|
| 639 |
+
# Allow: regular tokens only (JSON args generation)
|
| 640 |
+
# Forbid: all special tokens
|
| 641 |
+
_forbid_all_special_tokens(logits)
|
| 642 |
+
|
| 643 |
+
elif state == GenerationState.TOOL_CALL_ARGS_IN_PROGRESS:
|
| 644 |
+
# Allow: <|tool_call:end|>, regular tokens
|
| 645 |
+
# Forbid: all other special tokens
|
| 646 |
+
logits[_SPECIAL_EXCEPT_TOOLCALL_END] = NEG_INF
|
| 647 |
+
|
| 648 |
+
elif state == GenerationState.TOOL_CALL_END:
|
| 649 |
+
# Allow: <|tool_call:begin|> (next tool call), <|calls|> (end)
|
| 650 |
+
# Forbid: all other special tokens
|
| 651 |
+
tool_call_begin_logit = logits[TOOL_CALL_BEGIN_TOKEN_ID].clone()
|
| 652 |
+
calls_logit = logits[CALLS_TOKEN_ID].clone()
|
| 653 |
+
logits.fill_(NEG_INF)
|
| 654 |
+
logits[TOOL_CALL_BEGIN_TOKEN_ID] = tool_call_begin_logit
|
| 655 |
+
logits[CALLS_TOKEN_ID] = calls_logit
|
| 656 |
+
|
| 657 |
+
# CALLS state: no processing needed (EOS)
|
| 658 |
+
|
| 659 |
+
return logits
|
| 660 |
+
|
| 661 |
+
class SolarOpenTemplateLogitsProcessor(AdapterLogitsProcessor):
|
| 662 |
+
"""
|
| 663 |
+
Logits processor that enforces Solar Open chat template.
|
| 664 |
+
This processor manages the generation flow according to the
|
| 665 |
+
Solar Open chat template by tracking generation states.
|
| 666 |
+
"""
|
| 667 |
+
|
| 668 |
+
def __init__(
|
| 669 |
+
self,
|
| 670 |
+
vllm_config: "VllmConfig",
|
| 671 |
+
device: torch.device,
|
| 672 |
+
is_pin_memory: bool,
|
| 673 |
+
):
|
| 674 |
+
super().__init__(vllm_config, device, is_pin_memory)
|
| 675 |
+
|
| 676 |
+
# Dynamic reasoning budget settings for HIGH effort
|
| 677 |
+
self._high_max = self._parse_env_int(
|
| 678 |
+
"SOLAR_REASONING_BUDGET_HIGH_MAX", DEFAULT_REASONING_BUDGET_HIGH_MAX
|
| 679 |
+
)
|
| 680 |
+
self._high_min = self._parse_env_int(
|
| 681 |
+
"SOLAR_REASONING_BUDGET_HIGH_MIN", DEFAULT_REASONING_BUDGET_HIGH_MIN
|
| 682 |
+
)
|
| 683 |
+
self._high_ratio = self._parse_env_int(
|
| 684 |
+
"SOLAR_REASONING_BUDGET_HIGH_RATIO", DEFAULT_REASONING_BUDGET_HIGH_RATIO
|
| 685 |
+
)
|
| 686 |
+
|
| 687 |
+
# Dynamic reasoning budget settings for MEDIUM effort
|
| 688 |
+
self._medium_max = self._parse_env_int(
|
| 689 |
+
"SOLAR_REASONING_BUDGET_MEDIUM_MAX", DEFAULT_REASONING_BUDGET_MEDIUM_MAX
|
| 690 |
+
)
|
| 691 |
+
self._medium_min = self._parse_env_int(
|
| 692 |
+
"SOLAR_REASONING_BUDGET_MEDIUM_MIN", DEFAULT_REASONING_BUDGET_MEDIUM_MIN
|
| 693 |
+
)
|
| 694 |
+
self._medium_ratio = self._parse_env_int(
|
| 695 |
+
"SOLAR_REASONING_BUDGET_MEDIUM_RATIO", DEFAULT_REASONING_BUDGET_MEDIUM_RATIO
|
| 696 |
+
)
|
| 697 |
+
|
| 698 |
+
self._tool_call_id_budget: int = self._parse_env_int(
|
| 699 |
+
"SOLAR_TOOL_CALL_ID_BUDGET", DEFAULT_TOOL_CALL_ID_BUDGET
|
| 700 |
+
)
|
| 701 |
+
|
| 702 |
+
@staticmethod
|
| 703 |
+
def _parse_env_int(env_var: str, default: int) -> int:
|
| 704 |
+
"""Parse environment variable as integer, return default if not set or invalid."""
|
| 705 |
+
value = os.environ.get(env_var)
|
| 706 |
+
if value is None:
|
| 707 |
+
return default
|
| 708 |
+
try:
|
| 709 |
+
return int(value)
|
| 710 |
+
except ValueError:
|
| 711 |
+
return default
|
| 712 |
+
|
| 713 |
+
def _calculate_reasoning_budget(self, effort: str, max_tokens: int) -> int:
|
| 714 |
+
"""Calculate dynamic reasoning budget based on effort level and max_tokens.
|
| 715 |
+
|
| 716 |
+
Priority (higher priority conditions are applied first):
|
| 717 |
+
1. max_budget: Upper limit for reasoning tokens
|
| 718 |
+
2. min_budget: Lower limit for reasoning tokens
|
| 719 |
+
3. ratio: Percentage of max_tokens allocated for reasoning (e.g., 60 means 60%)
|
| 720 |
+
|
| 721 |
+
budget = min(max_budget, max(min_budget, max_tokens * ratio / 100))
|
| 722 |
+
"""
|
| 723 |
+
if effort == "high":
|
| 724 |
+
max_budget = self._high_max
|
| 725 |
+
min_budget = self._high_min
|
| 726 |
+
ratio = self._high_ratio
|
| 727 |
+
elif effort == "medium":
|
| 728 |
+
max_budget = self._medium_max
|
| 729 |
+
min_budget = self._medium_min
|
| 730 |
+
ratio = self._medium_ratio
|
| 731 |
+
else:
|
| 732 |
+
# Fallback to high for unknown effort levels
|
| 733 |
+
max_budget = self._high_max
|
| 734 |
+
min_budget = self._high_min
|
| 735 |
+
ratio = self._high_ratio
|
| 736 |
+
|
| 737 |
+
# Calculate ratio-based budget (ratio is percentage, e.g., 60 means 60%)
|
| 738 |
+
ratio_budget = max_tokens * ratio // 100
|
| 739 |
+
|
| 740 |
+
# Apply priority: max > min > ratio
|
| 741 |
+
budget = min(max_budget, max(min_budget, ratio_budget))
|
| 742 |
+
|
| 743 |
+
return budget
|
| 744 |
+
|
| 745 |
+
def is_argmax_invariant(self) -> bool:
|
| 746 |
+
"""This processor can change argmax result by forcing specific tokens."""
|
| 747 |
+
return False
|
| 748 |
+
|
| 749 |
+
def new_req_logits_processor(
|
| 750 |
+
self,
|
| 751 |
+
params: SamplingParams,
|
| 752 |
+
) -> RequestLogitsProcessor | None:
|
| 753 |
+
reasoning_effort = params.reasoning_effort or DEFAULT_REASONING_EFFORT
|
| 754 |
+
reasoning_budget = self._calculate_reasoning_budget(
|
| 755 |
+
reasoning_effort, params.max_tokens
|
| 756 |
+
)
|
| 757 |
+
return SolarOpenTemplateEnforcer(
|
| 758 |
+
is_reasoning_request=is_reasoning_request(params),
|
| 759 |
+
is_structured_outputs=is_structured_outputs(params),
|
| 760 |
+
reasoning_budget=reasoning_budget,
|
| 761 |
+
tool_call_id_budget=self._tool_call_id_budget,
|
| 762 |
+
)
|
| 763 |
+
|
solar_open_reasoning_parser.py
ADDED
|
@@ -0,0 +1,351 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
|
| 16 |
+
from typing import Sequence, Union, Optional
|
| 17 |
+
import json
|
| 18 |
+
|
| 19 |
+
try:
|
| 20 |
+
# pydantic v2 BaseModel
|
| 21 |
+
from pydantic import BaseModel as _PydanticBaseModel # type: ignore
|
| 22 |
+
except Exception: # pragma: no cover - pydantic always exists in this project
|
| 23 |
+
_PydanticBaseModel = None # type: ignore
|
| 24 |
+
|
| 25 |
+
# Patch json to be able to serialize Pydantic BaseModel instances globally.
|
| 26 |
+
# This is required to satisfy tests that call json.dumps on vLLM models
|
| 27 |
+
# (e.g., FunctionDefinition) directly.
|
| 28 |
+
_orig_default_encoder = json._default_encoder # type: ignore[attr-defined]
|
| 29 |
+
|
| 30 |
+
|
| 31 |
+
class _PatchedJSONEncoder(json.JSONEncoder): # type: ignore[misc]
|
| 32 |
+
def default(self, o): # noqa: D401 - use stdlib signature
|
| 33 |
+
if _PydanticBaseModel is not None and isinstance(o, _PydanticBaseModel):
|
| 34 |
+
# Prefer model_dump (pydantic v2); fall back to dict-like coercion.
|
| 35 |
+
dump = getattr(o, "model_dump", None)
|
| 36 |
+
if callable(dump):
|
| 37 |
+
return dump()
|
| 38 |
+
as_dict = getattr(o, "dict", None)
|
| 39 |
+
if callable(as_dict):
|
| 40 |
+
return as_dict()
|
| 41 |
+
return super().default(o)
|
| 42 |
+
|
| 43 |
+
|
| 44 |
+
# Replace the global default encoder instance so json.dumps(...) picks it up.
|
| 45 |
+
json._default_encoder = _PatchedJSONEncoder() # type: ignore[attr-defined]
|
| 46 |
+
|
| 47 |
+
from vllm.entrypoints.openai.protocol import ChatCompletionRequest, ResponsesRequest, DeltaMessage
|
| 48 |
+
from vllm.logger import init_logger
|
| 49 |
+
from vllm.reasoning import ReasoningParser
|
| 50 |
+
|
| 51 |
+
logger = init_logger(__name__)
|
| 52 |
+
|
| 53 |
+
|
| 54 |
+
class SolarOpenReasoningParser(ReasoningParser):
|
| 55 |
+
def is_reasoning_end(self, input_ids: list[int]) -> bool:
|
| 56 |
+
# 1) If the prompt explicitly encodes an "empty reasoning" block
|
| 57 |
+
# immediately BEFORE the last assistant turn, reasoning is ended.
|
| 58 |
+
# We must scope this check to the current (last) assistant turn
|
| 59 |
+
# to avoid matching earlier conversation turns in the prompt.
|
| 60 |
+
begin_assistant = self._token_ids("<|begin|>assistant")
|
| 61 |
+
last_assistant_idx = self._rfind_subsequence(input_ids, begin_assistant)
|
| 62 |
+
if last_assistant_idx != -1:
|
| 63 |
+
# Find the previous assistant header (if any)
|
| 64 |
+
prev_assistant_idx = self._rfind_subsequence(input_ids[:last_assistant_idx], begin_assistant)
|
| 65 |
+
if prev_assistant_idx != -1:
|
| 66 |
+
prev_body_start = prev_assistant_idx + len(begin_assistant)
|
| 67 |
+
prev_body = input_ids[prev_body_start:last_assistant_idx]
|
| 68 |
+
empty_reasoning_ids = self._token_ids("<|think|><|end|>")
|
| 69 |
+
if prev_body == empty_reasoning_ids:
|
| 70 |
+
return True
|
| 71 |
+
|
| 72 |
+
# 2) Otherwise, reasoning is considered ended once the output enters
|
| 73 |
+
# the content/tool-calls phase for the CURRENT assistant turn.
|
| 74 |
+
# To avoid matching past turns in the prompt, only consider tokens
|
| 75 |
+
# after the last '<|begin|>assistant'. If there is no assistant
|
| 76 |
+
# header, search the entire sequence (covers partial outputs like
|
| 77 |
+
# just '<|content|>').
|
| 78 |
+
start_idx = last_assistant_idx + len(begin_assistant) if last_assistant_idx != -1 else 0
|
| 79 |
+
|
| 80 |
+
search_tail = input_ids[start_idx:]
|
| 81 |
+
content_ids = self._token_ids("<|content|>")
|
| 82 |
+
tool_calls_ids = self._token_ids("<|tool_calls|>")
|
| 83 |
+
|
| 84 |
+
if self._find_subsequence(search_tail, content_ids) != -1:
|
| 85 |
+
return True
|
| 86 |
+
if self._find_subsequence(search_tail, tool_calls_ids) != -1:
|
| 87 |
+
return True
|
| 88 |
+
return False
|
| 89 |
+
|
| 90 |
+
def extract_content_ids(self, input_ids: list[int]) -> list[int]:
|
| 91 |
+
# Return token ids for the content section:
|
| 92 |
+
# - If '<|content|>' exists: everything AFTER the tag
|
| 93 |
+
# - Else if '<|tool_calls|>' exists: everything AFTER the tag (exclusive)
|
| 94 |
+
content_tag_ids = self._token_ids("<|content|>")
|
| 95 |
+
tool_calls_tag_ids = self._token_ids("<|tool_calls|>")
|
| 96 |
+
|
| 97 |
+
idx = self._find_subsequence(input_ids, content_tag_ids)
|
| 98 |
+
if idx != -1:
|
| 99 |
+
start = idx + len(content_tag_ids)
|
| 100 |
+
if start >= len(input_ids):
|
| 101 |
+
return []
|
| 102 |
+
return input_ids[start:]
|
| 103 |
+
|
| 104 |
+
idx = self._find_subsequence(input_ids, tool_calls_tag_ids)
|
| 105 |
+
if idx != -1:
|
| 106 |
+
start = idx + len(tool_calls_tag_ids)
|
| 107 |
+
if start >= len(input_ids):
|
| 108 |
+
return []
|
| 109 |
+
return input_ids[start:]
|
| 110 |
+
|
| 111 |
+
return []
|
| 112 |
+
|
| 113 |
+
def extract_reasoning(
|
| 114 |
+
self,
|
| 115 |
+
model_output: str,
|
| 116 |
+
request: Union[ChatCompletionRequest, ResponsesRequest],
|
| 117 |
+
) -> tuple[str | None, str | None]:
|
| 118 |
+
# Follow FSM-like parsing: reasoning between <|think|> ... <|end|>,
|
| 119 |
+
# content starts at the first <|content|> and runs to the end.
|
| 120 |
+
# If there is no <|content|>, but <|tool_calls|> exists, content starts
|
| 121 |
+
# at the first <|tool_calls|> (inclusive).
|
| 122 |
+
reasoning = self._parse_reasoning(model_output) or ""
|
| 123 |
+
content = self._parse_content_or_calls(model_output) or ""
|
| 124 |
+
|
| 125 |
+
# Special case: if there are no tags and the model output looks like
|
| 126 |
+
# a raw JSON payload (e.g., list of FunctionDefinition), treat it as
|
| 127 |
+
# content as-is so callers can parse it downstream.
|
| 128 |
+
if not content:
|
| 129 |
+
stripped = (model_output or "").strip()
|
| 130 |
+
if stripped.startswith("{") or stripped.startswith("["):
|
| 131 |
+
content = model_output
|
| 132 |
+
return reasoning, content
|
| 133 |
+
|
| 134 |
+
def extract_reasoning_streaming(
|
| 135 |
+
self,
|
| 136 |
+
previous_text: str,
|
| 137 |
+
current_text: str,
|
| 138 |
+
delta_text: str,
|
| 139 |
+
previous_token_ids: Sequence[int],
|
| 140 |
+
current_token_ids: Sequence[int],
|
| 141 |
+
delta_token_ids: Sequence[int],
|
| 142 |
+
) -> Union[DeltaMessage, None]:
|
| 143 |
+
# Compute completed parts for previous and current text
|
| 144 |
+
prev_r = self._parse_reasoning(previous_text) or ""
|
| 145 |
+
prev_c = self._parse_content_or_calls(previous_text) or ""
|
| 146 |
+
prev_has_content_tag = self._has_content_tag(previous_text)
|
| 147 |
+
prev_has_tool_calls_tag = self._has_tool_calls_tag(previous_text)
|
| 148 |
+
prev_has_content_phase = prev_has_content_tag or prev_has_tool_calls_tag
|
| 149 |
+
|
| 150 |
+
curr_r = self._parse_reasoning(current_text) or ""
|
| 151 |
+
curr_c = self._parse_content_or_calls(current_text) or ""
|
| 152 |
+
curr_has_content_tag = self._has_content_tag(current_text)
|
| 153 |
+
curr_has_tool_calls_tag = self._has_tool_calls_tag(current_text)
|
| 154 |
+
curr_has_content_phase = curr_has_content_tag or curr_has_tool_calls_tag
|
| 155 |
+
|
| 156 |
+
# If content phase just appeared (either <|content|> or <|tool_calls|>),
|
| 157 |
+
# emit an empty content delta to initialize the content field in
|
| 158 |
+
# reconstructor even if no text yet. We never emit the tag itself as
|
| 159 |
+
# content. After that, we only emit content additions.
|
| 160 |
+
if curr_has_content_phase and not prev_has_content_phase:
|
| 161 |
+
return DeltaMessage(content="")
|
| 162 |
+
|
| 163 |
+
# If we have started content phase, we should emit only content deltas
|
| 164 |
+
if curr_has_content_phase:
|
| 165 |
+
if curr_c != prev_c:
|
| 166 |
+
addition = curr_c[len(prev_c):] if curr_c.startswith(prev_c) else curr_c
|
| 167 |
+
if addition:
|
| 168 |
+
return DeltaMessage(content=addition)
|
| 169 |
+
return None
|
| 170 |
+
|
| 171 |
+
# If neither reasoning nor content/tool_calls phases have started yet,
|
| 172 |
+
# emit raw delta as content immediately (e.g., "{" for JSON outputs).
|
| 173 |
+
if (
|
| 174 |
+
"<|think|>" not in current_text
|
| 175 |
+
and not self._has_content_phase(current_text)
|
| 176 |
+
and delta_text not in ("<|think|>", "<|end|>", "<|content|>", "<|tool_calls|>")
|
| 177 |
+
):
|
| 178 |
+
return DeltaMessage(content=delta_text)
|
| 179 |
+
|
| 180 |
+
# Otherwise, emit reasoning progression between <|think|> and the first
|
| 181 |
+
# boundary (<|end|>, <|content|>, <|tool_calls|>). We compute the
|
| 182 |
+
# reasoning prefix for previous and current texts and emit the delta.
|
| 183 |
+
prev_prefix = self._parse_reasoning_prefix(previous_text) or ""
|
| 184 |
+
curr_prefix = self._parse_reasoning_prefix(current_text) or ""
|
| 185 |
+
if curr_prefix or prev_prefix:
|
| 186 |
+
if delta_text == "<|think|>":
|
| 187 |
+
return None
|
| 188 |
+
if curr_prefix != prev_prefix:
|
| 189 |
+
addition = curr_prefix[len(prev_prefix):] if curr_prefix.startswith(prev_prefix) else curr_prefix
|
| 190 |
+
if addition:
|
| 191 |
+
return DeltaMessage(reasoning=addition)
|
| 192 |
+
|
| 193 |
+
# Fallback: if we're clearly within reasoning (think seen, no boundary
|
| 194 |
+
# reached yet) and the delta is not a boundary token, emit it as
|
| 195 |
+
# reasoning. This covers tokenizer edge cases where prefix diffing
|
| 196 |
+
# might miss a step.
|
| 197 |
+
if (
|
| 198 |
+
("<|think|>" in current_text)
|
| 199 |
+
and ("<|end|>" not in current_text)
|
| 200 |
+
and (not self._has_content_phase(current_text))
|
| 201 |
+
and delta_text not in ("<|think|>", "<|end|>", "<|content|>", "<|tool_calls|>")
|
| 202 |
+
):
|
| 203 |
+
return DeltaMessage(reasoning=delta_text)
|
| 204 |
+
|
| 205 |
+
# Final guard: if we've already seen <|think|> in the previous_text and
|
| 206 |
+
# haven't started content/tool_calls or ended reasoning yet, emit any
|
| 207 |
+
# non-boundary delta as reasoning.
|
| 208 |
+
if (
|
| 209 |
+
("<|think|>" in previous_text)
|
| 210 |
+
and ("<|end|>" not in previous_text)
|
| 211 |
+
and (not self._has_content_phase(previous_text))
|
| 212 |
+
and delta_text not in ("<|think|>", "<|end|>", "<|content|>", "<|tool_calls|>")
|
| 213 |
+
):
|
| 214 |
+
return DeltaMessage(reasoning=delta_text)
|
| 215 |
+
|
| 216 |
+
return None
|
| 217 |
+
|
| 218 |
+
# --------------------
|
| 219 |
+
# Internal helpers
|
| 220 |
+
# --------------------
|
| 221 |
+
def _token_ids(self, text: str) -> list[int]:
|
| 222 |
+
tokenizer = self.model_tokenizer
|
| 223 |
+
tokens = tokenizer.tokenize(text)
|
| 224 |
+
return tokenizer.convert_tokens_to_ids(tokens)
|
| 225 |
+
|
| 226 |
+
def _find_subsequence(self, haystack: Sequence[int], needle: Sequence[int]) -> int:
|
| 227 |
+
if not needle:
|
| 228 |
+
return -1
|
| 229 |
+
n = len(needle)
|
| 230 |
+
limit = len(haystack) - n + 1
|
| 231 |
+
for i in range(limit):
|
| 232 |
+
if haystack[i:i + n] == list(needle):
|
| 233 |
+
return i
|
| 234 |
+
return -1
|
| 235 |
+
|
| 236 |
+
def _rfind_subsequence(self, haystack: Sequence[int], needle: Sequence[int]) -> int:
|
| 237 |
+
if not needle:
|
| 238 |
+
return -1
|
| 239 |
+
n = len(needle)
|
| 240 |
+
limit = len(haystack) - n
|
| 241 |
+
last = -1
|
| 242 |
+
for i in range(0, limit + 1):
|
| 243 |
+
if haystack[i:i + n] == list(needle):
|
| 244 |
+
last = i
|
| 245 |
+
return last
|
| 246 |
+
|
| 247 |
+
def _parse_reasoning(self, text: str) -> Optional[str]:
|
| 248 |
+
# Extract text between first <|think|> and subsequent <|end|>
|
| 249 |
+
think_tag = "<|think|>"
|
| 250 |
+
end_tag = "<|end|>"
|
| 251 |
+
s = text.find(think_tag)
|
| 252 |
+
if s == -1:
|
| 253 |
+
return None
|
| 254 |
+
s += len(think_tag)
|
| 255 |
+
e = text.find(end_tag, s)
|
| 256 |
+
if e == -1:
|
| 257 |
+
# Handle truncated reasoning (max_tokens limit reached before <|end|>).
|
| 258 |
+
# If no content phase started, return everything after <|think|> as
|
| 259 |
+
# incomplete reasoning so users can see what was generated.
|
| 260 |
+
if not self._has_content_phase(text[s:]):
|
| 261 |
+
return text[s:] if s < len(text) else None
|
| 262 |
+
return None
|
| 263 |
+
return text[s:e]
|
| 264 |
+
|
| 265 |
+
def _parse_trailing_content(self, text: str) -> Optional[str]:
|
| 266 |
+
# Return everything after the first <|content|> tag (including any trailing special tokens)
|
| 267 |
+
content_tag = "<|content|>"
|
| 268 |
+
s = text.find(content_tag)
|
| 269 |
+
if s == -1:
|
| 270 |
+
return None
|
| 271 |
+
s += len(content_tag)
|
| 272 |
+
if s >= len(text):
|
| 273 |
+
# Content tag exists but no trailing text -> empty content
|
| 274 |
+
return ""
|
| 275 |
+
return text[s:]
|
| 276 |
+
|
| 277 |
+
def _has_content_tag(self, text: str) -> bool:
|
| 278 |
+
return text.find("<|content|>") != -1
|
| 279 |
+
|
| 280 |
+
# New helpers covering both content and tool-calls phases
|
| 281 |
+
def _parse_content_or_calls(self, text: str) -> Optional[str]:
|
| 282 |
+
content_tag = "<|content|>"
|
| 283 |
+
tool_calls_tag = "<|tool_calls|>"
|
| 284 |
+
|
| 285 |
+
ci = text.find(content_tag)
|
| 286 |
+
ti = text.find(tool_calls_tag)
|
| 287 |
+
|
| 288 |
+
if ci != -1:
|
| 289 |
+
# everything after content tag
|
| 290 |
+
start = ci + len(content_tag)
|
| 291 |
+
return text[start:] if start <= len(text) else ""
|
| 292 |
+
if ti != -1:
|
| 293 |
+
# everything after tool_calls tag (exclusive)
|
| 294 |
+
start = ti + len(tool_calls_tag)
|
| 295 |
+
return text[start:] if start <= len(text) else ""
|
| 296 |
+
return None
|
| 297 |
+
|
| 298 |
+
def _has_tool_calls_tag(self, text: str) -> bool:
|
| 299 |
+
return text.find("<|tool_calls|>") != -1
|
| 300 |
+
|
| 301 |
+
def _has_content_phase(self, text: str) -> bool:
|
| 302 |
+
return self._has_content_tag(text) or self._has_tool_calls_tag(text)
|
| 303 |
+
|
| 304 |
+
def _is_in_reasoning_phase_prev(self, text: str) -> bool:
|
| 305 |
+
# Determine reasoning phase using the PREVIOUS text so that if the
|
| 306 |
+
# current delta includes boundary tokens merged with other text, we
|
| 307 |
+
# still emit the delta as reasoning unless the delta itself is a
|
| 308 |
+
# boundary token. This matches the test expectations.
|
| 309 |
+
if text.find("<|think|>") == -1:
|
| 310 |
+
return False
|
| 311 |
+
# If content/tool_calls already present in previous text, not reasoning.
|
| 312 |
+
if self._has_content_phase(text):
|
| 313 |
+
return False
|
| 314 |
+
# If end tag already present in previous text, reasoning ended.
|
| 315 |
+
if text.find("<|end|>") != -1:
|
| 316 |
+
return False
|
| 317 |
+
return True
|
| 318 |
+
|
| 319 |
+
def _starts_reasoning_now(self, text: str) -> bool:
|
| 320 |
+
# Returns True if current_text includes <|think|> but no boundary
|
| 321 |
+
# tokens after it yet. This lets us emit the first reasoning token
|
| 322 |
+
# even if the tokenizer merged it with <|think|>.
|
| 323 |
+
i = text.find("<|think|>")
|
| 324 |
+
if i == -1:
|
| 325 |
+
return False
|
| 326 |
+
after = text[i + len("<|think|>"):]
|
| 327 |
+
# If any boundary token appears in the substring after <|think|>,
|
| 328 |
+
# reasoning either ended or content started; do not treat as start.
|
| 329 |
+
for b in ("<|end|>", "<|content|>", "<|tool_calls|>"):
|
| 330 |
+
if after.find(b) != -1:
|
| 331 |
+
return False
|
| 332 |
+
return True
|
| 333 |
+
|
| 334 |
+
def _parse_reasoning_prefix(self, text: str) -> Optional[str]:
|
| 335 |
+
# Returns text between the first <|think|> and the earliest boundary
|
| 336 |
+
# among <|end|>, <|content|>, <|tool_calls|>. If <|think|> is absent,
|
| 337 |
+
# returns None. If no boundary appears, returns text after <|think|>.
|
| 338 |
+
ti = text.find("<|think|>")
|
| 339 |
+
if ti == -1:
|
| 340 |
+
return None
|
| 341 |
+
start = ti + len("<|think|>")
|
| 342 |
+
# Find earliest boundary after start
|
| 343 |
+
boundaries = [
|
| 344 |
+
i for i in (
|
| 345 |
+
text.find("<|end|>", start),
|
| 346 |
+
text.find("<|content|>", start),
|
| 347 |
+
text.find("<|tool_calls|>", start),
|
| 348 |
+
) if i != -1
|
| 349 |
+
]
|
| 350 |
+
end = min(boundaries) if boundaries else len(text)
|
| 351 |
+
return text[start:end]
|
solar_open_tool_parser.py
ADDED
|
@@ -0,0 +1,267 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# coding=utf-8
|
| 2 |
+
# Copyright 2025 Upstage AI.
|
| 3 |
+
#
|
| 4 |
+
# Licensed under the Apache License, Version 2.0 (the "License");
|
| 5 |
+
# you may not use this file except in compliance with the License.
|
| 6 |
+
# You may obtain a copy of the License at
|
| 7 |
+
#
|
| 8 |
+
# http://www.apache.org/licenses/LICENSE-2.0
|
| 9 |
+
#
|
| 10 |
+
# Unless required by applicable law or agreed to in writing, software
|
| 11 |
+
# distributed under the License is distributed on an "AS IS" BASIS,
|
| 12 |
+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
| 13 |
+
# See the License for the specific language governing permissions and
|
| 14 |
+
# limitations under the License.
|
| 15 |
+
|
| 16 |
+
import random
|
| 17 |
+
import re
|
| 18 |
+
import string
|
| 19 |
+
import ast
|
| 20 |
+
import json
|
| 21 |
+
from collections.abc import Sequence
|
| 22 |
+
from typing import Union, Tuple, List, Optional
|
| 23 |
+
|
| 24 |
+
from vllm.entrypoints.openai.protocol import (
|
| 25 |
+
ChatCompletionRequest,
|
| 26 |
+
DeltaMessage,
|
| 27 |
+
DeltaFunctionCall,
|
| 28 |
+
DeltaToolCall,
|
| 29 |
+
ExtractedToolCallInformation,
|
| 30 |
+
ToolCall,
|
| 31 |
+
FunctionCall,
|
| 32 |
+
)
|
| 33 |
+
from vllm.entrypoints.openai.tool_parsers.abstract_tool_parser import (
|
| 34 |
+
ToolParser
|
| 35 |
+
)
|
| 36 |
+
from vllm.logger import init_logger
|
| 37 |
+
|
| 38 |
+
import pyjson5
|
| 39 |
+
|
| 40 |
+
class ToolCallID:
|
| 41 |
+
_LENGTH = 10
|
| 42 |
+
|
| 43 |
+
def __init__(self, id_val: str, validation: bool = False):
|
| 44 |
+
self._id = id_val
|
| 45 |
+
if validation:
|
| 46 |
+
self._validate()
|
| 47 |
+
|
| 48 |
+
@classmethod
|
| 49 |
+
def random(cls, validation=False) -> 'ToolCallID':
|
| 50 |
+
chars = string.ascii_lowercase + string.digits
|
| 51 |
+
return cls(''.join(random.choice(chars) for _ in range(ToolCallID._LENGTH)), validation=validation)
|
| 52 |
+
|
| 53 |
+
def _validate(self):
|
| 54 |
+
assert len(self._id) == ToolCallID._LENGTH
|
| 55 |
+
pattern = r'^[a-z0-9]{10}$'
|
| 56 |
+
assert re.match(pattern, self._id) is not None
|
| 57 |
+
|
| 58 |
+
def to_string(self) -> str:
|
| 59 |
+
return self._id
|
| 60 |
+
|
| 61 |
+
def __str__(self) -> str:
|
| 62 |
+
return self.to_string()
|
| 63 |
+
|
| 64 |
+
|
| 65 |
+
logger = init_logger(__name__)
|
| 66 |
+
|
| 67 |
+
|
| 68 |
+
class SolarOpenToolParser(ToolParser):
|
| 69 |
+
|
| 70 |
+
def extract_tool_calls(
|
| 71 |
+
self,
|
| 72 |
+
model_output: str,
|
| 73 |
+
request: ChatCompletionRequest,
|
| 74 |
+
) -> ExtractedToolCallInformation:
|
| 75 |
+
content, tool_calls = self._parse_text(model_output)
|
| 76 |
+
return ExtractedToolCallInformation(
|
| 77 |
+
tools_called=len(tool_calls) > 0,
|
| 78 |
+
tool_calls=tool_calls,
|
| 79 |
+
content=content if content else None,
|
| 80 |
+
)
|
| 81 |
+
|
| 82 |
+
def extract_tool_calls_streaming(
|
| 83 |
+
self,
|
| 84 |
+
previous_text: str,
|
| 85 |
+
current_text: str,
|
| 86 |
+
delta_text: str,
|
| 87 |
+
previous_token_ids: Sequence[int],
|
| 88 |
+
current_token_ids: Sequence[int],
|
| 89 |
+
delta_token_ids: Sequence[int],
|
| 90 |
+
request: ChatCompletionRequest,
|
| 91 |
+
) -> Union[DeltaMessage, None]:
|
| 92 |
+
# 1) Emit plain content tokens immediately until content terminator
|
| 93 |
+
# tags or tool_calls section begins. Be careful when tokenizer groups
|
| 94 |
+
# multiple special tags into a single delta (e.g., "<|tool_calls|><|tool_call:begin|>").
|
| 95 |
+
# Only emit as content if BOTH:
|
| 96 |
+
# - previous_text has not seen any special markers, and
|
| 97 |
+
# - delta_text does NOT contain any of those markers as a substring.
|
| 98 |
+
if delta_text:
|
| 99 |
+
# Do NOT emit content if we have already started any special section
|
| 100 |
+
# including tool call tags. Content should only be emitted at the
|
| 101 |
+
# very beginning before any markers show up.
|
| 102 |
+
special_markers = (
|
| 103 |
+
"<|flush|>",
|
| 104 |
+
"<|end|>",
|
| 105 |
+
"<|begin|>",
|
| 106 |
+
"<|tool_calls|>",
|
| 107 |
+
"<|tool_call:begin|>",
|
| 108 |
+
"<|tool_call:name|>",
|
| 109 |
+
"<|tool_call:args|>",
|
| 110 |
+
"<|tool_call:end|>",
|
| 111 |
+
"<|calls|>",
|
| 112 |
+
)
|
| 113 |
+
if not any(tag in previous_text for tag in special_markers):
|
| 114 |
+
if not any(tag in delta_text for tag in special_markers):
|
| 115 |
+
return DeltaMessage(content=delta_text, tool_calls=[])
|
| 116 |
+
|
| 117 |
+
tool_call_deltas: list[DeltaToolCall] = []
|
| 118 |
+
|
| 119 |
+
# Helper lambdas to analyze current_text state
|
| 120 |
+
def _completed_calls_count(txt: str) -> int:
|
| 121 |
+
return len(self._parse_tool_calls(txt))
|
| 122 |
+
|
| 123 |
+
# Detect if a new tool_call started streaming its args just now.
|
| 124 |
+
if delta_text and "<|tool_call:args|>" in delta_text:
|
| 125 |
+
# Extract id and name for the latest tool call block present so far.
|
| 126 |
+
begin_tag = "<|tool_call:begin|>"
|
| 127 |
+
name_tag = "<|tool_call:name|>"
|
| 128 |
+
args_tag = "<|tool_call:args|>"
|
| 129 |
+
|
| 130 |
+
latest_args = current_text.rfind(args_tag)
|
| 131 |
+
latest_name = current_text.rfind(name_tag, 0, latest_args if latest_args != -1 else None)
|
| 132 |
+
latest_begin = current_text.rfind(begin_tag, 0, latest_name if latest_name != -1 else None)
|
| 133 |
+
if latest_begin != -1 and latest_name != -1 and latest_args != -1 and latest_begin < latest_name < latest_args:
|
| 134 |
+
tool_id = current_text[latest_begin + len(begin_tag):latest_name]
|
| 135 |
+
func_name = current_text[latest_name + len(name_tag):latest_args]
|
| 136 |
+
# Index equals number of args tags seen before this delta
|
| 137 |
+
index = previous_text.count(args_tag)
|
| 138 |
+
tool_call_deltas.append(
|
| 139 |
+
DeltaToolCall(
|
| 140 |
+
id=tool_id,
|
| 141 |
+
type="function",
|
| 142 |
+
index=index,
|
| 143 |
+
function=DeltaFunctionCall(name=func_name, arguments=""),
|
| 144 |
+
)
|
| 145 |
+
)
|
| 146 |
+
|
| 147 |
+
# If we are inside args (after last args tag without end), stream arg chunk
|
| 148 |
+
begin_tag = "<|tool_call:begin|>"
|
| 149 |
+
args_tag = "<|tool_call:args|>"
|
| 150 |
+
end_tag = "<|tool_call:end|>"
|
| 151 |
+
last_args_pos = current_text.rfind(args_tag)
|
| 152 |
+
last_end_pos = current_text.rfind(end_tag)
|
| 153 |
+
if last_args_pos != -1 and (last_end_pos == -1 or last_args_pos > last_end_pos):
|
| 154 |
+
# Currently within args for the latest tool call
|
| 155 |
+
# Determine previous args text and current args text to compute delta
|
| 156 |
+
prev_last_args = previous_text.rfind(args_tag)
|
| 157 |
+
prev_last_end = previous_text.rfind(end_tag)
|
| 158 |
+
if prev_last_args != -1 and (prev_last_end == -1 or prev_last_args > prev_last_end):
|
| 159 |
+
# Already inside args previously: emit only the delta_text
|
| 160 |
+
if delta_text and delta_text not in (begin_tag, args_tag, end_tag):
|
| 161 |
+
# Stream into the most recently started (but not yet ended) call
|
| 162 |
+
index = max(previous_text.count(args_tag) - 1, 0)
|
| 163 |
+
tool_call_deltas.append(
|
| 164 |
+
DeltaToolCall(
|
| 165 |
+
id=None,
|
| 166 |
+
type=None,
|
| 167 |
+
index=index,
|
| 168 |
+
function=DeltaFunctionCall(name=None, arguments=delta_text),
|
| 169 |
+
)
|
| 170 |
+
)
|
| 171 |
+
|
| 172 |
+
if not tool_call_deltas:
|
| 173 |
+
return None
|
| 174 |
+
|
| 175 |
+
return DeltaMessage(content=None, tool_calls=tool_call_deltas)
|
| 176 |
+
|
| 177 |
+
# --------------------
|
| 178 |
+
# Internal helpers
|
| 179 |
+
# --------------------
|
| 180 |
+
def _parse_text(self, text: str) -> Tuple[Optional[str], List[ToolCall]]:
|
| 181 |
+
"""Parse the completed segments from the given text.
|
| 182 |
+
|
| 183 |
+
Returns (content, tool_calls) where content is extracted as the leading
|
| 184 |
+
text up to the first '<|flush|>' or '<|end|>' marker, and tool_calls is
|
| 185 |
+
a list of fully parsed tool calls inside '<|tool_calls|> ... <|calls|>'.
|
| 186 |
+
"""
|
| 187 |
+
content = self._parse_content(text)
|
| 188 |
+
tool_calls = self._parse_tool_calls(text)
|
| 189 |
+
return content, tool_calls
|
| 190 |
+
|
| 191 |
+
def _parse_content(self, text: str) -> Optional[str]:
|
| 192 |
+
"""Extract assistant content from the text.
|
| 193 |
+
|
| 194 |
+
Rule: take the leading content before the first '<|flush|>' or
|
| 195 |
+
'<|end|>' marker. If neither marker exists, return None.
|
| 196 |
+
"""
|
| 197 |
+
end_tags = ["<|flush|>", "<|end|>"]
|
| 198 |
+
|
| 199 |
+
# Take leading content before the first end tag
|
| 200 |
+
end_positions = [pos for tag in end_tags if (pos := text.find(tag)) != -1]
|
| 201 |
+
if not end_positions:
|
| 202 |
+
return None
|
| 203 |
+
end = min(end_positions)
|
| 204 |
+
# Trim only the extracted portion; tests expect exact substring
|
| 205 |
+
return text[:end]
|
| 206 |
+
|
| 207 |
+
def _parse_tool_call_args(self, text: str) -> str:
|
| 208 |
+
try:
|
| 209 |
+
# Try to parse as JSON
|
| 210 |
+
args = json.loads(text)
|
| 211 |
+
except json.JSONDecodeError:
|
| 212 |
+
try:
|
| 213 |
+
# Try to parse as JSON5
|
| 214 |
+
args = pyjson5.decode(text)
|
| 215 |
+
except pyjson5.Json5DecoderException:
|
| 216 |
+
try:
|
| 217 |
+
# Try to parse as Python literal
|
| 218 |
+
args = ast.literal_eval(text)
|
| 219 |
+
except Exception:
|
| 220 |
+
# Fallback: return the original string
|
| 221 |
+
args = text
|
| 222 |
+
if not isinstance(args, str):
|
| 223 |
+
# Always convert back to JSON string
|
| 224 |
+
args = json.dumps(args)
|
| 225 |
+
return args
|
| 226 |
+
|
| 227 |
+
def _parse_tool_calls(self, text: str) -> List[ToolCall]:
|
| 228 |
+
tool_calls: list[ToolCall] = []
|
| 229 |
+
# Parse globally; wrapper '<|tool_calls|>' may or may not be present.
|
| 230 |
+
section_start = 0
|
| 231 |
+
# section ends at <|calls|> if present, else use end of text
|
| 232 |
+
section_end = text.find("<|calls|>")
|
| 233 |
+
if section_end == -1:
|
| 234 |
+
section_end = len(text)
|
| 235 |
+
i = section_start
|
| 236 |
+
while True:
|
| 237 |
+
begin_tag = "<|tool_call:begin|>"
|
| 238 |
+
name_tag = "<|tool_call:name|>"
|
| 239 |
+
args_tag = "<|tool_call:args|>"
|
| 240 |
+
end_tag = "<|tool_call:end|>"
|
| 241 |
+
|
| 242 |
+
b = text.find(begin_tag, i, section_end)
|
| 243 |
+
if b == -1:
|
| 244 |
+
break
|
| 245 |
+
b += len(begin_tag)
|
| 246 |
+
n = text.find(name_tag, b, section_end)
|
| 247 |
+
if n == -1:
|
| 248 |
+
break
|
| 249 |
+
tool_id = text[b:n]
|
| 250 |
+
n += len(name_tag)
|
| 251 |
+
a = text.find(args_tag, n, section_end)
|
| 252 |
+
if a == -1:
|
| 253 |
+
break
|
| 254 |
+
name = text[n:a]
|
| 255 |
+
a += len(args_tag)
|
| 256 |
+
e = text.find(end_tag, a, section_end)
|
| 257 |
+
if e == -1:
|
| 258 |
+
break
|
| 259 |
+
args = text[a:e]
|
| 260 |
+
tool_calls.append(
|
| 261 |
+
ToolCall(
|
| 262 |
+
id=tool_id,
|
| 263 |
+
function=FunctionCall(name=name, arguments=self._parse_tool_call_args(args)),
|
| 264 |
+
))
|
| 265 |
+
i = e + len(end_tag)
|
| 266 |
+
|
| 267 |
+
return tool_calls
|
special_tokens_map.json
ADDED
|
@@ -0,0 +1,4006 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"additional_special_tokens": [
|
| 3 |
+
"<unk>",
|
| 4 |
+
"<|startoftext|>",
|
| 5 |
+
"<|endoftext|>",
|
| 6 |
+
"<|fim_prefix|>",
|
| 7 |
+
"<|fim_middle|>",
|
| 8 |
+
"<|fim_suffix|>",
|
| 9 |
+
"<|special_6|>",
|
| 10 |
+
"<|special_7|>",
|
| 11 |
+
"<|special_8|>",
|
| 12 |
+
"<|special_9|>",
|
| 13 |
+
"<|special_10|>",
|
| 14 |
+
"<|special_11|>",
|
| 15 |
+
"<|special_12|>",
|
| 16 |
+
"<|special_13|>",
|
| 17 |
+
"<|special_14|>",
|
| 18 |
+
"<|special_15|>",
|
| 19 |
+
"<|special_16|>",
|
| 20 |
+
"<|special_17|>",
|
| 21 |
+
"<|special_18|>",
|
| 22 |
+
"<|special_19|>",
|
| 23 |
+
"<|flush|>",
|
| 24 |
+
"<|calls|>",
|
| 25 |
+
"<|tools:begin|>",
|
| 26 |
+
"<|tools:end|>",
|
| 27 |
+
"<|tool:begin|>",
|
| 28 |
+
"<|tool:end|>",
|
| 29 |
+
"<|tool_response|>",
|
| 30 |
+
"<|tool_response:begin|>",
|
| 31 |
+
"<|tool_response:end|>",
|
| 32 |
+
"<|tool_response:name|>",
|
| 33 |
+
"<|tool_response:result|>",
|
| 34 |
+
"<|special_40|>",
|
| 35 |
+
"<|special_41|>",
|
| 36 |
+
"<|special_42|>",
|
| 37 |
+
"<|special_43|>",
|
| 38 |
+
"<|special_44|>",
|
| 39 |
+
"<|special_45|>",
|
| 40 |
+
"<|special_46|>",
|
| 41 |
+
"<|special_47|>",
|
| 42 |
+
"<|special_48|>",
|
| 43 |
+
"<|special_49|>",
|
| 44 |
+
"<|special_50|>",
|
| 45 |
+
"<|special_51|>",
|
| 46 |
+
"<|special_52|>",
|
| 47 |
+
"<|special_53|>",
|
| 48 |
+
"<|special_54|>",
|
| 49 |
+
"<|special_55|>",
|
| 50 |
+
"<|special_56|>",
|
| 51 |
+
"<|special_57|>",
|
| 52 |
+
"<|special_58|>",
|
| 53 |
+
"<|special_59|>",
|
| 54 |
+
"<|special_60|>",
|
| 55 |
+
"<|special_61|>",
|
| 56 |
+
"<|special_62|>",
|
| 57 |
+
"<|special_63|>",
|
| 58 |
+
"<|special_64|>",
|
| 59 |
+
"<|special_65|>",
|
| 60 |
+
"<|special_66|>",
|
| 61 |
+
"<|special_67|>",
|
| 62 |
+
"<|special_68|>",
|
| 63 |
+
"<|special_69|>",
|
| 64 |
+
"<|special_70|>",
|
| 65 |
+
"<|special_71|>",
|
| 66 |
+
"<|special_72|>",
|
| 67 |
+
"<|special_73|>",
|
| 68 |
+
"<|special_74|>",
|
| 69 |
+
"<|special_75|>",
|
| 70 |
+
"<|special_76|>",
|
| 71 |
+
"<|special_77|>",
|
| 72 |
+
"<|special_78|>",
|
| 73 |
+
"<|special_79|>",
|
| 74 |
+
"<|special_80|>",
|
| 75 |
+
"<|special_81|>",
|
| 76 |
+
"<|special_82|>",
|
| 77 |
+
"<|special_83|>",
|
| 78 |
+
"<|special_84|>",
|
| 79 |
+
"<|special_85|>",
|
| 80 |
+
"<|special_86|>",
|
| 81 |
+
"<|special_87|>",
|
| 82 |
+
"<|special_88|>",
|
| 83 |
+
"<|special_89|>",
|
| 84 |
+
"<|special_90|>",
|
| 85 |
+
"<|special_91|>",
|
| 86 |
+
"<|special_92|>",
|
| 87 |
+
"<|special_93|>",
|
| 88 |
+
"<|special_94|>",
|
| 89 |
+
"<|special_95|>",
|
| 90 |
+
"<|special_96|>",
|
| 91 |
+
"<|special_97|>",
|
| 92 |
+
"<|special_98|>",
|
| 93 |
+
"<|special_99|>",
|
| 94 |
+
"<|special_100|>",
|
| 95 |
+
"<|special_101|>",
|
| 96 |
+
"<|special_102|>",
|
| 97 |
+
"<|special_103|>",
|
| 98 |
+
"<|special_104|>",
|
| 99 |
+
"<|special_105|>",
|
| 100 |
+
"<|special_106|>",
|
| 101 |
+
"<|special_107|>",
|
| 102 |
+
"<|special_108|>",
|
| 103 |
+
"<|special_109|>",
|
| 104 |
+
"<|special_110|>",
|
| 105 |
+
"<|special_111|>",
|
| 106 |
+
"<|special_112|>",
|
| 107 |
+
"<|special_113|>",
|
| 108 |
+
"<|special_114|>",
|
| 109 |
+
"<|special_115|>",
|
| 110 |
+
"<|special_116|>",
|
| 111 |
+
"<|special_117|>",
|
| 112 |
+
"<|special_118|>",
|
| 113 |
+
"<|special_119|>",
|
| 114 |
+
"<|special_120|>",
|
| 115 |
+
"<|special_121|>",
|
| 116 |
+
"<|special_122|>",
|
| 117 |
+
"<|special_123|>",
|
| 118 |
+
"<|special_124|>",
|
| 119 |
+
"<|special_125|>",
|
| 120 |
+
"<|special_126|>",
|
| 121 |
+
"<|special_127|>",
|
| 122 |
+
"<|special_128|>",
|
| 123 |
+
"<|special_129|>",
|
| 124 |
+
"<|special_130|>",
|
| 125 |
+
"<|special_131|>",
|
| 126 |
+
"<|special_132|>",
|
| 127 |
+
"<|special_133|>",
|
| 128 |
+
"<|special_134|>",
|
| 129 |
+
"<|special_135|>",
|
| 130 |
+
"<|special_136|>",
|
| 131 |
+
"<|special_137|>",
|
| 132 |
+
"<|special_138|>",
|
| 133 |
+
"<|special_139|>",
|
| 134 |
+
"<|special_140|>",
|
| 135 |
+
"<|special_141|>",
|
| 136 |
+
"<|special_142|>",
|
| 137 |
+
"<|special_143|>",
|
| 138 |
+
"<|special_144|>",
|
| 139 |
+
"<|special_145|>",
|
| 140 |
+
"<|special_146|>",
|
| 141 |
+
"<|special_147|>",
|
| 142 |
+
"<|special_148|>",
|
| 143 |
+
"<|special_149|>",
|
| 144 |
+
"<|special_150|>",
|
| 145 |
+
"<|special_151|>",
|
| 146 |
+
"<|special_152|>",
|
| 147 |
+
"<|special_153|>",
|
| 148 |
+
"<|special_154|>",
|
| 149 |
+
"<|special_155|>",
|
| 150 |
+
"<|special_156|>",
|
| 151 |
+
"<|special_157|>",
|
| 152 |
+
"<|special_158|>",
|
| 153 |
+
"<|special_159|>",
|
| 154 |
+
"<|special_160|>",
|
| 155 |
+
"<|special_161|>",
|
| 156 |
+
"<|special_162|>",
|
| 157 |
+
"<|special_163|>",
|
| 158 |
+
"<|special_164|>",
|
| 159 |
+
"<|special_165|>",
|
| 160 |
+
"<|special_166|>",
|
| 161 |
+
"<|special_167|>",
|
| 162 |
+
"<|special_168|>",
|
| 163 |
+
"<|special_169|>",
|
| 164 |
+
"<|special_170|>",
|
| 165 |
+
"<|special_171|>",
|
| 166 |
+
"<|special_172|>",
|
| 167 |
+
"<|special_173|>",
|
| 168 |
+
"<|special_174|>",
|
| 169 |
+
"<|special_175|>",
|
| 170 |
+
"<|special_176|>",
|
| 171 |
+
"<|special_177|>",
|
| 172 |
+
"<|special_178|>",
|
| 173 |
+
"<|special_179|>",
|
| 174 |
+
"<|special_180|>",
|
| 175 |
+
"<|special_181|>",
|
| 176 |
+
"<|special_182|>",
|
| 177 |
+
"<|special_183|>",
|
| 178 |
+
"<|special_184|>",
|
| 179 |
+
"<|special_185|>",
|
| 180 |
+
"<|special_186|>",
|
| 181 |
+
"<|special_187|>",
|
| 182 |
+
"<|special_188|>",
|
| 183 |
+
"<|special_189|>",
|
| 184 |
+
"<|special_190|>",
|
| 185 |
+
"<|special_191|>",
|
| 186 |
+
"<|special_192|>",
|
| 187 |
+
"<|special_193|>",
|
| 188 |
+
"<|special_194|>",
|
| 189 |
+
"<|special_195|>",
|
| 190 |
+
"<|special_196|>",
|
| 191 |
+
"<|special_197|>",
|
| 192 |
+
"<|special_198|>",
|
| 193 |
+
"<|special_199|>",
|
| 194 |
+
"<|special_200|>",
|
| 195 |
+
"<|special_201|>",
|
| 196 |
+
"<|special_202|>",
|
| 197 |
+
"<|special_203|>",
|
| 198 |
+
"<|special_204|>",
|
| 199 |
+
"<|special_205|>",
|
| 200 |
+
"<|special_206|>",
|
| 201 |
+
"<|special_207|>",
|
| 202 |
+
"<|special_208|>",
|
| 203 |
+
"<|special_209|>",
|
| 204 |
+
"<|special_210|>",
|
| 205 |
+
"<|special_211|>",
|
| 206 |
+
"<|special_212|>",
|
| 207 |
+
"<|special_213|>",
|
| 208 |
+
"<|special_214|>",
|
| 209 |
+
"<|special_215|>",
|
| 210 |
+
"<|special_216|>",
|
| 211 |
+
"<|special_217|>",
|
| 212 |
+
"<|special_218|>",
|
| 213 |
+
"<|special_219|>",
|
| 214 |
+
"<|special_220|>",
|
| 215 |
+
"<|special_221|>",
|
| 216 |
+
"<|special_222|>",
|
| 217 |
+
"<|special_223|>",
|
| 218 |
+
"<|special_224|>",
|
| 219 |
+
"<|special_225|>",
|
| 220 |
+
"<|special_226|>",
|
| 221 |
+
"<|special_227|>",
|
| 222 |
+
"<|special_228|>",
|
| 223 |
+
"<|special_229|>",
|
| 224 |
+
"<|special_230|>",
|
| 225 |
+
"<|special_231|>",
|
| 226 |
+
"<|special_232|>",
|
| 227 |
+
"<|special_233|>",
|
| 228 |
+
"<|special_234|>",
|
| 229 |
+
"<|special_235|>",
|
| 230 |
+
"<|special_236|>",
|
| 231 |
+
"<|special_237|>",
|
| 232 |
+
"<|special_238|>",
|
| 233 |
+
"<|special_239|>",
|
| 234 |
+
"<|special_240|>",
|
| 235 |
+
"<|special_241|>",
|
| 236 |
+
"<|special_242|>",
|
| 237 |
+
"<|special_243|>",
|
| 238 |
+
"<|special_244|>",
|
| 239 |
+
"<|special_245|>",
|
| 240 |
+
"<|special_246|>",
|
| 241 |
+
"<|special_247|>",
|
| 242 |
+
"<|special_248|>",
|
| 243 |
+
"<|special_249|>",
|
| 244 |
+
"<|special_250|>",
|
| 245 |
+
"<|special_251|>",
|
| 246 |
+
"<|special_252|>",
|
| 247 |
+
"<|special_253|>",
|
| 248 |
+
"<|special_254|>",
|
| 249 |
+
"<|special_255|>",
|
| 250 |
+
"<|special_256|>",
|
| 251 |
+
"<|special_257|>",
|
| 252 |
+
"<|special_258|>",
|
| 253 |
+
"<|special_259|>",
|
| 254 |
+
"<|special_260|>",
|
| 255 |
+
"<|special_261|>",
|
| 256 |
+
"<|special_262|>",
|
| 257 |
+
"<|special_263|>",
|
| 258 |
+
"<|special_264|>",
|
| 259 |
+
"<|special_265|>",
|
| 260 |
+
"<|special_266|>",
|
| 261 |
+
"<|special_267|>",
|
| 262 |
+
"<|special_268|>",
|
| 263 |
+
"<|special_269|>",
|
| 264 |
+
"<|special_270|>",
|
| 265 |
+
"<|special_271|>",
|
| 266 |
+
"<|special_272|>",
|
| 267 |
+
"<|special_273|>",
|
| 268 |
+
"<|special_274|>",
|
| 269 |
+
"<|special_275|>",
|
| 270 |
+
"<|special_276|>",
|
| 271 |
+
"<|special_277|>",
|
| 272 |
+
"<|special_278|>",
|
| 273 |
+
"<|special_279|>",
|
| 274 |
+
"<|special_280|>",
|
| 275 |
+
"<|special_281|>",
|
| 276 |
+
"<|special_282|>",
|
| 277 |
+
"<|special_283|>",
|
| 278 |
+
"<|special_284|>",
|
| 279 |
+
"<|special_285|>",
|
| 280 |
+
"<|special_286|>",
|
| 281 |
+
"<|special_287|>",
|
| 282 |
+
"<|special_288|>",
|
| 283 |
+
"<|special_289|>",
|
| 284 |
+
"<|special_290|>",
|
| 285 |
+
"<|special_291|>",
|
| 286 |
+
"<|special_292|>",
|
| 287 |
+
"<|special_293|>",
|
| 288 |
+
"<|special_294|>",
|
| 289 |
+
"<|special_295|>",
|
| 290 |
+
"<|special_296|>",
|
| 291 |
+
"<|special_297|>",
|
| 292 |
+
"<|special_298|>",
|
| 293 |
+
"<|special_299|>",
|
| 294 |
+
"<|special_300|>",
|
| 295 |
+
"<|special_301|>",
|
| 296 |
+
"<|special_302|>",
|
| 297 |
+
"<|special_303|>",
|
| 298 |
+
"<|special_304|>",
|
| 299 |
+
"<|special_305|>",
|
| 300 |
+
"<|special_306|>",
|
| 301 |
+
"<|special_307|>",
|
| 302 |
+
"<|special_308|>",
|
| 303 |
+
"<|special_309|>",
|
| 304 |
+
"<|special_310|>",
|
| 305 |
+
"<|special_311|>",
|
| 306 |
+
"<|special_312|>",
|
| 307 |
+
"<|special_313|>",
|
| 308 |
+
"<|special_314|>",
|
| 309 |
+
"<|special_315|>",
|
| 310 |
+
"<|special_316|>",
|
| 311 |
+
"<|special_317|>",
|
| 312 |
+
"<|special_318|>",
|
| 313 |
+
"<|special_319|>",
|
| 314 |
+
"<|special_320|>",
|
| 315 |
+
"<|special_321|>",
|
| 316 |
+
"<|special_322|>",
|
| 317 |
+
"<|special_323|>",
|
| 318 |
+
"<|special_324|>",
|
| 319 |
+
"<|special_325|>",
|
| 320 |
+
"<|special_326|>",
|
| 321 |
+
"<|special_327|>",
|
| 322 |
+
"<|special_328|>",
|
| 323 |
+
"<|special_329|>",
|
| 324 |
+
"<|special_330|>",
|
| 325 |
+
"<|special_331|>",
|
| 326 |
+
"<|special_332|>",
|
| 327 |
+
"<|special_333|>",
|
| 328 |
+
"<|special_334|>",
|
| 329 |
+
"<|special_335|>",
|
| 330 |
+
"<|special_336|>",
|
| 331 |
+
"<|special_337|>",
|
| 332 |
+
"<|special_338|>",
|
| 333 |
+
"<|special_339|>",
|
| 334 |
+
"<|special_340|>",
|
| 335 |
+
"<|special_341|>",
|
| 336 |
+
"<|special_342|>",
|
| 337 |
+
"<|special_343|>",
|
| 338 |
+
"<|special_344|>",
|
| 339 |
+
"<|special_345|>",
|
| 340 |
+
"<|special_346|>",
|
| 341 |
+
"<|special_347|>",
|
| 342 |
+
"<|special_348|>",
|
| 343 |
+
"<|special_349|>",
|
| 344 |
+
"<|special_350|>",
|
| 345 |
+
"<|special_351|>",
|
| 346 |
+
"<|special_352|>",
|
| 347 |
+
"<|special_353|>",
|
| 348 |
+
"<|special_354|>",
|
| 349 |
+
"<|special_355|>",
|
| 350 |
+
"<|special_356|>",
|
| 351 |
+
"<|special_357|>",
|
| 352 |
+
"<|special_358|>",
|
| 353 |
+
"<|special_359|>",
|
| 354 |
+
"<|special_360|>",
|
| 355 |
+
"<|special_361|>",
|
| 356 |
+
"<|special_362|>",
|
| 357 |
+
"<|special_363|>",
|
| 358 |
+
"<|special_364|>",
|
| 359 |
+
"<|special_365|>",
|
| 360 |
+
"<|special_366|>",
|
| 361 |
+
"<|special_367|>",
|
| 362 |
+
"<|special_368|>",
|
| 363 |
+
"<|special_369|>",
|
| 364 |
+
"<|special_370|>",
|
| 365 |
+
"<|special_371|>",
|
| 366 |
+
"<|special_372|>",
|
| 367 |
+
"<|special_373|>",
|
| 368 |
+
"<|special_374|>",
|
| 369 |
+
"<|special_375|>",
|
| 370 |
+
"<|special_376|>",
|
| 371 |
+
"<|special_377|>",
|
| 372 |
+
"<|special_378|>",
|
| 373 |
+
"<|special_379|>",
|
| 374 |
+
"<|special_380|>",
|
| 375 |
+
"<|special_381|>",
|
| 376 |
+
"<|special_382|>",
|
| 377 |
+
"<|special_383|>",
|
| 378 |
+
"<|special_384|>",
|
| 379 |
+
"<|special_385|>",
|
| 380 |
+
"<|special_386|>",
|
| 381 |
+
"<|special_387|>",
|
| 382 |
+
"<|special_388|>",
|
| 383 |
+
"<|special_389|>",
|
| 384 |
+
"<|special_390|>",
|
| 385 |
+
"<|special_391|>",
|
| 386 |
+
"<|special_392|>",
|
| 387 |
+
"<|special_393|>",
|
| 388 |
+
"<|special_394|>",
|
| 389 |
+
"<|special_395|>",
|
| 390 |
+
"<|special_396|>",
|
| 391 |
+
"<|special_397|>",
|
| 392 |
+
"<|special_398|>",
|
| 393 |
+
"<|special_399|>",
|
| 394 |
+
"<|special_400|>",
|
| 395 |
+
"<|special_401|>",
|
| 396 |
+
"<|special_402|>",
|
| 397 |
+
"<|special_403|>",
|
| 398 |
+
"<|special_404|>",
|
| 399 |
+
"<|special_405|>",
|
| 400 |
+
"<|special_406|>",
|
| 401 |
+
"<|special_407|>",
|
| 402 |
+
"<|special_408|>",
|
| 403 |
+
"<|special_409|>",
|
| 404 |
+
"<|special_410|>",
|
| 405 |
+
"<|special_411|>",
|
| 406 |
+
"<|special_412|>",
|
| 407 |
+
"<|special_413|>",
|
| 408 |
+
"<|special_414|>",
|
| 409 |
+
"<|special_415|>",
|
| 410 |
+
"<|special_416|>",
|
| 411 |
+
"<|special_417|>",
|
| 412 |
+
"<|special_418|>",
|
| 413 |
+
"<|special_419|>",
|
| 414 |
+
"<|special_420|>",
|
| 415 |
+
"<|special_421|>",
|
| 416 |
+
"<|special_422|>",
|
| 417 |
+
"<|special_423|>",
|
| 418 |
+
"<|special_424|>",
|
| 419 |
+
"<|special_425|>",
|
| 420 |
+
"<|special_426|>",
|
| 421 |
+
"<|special_427|>",
|
| 422 |
+
"<|special_428|>",
|
| 423 |
+
"<|special_429|>",
|
| 424 |
+
"<|special_430|>",
|
| 425 |
+
"<|special_431|>",
|
| 426 |
+
"<|special_432|>",
|
| 427 |
+
"<|special_433|>",
|
| 428 |
+
"<|special_434|>",
|
| 429 |
+
"<|special_435|>",
|
| 430 |
+
"<|special_436|>",
|
| 431 |
+
"<|special_437|>",
|
| 432 |
+
"<|special_438|>",
|
| 433 |
+
"<|special_439|>",
|
| 434 |
+
"<|special_440|>",
|
| 435 |
+
"<|special_441|>",
|
| 436 |
+
"<|special_442|>",
|
| 437 |
+
"<|special_443|>",
|
| 438 |
+
"<|special_444|>",
|
| 439 |
+
"<|special_445|>",
|
| 440 |
+
"<|special_446|>",
|
| 441 |
+
"<|special_447|>",
|
| 442 |
+
"<|special_448|>",
|
| 443 |
+
"<|special_449|>",
|
| 444 |
+
"<|special_450|>",
|
| 445 |
+
"<|special_451|>",
|
| 446 |
+
"<|special_452|>",
|
| 447 |
+
"<|special_453|>",
|
| 448 |
+
"<|special_454|>",
|
| 449 |
+
"<|special_455|>",
|
| 450 |
+
"<|special_456|>",
|
| 451 |
+
"<|special_457|>",
|
| 452 |
+
"<|special_458|>",
|
| 453 |
+
"<|special_459|>",
|
| 454 |
+
"<|special_460|>",
|
| 455 |
+
"<|special_461|>",
|
| 456 |
+
"<|special_462|>",
|
| 457 |
+
"<|special_463|>",
|
| 458 |
+
"<|special_464|>",
|
| 459 |
+
"<|special_465|>",
|
| 460 |
+
"<|special_466|>",
|
| 461 |
+
"<|special_467|>",
|
| 462 |
+
"<|special_468|>",
|
| 463 |
+
"<|special_469|>",
|
| 464 |
+
"<|special_470|>",
|
| 465 |
+
"<|special_471|>",
|
| 466 |
+
"<|special_472|>",
|
| 467 |
+
"<|special_473|>",
|
| 468 |
+
"<|special_474|>",
|
| 469 |
+
"<|special_475|>",
|
| 470 |
+
"<|special_476|>",
|
| 471 |
+
"<|special_477|>",
|
| 472 |
+
"<|special_478|>",
|
| 473 |
+
"<|special_479|>",
|
| 474 |
+
"<|special_480|>",
|
| 475 |
+
"<|special_481|>",
|
| 476 |
+
"<|special_482|>",
|
| 477 |
+
"<|special_483|>",
|
| 478 |
+
"<|special_484|>",
|
| 479 |
+
"<|special_485|>",
|
| 480 |
+
"<|special_486|>",
|
| 481 |
+
"<|special_487|>",
|
| 482 |
+
"<|special_488|>",
|
| 483 |
+
"<|special_489|>",
|
| 484 |
+
"<|special_490|>",
|
| 485 |
+
"<|special_491|>",
|
| 486 |
+
"<|special_492|>",
|
| 487 |
+
"<|special_493|>",
|
| 488 |
+
"<|special_494|>",
|
| 489 |
+
"<|special_495|>",
|
| 490 |
+
"<|special_496|>",
|
| 491 |
+
"<|special_497|>",
|
| 492 |
+
"<|special_498|>",
|
| 493 |
+
"<|special_499|>",
|
| 494 |
+
"<|special_500|>",
|
| 495 |
+
"<|special_501|>",
|
| 496 |
+
"<|special_502|>",
|
| 497 |
+
"<|special_503|>",
|
| 498 |
+
"<|special_504|>",
|
| 499 |
+
"<|special_505|>",
|
| 500 |
+
"<|special_506|>",
|
| 501 |
+
"<|special_507|>",
|
| 502 |
+
"<|special_508|>",
|
| 503 |
+
"<|special_509|>",
|
| 504 |
+
"<|special_510|>",
|
| 505 |
+
"<|special_511|>",
|
| 506 |
+
"<|special_625|>",
|
| 507 |
+
"<|special_626|>",
|
| 508 |
+
"<|special_627|>",
|
| 509 |
+
"<|special_628|>",
|
| 510 |
+
"<|special_629|>",
|
| 511 |
+
"<|special_630|>",
|
| 512 |
+
"<|special_631|>",
|
| 513 |
+
"<|special_632|>",
|
| 514 |
+
"<|special_633|>",
|
| 515 |
+
"<|special_634|>",
|
| 516 |
+
"<|special_635|>",
|
| 517 |
+
"<|special_636|>",
|
| 518 |
+
"<|special_637|>",
|
| 519 |
+
"<|special_638|>",
|
| 520 |
+
"<|special_639|>",
|
| 521 |
+
"<|special_640|>",
|
| 522 |
+
"<|special_641|>",
|
| 523 |
+
"<|special_642|>",
|
| 524 |
+
"<|special_643|>",
|
| 525 |
+
"<|special_644|>",
|
| 526 |
+
"<|special_645|>",
|
| 527 |
+
"<|special_646|>",
|
| 528 |
+
"<|special_647|>",
|
| 529 |
+
"<|special_648|>",
|
| 530 |
+
"<|special_649|>",
|
| 531 |
+
"<|special_650|>",
|
| 532 |
+
"<|special_651|>",
|
| 533 |
+
"<|special_652|>",
|
| 534 |
+
"<|special_653|>",
|
| 535 |
+
"<|special_654|>",
|
| 536 |
+
"<|special_655|>",
|
| 537 |
+
"<|special_656|>",
|
| 538 |
+
"<|special_657|>",
|
| 539 |
+
"<|special_658|>",
|
| 540 |
+
"<|special_659|>",
|
| 541 |
+
"<|special_660|>",
|
| 542 |
+
"<|special_661|>",
|
| 543 |
+
"<|special_662|>",
|
| 544 |
+
"<|special_663|>",
|
| 545 |
+
"<|special_664|>",
|
| 546 |
+
"<|special_665|>",
|
| 547 |
+
"<|special_666|>",
|
| 548 |
+
"<|special_667|>",
|
| 549 |
+
"<|special_668|>",
|
| 550 |
+
"<|special_669|>",
|
| 551 |
+
"<|special_670|>",
|
| 552 |
+
"<|special_671|>",
|
| 553 |
+
"<|special_672|>",
|
| 554 |
+
"<|special_673|>",
|
| 555 |
+
"<|special_674|>",
|
| 556 |
+
"<|special_675|>",
|
| 557 |
+
"<|special_676|>",
|
| 558 |
+
"<|special_677|>",
|
| 559 |
+
"<|special_678|>",
|
| 560 |
+
"<|special_679|>",
|
| 561 |
+
"<|special_680|>",
|
| 562 |
+
"<|special_681|>",
|
| 563 |
+
"<|special_682|>",
|
| 564 |
+
"<|special_683|>",
|
| 565 |
+
"<|special_684|>",
|
| 566 |
+
"<|special_685|>",
|
| 567 |
+
"<|special_686|>",
|
| 568 |
+
"<|special_687|>",
|
| 569 |
+
"<|special_688|>",
|
| 570 |
+
"<|special_689|>",
|
| 571 |
+
"<|special_690|>",
|
| 572 |
+
"<|special_691|>",
|
| 573 |
+
"<|special_692|>",
|
| 574 |
+
"<|special_693|>",
|
| 575 |
+
"<|special_694|>",
|
| 576 |
+
"<|special_695|>",
|
| 577 |
+
"<|special_696|>",
|
| 578 |
+
"<|special_697|>",
|
| 579 |
+
"<|special_698|>",
|
| 580 |
+
"<|special_699|>",
|
| 581 |
+
"<|special_700|>",
|
| 582 |
+
"<|special_701|>",
|
| 583 |
+
"<|special_702|>",
|
| 584 |
+
"<|special_703|>",
|
| 585 |
+
"<|special_704|>",
|
| 586 |
+
"<|special_705|>",
|
| 587 |
+
"<|special_706|>",
|
| 588 |
+
"<|special_707|>",
|
| 589 |
+
"<|special_708|>",
|
| 590 |
+
"<|special_709|>",
|
| 591 |
+
"<|special_710|>",
|
| 592 |
+
"<|special_711|>",
|
| 593 |
+
"<|special_712|>",
|
| 594 |
+
"<|special_713|>",
|
| 595 |
+
"<|special_714|>",
|
| 596 |
+
"<|special_715|>",
|
| 597 |
+
"<|special_716|>",
|
| 598 |
+
"<|special_717|>",
|
| 599 |
+
"<|special_718|>",
|
| 600 |
+
"<|special_719|>",
|
| 601 |
+
"<|special_720|>",
|
| 602 |
+
"<|special_721|>",
|
| 603 |
+
"<|special_722|>",
|
| 604 |
+
"<|special_723|>",
|
| 605 |
+
"<|special_724|>",
|
| 606 |
+
"<|special_725|>",
|
| 607 |
+
"<|special_726|>",
|
| 608 |
+
"<|special_727|>",
|
| 609 |
+
"<|special_728|>",
|
| 610 |
+
"<|special_729|>",
|
| 611 |
+
"<|special_730|>",
|
| 612 |
+
"<|special_731|>",
|
| 613 |
+
"<|special_732|>",
|
| 614 |
+
"<|special_733|>",
|
| 615 |
+
"<|special_734|>",
|
| 616 |
+
"<|special_735|>",
|
| 617 |
+
"<|special_736|>",
|
| 618 |
+
"<|special_737|>",
|
| 619 |
+
"<|special_738|>",
|
| 620 |
+
"<|special_739|>",
|
| 621 |
+
"<|special_740|>",
|
| 622 |
+
"<|special_741|>",
|
| 623 |
+
"<|special_742|>",
|
| 624 |
+
"<|special_743|>",
|
| 625 |
+
"<|special_744|>",
|
| 626 |
+
"<|special_745|>",
|
| 627 |
+
"<|special_746|>",
|
| 628 |
+
"<|special_747|>",
|
| 629 |
+
"<|special_748|>",
|
| 630 |
+
"<|special_749|>",
|
| 631 |
+
"<|special_750|>",
|
| 632 |
+
"<|special_751|>",
|
| 633 |
+
"<|special_752|>",
|
| 634 |
+
"<|special_753|>",
|
| 635 |
+
"<|special_754|>",
|
| 636 |
+
"<|special_755|>",
|
| 637 |
+
"<|special_756|>",
|
| 638 |
+
"<|special_757|>",
|
| 639 |
+
"<|special_758|>",
|
| 640 |
+
"<|special_759|>",
|
| 641 |
+
"<|special_760|>",
|
| 642 |
+
"<|special_761|>",
|
| 643 |
+
"<|special_762|>",
|
| 644 |
+
"<|special_763|>",
|
| 645 |
+
"<|special_764|>",
|
| 646 |
+
"<|special_765|>",
|
| 647 |
+
"<|special_766|>",
|
| 648 |
+
"<|special_767|>",
|
| 649 |
+
"<|special_768|>",
|
| 650 |
+
"<|special_769|>",
|
| 651 |
+
"<|special_770|>",
|
| 652 |
+
"<|special_771|>",
|
| 653 |
+
"<|special_772|>",
|
| 654 |
+
"<|special_773|>",
|
| 655 |
+
"<|special_774|>",
|
| 656 |
+
"<|special_775|>",
|
| 657 |
+
"<|special_776|>",
|
| 658 |
+
"<|special_777|>",
|
| 659 |
+
"<|special_778|>",
|
| 660 |
+
"<|special_779|>",
|
| 661 |
+
"<|special_780|>",
|
| 662 |
+
"<|special_781|>",
|
| 663 |
+
"<|special_782|>",
|
| 664 |
+
"<|special_783|>",
|
| 665 |
+
"<|special_784|>",
|
| 666 |
+
"<|special_785|>",
|
| 667 |
+
"<|special_786|>",
|
| 668 |
+
"<|special_787|>",
|
| 669 |
+
"<|special_788|>",
|
| 670 |
+
"<|special_789|>",
|
| 671 |
+
"<|special_790|>",
|
| 672 |
+
"<|special_791|>",
|
| 673 |
+
"<|special_792|>",
|
| 674 |
+
"<|special_793|>",
|
| 675 |
+
"<|special_794|>",
|
| 676 |
+
"<|special_795|>",
|
| 677 |
+
"<|special_796|>",
|
| 678 |
+
"<|special_797|>",
|
| 679 |
+
"<|special_798|>",
|
| 680 |
+
"<|special_799|>",
|
| 681 |
+
"<|special_800|>",
|
| 682 |
+
"<|special_801|>",
|
| 683 |
+
"<|special_802|>",
|
| 684 |
+
"<|special_803|>",
|
| 685 |
+
"<|special_804|>",
|
| 686 |
+
"<|special_805|>",
|
| 687 |
+
"<|special_806|>",
|
| 688 |
+
"<|special_807|>",
|
| 689 |
+
"<|special_808|>",
|
| 690 |
+
"<|special_809|>",
|
| 691 |
+
"<|special_810|>",
|
| 692 |
+
"<|special_811|>",
|
| 693 |
+
"<|special_812|>",
|
| 694 |
+
"<|special_813|>",
|
| 695 |
+
"<|special_814|>",
|
| 696 |
+
"<|special_815|>",
|
| 697 |
+
"<|special_816|>",
|
| 698 |
+
"<|special_817|>",
|
| 699 |
+
"<|special_818|>",
|
| 700 |
+
"<|special_819|>",
|
| 701 |
+
"<|special_820|>",
|
| 702 |
+
"<|special_821|>",
|
| 703 |
+
"<|special_822|>",
|
| 704 |
+
"<|special_823|>",
|
| 705 |
+
"<|special_824|>",
|
| 706 |
+
"<|special_825|>",
|
| 707 |
+
"<|special_826|>",
|
| 708 |
+
"<|special_827|>",
|
| 709 |
+
"<|special_828|>",
|
| 710 |
+
"<|special_829|>",
|
| 711 |
+
"<|special_830|>",
|
| 712 |
+
"<|special_831|>",
|
| 713 |
+
"<|special_832|>",
|
| 714 |
+
"<|special_833|>",
|
| 715 |
+
"<|special_834|>",
|
| 716 |
+
"<|special_835|>",
|
| 717 |
+
"<|special_836|>",
|
| 718 |
+
"<|special_837|>",
|
| 719 |
+
"<|special_838|>",
|
| 720 |
+
"<|special_839|>",
|
| 721 |
+
"<|special_840|>",
|
| 722 |
+
"<|special_841|>",
|
| 723 |
+
"<|special_842|>",
|
| 724 |
+
"<|special_843|>",
|
| 725 |
+
"<|special_844|>",
|
| 726 |
+
"<|special_845|>",
|
| 727 |
+
"<|special_846|>",
|
| 728 |
+
"<|special_847|>",
|
| 729 |
+
"<|special_848|>",
|
| 730 |
+
"<|special_849|>",
|
| 731 |
+
"<|special_850|>",
|
| 732 |
+
"<|special_851|>",
|
| 733 |
+
"<|special_852|>",
|
| 734 |
+
"<|special_853|>",
|
| 735 |
+
"<|special_854|>",
|
| 736 |
+
"<|special_855|>",
|
| 737 |
+
"<|special_856|>",
|
| 738 |
+
"<|special_857|>",
|
| 739 |
+
"<|special_858|>",
|
| 740 |
+
"<|special_859|>",
|
| 741 |
+
"<|special_860|>",
|
| 742 |
+
"<|special_861|>",
|
| 743 |
+
"<|special_862|>",
|
| 744 |
+
"<|special_863|>",
|
| 745 |
+
"<|special_864|>",
|
| 746 |
+
"<|special_865|>",
|
| 747 |
+
"<|special_866|>",
|
| 748 |
+
"<|special_867|>",
|
| 749 |
+
"<|special_868|>",
|
| 750 |
+
"<|special_869|>",
|
| 751 |
+
"<|special_870|>",
|
| 752 |
+
"<|special_871|>",
|
| 753 |
+
"<|special_872|>",
|
| 754 |
+
"<|special_873|>",
|
| 755 |
+
"<|special_874|>",
|
| 756 |
+
"<|special_875|>",
|
| 757 |
+
"<|special_876|>",
|
| 758 |
+
"<|special_877|>",
|
| 759 |
+
"<|special_878|>",
|
| 760 |
+
"<|special_879|>",
|
| 761 |
+
"<|special_880|>",
|
| 762 |
+
"<|special_881|>",
|
| 763 |
+
"<|special_882|>",
|
| 764 |
+
"<|special_883|>",
|
| 765 |
+
"<|special_884|>",
|
| 766 |
+
"<|special_885|>",
|
| 767 |
+
"<|special_886|>",
|
| 768 |
+
"<|special_887|>",
|
| 769 |
+
"<|special_888|>",
|
| 770 |
+
"<|special_889|>",
|
| 771 |
+
"<|special_890|>",
|
| 772 |
+
"<|special_891|>",
|
| 773 |
+
"<|special_892|>",
|
| 774 |
+
"<|special_893|>",
|
| 775 |
+
"<|special_894|>",
|
| 776 |
+
"<|special_895|>",
|
| 777 |
+
"<|special_896|>",
|
| 778 |
+
"<|special_897|>",
|
| 779 |
+
"<|special_898|>",
|
| 780 |
+
"<|special_899|>",
|
| 781 |
+
"<|special_900|>",
|
| 782 |
+
"<|special_901|>",
|
| 783 |
+
"<|special_902|>",
|
| 784 |
+
"<|special_903|>",
|
| 785 |
+
"<|special_904|>",
|
| 786 |
+
"<|special_905|>",
|
| 787 |
+
"<|special_906|>",
|
| 788 |
+
"<|special_907|>",
|
| 789 |
+
"<|special_908|>",
|
| 790 |
+
"<|special_909|>",
|
| 791 |
+
"<|special_910|>",
|
| 792 |
+
"<|special_911|>",
|
| 793 |
+
"<|special_912|>",
|
| 794 |
+
"<|special_913|>",
|
| 795 |
+
"<|special_914|>",
|
| 796 |
+
"<|special_915|>",
|
| 797 |
+
"<|special_916|>",
|
| 798 |
+
"<|special_917|>",
|
| 799 |
+
"<|special_918|>",
|
| 800 |
+
"<|special_919|>",
|
| 801 |
+
"<|special_920|>",
|
| 802 |
+
"<|special_921|>",
|
| 803 |
+
"<|special_922|>",
|
| 804 |
+
"<|special_923|>",
|
| 805 |
+
"<|special_924|>",
|
| 806 |
+
"<|special_925|>",
|
| 807 |
+
"<|special_926|>",
|
| 808 |
+
"<|special_927|>",
|
| 809 |
+
"<|special_928|>",
|
| 810 |
+
"<|special_929|>",
|
| 811 |
+
"<|special_930|>",
|
| 812 |
+
"<|special_931|>",
|
| 813 |
+
"<|special_932|>",
|
| 814 |
+
"<|special_933|>",
|
| 815 |
+
"<|special_934|>",
|
| 816 |
+
"<|special_935|>",
|
| 817 |
+
"<|special_936|>",
|
| 818 |
+
"<|special_937|>",
|
| 819 |
+
"<|special_938|>",
|
| 820 |
+
"<|special_939|>",
|
| 821 |
+
"<|special_940|>",
|
| 822 |
+
"<|special_941|>",
|
| 823 |
+
"<|special_942|>",
|
| 824 |
+
"<|special_943|>",
|
| 825 |
+
"<|special_944|>",
|
| 826 |
+
"<|special_945|>",
|
| 827 |
+
"<|special_946|>",
|
| 828 |
+
"<|special_947|>",
|
| 829 |
+
"<|special_948|>",
|
| 830 |
+
"<|special_949|>",
|
| 831 |
+
"<|special_950|>",
|
| 832 |
+
"<|special_951|>",
|
| 833 |
+
"<|special_952|>",
|
| 834 |
+
"<|special_953|>",
|
| 835 |
+
"<|special_954|>",
|
| 836 |
+
"<|special_955|>",
|
| 837 |
+
"<|special_956|>",
|
| 838 |
+
"<|special_957|>",
|
| 839 |
+
"<|special_958|>",
|
| 840 |
+
"<|special_959|>",
|
| 841 |
+
"<|special_960|>",
|
| 842 |
+
"<|special_961|>",
|
| 843 |
+
"<|special_962|>",
|
| 844 |
+
"<|special_963|>",
|
| 845 |
+
"<|special_964|>",
|
| 846 |
+
"<|special_965|>",
|
| 847 |
+
"<|special_966|>",
|
| 848 |
+
"<|special_967|>",
|
| 849 |
+
"<|special_968|>",
|
| 850 |
+
"<|special_969|>",
|
| 851 |
+
"<|special_970|>",
|
| 852 |
+
"<|special_971|>",
|
| 853 |
+
"<|special_972|>",
|
| 854 |
+
"<|special_973|>",
|
| 855 |
+
"<|special_974|>",
|
| 856 |
+
"<|special_975|>",
|
| 857 |
+
"<|special_976|>",
|
| 858 |
+
"<|special_977|>",
|
| 859 |
+
"<|special_978|>",
|
| 860 |
+
"<|special_979|>",
|
| 861 |
+
"<|special_980|>",
|
| 862 |
+
"<|special_981|>",
|
| 863 |
+
"<|special_982|>",
|
| 864 |
+
"<|special_983|>",
|
| 865 |
+
"<|special_984|>",
|
| 866 |
+
"<|special_985|>",
|
| 867 |
+
"<|special_986|>",
|
| 868 |
+
"<|special_987|>",
|
| 869 |
+
"<|special_988|>",
|
| 870 |
+
"<|special_989|>",
|
| 871 |
+
"<|special_990|>",
|
| 872 |
+
"<|special_991|>",
|
| 873 |
+
"<|special_992|>",
|
| 874 |
+
"<|special_993|>",
|
| 875 |
+
"<|special_994|>",
|
| 876 |
+
"<|special_995|>",
|
| 877 |
+
"<|special_996|>",
|
| 878 |
+
"<|special_997|>",
|
| 879 |
+
"<|special_998|>",
|
| 880 |
+
"<|special_999|>",
|
| 881 |
+
"<|special_1000|>",
|
| 882 |
+
"<|special_1001|>",
|
| 883 |
+
"<|special_1002|>",
|
| 884 |
+
"<|special_1003|>",
|
| 885 |
+
"<|special_1004|>",
|
| 886 |
+
"<|special_1005|>",
|
| 887 |
+
"<|special_1006|>",
|
| 888 |
+
"<|special_1007|>",
|
| 889 |
+
"<|special_1008|>",
|
| 890 |
+
"<|special_1009|>",
|
| 891 |
+
"<|special_1010|>",
|
| 892 |
+
"<|special_1011|>",
|
| 893 |
+
"<|special_1012|>",
|
| 894 |
+
"<|special_1013|>",
|
| 895 |
+
"<|special_1014|>",
|
| 896 |
+
"<|special_1015|>",
|
| 897 |
+
"<|special_1016|>",
|
| 898 |
+
"<|special_1017|>",
|
| 899 |
+
"<|special_1018|>",
|
| 900 |
+
"<|special_1019|>",
|
| 901 |
+
"<|special_1020|>",
|
| 902 |
+
"<|special_1021|>",
|
| 903 |
+
"<|special_1022|>",
|
| 904 |
+
"<|special_1023|>",
|
| 905 |
+
"<|special_1024|>",
|
| 906 |
+
"<|special_1025|>",
|
| 907 |
+
"<|special_1026|>",
|
| 908 |
+
"<|special_1027|>",
|
| 909 |
+
"<|special_1028|>",
|
| 910 |
+
"<|special_1029|>",
|
| 911 |
+
"<|special_1030|>",
|
| 912 |
+
"<|special_1031|>",
|
| 913 |
+
"<|special_1032|>",
|
| 914 |
+
"<|special_1033|>",
|
| 915 |
+
"<|special_1034|>",
|
| 916 |
+
"<|special_1035|>",
|
| 917 |
+
"<|special_1036|>",
|
| 918 |
+
"<|special_1037|>",
|
| 919 |
+
"<|special_1038|>",
|
| 920 |
+
"<|special_1039|>",
|
| 921 |
+
"<|special_1040|>",
|
| 922 |
+
"<|special_1041|>",
|
| 923 |
+
"<|special_1042|>",
|
| 924 |
+
"<|special_1043|>",
|
| 925 |
+
"<|special_1044|>",
|
| 926 |
+
"<|special_1045|>",
|
| 927 |
+
"<|special_1046|>",
|
| 928 |
+
"<|special_1047|>",
|
| 929 |
+
"<|special_1048|>",
|
| 930 |
+
"<|special_1049|>",
|
| 931 |
+
"<|special_1050|>",
|
| 932 |
+
"<|special_1051|>",
|
| 933 |
+
"<|special_1052|>",
|
| 934 |
+
"<|special_1053|>",
|
| 935 |
+
"<|special_1054|>",
|
| 936 |
+
"<|special_1055|>",
|
| 937 |
+
"<|special_1056|>",
|
| 938 |
+
"<|special_1057|>",
|
| 939 |
+
"<|special_1058|>",
|
| 940 |
+
"<|special_1059|>",
|
| 941 |
+
"<|special_1060|>",
|
| 942 |
+
"<|special_1061|>",
|
| 943 |
+
"<|special_1062|>",
|
| 944 |
+
"<|special_1063|>",
|
| 945 |
+
"<|special_1064|>",
|
| 946 |
+
"<|special_1065|>",
|
| 947 |
+
"<|special_1066|>",
|
| 948 |
+
"<|special_1067|>",
|
| 949 |
+
"<|special_1068|>",
|
| 950 |
+
"<|special_1069|>",
|
| 951 |
+
"<|special_1070|>",
|
| 952 |
+
"<|special_1071|>",
|
| 953 |
+
"<|special_1072|>",
|
| 954 |
+
"<|special_1073|>",
|
| 955 |
+
"<|special_1074|>",
|
| 956 |
+
"<|special_1075|>",
|
| 957 |
+
"<|special_1076|>",
|
| 958 |
+
"<|special_1077|>",
|
| 959 |
+
"<|special_1078|>",
|
| 960 |
+
"<|special_1079|>",
|
| 961 |
+
"<|special_1080|>",
|
| 962 |
+
"<|special_1081|>",
|
| 963 |
+
"<|special_1082|>",
|
| 964 |
+
"<|special_1083|>",
|
| 965 |
+
"<|special_1084|>",
|
| 966 |
+
"<|special_1085|>",
|
| 967 |
+
"<|special_1086|>",
|
| 968 |
+
"<|special_1087|>",
|
| 969 |
+
"<|special_1088|>",
|
| 970 |
+
"<|special_1089|>",
|
| 971 |
+
"<|special_1090|>",
|
| 972 |
+
"<|special_1091|>",
|
| 973 |
+
"<|special_1092|>",
|
| 974 |
+
"<|special_1093|>",
|
| 975 |
+
"<|special_1094|>",
|
| 976 |
+
"<|special_1095|>",
|
| 977 |
+
"<|special_1096|>",
|
| 978 |
+
"<|special_1097|>",
|
| 979 |
+
"<|special_1098|>",
|
| 980 |
+
"<|special_1099|>",
|
| 981 |
+
"<|special_1100|>",
|
| 982 |
+
"<|special_1101|>",
|
| 983 |
+
"<|special_1102|>",
|
| 984 |
+
"<|special_1103|>",
|
| 985 |
+
"<|special_1104|>",
|
| 986 |
+
"<|special_1105|>",
|
| 987 |
+
"<|special_1106|>",
|
| 988 |
+
"<|special_1107|>",
|
| 989 |
+
"<|special_1108|>",
|
| 990 |
+
"<|special_1109|>",
|
| 991 |
+
"<|special_1110|>",
|
| 992 |
+
"<|special_1111|>",
|
| 993 |
+
"<|special_1112|>",
|
| 994 |
+
"<|special_1113|>",
|
| 995 |
+
"<|special_1114|>",
|
| 996 |
+
"<|special_1115|>",
|
| 997 |
+
"<|special_1116|>",
|
| 998 |
+
"<|special_1117|>",
|
| 999 |
+
"<|special_1118|>",
|
| 1000 |
+
"<|special_1119|>",
|
| 1001 |
+
"<|special_1120|>",
|
| 1002 |
+
"<|special_1121|>",
|
| 1003 |
+
"<|special_1122|>",
|
| 1004 |
+
"<|special_1123|>",
|
| 1005 |
+
"<|special_1124|>",
|
| 1006 |
+
"<|special_1125|>",
|
| 1007 |
+
"<|special_1126|>",
|
| 1008 |
+
"<|special_1127|>",
|
| 1009 |
+
"<|special_1128|>",
|
| 1010 |
+
"<|special_1129|>",
|
| 1011 |
+
"<|special_1130|>",
|
| 1012 |
+
"<|special_1131|>",
|
| 1013 |
+
"<|special_1132|>",
|
| 1014 |
+
"<|special_1133|>",
|
| 1015 |
+
"<|special_1134|>",
|
| 1016 |
+
"<|special_1135|>",
|
| 1017 |
+
"<|special_1136|>",
|
| 1018 |
+
"<|special_1137|>",
|
| 1019 |
+
"<|special_1138|>",
|
| 1020 |
+
"<|special_1139|>",
|
| 1021 |
+
"<|special_1140|>",
|
| 1022 |
+
"<|special_1141|>",
|
| 1023 |
+
"<|special_1142|>",
|
| 1024 |
+
"<|special_1143|>",
|
| 1025 |
+
"<|special_1144|>",
|
| 1026 |
+
"<|special_1145|>",
|
| 1027 |
+
"<|special_1146|>",
|
| 1028 |
+
"<|special_1147|>",
|
| 1029 |
+
"<|special_1148|>",
|
| 1030 |
+
"<|special_1149|>",
|
| 1031 |
+
"<|special_1150|>",
|
| 1032 |
+
"<|special_1151|>",
|
| 1033 |
+
"<|special_1152|>",
|
| 1034 |
+
"<|special_1153|>",
|
| 1035 |
+
"<|special_1154|>",
|
| 1036 |
+
"<|special_1155|>",
|
| 1037 |
+
"<|special_1156|>",
|
| 1038 |
+
"<|special_1157|>",
|
| 1039 |
+
"<|special_1158|>",
|
| 1040 |
+
"<|special_1159|>",
|
| 1041 |
+
"<|special_1160|>",
|
| 1042 |
+
"<|special_1161|>",
|
| 1043 |
+
"<|special_1162|>",
|
| 1044 |
+
"<|special_1163|>",
|
| 1045 |
+
"<|special_1164|>",
|
| 1046 |
+
"<|special_1165|>",
|
| 1047 |
+
"<|special_1166|>",
|
| 1048 |
+
"<|special_1167|>",
|
| 1049 |
+
"<|special_1168|>",
|
| 1050 |
+
"<|special_1169|>",
|
| 1051 |
+
"<|special_1170|>",
|
| 1052 |
+
"<|special_1171|>",
|
| 1053 |
+
"<|special_1172|>",
|
| 1054 |
+
"<|special_1173|>",
|
| 1055 |
+
"<|special_1174|>",
|
| 1056 |
+
"<|special_1175|>",
|
| 1057 |
+
"<|special_1176|>",
|
| 1058 |
+
"<|special_1177|>",
|
| 1059 |
+
"<|special_1178|>",
|
| 1060 |
+
"<|special_1179|>",
|
| 1061 |
+
"<|special_1180|>",
|
| 1062 |
+
"<|special_1181|>",
|
| 1063 |
+
"<|special_1182|>",
|
| 1064 |
+
"<|special_1183|>",
|
| 1065 |
+
"<|special_1184|>",
|
| 1066 |
+
"<|special_1185|>",
|
| 1067 |
+
"<|special_1186|>",
|
| 1068 |
+
"<|special_1187|>",
|
| 1069 |
+
"<|special_1188|>",
|
| 1070 |
+
"<|special_1189|>",
|
| 1071 |
+
"<|special_1190|>",
|
| 1072 |
+
"<|special_1191|>",
|
| 1073 |
+
"<|special_1192|>",
|
| 1074 |
+
"<|special_1193|>",
|
| 1075 |
+
"<|special_1194|>",
|
| 1076 |
+
"<|special_1195|>",
|
| 1077 |
+
"<|special_1196|>",
|
| 1078 |
+
"<|special_1197|>",
|
| 1079 |
+
"<|special_1198|>",
|
| 1080 |
+
"<|special_1199|>",
|
| 1081 |
+
"<|special_1200|>",
|
| 1082 |
+
"<|special_1201|>",
|
| 1083 |
+
"<|special_1202|>",
|
| 1084 |
+
"<|special_1203|>",
|
| 1085 |
+
"<|special_1204|>",
|
| 1086 |
+
"<|special_1205|>",
|
| 1087 |
+
"<|special_1206|>",
|
| 1088 |
+
"<|special_1207|>",
|
| 1089 |
+
"<|special_1208|>",
|
| 1090 |
+
"<|special_1209|>",
|
| 1091 |
+
"<|special_1210|>",
|
| 1092 |
+
"<|special_1211|>",
|
| 1093 |
+
"<|special_1212|>",
|
| 1094 |
+
"<|special_1213|>",
|
| 1095 |
+
"<|special_1214|>",
|
| 1096 |
+
"<|special_1215|>",
|
| 1097 |
+
"<|special_1216|>",
|
| 1098 |
+
"<|special_1217|>",
|
| 1099 |
+
"<|special_1218|>",
|
| 1100 |
+
"<|special_1219|>",
|
| 1101 |
+
"<|special_1220|>",
|
| 1102 |
+
"<|special_1221|>",
|
| 1103 |
+
"<|special_1222|>",
|
| 1104 |
+
"<|special_1223|>",
|
| 1105 |
+
"<|special_1224|>",
|
| 1106 |
+
"<|special_1225|>",
|
| 1107 |
+
"<|special_1226|>",
|
| 1108 |
+
"<|special_1227|>",
|
| 1109 |
+
"<|special_1228|>",
|
| 1110 |
+
"<|special_1229|>",
|
| 1111 |
+
"<|special_1230|>",
|
| 1112 |
+
"<|special_1231|>",
|
| 1113 |
+
"<|special_1232|>",
|
| 1114 |
+
"<|special_1233|>",
|
| 1115 |
+
"<|special_1234|>",
|
| 1116 |
+
"<|special_1235|>",
|
| 1117 |
+
"<|special_1236|>",
|
| 1118 |
+
"<|special_1237|>",
|
| 1119 |
+
"<|special_1238|>",
|
| 1120 |
+
"<|special_1239|>",
|
| 1121 |
+
"<|special_1240|>",
|
| 1122 |
+
"<|special_1241|>",
|
| 1123 |
+
"<|special_1242|>",
|
| 1124 |
+
"<|special_1243|>",
|
| 1125 |
+
"<|special_1244|>",
|
| 1126 |
+
"<|special_1245|>",
|
| 1127 |
+
"<|special_1246|>",
|
| 1128 |
+
"<|special_1247|>",
|
| 1129 |
+
"<|special_1248|>",
|
| 1130 |
+
"<|special_1249|>",
|
| 1131 |
+
"<|special_1250|>",
|
| 1132 |
+
"<|special_1251|>",
|
| 1133 |
+
"<|special_1252|>",
|
| 1134 |
+
"<|special_1253|>",
|
| 1135 |
+
"<|special_1254|>",
|
| 1136 |
+
"<|special_1255|>",
|
| 1137 |
+
"<|special_1256|>",
|
| 1138 |
+
"<|special_1257|>",
|
| 1139 |
+
"<|special_1258|>",
|
| 1140 |
+
"<|special_1259|>",
|
| 1141 |
+
"<|special_1260|>",
|
| 1142 |
+
"<|special_1261|>",
|
| 1143 |
+
"<|special_1262|>",
|
| 1144 |
+
"<|special_1263|>",
|
| 1145 |
+
"<|special_1264|>",
|
| 1146 |
+
"<|special_1265|>",
|
| 1147 |
+
"<|special_1266|>",
|
| 1148 |
+
"<|special_1267|>",
|
| 1149 |
+
"<|special_1268|>",
|
| 1150 |
+
"<|special_1269|>",
|
| 1151 |
+
"<|special_1270|>",
|
| 1152 |
+
"<|special_1271|>",
|
| 1153 |
+
"<|special_1272|>",
|
| 1154 |
+
"<|special_1273|>",
|
| 1155 |
+
"<|special_1274|>",
|
| 1156 |
+
"<|special_1275|>",
|
| 1157 |
+
"<|special_1276|>",
|
| 1158 |
+
"<|special_1277|>",
|
| 1159 |
+
"<|special_1278|>",
|
| 1160 |
+
"<|special_1279|>",
|
| 1161 |
+
"<|special_1280|>",
|
| 1162 |
+
"<|special_1281|>",
|
| 1163 |
+
"<|special_1282|>",
|
| 1164 |
+
"<|special_1283|>",
|
| 1165 |
+
"<|special_1284|>",
|
| 1166 |
+
"<|special_1285|>",
|
| 1167 |
+
"<|special_1286|>",
|
| 1168 |
+
"<|special_1287|>",
|
| 1169 |
+
"<|special_1288|>",
|
| 1170 |
+
"<|special_1289|>",
|
| 1171 |
+
"<|special_1290|>",
|
| 1172 |
+
"<|special_1291|>",
|
| 1173 |
+
"<|special_1292|>",
|
| 1174 |
+
"<|special_1293|>",
|
| 1175 |
+
"<|special_1294|>",
|
| 1176 |
+
"<|special_1295|>",
|
| 1177 |
+
"<|special_1296|>",
|
| 1178 |
+
"<|special_1297|>",
|
| 1179 |
+
"<|special_1298|>",
|
| 1180 |
+
"<|special_1299|>",
|
| 1181 |
+
"<|special_1300|>",
|
| 1182 |
+
"<|special_1301|>",
|
| 1183 |
+
"<|special_1302|>",
|
| 1184 |
+
"<|special_1303|>",
|
| 1185 |
+
"<|special_1304|>",
|
| 1186 |
+
"<|special_1305|>",
|
| 1187 |
+
"<|special_1306|>",
|
| 1188 |
+
"<|special_1307|>",
|
| 1189 |
+
"<|special_1308|>",
|
| 1190 |
+
"<|special_1309|>",
|
| 1191 |
+
"<|special_1310|>",
|
| 1192 |
+
"<|special_1311|>",
|
| 1193 |
+
"<|special_1312|>",
|
| 1194 |
+
"<|special_1313|>",
|
| 1195 |
+
"<|special_1314|>",
|
| 1196 |
+
"<|special_1315|>",
|
| 1197 |
+
"<|special_1316|>",
|
| 1198 |
+
"<|special_1317|>",
|
| 1199 |
+
"<|special_1318|>",
|
| 1200 |
+
"<|special_1319|>",
|
| 1201 |
+
"<|special_1320|>",
|
| 1202 |
+
"<|special_1321|>",
|
| 1203 |
+
"<|special_1322|>",
|
| 1204 |
+
"<|special_1323|>",
|
| 1205 |
+
"<|special_1324|>",
|
| 1206 |
+
"<|special_1325|>",
|
| 1207 |
+
"<|special_1326|>",
|
| 1208 |
+
"<|special_1327|>",
|
| 1209 |
+
"<|special_1328|>",
|
| 1210 |
+
"<|special_1329|>",
|
| 1211 |
+
"<|special_1330|>",
|
| 1212 |
+
"<|special_1331|>",
|
| 1213 |
+
"<|special_1332|>",
|
| 1214 |
+
"<|special_1333|>",
|
| 1215 |
+
"<|special_1334|>",
|
| 1216 |
+
"<|special_1335|>",
|
| 1217 |
+
"<|special_1336|>",
|
| 1218 |
+
"<|special_1337|>",
|
| 1219 |
+
"<|special_1338|>",
|
| 1220 |
+
"<|special_1339|>",
|
| 1221 |
+
"<|special_1340|>",
|
| 1222 |
+
"<|special_1341|>",
|
| 1223 |
+
"<|special_1342|>",
|
| 1224 |
+
"<|special_1343|>",
|
| 1225 |
+
"<|special_1344|>",
|
| 1226 |
+
"<|special_1345|>",
|
| 1227 |
+
"<|special_1346|>",
|
| 1228 |
+
"<|special_1347|>",
|
| 1229 |
+
"<|special_1348|>",
|
| 1230 |
+
"<|special_1349|>",
|
| 1231 |
+
"<|special_1350|>",
|
| 1232 |
+
"<|special_1351|>",
|
| 1233 |
+
"<|special_1352|>",
|
| 1234 |
+
"<|special_1353|>",
|
| 1235 |
+
"<|special_1354|>",
|
| 1236 |
+
"<|special_1355|>",
|
| 1237 |
+
"<|special_1356|>",
|
| 1238 |
+
"<|special_1357|>",
|
| 1239 |
+
"<|special_1358|>",
|
| 1240 |
+
"<|special_1359|>",
|
| 1241 |
+
"<|special_1360|>",
|
| 1242 |
+
"<|special_1361|>",
|
| 1243 |
+
"<|special_1362|>",
|
| 1244 |
+
"<|special_1363|>",
|
| 1245 |
+
"<|special_1364|>",
|
| 1246 |
+
"<|special_1365|>",
|
| 1247 |
+
"<|special_1366|>",
|
| 1248 |
+
"<|special_1367|>",
|
| 1249 |
+
"<|special_1368|>",
|
| 1250 |
+
"<|special_1369|>",
|
| 1251 |
+
"<|special_1370|>",
|
| 1252 |
+
"<|special_1371|>",
|
| 1253 |
+
"<|special_1372|>",
|
| 1254 |
+
"<|special_1373|>",
|
| 1255 |
+
"<|special_1374|>",
|
| 1256 |
+
"<|special_1375|>",
|
| 1257 |
+
"<|special_1376|>",
|
| 1258 |
+
"<|special_1377|>",
|
| 1259 |
+
"<|special_1378|>",
|
| 1260 |
+
"<|special_1379|>",
|
| 1261 |
+
"<|special_1380|>",
|
| 1262 |
+
"<|special_1381|>",
|
| 1263 |
+
"<|special_1382|>",
|
| 1264 |
+
"<|special_1383|>",
|
| 1265 |
+
"<|special_1384|>",
|
| 1266 |
+
"<|special_1385|>",
|
| 1267 |
+
"<|special_1386|>",
|
| 1268 |
+
"<|special_1387|>",
|
| 1269 |
+
"<|special_1388|>",
|
| 1270 |
+
"<|special_1389|>",
|
| 1271 |
+
"<|special_1390|>",
|
| 1272 |
+
"<|special_1391|>",
|
| 1273 |
+
"<|special_1392|>",
|
| 1274 |
+
"<|special_1393|>",
|
| 1275 |
+
"<|special_1394|>",
|
| 1276 |
+
"<|special_1395|>",
|
| 1277 |
+
"<|special_1396|>",
|
| 1278 |
+
"<|special_1397|>",
|
| 1279 |
+
"<|special_1398|>",
|
| 1280 |
+
"<|special_1399|>",
|
| 1281 |
+
"<|special_1400|>",
|
| 1282 |
+
"<|special_1401|>",
|
| 1283 |
+
"<|special_1402|>",
|
| 1284 |
+
"<|special_1403|>",
|
| 1285 |
+
"<|special_1404|>",
|
| 1286 |
+
"<|special_1405|>",
|
| 1287 |
+
"<|special_1406|>",
|
| 1288 |
+
"<|special_1407|>",
|
| 1289 |
+
"<|special_1408|>",
|
| 1290 |
+
"<|special_1409|>",
|
| 1291 |
+
"<|special_1410|>",
|
| 1292 |
+
"<|special_1411|>",
|
| 1293 |
+
"<|special_1412|>",
|
| 1294 |
+
"<|special_1413|>",
|
| 1295 |
+
"<|special_1414|>",
|
| 1296 |
+
"<|special_1415|>",
|
| 1297 |
+
"<|special_1416|>",
|
| 1298 |
+
"<|special_1417|>",
|
| 1299 |
+
"<|special_1418|>",
|
| 1300 |
+
"<|special_1419|>",
|
| 1301 |
+
"<|special_1420|>",
|
| 1302 |
+
"<|special_1421|>",
|
| 1303 |
+
"<|special_1422|>",
|
| 1304 |
+
"<|special_1423|>",
|
| 1305 |
+
"<|special_1424|>",
|
| 1306 |
+
"<|special_1425|>",
|
| 1307 |
+
"<|special_1426|>",
|
| 1308 |
+
"<|special_1427|>",
|
| 1309 |
+
"<|special_1428|>",
|
| 1310 |
+
"<|special_1429|>",
|
| 1311 |
+
"<|special_1430|>",
|
| 1312 |
+
"<|special_1431|>",
|
| 1313 |
+
"<|special_1432|>",
|
| 1314 |
+
"<|special_1433|>",
|
| 1315 |
+
"<|special_1434|>",
|
| 1316 |
+
"<|special_1435|>",
|
| 1317 |
+
"<|special_1436|>",
|
| 1318 |
+
"<|special_1437|>",
|
| 1319 |
+
"<|special_1438|>",
|
| 1320 |
+
"<|special_1439|>",
|
| 1321 |
+
"<|special_1440|>",
|
| 1322 |
+
"<|special_1441|>",
|
| 1323 |
+
"<|special_1442|>",
|
| 1324 |
+
"<|special_1443|>",
|
| 1325 |
+
"<|special_1444|>",
|
| 1326 |
+
"<|special_1445|>",
|
| 1327 |
+
"<|special_1446|>",
|
| 1328 |
+
"<|special_1447|>",
|
| 1329 |
+
"<|special_1448|>",
|
| 1330 |
+
"<|special_1449|>",
|
| 1331 |
+
"<|special_1450|>",
|
| 1332 |
+
"<|special_1451|>",
|
| 1333 |
+
"<|special_1452|>",
|
| 1334 |
+
"<|special_1453|>",
|
| 1335 |
+
"<|special_1454|>",
|
| 1336 |
+
"<|special_1455|>",
|
| 1337 |
+
"<|special_1456|>",
|
| 1338 |
+
"<|special_1457|>",
|
| 1339 |
+
"<|special_1458|>",
|
| 1340 |
+
"<|special_1459|>",
|
| 1341 |
+
"<|special_1460|>",
|
| 1342 |
+
"<|special_1461|>",
|
| 1343 |
+
"<|special_1462|>",
|
| 1344 |
+
"<|special_1463|>",
|
| 1345 |
+
"<|special_1464|>",
|
| 1346 |
+
"<|special_1465|>",
|
| 1347 |
+
"<|special_1466|>",
|
| 1348 |
+
"<|special_1467|>",
|
| 1349 |
+
"<|special_1468|>",
|
| 1350 |
+
"<|special_1469|>",
|
| 1351 |
+
"<|special_1470|>",
|
| 1352 |
+
"<|special_1471|>",
|
| 1353 |
+
"<|special_1472|>",
|
| 1354 |
+
"<|special_1473|>",
|
| 1355 |
+
"<|special_1474|>",
|
| 1356 |
+
"<|special_1475|>",
|
| 1357 |
+
"<|special_1476|>",
|
| 1358 |
+
"<|special_1477|>",
|
| 1359 |
+
"<|special_1478|>",
|
| 1360 |
+
"<|special_1479|>",
|
| 1361 |
+
"<|special_1480|>",
|
| 1362 |
+
"<|special_1481|>",
|
| 1363 |
+
"<|special_1482|>",
|
| 1364 |
+
"<|special_1483|>",
|
| 1365 |
+
"<|special_1484|>",
|
| 1366 |
+
"<|special_1485|>",
|
| 1367 |
+
"<|special_1486|>",
|
| 1368 |
+
"<|special_1487|>",
|
| 1369 |
+
"<|special_1488|>",
|
| 1370 |
+
"<|special_1489|>",
|
| 1371 |
+
"<|special_1490|>",
|
| 1372 |
+
"<|special_1491|>",
|
| 1373 |
+
"<|special_1492|>",
|
| 1374 |
+
"<|special_1493|>",
|
| 1375 |
+
"<|special_1494|>",
|
| 1376 |
+
"<|special_1495|>",
|
| 1377 |
+
"<|special_1496|>",
|
| 1378 |
+
"<|special_1497|>",
|
| 1379 |
+
"<|special_1498|>",
|
| 1380 |
+
"<|special_1499|>",
|
| 1381 |
+
"<|special_1500|>",
|
| 1382 |
+
"<|special_1501|>",
|
| 1383 |
+
"<|special_1502|>",
|
| 1384 |
+
"<|special_1503|>",
|
| 1385 |
+
"<|special_1504|>",
|
| 1386 |
+
"<|special_1505|>",
|
| 1387 |
+
"<|special_1506|>",
|
| 1388 |
+
"<|special_1507|>",
|
| 1389 |
+
"<|special_1508|>",
|
| 1390 |
+
"<|special_1509|>",
|
| 1391 |
+
"<|special_1510|>",
|
| 1392 |
+
"<|special_1511|>",
|
| 1393 |
+
"<|special_1512|>",
|
| 1394 |
+
"<|special_1513|>",
|
| 1395 |
+
"<|special_1514|>",
|
| 1396 |
+
"<|special_1515|>",
|
| 1397 |
+
"<|special_1516|>",
|
| 1398 |
+
"<|special_1517|>",
|
| 1399 |
+
"<|special_1518|>",
|
| 1400 |
+
"<|special_1519|>",
|
| 1401 |
+
"<|special_1520|>",
|
| 1402 |
+
"<|special_1521|>",
|
| 1403 |
+
"<|special_1522|>",
|
| 1404 |
+
"<|special_1523|>",
|
| 1405 |
+
"<|special_1524|>",
|
| 1406 |
+
"<|special_1525|>",
|
| 1407 |
+
"<|special_1526|>",
|
| 1408 |
+
"<|special_1527|>",
|
| 1409 |
+
"<|special_1528|>",
|
| 1410 |
+
"<|special_1529|>",
|
| 1411 |
+
"<|special_1530|>",
|
| 1412 |
+
"<|special_1531|>",
|
| 1413 |
+
"<|special_1532|>",
|
| 1414 |
+
"<|special_1533|>",
|
| 1415 |
+
"<|special_1534|>",
|
| 1416 |
+
"<|special_1535|>",
|
| 1417 |
+
"<|special_1536|>",
|
| 1418 |
+
"<|special_1537|>",
|
| 1419 |
+
"<|special_1538|>",
|
| 1420 |
+
"<|special_1539|>",
|
| 1421 |
+
"<|special_1540|>",
|
| 1422 |
+
"<|special_1541|>",
|
| 1423 |
+
"<|special_1542|>",
|
| 1424 |
+
"<|special_1543|>",
|
| 1425 |
+
"<|special_1544|>",
|
| 1426 |
+
"<|special_1545|>",
|
| 1427 |
+
"<|special_1546|>",
|
| 1428 |
+
"<|special_1547|>",
|
| 1429 |
+
"<|special_1548|>",
|
| 1430 |
+
"<|special_1549|>",
|
| 1431 |
+
"<|special_1550|>",
|
| 1432 |
+
"<|special_1551|>",
|
| 1433 |
+
"<|special_1552|>",
|
| 1434 |
+
"<|special_1553|>",
|
| 1435 |
+
"<|special_1554|>",
|
| 1436 |
+
"<|special_1555|>",
|
| 1437 |
+
"<|special_1556|>",
|
| 1438 |
+
"<|special_1557|>",
|
| 1439 |
+
"<|special_1558|>",
|
| 1440 |
+
"<|special_1559|>",
|
| 1441 |
+
"<|special_1560|>",
|
| 1442 |
+
"<|special_1561|>",
|
| 1443 |
+
"<|special_1562|>",
|
| 1444 |
+
"<|special_1563|>",
|
| 1445 |
+
"<|special_1564|>",
|
| 1446 |
+
"<|special_1565|>",
|
| 1447 |
+
"<|special_1566|>",
|
| 1448 |
+
"<|special_1567|>",
|
| 1449 |
+
"<|special_1568|>",
|
| 1450 |
+
"<|special_1569|>",
|
| 1451 |
+
"<|special_1570|>",
|
| 1452 |
+
"<|special_1571|>",
|
| 1453 |
+
"<|special_1572|>",
|
| 1454 |
+
"<|special_1573|>",
|
| 1455 |
+
"<|special_1574|>",
|
| 1456 |
+
"<|special_1575|>",
|
| 1457 |
+
"<|special_1576|>",
|
| 1458 |
+
"<|special_1577|>",
|
| 1459 |
+
"<|special_1578|>",
|
| 1460 |
+
"<|special_1579|>",
|
| 1461 |
+
"<|special_1580|>",
|
| 1462 |
+
"<|special_1581|>",
|
| 1463 |
+
"<|special_1582|>",
|
| 1464 |
+
"<|special_1583|>",
|
| 1465 |
+
"<|special_1584|>",
|
| 1466 |
+
"<|special_1585|>",
|
| 1467 |
+
"<|special_1586|>",
|
| 1468 |
+
"<|special_1587|>",
|
| 1469 |
+
"<|special_1588|>",
|
| 1470 |
+
"<|special_1589|>",
|
| 1471 |
+
"<|special_1590|>",
|
| 1472 |
+
"<|special_1591|>",
|
| 1473 |
+
"<|special_1592|>",
|
| 1474 |
+
"<|special_1593|>",
|
| 1475 |
+
"<|special_1594|>",
|
| 1476 |
+
"<|special_1595|>",
|
| 1477 |
+
"<|special_1596|>",
|
| 1478 |
+
"<|special_1597|>",
|
| 1479 |
+
"<|special_1598|>",
|
| 1480 |
+
"<|special_1599|>",
|
| 1481 |
+
"<|special_1600|>",
|
| 1482 |
+
"<|special_1601|>",
|
| 1483 |
+
"<|special_1602|>",
|
| 1484 |
+
"<|special_1603|>",
|
| 1485 |
+
"<|special_1604|>",
|
| 1486 |
+
"<|special_1605|>",
|
| 1487 |
+
"<|special_1606|>",
|
| 1488 |
+
"<|special_1607|>",
|
| 1489 |
+
"<|special_1608|>",
|
| 1490 |
+
"<|special_1609|>",
|
| 1491 |
+
"<|special_1610|>",
|
| 1492 |
+
"<|special_1611|>",
|
| 1493 |
+
"<|special_1612|>",
|
| 1494 |
+
"<|special_1613|>",
|
| 1495 |
+
"<|special_1614|>",
|
| 1496 |
+
"<|special_1615|>",
|
| 1497 |
+
"<|special_1616|>",
|
| 1498 |
+
"<|special_1617|>",
|
| 1499 |
+
"<|special_1618|>",
|
| 1500 |
+
"<|special_1619|>",
|
| 1501 |
+
"<|special_1620|>",
|
| 1502 |
+
"<|special_1621|>",
|
| 1503 |
+
"<|special_1622|>",
|
| 1504 |
+
"<|special_1623|>",
|
| 1505 |
+
"<|special_1624|>",
|
| 1506 |
+
"<|special_1625|>",
|
| 1507 |
+
"<|special_1626|>",
|
| 1508 |
+
"<|special_1627|>",
|
| 1509 |
+
"<|special_1628|>",
|
| 1510 |
+
"<|special_1629|>",
|
| 1511 |
+
"<|special_1630|>",
|
| 1512 |
+
"<|special_1631|>",
|
| 1513 |
+
"<|special_1632|>",
|
| 1514 |
+
"<|special_1633|>",
|
| 1515 |
+
"<|special_1634|>",
|
| 1516 |
+
"<|special_1635|>",
|
| 1517 |
+
"<|special_1636|>",
|
| 1518 |
+
"<|special_1637|>",
|
| 1519 |
+
"<|special_1638|>",
|
| 1520 |
+
"<|special_1639|>",
|
| 1521 |
+
"<|special_1640|>",
|
| 1522 |
+
"<|special_1641|>",
|
| 1523 |
+
"<|special_1642|>",
|
| 1524 |
+
"<|special_1643|>",
|
| 1525 |
+
"<|special_1644|>",
|
| 1526 |
+
"<|special_1645|>",
|
| 1527 |
+
"<|special_1646|>",
|
| 1528 |
+
"<|special_1647|>",
|
| 1529 |
+
"<|special_1648|>",
|
| 1530 |
+
"<|special_1649|>",
|
| 1531 |
+
"<|special_1650|>",
|
| 1532 |
+
"<|special_1651|>",
|
| 1533 |
+
"<|special_1652|>",
|
| 1534 |
+
"<|special_1653|>",
|
| 1535 |
+
"<|special_1654|>",
|
| 1536 |
+
"<|special_1655|>",
|
| 1537 |
+
"<|special_1656|>",
|
| 1538 |
+
"<|special_1657|>",
|
| 1539 |
+
"<|special_1658|>",
|
| 1540 |
+
"<|special_1659|>",
|
| 1541 |
+
"<|special_1660|>",
|
| 1542 |
+
"<|special_1661|>",
|
| 1543 |
+
"<|special_1662|>",
|
| 1544 |
+
"<|special_1663|>",
|
| 1545 |
+
"<|special_1664|>",
|
| 1546 |
+
"<|special_1665|>",
|
| 1547 |
+
"<|special_1666|>",
|
| 1548 |
+
"<|special_1667|>",
|
| 1549 |
+
"<|special_1668|>",
|
| 1550 |
+
"<|special_1669|>",
|
| 1551 |
+
"<|special_1670|>",
|
| 1552 |
+
"<|special_1671|>",
|
| 1553 |
+
"<|special_1672|>",
|
| 1554 |
+
"<|special_1673|>",
|
| 1555 |
+
"<|special_1674|>",
|
| 1556 |
+
"<|special_1675|>",
|
| 1557 |
+
"<|special_1676|>",
|
| 1558 |
+
"<|special_1677|>",
|
| 1559 |
+
"<|special_1678|>",
|
| 1560 |
+
"<|special_1679|>",
|
| 1561 |
+
"<|special_1680|>",
|
| 1562 |
+
"<|special_1681|>",
|
| 1563 |
+
"<|special_1682|>",
|
| 1564 |
+
"<|special_1683|>",
|
| 1565 |
+
"<|special_1684|>",
|
| 1566 |
+
"<|special_1685|>",
|
| 1567 |
+
"<|special_1686|>",
|
| 1568 |
+
"<|special_1687|>",
|
| 1569 |
+
"<|special_1688|>",
|
| 1570 |
+
"<|special_1689|>",
|
| 1571 |
+
"<|special_1690|>",
|
| 1572 |
+
"<|special_1691|>",
|
| 1573 |
+
"<|special_1692|>",
|
| 1574 |
+
"<|special_1693|>",
|
| 1575 |
+
"<|special_1694|>",
|
| 1576 |
+
"<|special_1695|>",
|
| 1577 |
+
"<|special_1696|>",
|
| 1578 |
+
"<|special_1697|>",
|
| 1579 |
+
"<|special_1698|>",
|
| 1580 |
+
"<|special_1699|>",
|
| 1581 |
+
"<|special_1700|>",
|
| 1582 |
+
"<|special_1701|>",
|
| 1583 |
+
"<|special_1702|>",
|
| 1584 |
+
"<|special_1703|>",
|
| 1585 |
+
"<|special_1704|>",
|
| 1586 |
+
"<|special_1705|>",
|
| 1587 |
+
"<|special_1706|>",
|
| 1588 |
+
"<|special_1707|>",
|
| 1589 |
+
"<|special_1708|>",
|
| 1590 |
+
"<|special_1709|>",
|
| 1591 |
+
"<|special_1710|>",
|
| 1592 |
+
"<|special_1711|>",
|
| 1593 |
+
"<|special_1712|>",
|
| 1594 |
+
"<|special_1713|>",
|
| 1595 |
+
"<|special_1714|>",
|
| 1596 |
+
"<|special_1715|>",
|
| 1597 |
+
"<|special_1716|>",
|
| 1598 |
+
"<|special_1717|>",
|
| 1599 |
+
"<|special_1718|>",
|
| 1600 |
+
"<|special_1719|>",
|
| 1601 |
+
"<|special_1720|>",
|
| 1602 |
+
"<|special_1721|>",
|
| 1603 |
+
"<|special_1722|>",
|
| 1604 |
+
"<|special_1723|>",
|
| 1605 |
+
"<|special_1724|>",
|
| 1606 |
+
"<|special_1725|>",
|
| 1607 |
+
"<|special_1726|>",
|
| 1608 |
+
"<|special_1727|>",
|
| 1609 |
+
"<|special_1728|>",
|
| 1610 |
+
"<|special_1729|>",
|
| 1611 |
+
"<|special_1730|>",
|
| 1612 |
+
"<|special_1731|>",
|
| 1613 |
+
"<|special_1732|>",
|
| 1614 |
+
"<|special_1733|>",
|
| 1615 |
+
"<|special_1734|>",
|
| 1616 |
+
"<|special_1735|>",
|
| 1617 |
+
"<|special_1736|>",
|
| 1618 |
+
"<|special_1737|>",
|
| 1619 |
+
"<|special_1738|>",
|
| 1620 |
+
"<|special_1739|>",
|
| 1621 |
+
"<|special_1740|>",
|
| 1622 |
+
"<|special_1741|>",
|
| 1623 |
+
"<|special_1742|>",
|
| 1624 |
+
"<|special_1743|>",
|
| 1625 |
+
"<|special_1744|>",
|
| 1626 |
+
"<|special_1745|>",
|
| 1627 |
+
"<|special_1746|>",
|
| 1628 |
+
"<|special_1747|>",
|
| 1629 |
+
"<|special_1748|>",
|
| 1630 |
+
"<|special_1749|>",
|
| 1631 |
+
"<|special_1750|>",
|
| 1632 |
+
"<|special_1751|>",
|
| 1633 |
+
"<|special_1752|>",
|
| 1634 |
+
"<|special_1753|>",
|
| 1635 |
+
"<|special_1754|>",
|
| 1636 |
+
"<|special_1755|>",
|
| 1637 |
+
"<|special_1756|>",
|
| 1638 |
+
"<|special_1757|>",
|
| 1639 |
+
"<|special_1758|>",
|
| 1640 |
+
"<|special_1759|>",
|
| 1641 |
+
"<|special_1760|>",
|
| 1642 |
+
"<|special_1761|>",
|
| 1643 |
+
"<|special_1762|>",
|
| 1644 |
+
"<|special_1763|>",
|
| 1645 |
+
"<|special_1764|>",
|
| 1646 |
+
"<|special_1765|>",
|
| 1647 |
+
"<|special_1766|>",
|
| 1648 |
+
"<|special_1767|>",
|
| 1649 |
+
"<|special_1768|>",
|
| 1650 |
+
"<|special_1769|>",
|
| 1651 |
+
"<|special_1770|>",
|
| 1652 |
+
"<|special_1771|>",
|
| 1653 |
+
"<|special_1772|>",
|
| 1654 |
+
"<|special_1773|>",
|
| 1655 |
+
"<|special_1774|>",
|
| 1656 |
+
"<|special_1775|>",
|
| 1657 |
+
"<|special_1776|>",
|
| 1658 |
+
"<|special_1777|>",
|
| 1659 |
+
"<|special_1778|>",
|
| 1660 |
+
"<|special_1779|>",
|
| 1661 |
+
"<|special_1780|>",
|
| 1662 |
+
"<|special_1781|>",
|
| 1663 |
+
"<|special_1782|>",
|
| 1664 |
+
"<|special_1783|>",
|
| 1665 |
+
"<|special_1784|>",
|
| 1666 |
+
"<|special_1785|>",
|
| 1667 |
+
"<|special_1786|>",
|
| 1668 |
+
"<|special_1787|>",
|
| 1669 |
+
"<|special_1788|>",
|
| 1670 |
+
"<|special_1789|>",
|
| 1671 |
+
"<|special_1790|>",
|
| 1672 |
+
"<|special_1791|>",
|
| 1673 |
+
"<|special_1792|>",
|
| 1674 |
+
"<|special_1793|>",
|
| 1675 |
+
"<|special_1794|>",
|
| 1676 |
+
"<|special_1795|>",
|
| 1677 |
+
"<|special_1796|>",
|
| 1678 |
+
"<|special_1797|>",
|
| 1679 |
+
"<|special_1798|>",
|
| 1680 |
+
"<|special_1799|>",
|
| 1681 |
+
"<|special_1800|>",
|
| 1682 |
+
"<|special_1801|>",
|
| 1683 |
+
"<|special_1802|>",
|
| 1684 |
+
"<|special_1803|>",
|
| 1685 |
+
"<|special_1804|>",
|
| 1686 |
+
"<|special_1805|>",
|
| 1687 |
+
"<|special_1806|>",
|
| 1688 |
+
"<|special_1807|>",
|
| 1689 |
+
"<|special_1808|>",
|
| 1690 |
+
"<|special_1809|>",
|
| 1691 |
+
"<|special_1810|>",
|
| 1692 |
+
"<|special_1811|>",
|
| 1693 |
+
"<|special_1812|>",
|
| 1694 |
+
"<|special_1813|>",
|
| 1695 |
+
"<|special_1814|>",
|
| 1696 |
+
"<|special_1815|>",
|
| 1697 |
+
"<|special_1816|>",
|
| 1698 |
+
"<|special_1817|>",
|
| 1699 |
+
"<|special_1818|>",
|
| 1700 |
+
"<|special_1819|>",
|
| 1701 |
+
"<|special_1820|>",
|
| 1702 |
+
"<|special_1821|>",
|
| 1703 |
+
"<|special_1822|>",
|
| 1704 |
+
"<|special_1823|>",
|
| 1705 |
+
"<|special_1824|>",
|
| 1706 |
+
"<|special_1825|>",
|
| 1707 |
+
"<|special_1826|>",
|
| 1708 |
+
"<|special_1827|>",
|
| 1709 |
+
"<|special_1828|>",
|
| 1710 |
+
"<|special_1829|>",
|
| 1711 |
+
"<|special_1830|>",
|
| 1712 |
+
"<|special_1831|>",
|
| 1713 |
+
"<|special_1832|>",
|
| 1714 |
+
"<|special_1833|>",
|
| 1715 |
+
"<|special_1834|>",
|
| 1716 |
+
"<|special_1835|>",
|
| 1717 |
+
"<|special_1836|>",
|
| 1718 |
+
"<|special_1837|>",
|
| 1719 |
+
"<|special_1838|>",
|
| 1720 |
+
"<|special_1839|>",
|
| 1721 |
+
"<|special_1840|>",
|
| 1722 |
+
"<|special_1841|>",
|
| 1723 |
+
"<|special_1842|>",
|
| 1724 |
+
"<|special_1843|>",
|
| 1725 |
+
"<|special_1844|>",
|
| 1726 |
+
"<|special_1845|>",
|
| 1727 |
+
"<|special_1846|>",
|
| 1728 |
+
"<|special_1847|>",
|
| 1729 |
+
"<|special_1848|>",
|
| 1730 |
+
"<|special_1849|>",
|
| 1731 |
+
"<|special_1850|>",
|
| 1732 |
+
"<|special_1851|>",
|
| 1733 |
+
"<|special_1852|>",
|
| 1734 |
+
"<|special_1853|>",
|
| 1735 |
+
"<|special_1854|>",
|
| 1736 |
+
"<|special_1855|>",
|
| 1737 |
+
"<|special_1856|>",
|
| 1738 |
+
"<|special_1857|>",
|
| 1739 |
+
"<|special_1858|>",
|
| 1740 |
+
"<|special_1859|>",
|
| 1741 |
+
"<|special_1860|>",
|
| 1742 |
+
"<|special_1861|>",
|
| 1743 |
+
"<|special_1862|>",
|
| 1744 |
+
"<|special_1863|>",
|
| 1745 |
+
"<|special_1864|>",
|
| 1746 |
+
"<|special_1865|>",
|
| 1747 |
+
"<|special_1866|>",
|
| 1748 |
+
"<|special_1867|>",
|
| 1749 |
+
"<|special_1868|>",
|
| 1750 |
+
"<|special_1869|>",
|
| 1751 |
+
"<|special_1870|>",
|
| 1752 |
+
"<|special_1871|>",
|
| 1753 |
+
"<|special_1872|>",
|
| 1754 |
+
"<|special_1873|>",
|
| 1755 |
+
"<|special_1874|>",
|
| 1756 |
+
"<|special_1875|>",
|
| 1757 |
+
"<|special_1876|>",
|
| 1758 |
+
"<|special_1877|>",
|
| 1759 |
+
"<|special_1878|>",
|
| 1760 |
+
"<|special_1879|>",
|
| 1761 |
+
"<|special_1880|>",
|
| 1762 |
+
"<|special_1881|>",
|
| 1763 |
+
"<|special_1882|>",
|
| 1764 |
+
"<|special_1883|>",
|
| 1765 |
+
"<|special_1884|>",
|
| 1766 |
+
"<|special_1885|>",
|
| 1767 |
+
"<|special_1886|>",
|
| 1768 |
+
"<|special_1887|>",
|
| 1769 |
+
"<|special_1888|>",
|
| 1770 |
+
"<|special_1889|>",
|
| 1771 |
+
"<|special_1890|>",
|
| 1772 |
+
"<|special_1891|>",
|
| 1773 |
+
"<|special_1892|>",
|
| 1774 |
+
"<|special_1893|>",
|
| 1775 |
+
"<|special_1894|>",
|
| 1776 |
+
"<|special_1895|>",
|
| 1777 |
+
"<|special_1896|>",
|
| 1778 |
+
"<|special_1897|>",
|
| 1779 |
+
"<|special_1898|>",
|
| 1780 |
+
"<|special_1899|>",
|
| 1781 |
+
"<|special_1900|>",
|
| 1782 |
+
"<|special_1901|>",
|
| 1783 |
+
"<|special_1902|>",
|
| 1784 |
+
"<|special_1903|>",
|
| 1785 |
+
"<|special_1904|>",
|
| 1786 |
+
"<|special_1905|>",
|
| 1787 |
+
"<|special_1906|>",
|
| 1788 |
+
"<|special_1907|>",
|
| 1789 |
+
"<|special_1908|>",
|
| 1790 |
+
"<|special_1909|>",
|
| 1791 |
+
"<|special_1910|>",
|
| 1792 |
+
"<|special_1911|>",
|
| 1793 |
+
"<|special_1912|>",
|
| 1794 |
+
"<|special_1913|>",
|
| 1795 |
+
"<|special_1914|>",
|
| 1796 |
+
"<|special_1915|>",
|
| 1797 |
+
"<|special_1916|>",
|
| 1798 |
+
"<|special_1917|>",
|
| 1799 |
+
"<|special_1918|>",
|
| 1800 |
+
"<|special_1919|>",
|
| 1801 |
+
"<|special_1920|>",
|
| 1802 |
+
"<|special_1921|>",
|
| 1803 |
+
"<|special_1922|>",
|
| 1804 |
+
"<|special_1923|>",
|
| 1805 |
+
"<|special_1924|>",
|
| 1806 |
+
"<|special_1925|>",
|
| 1807 |
+
"<|special_1926|>",
|
| 1808 |
+
"<|special_1927|>",
|
| 1809 |
+
"<|special_1928|>",
|
| 1810 |
+
"<|special_1929|>",
|
| 1811 |
+
"<|special_1930|>",
|
| 1812 |
+
"<|special_1931|>",
|
| 1813 |
+
"<|special_1932|>",
|
| 1814 |
+
"<|special_1933|>",
|
| 1815 |
+
"<|special_1934|>",
|
| 1816 |
+
"<|special_1935|>",
|
| 1817 |
+
"<|special_1936|>",
|
| 1818 |
+
"<|special_1937|>",
|
| 1819 |
+
"<|special_1938|>",
|
| 1820 |
+
"<|special_1939|>",
|
| 1821 |
+
"<|special_1940|>",
|
| 1822 |
+
"<|special_1941|>",
|
| 1823 |
+
"<|special_1942|>",
|
| 1824 |
+
"<|special_1943|>",
|
| 1825 |
+
"<|special_1944|>",
|
| 1826 |
+
"<|special_1945|>",
|
| 1827 |
+
"<|special_1946|>",
|
| 1828 |
+
"<|special_1947|>",
|
| 1829 |
+
"<|special_1948|>",
|
| 1830 |
+
"<|special_1949|>",
|
| 1831 |
+
"<|special_1950|>",
|
| 1832 |
+
"<|special_1951|>",
|
| 1833 |
+
"<|special_1952|>",
|
| 1834 |
+
"<|special_1953|>",
|
| 1835 |
+
"<|special_1954|>",
|
| 1836 |
+
"<|special_1955|>",
|
| 1837 |
+
"<|special_1956|>",
|
| 1838 |
+
"<|special_1957|>",
|
| 1839 |
+
"<|special_1958|>",
|
| 1840 |
+
"<|special_1959|>",
|
| 1841 |
+
"<|special_1960|>",
|
| 1842 |
+
"<|special_1961|>",
|
| 1843 |
+
"<|special_1962|>",
|
| 1844 |
+
"<|special_1963|>",
|
| 1845 |
+
"<|special_1964|>",
|
| 1846 |
+
"<|special_1965|>",
|
| 1847 |
+
"<|special_1966|>",
|
| 1848 |
+
"<|special_1967|>",
|
| 1849 |
+
"<|special_1968|>",
|
| 1850 |
+
"<|special_1969|>",
|
| 1851 |
+
"<|special_1970|>",
|
| 1852 |
+
"<|special_1971|>",
|
| 1853 |
+
"<|special_1972|>",
|
| 1854 |
+
"<|special_1973|>",
|
| 1855 |
+
"<|special_1974|>",
|
| 1856 |
+
"<|special_1975|>",
|
| 1857 |
+
"<|special_1976|>",
|
| 1858 |
+
"<|special_1977|>",
|
| 1859 |
+
"<|special_1978|>",
|
| 1860 |
+
"<|special_1979|>",
|
| 1861 |
+
"<|special_1980|>",
|
| 1862 |
+
"<|special_1981|>",
|
| 1863 |
+
"<|special_1982|>",
|
| 1864 |
+
"<|special_1983|>",
|
| 1865 |
+
"<|special_1984|>",
|
| 1866 |
+
"<|special_1985|>",
|
| 1867 |
+
"<|special_1986|>",
|
| 1868 |
+
"<|special_1987|>",
|
| 1869 |
+
"<|special_1988|>",
|
| 1870 |
+
"<|special_1989|>",
|
| 1871 |
+
"<|special_1990|>",
|
| 1872 |
+
"<|special_1991|>",
|
| 1873 |
+
"<|special_1992|>",
|
| 1874 |
+
"<|special_1993|>",
|
| 1875 |
+
"<|special_1994|>",
|
| 1876 |
+
"<|special_1995|>",
|
| 1877 |
+
"<|special_1996|>",
|
| 1878 |
+
"<|special_1997|>",
|
| 1879 |
+
"<|special_1998|>",
|
| 1880 |
+
"<|special_1999|>",
|
| 1881 |
+
"<|special_2000|>",
|
| 1882 |
+
"<|special_2001|>",
|
| 1883 |
+
"<|special_2002|>",
|
| 1884 |
+
"<|special_2003|>",
|
| 1885 |
+
"<|special_2004|>",
|
| 1886 |
+
"<|special_2005|>",
|
| 1887 |
+
"<|special_2006|>",
|
| 1888 |
+
"<|special_2007|>",
|
| 1889 |
+
"<|special_2008|>",
|
| 1890 |
+
"<|special_2009|>",
|
| 1891 |
+
"<|special_2010|>",
|
| 1892 |
+
"<|special_2011|>",
|
| 1893 |
+
"<|special_2012|>",
|
| 1894 |
+
"<|special_2013|>",
|
| 1895 |
+
"<|special_2014|>",
|
| 1896 |
+
"<|special_2015|>",
|
| 1897 |
+
"<|special_2016|>",
|
| 1898 |
+
"<|special_2017|>",
|
| 1899 |
+
"<|special_2018|>",
|
| 1900 |
+
"<|special_2019|>",
|
| 1901 |
+
"<|special_2020|>",
|
| 1902 |
+
"<|special_2021|>",
|
| 1903 |
+
"<|special_2022|>",
|
| 1904 |
+
"<|special_2023|>",
|
| 1905 |
+
"<|special_2024|>",
|
| 1906 |
+
"<|special_2025|>",
|
| 1907 |
+
"<|special_2026|>",
|
| 1908 |
+
"<|special_2027|>",
|
| 1909 |
+
"<|special_2028|>",
|
| 1910 |
+
"<|special_2029|>",
|
| 1911 |
+
"<|special_2030|>",
|
| 1912 |
+
"<|special_2031|>",
|
| 1913 |
+
"<|special_2032|>",
|
| 1914 |
+
"<|special_2033|>",
|
| 1915 |
+
"<|special_2034|>",
|
| 1916 |
+
"<|special_2035|>",
|
| 1917 |
+
"<|special_2036|>",
|
| 1918 |
+
"<|special_2037|>",
|
| 1919 |
+
"<|special_2038|>",
|
| 1920 |
+
"<|special_2039|>",
|
| 1921 |
+
"<|special_2040|>",
|
| 1922 |
+
"<|special_2041|>",
|
| 1923 |
+
"<|special_2042|>",
|
| 1924 |
+
"<|special_2043|>",
|
| 1925 |
+
"<|special_2044|>",
|
| 1926 |
+
"<|special_2045|>",
|
| 1927 |
+
"<|special_2046|>",
|
| 1928 |
+
"<|special_2047|>",
|
| 1929 |
+
"<|special_2048|>",
|
| 1930 |
+
"<|special_2049|>",
|
| 1931 |
+
"<|special_2050|>",
|
| 1932 |
+
"<|special_2051|>",
|
| 1933 |
+
"<|special_2052|>",
|
| 1934 |
+
"<|special_2053|>",
|
| 1935 |
+
"<|special_2054|>",
|
| 1936 |
+
"<|special_2055|>",
|
| 1937 |
+
"<|special_2056|>",
|
| 1938 |
+
"<|special_2057|>",
|
| 1939 |
+
"<|special_2058|>",
|
| 1940 |
+
"<|special_2059|>",
|
| 1941 |
+
"<|special_2060|>",
|
| 1942 |
+
"<|special_2061|>",
|
| 1943 |
+
"<|special_2062|>",
|
| 1944 |
+
"<|special_2063|>",
|
| 1945 |
+
"<|special_2064|>",
|
| 1946 |
+
"<|special_2065|>",
|
| 1947 |
+
"<|special_2066|>",
|
| 1948 |
+
"<|special_2067|>",
|
| 1949 |
+
"<|special_2068|>",
|
| 1950 |
+
"<|special_2069|>",
|
| 1951 |
+
"<|special_2070|>",
|
| 1952 |
+
"<|special_2071|>",
|
| 1953 |
+
"<|special_2072|>",
|
| 1954 |
+
"<|special_2073|>",
|
| 1955 |
+
"<|special_2074|>",
|
| 1956 |
+
"<|special_2075|>",
|
| 1957 |
+
"<|special_2076|>",
|
| 1958 |
+
"<|special_2077|>",
|
| 1959 |
+
"<|special_2078|>",
|
| 1960 |
+
"<|special_2079|>",
|
| 1961 |
+
"<|special_2080|>",
|
| 1962 |
+
"<|special_2081|>",
|
| 1963 |
+
"<|special_2082|>",
|
| 1964 |
+
"<|special_2083|>",
|
| 1965 |
+
"<|special_2084|>",
|
| 1966 |
+
"<|special_2085|>",
|
| 1967 |
+
"<|special_2086|>",
|
| 1968 |
+
"<|special_2087|>",
|
| 1969 |
+
"<|special_2088|>",
|
| 1970 |
+
"<|special_2089|>",
|
| 1971 |
+
"<|special_2090|>",
|
| 1972 |
+
"<|special_2091|>",
|
| 1973 |
+
"<|special_2092|>",
|
| 1974 |
+
"<|special_2093|>",
|
| 1975 |
+
"<|special_2094|>",
|
| 1976 |
+
"<|special_2095|>",
|
| 1977 |
+
"<|special_2096|>",
|
| 1978 |
+
"<|special_2097|>",
|
| 1979 |
+
"<|special_2098|>",
|
| 1980 |
+
"<|special_2099|>",
|
| 1981 |
+
"<|special_2100|>",
|
| 1982 |
+
"<|special_2101|>",
|
| 1983 |
+
"<|special_2102|>",
|
| 1984 |
+
"<|special_2103|>",
|
| 1985 |
+
"<|special_2104|>",
|
| 1986 |
+
"<|special_2105|>",
|
| 1987 |
+
"<|special_2106|>",
|
| 1988 |
+
"<|special_2107|>",
|
| 1989 |
+
"<|special_2108|>",
|
| 1990 |
+
"<|special_2109|>",
|
| 1991 |
+
"<|special_2110|>",
|
| 1992 |
+
"<|special_2111|>",
|
| 1993 |
+
"<|special_2112|>",
|
| 1994 |
+
"<|special_2113|>",
|
| 1995 |
+
"<|special_2114|>",
|
| 1996 |
+
"<|special_2115|>",
|
| 1997 |
+
"<|special_2116|>",
|
| 1998 |
+
"<|special_2117|>",
|
| 1999 |
+
"<|special_2118|>",
|
| 2000 |
+
"<|special_2119|>",
|
| 2001 |
+
"<|special_2120|>",
|
| 2002 |
+
"<|special_2121|>",
|
| 2003 |
+
"<|special_2122|>",
|
| 2004 |
+
"<|special_2123|>",
|
| 2005 |
+
"<|special_2124|>",
|
| 2006 |
+
"<|special_2125|>",
|
| 2007 |
+
"<|special_2126|>",
|
| 2008 |
+
"<|special_2127|>",
|
| 2009 |
+
"<|special_2128|>",
|
| 2010 |
+
"<|special_2129|>",
|
| 2011 |
+
"<|special_2130|>",
|
| 2012 |
+
"<|special_2131|>",
|
| 2013 |
+
"<|special_2132|>",
|
| 2014 |
+
"<|special_2133|>",
|
| 2015 |
+
"<|special_2134|>",
|
| 2016 |
+
"<|special_2135|>",
|
| 2017 |
+
"<|special_2136|>",
|
| 2018 |
+
"<|special_2137|>",
|
| 2019 |
+
"<|special_2138|>",
|
| 2020 |
+
"<|special_2139|>",
|
| 2021 |
+
"<|special_2140|>",
|
| 2022 |
+
"<|special_2141|>",
|
| 2023 |
+
"<|special_2142|>",
|
| 2024 |
+
"<|special_2143|>",
|
| 2025 |
+
"<|special_2144|>",
|
| 2026 |
+
"<|special_2145|>",
|
| 2027 |
+
"<|special_2146|>",
|
| 2028 |
+
"<|special_2147|>",
|
| 2029 |
+
"<|special_2148|>",
|
| 2030 |
+
"<|special_2149|>",
|
| 2031 |
+
"<|special_2150|>",
|
| 2032 |
+
"<|special_2151|>",
|
| 2033 |
+
"<|special_2152|>",
|
| 2034 |
+
"<|special_2153|>",
|
| 2035 |
+
"<|special_2154|>",
|
| 2036 |
+
"<|special_2155|>",
|
| 2037 |
+
"<|special_2156|>",
|
| 2038 |
+
"<|special_2157|>",
|
| 2039 |
+
"<|special_2158|>",
|
| 2040 |
+
"<|special_2159|>",
|
| 2041 |
+
"<|special_2160|>",
|
| 2042 |
+
"<|special_2161|>",
|
| 2043 |
+
"<|special_2162|>",
|
| 2044 |
+
"<|special_2163|>",
|
| 2045 |
+
"<|special_2164|>",
|
| 2046 |
+
"<|special_2165|>",
|
| 2047 |
+
"<|special_2166|>",
|
| 2048 |
+
"<|special_2167|>",
|
| 2049 |
+
"<|special_2168|>",
|
| 2050 |
+
"<|special_2169|>",
|
| 2051 |
+
"<|special_2170|>",
|
| 2052 |
+
"<|special_2171|>",
|
| 2053 |
+
"<|special_2172|>",
|
| 2054 |
+
"<|special_2173|>",
|
| 2055 |
+
"<|special_2174|>",
|
| 2056 |
+
"<|special_2175|>",
|
| 2057 |
+
"<|special_2176|>",
|
| 2058 |
+
"<|special_2177|>",
|
| 2059 |
+
"<|special_2178|>",
|
| 2060 |
+
"<|special_2179|>",
|
| 2061 |
+
"<|special_2180|>",
|
| 2062 |
+
"<|special_2181|>",
|
| 2063 |
+
"<|special_2182|>",
|
| 2064 |
+
"<|special_2183|>",
|
| 2065 |
+
"<|special_2184|>",
|
| 2066 |
+
"<|special_2185|>",
|
| 2067 |
+
"<|special_2186|>",
|
| 2068 |
+
"<|special_2187|>",
|
| 2069 |
+
"<|special_2188|>",
|
| 2070 |
+
"<|special_2189|>",
|
| 2071 |
+
"<|special_2190|>",
|
| 2072 |
+
"<|special_2191|>",
|
| 2073 |
+
"<|special_2192|>",
|
| 2074 |
+
"<|special_2193|>",
|
| 2075 |
+
"<|special_2194|>",
|
| 2076 |
+
"<|special_2195|>",
|
| 2077 |
+
"<|special_2196|>",
|
| 2078 |
+
"<|special_2197|>",
|
| 2079 |
+
"<|special_2198|>",
|
| 2080 |
+
"<|special_2199|>",
|
| 2081 |
+
"<|special_2200|>",
|
| 2082 |
+
"<|special_2201|>",
|
| 2083 |
+
"<|special_2202|>",
|
| 2084 |
+
"<|special_2203|>",
|
| 2085 |
+
"<|special_2204|>",
|
| 2086 |
+
"<|special_2205|>",
|
| 2087 |
+
"<|special_2206|>",
|
| 2088 |
+
"<|special_2207|>",
|
| 2089 |
+
"<|special_2208|>",
|
| 2090 |
+
"<|special_2209|>",
|
| 2091 |
+
"<|special_2210|>",
|
| 2092 |
+
"<|special_2211|>",
|
| 2093 |
+
"<|special_2212|>",
|
| 2094 |
+
"<|special_2213|>",
|
| 2095 |
+
"<|special_2214|>",
|
| 2096 |
+
"<|special_2215|>",
|
| 2097 |
+
"<|special_2216|>",
|
| 2098 |
+
"<|special_2217|>",
|
| 2099 |
+
"<|special_2218|>",
|
| 2100 |
+
"<|special_2219|>",
|
| 2101 |
+
"<|special_2220|>",
|
| 2102 |
+
"<|special_2221|>",
|
| 2103 |
+
"<|special_2222|>",
|
| 2104 |
+
"<|special_2223|>",
|
| 2105 |
+
"<|special_2224|>",
|
| 2106 |
+
"<|special_2225|>",
|
| 2107 |
+
"<|special_2226|>",
|
| 2108 |
+
"<|special_2227|>",
|
| 2109 |
+
"<|special_2228|>",
|
| 2110 |
+
"<|special_2229|>",
|
| 2111 |
+
"<|special_2230|>",
|
| 2112 |
+
"<|special_2231|>",
|
| 2113 |
+
"<|special_2232|>",
|
| 2114 |
+
"<|special_2233|>",
|
| 2115 |
+
"<|special_2234|>",
|
| 2116 |
+
"<|special_2235|>",
|
| 2117 |
+
"<|special_2236|>",
|
| 2118 |
+
"<|special_2237|>",
|
| 2119 |
+
"<|special_2238|>",
|
| 2120 |
+
"<|special_2239|>",
|
| 2121 |
+
"<|special_2240|>",
|
| 2122 |
+
"<|special_2241|>",
|
| 2123 |
+
"<|special_2242|>",
|
| 2124 |
+
"<|special_2243|>",
|
| 2125 |
+
"<|special_2244|>",
|
| 2126 |
+
"<|special_2245|>",
|
| 2127 |
+
"<|special_2246|>",
|
| 2128 |
+
"<|special_2247|>",
|
| 2129 |
+
"<|special_2248|>",
|
| 2130 |
+
"<|special_2249|>",
|
| 2131 |
+
"<|special_2250|>",
|
| 2132 |
+
"<|special_2251|>",
|
| 2133 |
+
"<|special_2252|>",
|
| 2134 |
+
"<|special_2253|>",
|
| 2135 |
+
"<|special_2254|>",
|
| 2136 |
+
"<|special_2255|>",
|
| 2137 |
+
"<|special_2256|>",
|
| 2138 |
+
"<|special_2257|>",
|
| 2139 |
+
"<|special_2258|>",
|
| 2140 |
+
"<|special_2259|>",
|
| 2141 |
+
"<|special_2260|>",
|
| 2142 |
+
"<|special_2261|>",
|
| 2143 |
+
"<|special_2262|>",
|
| 2144 |
+
"<|special_2263|>",
|
| 2145 |
+
"<|special_2264|>",
|
| 2146 |
+
"<|special_2265|>",
|
| 2147 |
+
"<|special_2266|>",
|
| 2148 |
+
"<|special_2267|>",
|
| 2149 |
+
"<|special_2268|>",
|
| 2150 |
+
"<|special_2269|>",
|
| 2151 |
+
"<|special_2270|>",
|
| 2152 |
+
"<|special_2271|>",
|
| 2153 |
+
"<|special_2272|>",
|
| 2154 |
+
"<|special_2273|>",
|
| 2155 |
+
"<|special_2274|>",
|
| 2156 |
+
"<|special_2275|>",
|
| 2157 |
+
"<|special_2276|>",
|
| 2158 |
+
"<|special_2277|>",
|
| 2159 |
+
"<|special_2278|>",
|
| 2160 |
+
"<|special_2279|>",
|
| 2161 |
+
"<|special_2280|>",
|
| 2162 |
+
"<|special_2281|>",
|
| 2163 |
+
"<|special_2282|>",
|
| 2164 |
+
"<|special_2283|>",
|
| 2165 |
+
"<|special_2284|>",
|
| 2166 |
+
"<|special_2285|>",
|
| 2167 |
+
"<|special_2286|>",
|
| 2168 |
+
"<|special_2287|>",
|
| 2169 |
+
"<|special_2288|>",
|
| 2170 |
+
"<|special_2289|>",
|
| 2171 |
+
"<|special_2290|>",
|
| 2172 |
+
"<|special_2291|>",
|
| 2173 |
+
"<|special_2292|>",
|
| 2174 |
+
"<|special_2293|>",
|
| 2175 |
+
"<|special_2294|>",
|
| 2176 |
+
"<|special_2295|>",
|
| 2177 |
+
"<|special_2296|>",
|
| 2178 |
+
"<|special_2297|>",
|
| 2179 |
+
"<|special_2298|>",
|
| 2180 |
+
"<|special_2299|>",
|
| 2181 |
+
"<|special_2300|>",
|
| 2182 |
+
"<|special_2301|>",
|
| 2183 |
+
"<|special_2302|>",
|
| 2184 |
+
"<|special_2303|>",
|
| 2185 |
+
"<|special_2304|>",
|
| 2186 |
+
"<|special_2305|>",
|
| 2187 |
+
"<|special_2306|>",
|
| 2188 |
+
"<|special_2307|>",
|
| 2189 |
+
"<|special_2308|>",
|
| 2190 |
+
"<|special_2309|>",
|
| 2191 |
+
"<|special_2310|>",
|
| 2192 |
+
"<|special_2311|>",
|
| 2193 |
+
"<|special_2312|>",
|
| 2194 |
+
"<|special_2313|>",
|
| 2195 |
+
"<|special_2314|>",
|
| 2196 |
+
"<|special_2315|>",
|
| 2197 |
+
"<|special_2316|>",
|
| 2198 |
+
"<|special_2317|>",
|
| 2199 |
+
"<|special_2318|>",
|
| 2200 |
+
"<|special_2319|>",
|
| 2201 |
+
"<|special_2320|>",
|
| 2202 |
+
"<|special_2321|>",
|
| 2203 |
+
"<|special_2322|>",
|
| 2204 |
+
"<|special_2323|>",
|
| 2205 |
+
"<|special_2324|>",
|
| 2206 |
+
"<|special_2325|>",
|
| 2207 |
+
"<|special_2326|>",
|
| 2208 |
+
"<|special_2327|>",
|
| 2209 |
+
"<|special_2328|>",
|
| 2210 |
+
"<|special_2329|>",
|
| 2211 |
+
"<|special_2330|>",
|
| 2212 |
+
"<|special_2331|>",
|
| 2213 |
+
"<|special_2332|>",
|
| 2214 |
+
"<|special_2333|>",
|
| 2215 |
+
"<|special_2334|>",
|
| 2216 |
+
"<|special_2335|>",
|
| 2217 |
+
"<|special_2336|>",
|
| 2218 |
+
"<|special_2337|>",
|
| 2219 |
+
"<|special_2338|>",
|
| 2220 |
+
"<|special_2339|>",
|
| 2221 |
+
"<|special_2340|>",
|
| 2222 |
+
"<|special_2341|>",
|
| 2223 |
+
"<|special_2342|>",
|
| 2224 |
+
"<|special_2343|>",
|
| 2225 |
+
"<|special_2344|>",
|
| 2226 |
+
"<|special_2345|>",
|
| 2227 |
+
"<|special_2346|>",
|
| 2228 |
+
"<|special_2347|>",
|
| 2229 |
+
"<|special_2348|>",
|
| 2230 |
+
"<|special_2349|>",
|
| 2231 |
+
"<|special_2350|>",
|
| 2232 |
+
"<|special_2351|>",
|
| 2233 |
+
"<|special_2352|>",
|
| 2234 |
+
"<|special_2353|>",
|
| 2235 |
+
"<|special_2354|>",
|
| 2236 |
+
"<|special_2355|>",
|
| 2237 |
+
"<|special_2356|>",
|
| 2238 |
+
"<|special_2357|>",
|
| 2239 |
+
"<|special_2358|>",
|
| 2240 |
+
"<|special_2359|>",
|
| 2241 |
+
"<|special_2360|>",
|
| 2242 |
+
"<|special_2361|>",
|
| 2243 |
+
"<|special_2362|>",
|
| 2244 |
+
"<|special_2363|>",
|
| 2245 |
+
"<|special_2364|>",
|
| 2246 |
+
"<|special_2365|>",
|
| 2247 |
+
"<|special_2366|>",
|
| 2248 |
+
"<|special_2367|>",
|
| 2249 |
+
"<|special_2368|>",
|
| 2250 |
+
"<|special_2369|>",
|
| 2251 |
+
"<|special_2370|>",
|
| 2252 |
+
"<|special_2371|>",
|
| 2253 |
+
"<|special_2372|>",
|
| 2254 |
+
"<|special_2373|>",
|
| 2255 |
+
"<|special_2374|>",
|
| 2256 |
+
"<|special_2375|>",
|
| 2257 |
+
"<|special_2376|>",
|
| 2258 |
+
"<|special_2377|>",
|
| 2259 |
+
"<|special_2378|>",
|
| 2260 |
+
"<|special_2379|>",
|
| 2261 |
+
"<|special_2380|>",
|
| 2262 |
+
"<|special_2381|>",
|
| 2263 |
+
"<|special_2382|>",
|
| 2264 |
+
"<|special_2383|>",
|
| 2265 |
+
"<|special_2384|>",
|
| 2266 |
+
"<|special_2385|>",
|
| 2267 |
+
"<|special_2386|>",
|
| 2268 |
+
"<|special_2387|>",
|
| 2269 |
+
"<|special_2388|>",
|
| 2270 |
+
"<|special_2389|>",
|
| 2271 |
+
"<|special_2390|>",
|
| 2272 |
+
"<|special_2391|>",
|
| 2273 |
+
"<|special_2392|>",
|
| 2274 |
+
"<|special_2393|>",
|
| 2275 |
+
"<|special_2394|>",
|
| 2276 |
+
"<|special_2395|>",
|
| 2277 |
+
"<|special_2396|>",
|
| 2278 |
+
"<|special_2397|>",
|
| 2279 |
+
"<|special_2398|>",
|
| 2280 |
+
"<|special_2399|>",
|
| 2281 |
+
"<|special_2400|>",
|
| 2282 |
+
"<|special_2401|>",
|
| 2283 |
+
"<|special_2402|>",
|
| 2284 |
+
"<|special_2403|>",
|
| 2285 |
+
"<|special_2404|>",
|
| 2286 |
+
"<|special_2405|>",
|
| 2287 |
+
"<|special_2406|>",
|
| 2288 |
+
"<|special_2407|>",
|
| 2289 |
+
"<|special_2408|>",
|
| 2290 |
+
"<|special_2409|>",
|
| 2291 |
+
"<|special_2410|>",
|
| 2292 |
+
"<|special_2411|>",
|
| 2293 |
+
"<|special_2412|>",
|
| 2294 |
+
"<|special_2413|>",
|
| 2295 |
+
"<|special_2414|>",
|
| 2296 |
+
"<|special_2415|>",
|
| 2297 |
+
"<|special_2416|>",
|
| 2298 |
+
"<|special_2417|>",
|
| 2299 |
+
"<|special_2418|>",
|
| 2300 |
+
"<|special_2419|>",
|
| 2301 |
+
"<|special_2420|>",
|
| 2302 |
+
"<|special_2421|>",
|
| 2303 |
+
"<|special_2422|>",
|
| 2304 |
+
"<|special_2423|>",
|
| 2305 |
+
"<|special_2424|>",
|
| 2306 |
+
"<|special_2425|>",
|
| 2307 |
+
"<|special_2426|>",
|
| 2308 |
+
"<|special_2427|>",
|
| 2309 |
+
"<|special_2428|>",
|
| 2310 |
+
"<|special_2429|>",
|
| 2311 |
+
"<|special_2430|>",
|
| 2312 |
+
"<|special_2431|>",
|
| 2313 |
+
"<|special_2432|>",
|
| 2314 |
+
"<|special_2433|>",
|
| 2315 |
+
"<|special_2434|>",
|
| 2316 |
+
"<|special_2435|>",
|
| 2317 |
+
"<|special_2436|>",
|
| 2318 |
+
"<|special_2437|>",
|
| 2319 |
+
"<|special_2438|>",
|
| 2320 |
+
"<|special_2439|>",
|
| 2321 |
+
"<|special_2440|>",
|
| 2322 |
+
"<|special_2441|>",
|
| 2323 |
+
"<|special_2442|>",
|
| 2324 |
+
"<|special_2443|>",
|
| 2325 |
+
"<|special_2444|>",
|
| 2326 |
+
"<|special_2445|>",
|
| 2327 |
+
"<|special_2446|>",
|
| 2328 |
+
"<|special_2447|>",
|
| 2329 |
+
"<|special_2448|>",
|
| 2330 |
+
"<|special_2449|>",
|
| 2331 |
+
"<|special_2450|>",
|
| 2332 |
+
"<|special_2451|>",
|
| 2333 |
+
"<|special_2452|>",
|
| 2334 |
+
"<|special_2453|>",
|
| 2335 |
+
"<|special_2454|>",
|
| 2336 |
+
"<|special_2455|>",
|
| 2337 |
+
"<|special_2456|>",
|
| 2338 |
+
"<|special_2457|>",
|
| 2339 |
+
"<|special_2458|>",
|
| 2340 |
+
"<|special_2459|>",
|
| 2341 |
+
"<|special_2460|>",
|
| 2342 |
+
"<|special_2461|>",
|
| 2343 |
+
"<|special_2462|>",
|
| 2344 |
+
"<|special_2463|>",
|
| 2345 |
+
"<|special_2464|>",
|
| 2346 |
+
"<|special_2465|>",
|
| 2347 |
+
"<|special_2466|>",
|
| 2348 |
+
"<|special_2467|>",
|
| 2349 |
+
"<|special_2468|>",
|
| 2350 |
+
"<|special_2469|>",
|
| 2351 |
+
"<|special_2470|>",
|
| 2352 |
+
"<|special_2471|>",
|
| 2353 |
+
"<|special_2472|>",
|
| 2354 |
+
"<|special_2473|>",
|
| 2355 |
+
"<|special_2474|>",
|
| 2356 |
+
"<|special_2475|>",
|
| 2357 |
+
"<|special_2476|>",
|
| 2358 |
+
"<|special_2477|>",
|
| 2359 |
+
"<|special_2478|>",
|
| 2360 |
+
"<|special_2479|>",
|
| 2361 |
+
"<|special_2480|>",
|
| 2362 |
+
"<|special_2481|>",
|
| 2363 |
+
"<|special_2482|>",
|
| 2364 |
+
"<|special_2483|>",
|
| 2365 |
+
"<|special_2484|>",
|
| 2366 |
+
"<|special_2485|>",
|
| 2367 |
+
"<|special_2486|>",
|
| 2368 |
+
"<|special_2487|>",
|
| 2369 |
+
"<|special_2488|>",
|
| 2370 |
+
"<|special_2489|>",
|
| 2371 |
+
"<|special_2490|>",
|
| 2372 |
+
"<|special_2491|>",
|
| 2373 |
+
"<|special_2492|>",
|
| 2374 |
+
"<|special_2493|>",
|
| 2375 |
+
"<|special_2494|>",
|
| 2376 |
+
"<|special_2495|>",
|
| 2377 |
+
"<|special_2496|>",
|
| 2378 |
+
"<|special_2497|>",
|
| 2379 |
+
"<|special_2498|>",
|
| 2380 |
+
"<|special_2499|>",
|
| 2381 |
+
"<|special_2500|>",
|
| 2382 |
+
"<|special_2501|>",
|
| 2383 |
+
"<|special_2502|>",
|
| 2384 |
+
"<|special_2503|>",
|
| 2385 |
+
"<|special_2504|>",
|
| 2386 |
+
"<|special_2505|>",
|
| 2387 |
+
"<|special_2506|>",
|
| 2388 |
+
"<|special_2507|>",
|
| 2389 |
+
"<|special_2508|>",
|
| 2390 |
+
"<|special_2509|>",
|
| 2391 |
+
"<|special_2510|>",
|
| 2392 |
+
"<|special_2511|>",
|
| 2393 |
+
"<|special_2512|>",
|
| 2394 |
+
"<|special_2513|>",
|
| 2395 |
+
"<|special_2514|>",
|
| 2396 |
+
"<|special_2515|>",
|
| 2397 |
+
"<|special_2516|>",
|
| 2398 |
+
"<|special_2517|>",
|
| 2399 |
+
"<|special_2518|>",
|
| 2400 |
+
"<|special_2519|>",
|
| 2401 |
+
"<|special_2520|>",
|
| 2402 |
+
"<|special_2521|>",
|
| 2403 |
+
"<|special_2522|>",
|
| 2404 |
+
"<|special_2523|>",
|
| 2405 |
+
"<|special_2524|>",
|
| 2406 |
+
"<|special_2525|>",
|
| 2407 |
+
"<|special_2526|>",
|
| 2408 |
+
"<|special_2527|>",
|
| 2409 |
+
"<|special_2528|>",
|
| 2410 |
+
"<|special_2529|>",
|
| 2411 |
+
"<|special_2530|>",
|
| 2412 |
+
"<|special_2531|>",
|
| 2413 |
+
"<|special_2532|>",
|
| 2414 |
+
"<|special_2533|>",
|
| 2415 |
+
"<|special_2534|>",
|
| 2416 |
+
"<|special_2535|>",
|
| 2417 |
+
"<|special_2536|>",
|
| 2418 |
+
"<|special_2537|>",
|
| 2419 |
+
"<|special_2538|>",
|
| 2420 |
+
"<|special_2539|>",
|
| 2421 |
+
"<|special_2540|>",
|
| 2422 |
+
"<|special_2541|>",
|
| 2423 |
+
"<|special_2542|>",
|
| 2424 |
+
"<|special_2543|>",
|
| 2425 |
+
"<|special_2544|>",
|
| 2426 |
+
"<|special_2545|>",
|
| 2427 |
+
"<|special_2546|>",
|
| 2428 |
+
"<|special_2547|>",
|
| 2429 |
+
"<|special_2548|>",
|
| 2430 |
+
"<|special_2549|>",
|
| 2431 |
+
"<|special_2550|>",
|
| 2432 |
+
"<|special_2551|>",
|
| 2433 |
+
"<|special_2552|>",
|
| 2434 |
+
"<|special_2553|>",
|
| 2435 |
+
"<|special_2554|>",
|
| 2436 |
+
"<|special_2555|>",
|
| 2437 |
+
"<|special_2556|>",
|
| 2438 |
+
"<|special_2557|>",
|
| 2439 |
+
"<|special_2558|>",
|
| 2440 |
+
"<|special_2559|>",
|
| 2441 |
+
"<|special_2560|>",
|
| 2442 |
+
"<|special_2561|>",
|
| 2443 |
+
"<|special_2562|>",
|
| 2444 |
+
"<|special_2563|>",
|
| 2445 |
+
"<|special_2564|>",
|
| 2446 |
+
"<|special_2565|>",
|
| 2447 |
+
"<|special_2566|>",
|
| 2448 |
+
"<|special_2567|>",
|
| 2449 |
+
"<|special_2568|>",
|
| 2450 |
+
"<|special_2569|>",
|
| 2451 |
+
"<|special_2570|>",
|
| 2452 |
+
"<|special_2571|>",
|
| 2453 |
+
"<|special_2572|>",
|
| 2454 |
+
"<|special_2573|>",
|
| 2455 |
+
"<|special_2574|>",
|
| 2456 |
+
"<|special_2575|>",
|
| 2457 |
+
"<|special_2576|>",
|
| 2458 |
+
"<|special_2577|>",
|
| 2459 |
+
"<|special_2578|>",
|
| 2460 |
+
"<|special_2579|>",
|
| 2461 |
+
"<|special_2580|>",
|
| 2462 |
+
"<|special_2581|>",
|
| 2463 |
+
"<|special_2582|>",
|
| 2464 |
+
"<|special_2583|>",
|
| 2465 |
+
"<|special_2584|>",
|
| 2466 |
+
"<|special_2585|>",
|
| 2467 |
+
"<|special_2586|>",
|
| 2468 |
+
"<|special_2587|>",
|
| 2469 |
+
"<|special_2588|>",
|
| 2470 |
+
"<|special_2589|>",
|
| 2471 |
+
"<|special_2590|>",
|
| 2472 |
+
"<|special_2591|>",
|
| 2473 |
+
"<|special_2592|>",
|
| 2474 |
+
"<|special_2593|>",
|
| 2475 |
+
"<|special_2594|>",
|
| 2476 |
+
"<|special_2595|>",
|
| 2477 |
+
"<|special_2596|>",
|
| 2478 |
+
"<|special_2597|>",
|
| 2479 |
+
"<|special_2598|>",
|
| 2480 |
+
"<|special_2599|>",
|
| 2481 |
+
"<|special_2600|>",
|
| 2482 |
+
"<|special_2601|>",
|
| 2483 |
+
"<|special_2602|>",
|
| 2484 |
+
"<|special_2603|>",
|
| 2485 |
+
"<|special_2604|>",
|
| 2486 |
+
"<|special_2605|>",
|
| 2487 |
+
"<|special_2606|>",
|
| 2488 |
+
"<|special_2607|>",
|
| 2489 |
+
"<|special_2608|>",
|
| 2490 |
+
"<|special_2609|>",
|
| 2491 |
+
"<|special_2610|>",
|
| 2492 |
+
"<|special_2611|>",
|
| 2493 |
+
"<|special_2612|>",
|
| 2494 |
+
"<|special_2613|>",
|
| 2495 |
+
"<|special_2614|>",
|
| 2496 |
+
"<|special_2615|>",
|
| 2497 |
+
"<|special_2616|>",
|
| 2498 |
+
"<|special_2617|>",
|
| 2499 |
+
"<|special_2618|>",
|
| 2500 |
+
"<|special_2619|>",
|
| 2501 |
+
"<|special_2620|>",
|
| 2502 |
+
"<|special_2621|>",
|
| 2503 |
+
"<|special_2622|>",
|
| 2504 |
+
"<|special_2623|>",
|
| 2505 |
+
"<|special_2624|>",
|
| 2506 |
+
"<|special_2625|>",
|
| 2507 |
+
"<|special_2626|>",
|
| 2508 |
+
"<|special_2627|>",
|
| 2509 |
+
"<|special_2628|>",
|
| 2510 |
+
"<|special_2629|>",
|
| 2511 |
+
"<|special_2630|>",
|
| 2512 |
+
"<|special_2631|>",
|
| 2513 |
+
"<|special_2632|>",
|
| 2514 |
+
"<|special_2633|>",
|
| 2515 |
+
"<|special_2634|>",
|
| 2516 |
+
"<|special_2635|>",
|
| 2517 |
+
"<|special_2636|>",
|
| 2518 |
+
"<|special_2637|>",
|
| 2519 |
+
"<|special_2638|>",
|
| 2520 |
+
"<|special_2639|>",
|
| 2521 |
+
"<|special_2640|>",
|
| 2522 |
+
"<|special_2641|>",
|
| 2523 |
+
"<|special_2642|>",
|
| 2524 |
+
"<|special_2643|>",
|
| 2525 |
+
"<|special_2644|>",
|
| 2526 |
+
"<|special_2645|>",
|
| 2527 |
+
"<|special_2646|>",
|
| 2528 |
+
"<|special_2647|>",
|
| 2529 |
+
"<|special_2648|>",
|
| 2530 |
+
"<|special_2649|>",
|
| 2531 |
+
"<|special_2650|>",
|
| 2532 |
+
"<|special_2651|>",
|
| 2533 |
+
"<|special_2652|>",
|
| 2534 |
+
"<|special_2653|>",
|
| 2535 |
+
"<|special_2654|>",
|
| 2536 |
+
"<|special_2655|>",
|
| 2537 |
+
"<|special_2656|>",
|
| 2538 |
+
"<|special_2657|>",
|
| 2539 |
+
"<|special_2658|>",
|
| 2540 |
+
"<|special_2659|>",
|
| 2541 |
+
"<|special_2660|>",
|
| 2542 |
+
"<|special_2661|>",
|
| 2543 |
+
"<|special_2662|>",
|
| 2544 |
+
"<|special_2663|>",
|
| 2545 |
+
"<|special_2664|>",
|
| 2546 |
+
"<|special_2665|>",
|
| 2547 |
+
"<|special_2666|>",
|
| 2548 |
+
"<|special_2667|>",
|
| 2549 |
+
"<|special_2668|>",
|
| 2550 |
+
"<|special_2669|>",
|
| 2551 |
+
"<|special_2670|>",
|
| 2552 |
+
"<|special_2671|>",
|
| 2553 |
+
"<|special_2672|>",
|
| 2554 |
+
"<|special_2673|>",
|
| 2555 |
+
"<|special_2674|>",
|
| 2556 |
+
"<|special_2675|>",
|
| 2557 |
+
"<|special_2676|>",
|
| 2558 |
+
"<|special_2677|>",
|
| 2559 |
+
"<|special_2678|>",
|
| 2560 |
+
"<|special_2679|>",
|
| 2561 |
+
"<|special_2680|>",
|
| 2562 |
+
"<|special_2681|>",
|
| 2563 |
+
"<|special_2682|>",
|
| 2564 |
+
"<|special_2683|>",
|
| 2565 |
+
"<|special_2684|>",
|
| 2566 |
+
"<|special_2685|>",
|
| 2567 |
+
"<|special_2686|>",
|
| 2568 |
+
"<|special_2687|>",
|
| 2569 |
+
"<|special_2688|>",
|
| 2570 |
+
"<|special_2689|>",
|
| 2571 |
+
"<|special_2690|>",
|
| 2572 |
+
"<|special_2691|>",
|
| 2573 |
+
"<|special_2692|>",
|
| 2574 |
+
"<|special_2693|>",
|
| 2575 |
+
"<|special_2694|>",
|
| 2576 |
+
"<|special_2695|>",
|
| 2577 |
+
"<|special_2696|>",
|
| 2578 |
+
"<|special_2697|>",
|
| 2579 |
+
"<|special_2698|>",
|
| 2580 |
+
"<|special_2699|>",
|
| 2581 |
+
"<|special_2700|>",
|
| 2582 |
+
"<|special_2701|>",
|
| 2583 |
+
"<|special_2702|>",
|
| 2584 |
+
"<|special_2703|>",
|
| 2585 |
+
"<|special_2704|>",
|
| 2586 |
+
"<|special_2705|>",
|
| 2587 |
+
"<|special_2706|>",
|
| 2588 |
+
"<|special_2707|>",
|
| 2589 |
+
"<|special_2708|>",
|
| 2590 |
+
"<|special_2709|>",
|
| 2591 |
+
"<|special_2710|>",
|
| 2592 |
+
"<|special_2711|>",
|
| 2593 |
+
"<|special_2712|>",
|
| 2594 |
+
"<|special_2713|>",
|
| 2595 |
+
"<|special_2714|>",
|
| 2596 |
+
"<|special_2715|>",
|
| 2597 |
+
"<|special_2716|>",
|
| 2598 |
+
"<|special_2717|>",
|
| 2599 |
+
"<|special_2718|>",
|
| 2600 |
+
"<|special_2719|>",
|
| 2601 |
+
"<|special_2720|>",
|
| 2602 |
+
"<|special_2721|>",
|
| 2603 |
+
"<|special_2722|>",
|
| 2604 |
+
"<|special_2723|>",
|
| 2605 |
+
"<|special_2724|>",
|
| 2606 |
+
"<|special_2725|>",
|
| 2607 |
+
"<|special_2726|>",
|
| 2608 |
+
"<|special_2727|>",
|
| 2609 |
+
"<|special_2728|>",
|
| 2610 |
+
"<|special_2729|>",
|
| 2611 |
+
"<|special_2730|>",
|
| 2612 |
+
"<|special_2731|>",
|
| 2613 |
+
"<|special_2732|>",
|
| 2614 |
+
"<|special_2733|>",
|
| 2615 |
+
"<|special_2734|>",
|
| 2616 |
+
"<|special_2735|>",
|
| 2617 |
+
"<|special_2736|>",
|
| 2618 |
+
"<|special_2737|>",
|
| 2619 |
+
"<|special_2738|>",
|
| 2620 |
+
"<|special_2739|>",
|
| 2621 |
+
"<|special_2740|>",
|
| 2622 |
+
"<|special_2741|>",
|
| 2623 |
+
"<|special_2742|>",
|
| 2624 |
+
"<|special_2743|>",
|
| 2625 |
+
"<|special_2744|>",
|
| 2626 |
+
"<|special_2745|>",
|
| 2627 |
+
"<|special_2746|>",
|
| 2628 |
+
"<|special_2747|>",
|
| 2629 |
+
"<|special_2748|>",
|
| 2630 |
+
"<|special_2749|>",
|
| 2631 |
+
"<|special_2750|>",
|
| 2632 |
+
"<|special_2751|>",
|
| 2633 |
+
"<|special_2752|>",
|
| 2634 |
+
"<|special_2753|>",
|
| 2635 |
+
"<|special_2754|>",
|
| 2636 |
+
"<|special_2755|>",
|
| 2637 |
+
"<|special_2756|>",
|
| 2638 |
+
"<|special_2757|>",
|
| 2639 |
+
"<|special_2758|>",
|
| 2640 |
+
"<|special_2759|>",
|
| 2641 |
+
"<|special_2760|>",
|
| 2642 |
+
"<|special_2761|>",
|
| 2643 |
+
"<|special_2762|>",
|
| 2644 |
+
"<|special_2763|>",
|
| 2645 |
+
"<|special_2764|>",
|
| 2646 |
+
"<|special_2765|>",
|
| 2647 |
+
"<|special_2766|>",
|
| 2648 |
+
"<|special_2767|>",
|
| 2649 |
+
"<|special_2768|>",
|
| 2650 |
+
"<|special_2769|>",
|
| 2651 |
+
"<|special_2770|>",
|
| 2652 |
+
"<|special_2771|>",
|
| 2653 |
+
"<|special_2772|>",
|
| 2654 |
+
"<|special_2773|>",
|
| 2655 |
+
"<|special_2774|>",
|
| 2656 |
+
"<|special_2775|>",
|
| 2657 |
+
"<|special_2776|>",
|
| 2658 |
+
"<|special_2777|>",
|
| 2659 |
+
"<|special_2778|>",
|
| 2660 |
+
"<|special_2779|>",
|
| 2661 |
+
"<|special_2780|>",
|
| 2662 |
+
"<|special_2781|>",
|
| 2663 |
+
"<|special_2782|>",
|
| 2664 |
+
"<|special_2783|>",
|
| 2665 |
+
"<|special_2784|>",
|
| 2666 |
+
"<|special_2785|>",
|
| 2667 |
+
"<|special_2786|>",
|
| 2668 |
+
"<|special_2787|>",
|
| 2669 |
+
"<|special_2788|>",
|
| 2670 |
+
"<|special_2789|>",
|
| 2671 |
+
"<|special_2790|>",
|
| 2672 |
+
"<|special_2791|>",
|
| 2673 |
+
"<|special_2792|>",
|
| 2674 |
+
"<|special_2793|>",
|
| 2675 |
+
"<|special_2794|>",
|
| 2676 |
+
"<|special_2795|>",
|
| 2677 |
+
"<|special_2796|>",
|
| 2678 |
+
"<|special_2797|>",
|
| 2679 |
+
"<|special_2798|>",
|
| 2680 |
+
"<|special_2799|>",
|
| 2681 |
+
"<|special_2800|>",
|
| 2682 |
+
"<|special_2801|>",
|
| 2683 |
+
"<|special_2802|>",
|
| 2684 |
+
"<|special_2803|>",
|
| 2685 |
+
"<|special_2804|>",
|
| 2686 |
+
"<|special_2805|>",
|
| 2687 |
+
"<|special_2806|>",
|
| 2688 |
+
"<|special_2807|>",
|
| 2689 |
+
"<|special_2808|>",
|
| 2690 |
+
"<|special_2809|>",
|
| 2691 |
+
"<|special_2810|>",
|
| 2692 |
+
"<|special_2811|>",
|
| 2693 |
+
"<|special_2812|>",
|
| 2694 |
+
"<|special_2813|>",
|
| 2695 |
+
"<|special_2814|>",
|
| 2696 |
+
"<|special_2815|>",
|
| 2697 |
+
"<|special_2816|>",
|
| 2698 |
+
"<|special_2817|>",
|
| 2699 |
+
"<|special_2818|>",
|
| 2700 |
+
"<|special_2819|>",
|
| 2701 |
+
"<|special_2820|>",
|
| 2702 |
+
"<|special_2821|>",
|
| 2703 |
+
"<|special_2822|>",
|
| 2704 |
+
"<|special_2823|>",
|
| 2705 |
+
"<|special_2824|>",
|
| 2706 |
+
"<|special_2825|>",
|
| 2707 |
+
"<|special_2826|>",
|
| 2708 |
+
"<|special_2827|>",
|
| 2709 |
+
"<|special_2828|>",
|
| 2710 |
+
"<|special_2829|>",
|
| 2711 |
+
"<|special_2830|>",
|
| 2712 |
+
"<|special_2831|>",
|
| 2713 |
+
"<|special_2832|>",
|
| 2714 |
+
"<|special_2833|>",
|
| 2715 |
+
"<|special_2834|>",
|
| 2716 |
+
"<|special_2835|>",
|
| 2717 |
+
"<|special_2836|>",
|
| 2718 |
+
"<|special_2837|>",
|
| 2719 |
+
"<|special_2838|>",
|
| 2720 |
+
"<|special_2839|>",
|
| 2721 |
+
"<|special_2840|>",
|
| 2722 |
+
"<|special_2841|>",
|
| 2723 |
+
"<|special_2842|>",
|
| 2724 |
+
"<|special_2843|>",
|
| 2725 |
+
"<|special_2844|>",
|
| 2726 |
+
"<|special_2845|>",
|
| 2727 |
+
"<|special_2846|>",
|
| 2728 |
+
"<|special_2847|>",
|
| 2729 |
+
"<|special_2848|>",
|
| 2730 |
+
"<|special_2849|>",
|
| 2731 |
+
"<|special_2850|>",
|
| 2732 |
+
"<|special_2851|>",
|
| 2733 |
+
"<|special_2852|>",
|
| 2734 |
+
"<|special_2853|>",
|
| 2735 |
+
"<|special_2854|>",
|
| 2736 |
+
"<|special_2855|>",
|
| 2737 |
+
"<|special_2856|>",
|
| 2738 |
+
"<|special_2857|>",
|
| 2739 |
+
"<|special_2858|>",
|
| 2740 |
+
"<|special_2859|>",
|
| 2741 |
+
"<|special_2860|>",
|
| 2742 |
+
"<|special_2861|>",
|
| 2743 |
+
"<|special_2862|>",
|
| 2744 |
+
"<|special_2863|>",
|
| 2745 |
+
"<|special_2864|>",
|
| 2746 |
+
"<|special_2865|>",
|
| 2747 |
+
"<|special_2866|>",
|
| 2748 |
+
"<|special_2867|>",
|
| 2749 |
+
"<|special_2868|>",
|
| 2750 |
+
"<|special_2869|>",
|
| 2751 |
+
"<|special_2870|>",
|
| 2752 |
+
"<|special_2871|>",
|
| 2753 |
+
"<|special_2872|>",
|
| 2754 |
+
"<|special_2873|>",
|
| 2755 |
+
"<|special_2874|>",
|
| 2756 |
+
"<|special_2875|>",
|
| 2757 |
+
"<|special_2876|>",
|
| 2758 |
+
"<|special_2877|>",
|
| 2759 |
+
"<|special_2878|>",
|
| 2760 |
+
"<|special_2879|>",
|
| 2761 |
+
"<|special_2880|>",
|
| 2762 |
+
"<|special_2881|>",
|
| 2763 |
+
"<|special_2882|>",
|
| 2764 |
+
"<|special_2883|>",
|
| 2765 |
+
"<|special_2884|>",
|
| 2766 |
+
"<|special_2885|>",
|
| 2767 |
+
"<|special_2886|>",
|
| 2768 |
+
"<|special_2887|>",
|
| 2769 |
+
"<|special_2888|>",
|
| 2770 |
+
"<|special_2889|>",
|
| 2771 |
+
"<|special_2890|>",
|
| 2772 |
+
"<|special_2891|>",
|
| 2773 |
+
"<|special_2892|>",
|
| 2774 |
+
"<|special_2893|>",
|
| 2775 |
+
"<|special_2894|>",
|
| 2776 |
+
"<|special_2895|>",
|
| 2777 |
+
"<|special_2896|>",
|
| 2778 |
+
"<|special_2897|>",
|
| 2779 |
+
"<|special_2898|>",
|
| 2780 |
+
"<|special_2899|>",
|
| 2781 |
+
"<|special_2900|>",
|
| 2782 |
+
"<|special_2901|>",
|
| 2783 |
+
"<|special_2902|>",
|
| 2784 |
+
"<|special_2903|>",
|
| 2785 |
+
"<|special_2904|>",
|
| 2786 |
+
"<|special_2905|>",
|
| 2787 |
+
"<|special_2906|>",
|
| 2788 |
+
"<|special_2907|>",
|
| 2789 |
+
"<|special_2908|>",
|
| 2790 |
+
"<|special_2909|>",
|
| 2791 |
+
"<|special_2910|>",
|
| 2792 |
+
"<|special_2911|>",
|
| 2793 |
+
"<|special_2912|>",
|
| 2794 |
+
"<|special_2913|>",
|
| 2795 |
+
"<|special_2914|>",
|
| 2796 |
+
"<|special_2915|>",
|
| 2797 |
+
"<|special_2916|>",
|
| 2798 |
+
"<|special_2917|>",
|
| 2799 |
+
"<|special_2918|>",
|
| 2800 |
+
"<|special_2919|>",
|
| 2801 |
+
"<|special_2920|>",
|
| 2802 |
+
"<|special_2921|>",
|
| 2803 |
+
"<|special_2922|>",
|
| 2804 |
+
"<|special_2923|>",
|
| 2805 |
+
"<|special_2924|>",
|
| 2806 |
+
"<|special_2925|>",
|
| 2807 |
+
"<|special_2926|>",
|
| 2808 |
+
"<|special_2927|>",
|
| 2809 |
+
"<|special_2928|>",
|
| 2810 |
+
"<|special_2929|>",
|
| 2811 |
+
"<|special_2930|>",
|
| 2812 |
+
"<|special_2931|>",
|
| 2813 |
+
"<|special_2932|>",
|
| 2814 |
+
"<|special_2933|>",
|
| 2815 |
+
"<|special_2934|>",
|
| 2816 |
+
"<|special_2935|>",
|
| 2817 |
+
"<|special_2936|>",
|
| 2818 |
+
"<|special_2937|>",
|
| 2819 |
+
"<|special_2938|>",
|
| 2820 |
+
"<|special_2939|>",
|
| 2821 |
+
"<|special_2940|>",
|
| 2822 |
+
"<|special_2941|>",
|
| 2823 |
+
"<|special_2942|>",
|
| 2824 |
+
"<|special_2943|>",
|
| 2825 |
+
"<|special_2944|>",
|
| 2826 |
+
"<|special_2945|>",
|
| 2827 |
+
"<|special_2946|>",
|
| 2828 |
+
"<|special_2947|>",
|
| 2829 |
+
"<|special_2948|>",
|
| 2830 |
+
"<|special_2949|>",
|
| 2831 |
+
"<|special_2950|>",
|
| 2832 |
+
"<|special_2951|>",
|
| 2833 |
+
"<|special_2952|>",
|
| 2834 |
+
"<|special_2953|>",
|
| 2835 |
+
"<|special_2954|>",
|
| 2836 |
+
"<|special_2955|>",
|
| 2837 |
+
"<|special_2956|>",
|
| 2838 |
+
"<|special_2957|>",
|
| 2839 |
+
"<|special_2958|>",
|
| 2840 |
+
"<|special_2959|>",
|
| 2841 |
+
"<|special_2960|>",
|
| 2842 |
+
"<|special_2961|>",
|
| 2843 |
+
"<|special_2962|>",
|
| 2844 |
+
"<|special_2963|>",
|
| 2845 |
+
"<|special_2964|>",
|
| 2846 |
+
"<|special_2965|>",
|
| 2847 |
+
"<|special_2966|>",
|
| 2848 |
+
"<|special_2967|>",
|
| 2849 |
+
"<|special_2968|>",
|
| 2850 |
+
"<|special_2969|>",
|
| 2851 |
+
"<|special_2970|>",
|
| 2852 |
+
"<|special_2971|>",
|
| 2853 |
+
"<|special_2972|>",
|
| 2854 |
+
"<|special_2973|>",
|
| 2855 |
+
"<|special_2974|>",
|
| 2856 |
+
"<|special_2975|>",
|
| 2857 |
+
"<|special_2976|>",
|
| 2858 |
+
"<|special_2977|>",
|
| 2859 |
+
"<|special_2978|>",
|
| 2860 |
+
"<|special_2979|>",
|
| 2861 |
+
"<|special_2980|>",
|
| 2862 |
+
"<|special_2981|>",
|
| 2863 |
+
"<|special_2982|>",
|
| 2864 |
+
"<|special_2983|>",
|
| 2865 |
+
"<|special_2984|>",
|
| 2866 |
+
"<|special_2985|>",
|
| 2867 |
+
"<|special_2986|>",
|
| 2868 |
+
"<|special_2987|>",
|
| 2869 |
+
"<|special_2988|>",
|
| 2870 |
+
"<|special_2989|>",
|
| 2871 |
+
"<|special_2990|>",
|
| 2872 |
+
"<|special_2991|>",
|
| 2873 |
+
"<|special_2992|>",
|
| 2874 |
+
"<|special_2993|>",
|
| 2875 |
+
"<|special_2994|>",
|
| 2876 |
+
"<|special_2995|>",
|
| 2877 |
+
"<|special_2996|>",
|
| 2878 |
+
"<|special_2997|>",
|
| 2879 |
+
"<|special_2998|>",
|
| 2880 |
+
"<|special_2999|>",
|
| 2881 |
+
"<|special_3000|>",
|
| 2882 |
+
"<|special_3001|>",
|
| 2883 |
+
"<|special_3002|>",
|
| 2884 |
+
"<|special_3003|>",
|
| 2885 |
+
"<|special_3004|>",
|
| 2886 |
+
"<|special_3005|>",
|
| 2887 |
+
"<|special_3006|>",
|
| 2888 |
+
"<|special_3007|>",
|
| 2889 |
+
"<|special_3008|>",
|
| 2890 |
+
"<|special_3009|>",
|
| 2891 |
+
"<|special_3010|>",
|
| 2892 |
+
"<|special_3011|>",
|
| 2893 |
+
"<|special_3012|>",
|
| 2894 |
+
"<|special_3013|>",
|
| 2895 |
+
"<|special_3014|>",
|
| 2896 |
+
"<|special_3015|>",
|
| 2897 |
+
"<|special_3016|>",
|
| 2898 |
+
"<|special_3017|>",
|
| 2899 |
+
"<|special_3018|>",
|
| 2900 |
+
"<|special_3019|>",
|
| 2901 |
+
"<|special_3020|>",
|
| 2902 |
+
"<|special_3021|>",
|
| 2903 |
+
"<|special_3022|>",
|
| 2904 |
+
"<|special_3023|>",
|
| 2905 |
+
"<|special_3024|>",
|
| 2906 |
+
"<|special_3025|>",
|
| 2907 |
+
"<|special_3026|>",
|
| 2908 |
+
"<|special_3027|>",
|
| 2909 |
+
"<|special_3028|>",
|
| 2910 |
+
"<|special_3029|>",
|
| 2911 |
+
"<|special_3030|>",
|
| 2912 |
+
"<|special_3031|>",
|
| 2913 |
+
"<|special_3032|>",
|
| 2914 |
+
"<|special_3033|>",
|
| 2915 |
+
"<|special_3034|>",
|
| 2916 |
+
"<|special_3035|>",
|
| 2917 |
+
"<|special_3036|>",
|
| 2918 |
+
"<|special_3037|>",
|
| 2919 |
+
"<|special_3038|>",
|
| 2920 |
+
"<|special_3039|>",
|
| 2921 |
+
"<|special_3040|>",
|
| 2922 |
+
"<|special_3041|>",
|
| 2923 |
+
"<|special_3042|>",
|
| 2924 |
+
"<|special_3043|>",
|
| 2925 |
+
"<|special_3044|>",
|
| 2926 |
+
"<|special_3045|>",
|
| 2927 |
+
"<|special_3046|>",
|
| 2928 |
+
"<|special_3047|>",
|
| 2929 |
+
"<|special_3048|>",
|
| 2930 |
+
"<|special_3049|>",
|
| 2931 |
+
"<|special_3050|>",
|
| 2932 |
+
"<|special_3051|>",
|
| 2933 |
+
"<|special_3052|>",
|
| 2934 |
+
"<|special_3053|>",
|
| 2935 |
+
"<|special_3054|>",
|
| 2936 |
+
"<|special_3055|>",
|
| 2937 |
+
"<|special_3056|>",
|
| 2938 |
+
"<|special_3057|>",
|
| 2939 |
+
"<|special_3058|>",
|
| 2940 |
+
"<|special_3059|>",
|
| 2941 |
+
"<|special_3060|>",
|
| 2942 |
+
"<|special_3061|>",
|
| 2943 |
+
"<|special_3062|>",
|
| 2944 |
+
"<|special_3063|>",
|
| 2945 |
+
"<|special_3064|>",
|
| 2946 |
+
"<|special_3065|>",
|
| 2947 |
+
"<|special_3066|>",
|
| 2948 |
+
"<|special_3067|>",
|
| 2949 |
+
"<|special_3068|>",
|
| 2950 |
+
"<|special_3069|>",
|
| 2951 |
+
"<|special_3070|>",
|
| 2952 |
+
"<|special_3071|>",
|
| 2953 |
+
"<|special_3072|>",
|
| 2954 |
+
"<|special_3073|>",
|
| 2955 |
+
"<|special_3074|>",
|
| 2956 |
+
"<|special_3075|>",
|
| 2957 |
+
"<|special_3076|>",
|
| 2958 |
+
"<|special_3077|>",
|
| 2959 |
+
"<|special_3078|>",
|
| 2960 |
+
"<|special_3079|>",
|
| 2961 |
+
"<|special_3080|>",
|
| 2962 |
+
"<|special_3081|>",
|
| 2963 |
+
"<|special_3082|>",
|
| 2964 |
+
"<|special_3083|>",
|
| 2965 |
+
"<|special_3084|>",
|
| 2966 |
+
"<|special_3085|>",
|
| 2967 |
+
"<|special_3086|>",
|
| 2968 |
+
"<|special_3087|>",
|
| 2969 |
+
"<|special_3088|>",
|
| 2970 |
+
"<|special_3089|>",
|
| 2971 |
+
"<|special_3090|>",
|
| 2972 |
+
"<|special_3091|>",
|
| 2973 |
+
"<|special_3092|>",
|
| 2974 |
+
"<|special_3093|>",
|
| 2975 |
+
"<|special_3094|>",
|
| 2976 |
+
"<|special_3095|>",
|
| 2977 |
+
"<|special_3096|>",
|
| 2978 |
+
"<|special_3097|>",
|
| 2979 |
+
"<|special_3098|>",
|
| 2980 |
+
"<|special_3099|>",
|
| 2981 |
+
"<|special_3100|>",
|
| 2982 |
+
"<|special_3101|>",
|
| 2983 |
+
"<|special_3102|>",
|
| 2984 |
+
"<|special_3103|>",
|
| 2985 |
+
"<|special_3104|>",
|
| 2986 |
+
"<|special_3105|>",
|
| 2987 |
+
"<|special_3106|>",
|
| 2988 |
+
"<|special_3107|>",
|
| 2989 |
+
"<|special_3108|>",
|
| 2990 |
+
"<|special_3109|>",
|
| 2991 |
+
"<|special_3110|>",
|
| 2992 |
+
"<|special_3111|>",
|
| 2993 |
+
"<|special_3112|>",
|
| 2994 |
+
"<|special_3113|>",
|
| 2995 |
+
"<|special_3114|>",
|
| 2996 |
+
"<|special_3115|>",
|
| 2997 |
+
"<|special_3116|>",
|
| 2998 |
+
"<|special_3117|>",
|
| 2999 |
+
"<|special_3118|>",
|
| 3000 |
+
"<|special_3119|>",
|
| 3001 |
+
"<|special_3120|>",
|
| 3002 |
+
"<|special_3121|>",
|
| 3003 |
+
"<|special_3122|>",
|
| 3004 |
+
"<|special_3123|>",
|
| 3005 |
+
"<|special_3124|>",
|
| 3006 |
+
"<|special_3125|>",
|
| 3007 |
+
"<|special_3126|>",
|
| 3008 |
+
"<|special_3127|>",
|
| 3009 |
+
"<|special_3128|>",
|
| 3010 |
+
"<|special_3129|>",
|
| 3011 |
+
"<|special_3130|>",
|
| 3012 |
+
"<|special_3131|>",
|
| 3013 |
+
"<|special_3132|>",
|
| 3014 |
+
"<|special_3133|>",
|
| 3015 |
+
"<|special_3134|>",
|
| 3016 |
+
"<|special_3135|>",
|
| 3017 |
+
"<|special_3136|>",
|
| 3018 |
+
"<|special_3137|>",
|
| 3019 |
+
"<|special_3138|>",
|
| 3020 |
+
"<|special_3139|>",
|
| 3021 |
+
"<|special_3140|>",
|
| 3022 |
+
"<|special_3141|>",
|
| 3023 |
+
"<|special_3142|>",
|
| 3024 |
+
"<|special_3143|>",
|
| 3025 |
+
"<|special_3144|>",
|
| 3026 |
+
"<|special_3145|>",
|
| 3027 |
+
"<|special_3146|>",
|
| 3028 |
+
"<|special_3147|>",
|
| 3029 |
+
"<|special_3148|>",
|
| 3030 |
+
"<|special_3149|>",
|
| 3031 |
+
"<|special_3150|>",
|
| 3032 |
+
"<|special_3151|>",
|
| 3033 |
+
"<|special_3152|>",
|
| 3034 |
+
"<|special_3153|>",
|
| 3035 |
+
"<|special_3154|>",
|
| 3036 |
+
"<|special_3155|>",
|
| 3037 |
+
"<|special_3156|>",
|
| 3038 |
+
"<|special_3157|>",
|
| 3039 |
+
"<|special_3158|>",
|
| 3040 |
+
"<|special_3159|>",
|
| 3041 |
+
"<|special_3160|>",
|
| 3042 |
+
"<|special_3161|>",
|
| 3043 |
+
"<|special_3162|>",
|
| 3044 |
+
"<|special_3163|>",
|
| 3045 |
+
"<|special_3164|>",
|
| 3046 |
+
"<|special_3165|>",
|
| 3047 |
+
"<|special_3166|>",
|
| 3048 |
+
"<|special_3167|>",
|
| 3049 |
+
"<|special_3168|>",
|
| 3050 |
+
"<|special_3169|>",
|
| 3051 |
+
"<|special_3170|>",
|
| 3052 |
+
"<|special_3171|>",
|
| 3053 |
+
"<|special_3172|>",
|
| 3054 |
+
"<|special_3173|>",
|
| 3055 |
+
"<|special_3174|>",
|
| 3056 |
+
"<|special_3175|>",
|
| 3057 |
+
"<|special_3176|>",
|
| 3058 |
+
"<|special_3177|>",
|
| 3059 |
+
"<|special_3178|>",
|
| 3060 |
+
"<|special_3179|>",
|
| 3061 |
+
"<|special_3180|>",
|
| 3062 |
+
"<|special_3181|>",
|
| 3063 |
+
"<|special_3182|>",
|
| 3064 |
+
"<|special_3183|>",
|
| 3065 |
+
"<|special_3184|>",
|
| 3066 |
+
"<|special_3185|>",
|
| 3067 |
+
"<|special_3186|>",
|
| 3068 |
+
"<|special_3187|>",
|
| 3069 |
+
"<|special_3188|>",
|
| 3070 |
+
"<|special_3189|>",
|
| 3071 |
+
"<|special_3190|>",
|
| 3072 |
+
"<|special_3191|>",
|
| 3073 |
+
"<|special_3192|>",
|
| 3074 |
+
"<|special_3193|>",
|
| 3075 |
+
"<|special_3194|>",
|
| 3076 |
+
"<|special_3195|>",
|
| 3077 |
+
"<|special_3196|>",
|
| 3078 |
+
"<|special_3197|>",
|
| 3079 |
+
"<|special_3198|>",
|
| 3080 |
+
"<|special_3199|>",
|
| 3081 |
+
"<|special_3200|>",
|
| 3082 |
+
"<|special_3201|>",
|
| 3083 |
+
"<|special_3202|>",
|
| 3084 |
+
"<|special_3203|>",
|
| 3085 |
+
"<|special_3204|>",
|
| 3086 |
+
"<|special_3205|>",
|
| 3087 |
+
"<|special_3206|>",
|
| 3088 |
+
"<|special_3207|>",
|
| 3089 |
+
"<|special_3208|>",
|
| 3090 |
+
"<|special_3209|>",
|
| 3091 |
+
"<|special_3210|>",
|
| 3092 |
+
"<|special_3211|>",
|
| 3093 |
+
"<|special_3212|>",
|
| 3094 |
+
"<|special_3213|>",
|
| 3095 |
+
"<|special_3214|>",
|
| 3096 |
+
"<|special_3215|>",
|
| 3097 |
+
"<|special_3216|>",
|
| 3098 |
+
"<|special_3217|>",
|
| 3099 |
+
"<|special_3218|>",
|
| 3100 |
+
"<|special_3219|>",
|
| 3101 |
+
"<|special_3220|>",
|
| 3102 |
+
"<|special_3221|>",
|
| 3103 |
+
"<|special_3222|>",
|
| 3104 |
+
"<|special_3223|>",
|
| 3105 |
+
"<|special_3224|>",
|
| 3106 |
+
"<|special_3225|>",
|
| 3107 |
+
"<|special_3226|>",
|
| 3108 |
+
"<|special_3227|>",
|
| 3109 |
+
"<|special_3228|>",
|
| 3110 |
+
"<|special_3229|>",
|
| 3111 |
+
"<|special_3230|>",
|
| 3112 |
+
"<|special_3231|>",
|
| 3113 |
+
"<|special_3232|>",
|
| 3114 |
+
"<|special_3233|>",
|
| 3115 |
+
"<|special_3234|>",
|
| 3116 |
+
"<|special_3235|>",
|
| 3117 |
+
"<|special_3236|>",
|
| 3118 |
+
"<|special_3237|>",
|
| 3119 |
+
"<|special_3238|>",
|
| 3120 |
+
"<|special_3239|>",
|
| 3121 |
+
"<|special_3240|>",
|
| 3122 |
+
"<|special_3241|>",
|
| 3123 |
+
"<|special_3242|>",
|
| 3124 |
+
"<|special_3243|>",
|
| 3125 |
+
"<|special_3244|>",
|
| 3126 |
+
"<|special_3245|>",
|
| 3127 |
+
"<|special_3246|>",
|
| 3128 |
+
"<|special_3247|>",
|
| 3129 |
+
"<|special_3248|>",
|
| 3130 |
+
"<|special_3249|>",
|
| 3131 |
+
"<|special_3250|>",
|
| 3132 |
+
"<|special_3251|>",
|
| 3133 |
+
"<|special_3252|>",
|
| 3134 |
+
"<|special_3253|>",
|
| 3135 |
+
"<|special_3254|>",
|
| 3136 |
+
"<|special_3255|>",
|
| 3137 |
+
"<|special_3256|>",
|
| 3138 |
+
"<|special_3257|>",
|
| 3139 |
+
"<|special_3258|>",
|
| 3140 |
+
"<|special_3259|>",
|
| 3141 |
+
"<|special_3260|>",
|
| 3142 |
+
"<|special_3261|>",
|
| 3143 |
+
"<|special_3262|>",
|
| 3144 |
+
"<|special_3263|>",
|
| 3145 |
+
"<|special_3264|>",
|
| 3146 |
+
"<|special_3265|>",
|
| 3147 |
+
"<|special_3266|>",
|
| 3148 |
+
"<|special_3267|>",
|
| 3149 |
+
"<|special_3268|>",
|
| 3150 |
+
"<|special_3269|>",
|
| 3151 |
+
"<|special_3270|>",
|
| 3152 |
+
"<|special_3271|>",
|
| 3153 |
+
"<|special_3272|>",
|
| 3154 |
+
"<|special_3273|>",
|
| 3155 |
+
"<|special_3274|>",
|
| 3156 |
+
"<|special_3275|>",
|
| 3157 |
+
"<|special_3276|>",
|
| 3158 |
+
"<|special_3277|>",
|
| 3159 |
+
"<|special_3278|>",
|
| 3160 |
+
"<|special_3279|>",
|
| 3161 |
+
"<|special_3280|>",
|
| 3162 |
+
"<|special_3281|>",
|
| 3163 |
+
"<|special_3282|>",
|
| 3164 |
+
"<|special_3283|>",
|
| 3165 |
+
"<|special_3284|>",
|
| 3166 |
+
"<|special_3285|>",
|
| 3167 |
+
"<|special_3286|>",
|
| 3168 |
+
"<|special_3287|>",
|
| 3169 |
+
"<|special_3288|>",
|
| 3170 |
+
"<|special_3289|>",
|
| 3171 |
+
"<|special_3290|>",
|
| 3172 |
+
"<|special_3291|>",
|
| 3173 |
+
"<|special_3292|>",
|
| 3174 |
+
"<|special_3293|>",
|
| 3175 |
+
"<|special_3294|>",
|
| 3176 |
+
"<|special_3295|>",
|
| 3177 |
+
"<|special_3296|>",
|
| 3178 |
+
"<|special_3297|>",
|
| 3179 |
+
"<|special_3298|>",
|
| 3180 |
+
"<|special_3299|>",
|
| 3181 |
+
"<|special_3300|>",
|
| 3182 |
+
"<|special_3301|>",
|
| 3183 |
+
"<|special_3302|>",
|
| 3184 |
+
"<|special_3303|>",
|
| 3185 |
+
"<|special_3304|>",
|
| 3186 |
+
"<|special_3305|>",
|
| 3187 |
+
"<|special_3306|>",
|
| 3188 |
+
"<|special_3307|>",
|
| 3189 |
+
"<|special_3308|>",
|
| 3190 |
+
"<|special_3309|>",
|
| 3191 |
+
"<|special_3310|>",
|
| 3192 |
+
"<|special_3311|>",
|
| 3193 |
+
"<|special_3312|>",
|
| 3194 |
+
"<|special_3313|>",
|
| 3195 |
+
"<|special_3314|>",
|
| 3196 |
+
"<|special_3315|>",
|
| 3197 |
+
"<|special_3316|>",
|
| 3198 |
+
"<|special_3317|>",
|
| 3199 |
+
"<|special_3318|>",
|
| 3200 |
+
"<|special_3319|>",
|
| 3201 |
+
"<|special_3320|>",
|
| 3202 |
+
"<|special_3321|>",
|
| 3203 |
+
"<|special_3322|>",
|
| 3204 |
+
"<|special_3323|>",
|
| 3205 |
+
"<|special_3324|>",
|
| 3206 |
+
"<|special_3325|>",
|
| 3207 |
+
"<|special_3326|>",
|
| 3208 |
+
"<|special_3327|>",
|
| 3209 |
+
"<|special_3328|>",
|
| 3210 |
+
"<|special_3329|>",
|
| 3211 |
+
"<|special_3330|>",
|
| 3212 |
+
"<|special_3331|>",
|
| 3213 |
+
"<|special_3332|>",
|
| 3214 |
+
"<|special_3333|>",
|
| 3215 |
+
"<|special_3334|>",
|
| 3216 |
+
"<|special_3335|>",
|
| 3217 |
+
"<|special_3336|>",
|
| 3218 |
+
"<|special_3337|>",
|
| 3219 |
+
"<|special_3338|>",
|
| 3220 |
+
"<|special_3339|>",
|
| 3221 |
+
"<|special_3340|>",
|
| 3222 |
+
"<|special_3341|>",
|
| 3223 |
+
"<|special_3342|>",
|
| 3224 |
+
"<|special_3343|>",
|
| 3225 |
+
"<|special_3344|>",
|
| 3226 |
+
"<|special_3345|>",
|
| 3227 |
+
"<|special_3346|>",
|
| 3228 |
+
"<|special_3347|>",
|
| 3229 |
+
"<|special_3348|>",
|
| 3230 |
+
"<|special_3349|>",
|
| 3231 |
+
"<|special_3350|>",
|
| 3232 |
+
"<|special_3351|>",
|
| 3233 |
+
"<|special_3352|>",
|
| 3234 |
+
"<|special_3353|>",
|
| 3235 |
+
"<|special_3354|>",
|
| 3236 |
+
"<|special_3355|>",
|
| 3237 |
+
"<|special_3356|>",
|
| 3238 |
+
"<|special_3357|>",
|
| 3239 |
+
"<|special_3358|>",
|
| 3240 |
+
"<|special_3359|>",
|
| 3241 |
+
"<|special_3360|>",
|
| 3242 |
+
"<|special_3361|>",
|
| 3243 |
+
"<|special_3362|>",
|
| 3244 |
+
"<|special_3363|>",
|
| 3245 |
+
"<|special_3364|>",
|
| 3246 |
+
"<|special_3365|>",
|
| 3247 |
+
"<|special_3366|>",
|
| 3248 |
+
"<|special_3367|>",
|
| 3249 |
+
"<|special_3368|>",
|
| 3250 |
+
"<|special_3369|>",
|
| 3251 |
+
"<|special_3370|>",
|
| 3252 |
+
"<|special_3371|>",
|
| 3253 |
+
"<|special_3372|>",
|
| 3254 |
+
"<|special_3373|>",
|
| 3255 |
+
"<|special_3374|>",
|
| 3256 |
+
"<|special_3375|>",
|
| 3257 |
+
"<|special_3376|>",
|
| 3258 |
+
"<|special_3377|>",
|
| 3259 |
+
"<|special_3378|>",
|
| 3260 |
+
"<|special_3379|>",
|
| 3261 |
+
"<|special_3380|>",
|
| 3262 |
+
"<|special_3381|>",
|
| 3263 |
+
"<|special_3382|>",
|
| 3264 |
+
"<|special_3383|>",
|
| 3265 |
+
"<|special_3384|>",
|
| 3266 |
+
"<|special_3385|>",
|
| 3267 |
+
"<|special_3386|>",
|
| 3268 |
+
"<|special_3387|>",
|
| 3269 |
+
"<|special_3388|>",
|
| 3270 |
+
"<|special_3389|>",
|
| 3271 |
+
"<|special_3390|>",
|
| 3272 |
+
"<|special_3391|>",
|
| 3273 |
+
"<|special_3392|>",
|
| 3274 |
+
"<|special_3393|>",
|
| 3275 |
+
"<|special_3394|>",
|
| 3276 |
+
"<|special_3395|>",
|
| 3277 |
+
"<|special_3396|>",
|
| 3278 |
+
"<|special_3397|>",
|
| 3279 |
+
"<|special_3398|>",
|
| 3280 |
+
"<|special_3399|>",
|
| 3281 |
+
"<|special_3400|>",
|
| 3282 |
+
"<|special_3401|>",
|
| 3283 |
+
"<|special_3402|>",
|
| 3284 |
+
"<|special_3403|>",
|
| 3285 |
+
"<|special_3404|>",
|
| 3286 |
+
"<|special_3405|>",
|
| 3287 |
+
"<|special_3406|>",
|
| 3288 |
+
"<|special_3407|>",
|
| 3289 |
+
"<|special_3408|>",
|
| 3290 |
+
"<|special_3409|>",
|
| 3291 |
+
"<|special_3410|>",
|
| 3292 |
+
"<|special_3411|>",
|
| 3293 |
+
"<|special_3412|>",
|
| 3294 |
+
"<|special_3413|>",
|
| 3295 |
+
"<|special_3414|>",
|
| 3296 |
+
"<|special_3415|>",
|
| 3297 |
+
"<|special_3416|>",
|
| 3298 |
+
"<|special_3417|>",
|
| 3299 |
+
"<|special_3418|>",
|
| 3300 |
+
"<|special_3419|>",
|
| 3301 |
+
"<|special_3420|>",
|
| 3302 |
+
"<|special_3421|>",
|
| 3303 |
+
"<|special_3422|>",
|
| 3304 |
+
"<|special_3423|>",
|
| 3305 |
+
"<|special_3424|>",
|
| 3306 |
+
"<|special_3425|>",
|
| 3307 |
+
"<|special_3426|>",
|
| 3308 |
+
"<|special_3427|>",
|
| 3309 |
+
"<|special_3428|>",
|
| 3310 |
+
"<|special_3429|>",
|
| 3311 |
+
"<|special_3430|>",
|
| 3312 |
+
"<|special_3431|>",
|
| 3313 |
+
"<|special_3432|>",
|
| 3314 |
+
"<|special_3433|>",
|
| 3315 |
+
"<|special_3434|>",
|
| 3316 |
+
"<|special_3435|>",
|
| 3317 |
+
"<|special_3436|>",
|
| 3318 |
+
"<|special_3437|>",
|
| 3319 |
+
"<|special_3438|>",
|
| 3320 |
+
"<|special_3439|>",
|
| 3321 |
+
"<|special_3440|>",
|
| 3322 |
+
"<|special_3441|>",
|
| 3323 |
+
"<|special_3442|>",
|
| 3324 |
+
"<|special_3443|>",
|
| 3325 |
+
"<|special_3444|>",
|
| 3326 |
+
"<|special_3445|>",
|
| 3327 |
+
"<|special_3446|>",
|
| 3328 |
+
"<|special_3447|>",
|
| 3329 |
+
"<|special_3448|>",
|
| 3330 |
+
"<|special_3449|>",
|
| 3331 |
+
"<|special_3450|>",
|
| 3332 |
+
"<|special_3451|>",
|
| 3333 |
+
"<|special_3452|>",
|
| 3334 |
+
"<|special_3453|>",
|
| 3335 |
+
"<|special_3454|>",
|
| 3336 |
+
"<|special_3455|>",
|
| 3337 |
+
"<|special_3456|>",
|
| 3338 |
+
"<|special_3457|>",
|
| 3339 |
+
"<|special_3458|>",
|
| 3340 |
+
"<|special_3459|>",
|
| 3341 |
+
"<|special_3460|>",
|
| 3342 |
+
"<|special_3461|>",
|
| 3343 |
+
"<|special_3462|>",
|
| 3344 |
+
"<|special_3463|>",
|
| 3345 |
+
"<|special_3464|>",
|
| 3346 |
+
"<|special_3465|>",
|
| 3347 |
+
"<|special_3466|>",
|
| 3348 |
+
"<|special_3467|>",
|
| 3349 |
+
"<|special_3468|>",
|
| 3350 |
+
"<|special_3469|>",
|
| 3351 |
+
"<|special_3470|>",
|
| 3352 |
+
"<|special_3471|>",
|
| 3353 |
+
"<|special_3472|>",
|
| 3354 |
+
"<|special_3473|>",
|
| 3355 |
+
"<|special_3474|>",
|
| 3356 |
+
"<|special_3475|>",
|
| 3357 |
+
"<|special_3476|>",
|
| 3358 |
+
"<|special_3477|>",
|
| 3359 |
+
"<|special_3478|>",
|
| 3360 |
+
"<|special_3479|>",
|
| 3361 |
+
"<|special_3480|>",
|
| 3362 |
+
"<|special_3481|>",
|
| 3363 |
+
"<|special_3482|>",
|
| 3364 |
+
"<|special_3483|>",
|
| 3365 |
+
"<|special_3484|>",
|
| 3366 |
+
"<|special_3485|>",
|
| 3367 |
+
"<|special_3486|>",
|
| 3368 |
+
"<|special_3487|>",
|
| 3369 |
+
"<|special_3488|>",
|
| 3370 |
+
"<|special_3489|>",
|
| 3371 |
+
"<|special_3490|>",
|
| 3372 |
+
"<|special_3491|>",
|
| 3373 |
+
"<|special_3492|>",
|
| 3374 |
+
"<|special_3493|>",
|
| 3375 |
+
"<|special_3494|>",
|
| 3376 |
+
"<|special_3495|>",
|
| 3377 |
+
"<|special_3496|>",
|
| 3378 |
+
"<|special_3497|>",
|
| 3379 |
+
"<|special_3498|>",
|
| 3380 |
+
"<|special_3499|>",
|
| 3381 |
+
"<|special_3500|>",
|
| 3382 |
+
"<|special_3501|>",
|
| 3383 |
+
"<|special_3502|>",
|
| 3384 |
+
"<|special_3503|>",
|
| 3385 |
+
"<|special_3504|>",
|
| 3386 |
+
"<|special_3505|>",
|
| 3387 |
+
"<|special_3506|>",
|
| 3388 |
+
"<|special_3507|>",
|
| 3389 |
+
"<|special_3508|>",
|
| 3390 |
+
"<|special_3509|>",
|
| 3391 |
+
"<|special_3510|>",
|
| 3392 |
+
"<|special_3511|>",
|
| 3393 |
+
"<|special_3512|>",
|
| 3394 |
+
"<|special_3513|>",
|
| 3395 |
+
"<|special_3514|>",
|
| 3396 |
+
"<|special_3515|>",
|
| 3397 |
+
"<|special_3516|>",
|
| 3398 |
+
"<|special_3517|>",
|
| 3399 |
+
"<|special_3518|>",
|
| 3400 |
+
"<|special_3519|>",
|
| 3401 |
+
"<|special_3520|>",
|
| 3402 |
+
"<|special_3521|>",
|
| 3403 |
+
"<|special_3522|>",
|
| 3404 |
+
"<|special_3523|>",
|
| 3405 |
+
"<|special_3524|>",
|
| 3406 |
+
"<|special_3525|>",
|
| 3407 |
+
"<|special_3526|>",
|
| 3408 |
+
"<|special_3527|>",
|
| 3409 |
+
"<|special_3528|>",
|
| 3410 |
+
"<|special_3529|>",
|
| 3411 |
+
"<|special_3530|>",
|
| 3412 |
+
"<|special_3531|>",
|
| 3413 |
+
"<|special_3532|>",
|
| 3414 |
+
"<|special_3533|>",
|
| 3415 |
+
"<|special_3534|>",
|
| 3416 |
+
"<|special_3535|>",
|
| 3417 |
+
"<|special_3536|>",
|
| 3418 |
+
"<|special_3537|>",
|
| 3419 |
+
"<|special_3538|>",
|
| 3420 |
+
"<|special_3539|>",
|
| 3421 |
+
"<|special_3540|>",
|
| 3422 |
+
"<|special_3541|>",
|
| 3423 |
+
"<|special_3542|>",
|
| 3424 |
+
"<|special_3543|>",
|
| 3425 |
+
"<|special_3544|>",
|
| 3426 |
+
"<|special_3545|>",
|
| 3427 |
+
"<|special_3546|>",
|
| 3428 |
+
"<|special_3547|>",
|
| 3429 |
+
"<|special_3548|>",
|
| 3430 |
+
"<|special_3549|>",
|
| 3431 |
+
"<|special_3550|>",
|
| 3432 |
+
"<|special_3551|>",
|
| 3433 |
+
"<|special_3552|>",
|
| 3434 |
+
"<|special_3553|>",
|
| 3435 |
+
"<|special_3554|>",
|
| 3436 |
+
"<|special_3555|>",
|
| 3437 |
+
"<|special_3556|>",
|
| 3438 |
+
"<|special_3557|>",
|
| 3439 |
+
"<|special_3558|>",
|
| 3440 |
+
"<|special_3559|>",
|
| 3441 |
+
"<|special_3560|>",
|
| 3442 |
+
"<|special_3561|>",
|
| 3443 |
+
"<|special_3562|>",
|
| 3444 |
+
"<|special_3563|>",
|
| 3445 |
+
"<|special_3564|>",
|
| 3446 |
+
"<|special_3565|>",
|
| 3447 |
+
"<|special_3566|>",
|
| 3448 |
+
"<|special_3567|>",
|
| 3449 |
+
"<|special_3568|>",
|
| 3450 |
+
"<|special_3569|>",
|
| 3451 |
+
"<|special_3570|>",
|
| 3452 |
+
"<|special_3571|>",
|
| 3453 |
+
"<|special_3572|>",
|
| 3454 |
+
"<|special_3573|>",
|
| 3455 |
+
"<|special_3574|>",
|
| 3456 |
+
"<|special_3575|>",
|
| 3457 |
+
"<|special_3576|>",
|
| 3458 |
+
"<|special_3577|>",
|
| 3459 |
+
"<|special_3578|>",
|
| 3460 |
+
"<|special_3579|>",
|
| 3461 |
+
"<|special_3580|>",
|
| 3462 |
+
"<|special_3581|>",
|
| 3463 |
+
"<|special_3582|>",
|
| 3464 |
+
"<|special_3583|>",
|
| 3465 |
+
"<|special_3584|>",
|
| 3466 |
+
"<|special_3585|>",
|
| 3467 |
+
"<|special_3586|>",
|
| 3468 |
+
"<|special_3587|>",
|
| 3469 |
+
"<|special_3588|>",
|
| 3470 |
+
"<|special_3589|>",
|
| 3471 |
+
"<|special_3590|>",
|
| 3472 |
+
"<|special_3591|>",
|
| 3473 |
+
"<|special_3592|>",
|
| 3474 |
+
"<|special_3593|>",
|
| 3475 |
+
"<|special_3594|>",
|
| 3476 |
+
"<|special_3595|>",
|
| 3477 |
+
"<|special_3596|>",
|
| 3478 |
+
"<|special_3597|>",
|
| 3479 |
+
"<|special_3598|>",
|
| 3480 |
+
"<|special_3599|>",
|
| 3481 |
+
"<|special_3600|>",
|
| 3482 |
+
"<|special_3601|>",
|
| 3483 |
+
"<|special_3602|>",
|
| 3484 |
+
"<|special_3603|>",
|
| 3485 |
+
"<|special_3604|>",
|
| 3486 |
+
"<|special_3605|>",
|
| 3487 |
+
"<|special_3606|>",
|
| 3488 |
+
"<|special_3607|>",
|
| 3489 |
+
"<|special_3608|>",
|
| 3490 |
+
"<|special_3609|>",
|
| 3491 |
+
"<|special_3610|>",
|
| 3492 |
+
"<|special_3611|>",
|
| 3493 |
+
"<|special_3612|>",
|
| 3494 |
+
"<|special_3613|>",
|
| 3495 |
+
"<|special_3614|>",
|
| 3496 |
+
"<|special_3615|>",
|
| 3497 |
+
"<|special_3616|>",
|
| 3498 |
+
"<|special_3617|>",
|
| 3499 |
+
"<|special_3618|>",
|
| 3500 |
+
"<|special_3619|>",
|
| 3501 |
+
"<|special_3620|>",
|
| 3502 |
+
"<|special_3621|>",
|
| 3503 |
+
"<|special_3622|>",
|
| 3504 |
+
"<|special_3623|>",
|
| 3505 |
+
"<|special_3624|>",
|
| 3506 |
+
"<|special_3625|>",
|
| 3507 |
+
"<|special_3626|>",
|
| 3508 |
+
"<|special_3627|>",
|
| 3509 |
+
"<|special_3628|>",
|
| 3510 |
+
"<|special_3629|>",
|
| 3511 |
+
"<|special_3630|>",
|
| 3512 |
+
"<|special_3631|>",
|
| 3513 |
+
"<|special_3632|>",
|
| 3514 |
+
"<|special_3633|>",
|
| 3515 |
+
"<|special_3634|>",
|
| 3516 |
+
"<|special_3635|>",
|
| 3517 |
+
"<|special_3636|>",
|
| 3518 |
+
"<|special_3637|>",
|
| 3519 |
+
"<|special_3638|>",
|
| 3520 |
+
"<|special_3639|>",
|
| 3521 |
+
"<|special_3640|>",
|
| 3522 |
+
"<|special_3641|>",
|
| 3523 |
+
"<|special_3642|>",
|
| 3524 |
+
"<|special_3643|>",
|
| 3525 |
+
"<|special_3644|>",
|
| 3526 |
+
"<|special_3645|>",
|
| 3527 |
+
"<|special_3646|>",
|
| 3528 |
+
"<|special_3647|>",
|
| 3529 |
+
"<|special_3648|>",
|
| 3530 |
+
"<|special_3649|>",
|
| 3531 |
+
"<|special_3650|>",
|
| 3532 |
+
"<|special_3651|>",
|
| 3533 |
+
"<|special_3652|>",
|
| 3534 |
+
"<|special_3653|>",
|
| 3535 |
+
"<|special_3654|>",
|
| 3536 |
+
"<|special_3655|>",
|
| 3537 |
+
"<|special_3656|>",
|
| 3538 |
+
"<|special_3657|>",
|
| 3539 |
+
"<|special_3658|>",
|
| 3540 |
+
"<|special_3659|>",
|
| 3541 |
+
"<|special_3660|>",
|
| 3542 |
+
"<|special_3661|>",
|
| 3543 |
+
"<|special_3662|>",
|
| 3544 |
+
"<|special_3663|>",
|
| 3545 |
+
"<|special_3664|>",
|
| 3546 |
+
"<|special_3665|>",
|
| 3547 |
+
"<|special_3666|>",
|
| 3548 |
+
"<|special_3667|>",
|
| 3549 |
+
"<|special_3668|>",
|
| 3550 |
+
"<|special_3669|>",
|
| 3551 |
+
"<|special_3670|>",
|
| 3552 |
+
"<|special_3671|>",
|
| 3553 |
+
"<|special_3672|>",
|
| 3554 |
+
"<|special_3673|>",
|
| 3555 |
+
"<|special_3674|>",
|
| 3556 |
+
"<|special_3675|>",
|
| 3557 |
+
"<|special_3676|>",
|
| 3558 |
+
"<|special_3677|>",
|
| 3559 |
+
"<|special_3678|>",
|
| 3560 |
+
"<|special_3679|>",
|
| 3561 |
+
"<|special_3680|>",
|
| 3562 |
+
"<|special_3681|>",
|
| 3563 |
+
"<|special_3682|>",
|
| 3564 |
+
"<|special_3683|>",
|
| 3565 |
+
"<|special_3684|>",
|
| 3566 |
+
"<|special_3685|>",
|
| 3567 |
+
"<|special_3686|>",
|
| 3568 |
+
"<|special_3687|>",
|
| 3569 |
+
"<|special_3688|>",
|
| 3570 |
+
"<|special_3689|>",
|
| 3571 |
+
"<|special_3690|>",
|
| 3572 |
+
"<|special_3691|>",
|
| 3573 |
+
"<|special_3692|>",
|
| 3574 |
+
"<|special_3693|>",
|
| 3575 |
+
"<|special_3694|>",
|
| 3576 |
+
"<|special_3695|>",
|
| 3577 |
+
"<|special_3696|>",
|
| 3578 |
+
"<|special_3697|>",
|
| 3579 |
+
"<|special_3698|>",
|
| 3580 |
+
"<|special_3699|>",
|
| 3581 |
+
"<|special_3700|>",
|
| 3582 |
+
"<|special_3701|>",
|
| 3583 |
+
"<|special_3702|>",
|
| 3584 |
+
"<|special_3703|>",
|
| 3585 |
+
"<|special_3704|>",
|
| 3586 |
+
"<|special_3705|>",
|
| 3587 |
+
"<|special_3706|>",
|
| 3588 |
+
"<|special_3707|>",
|
| 3589 |
+
"<|special_3708|>",
|
| 3590 |
+
"<|special_3709|>",
|
| 3591 |
+
"<|special_3710|>",
|
| 3592 |
+
"<|special_3711|>",
|
| 3593 |
+
"<|special_3712|>",
|
| 3594 |
+
"<|special_3713|>",
|
| 3595 |
+
"<|special_3714|>",
|
| 3596 |
+
"<|special_3715|>",
|
| 3597 |
+
"<|special_3716|>",
|
| 3598 |
+
"<|special_3717|>",
|
| 3599 |
+
"<|special_3718|>",
|
| 3600 |
+
"<|special_3719|>",
|
| 3601 |
+
"<|special_3720|>",
|
| 3602 |
+
"<|special_3721|>",
|
| 3603 |
+
"<|special_3722|>",
|
| 3604 |
+
"<|special_3723|>",
|
| 3605 |
+
"<|special_3724|>",
|
| 3606 |
+
"<|special_3725|>",
|
| 3607 |
+
"<|special_3726|>",
|
| 3608 |
+
"<|special_3727|>",
|
| 3609 |
+
"<|special_3728|>",
|
| 3610 |
+
"<|special_3729|>",
|
| 3611 |
+
"<|special_3730|>",
|
| 3612 |
+
"<|special_3731|>",
|
| 3613 |
+
"<|special_3732|>",
|
| 3614 |
+
"<|special_3733|>",
|
| 3615 |
+
"<|special_3734|>",
|
| 3616 |
+
"<|special_3735|>",
|
| 3617 |
+
"<|special_3736|>",
|
| 3618 |
+
"<|special_3737|>",
|
| 3619 |
+
"<|special_3738|>",
|
| 3620 |
+
"<|special_3739|>",
|
| 3621 |
+
"<|special_3740|>",
|
| 3622 |
+
"<|special_3741|>",
|
| 3623 |
+
"<|special_3742|>",
|
| 3624 |
+
"<|special_3743|>",
|
| 3625 |
+
"<|special_3744|>",
|
| 3626 |
+
"<|special_3745|>",
|
| 3627 |
+
"<|special_3746|>",
|
| 3628 |
+
"<|special_3747|>",
|
| 3629 |
+
"<|special_3748|>",
|
| 3630 |
+
"<|special_3749|>",
|
| 3631 |
+
"<|special_3750|>",
|
| 3632 |
+
"<|special_3751|>",
|
| 3633 |
+
"<|special_3752|>",
|
| 3634 |
+
"<|special_3753|>",
|
| 3635 |
+
"<|special_3754|>",
|
| 3636 |
+
"<|special_3755|>",
|
| 3637 |
+
"<|special_3756|>",
|
| 3638 |
+
"<|special_3757|>",
|
| 3639 |
+
"<|special_3758|>",
|
| 3640 |
+
"<|special_3759|>",
|
| 3641 |
+
"<|special_3760|>",
|
| 3642 |
+
"<|special_3761|>",
|
| 3643 |
+
"<|special_3762|>",
|
| 3644 |
+
"<|special_3763|>",
|
| 3645 |
+
"<|special_3764|>",
|
| 3646 |
+
"<|special_3765|>",
|
| 3647 |
+
"<|special_3766|>",
|
| 3648 |
+
"<|special_3767|>",
|
| 3649 |
+
"<|special_3768|>",
|
| 3650 |
+
"<|special_3769|>",
|
| 3651 |
+
"<|special_3770|>",
|
| 3652 |
+
"<|special_3771|>",
|
| 3653 |
+
"<|special_3772|>",
|
| 3654 |
+
"<|special_3773|>",
|
| 3655 |
+
"<|special_3774|>",
|
| 3656 |
+
"<|special_3775|>",
|
| 3657 |
+
"<|special_3776|>",
|
| 3658 |
+
"<|special_3777|>",
|
| 3659 |
+
"<|special_3778|>",
|
| 3660 |
+
"<|special_3779|>",
|
| 3661 |
+
"<|special_3780|>",
|
| 3662 |
+
"<|special_3781|>",
|
| 3663 |
+
"<|special_3782|>",
|
| 3664 |
+
"<|special_3783|>",
|
| 3665 |
+
"<|special_3784|>",
|
| 3666 |
+
"<|special_3785|>",
|
| 3667 |
+
"<|special_3786|>",
|
| 3668 |
+
"<|special_3787|>",
|
| 3669 |
+
"<|special_3788|>",
|
| 3670 |
+
"<|special_3789|>",
|
| 3671 |
+
"<|special_3790|>",
|
| 3672 |
+
"<|special_3791|>",
|
| 3673 |
+
"<|special_3792|>",
|
| 3674 |
+
"<|special_3793|>",
|
| 3675 |
+
"<|special_3794|>",
|
| 3676 |
+
"<|special_3795|>",
|
| 3677 |
+
"<|special_3796|>",
|
| 3678 |
+
"<|special_3797|>",
|
| 3679 |
+
"<|special_3798|>",
|
| 3680 |
+
"<|special_3799|>",
|
| 3681 |
+
"<|special_3800|>",
|
| 3682 |
+
"<|special_3801|>",
|
| 3683 |
+
"<|special_3802|>",
|
| 3684 |
+
"<|special_3803|>",
|
| 3685 |
+
"<|special_3804|>",
|
| 3686 |
+
"<|special_3805|>",
|
| 3687 |
+
"<|special_3806|>",
|
| 3688 |
+
"<|special_3807|>",
|
| 3689 |
+
"<|special_3808|>",
|
| 3690 |
+
"<|special_3809|>",
|
| 3691 |
+
"<|special_3810|>",
|
| 3692 |
+
"<|special_3811|>",
|
| 3693 |
+
"<|special_3812|>",
|
| 3694 |
+
"<|special_3813|>",
|
| 3695 |
+
"<|special_3814|>",
|
| 3696 |
+
"<|special_3815|>",
|
| 3697 |
+
"<|special_3816|>",
|
| 3698 |
+
"<|special_3817|>",
|
| 3699 |
+
"<|special_3818|>",
|
| 3700 |
+
"<|special_3819|>",
|
| 3701 |
+
"<|special_3820|>",
|
| 3702 |
+
"<|special_3821|>",
|
| 3703 |
+
"<|special_3822|>",
|
| 3704 |
+
"<|special_3823|>",
|
| 3705 |
+
"<|special_3824|>",
|
| 3706 |
+
"<|special_3825|>",
|
| 3707 |
+
"<|special_3826|>",
|
| 3708 |
+
"<|special_3827|>",
|
| 3709 |
+
"<|special_3828|>",
|
| 3710 |
+
"<|special_3829|>",
|
| 3711 |
+
"<|special_3830|>",
|
| 3712 |
+
"<|special_3831|>",
|
| 3713 |
+
"<|special_3832|>",
|
| 3714 |
+
"<|special_3833|>",
|
| 3715 |
+
"<|special_3834|>",
|
| 3716 |
+
"<|special_3835|>",
|
| 3717 |
+
"<|special_3836|>",
|
| 3718 |
+
"<|special_3837|>",
|
| 3719 |
+
"<|special_3838|>",
|
| 3720 |
+
"<|special_3839|>",
|
| 3721 |
+
"<|special_3840|>",
|
| 3722 |
+
"<|special_3841|>",
|
| 3723 |
+
"<|special_3842|>",
|
| 3724 |
+
"<|special_3843|>",
|
| 3725 |
+
"<|special_3844|>",
|
| 3726 |
+
"<|special_3845|>",
|
| 3727 |
+
"<|special_3846|>",
|
| 3728 |
+
"<|special_3847|>",
|
| 3729 |
+
"<|special_3848|>",
|
| 3730 |
+
"<|special_3849|>",
|
| 3731 |
+
"<|special_3850|>",
|
| 3732 |
+
"<|special_3851|>",
|
| 3733 |
+
"<|special_3852|>",
|
| 3734 |
+
"<|special_3853|>",
|
| 3735 |
+
"<|special_3854|>",
|
| 3736 |
+
"<|special_3855|>",
|
| 3737 |
+
"<|special_3856|>",
|
| 3738 |
+
"<|special_3857|>",
|
| 3739 |
+
"<|special_3858|>",
|
| 3740 |
+
"<|special_3859|>",
|
| 3741 |
+
"<|special_3860|>",
|
| 3742 |
+
"<|special_3861|>",
|
| 3743 |
+
"<|special_3862|>",
|
| 3744 |
+
"<|special_3863|>",
|
| 3745 |
+
"<|special_3864|>",
|
| 3746 |
+
"<|special_3865|>",
|
| 3747 |
+
"<|special_3866|>",
|
| 3748 |
+
"<|special_3867|>",
|
| 3749 |
+
"<|special_3868|>",
|
| 3750 |
+
"<|special_3869|>",
|
| 3751 |
+
"<|special_3870|>",
|
| 3752 |
+
"<|special_3871|>",
|
| 3753 |
+
"<|special_3872|>",
|
| 3754 |
+
"<|special_3873|>",
|
| 3755 |
+
"<|special_3874|>",
|
| 3756 |
+
"<|special_3875|>",
|
| 3757 |
+
"<|special_3876|>",
|
| 3758 |
+
"<|special_3877|>",
|
| 3759 |
+
"<|special_3878|>",
|
| 3760 |
+
"<|special_3879|>",
|
| 3761 |
+
"<|special_3880|>",
|
| 3762 |
+
"<|special_3881|>",
|
| 3763 |
+
"<|special_3882|>",
|
| 3764 |
+
"<|special_3883|>",
|
| 3765 |
+
"<|special_3884|>",
|
| 3766 |
+
"<|special_3885|>",
|
| 3767 |
+
"<|special_3886|>",
|
| 3768 |
+
"<|special_3887|>",
|
| 3769 |
+
"<|special_3888|>",
|
| 3770 |
+
"<|special_3889|>",
|
| 3771 |
+
"<|special_3890|>",
|
| 3772 |
+
"<|special_3891|>",
|
| 3773 |
+
"<|special_3892|>",
|
| 3774 |
+
"<|special_3893|>",
|
| 3775 |
+
"<|special_3894|>",
|
| 3776 |
+
"<|special_3895|>",
|
| 3777 |
+
"<|special_3896|>",
|
| 3778 |
+
"<|special_3897|>",
|
| 3779 |
+
"<|special_3898|>",
|
| 3780 |
+
"<|special_3899|>",
|
| 3781 |
+
"<|special_3900|>",
|
| 3782 |
+
"<|special_3901|>",
|
| 3783 |
+
"<|special_3902|>",
|
| 3784 |
+
"<|special_3903|>",
|
| 3785 |
+
"<|special_3904|>",
|
| 3786 |
+
"<|special_3905|>",
|
| 3787 |
+
"<|special_3906|>",
|
| 3788 |
+
"<|special_3907|>",
|
| 3789 |
+
"<|special_3908|>",
|
| 3790 |
+
"<|special_3909|>",
|
| 3791 |
+
"<|special_3910|>",
|
| 3792 |
+
"<|special_3911|>",
|
| 3793 |
+
"<|special_3912|>",
|
| 3794 |
+
"<|special_3913|>",
|
| 3795 |
+
"<|special_3914|>",
|
| 3796 |
+
"<|special_3915|>",
|
| 3797 |
+
"<|special_3916|>",
|
| 3798 |
+
"<|special_3917|>",
|
| 3799 |
+
"<|special_3918|>",
|
| 3800 |
+
"<|special_3919|>",
|
| 3801 |
+
"<|special_3920|>",
|
| 3802 |
+
"<|special_3921|>",
|
| 3803 |
+
"<|special_3922|>",
|
| 3804 |
+
"<|special_3923|>",
|
| 3805 |
+
"<|special_3924|>",
|
| 3806 |
+
"<|special_3925|>",
|
| 3807 |
+
"<|special_3926|>",
|
| 3808 |
+
"<|special_3927|>",
|
| 3809 |
+
"<|special_3928|>",
|
| 3810 |
+
"<|special_3929|>",
|
| 3811 |
+
"<|special_3930|>",
|
| 3812 |
+
"<|special_3931|>",
|
| 3813 |
+
"<|special_3932|>",
|
| 3814 |
+
"<|special_3933|>",
|
| 3815 |
+
"<|special_3934|>",
|
| 3816 |
+
"<|special_3935|>",
|
| 3817 |
+
"<|special_3936|>",
|
| 3818 |
+
"<|special_3937|>",
|
| 3819 |
+
"<|special_3938|>",
|
| 3820 |
+
"<|special_3939|>",
|
| 3821 |
+
"<|special_3940|>",
|
| 3822 |
+
"<|special_3941|>",
|
| 3823 |
+
"<|special_3942|>",
|
| 3824 |
+
"<|special_3943|>",
|
| 3825 |
+
"<|special_3944|>",
|
| 3826 |
+
"<|special_3945|>",
|
| 3827 |
+
"<|special_3946|>",
|
| 3828 |
+
"<|special_3947|>",
|
| 3829 |
+
"<|special_3948|>",
|
| 3830 |
+
"<|special_3949|>",
|
| 3831 |
+
"<|special_3950|>",
|
| 3832 |
+
"<|special_3951|>",
|
| 3833 |
+
"<|special_3952|>",
|
| 3834 |
+
"<|special_3953|>",
|
| 3835 |
+
"<|special_3954|>",
|
| 3836 |
+
"<|special_3955|>",
|
| 3837 |
+
"<|special_3956|>",
|
| 3838 |
+
"<|special_3957|>",
|
| 3839 |
+
"<|special_3958|>",
|
| 3840 |
+
"<|special_3959|>",
|
| 3841 |
+
"<|special_3960|>",
|
| 3842 |
+
"<|special_3961|>",
|
| 3843 |
+
"<|special_3962|>",
|
| 3844 |
+
"<|special_3963|>",
|
| 3845 |
+
"<|special_3964|>",
|
| 3846 |
+
"<|special_3965|>",
|
| 3847 |
+
"<|special_3966|>",
|
| 3848 |
+
"<|special_3967|>",
|
| 3849 |
+
"<|special_3968|>",
|
| 3850 |
+
"<|special_3969|>",
|
| 3851 |
+
"<|special_3970|>",
|
| 3852 |
+
"<|special_3971|>",
|
| 3853 |
+
"<|special_3972|>",
|
| 3854 |
+
"<|special_3973|>",
|
| 3855 |
+
"<|special_3974|>",
|
| 3856 |
+
"<|special_3975|>",
|
| 3857 |
+
"<|special_3976|>",
|
| 3858 |
+
"<|special_3977|>",
|
| 3859 |
+
"<|special_3978|>",
|
| 3860 |
+
"<|special_3979|>",
|
| 3861 |
+
"<|special_3980|>",
|
| 3862 |
+
"<|special_3981|>",
|
| 3863 |
+
"<|special_3982|>",
|
| 3864 |
+
"<|special_3983|>",
|
| 3865 |
+
"<|special_3984|>",
|
| 3866 |
+
"<|special_3985|>",
|
| 3867 |
+
"<|special_3986|>",
|
| 3868 |
+
"<|special_3987|>",
|
| 3869 |
+
"<|special_3988|>",
|
| 3870 |
+
"<|special_3989|>",
|
| 3871 |
+
"<|special_3990|>",
|
| 3872 |
+
"<|special_3991|>",
|
| 3873 |
+
"<|special_3992|>",
|
| 3874 |
+
"<|special_3993|>",
|
| 3875 |
+
"<|special_3994|>",
|
| 3876 |
+
"<|special_3995|>",
|
| 3877 |
+
"<|special_3996|>",
|
| 3878 |
+
"<|special_3997|>",
|
| 3879 |
+
"<|special_3998|>",
|
| 3880 |
+
"<|special_3999|>",
|
| 3881 |
+
"<|special_4000|>",
|
| 3882 |
+
"<|special_4001|>",
|
| 3883 |
+
"<|special_4002|>",
|
| 3884 |
+
"<|special_4003|>",
|
| 3885 |
+
"<|special_4004|>",
|
| 3886 |
+
"<|special_4005|>",
|
| 3887 |
+
"<|special_4006|>",
|
| 3888 |
+
"<|special_4007|>",
|
| 3889 |
+
"<|special_4008|>",
|
| 3890 |
+
"<|special_4009|>",
|
| 3891 |
+
"<|special_4010|>",
|
| 3892 |
+
"<|special_4011|>",
|
| 3893 |
+
"<|special_4012|>",
|
| 3894 |
+
"<|special_4013|>",
|
| 3895 |
+
"<|special_4014|>",
|
| 3896 |
+
"<|special_4015|>",
|
| 3897 |
+
"<|special_4016|>",
|
| 3898 |
+
"<|special_4017|>",
|
| 3899 |
+
"<|special_4018|>",
|
| 3900 |
+
"<|special_4019|>",
|
| 3901 |
+
"<|special_4020|>",
|
| 3902 |
+
"<|special_4021|>",
|
| 3903 |
+
"<|special_4022|>",
|
| 3904 |
+
"<|special_4023|>",
|
| 3905 |
+
"<|special_4024|>",
|
| 3906 |
+
"<|special_4025|>",
|
| 3907 |
+
"<|special_4026|>",
|
| 3908 |
+
"<|special_4027|>",
|
| 3909 |
+
"<|special_4028|>",
|
| 3910 |
+
"<|special_4029|>",
|
| 3911 |
+
"<|special_4030|>",
|
| 3912 |
+
"<|special_4031|>",
|
| 3913 |
+
"<|special_4032|>",
|
| 3914 |
+
"<|special_4033|>",
|
| 3915 |
+
"<|special_4034|>",
|
| 3916 |
+
"<|special_4035|>",
|
| 3917 |
+
"<|special_4036|>",
|
| 3918 |
+
"<|special_4037|>",
|
| 3919 |
+
"<|special_4038|>",
|
| 3920 |
+
"<|special_4039|>",
|
| 3921 |
+
"<|special_4040|>",
|
| 3922 |
+
"<|special_4041|>",
|
| 3923 |
+
"<|special_4042|>",
|
| 3924 |
+
"<|special_4043|>",
|
| 3925 |
+
"<|special_4044|>",
|
| 3926 |
+
"<|special_4045|>",
|
| 3927 |
+
"<|special_4046|>",
|
| 3928 |
+
"<|special_4047|>",
|
| 3929 |
+
"<|special_4048|>",
|
| 3930 |
+
"<|special_4049|>",
|
| 3931 |
+
"<|special_4050|>",
|
| 3932 |
+
"<|special_4051|>",
|
| 3933 |
+
"<|special_4052|>",
|
| 3934 |
+
"<|special_4053|>",
|
| 3935 |
+
"<|special_4054|>",
|
| 3936 |
+
"<|special_4055|>",
|
| 3937 |
+
"<|special_4056|>",
|
| 3938 |
+
"<|special_4057|>",
|
| 3939 |
+
"<|special_4058|>",
|
| 3940 |
+
"<|special_4059|>",
|
| 3941 |
+
"<|special_4060|>",
|
| 3942 |
+
"<|special_4061|>",
|
| 3943 |
+
"<|special_4062|>",
|
| 3944 |
+
"<|special_4063|>",
|
| 3945 |
+
"<|special_4064|>",
|
| 3946 |
+
"<|special_4065|>",
|
| 3947 |
+
"<|special_4066|>",
|
| 3948 |
+
"<|special_4067|>",
|
| 3949 |
+
"<|special_4068|>",
|
| 3950 |
+
"<|special_4069|>",
|
| 3951 |
+
"<|special_4070|>",
|
| 3952 |
+
"<|special_4071|>",
|
| 3953 |
+
"<|special_4072|>",
|
| 3954 |
+
"<|special_4073|>",
|
| 3955 |
+
"<|special_4074|>",
|
| 3956 |
+
"<|special_4075|>",
|
| 3957 |
+
"<|special_4076|>",
|
| 3958 |
+
"<|special_4077|>",
|
| 3959 |
+
"<|special_4078|>",
|
| 3960 |
+
"<|special_4079|>",
|
| 3961 |
+
"<|special_4080|>",
|
| 3962 |
+
"<|special_4081|>",
|
| 3963 |
+
"<|special_4082|>",
|
| 3964 |
+
"<|special_4083|>",
|
| 3965 |
+
"<|special_4084|>",
|
| 3966 |
+
"<|special_4085|>",
|
| 3967 |
+
"<|special_4086|>",
|
| 3968 |
+
"<|special_4087|>",
|
| 3969 |
+
"<|special_4088|>",
|
| 3970 |
+
"<|special_4089|>",
|
| 3971 |
+
"<|special_4090|>",
|
| 3972 |
+
"<|special_4091|>",
|
| 3973 |
+
"<|special_4092|>",
|
| 3974 |
+
"<|special_4093|>",
|
| 3975 |
+
"<|special_4094|>",
|
| 3976 |
+
"<|special_4095|>"
|
| 3977 |
+
],
|
| 3978 |
+
"bos_token": {
|
| 3979 |
+
"content": "<|startoftext|>",
|
| 3980 |
+
"lstrip": false,
|
| 3981 |
+
"normalized": false,
|
| 3982 |
+
"rstrip": false,
|
| 3983 |
+
"single_word": false
|
| 3984 |
+
},
|
| 3985 |
+
"eos_token": {
|
| 3986 |
+
"content": "<|endoftext|>",
|
| 3987 |
+
"lstrip": false,
|
| 3988 |
+
"normalized": false,
|
| 3989 |
+
"rstrip": false,
|
| 3990 |
+
"single_word": false
|
| 3991 |
+
},
|
| 3992 |
+
"pad_token": {
|
| 3993 |
+
"content": "<|endoftext|>",
|
| 3994 |
+
"lstrip": false,
|
| 3995 |
+
"normalized": false,
|
| 3996 |
+
"rstrip": false,
|
| 3997 |
+
"single_word": false
|
| 3998 |
+
},
|
| 3999 |
+
"unk_token": {
|
| 4000 |
+
"content": "<unk>",
|
| 4001 |
+
"lstrip": false,
|
| 4002 |
+
"normalized": false,
|
| 4003 |
+
"rstrip": false,
|
| 4004 |
+
"single_word": false
|
| 4005 |
+
}
|
| 4006 |
+
}
|
tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:cad2dc0cc1dd988ef022be5d94131ec9eecfbf66a955ca820fde0c462fa69add
|
| 3 |
+
size 16473584
|
tokenizer_config.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|